Commit 6bc4066a authored by Davis King's avatar Davis King

updated docs

--HG--
extra : convert_revision : svn%3Afdd8eb12-d10e-0410-9acb-85c331704f74/trunk%404203
parent 7e9d7f07
......@@ -130,6 +130,9 @@
<li>Relevance vector machines for <a href="ml.html#rvm_trainer">classification</a>
and <a href="ml.html#rvm_regression_trainer">regression</a> </li>
<li>General purpose <a href="ml.html#one_vs_one_trainer">multiclass classification</a> tools</li>
<li>A <a href="ml.html#svm_multiclass_linear_trainer">Multiclass SVM</a></li>
<li>A tool for solving the optimization problem associated with
<a href="optimization.html#structural_svm_problem">structural support vector machines</a>. </li>
<li>An online <a href="ml.html#krls">kernel RLS regression</a> algorithm</li>
<li>An online <a href="ml.html#svm_pegasos">SVM classification</a> algorithm</li>
<li>An online kernelized <a href="ml.html#kcentroid">centroid estimator</a>/novelty detector</li> and
......
......@@ -680,7 +680,7 @@ subject to the following constraint:
A structural SVM, on the other hand, can learn to predict outputs as complex
as entire parse trees. To do this, it learns a function F(x,y) which measures
how well a particular data sample x matches a label y. When used for prediction,
the best label for a new x is then given by the y which maximizes F(x,y).
the best label for a new x is given by the y which maximizes F(x,y).
<br/>
<br/>
......
......@@ -12,12 +12,36 @@
<current>
New Stuff:
- Added a multiclass support vector machine.
- Added a tool for solving the optimization problem associated with
structural support vector machines.
- Added new functions for dealing with sparse vectors: add_to(),
subtract_from(), max_index_plus_one(), fix_nonzero_indexing(), a
more flexible dot(), and I renamed assign_dense_to_sparse() to assign()
and made it more flexible.
Non-Backwards Compatible Changes:
- Renamed max_index_value_plus_one() (a function for working with graphs) to
max_index_plus_one() so that it uses the same name as the essentially
identical function for working with sparse vectors.
- I simplified the cross_validate_multiclass_trainer(), cross_validate_trainer(),
test_binary_decision_function(), and test_multiclass_decision_function()
routines. They now always return double matrices regardless of any other
consideration. This only breaks previous code if you had been assigning
the result into a float or long double matrix.
- Renamed assign_dense_to_sparse() to assign()
Bug fixes:
- Fixed a bug in load_libsvm_formatted_data(). I had forgotten to clear the
contents of the labels output vector before adding the loaded label data.
- Fixed a bug in the kernel_matrix() function. It didn't compile when used
with sparse samples which were of type std::vector&lt;std::pair&lt;&gt; &gt;.
Moreover, some of the trainers have a dependency on kernel_matrix() so this
fix makes those trainers also work with this kind of sparse sample.
Other:
- Added a value_type typedef to matrix_exp so it's easier to write templates
which operate on STL containers and matrix objects.
</current>
<!-- ******************************************************************************* -->
......
......@@ -133,6 +133,7 @@
<term file="dlib/svm/multiclass_tools_abstract.h.html#find_missing_pairs" name="find_missing_pairs"/>
<term file="ml.html" name="svm_multiclass_linear_trainer"/>
<term link="ml.html#svm_multiclass_linear_trainer" name="Multiclass SVM"/>
<term file="ml.html" name="one_vs_one_trainer"/>
<term file="ml.html" name="one_vs_one_decision_function"/>
<term file="ml.html" name="multiclass_linear_decision_function"/>
......
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment