Commit 278c4bcf authored by Davis King's avatar Davis King

updated the docs

--HG--
extra : convert_revision : svn%3Afdd8eb12-d10e-0410-9acb-85c331704f74/trunk%402452
parent db7e5405
......@@ -328,7 +328,7 @@
<file>dlib/optimization.h</file>
<spec_file link="true">dlib/optimization/optimization_abstract.h</spec_file>
<description>
Performs an unconstrained minimization of the function f() using the
Performs an unconstrained minimization of the potentially nonlinear function f() using the
BFGS quasi newton method.
</description>
......@@ -341,7 +341,7 @@
<file>dlib/optimization.h</file>
<spec_file link="true">dlib/optimization/optimization_abstract.h</spec_file>
<description>
Performs an unconstrained minimization of the function f() using a
Performs an unconstrained minimization of the potentially nonlinear function f() using a
conjugate gradient method.
</description>
......@@ -354,7 +354,7 @@
<file>dlib/optimization.h</file>
<spec_file link="true">dlib/optimization/optimization_abstract.h</spec_file>
<description>
Performs an unconstrained minimization of the function f() using the
Performs an unconstrained minimization of the potentially nonlinear function f() using the
BFGS quasi newton method. This version doesn't take a gradient function of f()
but instead numerically approximates the gradient.
</description>
......@@ -368,7 +368,7 @@
<file>dlib/optimization.h</file>
<spec_file link="true">dlib/optimization/optimization_abstract.h</spec_file>
<description>
Performs an unconstrained minimization of the function f() using a
Performs an unconstrained minimization of the potentially nonlinear function f() using a
conjugate gradient method. This version doesn't take a gradient function of f()
but instead numerically approximates the gradient.
</description>
......
......@@ -106,7 +106,7 @@
<a href="dlib/matrix/matrix_utilities_abstract.h.html#svd">singular value decomposition</a>,
<a href="dlib/matrix/matrix_utilities_abstract.h.html#trans">transpose</a>,
<a href="dlib/matrix/matrix_math_functions_abstract.h.html#sin">trig functions</a>, etc...</li>
<li>Unconstrained optimization algorithms such as
<li>Unconstrained non-linear optimization algorithms such as
<a href="algorithms.html#find_min_conjugate_gradient">conjugate gradient</a> and <a href="algorithms.html#find_min_quasi_newton">quasi newton</a> techniques</li>
<li>A <a href="algorithms.html#bigint">big integer</a> object</li>
<li>A <a href="algorithms.html#rand">random number</a> object</li>
......
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment