Skip to content
Projects
Groups
Snippets
Help
Loading...
Sign in
Toggle navigation
D
dlib
Project
Project
Details
Activity
Cycle Analytics
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Charts
Issues
0
Issues
0
List
Board
Labels
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Charts
Wiki
Wiki
Snippets
Snippets
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Charts
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
钟尚武
dlib
Commits
d2545493
Commit
d2545493
authored
Apr 16, 2016
by
Davis King
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
renamed EXAMPLE_LAYER_ to EXAMPLE_COMPUTATIONAL_LAYER_
parent
79adbae2
Show whitespace changes
Inline
Side-by-side
Showing
2 changed files
with
109 additions
and
105 deletions
+109
-105
core_abstract.h
dlib/dnn/core_abstract.h
+5
-5
layers_abstract.h
dlib/dnn/layers_abstract.h
+104
-100
No files found.
dlib/dnn/core_abstract.h
View file @
d2545493
...
@@ -181,8 +181,8 @@ namespace dlib
...
@@ -181,8 +181,8 @@ namespace dlib
{
{
/*!
/*!
REQUIREMENTS ON LAYER_DETAILS
REQUIREMENTS ON LAYER_DETAILS
- Must be a type that implements the EXAMPLE_
LAYER_ interface defined in
- Must be a type that implements the EXAMPLE_
COMPUTATIONAL_LAYER_ interface
layers_abstract.h
defined in
layers_abstract.h
REQUIREMENTS ON SUBNET
REQUIREMENTS ON SUBNET
- One of the following must be true:
- One of the following must be true:
...
@@ -1272,9 +1272,9 @@ namespace dlib
...
@@ -1272,9 +1272,9 @@ namespace dlib
);
);
/*!
/*!
ensures
ensures
- Checks if l correctly implements the EXAMPLE_
LAYER_ interface defined in
- Checks if l correctly implements the EXAMPLE_
COMPUTATIONAL_LAYER_ interface
layers_abstract.h. Importantly, it computes numerical approximations to the
defined in layers_abstract.h. Importantly, it computes numerical approximations
gradients and compares them to the outputs of the layer.
to the
gradients and compares them to the outputs of the layer.
- The results of the testing are returned. In particular, if the returned object
- The results of the testing are returned. In particular, if the returned object
is RESULT then we will have:
is RESULT then we will have:
- RESULT.was_good == false if and only if the layer failed the testing.
- RESULT.was_good == false if and only if the layer failed the testing.
...
...
dlib/dnn/layers_abstract.h
View file @
d2545493
...
@@ -86,21 +86,21 @@ namespace dlib
...
@@ -86,21 +86,21 @@ namespace dlib
// ----------------------------------------------------------------------------------------
// ----------------------------------------------------------------------------------------
class
EXAMPLE_LAYER_
class
EXAMPLE_
COMPUTATIONAL_
LAYER_
{
{
/*!
/*!
WHAT THIS OBJECT REPRESENTS
WHAT THIS OBJECT REPRESENTS
Each
layer in a deep neural network can be thought of as a function,
Each
computational layer in a deep neural network can be thought of as a
f
(data,parameters), that takes in a data tensor, some parameters, and
f
unction, f(data,parameters), that takes in a data tensor, some parameters,
produces an output tensor. You create an entire deep network by composing
and produces an output tensor. You create an entire deep network by
these functions. Importantly, you are able to use a wide range of
composing these functions. Importantly, you are able to use a wide range
different functions to accommodate the task you are trying to accomplish.
of different functions to accommodate the task you are trying to
Therefore, dlib includes a number of common layer types but if you want to
accomplish. Therefore, dlib includes a number of common layer types but if
define your own then you simply implement a class with the same interfac
e
you want to define your own then you simply implement a class with the sam
e
as EXAMPLE
_LAYER_.
interface as EXAMPLE_COMPUTATIONAL
_LAYER_.
Note that there is no dlib::EXAMPLE_
LAYER_ type. It is shown here purely
Note that there is no dlib::EXAMPLE_
COMPUTATIONAL_LAYER_ type. It is shown
to document the interface that a layer object must implement.
here purely
to document the interface that a layer object must implement.
The central work of defining a layer is implementing the forward and backward
The central work of defining a layer is implementing the forward and backward
methods. When you do this you have four options:
methods. When you do this you have four options:
...
@@ -127,7 +127,7 @@ namespace dlib
...
@@ -127,7 +127,7 @@ namespace dlib
public
:
public
:
EXAMPLE_LAYER_
(
EXAMPLE_
COMPUTATIONAL_
LAYER_
(
);
);
/*!
/*!
ensures
ensures
...
@@ -136,15 +136,15 @@ namespace dlib
...
@@ -136,15 +136,15 @@ namespace dlib
layer objects be default constructable.
layer objects be default constructable.
!*/
!*/
EXAMPLE_LAYER_
(
EXAMPLE_
COMPUTATIONAL_
LAYER_
(
const
EXAMPLE_LAYER_
&
item
const
EXAMPLE_
COMPUTATIONAL_
LAYER_
&
item
);
);
/*!
/*!
ensures
ensures
- EXAMPLE_LAYER_ objects are copy constructable
- EXAMPLE_
COMPUTATIONAL_
LAYER_ objects are copy constructable
!*/
!*/
EXAMPLE_LAYER_
(
EXAMPLE_
COMPUTATIONAL_
LAYER_
(
const
some_other_layer_type
&
item
const
some_other_layer_type
&
item
);
);
/*!
/*!
...
@@ -306,8 +306,8 @@ namespace dlib
...
@@ -306,8 +306,8 @@ namespace dlib
};
};
void
serialize
(
const
EXAMPLE_LAYER_
&
item
,
std
::
ostream
&
out
);
void
serialize
(
const
EXAMPLE_
COMPUTATIONAL_
LAYER_
&
item
,
std
::
ostream
&
out
);
void
deserialize
(
EXAMPLE_LAYER_
&
item
,
std
::
istream
&
in
);
void
deserialize
(
EXAMPLE_
COMPUTATIONAL_
LAYER_
&
item
,
std
::
istream
&
in
);
/*!
/*!
provides serialization support
provides serialization support
!*/
!*/
...
@@ -316,7 +316,7 @@ namespace dlib
...
@@ -316,7 +316,7 @@ namespace dlib
// easily composed. Moreover, the convention is that the layer class ends with an _
// easily composed. Moreover, the convention is that the layer class ends with an _
// while the add_layer template has the same name but without the trailing _.
// while the add_layer template has the same name but without the trailing _.
template
<
typename
SUBNET
>
template
<
typename
SUBNET
>
using
EXAMPLE_LAYER
=
add_layer
<
EXAMPLE_LAYER_
,
SUBNET
>
;
using
EXAMPLE_LAYER
=
add_layer
<
EXAMPLE_
COMPUTATIONAL_
LAYER_
,
SUBNET
>
;
// ----------------------------------------------------------------------------------------
// ----------------------------------------------------------------------------------------
// ----------------------------------------------------------------------------------------
// ----------------------------------------------------------------------------------------
...
@@ -345,9 +345,10 @@ namespace dlib
...
@@ -345,9 +345,10 @@ namespace dlib
num_outputs > 0
num_outputs > 0
WHAT THIS OBJECT REPRESENTS
WHAT THIS OBJECT REPRESENTS
This is an implementation of the EXAMPLE_LAYER_ interface defined above.
This is an implementation of the EXAMPLE_COMPUTATIONAL_LAYER_ interface
In particular, it defines a fully connected layer that takes an input
defined above. In particular, it defines a fully connected layer that
tensor and multiplies it by a weight matrix and outputs the results.
takes an input tensor and multiplies it by a weight matrix and outputs the
results.
!*/
!*/
public
:
public
:
...
@@ -386,7 +387,7 @@ namespace dlib
...
@@ -386,7 +387,7 @@ namespace dlib
const
tensor
&
get_layer_params
()
const
;
const
tensor
&
get_layer_params
()
const
;
tensor
&
get_layer_params
();
tensor
&
get_layer_params
();
/*!
/*!
These functions are implemented as described in the EXAMPLE_LAYER_ interface.
These functions are implemented as described in the EXAMPLE_
COMPUTATIONAL_
LAYER_ interface.
!*/
!*/
friend
void
serialize
(
const
fc_
&
item
,
std
::
ostream
&
out
);
friend
void
serialize
(
const
fc_
&
item
,
std
::
ostream
&
out
);
...
@@ -424,10 +425,10 @@ namespace dlib
...
@@ -424,10 +425,10 @@ namespace dlib
All of them must be > 0.
All of them must be > 0.
WHAT THIS OBJECT REPRESENTS
WHAT THIS OBJECT REPRESENTS
This is an implementation of the EXAMPLE_
LAYER_ interface defined above.
This is an implementation of the EXAMPLE_
COMPUTATIONAL_LAYER_ interface
In particular, it defines a convolution layer that takes an input tensor
defined above. In particular, it defines a convolution layer that takes an
(nominally representing an image) and convolves it with a set of filters
input tensor (nominally representing an image) and convolves it with a set
and then outputs the results.
of filters
and then outputs the results.
The dimensions of the tensors output by this layer are as follows (letting
The dimensions of the tensors output by this layer are as follows (letting
IN be the input tensor and OUT the output tensor):
IN be the input tensor and OUT the output tensor):
...
@@ -496,7 +497,7 @@ namespace dlib
...
@@ -496,7 +497,7 @@ namespace dlib
const
tensor
&
get_layer_params
()
const
;
const
tensor
&
get_layer_params
()
const
;
tensor
&
get_layer_params
();
tensor
&
get_layer_params
();
/*!
/*!
These functions are implemented as described in the EXAMPLE_LAYER_ interface.
These functions are implemented as described in the EXAMPLE_
COMPUTATIONAL_
LAYER_ interface.
!*/
!*/
friend
void
serialize
(
const
con_
&
item
,
std
::
ostream
&
out
);
friend
void
serialize
(
const
con_
&
item
,
std
::
ostream
&
out
);
...
@@ -523,11 +524,11 @@ namespace dlib
...
@@ -523,11 +524,11 @@ namespace dlib
{
{
/*!
/*!
WHAT THIS OBJECT REPRESENTS
WHAT THIS OBJECT REPRESENTS
This is an implementation of the EXAMPLE_
LAYER_ interface defined above.
This is an implementation of the EXAMPLE_
COMPUTATIONAL_LAYER_ interface
In particular, it defines a dropout layer. Therefore, it passes its inputs
defined above. In particular, it defines a dropout layer. Therefore, it
through the stochastic function f(x) which outputs either 0 or x. The
passes its inputs through the stochastic function f(x) which outputs either
probability of 0 being output is given by the drop_rate argument to this
0 or x. The probability of 0 being output is given by the drop_rate
object's constructor.
argument to this
object's constructor.
Note that, after you finish training a network with dropout, it is a good
Note that, after you finish training a network with dropout, it is a good
idea to replace each dropout_ layer with a multiply_ layer because the
idea to replace each dropout_ layer with a multiply_ layer because the
...
@@ -560,7 +561,7 @@ namespace dlib
...
@@ -560,7 +561,7 @@ namespace dlib
const
tensor
&
get_layer_params
()
const
;
const
tensor
&
get_layer_params
()
const
;
tensor
&
get_layer_params
();
tensor
&
get_layer_params
();
/*!
/*!
These functions are implemented as described in the EXAMPLE_LAYER_ interface.
These functions are implemented as described in the EXAMPLE_
COMPUTATIONAL_
LAYER_ interface.
!*/
!*/
};
};
...
@@ -579,10 +580,10 @@ namespace dlib
...
@@ -579,10 +580,10 @@ namespace dlib
{
{
/*!
/*!
WHAT THIS OBJECT REPRESENTS
WHAT THIS OBJECT REPRESENTS
This is an implementation of the EXAMPLE_
LAYER_ interface defined above.
This is an implementation of the EXAMPLE_
COMPUTATIONAL_LAYER_ interface
In particular, it defines a basic layer that just multiplies its inpu
t
defined above. In particular, it defines a basic layer that jus
t
tensor with a constant value and returns the result. It therefore has no
multiplies its input tensor with a constant value and returns the result.
learnable parameters.
It therefore has no
learnable parameters.
!*/
!*/
public
:
public
:
...
@@ -618,7 +619,7 @@ namespace dlib
...
@@ -618,7 +619,7 @@ namespace dlib
const
tensor
&
get_layer_params
()
const
;
const
tensor
&
get_layer_params
()
const
;
tensor
&
get_layer_params
();
tensor
&
get_layer_params
();
/*!
/*!
These functions are implemented as described in the EXAMPLE_LAYER_ interface.
These functions are implemented as described in the EXAMPLE_
COMPUTATIONAL_
LAYER_ interface.
!*/
!*/
};
};
...
@@ -646,9 +647,9 @@ namespace dlib
...
@@ -646,9 +647,9 @@ namespace dlib
{
{
/*!
/*!
WHAT THIS OBJECT REPRESENTS
WHAT THIS OBJECT REPRESENTS
This is an implementation of the EXAMPLE_
LAYER_ interface defined above.
This is an implementation of the EXAMPLE_
COMPUTATIONAL_LAYER_ interface
In particular, it defines a batch normalization layer that implements the
defined above. In particular, it defines a batch normalization layer that
method described in the paper:
implements the
method described in the paper:
Batch Normalization: Accelerating Deep Network Training by Reducing
Batch Normalization: Accelerating Deep Network Training by Reducing
Internal Covariate Shift by Sergey Ioffe and Christian Szegedy
Internal Covariate Shift by Sergey Ioffe and Christian Szegedy
...
@@ -722,7 +723,7 @@ namespace dlib
...
@@ -722,7 +723,7 @@ namespace dlib
const
tensor
&
get_layer_params
()
const
;
const
tensor
&
get_layer_params
()
const
;
tensor
&
get_layer_params
();
tensor
&
get_layer_params
();
/*!
/*!
These functions are implemented as described in the EXAMPLE_LAYER_ interface.
These functions are implemented as described in the EXAMPLE_
COMPUTATIONAL_
LAYER_ interface.
!*/
!*/
friend
void
serialize
(
const
bn_
&
item
,
std
::
ostream
&
out
);
friend
void
serialize
(
const
bn_
&
item
,
std
::
ostream
&
out
);
...
@@ -744,10 +745,11 @@ namespace dlib
...
@@ -744,10 +745,11 @@ namespace dlib
{
{
/*!
/*!
WHAT THIS OBJECT REPRESENTS
WHAT THIS OBJECT REPRESENTS
This is an implementation of the EXAMPLE_LAYER_ interface defined above.
This is an implementation of the EXAMPLE_COMPUTATIONAL_LAYER_ interface
In particular, it applies a simple pointwise linear transformation to an
defined above. In particular, it applies a simple pointwise linear
input tensor. You can think of it as having two parameter tensors, A and
transformation to an input tensor. You can think of it as having two
B. If the input tensor is called INPUT then the output of this layer is:
parameter tensors, A and B. If the input tensor is called INPUT then the
output of this layer is:
A*INPUT+B
A*INPUT+B
where all operations are performed element wise and each sample in the
where all operations are performed element wise and each sample in the
INPUT tensor is processed separately.
INPUT tensor is processed separately.
...
@@ -819,9 +821,10 @@ namespace dlib
...
@@ -819,9 +821,10 @@ namespace dlib
const
tensor
&
get_layer_params
()
const
;
const
tensor
&
get_layer_params
()
const
;
tensor
&
get_layer_params
();
tensor
&
get_layer_params
();
/*!
/*!
These functions are implemented as described in the EXAMPLE_LAYER_ interface.
These functions are implemented as described in the
Also note that get_layer_params() always returns an empty tensor since there
EXAMPLE_COMPUTATIONAL_LAYER_ interface. Also note that get_layer_params()
are no learnable parameters in this object.
always returns an empty tensor since there are no learnable parameters in this
object.
!*/
!*/
};
};
...
@@ -850,11 +853,11 @@ namespace dlib
...
@@ -850,11 +853,11 @@ namespace dlib
All of them must be > 0.
All of them must be > 0.
WHAT THIS OBJECT REPRESENTS
WHAT THIS OBJECT REPRESENTS
This is an implementation of the EXAMPLE_
LAYER_ interface defined above.
This is an implementation of the EXAMPLE_
COMPUTATIONAL_LAYER_ interface
In particular, it defines a max pooling layer that takes an input tensor
defined above. In particular, it defines a max pooling layer that takes an
and downsamples it. It does this by sliding a window over the images in an
input tensor and downsamples it. It does this by sliding a window over the
i
nput tensor and outputting, for each channel, the maximum element within
i
mages in an input tensor and outputting, for each channel, the maximum
the window.
element within
the window.
To be precise, if we call the input tensor IN and the output tensor OUT,
To be precise, if we call the input tensor IN and the output tensor OUT,
then OUT is defined as follows:
then OUT is defined as follows:
...
@@ -920,9 +923,9 @@ namespace dlib
...
@@ -920,9 +923,9 @@ namespace dlib
const
tensor
&
get_layer_params
()
const
;
const
tensor
&
get_layer_params
()
const
;
tensor
&
get_layer_params
();
tensor
&
get_layer_params
();
/*!
/*!
These functions are implemented as described in the EXAMPLE_
LAYER_ interface.
These functions are implemented as described in the EXAMPLE_
COMPUTATIONAL_LAYER_
Note that this layer doesn't have any parameters, so the tensor returned by
interface. Note that this layer doesn't have any parameters, so the tensor
get_layer_params() is always empty.
returned by
get_layer_params() is always empty.
!*/
!*/
friend
void
serialize
(
const
max_pool_
&
item
,
std
::
ostream
&
out
);
friend
void
serialize
(
const
max_pool_
&
item
,
std
::
ostream
&
out
);
...
@@ -956,11 +959,11 @@ namespace dlib
...
@@ -956,11 +959,11 @@ namespace dlib
All of them must be > 0.
All of them must be > 0.
WHAT THIS OBJECT REPRESENTS
WHAT THIS OBJECT REPRESENTS
This is an implementation of the EXAMPLE_
LAYER_ interface defined above.
This is an implementation of the EXAMPLE_
COMPUTATIONAL_LAYER_ interface
In particular, it defines an average pooling layer that takes an input tensor
defined above. In particular, it defines an average pooling layer that
and downsamples it. It does this by sliding a window over the images in an
takes an input tensor and downsamples it. It does this by sliding a window
input tensor and outputting, for each channel, the average element within
over the images in an input tensor and outputting, for each channel, the
the window.
average element within
the window.
To be precise, if we call the input tensor IN and the output tensor OUT,
To be precise, if we call the input tensor IN and the output tensor OUT,
then OUT is defined as follows:
then OUT is defined as follows:
...
@@ -1026,9 +1029,9 @@ namespace dlib
...
@@ -1026,9 +1029,9 @@ namespace dlib
const
tensor
&
get_layer_params
()
const
;
const
tensor
&
get_layer_params
()
const
;
tensor
&
get_layer_params
();
tensor
&
get_layer_params
();
/*!
/*!
These functions are implemented as described in the EXAMPLE_
LAYER_ interface.
These functions are implemented as described in the EXAMPLE_
COMPUTATIONAL_LAYER_
Note that this layer doesn't have any parameters, so the tensor returned by
interface. Note that this layer doesn't have any parameters, so the tensor
get_layer_params() is always empty.
returned by
get_layer_params() is always empty.
!*/
!*/
friend
void
serialize
(
const
avg_pool_
&
item
,
std
::
ostream
&
out
);
friend
void
serialize
(
const
avg_pool_
&
item
,
std
::
ostream
&
out
);
...
@@ -1053,9 +1056,9 @@ namespace dlib
...
@@ -1053,9 +1056,9 @@ namespace dlib
{
{
/*!
/*!
WHAT THIS OBJECT REPRESENTS
WHAT THIS OBJECT REPRESENTS
This is an implementation of the EXAMPLE_
LAYER_ interface defined above.
This is an implementation of the EXAMPLE_
COMPUTATIONAL_LAYER_ interface
In particular, it defines a rectified linear layer. Therefore, it passes
defined above. In particular, it defines a rectified linear layer.
its inputs through the function
Therefore, it passes
its inputs through the function
f(x)=max(x,0)
f(x)=max(x,0)
where f() is applied pointwise across the input tensor.
where f() is applied pointwise across the input tensor.
!*/
!*/
...
@@ -1071,9 +1074,9 @@ namespace dlib
...
@@ -1071,9 +1074,9 @@ namespace dlib
const
tensor
&
get_layer_params
()
const
;
const
tensor
&
get_layer_params
()
const
;
tensor
&
get_layer_params
();
tensor
&
get_layer_params
();
/*!
/*!
These functions are implemented as described in the EXAMPLE_
LAYER_ interface.
These functions are implemented as described in the EXAMPLE_
COMPUTATIONAL_LAYER_
Note that this layer doesn't have any parameters, so the tensor returned by
interface. Note that this layer doesn't have any parameters, so the tensor
get_layer_params() is always empty.
returned by
get_layer_params() is always empty.
!*/
!*/
};
};
...
@@ -1092,9 +1095,9 @@ namespace dlib
...
@@ -1092,9 +1095,9 @@ namespace dlib
{
{
/*!
/*!
WHAT THIS OBJECT REPRESENTS
WHAT THIS OBJECT REPRESENTS
This is an implementation of the EXAMPLE_
LAYER_ interface defined above.
This is an implementation of the EXAMPLE_
COMPUTATIONAL_LAYER_ interface
In particular, it defines a parametric rectified linear layer. Therefore,
defined above. In particular, it defines a parametric rectified linear
it passes its inputs through the function
layer. Therefore,
it passes its inputs through the function
f(x) = x>0 ? x : p*x
f(x) = x>0 ? x : p*x
where f() is applied pointwise across the input tensor and p is a scalar
where f() is applied pointwise across the input tensor and p is a scalar
parameter learned by this layer.
parameter learned by this layer.
...
@@ -1130,7 +1133,7 @@ namespace dlib
...
@@ -1130,7 +1133,7 @@ namespace dlib
const
tensor
&
get_layer_params
()
const
;
const
tensor
&
get_layer_params
()
const
;
tensor
&
get_layer_params
();
tensor
&
get_layer_params
();
/*!
/*!
These functions are implemented as described in the EXAMPLE_LAYER_ interface.
These functions are implemented as described in the EXAMPLE_
COMPUTATIONAL_
LAYER_ interface.
!*/
!*/
};
};
...
@@ -1149,9 +1152,9 @@ namespace dlib
...
@@ -1149,9 +1152,9 @@ namespace dlib
{
{
/*!
/*!
WHAT THIS OBJECT REPRESENTS
WHAT THIS OBJECT REPRESENTS
This is an implementation of the EXAMPLE_
LAYER_ interface defined above.
This is an implementation of the EXAMPLE_
COMPUTATIONAL_LAYER_ interface
In particular, it defines a sigmoid layer. Therefore, it passes its inputs
defined above. In particular, it defines a sigmoid layer. Therefore, it
through the function
passes its inputs
through the function
f(x)=1/(1+exp(-x))
f(x)=1/(1+exp(-x))
where f() is applied pointwise across the input tensor.
where f() is applied pointwise across the input tensor.
!*/
!*/
...
@@ -1167,9 +1170,9 @@ namespace dlib
...
@@ -1167,9 +1170,9 @@ namespace dlib
const
tensor
&
get_layer_params
()
const
;
const
tensor
&
get_layer_params
()
const
;
tensor
&
get_layer_params
();
tensor
&
get_layer_params
();
/*!
/*!
These functions are implemented as described in the EXAMPLE_
LAYER_ interface.
These functions are implemented as described in the EXAMPLE_
COMPUTATIONAL_LAYER_
Note that this layer doesn't have any parameters, so the tensor returned by
interface. Note that this layer doesn't have any parameters, so the tensor
get_layer_params() is always empty.
returned by
get_layer_params() is always empty.
!*/
!*/
};
};
...
@@ -1188,9 +1191,9 @@ namespace dlib
...
@@ -1188,9 +1191,9 @@ namespace dlib
{
{
/*!
/*!
WHAT THIS OBJECT REPRESENTS
WHAT THIS OBJECT REPRESENTS
This is an implementation of the EXAMPLE_
LAYER_ interface defined above.
This is an implementation of the EXAMPLE_
COMPUTATIONAL_LAYER_ interface
In particular, it defines a hyperbolic tangent layer. Therefore, it passes
defined above. In particular, it defines a hyperbolic tangent layer.
its inputs through the function
Therefore, it passes
its inputs through the function
f(x)=std::tanh(x)
f(x)=std::tanh(x)
where f() is applied pointwise across the input tensor.
where f() is applied pointwise across the input tensor.
!*/
!*/
...
@@ -1206,9 +1209,9 @@ namespace dlib
...
@@ -1206,9 +1209,9 @@ namespace dlib
const
tensor
&
get_layer_params
()
const
;
const
tensor
&
get_layer_params
()
const
;
tensor
&
get_layer_params
();
tensor
&
get_layer_params
();
/*!
/*!
These functions are implemented as described in the EXAMPLE_
LAYER_ interface.
These functions are implemented as described in the EXAMPLE_
COMPUTATIONAL_LAYER_
Note that this layer doesn't have any parameters, so the tensor returned by
interface. Note that this layer doesn't have any parameters, so the tensor
get_layer_params() is always empty.
returned by
get_layer_params() is always empty.
!*/
!*/
};
};
...
@@ -1227,9 +1230,9 @@ namespace dlib
...
@@ -1227,9 +1230,9 @@ namespace dlib
{
{
/*!
/*!
WHAT THIS OBJECT REPRESENTS
WHAT THIS OBJECT REPRESENTS
This is an implementation of the EXAMPLE_
LAYER_ interface defined above.
This is an implementation of the EXAMPLE_
COMPUTATIONAL_LAYER_ interface
In particular, it defines a softmax layer. To be precise, we define the
defined above. In particular, it defines a softmax layer. To be precise,
softmax function s(x) as:
we define the
softmax function s(x) as:
s(x) == exp(x)/sum(exp(x))
s(x) == exp(x)/sum(exp(x))
where x is a vector. Then this layer treats its input tensor as a
where x is a vector. Then this layer treats its input tensor as a
collection of multi-channel images and applies s() to each spatial location
collection of multi-channel images and applies s() to each spatial location
...
@@ -1253,9 +1256,9 @@ namespace dlib
...
@@ -1253,9 +1256,9 @@ namespace dlib
const
tensor
&
get_layer_params
()
const
;
const
tensor
&
get_layer_params
()
const
;
tensor
&
get_layer_params
();
tensor
&
get_layer_params
();
/*!
/*!
These functions are implemented as described in the EXAMPLE_
LAYER_ interface.
These functions are implemented as described in the EXAMPLE_
COMPUTATIONAL_LAYER_
Note that this layer doesn't have any parameters, so the tensor returned by
interface. Note that this layer doesn't have any parameters, so the tensor
get_layer_params() is always empty.
returned by
get_layer_params() is always empty.
!*/
!*/
};
};
...
@@ -1277,10 +1280,11 @@ namespace dlib
...
@@ -1277,10 +1280,11 @@ namespace dlib
{
{
/*!
/*!
WHAT THIS OBJECT REPRESENTS
WHAT THIS OBJECT REPRESENTS
This is an implementation of the EXAMPLE_LAYER_ interface defined above.
This is an implementation of the EXAMPLE_COMPUTATIONAL_LAYER_ interface
This layer simply adds the output of two previous layers. In particular,
defined above. This layer simply adds the output of two previous layers.
it adds the tensor from its immediate predecessor layer, sub.get_output(),
In particular, it adds the tensor from its immediate predecessor layer,
with the tensor from a deeper layer, layer<tag>(sub).get_output().
sub.get_output(), with the tensor from a deeper layer,
layer<tag>(sub).get_output().
Therefore, you supply a tag via add_prev_'s template argument that tells it
Therefore, you supply a tag via add_prev_'s template argument that tells it
what layer to add to the output of the previous layer. The result of this
what layer to add to the output of the previous layer. The result of this
...
@@ -1299,7 +1303,7 @@ namespace dlib
...
@@ -1299,7 +1303,7 @@ namespace dlib
const
tensor
&
get_layer_params
()
const
;
const
tensor
&
get_layer_params
()
const
;
tensor
&
get_layer_params
();
tensor
&
get_layer_params
();
/*!
/*!
These functions are implemented as described in the EXAMPLE_LAYER_ interface.
These functions are implemented as described in the EXAMPLE_
COMPUTATIONAL_
LAYER_ interface.
!*/
!*/
};
};
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment