Commit 7f7ddcaa authored by Davis King's avatar Davis King

merged

parents 20536bec 27f04a37
......@@ -161,7 +161,7 @@ namespace dlib
}
#ifdef DLIB_HAS_RVALUE_REFERENCES
array2d(array2d&& item)
array2d(array2d&& item) : array2d()
{
swap(item);
}
......
......@@ -70,8 +70,8 @@ namespace dlib
/*!
ensures
- #size() == N
- Initializes all N elements in this stack with the given item.
E.g. top()==item, pop().top()==item, pop().pop().top()==item, etc.
- Initializes all N elements in this stack with the given item. E.g.
top()==item, pop().top()==item, pop().pop().top()==item, etc.
!*/
const T& top(
......@@ -93,8 +93,8 @@ namespace dlib
) const;
/*!
ensures
- returns the number of elements in this stack. In particular, the
number returned is always N.
- returns the number of elements in this stack. In particular, the number
returned is always N.
!*/
const sstack<T,N-1>& pop(
......
......@@ -44,10 +44,10 @@ namespace dlib
ensures
- Constructs this object from item. This form of constructor is optional
but it allows you to provide a conversion from one input layer type to
another. For example, the following code is valid only if my_input2 can
be constructed from my_input1:
relu<fc<relu<fc<my_input1>>>> my_dnn1;
relu<fc<relu<fc<my_input2>>>> my_dnn2(my_dnn1);
another. For example, the following code is valid only if my_input_layer2 can
be constructed from my_input_layer1:
relu<fc<relu<fc<my_input_layer1>>>> my_dnn1;
relu<fc<relu<fc<my_input_layer2>>>> my_dnn2(my_dnn1);
This kind of pattern is useful if you want to use one type of input layer
during training but a different type of layer during testing since it
allows you to easily convert between related deep neural network types.
......
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment