Skip to content
Projects
Groups
Snippets
Help
Loading...
Sign in
Toggle navigation
D
dlib
Project
Project
Details
Activity
Cycle Analytics
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Charts
Issues
0
Issues
0
List
Board
Labels
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Charts
Wiki
Wiki
Snippets
Snippets
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Charts
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
钟尚武
dlib
Commits
ea5f89c6
Commit
ea5f89c6
authored
Mar 27, 2016
by
Davis King
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
Renamed variable to make things more clear.
parent
030f5a0a
Show whitespace changes
Inline
Side-by-side
Showing
2 changed files
with
25 additions
and
25 deletions
+25
-25
core.h
dlib/dnn/core.h
+6
-6
core_abstract.h
dlib/dnn/core_abstract.h
+19
-19
No files found.
dlib/dnn/core.h
View file @
ea5f89c6
...
@@ -1344,7 +1344,7 @@ namespace dlib
...
@@ -1344,7 +1344,7 @@ namespace dlib
template
<
template
<
size_t
num
,
size_t
num
,
template
<
typename
>
class
LAYER
,
template
<
typename
>
class
REPEATED_
LAYER
,
typename
SUBNET
typename
SUBNET
>
>
class
repeat
class
repeat
...
@@ -1353,10 +1353,10 @@ namespace dlib
...
@@ -1353,10 +1353,10 @@ namespace dlib
public
:
public
:
typedef
SUBNET
subnet_type
;
typedef
SUBNET
subnet_type
;
typedef
typename
SUBNET
::
input_type
input_type
;
typedef
typename
SUBNET
::
input_type
input_type
;
const
static
size_t
num_layers
=
(
LAYER
<
SUBNET
>::
num_layers
-
SUBNET
::
num_layers
)
*
num
+
SUBNET
::
num_layers
;
const
static
size_t
num_layers
=
(
REPEATED_
LAYER
<
SUBNET
>::
num_layers
-
SUBNET
::
num_layers
)
*
num
+
SUBNET
::
num_layers
;
const
static
unsigned
int
sample_expansion_factor
=
SUBNET
::
sample_expansion_factor
;
const
static
unsigned
int
sample_expansion_factor
=
SUBNET
::
sample_expansion_factor
;
typedef
LAYER
<
impl
::
repeat_input_layer
>
repeated_layer_type
;
typedef
REPEATED_
LAYER
<
impl
::
repeat_input_layer
>
repeated_layer_type
;
repeat
(
repeat
(
)
:
)
:
...
@@ -1481,7 +1481,7 @@ namespace dlib
...
@@ -1481,7 +1481,7 @@ namespace dlib
template
<
typename
solver_type
>
template
<
typename
solver_type
>
void
update
(
const
tensor
&
x
,
const
tensor
&
gradient_input
,
sstack
<
solver_type
>
solvers
,
double
step_size
)
void
update
(
const
tensor
&
x
,
const
tensor
&
gradient_input
,
sstack
<
solver_type
>
solvers
,
double
step_size
)
{
{
const
auto
cnt
=
(
LAYER
<
SUBNET
>::
num_layers
-
SUBNET
::
num_layers
);
const
auto
cnt
=
(
REPEATED_
LAYER
<
SUBNET
>::
num_layers
-
SUBNET
::
num_layers
);
if
(
details
.
size
()
>
1
)
if
(
details
.
size
()
>
1
)
{
{
details
[
0
].
update
(
details
[
1
].
get_output
(),
gradient_input
,
solvers
,
step_size
);
details
[
0
].
update
(
details
[
1
].
get_output
(),
gradient_input
,
solvers
,
step_size
);
...
@@ -1565,10 +1565,10 @@ namespace dlib
...
@@ -1565,10 +1565,10 @@ namespace dlib
template
<
template
<
size_t
num
,
size_t
num
,
template
<
typename
>
class
LAYER
,
template
<
typename
>
class
REPEATED_
LAYER
,
typename
SUBNET
typename
SUBNET
>
>
struct
is_nonloss_layer_type
<
repeat
<
num
,
LAYER
,
SUBNET
>>
:
std
::
true_type
{};
struct
is_nonloss_layer_type
<
repeat
<
num
,
REPEATED_
LAYER
,
SUBNET
>>
:
std
::
true_type
{};
// ----------------------------------------------------------------------------------------
// ----------------------------------------------------------------------------------------
...
...
dlib/dnn/core_abstract.h
View file @
ea5f89c6
...
@@ -927,7 +927,7 @@ namespace dlib
...
@@ -927,7 +927,7 @@ namespace dlib
template
<
template
<
size_t
num
,
size_t
num
,
template
<
typename
>
class
LAYER
,
template
<
typename
>
class
REPEATED_
LAYER
,
typename
SUBNET
typename
SUBNET
>
>
class
repeat
class
repeat
...
@@ -936,11 +936,11 @@ namespace dlib
...
@@ -936,11 +936,11 @@ namespace dlib
REQUIREMENTS ON num
REQUIREMENTS ON num
- num > 0
- num > 0
REQUIREMENTS ON LAYER
REQUIREMENTS ON
REPEATED_
LAYER
- LAYER must be a template that stacks more layers onto a deep neural
-
REPEATED_
LAYER must be a template that stacks more layers onto a deep neural
network. For example, if net_type were a network without a loss layer,
network. For example, if net_type were a network without a loss layer,
then it should be legal to create a deeper network with a type of
then it should be legal to create a deeper network with a type of
LAYER<net_type>.
REPEATED_
LAYER<net_type>.
REQUIREMENTS ON SUBNET
REQUIREMENTS ON SUBNET
- One of the following must be true:
- One of the following must be true:
...
@@ -951,8 +951,8 @@ namespace dlib
...
@@ -951,8 +951,8 @@ namespace dlib
WHAT THIS OBJECT REPRESENTS
WHAT THIS OBJECT REPRESENTS
This object adds more layers to a deep neural network. In particular, it
This object adds more layers to a deep neural network. In particular, it
adds LAYER on top of SUBNET num times. So for example, if num were 2 then
adds
REPEATED_
LAYER on top of SUBNET num times. So for example, if num were 2 then
repeat<2,
LAYER,SUBNET> would create a network equivalent to LAYER<
LAYER<SUBNET>>.
repeat<2,
REPEATED_LAYER,SUBNET> would create a network equivalent to REPEATED_LAYER<REPEATED_
LAYER<SUBNET>>.
Also, this object provides an interface identical to the one defined by the
Also, this object provides an interface identical to the one defined by the
add_layer object except that we add the num_repetitions() and
add_layer object except that we add the num_repetitions() and
...
@@ -964,9 +964,9 @@ namespace dlib
...
@@ -964,9 +964,9 @@ namespace dlib
typedef
SUBNET
subnet_type
;
typedef
SUBNET
subnet_type
;
typedef
typename
SUBNET
::
input_type
input_type
;
typedef
typename
SUBNET
::
input_type
input_type
;
const
static
size_t
num_layers
=
(
LAYER
<
SUBNET
>::
num_layers
-
SUBNET
::
num_layers
)
*
num
+
SUBNET
::
num_layers
;
const
static
size_t
num_layers
=
(
REPEATED_
LAYER
<
SUBNET
>::
num_layers
-
SUBNET
::
num_layers
)
*
num
+
SUBNET
::
num_layers
;
const
static
unsigned
int
sample_expansion_factor
=
SUBNET
::
sample_expansion_factor
;
const
static
unsigned
int
sample_expansion_factor
=
SUBNET
::
sample_expansion_factor
;
typedef
LAYER
<
an_unspecified_input_type
>
repeated_layer_type
;
typedef
REPEATED_
LAYER
<
an_unspecified_input_type
>
repeated_layer_type
;
template
<
typename
T
,
typename
...
U
>
template
<
typename
T
,
typename
...
U
>
repeat
(
repeat
(
...
@@ -975,8 +975,8 @@ namespace dlib
...
@@ -975,8 +975,8 @@ namespace dlib
);
);
/*!
/*!
ensures
ensures
- arg1 is used to initialize the num_repetitions() copies of LAYER inside
- arg1 is used to initialize the num_repetitions() copies of
REPEATED_
LAYER inside
this object. That is, all the LAYER elements are initialized identically
this object. That is, all the
REPEATED_
LAYER elements are initialized identically
by being given copies of arg1.
by being given copies of arg1.
- The rest of the arguments to the constructor, i.e. args2, are passed to
- The rest of the arguments to the constructor, i.e. args2, are passed to
SUBNET's constructor.
SUBNET's constructor.
...
@@ -986,7 +986,7 @@ namespace dlib
...
@@ -986,7 +986,7 @@ namespace dlib
)
const
;
)
const
;
/*!
/*!
ensures
ensures
- returns num (i.e. the number of times LAYER was stacked on top of SUBNET)
- returns num (i.e. the number of times
REPEATED_
LAYER was stacked on top of SUBNET)
!*/
!*/
const
repeated_layer_type
&
get_repeated_layer
(
const
repeated_layer_type
&
get_repeated_layer
(
...
@@ -996,10 +996,10 @@ namespace dlib
...
@@ -996,10 +996,10 @@ namespace dlib
requires
requires
- i < num_repetitions()
- i < num_repetitions()
ensures
ensures
- returns a reference to the i-th instance of LAYER. For example,
- returns a reference to the i-th instance of
REPEATED_
LAYER. For example,
get_repeated_layer(0) returns the instance of LAYER that is on the top of
get_repeated_layer(0) returns the instance of
REPEATED_
LAYER that is on the top of
the network while get_repeated_layer(num_repetitions()-1) returns the
the network while get_repeated_layer(num_repetitions()-1) returns the
instance of LAYER that is stacked immediately on top of SUBNET.
instance of
REPEATED_
LAYER that is stacked immediately on top of SUBNET.
!*/
!*/
repeated_layer_type
&
get_repeated_layer
(
repeated_layer_type
&
get_repeated_layer
(
...
@@ -1009,10 +1009,10 @@ namespace dlib
...
@@ -1009,10 +1009,10 @@ namespace dlib
requires
requires
- i < num_repetitions()
- i < num_repetitions()
ensures
ensures
- returns a reference to the i-th instance of LAYER. For example,
- returns a reference to the i-th instance of
REPEATED_
LAYER. For example,
get_repeated_layer(0) returns the instance of LAYER that is on the top of
get_repeated_layer(0) returns the instance of
REPEATED_
LAYER that is on the top of
the network while get_repeated_layer(num_repetitions()-1) returns the
the network while get_repeated_layer(num_repetitions()-1) returns the
instance of LAYER that is stacked immediately on top of SUBNET.
instance of
REPEATED_
LAYER that is stacked immediately on top of SUBNET.
!*/
!*/
const
subnet_type
&
subnet
(
const
subnet_type
&
subnet
(
...
@@ -1020,7 +1020,7 @@ namespace dlib
...
@@ -1020,7 +1020,7 @@ namespace dlib
/*!
/*!
ensures
ensures
- returns the SUBNET base network that repeat sits on top of. If you want
- returns the SUBNET base network that repeat sits on top of. If you want
to access the LAYER components then you must use get_repeated_layer().
to access the
REPEATED_
LAYER components then you must use get_repeated_layer().
!*/
!*/
subnet_type
&
subnet
(
subnet_type
&
subnet
(
...
@@ -1028,7 +1028,7 @@ namespace dlib
...
@@ -1028,7 +1028,7 @@ namespace dlib
/*!
/*!
ensures
ensures
- returns the SUBNET base network that repeat sits on top of. If you want
- returns the SUBNET base network that repeat sits on top of. If you want
to access the LAYER components then you must use get_repeated_layer().
to access the
REPEATED_
LAYER components then you must use get_repeated_layer().
!*/
!*/
};
};
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment