- 08 May, 2017 1 commit
-
-
Davis King authored
-
- 25 Jun, 2016 1 commit
-
-
Davis King authored
--HG-- rename : examples/dnn_mnist_advanced_ex.cpp => examples/dnn_introduction2_ex.cpp rename : examples/dnn_mnist_ex.cpp => examples/dnn_introduction_ex.cpp
-
- 16 May, 2016 2 commits
-
-
Davis King authored
-
Davis King authored
-
- 15 May, 2016 1 commit
-
-
Davis King authored
object as an input. This allows the solvers to exhibit a more complex behavior that depends on the specific layer. It also removes the learning rate from the solver's parameter set and pushes it entirely into the core training code. This also removes the need for the separate "step size" which previously was multiplied with the output of the solvers. Most of the code is still the same, and in the core and trainer the step_size variables have just been renamed to learning_rate. The dnn_trainer's relevant member functions have also been renamed. The examples have been updated to reflect these API changes. I also cleaned up the resnet definition and added better downsampling.
-
- 12 Apr, 2016 2 commits
-
-
Davis King authored
-
Davis King authored
-
- 11 Apr, 2016 2 commits
-
-
Davis King authored
-
Davis King authored
-
- 09 Apr, 2016 1 commit
-
-
Davis King authored
to template arguments. This way, the type of a network specifies the entire network architecture and most of the time the user doesn't even need to do anything with layer constructors.
-
- 27 Mar, 2016 1 commit
-
-
Davis King authored
-
- 07 Feb, 2016 1 commit
-
-
Davis King authored
-