- 06 Sep, 2016 8 commits
- 
- 
Davis King authored
- 
Davis King authored
- 
Davis King authored
- 
Davis King authored
- 
Davis King authored
- 
Davis King authoredwhat it was when this example was trained (since I just modified the default value in dlib to something else). 
- 
Davis King authored
- 
Davis King authored
 
- 
- 05 Sep, 2016 13 commits
- 
- 
Davis King authored
- 
Davis King authored
- 
Davis King authored
- 
Davis King authored
- 
Davis King authored
- 
Davis King authored
- 
Davis King authoredcode a little. 
- 
Davis King authored
- 
Davis King authoredbatch normalization running stats window size from 1000 to 100. 
- 
Davis King authoredinstead of rectangle. 
- 
Davis King authoredattendant objects. Also fixed a minor bug in the loss layer. 
- 
Davis King authored
- 
Davis King authoredimage. 
 
- 
- 04 Sep, 2016 2 commits
- 
- 
Davis King authored
- 
Davis King authored
 
- 
- 03 Sep, 2016 7 commits
- 
- 
Davis King authored
- 
Davis King authored
- 
Davis King authored
- 
Davis King authored
- 
Davis King authoredthat use std::vector in addition to dlib::array. 
- 
Davis King authoredwith the mapping functions necessary at each layer to support these routines. 
- 
Davis King authoredvisit_layers_range(). 
 
- 
- 01 Sep, 2016 1 commit
- 
- 
jpblackburn authoredAdd an overload of dnn_trainer::train_one_step that takes a pair of iterators rather than a std::vector. 
 
- 
- 31 Aug, 2016 5 commits
- 
- 
Davis King authoredsingular value of the input matrix rather than as an absolute tolerance. 
- 
Davis King authored
- 
Davis King authored
- 
Davis King authored
- 
Davis King authoredcalling code. 
 
- 
- 30 Aug, 2016 4 commits
- 
- 
Davis King authored
- 
Davis King authored
- 
Davis King authored
- 
Davis King authoredare empty, leading to MATLAB complaining about output arguments to being assigned. 
 
-