1. 16 May, 2016 6 commits
  2. 15 May, 2016 2 commits
    • Davis King's avatar
      merged · 93bbe5ff
      Davis King authored
      93bbe5ff
    • Davis King's avatar
      Changed the solver interface to take the learning rate and the layer details · 66166c67
      Davis King authored
      object as an input.  This allows the solvers to exhibit a more complex behavior
      that depends on the specific layer.  It also removes the learning rate from the
      solver's parameter set and pushes it entirely into the core training code.
      This also removes the need for the separate "step size" which previously was
      multiplied with the output of the solvers.
      
      Most of the code is still the same, and in the core and trainer the step_size
      variables have just been renamed to learning_rate.  The dnn_trainer's relevant
      member functions have also been renamed.
      
      The examples have been updated to reflect these API changes.  I also cleaned up
      the resnet definition and added better downsampling.
      66166c67
  3. 14 May, 2016 2 commits
    • Davis King's avatar
      merged · 9763c471
      Davis King authored
      9763c471
    • Davis King's avatar
      Fixed the in-place layers so that they don't interfere with the operation of · 8421f213
      Davis King authored
      skip layers and add_prev style layers.  In particular, now in-place layers only
      overwrite the gradient information in their child layer if they are operating
      in in-place mode.  Otherwise, they add their gradients to their child layers.
      
      It should also be noted that it's safe for in-place layers to overwrite
      gradients when in in-place mode since their child layers are inaccessible when
      in-place layers operate in in-place mode.  This prevents any other layers from
      trying to add to the child layer, thereby avoiding the potability of layer
      interference.  So the bug this change fixes is that, when not in in-place mode
      the child layers are still accessible but in-place layers were *still*
      overwriting child gradients.
      8421f213
  4. 13 May, 2016 7 commits
  5. 10 May, 2016 3 commits
  6. 08 May, 2016 2 commits
  7. 07 May, 2016 1 commit
  8. 05 May, 2016 6 commits
  9. 04 May, 2016 3 commits
  10. 03 May, 2016 1 commit
  11. 01 May, 2016 3 commits
  12. 30 Apr, 2016 2 commits
  13. 29 Apr, 2016 2 commits