1. 26 May, 2016 4 commits
  2. 24 May, 2016 4 commits
  3. 23 May, 2016 5 commits
  4. 22 May, 2016 6 commits
  5. 20 May, 2016 1 commit
  6. 19 May, 2016 1 commit
  7. 17 May, 2016 2 commits
  8. 16 May, 2016 7 commits
  9. 15 May, 2016 2 commits
    • Davis King's avatar
      merged · 93bbe5ff
      Davis King authored
      93bbe5ff
    • Davis King's avatar
      Changed the solver interface to take the learning rate and the layer details · 66166c67
      Davis King authored
      object as an input.  This allows the solvers to exhibit a more complex behavior
      that depends on the specific layer.  It also removes the learning rate from the
      solver's parameter set and pushes it entirely into the core training code.
      This also removes the need for the separate "step size" which previously was
      multiplied with the output of the solvers.
      
      Most of the code is still the same, and in the core and trainer the step_size
      variables have just been renamed to learning_rate.  The dnn_trainer's relevant
      member functions have also been renamed.
      
      The examples have been updated to reflect these API changes.  I also cleaned up
      the resnet definition and added better downsampling.
      66166c67
  10. 14 May, 2016 2 commits
    • Davis King's avatar
      merged · 9763c471
      Davis King authored
      9763c471
    • Davis King's avatar
      Fixed the in-place layers so that they don't interfere with the operation of · 8421f213
      Davis King authored
      skip layers and add_prev style layers.  In particular, now in-place layers only
      overwrite the gradient information in their child layer if they are operating
      in in-place mode.  Otherwise, they add their gradients to their child layers.
      
      It should also be noted that it's safe for in-place layers to overwrite
      gradients when in in-place mode since their child layers are inaccessible when
      in-place layers operate in in-place mode.  This prevents any other layers from
      trying to add to the child layer, thereby avoiding the potability of layer
      interference.  So the bug this change fixes is that, when not in in-place mode
      the child layers are still accessible but in-place layers were *still*
      overwriting child gradients.
      8421f213
  11. 13 May, 2016 6 commits