• Simon Layton's avatar
    Initial mixed-precision training (#196) · 08fcf12f
    Simon Layton authored
    * Initial multi-precision training
    
    Adds fp16 support via apex.amp
    Also switches communication to apex.DistributedDataParallel
    
    * Add Apex install to dockerfile
    
    * Fixes from @fmassa review
    
    Added support to tools/test_net.py
    SOLVER.MIXED_PRECISION -> DTYPE \in {float32, float16}
    apex.amp not installed now raises ImportError
    
    * Remove extraneous apex DDP import
    
    * Move to new amp API
    08fcf12f
Name
Last commit
Last update
..
config Loading commit data...
csrc Loading commit data...
data Loading commit data...
engine Loading commit data...
layers Loading commit data...
modeling Loading commit data...
solver Loading commit data...
structures Loading commit data...
utils Loading commit data...
__init__.py Loading commit data...