• Simon Layton's avatar
    Initial mixed-precision training (#196) · 08fcf12f
    Simon Layton authored
    * Initial multi-precision training
    
    Adds fp16 support via apex.amp
    Also switches communication to apex.DistributedDataParallel
    
    * Add Apex install to dockerfile
    
    * Fixes from @fmassa review
    
    Added support to tools/test_net.py
    SOLVER.MIXED_PRECISION -> DTYPE \in {float32, float16}
    apex.amp not installed now raises ImportError
    
    * Remove extraneous apex DDP import
    
    * Move to new amp API
    08fcf12f