• Simon Layton's avatar
    Initial mixed-precision training (#196) · 08fcf12f
    Simon Layton authored
    * Initial multi-precision training
    
    Adds fp16 support via apex.amp
    Also switches communication to apex.DistributedDataParallel
    
    * Add Apex install to dockerfile
    
    * Fixes from @fmassa review
    
    Added support to tools/test_net.py
    SOLVER.MIXED_PRECISION -> DTYPE \in {float32, float16}
    apex.amp not installed now raises ImportError
    
    * Remove extraneous apex DDP import
    
    * Move to new amp API
    08fcf12f
Name
Last commit
Last update
..
backbone Loading commit data...
detector Loading commit data...
roi_heads Loading commit data...
rpn Loading commit data...
__init__.py Loading commit data...
balanced_positive_negative_sampler.py Loading commit data...
box_coder.py Loading commit data...
make_layers.py Loading commit data...
matcher.py Loading commit data...
poolers.py Loading commit data...
registry.py Loading commit data...
utils.py Loading commit data...