• Simon Layton's avatar
    Initial mixed-precision training (#196) · 08fcf12f
    Simon Layton authored
    * Initial multi-precision training
    
    Adds fp16 support via apex.amp
    Also switches communication to apex.DistributedDataParallel
    
    * Add Apex install to dockerfile
    
    * Fixes from @fmassa review
    
    Added support to tools/test_net.py
    SOLVER.MIXED_PRECISION -> DTYPE \in {float32, float16}
    apex.amp not installed now raises ImportError
    
    * Remove extraneous apex DDP import
    
    * Move to new amp API
    08fcf12f
Name
Last commit
Last update
.github/ISSUE_TEMPLATE Loading commit data...
configs Loading commit data...
demo Loading commit data...
docker Loading commit data...
maskrcnn_benchmark Loading commit data...
tests Loading commit data...
tools Loading commit data...
.flake8 Loading commit data...
.gitignore Loading commit data...
ABSTRACTIONS.md Loading commit data...
CODE_OF_CONDUCT.md Loading commit data...
CONTRIBUTING.md Loading commit data...
INSTALL.md Loading commit data...
LICENSE Loading commit data...
MODEL_ZOO.md Loading commit data...
README.md Loading commit data...
TROUBLESHOOTING.md Loading commit data...
requirements.txt Loading commit data...
setup.py Loading commit data...