-
Simon Layton authored
* Initial multi-precision training Adds fp16 support via apex.amp Also switches communication to apex.DistributedDataParallel * Add Apex install to dockerfile * Fixes from @fmassa review Added support to tools/test_net.py SOLVER.MIXED_PRECISION -> DTYPE \in {float32, float16} apex.amp not installed now raises ImportError * Remove extraneous apex DDP import * Move to new amp API
08fcf12f
Name |
Last commit
|
Last update |
---|---|---|
.. | ||
dcn | ||
__init__.py | ||
_utils.py | ||
batch_norm.py | ||
misc.py | ||
nms.py | ||
roi_align.py | ||
roi_pool.py | ||
sigmoid_focal_loss.py | ||
smooth_l1_loss.py |