-
Simon Layton authored
* Initial multi-precision training Adds fp16 support via apex.amp Also switches communication to apex.DistributedDataParallel * Add Apex install to dockerfile * Fixes from @fmassa review Added support to tools/test_net.py SOLVER.MIXED_PRECISION -> DTYPE \in {float32, float16} apex.amp not installed now raises ImportError * Remove extraneous apex DDP import * Move to new amp API08fcf12f
| Name |
Last commit
|
Last update |
|---|---|---|
| .. | ||
| box_head | ||
| keypoint_head | ||
| mask_head | ||
| __init__.py | ||
| roi_heads.py |