-
Simon Layton authored
* Initial multi-precision training Adds fp16 support via apex.amp Also switches communication to apex.DistributedDataParallel * Add Apex install to dockerfile * Fixes from @fmassa review Added support to tools/test_net.py SOLVER.MIXED_PRECISION -> DTYPE \in {float32, float16} apex.amp not installed now raises ImportError * Remove extraneous apex DDP import * Move to new amp API08fcf12f
| Name |
Last commit
|
Last update |
|---|---|---|
| .. | ||
| backbone | ||
| detector | ||
| roi_heads | ||
| rpn | ||
| __init__.py | ||
| balanced_positive_negative_sampler.py | ||
| box_coder.py | ||
| make_layers.py | ||
| matcher.py | ||
| poolers.py | ||
| registry.py | ||
| utils.py |