Skip to content
Projects
Groups
Snippets
Help
Loading...
Sign in
Toggle navigation
M
maskrcnn
Project
Project
Details
Activity
Cycle Analytics
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Charts
Issues
0
Issues
0
List
Board
Labels
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Charts
Wiki
Wiki
Snippets
Snippets
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Charts
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
人工智能
maskrcnn
Commits
f66c1d40
Commit
f66c1d40
authored
Dec 20, 2018
by
benjaminrwilson
Committed by
Francisco Massa
Dec 20, 2018
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
Added distributed training check (#287)
parent
a0d6edd9
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
10 additions
and
0 deletions
+10
-0
comm.py
maskrcnn_benchmark/utils/comm.py
+10
-0
No files found.
maskrcnn_benchmark/utils/comm.py
View file @
f66c1d40
...
...
@@ -13,18 +13,24 @@ import torch
def
get_world_size
():
if
not
torch
.
distributed
.
is_available
():
return
1
if
not
torch
.
distributed
.
is_initialized
():
return
1
return
torch
.
distributed
.
get_world_size
()
def
get_rank
():
if
not
torch
.
distributed
.
is_available
():
return
0
if
not
torch
.
distributed
.
is_initialized
():
return
0
return
torch
.
distributed
.
get_rank
()
def
is_main_process
():
if
not
torch
.
distributed
.
is_available
():
return
True
if
not
torch
.
distributed
.
is_initialized
():
return
True
return
torch
.
distributed
.
get_rank
()
==
0
...
...
@@ -35,6 +41,8 @@ def synchronize():
Helper function to synchronize between multiple processes when
using distributed training
"""
if
not
torch
.
distributed
.
is_available
():
return
if
not
torch
.
distributed
.
is_initialized
():
return
world_size
=
torch
.
distributed
.
get_world_size
()
...
...
@@ -103,6 +111,8 @@ def scatter_gather(data):
# each process will then serialize the data to the folder defined by
# the main process, and then the main process reads all of the serialized
# files and returns them in a list
if
not
torch
.
distributed
.
is_available
():
return
[
data
]
if
not
torch
.
distributed
.
is_initialized
():
return
[
data
]
synchronize
()
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment