Skip to content
Projects
Groups
Snippets
Help
Loading...
Sign in
Toggle navigation
F
ffm-baseline
Project
Project
Details
Activity
Cycle Analytics
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Charts
Issues
0
Issues
0
List
Board
Labels
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Charts
Wiki
Wiki
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Charts
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
ML
ffm-baseline
Commits
bc2dee30
Commit
bc2dee30
authored
Aug 22, 2018
by
张彦钊
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
add print
parent
41265c35
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
12 additions
and
12 deletions
+12
-12
diaryQueueUpdate.py
diaryQueueUpdate.py
+12
-12
No files found.
diaryQueueUpdate.py
View file @
bc2dee30
...
...
@@ -216,18 +216,18 @@ def multi_update(key, name_dict):
if
__name__
==
"__main__"
:
start
=
time
.
time
()
#
warnings.filterwarnings("ignore")
#
data_set_cid = pd.read_csv(DIRECTORY_PATH + "data_set_cid.csv")["cid"].values.tolist()
#
#
device_id = "358035085192742"
#
native_queue_list, nearby_queue_list, nation_queue_list, megacity_queue_list = test_con_sql(device_id)
#
name_dict = {"native_queue": native_queue_list, "nearby_queue": nearby_queue_list,
#
"nation_queue": nation_queue_list, "megacity_queue": megacity_queue_list}
#
pool = Pool(4)
#
for key in name_dict.keys():
#
pool.apply_async(multi_update,(key,name_dict,))
#
pool.close()
#
pool.join()
warnings
.
filterwarnings
(
"ignore"
)
data_set_cid
=
pd
.
read_csv
(
DIRECTORY_PATH
+
"data_set_cid.csv"
)[
"cid"
]
.
values
.
tolist
()
device_id
=
"358035085192742"
native_queue_list
,
nearby_queue_list
,
nation_queue_list
,
megacity_queue_list
=
test_con_sql
(
device_id
)
name_dict
=
{
"native_queue"
:
native_queue_list
,
"nearby_queue"
:
nearby_queue_list
,
"nation_queue"
:
nation_queue_list
,
"megacity_queue"
:
megacity_queue_list
}
pool
=
Pool
(
4
)
for
key
in
name_dict
.
keys
():
pool
.
apply_async
(
multi_update
,(
key
,
name_dict
,))
pool
.
close
()
pool
.
join
()
# # TODO 上线后把预测用户改成多进程预测
# data_set_cid = pd.read_csv(DIRECTORY_PATH + "data_set_cid.csv")["cid"].values.tolist()
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment