Skip to content
Projects
Groups
Snippets
Help
Loading...
Sign in
Toggle navigation
C
crawler
Project
Project
Details
Activity
Cycle Analytics
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Charts
Issues
0
Issues
0
List
Board
Labels
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Charts
Wiki
Wiki
Snippets
Snippets
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Charts
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
backend
crawler
Commits
e870b0bd
Commit
e870b0bd
authored
Aug 13, 2020
by
litaolemo
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
update
parent
7e08e22c
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
19 additions
and
8 deletions
+19
-8
search_page_single_process.py
crawler_sys/framework/search_page_single_process.py
+19
-8
No files found.
crawler_sys/framework/search_page_single_process.py
View file @
e870b0bd
...
...
@@ -7,6 +7,9 @@ Created on Tue Dec 4 14:00:03 2018
import
argparse
import
configparser
import
random
from
concurrent.futures.process
import
ProcessPoolExecutor
from
elasticsearch.helpers
import
scan
from
elasticsearch
import
Elasticsearch
from
crawler.crawler_sys.framework.platform_crawler_register
import
get_crawler
...
...
@@ -98,20 +101,28 @@ for platform in PLATFORM_LIST:
initialize_crawler
=
get_crawler
(
platform
)
crawler
=
initialize_crawler
()
KEYWORD_dic
=
func_search_keywordlist
(
platform
)
KEYWORD_dic
=
random
.
shuffle
(
KEYWORD_dic
)
executor
=
ProcessPoolExecutor
(
max_workers
=
3
)
futures
=
[]
for
keyword
in
KEYWORD_dic
:
print
(
"search keyword '
%
s' on platform
%
s"
%
(
keyword
,
platform
))
search_pages
=
int
(
KEYWORD_dic
[
keyword
])
try
:
crawler
.
search_page
(
keyword
=
keyword
,
search_pages_max
=
search_pages
,
# try:
# crawler.search_page(keyword=keyword,
# search_pages_max=search_pages,
# output_to_es_raw=OUTPUT_TO_ES_RAW,
# output_to_es_register=OUTPUT_TO_ES_REGISTER,
# es_index=ES_INDEX,proxies_num=proxies_num)
#
# except Exception as e:
# print(e)
# continue
future
=
executor
.
submit
(
crawler
.
search_page
,
keyword
,
search_pages_max
=
search_pages
,
output_to_es_raw
=
OUTPUT_TO_ES_RAW
,
output_to_es_register
=
OUTPUT_TO_ES_REGISTER
,
es_index
=
ES_INDEX
,
proxies_num
=
proxies_num
)
except
Exception
as
e
:
print
(
e
)
continue
futures
.
append
(
future
)
executor
.
shutdown
(
True
)
# config file absolute path in serve
# '/home/hanye/crawlers/crawler_sys/framework/config/search_keywords.ini'
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment