# crawler

## 发布者页爬虫
1. 部署在BJ-GM-Prod-Cos-faiss001/srv/apps/  crontab -e
2. 切换权限 sudo su - gmuser
3. source /root/anaconda3/bin/activate
4. 创建虚拟环境 conda activate crawler_env/conda deactivate
5. 抓取程序 nohup python /srv/apps/crawler/crawler_sys/framework/update_data_in_target_releasers_multi_process_by_date_from_redis.py > /data/log/fect_task.log &
6. 写入抓取url程序 python /srv/apps/crawler/crawler_sys/framework/write_releasers_to_redis.py -p weibo -d 1 -proxies 2
7. 
##搜索页爬虫
pass

## 数据周报
1. 切换权限 sudo su - gmuser
2. source /root/anaconda3/bin/activate
3. python crawler/crawler_sys/utils/get_query_result.py