1台master,2台slave,虚拟机ubuntu下,配置如下:
一。master的config.json:
{
"taskdb": "mysql+taskdb://pyspider:pyspider-pass@192.168.209.128:3306/taskdb",
"projectdb": "mysql+projectdb://pyspider:pyspider-pass@192.168.209.128:3306/projectdb",
"resultdb": "mysql+resultdb://pyspider:pyspider-pass@192.168.209.128:3306/resultdb",
"message_queue": "redis://192.168.209.128:6379/db",
"phantomjs-proxy": "192.168.209.128:25555",
"scheduler":{
"xmlrpc-host":"0.0.0.0",
"delete-time":10},
"webui": {
"port": 5555,
"username": "",
"password": "",
"need-auth": false}
}
在主机上运行
/usr/local/bin/pyspider -c /home/pu/pyspider/conf.json schedule
/usr/local/bin/pyspider -c /home/pu/pyspider/conf.json webui
/usr/local/bin/pyspider -c /home/pu/pyspider/conf.json phantomjs
三个终端
二。slave的config.json:
{
"taskdb": "mysql+taskdb://pyspider:pyspider-pass@192.168.209.128:3306/taskdb",
"projectdb": "mysql+projectdb://pyspider:pyspider-pass@192.168.209.128:3306/projectdb",
"resultdb": "mysql+resultdb://pyspider:pyspider-pass@192.168.209.128:3306/resultdb",
"message_queue": "redis://192.168.209.128:6379/db",
"phantomjs-proxy": "192.168.209.128:25555",
"fetcher":{"xmlrpc-host":"192.168.209.128"}
}
在两个从机上运行
/usr/local/bin/pyspider -c /home/pu/pyspider/conf.json fetcher
/usr/local/bin/pyspider -c /home/pu/pyspider/conf.json processor
/usr/local/bin/pyspider -c /home/pu/pyspider/conf.json result_worker
三个终端
我是先命令行运行的,还没有用Supervisor来管理进程,想先分布式调试成功后再用这个管理进程,命令行只是多开几个终端而已。但是很奇怪,爬虫程序能很顺利的运行,但是单机跑和3台机子一起跑时间是一样的,就差几秒钟而已,求解?我看了终端输出的信息,是2台slave的提取的url是不重复的,但是时间分开的有间隔的,比如说slave1运行4秒钟,然后slave2运行3秒钟,并不是并行的而是有顺序,好奇怪!难道是schedule那里是一个一个拿取任务,不能同时拿的吗?