您好,登录后才能下订单哦!
密码登录
登录注册
点击 登录注册 即表示同意《亿速云用户服务条款》
这篇文章主要介绍python如何采集百度搜索结果带有特定URL的链接,文中介绍的非常详细,具有一定的参考价值,感兴趣的小伙伴们一定要看完!
#coding utf-8 import requests from bs4 import BeautifulSoup as bs import re from Queue import Queue import threading from argparse import ArgumentParser arg = ArgumentParser(description='baidu_url_collet py-script by xiaoye') arg.add_argument('keyword',help='keyword like inurl:?id=for searching sqli site') arg.add_argument('-p','--page',help='page count',dest='pagecount',type=int) arg.add_argument('-t','--thread',help='the thread_count',dest='thread_count',type=int,default=10) arg.add_argument('-o','--outfile',help='the file save result',dest='oufile',type=int,default='result.txt') result = arg.parse_args() headers = {'User-Agent':'Mozilla/5.0(windows NT 10.0 WX64;rv:50.0) Gecko/20100101 Firefox/50.0'} class Bg_url(threading.Thread): def __init__(self,que): threading.Thread.__init__(self) self._que = que def run(self): while not self._que.empty(): URL = self._que.get() try: self.bd_url_collet(URL) except Exception,e: print(e) pass def bd_url_collect(self, url): r = requests.get(url, headers=headers, timeout=3) soup = bs(r.content, 'lxml', from_encoding='utf-8') bqs = soup.find_all(name='a', attrs={‘data-click‘:re.compile(r'.'), 'class':None})#获得从百度搜索出来的a标签的链接 for bq in bqs: r = requests.get(bq['href'], headers=headers, timeout=3)#获取真实链接 if r.status_code == 200:#如果状态码为200 print r.url with open(result.outfile, 'a') as f: f.write(r.url + '\n') def main(): thread = [] thread_count = result.thread_count que = Queue() for i in range(0,(result.pagecount-1)*10,10): que.put('https://www.baidu.com/s?wd=' + result.keyword + '&pn=' + str(i)) or i in range(thread_count): thread.append(Bd_url(que)) for i in thread: i.start() for i in thread: i.join() if __name__ == '__main__': main() #执行格式 python aaaaa.py "inurl:asp?id=" -p 30 -t 30
以上是“python如何采集百度搜索结果带有特定URL的链接”这篇文章的所有内容,感谢各位的阅读!希望分享的内容对大家有帮助,更多相关知识,欢迎关注亿速云行业资讯频道!
免责声明:本站发布的内容(图片、视频和文字)以原创、转载和分享为主,文章观点不代表本网站立场,如果涉及侵权请联系站长邮箱:is@yisu.com进行举报,并提供相关证据,一经查实,将立刻删除涉嫌侵权内容。