本篇内容介绍了"怎么使用计算机编程语言爬虫"的有关知识,在实际案例的操作过程中,不少人都会遇到这样的困境,接下来就让小编带领大家学习一下如何处理这些情况吧!希望大家仔细阅读,能够学有所成!
1.导入模块
importrefombs 4 importbeautulsupumportrequestsimporttimeimportjsonimportpandasappdimportnumpyasnp 2 .状态码
r=请求。get(' https://github。状态代码3 .爬取*乎
#浏览器页眉和cookieheaders={ ' User-Agent ' : ' Mozilla/5.0(Macintosh;intelmacosx 10 _ 14 _ 6)applebwebkit/537.36(KHTML,likeGecko)Chrome/80。0 .3987 .87 safari/537.36 ' } cookies={ ' cookie ' : ' _ zap=3d 979 dbb- f25 B- 4014-8770-89045 dec 48 f 6;d _ c0=' APDvML4koQ-ptqfu 56 egnznd 2wd-eilet 3e=| 1561292196 ';tst=r;_ ga=GA1 . 2 . 910277933 . 1582789012 q _ C1=9a 429 b 07 b 08 a4 AE 1 AFE 0 a 99386626304 | 1584073146000 | 1561373910000;_ xsrf=bf1c 5 EDF-75bd-4512-8319-02c 650 B7 ad 2c;_ GID=GA1。2 .19832590999 .19832599995 l _ n _ c=1;l _ cap _ id=' ndixm2m 4 wy4 N2 ywndrejm 2 E3 ODA xmdmy 2 NGF imtq=| 158663749 | ceda 775 ba 80 ff 485 b 63943 E0 ba f 9968684237435 ';r _ cap _ id=' owy3ogq 1 mdjhmjfjdbizk0m dmxmmvlzdiwnzu0nzu=| 158663749 | 0948d 23 c 731 A8 fa 985614d 3ed 58 ed b 6405303 e 99 ';cap _ id=' m2i5 nmjkmzrjmjc3n gzjhnzmndmynd q3ndlmnme=| 158663749 | dacf 440 ab 7 ad 64214 a 939974 e 539 F9 b 86 DDB 9 EAC ';n _ c=1;hm _ lvt _ 98 bee 57 FD 2e f 70 CCD d5ca 52 b 9740 c 49=1586585625,1586587735,1586667228,1586667292;hm _ lpvt _ 98 bee 57 FD 2e f 70 CCD d5ca 52 b 9740 c 49=1586667292;会话id=gwbltmmtwz5 febtj rm4 akv 8 pff 6 p8y 6 qwk gup 4 tjp 6;JOID=uvksbeji6 ekghaipmkwakeomdkawmjn 4 y1khhpvgfpymxo 3 voudk 88 uo 62 jqgww 5 up 4 HC 2 kx _ kgo 9 xoki=;OSD=ulexau5l 4 eeleauhn 0 kmemhuuylbwfzmfv 52m 5k 3n kuwqamll0vkccaowu-azi6q zu5 as 7ho-lHrGG-d0 pa 4=;capion _ ticket=' 2 | 1:0 | 10:1586667673 | 14: capion _ ticket | 44: ytjkymiyn 2q 4y wi4 ndi0m zk0jq 1 ymiwymuxgynzy=| b 49 EB 8176314 b 73 E0 ade 9 f 19 dae4 b 463 FB 970 c 8 CBD 1 E6 a 07 a6 a 5333z _ c0=' 2 | unlock _ ticket=' amcrybojghemaayajvtbank l4i-y7pzkta 0 E4 momkdpg 3 NRC 6 guq=';KLBRSID=FB 3 EDA 1a 35 a9 ed 9 f 88 f 346 a7 a3 ebe 83 | 1586667697 | 1586660346 ' } start _ URL=' https://www .胡志。com/API/v3/feed/topstory/推荐?session _ token=c 03069 ed 8 f 250472 b 687 fd1ee 704 DD 5 bdesktop=true page _ number=5 limit=6 action=pullad _ interval=-1 before _ id=23 ' 4 .美丽的组合解析
s=请求session()start _ URL=' https://www .胡志。com/' html=s . get(URL=start _ URL,headers=headers,cookies=cookies,time out=5)soup=beautulsoup(html。内容)问题=[]# #名称question _ address=[]# # URL temp 1=汤。item intemp 1: temp 2=item的find _ all(' div ',class _=' cartopstoritem topstoritem-is推荐')。find _ all(' div,item prop=' zhi 3360 question ')# print(temp 2)if temp 2!=[]:####存在专栏等情况,暂时跳过问题地址。追加(临时2[0]).find('meta ',itemprop='url ').get(' content ')问题。追加(临时2[0]).find('meta ',itemprop='name ').获取(内容)5 .存储信息
question_focus_number=[]#关注量问题_答案_编号=[]#回答量forurldeyon _ address : test=s . get(URL=URL,headers=headers,cookies=cookies,time out=5)soup=beauty sup(test。内容)信息=汤。find _ all(' div ',class _='问题页')[0]# print(info)focus _ number=info。find(' meta ',itemprop='answerCount ').get(' content ')答案_ number=info。find(' meta,item prop=' zhi :追随者计数').get(' content ')question _ focus _ number。追加(focus _ number)问题_答案_编号。追加(答案号)6 .整理信息并输出
问题_信息=pd .DataFrame(列表(邮编(问题,问题焦点号,问题答案号)),列=['问题名称','关注人数','回答人数]foritemin['关注人数','回答人数]: question _ info[item]=NP。数组(question _ info[item],dtype=' int ')question _ info。sort _ values(by='关注人数,升序=假)输出:
"怎么使用计算机编程语言爬虫"的内容就介绍到这里了,感谢大家的阅读。如果想了解更多行业相关的知识可以关注网站,小编将为大家输出更多高质量的实用文章!
内容来源网络,如有侵权,联系删除,本文地址:https://www.230890.com/zhan/55801.html