1500字范文,内容丰富有趣,写作好帮手!
1500字范文 > python爬虫筛选数据_[python爬虫]使用beautifulsoup库的select方法对网页内容进行筛选...

python爬虫筛选数据_[python爬虫]使用beautifulsoup库的select方法对网页内容进行筛选...

时间:2023-04-01 02:10:47

相关推荐

python爬虫筛选数据_[python爬虫]使用beautifulsoup库的select方法对网页内容进行筛选...

from bs4 import BeautifulSoup

html = """

The Dormouse's story

The Dormouse's story

Once upon a time there were three little sisters; and their names were

,

Lacie and

Tillie;

and they lived at the bottom of a well.

...

"""

#按标签名查找

soup = BeautifulSoup(html,"lxml") #使用lxml库解析网页

res = soup.select("title") #查找标签为title的内容

print("标签title : ",res)

>>>标签title : [

The Dormouse's story] #输出一个list

soup = BeautifulSoup(html,"lxml")

res = soup.select("a")

print("标签a : ",res)

#此处输出的list有三个元素,因为标签a有三对

>>>标签a : [, Lacie, Tillie]

#通过类名查找

soup = BeautifulSoup(html,"lxml")

res = soup.select(".title") #查找class=title的内容

print("通过类名查找class=title : ",res)

>>>通过类名查找class=title : [

The Dormouse's story

]

soup = BeautifulSoup(html,"lxml")

res = soup.select(".sister")

print("通过类名查找class=sister : ",res)

>>>通过类名查找class=sister : [, Tillie]

#通过 id 名查找

soup = BeautifulSoup(html,"lxml")

res = soup.select("#link1") #查找id=link1的内容

print("通过 id 名查找id=link1 : ",res)

>>>通过类名查找class=sister : []

#通过组合查找

组合查找和写 class 文件时,标签名与类名、id名进行的组合原理是一样的,例如查找 p 标签中,id 等于 link1的内容,二者需要用空格分开

soup = BeautifulSoup(html,"lxml")

res = soup.select("p #link1") #查找p标签中,id=link1的内容

print("通过组合查找group=p #link1 : ",res)

>>>通过组合查找group=p #link1 : []

soup = BeautifulSoup(html,"lxml")

res = soup.select("p .brother")

print("通过组合查找group=p .brother : ",res)

>>>通过组合查找group=p .brother : [Lacie]

#通过子标签查找

soup = BeautifulSoup(html,"lxml")

res = soup.select("head > title") #head标签下的title子标签

print("通过子标签查找head > title : ",res)

>>>通过子标签查找head > title : [

The Dormouse's story]

#通过属性查找

查找时还可以加入属性元素,属性需要用中括号括起来,注意属性和标签属于同一节点,所以中间不能加空格,否则会无法匹配到。

soup = BeautifulSoup(html,"lxml")

res = soup.select('a[href="/elsie"]') #a标签下属性为/elsie的内容

print("通过属性查找 : ",res)

>>>通过属性查找 : []

本内容不代表本网观点和政治立场,如有侵犯你的权益请联系我们处理。
网友评论
网友评论仅供其表达个人看法,并不表明网站立场。