Issue
I'm building a scraper to crawl a page and return multiple items (h3 & p tags) from within a div. For some reason, the scraper will print all 'name' fields when called, but is only saving info for the last item on the page.
Here's my code:
import scrapy
class FoodSpider(scrapy.Spider):
name = 'food'
allowed_domains = ['https://blog.feedspot.com/food_blogs/']
start_urls = ['https://blog.feedspot.com/food_blogs/']
def parse(self, response):
blogs = response.xpath("//div[@class='fsb v4']")
for blog in blogs:
names = blog.xpath('.//h3/a[@class="tlink"]/text()'[0:]).extract()
links = blog.xpath('.//p/a[@class="ext"]/@href'[0:]).extract()
locations = blog.xpath('.//p/span[@class="location"]/text()'[0:]).extract()
abouts = blog.xpath('.//p[@class="trow trow-wrap"]/text()[4]'[0:]).extract()
post_freqs = blog.xpath('.//p[@class="trow trow-wrap"]/text()[6]'[0:]).extract()
networks = blog.xpath('.//p[@class="trow trow-wrap"]/text()[9]'[0:]).extract()
for name in names:
name.split(',')
# print(name)
for link in links:
link.split(',')
for location in locations:
location.split(',')
for about in abouts:
about.split(',')
for post_freq in post_freqs:
post_freq.split(',')
for network in networks:
network.split(',')
yield {'name': name,
'link': link,
'location': location,
'about': about,
'post_freq': post_freq,
'network': network
}
Anyone have an idea on what I'm doing wrong?
Solution
If you run //div[@class='fsb v4']
in DevTools it will only return a single element
So you have to find a Selector that gets all those profile DIVs
class FoodSpider(scrapy.Spider):
name = 'food'
allowed_domains = ['https://blog.feedspot.com/food_blogs/']
start_urls = ['https://blog.feedspot.com/food_blogs/']
def parse(self, response):
for blog in response.css("p.trow.trow-wrap"):
yield {'name': blog.css(".thumb.alignnone::attr(alt)").extract_first(),
'link': "https://www.feedspot.com/?followfeedid=%s" % blog.css("::attr(data)").extract_first(),
'location': blog.css(".location::text").extract_first(),
}
Answered By - Umair Ayub
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.