Issue
So as you are about to see, I am just starting with Python/Scrapy/programming in general. I am trying to figure out how to do multiple form requests in the same spider. I am trying to scrape data from a clerk and recorder’s webpage, but for two (or more) different names. Here is what gets me the first pages of desired results (for the name “Cruz”):
Import scrapy
class LoginSpider(scrapy.Spider):
name = "CRSpider5"
login_url = 'http://recordingsearch.car.elpasoco.com/rsui/opr/search.aspx'
start_urls = [login_url]
def parse(self, response):
validation = response.css('input[name="__EVENTVALIDATION"]::attr(value)').extract_first()
state = response.css('input[name="__VIEWSTATE"]::attr(value)').extract_first()
generator = response.css('input[name="__VIEWSTATEGENERATOR"]::attr(value)').extract_first()
data = {
'__EVENTVALIDATION' : validation,
'__VIEWSTATE' : state,
'__VIEWSTATEGENERATOR' : generator,
'__LASTFOCUS' : '',
'__EVENTTARGET' : '',
'__EVENTARGUMENT' : '',
'ctl00$ContentPlaceHolder1$btnSubmit' : 'Submit+Search',
'ctl00$ContentPlaceHolder1$lbxDocumentTypes' : 'TRANS',
'ctl00$ContentPlaceHolder1$txtGrantorGranteeName' : 'cruz',
}
yield scrapy.FormRequest(url=self.login_url, formdata=data, callback=self.parse_quotes)
def parse_quotes(self, response):
for test in response.css('table#ctl00_ContentPlaceHolder1_gvSearchResults tr')[1:-2]:
yield {
'Debtor': test.css("span::text").extract_first(),
'Creditor': test.css("span::text")[1].extract(),
'Date Recorded': test.css('font::text')[3].extract(),
'Instrument Number': test.css('font::text').extract_first(),
'County': 'El Paso'
}
I would like to do the same thing above but with multiple names (changing the 'ctl00$ContentPlaceHolder1$txtGrantorGranteeName' field to a different name like “smith” or “Jones”). How would I do this in the same spider? Thanks!
Solution
if you want to use a random name to start a Formrequest, you can:
import scrapy
import random
class LoginSpider(scrapy.Spider):
name = "CRSpider5"
login_url = 'http://recordingsearch.car.elpasoco.com/rsui/opr/search.aspx'
start_urls = [login_url]
**name = ['smith','Jones']**
def parse(self, response):
validation = response.css('input[name="__EVENTVALIDATION"]::attr(value)').extract_first()
state = response.css('input[name="__VIEWSTATE"]::attr(value)').extract_first()
generator = response.css('input[name="__VIEWSTATEGENERATOR"]::attr(value)').extract_first()
data = {
'__EVENTVALIDATION' : validation,
'__VIEWSTATE' : state,
'__VIEWSTATEGENERATOR' : generator,
'__LASTFOCUS' : '',
'__EVENTTARGET' : '',
'__EVENTARGUMENT' : '',
'ctl00$ContentPlaceHolder1$btnSubmit' : 'Submit+Search',
'ctl00$ContentPlaceHolder1$lbxDocumentTypes' : 'TRANS',
'ctl00$ContentPlaceHolder1$txtGrantorGranteeName' : **random.choice(name)**,
}
yield scrapy.FormRequest(url=self.login_url, formdata=data, callback=self.parse_quotes)
if you want to use different name to start many request,you can loop the 'name' list and yield more request
Answered By - v-wiil
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.