Skip to content Skip to sidebar Skip to footer

Scrapy Getting Values From Multiple Sites

I'm trying to pass a value from a function. i looked up the docs and just didn't understand it. ref: def parse_page1(self, response): item = MyItem() item['main_url'] = res

Solution 1:

This is how you can pass any value, link etc to other methods:

import scrapy

class GotoSpider(scrapy.Spider):
    name = 'goto'
    allowed_domains = ['first.com', 'second.com']
    start_urls = ['http://first.com/']

    def parse(self, response):
        name = response.xpath(...)
        link = response.xpath(...)  # link for second.com where you may find the price
        request = scrapy.Request(url=link, callback = self.parse_check)
        request.meta['name'] = name
        yield request

    def parse_check(self, response):
        name = response.meta['name']
        price = response.xpath(...)
        yield {"name":name,"price":price} #Assuming that in your "items.py" the fields are declared as name, price

Post a Comment for "Scrapy Getting Values From Multiple Sites"