Coroutines

    Scrapy has partial support for the.

    Warning

    support in Scrapy is experimental. Future Scrapyversions may introduce related API and behavior changes without adeprecation period or warning.

    The following callables may be defined as coroutines using async def, andhence use coroutine syntax (e.g. await, async for, async with):

    • callbacks.

    As a side effect, if the callback raises an exception, none of itsoutput is processed.

    If you need to output multiple items or requests and you are usingPython 3.5, return an iterable (e.g. a list) instead.

    Usage

    There are several use cases for coroutines in Scrapy. Code that wouldreturn Deferreds when written for previous Scrapy versions, such as downloadermiddlewares and signal handlers, can be rewritten to be shorter and cleaner:

    becomes:

    Coroutines may be used to call asynchronous code. This includes othercoroutines, functions that return Deferreds and functions that returnawaitable objects such as . This means you can usemany useful Python libraries providing such code:

    Note

    Common use cases for asynchronous code include:

    • requesting data from websites, databases and other services (in callbacks,pipelines and middlewares);
    • storing data in databases (in pipelines and middlewares);
    • delaying the spider initialization until some external event (in thespider_opened handler);