Scrapy limits allowed response sizes by default through the DOWNLOAD_MAXSIZE
and DOWNLOAD_WARNSIZE
settings.
However, those limits were only being enforced during the download of the raw, usually-compressed response bodies, and not during decompression, making Scrapy vulnerable to decompression bombs.
A malicious website being scraped could send a small response that, on decompression, could exhaust the memory available to the Scrapy process, potentially affecting any other process sharing that memory, and affecting disk usage in case of uncompressed response caching.
Upgrade to Scrapy 2.11.1.
If you are using Scrapy 1.8 or a lower version, and upgrading to Scrapy 2.11.1 is not an option, you may upgrade to Scrapy 1.8.4 instead.
There is no easy workaround.
Disabling HTTP decompression altogether is impractical, as HTTP compression is a rather common practice.
However, it is technically possible to manually backport the 2.11.1 or 1.8.4 fix, replacing the corresponding components of an unpatched version of Scrapy with patched versions copied into your own code.
This security issue was reported by @dmandefy through huntr.com.
{ "nvd_published_at": null, "cwe_ids": [ "CWE-409" ], "severity": "HIGH", "github_reviewed": true, "github_reviewed_at": "2024-02-16T16:07:13Z" }