Wikimedia has seen a 50 percent increase in bandwidth used for downloading multimedia content since January 2024 due to AI crawlers taking its content to train generative AI models. It has to find a way to address the problem, because it could slow down actual readers' access to its pages and assets.
Is there an easy way to download Wikipedia for offline use and periodically update it? I realise it will be a lot of data.
https://en.wikipedia.org/wiki/Wikipedia:Database_download