Question

is there a way to batch copy certain wikipedia articles(about 10,000) to my own mediawiki site?

EDIT: How do I do this without overwriting similarly named articles/pages? Also I don't plan on using illegal means (crawlers etc)

Was it helpful?

Solution

If you're looking to obtain a specific set of articles, then you may be able to use the Export page (http://en.wikipedia.org/wiki/Special:Export) to obtain an XML dump of the pages involved; you can export multiple pages at once, although you may wish to space out your requests.

You can import the XML dumps into MediaWiki using Special:Import or one of the import scripts in maintenance/.

OTHER TIPS

The Wikipedia database is available for download

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top