سؤال

I am trying to create a local exact copy of a Wiki on my Linux machine, but no matter what I try, it won't work properly in the end.

The challenge is that I have no access other than web-access to the Wiki, but it would be sufficient to have just a snapshot of the current state. I tried to use wget, but it fails to download files properly and does not convert links inside those pages.

I tried to use websucker.py but again it did not properly convert links, and since most Wiki files have no extension, I could not get my web-server (lighttpd) to serve them as text/html.

Does anyone have a working tool or can tell me what parameters to use with either wget or websucker.py to create a working clone of an existing Wiki?

هل كانت مفيدة؟

المحلول

Since nobody seems to know I spent a few more hours on Google and found the answer myself. I put it here if others have the same issue.

Each Wiki has an API that beside other features has a dump feature. You can use that API for a full or current dump of any Wiki. See here for a tutorial on how to use the dumpgenerator.py created by the wikiteam.

You can later import that XML dump either through the Special:import page or use the importDump.php script as explained in the Mediawiki manual.

مرخصة بموجب: CC-BY-SA مع الإسناد
لا تنتمي إلى StackOverflow
scroll top