Вопрос

How to save a website of 10 pages in one piece to view it offline (but with CSS and JS in place)?

I tried to save a webpage to view offline from the browser via Firefox:

File > Save page as > Webpage (complete), HTML.

The webpage is saved in a folder containing its HTML, and CSS and JS files but it appears messy, not as it appears online.

The [internet archive wayback machine (IAWM)][1] can save one page at a time to the archive but I want to save all in one piece.

I could use Firefox Screengrab to make screenshots of all pages on the website but this isn't saving a whole page and is also repetitive (AFAIR).

What else do I have left, do I still have a way to save a website of 30 pages in one piece to view offline (but with CSS and JS in place so it wouldn't look messy as in the first option)?

Maybe some JS code could help, maybe curl or wget in WSL. I don't know what approach to take here for a minimal solution as I desire "saving the whole website at once when all web pages look just like on the web".

Это было полезно?

Решение

Alternatively, if you have the control of the website/app, try to design it working with offline. The basic idea is storing as much data as you can in client when you are online for offline use. With service worker, there's very limit work the developer need to identify whether I am fetching data from ajax or offline. Regarding service worker, another link is https://developer.mozilla.org/en-US/docs/Web/API/Service_Worker_API

Checkout https://angular.io/tutorial, it works quite well with offline in chrome.

Другие советы

Tools like wget can spider the site and capture all the static content. If you have dynamic content based on forms, javascript or other technologies it is likely impractical to do easily.

It may be possible to run a clone the site on your laptop. Many web servers have versions that will run on a laptop. This is a common approach for website development.

It's not a straightforward task at all. If it were just HTML and CSS it would be easy, but modern websites have state--lots of it.

Here's a general idea that may work:

  1. User inputs address of site to save (in your tool)
  2. Have your tool record all requests/responses of the given site (XHR requests, images, fonts, documents, etc.), and store the request and responses in some kind of database or file. User can follow links and use javascript actions that they wish to use offline.
  3. Later when user wishes to view site offline, have them pick the file saved in #2
  4. Open the base document (response saved in #2) and mock each request the document makes (for images, fonts, XHR, etc.) and return the response that was captured #2

This idea has some issues that you'll have to resolve. E.g. requests issued in #4 may not match the requests you captured in #2 exactly. You may have to have some fuzzy matching logic to link up requests.

Лицензировано под: CC-BY-SA с атрибуция
scroll top