Today I learned a really cool command line feature to download an entire website to turn it into an offline archive. This is great if you’re developing sites for clients and want to backup their existing one but the new site doesn’t have the exact same content. This is the command I used:

wget --mirror --convert-links --adjust-extension --page-requisites --no-parent https://www.yourwebsitehere.com

Explanation

--mirror – Enables mirroring mode (turns on recursion and time stamping)
--convert-links – Converts links in downloaded HTML to work offline (makes them relative)
--adjust-extension – Adds proper file extensions (.html, .css) to files that lack them
--page-requisites – Downloads all assets needed to display the page (CSS, images, JS)
--no-parent – Doesn’t ascend to parent directories (stays within the specified URL path)

The command will create a folder in the current working directory called https://www.yourwebsitehere.com  which you can then open from your computer – using the index.html file. It does include all the third-party scripts including analytics so use with caution.

Back to blog