SuperITManBlog
GitHub
GitLab

Bash - Download a copy of website

December 02, 2018 - 1 min read

git

Recently, I followed a course and the labs were online to consult them during this. Nevertheless, I wanted to be able to retry some exercises later and be sure I can still access the website.

Of course, on internet, everything can disappear at any moment and it can be interesting to save what it is important for you. I remembered an article on LinuxJournal…

On Linux, you can easily download a website through the terminal using wget.

$ wget \
     --recursive \
     --no-clobber \
     --page-requisites \
     --html-extension \
     --convert-links \
     --restrict-file-names=windows \
     --domains website.org \
     --no-parent \
	 www.website.org/tutorials/html/

Sources


Alexis GEORGES

Written by Alexis GEORGES
Living in Brussels (beers, cheeses, chocolates, etc.), geek passionate by Open Source initiatives, against the censorship and careful with the privacy.
You can follow me on Twitter