Recently, I followed a course and the labs were online to consult them during this. Nevertheless, I wanted to be able to retry some exercises later and be sure I can still access the website.
Of course, on internet, everything can disappear at any moment and it can be interesting to save what it is important for you. I remembered an article on LinuxJournal…
On Linux, you can easily download a website through the terminal using wget.
$ wget \ --recursive \ --no-clobber \ --page-requisites \ --html-extension \ --convert-links \ --restrict-file-names=windows \ --domains website.org \ --no-parent \ www.website.org/tutorials/html/