Wget is non-interactive network downloaded meaning all the download operation will run in the background. Therefore you can download any file from the Internet with wget command.
Wget also lets user to download the entire website. The download process start with simple wget command followed by the website URL. Wget supports HTTP, HTTPS and FTP protocol to download file.
Download web pages using wget
For example you want to download the front page of jijokjose.com. You can simply follow the command
wget jijokjose.com
the download start and saved your file in home directory which you can use for later off-line uses. Now lets suppose you have connection problem or the jijokjose server has some problem. Wget try to reconnect to the server 20 times by default. You can limit the reconnection limit by following command
wget -t jijokjose.com
Now the wget will try to reconnect to the server 8 times. The above will download index.html file only. What if you want to download complete website. The below command will download the website up to five level depth than the usual
wget -r jijokjose.com
To convert all the website into off-line viewer
wget --convert-links -r linux.about.com -o logfile
Download and Resume your download using Wget
Suppose you are downloading file certainly get disconnected from the Internet. With wget you can resume you download with a command second time. Therefore wget acts as a download manager. To download the file simply open your terminal and enter the following the command wget followed by website URL. See the example
wget URL
wget http://www.jijokjose.com/projects/images/extension/Extension%20Fetcher%20for%2032%20bit%20OS.zip
Now what if you are downloading large file for example game and you are disconnected from the server Now you have to download from the start if you haven’t use any download manager. Wget let you t o download the file from the level you have disconnected from the Internet. For that purpose use the following command line
wget -c http://www.jijokjose.com/projects/images/extension/Extension%20Fetcher%20for%2032%20bit%20OS.zip
This process will resume your download
Download the dynamic website to static local copy
What if you want to download the dynamic website made by WordPress Joomla or Durpal that has lots of stuff in database. For this purpose you can use the following command which will convert all dynamic to static web page which you can use for local uses
wget --mirror --convert-links URL
wget --mirror --convert-links jijokjose.com
Analyze HTTP request
For this purpose we will use the following command line.
wget -r --spider URL
wget -r --spider jijokjose.com
No comments:
Post a Comment