wget

networkingLinux/Unix/Windows
The wget command is one of the most frequently used commands in Linux/Unix-like operating systems. wget The wget command is a non-interactive network downloader that retrieves files from the web. It supports HTTP, HTTPS, and FTP protocols, as well as retrieval through HTTP proxies. Unlike interactive browsers, wget can work in the background, after users log off, retrieving files recursively and creating local versions of remote sites.

Quick Reference

Command Name:

wget

Category:

networking

Platform:

Linux/Unix/Windows

Basic Usage:

wget https://example.com/file.zip

Common Use Cases

  • 1

    File download

    Download files from the web

  • 2

    Data transfer

    Transfer files between systems efficiently

  • 3

    Scripting

    Use in shell scripts to automate file downloads

  • 4

    Web scraping

    Download data from websites for analysis

Syntax

wget [options] [URL...]

Options

Option Description
-V, --version Display the version of wget
-h, --help Print help information
-b, --background Go to background after startup
-e, --execute=COMMAND Execute a command as if it were part of .wgetrc
-o, --output-file=FILE Log messages to FILE
-a, --append-output=FILE Append messages to FILE
-d, --debug Print debugging information
-q, --quiet Turn off output
-v, --verbose Be verbose (default)
-i, --input-file=FILE Download URLs found in FILE
-F, --force-html Treat input file as HTML
-B, --base=URL Resolves HTML input file links relative to URL
-t, --tries=NUMBER Set number of retries to NUMBER (0 unlimits)
-O, --output-document=FILE Write documents to FILE
-c, --continue Resume getting a partially-downloaded file
-P, --directory-prefix=PREFIX Save files to PREFIX/...
-r, --recursive Specify recursive download
-l, --level=NUMBER Maximum recursion depth (inf or 0 for infinite)
--no-parent Don't ascend to the parent directory
-A, --accept=LIST Comma-separated list of accepted extensions
-R, --reject=LIST Comma-separated list of rejected extensions
--limit-rate=RATE Limit download rate to RATE
--wait=SECONDS Wait SECONDS between retrievals
--random-wait Wait from 0.5*WAIT to 1.5*WAIT seconds between retrievals
--mirror Shortcut for -N -r -l inf --no-remove-listing
-k, --convert-links Make links in downloaded HTML point to local files
-p, --page-requisites Get all images, etc. needed to display HTML page
--user=USER Set both FTP and HTTP user to USER
--password=PASS Set both FTP and HTTP password to PASS

Examples

How to Use These Examples

The examples below show common ways to use the wget command. Try them in your terminal to see the results. You can copy any example by clicking on the code block.

#

Basic Examples:

wget https://example.com/file.zip

Download a file and save it to the current directory.

wget -O custom_name.zip https://example.com/file.zip

Download a file and save it with a custom name.

wget -P /path/to/directory https://example.com/file.zip

Download a file to a specific directory.

Advanced Examples:

wget -c https://example.com/largefile.zip

Resume a previously interrupted download.

wget -r -np -k https://example.com/

Recursively download a website (mirroring) while converting links for local viewing.

wget --limit-rate=200k https://example.com/largefile.zip

Limit the download speed to 200 KB/s.

wget -b https://example.com/largefile.zip

Download in the background, logging output to wget-log file.

wget -i urls.txt

Download files listed in a text file, one URL per line.

wget --user=username --password=password https://example.com/protected

Download from a site requiring authentication.

wget -r -l 2 -A .pdf https://example.com/

Recursively download all PDF files up to a depth of 2 levels.

wget --wait=2 --random-wait -r https://example.com/

Download with a delay between requests to avoid server overload.

wget --mirror --convert-links --adjust-extension --page-requisites --no-parent https://example.com/

Create a complete mirror of a website suitable for offline browsing.

Try It Yourself

Practice makes perfect! The best way to learn is by trying these examples on your own system with real files.

Understanding Syntax

Pay attention to the syntax coloring: commands, options, and file paths are highlighted differently.

Notes

Key Points:

  • wget is a robust utility for downloading files from the web
  • It works non-interactively, making it perfect for scripts and background tasks
  • Unlike browser-based downloads, wget can continue after a connection is lost
  • It excels at mirroring websites by recursively downloading content
  • wget can follow links in HTML pages, retrieving embedded files
  • It's widely available on most Unix-like systems and can be installed on Windows
  • wget supports both HTTP/HTTPS and FTP protocols

Website Mirroring Features:

  • Recursive retrieval of directories
  • HTML link conversion for local browsing
  • Creation of directory structures like those on the remote server
  • Page requisite downloading (images, CSS, JS) for proper display
  • Respect for the robots.txt protocol
  • Control over depth of recursion and types of files to download
  • Bandwidth throttling to avoid overloading servers

Resuming Downloads:

  • The -c option allows resuming a partially downloaded file
  • This is particularly useful for large files over unstable connections
  • wget automatically detects if a file is complete or partial
  • It will refuse to re-retrieve files unless --continue is specified
  • File integrity is maintained during resume operations

Security Considerations:

  • Be careful with --user and --password options as credentials appear in the command line
  • Consider using a .netrc file or .wgetrc for storing credentials securely
  • Use --no-check-certificate only when absolutely necessary
  • Set appropriate --wait times to avoid being blocked by servers
  • Be respectful of website owners by adhering to robots.txt guidelines

Common Use Cases:

  • Downloading large files reliably with resume capability
  • Mirroring websites for offline browsing
  • Automated downloading of regularly updated files
  • Web scraping for data collection (with appropriate permissions)
  • Creating local copies of documentation sites
  • Batch downloading files listed in a text file
  • Archiving web content that might become unavailable

Related Commands:

  • curl - Client for URL transfer, with more protocol support and request customization
  • aria2 - Lightweight multi-protocol & multi-source command-line download utility
  • httrack - Website copier for creating offline mirrors
  • lynx - Text-based web browser with download capabilities
  • lftp - Sophisticated file transfer program
  • youtube-dl - Command-line program to download videos from YouTube and other sites

Tips & Tricks

1

Use the -b option to background mode

2

Use the -O option to specify the output file

3

Use the -c option to continue getting a partially downloaded file

4

Use the -p option to include the header of the page

5

Use the -r option to recursively download a directory

Common Use Cases

File download

Download files from the web

Data transfer

Transfer files between systems efficiently

Scripting

Use in shell scripts to automate file downloads

Web scraping

Download data from websites for analysis

Backup

Create backups of important files and directories

Related Commands

These commands are frequently used alongside wget or serve similar purposes:

Use Cases

1

File download

Download files from the web

2

Data transfer

Transfer files between systems efficiently

3

Scripting

Use in shell scripts to automate file downloads

4

Web scraping

Download data from websites for analysis

5

Backup

Create backups of important files and directories

Learn By Doing

The best way to learn Linux commands is by practicing. Try out these examples in your terminal to build muscle memory and understand how the wget command works in different scenarios.

$ wget
View All Commands