Download entire site, use command:
wget -m http://www.yoursite/yourfolder/
If you want the pages that site links than using:
wget -H -r --level=1 -k -p http://www.yoursite/yourfolder/
If you want to download only jpg files, then use:
wget -A .jpg -l 3 -r http://www.yoursite/yourfolder/
Some args of wget command :
Specify recursive download.
Maximum recursion depth (inf or 0 for infinite).
Make links in downloaded HTML point to local files.
Get all images, etc. needed to display HTML page.
Ignore robots.txt files and just download
Don't go up to the parent directory
Only download NEWER files than what's already been downloaded
No directory, by default wget creates a directory
specified number of seconds between the retrievals
the time is vary between 0.5 and 1.5 * wait seconds ( see wait option)