[HJG] Someone has uploaded a lot of pictures to [Flickr], and I want to show them someplace where no internet is available. The pages at Flickr have a lot of links, icons etc., so a simple recursive download with e.g. wget would fetch lots of unwanted stuff. So the first step is to downloading the html-pages from that person, extract the links to the photos from them, then download the photo-pages (containing titles and descriptions), and the pictures in the selected size. Then we can make our [Flickr Offline Photoalbum]. First draft for the download: package require http proc getPage { url } { set token [::http::geturl $url] set data [::http::data $token] ::http::cleanup $token return $data } #catch {console show} ;## set url http://www.flickr.com/photos/siegfrieden set filename "s01.html" set data [ getPage $url ] #puts "$data" ;## set fileId [open $filename "w"] puts -nonewline $fileId $data close $fileId This will only get the first html-page, so the next step is to also get the other pages, extract all the informations we need, and then fetch the pictures. ... ---- See also: * [http] - [Download file via HTTP] * [A little file searcher] ---- [Category Internet] - [Category File]