[HJG] Someone has uploaded a lot of pictures to [Flickr], and I want to show them someplace where no internet is available. The pages at Flickr have a lot of links, icons etc., so a simple recursive download with e.g. wget would fetch lots of unwanted stuff. Of course, I could tweak the parameters for calling wget (-accept, -reject, etc.), but doing roughtly the same thing in Tcl looks like more fun :-) So the first step is to downloading the html-pages from that person, extract the links to the photos from them, then download the photo-pages (containing titles and descriptions), and the pictures in the selected size. Then we can make our [Flickr Offline Photoalbum]. First draft for the download: package require http proc getPage { url } { set token [::http::geturl $url] set data [::http::data $token] ::http::cleanup $token return $data } catch {console show} ;## set url http://www.flickr.com/photos/siegfrieden set filename "s01.html" set url http://www.flickr.com/photos/siegfrieden/page2 set filename "s02.html" set data [ getPage $url ] #puts "$data" ;## set fileId [open $filename "w"] puts -nonewline $fileId $data close $fileId set n 0 foreach line [split $data \n] { if {[regexp -- "" $line]} { puts "1: $line"; incr n } if {[regexp -- "<h4>" $line]} { puts "2: $line"; incr n } if {[regexp -- (class="Photo") $line]} { puts "3: $line"; incr n } if {[regexp -- (class="Desc") $line]} { puts "4: $line"; incr n } if {[regexp -- (class="end") $line]} { puts "5: $line\n"; incr n; break } } This will only get the one html-page, so the next step is to also get the other pages of the album, extract from them all the informations we need, and then fetch the pictures. Strings to look for: * "<title>" - Title for album * "<h4>" - Title for an image (but also an occurance of "Search by tag" below all images) * (class="Desc") - Description for an image (if present) * (class="Photo") - Links to preview-image and page with a single photo * (class="end") - Link to last album-page * - * "profile" - Link to profile-page of album-owner * "/page" - Links to more album-pages ... ---- See also: * [http] - [Download file via HTTP] * [A little file searcher] - [incrfilter] ---- [Category Internet] - [Category File]