[HJG] Someone has uploaded a lot of pictures to [Flickr], and I want to show them someplace where no internet is available. The pages at Flickr have a lot of links, icons etc., so a simple recursive download with e.g. wget would fetch lots of unwanted stuff. Of course, I could tweak the parameters for calling wget (-accept, -reject, etc.), or get the html-pages, then filter their contents with [awk] or perl, but doing roughly the same thing in Tcl looks like more fun :-) Moreover, with a Tcl-script I can also get the titles and descriptions of the images. So the first step is to download the html-pages from that person, extract the links to the photos from them, then download the photo-pages (containing titles and complete descriptions), and the pictures in the selected size (Thumbnail=100x75, Small=240x180, Medium=500x375, Large=1024x768, Original=as taken). Then we can make a [Flickr Offline Photoalbum] out of them, or just use a program like IrfanView [http://www.irfanview.com] to present the pictures as a slideshow. This is the alpha-version of the downloader: ---- #!/bin/sh # Restart with tcl: -*- mode: tcl; tab-width: 4; -*- \ exec wish $0 ${1+"$@"} # FlickrDownload.tcl - HaJo Gurt - 2006-01-20 - http://wiki.tcl.tk/15303 #: Download webpages and images for a photo-album from flickr.com # # 2005-11-22 First Version # Todo: # * Save infos to file for next stage (album-maker) # * Bug: !! End of Multiline-Descriptions not detected package require Tk package require Img package require http #########1#########2#########3#########4#########5#########6#########7##### proc Print { Str {tag ""} } { #: Output to text-window # puts $Str .txt1 insert end "\n" .txt1 insert end "$Str" $tag .txt1 see end ;# scroll to bottom update } #########1#########2#########3#########4#########5#########6#########7##### proc getPage { url } { #: Fetch a webpage from the web set token [::http::geturl $url] set data [::http::data $token] ::http::cleanup $token return $data } proc FetchImage {url fname } { #: Fetch a picture from the web / see also: [Polling web images with Tk] #puts -nonewline "Fetch: \"$url\" " Print "Fetch: \"$url\" " DL set f [open $fname w] fconfigure $f -translation binary set imgtok [http::geturl $url -binary true -channel $f] flush $f close $f http::cleanup $imgtok Print " ok." Ok } #########1#########2#########3#########4#########5#########6#########7##### proc Analyse1 { url1 page } { #: Analyse flickr album-webpage, # like http://www.flickr.com/photos/PERSON # or http://www.flickr.com/photos/PERSON/page2 global PicNr set filename [format "s%02d.html" $page ] if ($page==1) { set url $url1 } else { set url [format "$url1/page%d" $page ] } set base $url1 set p1 [ string first "//" $url 0 ]; incr p1 2 set p2 [ string first "/" $url $p1 ]; incr p2 -1 set p1 0 set base [ string range $url $p1 $p2 ] #Print "$base: $p1 $p2: '$base'" Print "# filename: $filename" ;## Print "# url : $url" ;## Print "# base: $base" ;## # Deaktivate for offline-testing: if 1 { set data [ getPage $url ] #puts "$data" ;## set fileId [open $filename "w"] Print -nonewline $fileId $data close $fileId } set fileId [open $filename r] set data [read $fileId] close $fileId foreach line [split $data \n] { #
im Khao Sok
#Figuren aus dem alten China, auf...
if {[regexp -- (class="Desc") $line]} { #Print "4: $line"; set p1 [ string first "Desc" $line 0 ]; incr p1 6 set p2 [ string first "<" $line $p1 ]; incr p2 -1 set sD [ string range $line $p1 $p2 ] Print "Descr: $p1 $p2: '$sD'" } # 12 # 1 #-if {[regexp -- (class="end") $line]} { ... } if {[regexp -- (page.*class="end") $line]} { #Print "5: $line"; set p1 [ string first "page" $line 0 ]; incr p1 4 set p2 [ string first "/" $line $p1 ]; incr p2 -1 set s9 [ string range $line $p1 $p2 ] Print "\nEnd: $p1 $p2: '$s9'" return [incr s9 0] #break } #if {[regexp -- (class="Activity") $line]} { ;# now get photo-page Analyse2 $base $sL $filename #break } # if {[regexp -- "