'''[http://www.tcl.tk/man/tcl/TclCmd/http.htm%|%http]''' , a command bundled into [Tcl] , is a client-side implementation of the [Hypertext Transfer Protocol%|%HTTP]/[http 1.1%|%1.1] protocol. ** Documentation ** [http://www.tcl.tk/man/tcl/TclCmd/http.htm%|%official reference]: ** Commands ** [http::geturl%|%http::geturl]: performs an HTTP transaction ** History ** As of version X , `http` supports [HTTP 1.1] . ** Simple Examples ** [Playing HTTP] , by [Richard Suchenwirth]: ** See Also ** [An HTTP robot in Tcl]: [Download file via HTTP]: [Downloading a File over HTTP]: [File Upload with tcl's http]: [Google Translation via http Module]: [http authentication]: isn't difficult. [HTML to DOM via http/htmlparse/struct::tree packages]: [cookies]: also easy. [TclCurl]: a higher-level API than http [w3m]: a higher-level API than http [Parallel Geturl]: a package built on of http to download a large number of urls in an efficient, parallel manner. [Voicent Telephone Call Interface]: an HTTP client package for making telephone calls from your Tcl/Tk programs using Voicent Gateway. It uses HTTP POST to communication with the gateway. [http://www.voicent.com]. [Official library of extensions]: [single command http fetcher]: [Tcl chatroom snaphost history (2)]: [Tclers Chat Tk GUI]: [An HTTP robot in Tcl]: [Getting stock quotes over the internet]: [Uploading files to Flickr]: [Downloading a File over HTTP]: [HTTPS]: [http://groups.google.com/group/comp.lang.tcl/browse_thread/thread/b46f7f3880ad42a5/16607ff5b9bac959?lnk=gst&q=http+keep-alive+#16607ff5b9bac959%|%TclSOAP & SSL -- Can I reuse connections?] , [comp.lang.tcl] , 2001-11-28: a deep and productive discussion of keep-alive, certificates, other HTTP 1.1 aspects, and much more. Pat summarized it in http://tclsoap.sourceforge.net/http.html [Web Site Status]: A simple tool to determine the status of a web site: Type a URL in the "Web Site:" text field, then press return or click the "Get Status" button. The HTTP status, code, filesize (in bytes) and raw HTML output will be displayed for the requested resource. Bookmarks can be stored in a file called 'status-bookmarks.txt'. The file should reside in the same directory as this application. ** Synopsis ** : '''http::config''' ?''options''? : '''[http://wiki.tcl.tk/24061%|%http::geturl]''' ''url'' ?''options''? : '''[http://wiki.tcl.tk/21934%|%http::formatQuery]''' ''key value'' ?''key value ...''? : '''http::reset''' ''token'' ?''why''? : '''http::wait''' ''token'' : '''http::data''' ''token'' : '''http::error''' ''token'' : '''http::status''' ''token'' : '''http::code''' ''token'' : '''http::ncode''' ''token'' : '''http::size''' ''token'' : '''http::meta''' ''token'' : '''http::cleanup''' ''token'' : '''http::register''' ''proto port command'' : '''http::unregister''' ''proto'' ** Documentation ** [http://www.tcl.tk/man/tcl/TclCmd/http.htm%|%official reference]: ** Versions and Forks ** Version 2.8.5, with full HTTP/1.1 support, is distributed with Tcl 8.6 . Version 2.7, with partial HTTP/1.1 support, is distributed with [Tcl] 8.5.2. Version 2.5.3 is distributed with [Tcl] 8.4.18. The [TclSOAP] project also contains a distribution of a [http://tclsoap.sf.net/http.html%|%proposed version 2.5] (with some [http 1.1] support), and [tclvfs] extends that to a proposed version 2.6 (with some webdav support) (merged into the tclsoap projects version (2003-06-23)). However, versions 2.5/2.6 seem to have introduced at least one bug (reported against tclsoap on sourceforge). ** Description ** '''HTTP''' is an acronym for '''HyperText Transfer Protocol''', the the protocol used by the worldwide web (WWW) - see http://www.w3c.org/Protocols/ for more about HTTP itself. `::http::size` is the number of bytes of HTML that geturl has returned. `geturl -validate 1` returns the metadata about the page, and since no html has been retrieved, `::http::size` returns `0`. In this case `$state(totalsize)` can be used. One nice feature of the http package is the '''support of different http transport protocols''' via the command: ====== ::http::register proto port command ====== The initial setting for `http` itself is as if the following command were issued: ====== ::http::register http 80 ::socket ====== This can be expanded for [HTTPS] with the [tls] package: ====== package require tls ::http::register https 443 ::tls::socket ====== But it is also possible to overwrite the normal http transport protocol. For example, to get support for multiple internet/ethernet interfaces in a server that has more than one network card or uses aliased IP addresses ([http://www.linuxdig.com/howto/ldp/IP-Alias.php]), register another version of http: ====== set myIP 192.168.10.1 ::http::register http 80 [list ::socket -myaddr $myIP] ====== [TR]: which just expands the initial behaviour. ** A POST Request ** [Silas]: Here is probably the easiest example about how to POST HTTP data using the http package: ====== package require http set url ::http::geturl $url -query [::http::formatQuery field1 value1 field2 value2 field3 value3] ====== ---- [David Welton] gives examples of POSTing HTTP data (that is, use of -query) in [http://groups.google.com/group/comp.lang.tcl/browse_thread/thread/78aaf163e0303b82/a546c0e70f83b118?lnk=gst&q=david+welton+http+post+%22-query%22#a546c0e70f83b118|Web scraping with Tcl help anyone?] , [comp.lang.tcl] , 2002-01-18 ** Examples ** [RS]: '''Minimal downloader''' to stdout: ====== package require http puts [http::data [http::geturl [lindex $argv 0]]] ====== [Bruce Hartweg] offers this (slightly paraphrased) minimal to-file version: ====== package require http http::geturl $theURL -channel [open $theFile w] ====== along with observations that a more robust version will check for redirects, close channels, http::cleanup, ... ---- [DKF]: To '''get the title of a webpage''', use this: ====== package require http set token [http::geturl $theURL] regexp {(?i)([^<>]+)} [http::data $token] -> title http::cleanup $token puts "Title was \"$title\"" ====== If you're doing more than getting the title, use [tdom] and not [[`[regexp]`] for the parsing... [MJ]: With [tdom] this becomes: ====== package require http package require tdom set token [http::geturl $theURL] set doc [dom parse [http::data $token]] set title [[$doc selectNodes {/html/head/title}] asText] $doc delete http::cleanup $token puts "Title was \"$title\"" ====== ---- A sample of catching an error when attempting to get a WWW page: ====== proc t url { if {[catch {set tok [::http::geturl $url]} msg]} { puts "oops: $msg" } else { return $tok } puts leaving } ====== ---- [DGP]: It's a simple thing, but I've found use of Tcl's http package the simplest way to discover what Content-Type an HTTP server is sending back with the resource. [RS]: me too, when [playing HTTP] ** A Cross-Posting Blog Client ** [tonytraductor]: I've used http to build a crossposting blog client (see http://tonyb.us/xpost) that posts to wordpress, livejournal, tumblr, friendica, and others. An example, send a post to tumblr: ====== # where .txt.txt is a text widget, # tags, title and other parameters set with tk::entry widgets in the gui ############################################ # post to tumblr proc tbpost {} { set ptext [.txt.txt get 1.0 {end -1c}] set login [::http::formatQuery mode login user $::email password $::tpswd ] set log [http::geturl http://www.tumblr.com/api/authenticate -query $login] set post [http::formatQuery mode postevent auth_method clear email $::email password $::tpswd type regular generator Xpostulate tags $::tags title $::subject body $ptext] set dopost [http::geturl http://www.tumblr.com/api/write -query $post] set mymeta [http::meta $dopost] set mystat [http::status $dopost] set length [http::size $dopost] toplevel .rsp wm title .rsp "Post Status" grid [tk::label .rsp.lbl -text "Tumblr says: $mystat\nPost length: $length"] grid [tk::button .rsp.view -text "View Journal" -command { set turl "http://$::tname.tumblr.com" exec $::brow $turl & }]\ [tk::button .rsp.ok -text "DONE" -command {destroy .rsp}] } ====== Today I'm trying to get it working with posterous, however, and having difficulty. ** Restarting A Download ** [LES]: is not superstitious and asks a question on 2004-08-13: ''What if the download is too large? How is it possible to... er... "cache" the download, i.e. save part of the stream and free up memory?'' [schlenk]: The http geturl method has various options for this special case. Either you give a channel, so the data is written directly to a file for example, or you register a special progress callback to deal with the situation. ---- [Peter Newman] 2004-03-08 : '''Resuming?''' Does anyone know if it's possible to resume (MP3 downloads) with [[`http`]. And if so, how? And if it's not possible to resume with [http], could you let me know that too. (So I don't have to waste time on a lost cause.) Thanks. [schlenk]: It is possible if the http server supports range requests and you know the length of the file from the content length headers. You just need to add the appropriate HTTP header fields when doing the request, see the RFC 2616 3.12 [http://www.w3.org/Protocols/rfc2616/rfc2616-sec3.html#sec3.12]. ** Proxy Handling ** Identification and handling of proxies can be a pain when using the http package so I'm trying to write a package to handle as much of this as possible - see [autoproxy] ** Blocking Behaviour when Resolving a Host Addresss ** Complaint: http blocks while resolving non-existent and disconnected server. [DGP]: This complaint maps to the complaint that [[`[socket]`] blocks in the form [[socket $host $port]] when `$host` does not exist/respond, even when the `-async` option is used. This basically further maps into a complaint that `gethostbyname()` blocks. Other [C] programs apparently have non-blocking solutions for this. We should discover what those solutions are and see if the [[`[socket]`] implementation can make use of them. [Andreas Kupries]' memory is that we collectively decided the best solution is "to have the core spawn a helper thread to process (and wait for) `gethostbyname()` while the rest of the core goes on crunching." [Darren New] observes that `gethostbyname()` can't be trusted to be thread-safe ... NOTE: (hint) I don't see this problem with `[socket]` reported as a bug at SF. [TV]: It's been a while for me, but isn't that inet function in fact opening a (maybe udp) socket to a [DNS], which could be `select()`-able on decent systems? [nl]: until this will be fixed you can use the `[tcllib]::dns` package to do the dns lookup and then http to the ip (note that this have some implications such as assuming that your DNS host responds and sending a wrong Host header by the http lib, but it is usually better then having your application hang on a bad dns entry). [PT] 2003-06-23: This all assumes that DNS is what is being used. However, there are various ways to resolve hostnames and the local C library resolver knows how to handle them according to local configuration. Maybe we are using a hosts file, maybe we have NIS. It is unfortunately not as simple as this appears - otherwise we'd have fixed it. Ultimately using an external process to do the lookups ala netscape's resolver proxy is likely the only way to avoid this delay. [TV]: A separate process would leave you with the delay, which is when you don't have the answer to the query readily available, wait for better alternatives or needed correction, or until your connections to the informing party are no longer cluttered or broken, but at least you could do something else in the meanwhile. A major normal reason for having processes or threads in the context of communication pacing things. [DKF]: A separate process would let you do other things while the delay was happening. You could even keep a pool of helper processes around and use them round-robin fashion. ** Bug: Errors in Callback Disappear ** ''This bug may be fixed in more recent versions of http'' How can you catch an error in a callback? e.g., if I call ====== http::geturl $url -command somecommand ====== any errors raised in '''somecommand''' just vanish instead of being passed to bgerror as I expect. ** Website Up? ** Here is some code recently mentioned on news:comp.lang.tcl for querying whether a site is alive. ====== if {$argc == 0} { set site http://purl.org/thecliff/tcl/wiki/ } else { set site [lindex $argv 0] } package require http 2.3 # this proc contributed by [Donal Fellows] proc geturl_followRedirects {url args} { while 1 { set token [eval [list http::geturl $url] $args] switch -glob [http::ncode $token] { 30[1237] {### redirect - see below ###} default {return $token} } upvar #0 $token state array set meta [set ${token}(meta)] if {![info exists meta(Location)]} { return $token } set url $meta(Location) unset meta } } set token [geturl_followRedirects $site -validate 1] if {[regexp -nocase ok [::http::code $token]]} { puts "$site is alive" } else { puts "$site is dead: [::http::code $token]" } ::http::cleanup $token ====== [HaO] 2013-05-02: IMHO it would be more secure to limit the redirections to 5. ** Backwards Incompatibility: http-2.7 ** [http://groups.google.com/group/comp.lang.tcl/browse_thread/thread/9a1a0b7f57141b12/3d834b2849323562#3d834b2849323562%|% Tcl/Tk 8.5.2 Release Candidates Options (new behaviour with http -handler)] , [comp.lang.tcl] , 2008-03-28: discusses a problem with http version 2.7 . ** Misc ** There is also a sourceforge project at [http://sourceforge.net/projects/tclhttp1-1] to build a HTTP/1.1 capable http package. [snichols] This project has not released any files yet as of 11/1/04 and has had 0% activity. [KJN]: still no releases at 2007-07-22. [LV]: So, has anyone submitted a [TIP] to take the various forks of the code and create a unified http package with all the working features? ---- [TV] 2003-04-24: I just found behaviour I didn't get: ======none (Tcl) 68 % info vars http::* ::http::urlTypes ::http::http ::http::1 ::http::alphanumeric ::http::encodings ::http::formMap ::http::defaultCharset (Tcl) 68 % unset ::http::1 can't unset "::http::1": no such variable (Tcl) 69 % info vars http::* ::http::urlTypes ::http::http ::http::alphanumeric ::http::encodings ::http::formMap ::http::defaultCharset ====== It's wish 8.4.1, and it runs [bwise], a webserver (tclhttpd with some alterations), and this is clearly from the http package to fetch a webpage. Maybe the manual gives a neat answer, I just found it noteworthy that an erroneous `[unset]` still seems to do its unsetting. [RS]: ..or that the variable was removed by the web server between the first two commands? What happens if you just call the first command repeatedly? [TV]: It would seem to be stable. It's the page content and url info etc array variable, which sticks around it seems until deleted, that's the whole reason I was looking for some garbage collection, or delayed freeing. It could be there is an event linked with some element, I don't know, I didn't write the at least handy [http] package... ---- <<discussion>> http(s) Link Verification [HaO] 2013-05-10: Here is my http(s) link (url) verification code, as inspired from the upper example from Kevin Kenny. This code follows max 5 forwards and requires tcl8.6 due to the tailcall: ======tcl proc ::linkCheck {urlIn {timeout 10000} {recursionLimit 5}} { if {[catch { set requestHandle [::http::geturl $urlIn -validate 1 -timeout $timeout] } err]} { return -code error [mc "Unknown host '%s'" $urlIn] } set fError 1 if {[::http::status $requestHandle] ne {ok}} { set errMsg [::http::status $requestHandle] } else { switch -glob -- [::http::ncode $requestHandle] { 2* {set fError 0} 30[12378] { # redirect if {0 < $recursionLimit && [info exists ${requestHandle}(meta)] && [dict exists [set ${requestHandle}(meta)] Location] } { incr recursionLimit -1 set url [dict get [set ${requestHandle}(meta)] Location] ::http::cleanup $requestHandle tailcall ::linkCheck $url $timeout $recursionLimit } } } set errMsg [::http::code $requestHandle] } ::http::cleanup $requestHandle if {$fError} { return -code error [mc "Error '%s' accessing url '%s'" $errMsg $urlIn] } return } ====== <<discussion>> <<categories>> Tcl syntax | Arts and crafts of Tcl-Tk programming | Command | Tcl | Internet | Package | Web