Version 28 of exec ampersand problem

Updated 2005-10-13 18:28:19

The exec command treats ampersands (&) in its arguments specially, e.g. for making execution asynchronous. This can be a problem if an argument, like an URL, contains ampersands.


[email protected] (Petteri K) writes in the comp.lang.tcl newsgroup: [working on Microsoft Windows] Would anybody know a workaround in case $url contains character '&' e.g. http://127.0.0.1/cgi-bin/script?var1=abc&var2=xyz

Benny Riefenstahl replied: exec plays tricks with quoting which make it impossible for me to get the right command executed in Tcl 8.3.4. What works on W2K here though is this:

   set url "http://www.tcl.tk?hello&hi"
   set shell [open "|[file join $env(COMSPEC)]" w]
   fconfigure $shell -buffering line
   puts $shell "start \"\" \"$url\""
   puts $shell exit
   close $shell

I hope this works on Windows 9x/Me, too.

CL suggests: In the context of avoiding exec's unescapable special characters, I prefer to use the far-too-little-understood << argument, in the manner of [slightly edited]:

   set url http://ats.nist.gov/cgi-bin/cgi.tcl/echo.cgi?hello=1&hi=2
   exec [file join $env(COMSPEC)] << "start \"\" \"$url\" \n exit \n"

SLB I have occasionally encountered a similar problem using rsh to execute a process on a remote machine with characters such as &|<> interpreted on the remote machine. Unfortunately, the implementations of rsh on windows I have used only support stdout and stderr streams, stdin is disconnected. The << argument relies on having a working stdin channel. One proposal for changing Tcl to resolve this problem is given here [L1 ]


Here is the simplest way that I've found, and I use it in my mindweb [L2 ]. It relies on the rundll utility available with some versions of Windows.

  proc goUrl x {
    global tcl_platform
    if {$tcl_platform(platform) eq "windows"} {
      set x [regsub -all -nocase {htm} $x {ht%6D}]
      exec rundll32 url.dll,FileProtocolHandler $x &
    }
  } ;# This definitely works on Win95 and Win98. 

See invoking browsers --Ro


For Unix/Linux, it is highly probable that the line

   set shell [open "|/bin/sh" w]

works as well (just in place of the "start" command, you'd have to substitute a more specific one...) Actually:

        set shell [ open "|/bin/sh -s 2>@stdout" w+ ]
        fconfigure $fid -blocking off
        fconfigure $fid -buffering line

Comes closer to doing what is expected... -sluggo

And when time comes to read it off, something like:

  while { [ gets $shell line ] >= 0 } {
     append retval "$line\n"
     after 1
  }

Will get you past things that buffer, like ps. The after 1' seems to be quite magical in the case of buffering, whereas after 0' does nothing useful.

But one idea for more cross-platformity: Unix has (guaranteed?) env(SHELL) to give the default shell; Windows has env(COMSPEC), Mac I don't know. Would it be worthwile to unify these into a, say, ::tcl_platform(shell) element? (RS)

No, auto_execok should cover that.


(Snaury) It seems I found true solution to the problem. Cmd.exe for WinXP has special characters '&', '|', '(' and ')', and in its help system it is written that if you want to pass these symbols as arguments you need to either use quotes or escape it with '^'.

Try the following code:

        exec >&@stdout echo /c {a^&b}
        OUTPUT: a&b

However, exec seems to auto-quote its arguments if they contain blanks. Try the following:

        exec >&@stdout echo /c {a & b}
        OUTPUT: a & b

The real question now, is how can one force quotes to some arguments? Apparently, using manual quotes produces wrong results:

        exec >&@stdout echo /c {"a&b"}
        OUTPUT: \"a&b\"