The exec command treats ampersands (&) in its arguments specially, e.g. for making execution asynchronous. This can be a problem if an argument, like an URL, contains ampersands.
[email protected] (Petteri K) writes in comp.lang.tcl: [working on Microsoft Windows] Would anybody know a workaround in case $url contains character '&' e.g. http://127.0.0.1/cgi-bin/script?var1=abc&var2=xyz
Benny Riefenstahl replied: exec plays tricks with quoting which make it impossible for me to get the right command executed in Tcl 8.3.4. What works on W2K here though is this:
set url "https://www.tcl-lang.org?hello&hi" set shell [open "|[file join $env(COMSPEC)]" w] fconfigure $shell -buffering line puts $shell "start \"\" \"$url\"" puts $shell exit close $shell
I hope this works on Windows 9x/Me, too.
CL suggests: In the context of avoiding exec's unescapable special characters, I prefer to use the far-too-little-understood << argument, in the manner of [slightly edited]:
set url http://ats.nist.gov/cgi-bin/cgi.tcl/echo.cgi?hello=1&hi=2 exec [file join $env(COMSPEC)] << "start \"\" \"$url\" \n exit \n"
SLB I have occasionally encountered a similar problem using rsh to execute a process on a remote machine with characters such as &|<> interpreted on the remote machine. Unfortunately, the implementations of rsh on windows I have used only support stdout and stderr streams, stdin is disconnected. The << argument relies on having a working stdin channel. One proposal for changing Tcl to resolve this problem is given here [L1 ]
Here is the simplest way that I've found, and I use it in my mindweb [L2 ]. It relies on the rundll utility available with some versions of Windows.
proc goUrl x { global tcl_platform if {$tcl_platform(platform) eq "windows"} { set x [regsub -all -nocase {htm} $x {ht%6D}] exec rundll32 url.dll,FileProtocolHandler $x & } } ;# This definitely works on Win95 and Win98.
See invoking browsers --Ro
For Unix/Linux, it is highly probable that the line
set shell [open "|/bin/sh" w]
works as well (just in place of the "start" command, you'd have to substitute a more specific one...) Actually:
set shell [ open "|/bin/sh -s 2>@stdout" w+ ] fconfigure $fid -blocking off fconfigure $fid -buffering line
Comes closer to doing what is expected... -sluggo
And when time comes to read it off, something like:
while { [ gets $shell line ] >= 0 } { append retval "$line\n" after 1 }
Will get you past things that buffer, like ps. The after 1' seems to be quite magical in the case of buffering, whereas after 0' does nothing useful.
But one idea for more cross-platformity: Unix has (guaranteed?) env(SHELL) to give the default shell; Windows has env(COMSPEC), Mac I don't know. Would it be worthwile to unify these into a, say, ::tcl_platform(shell) element? (RS)
No, auto_execok should cover that.
peterc: Mac OS X users tend to prefer using Applescript for that sort of thing.
(Snaury) It seems I found true solution to the problem. Cmd.exe for WinXP has special characters '&', '|', '(' and ')', and in its help system it is written that if you want to pass these symbols as arguments you need to either use quotes or escape it with '^'.
Try the following code:
exec >&@stdout cmd /c echo {a^&b} OUTPUT: a&b
However, exec seems to auto-quote its arguments if they contain blanks. Try the following:
exec >&@stdout cmd /c echo {a & b} OUTPUT: "a & b"
The real question now, is how can one force quotes to some arguments? Apparently, using manual quotes produces wrong results:
exec >&@stdout cmd /c echo {"a&b"} OUTPUT: \"a&b\"
DKF: This is caused by the fact that cmd.exe uses non-standard quoting rules compared with the normal case (basically the rules implemented by the MSVC runtime library, which we target because that's what third-party apps use or emulate). I remember David Gravereaux scratching his head a lot over this.
FWIW, you don't get this problem on Unixes, where argument processing is substantially different.
neb Aside from the escape character (^), and the continuation (&) DOS also has two boolean operators: && and ||.
exec cmd /c dir /B filex & type filex
will show the existence of filex, then type the contents.
exec cmd /c dir /B filex && type filex
will only type the contents of filex if it exists.
exec cmd /c dir /B filex || echo hello
will only echo 'hello' if filex doesn't exist.
I thought I would have to escape these for Tcl (\&\&) to pass them, but it appears to work with our without Tcl escapes.
Under linux I had trouble opening a pipe with arguments containing <, > and alike - this does not work, e.g.:
close [open "|ls <filename" "w"]
will cause the "open" command to interpret "<filename" as a redirection of stdin, and there is no trivial way to escape this.
Lars H: For file names, you might try including the path (e.g. ./<filename if you really want it to be relative). That's not a general solution though.
I found the following non-trivial solution:
proc shell_escape {s} { regsub -all -- {[^a-zA-Z0-9]} $s {\\\0} s return $s } proc pipeline args { set p "|bash -c \{" foreach a $args { append p [shell_escape $a] append p " " } append p "\} 2>@stderr " return $p } close [open [pipeline ls <filename] "w"]
This works for other special characters (&, |, spaces etc.), too.
The only disadvantage, of course, is that there's another executable (bash) involved, which is acceptable in many cases.
Regards,
Peter Niemayer <niemayer at isg.de>
HZe I have similar problems with the need to escape > < | from being used by the exec command.
If you e.g. create a file named ">b.txt" on Linux:
> touch a.txt \>b.txt
Then inside Tcl you do this:
> tclsh % exec ls {*}[glob *.txt]
You notice that there is no output. Instead, the file b.txt was created:
% ls '>b.txt' a.txt b.txt
How can I make sure that arguments starting with >, <, | are not treated by the exec command as I/O redirection? The above is an example. It is not a solution to change the argument to the external command. It might need an argument like ">b.txt" without triggering the I/O redirection.
The same happens on Windows.