Today I was trying to write a convenience wrapper script that ran commands remotely on a server (for one of the people we work with). But, for some reason, ssh handles arguments in a completely suprising (and annoyingly inconvenient) manner that completely ignores quotes.
For example:
timcharper@timcharper:~ $ ssh my_server grep "4 5 6" ./ grep: 5: No such file or directory grep: 6: No such file or directory
Grrr…. this made me feel ANGRY. Because of this, I couldn’t use the “$@” trick to splat all the arguments on the end and just move on with life. But it’s cool, because I’ve got a thick table to bang my head against, combined with an overly-aggressive problem-solving drive that won’t accept no for an answer.
So, I came across an awk trick to escape every character in a string, and another trick to iterate over arguments. Combining the two techniques did the trick, and I can now properly pass command line arguments through an ssh wrapper script. Since it depends on bash and awk, and bash and awk are on EVERY POSIX system out there, it’s a winner.
The working script:
#!/bin/sh CMD="" for (( i = 1; i <= $# ; i++ )); do eval ARG=\$$i CMD="$CMD $(echo "$ARG" | awk '{gsub(".", "\\\\&");print}')" done ssh my_server cd /path/to/app \&\& RAILS_ENV=production $CMD
Surprisingly, this is quite bullet broof, and properly escapes strings and preserves arguments like:
./remote.sh grep "hello there" . -R ./remote.sh grep "So I says to the typewriter, \"Hey, I'm in quotes\"" . -R ./remote.sh grep "\"" . -R
I wish somebody had posted a solution like this for me to find, so, here you go. If you came here looking for this solution, I probably saved you an hour of your life, and a lot of stress to boot. You’re welcome.
Have you ever wasted more than an hour on a trivial problem like this? Is there an easier way to do this?
5 comments:
From Jeff Snyder (relayed via email because his company blocks blogger.com)
Hi Tim,
I found myself trying to pass arguments verbatim through ssh verbatim today and came across your blog :-)
I'd have left this as a comment, but my company web proxy blocks blogger.com, so email it is.
Anyway.. I also was somewhat irritated that bash doesn't provide a builtin for it, you really shouldn't need to resort to awk magic to get something like this to work. But your solution was the best I found from a quick googling
I tested a few things with it and thought it was bulletproof at first, but unfotunately it chokes on newlines and ssh ends up executing everything after a newline as a separate command on the remote host
Here's what I've ended up with:
COMMAND="$1"; shift;
ARGS=""
for (( i = 1; i <= $# ; i++ )); do
eval ARG=\$$i
ARGS="$ARGS $(echo -n "$ARG" | awk 'BEGIN{RS=""} {gsub("[^\\\n]", "\\\\&");gsub("\\\n", "$'"'"'\\n'"'"'");print}')"
done
ssh user@host "$COMMAND" "$ARGS"
.. not very pretty.
Have you found anything better since you blogged about it?
That's weird, for me a simple \" seems to work:
$ ssh me@server grep "1 2" somefile
does the right thing.
Hello Tim,
be careful, your script works for 9 arguments at most. To surpass this restriction, try this change:
COMMAND="$1"; shift;
ARGS=""
for (( i = 1; i <= $# ; i++ )); do
eval ARG=\${$i}
ARGS="$ARGS $(echo -n "$ARG" | awk 'BEGIN{RS=""} {gsub("[^\\\n]", "\\\\&");gsub("\\\n", "$'"'"'\\n'"'"'");print}')"
done
ssh user@host "$COMMAND" "$ARGS"
http://mywiki.wooledge.org/BashFAQ/096
I was about to use your trick but some coworkers pointed out that there's a more elegant way to escape a string in bash, namely:
function escape() {
return printf '%q' "$@"
}
then you can just
ssh user@host "$(escape $COMMAND $ARGS)"
Post a Comment