문제

I have a file that has fairly long lines. The longest line has length 4609:

% perl -nle 'print length' ~/very_large_file | sort -nu | tail -1
4609

Now, when I just run cat ~/very_large_file it runs fine. But when I put inside backticks, it gives a 'word too long' error

% foreach line (`cat ~/very_large_file`)
Word too long.

% set x = `cat ~/very_large_file`
Word too long.

Is there an alternative to using backticks in csh to process each line of such a file?

Update My problem was solved by using a different language, but I still couldn't get the reason for the failing csh. Just came across this page that describes the manner of finding ARG_MAX. In particular, the getconf command is useful. Of course, I am still not sure whether this limit is the root cause, and if the limit applies to the languages other than csh.

도움이 되었습니까?

해결책

I don't mean to beat an old horse, but if you're scripting do consider moving to bash, zsh or even Korn. csh has disadvantages.

What you can try without abandoning csh completely:

  • Move to tcsh if you're with regular old (very old) csh.
  • Recompile tcsh with a longer word length (the default is 1000 bytes, I think) or with dynamic allocation.
  • If possible move the line processing to a secondary script or program and write that loop like this:

    cat ~/very_large_file | xargs secondary_script

라이센스 : CC-BY-SA ~와 함께 속성
제휴하지 않습니다 StackOverflow
scroll top