質問

I'm working with large variables and it can be very slow "looping" through them with while read line, I found out that the smaller the variable the faster it works.

How can I split large variable into smaller variables and then read them one by one?

for example, What I would like to achieve:

bigVar=$(echo "$bigVar" | split_var)

for var in "${bigVar[@]}"; do
  while read line; do
    ...
  done <<< "${var}"
done

or may be split to bigVar1, bigVar2, bigVar3 etc.. and than read them one by one.

役に立ちましたか?

解決

Instead of doing

bigVar=$(someCommand)
while read line
do
   ...
done <<< "$bigVar"

Use

while read line
do
   ...
done <   <(someCommand)

This way, you avoid the problem with big variables entirely, and someCommand can output gigabyte after gigabyte with no problem.

If the reason you put it in a variable was to do work in multiple steps on it, rewrite it as a pipeline.

他のヒント

If BigVar is made of words, you could use xargs to split it in lines no longer than the maximum length of a command line, usually 32kb or 64kb :

someCommand|xargs|while read line
do
    ...
done

In this case xargs uses its default command, which is echo.

I'm curious about what you want to do in the while loop, as it may be optimized with a pipeline.

ライセンス: CC-BY-SA帰属
所属していません StackOverflow
scroll top