Does this cause any problem if I am use it for large files?
We don't know because we don't know how fast your internet connection is, how much RAM you have, how fast the pipe is from the host you're downloading the file from?
Basically though, you are reading the file twice, once into memory to see how big it is, then again if it meets your requirement, which seems really... silly.
You're doubling the traffic to the host you're reading from and on your network connection, and, if the file is larger than RAM on your local machine, it is going to go nuts.
As Darshan says, look at using Net::SFTP. It will give you the ability to query the file's size before you try to load it, without pulling the entire thing down. It's a bit more complicated to use, but that complexity translates into flexibility.
"/user/myname/filename"
(S)FTP might not necessarily have its base path where it can see that file. To probe the system and figure out, ask the system, via the SFTP connection, what its current directory is when you first login, then ask it for the files it can see using something like this example from the Net::STFP docs:
sftp.dir.glob("/base/path", "*/**/*.rb") do |entry|
p entry.name
end
That will recursively look through the "/base/path"
hierarchy, searching for all "*.rb" files.