Question

This is similar to this question, but I want to include the path relative to the current directory in unix. If I do the following:

ls -LR | grep .txt

It doesn't include the full paths. For example, I have the following directory structure:

test1/file.txt
test2/file1.txt
test2/file2.txt

The code above will return:

file.txt
file1.txt
file2.txt

How can I get it to include the paths relative to the current directory using standard Unix commands?

Was it helpful?

Solution

Use find:

find . -name \*.txt -print

On systems that use GNU find, like most GNU/Linux distributions, you can leave out the -print.

OTHER TIPS

Use tree, with -f (full path) and -i (no indentation lines):

tree -if --noreport .
tree -if --noreport directory/

You can then use grep to filter out the ones you want.


If the command is not found, you can install it:

Type following command to install tree command on RHEL/CentOS and Fedora linux:

# yum install tree -y

If you are using Debian/Ubuntu, Mint Linux type following command in your terminal:

$ sudo apt-get install tree -y

Try find. You can look it up exactly in the man page, but it's sorta like this:

find [start directory] -name [what to find]

so for your example

find . -name "*.txt"

should give you what you want.

You could use find instead:

find . -name '*.txt'

That does the trick:

ls -R1 $PWD | while read l; do case $l in *:) d=${l%:};; "") d=;; *) echo "$d/$l";; esac; done | grep -i ".txt"

But it does that by "sinning" with the parsing of ls, though, which is considered bad form by the GNU and Ghostscript communities.

DIR=your_path
find $DIR | sed 's:""$DIR""::'

'sed' will erase 'your_path' from all 'find' results. And you recieve relative to 'DIR' path.

To get the actual full path file names of the desired files using the find command, use it with the pwd command:

find $(pwd) -name \*.txt -print

Here is a Perl script:

sub format_lines($)
{
    my $refonlines = shift;
    my @lines = @{$refonlines};
    my $tmppath = "-";

    foreach (@lines)
    {
        next if ($_ =~ /^\s+/);
        if ($_ =~ /(^\w+(\/\w*)*):/)
        {
            $tmppath = $1 if defined $1;    
            next;
        }
        print "$tmppath/$_";
    }
}

sub main()
{
        my @lines = ();

    while (<>) 
    {
        push (@lines, $_);
    }
    format_lines(\@lines);
}

main();

usage:

ls -LR | perl format_ls-LR.pl

You could create a shell function, e.g. in your .zshrc or .bashrc:

filepath() {
    echo $PWD/$1
}

filepath2() {
    for i in $@; do
        echo $PWD/$i
    done
}

The first one would work on single files only, obviously.

Find the file called "filename" on your filesystem starting the search from the root directory "/". The "filename"

find / -name "filename" 

If you want to preserve the details come with ls like file size etc in your output then this should work.

sed "s|<OLDPATH>|<NEWPATH>|g" input_file > output_file

You can implement this functionality like this
Firstly, using the ls command pointed to the targeted directory. Later using find command filter the result from it. From your case, it sounds like - always the filename starts with a word file***.txt

ls /some/path/here | find . -name 'file*.txt'   (* represents some wild card search)

In the fish shell, you can do this to list all pdfs recursively, including the ones in the current directory:

$ ls **pdf

Just remove 'pdf' if you want files of any type.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top