ChatGPT解决这个技术问题 Extra ChatGPT

ls command: how can I get a recursive full-path listing, one line per file?

How can I get ls to spit out a flat list of recursive one-per-line paths?

For example, I just want a flat listing of files with their full paths:

/home/dreftymac/.
/home/dreftymac/foo.txt
/home/dreftymac/bar.txt
/home/dreftymac/stackoverflow
/home/dreftymac/stackoverflow/alpha.txt
/home/dreftymac/stackoverflow/bravo.txt
/home/dreftymac/stackoverflow/charlie.txt

ls -a1 almost does what I need, but I do not want path fragments, I want full paths.

See also: tree
tree -aflix --noreport but if you use tree and there are any symbolic links in the path you will have to deal with those or use an alternate solution from one of the suggested answers.

J
Jonathan Leffler

Use find:

find .
find /home/dreftymac

If you want files only (omit directories, devices, etc):

find . -type f
find /home/dreftymac -type f

can ls parameters like --sort=extension "redeemed" by this solution?
You can even use printf output in order to display needed contextual info (e.g. find . -type f -printf '%p %u\n')
Can this be formatted with a falg? ie. python pprint.pprint(files)
@Shayan find with the -printf predicate allows you to do everyting ls does, and then some. However, it is not standard. You can use find -exec stat {} \; but unfortunately the options to stat are not standardized, either.
... In the end, unfortunately, the most portable solution might be a Perl or Python script.
a
approxiblue

If you really want to use ls, then format its output using awk:

ls -R /path | awk '
/:$/&&f{s=$0;f=0}
/:$/&&!f{sub(/:$/,"");s=$0;f=1;next}
NF&&f{ print s"/"$0 }'

Can someone please explain the above awk expressions?
This solution doesn't omit the directories (each directory gets its own line)
Thanks. Btw, would be nice to have this in 1 line for quick copy & paste.
add this to your .bashrc file: function lsr () { ls -R "$@" | awk ' /:$/&&f{s=$0;f=0} /:$/&&!f{sub(/:$/,"");s=$0;f=1;next} NF&&f{ print s"/"$0 }' } so you can use lsr /path to use this wherever
Do you mind explaining the awk code? It looks like you are using a regex to catch lines that end in ":" (the "headers" with parent directory paths), but I get lost after that and definitely don't understand the part where the last field NF is being evaluated as true/false. Thanks!
o
others

ls -ld $(find .)

if you want to sort your output by modification time:

ls -ltd $(find .)


-bash: /bin/ls: Argument list too long
+1 worked for me with 12106 files, and I could use the --sort=extension parameter of ls
Thanks. I wouldn't have thought by myself of that (nice and short) syntax - i would have used find . -name "*" -exec ls -ld '{}' \; (that one works whatever the number of files is), but your command is way shorter to write ;)
ls -ld $(find .) breaks for me if I'm listing a ntfs disk where files have spaces: ls: cannot access ./System: No such file or directory however find with quotes by @SRG works
A shorter alternative (depending on your needs) would be find . -ls.
t
tripleee

Best command is: tree -fi

-f print the full path prefix for each file -i don't print indentations

e.g.

$ tree -fi
.
./README.md
./node_modules
./package.json
./src
./src/datasources
./src/datasources/bookmarks.js
./src/example.json
./src/index.js
./src/resolvers.js
./src/schema.js

In order to use the files but not the links, you have to remove > from your output:

tree -fi |grep -v \>

If you want to know the nature of each file, (to read only ASCII files for example) try a while loop:

tree -fi |
grep -v \> |
while read -r first ; do 
    file "${first}"
done |
grep ASCII

@Nakilon what's the closest thing? Does it display output similarly? How would you easily display similar output with a short command?
@om01: on osx it is a difficult as brew install tree, given you are using homebrew
i couldn't list the files exclusivelty. it always lists the directories too. how can i do that?
@kommrad You could adapt the file example and pipe to grep -v directory
k
kenorb

Try the following simpler way:

find "$PWD"

find "`pwd`" if the path contains spaces or some other special characters.
How is this any different than find .? -.-
@SalmanPK If you give find an absolute path like pwd to start with, it will print absolute paths. By the way, "How is this any different than find" ;-)
find without an argument is a syntax error on some platforms. Where it isn't, just find is equivalent to find ..
S
Sushant Verma

Oh, really a long list of answers. It helped a lot and finally, I created my own which I was looking for :

To List All the Files in a directory and its sub-directories:

find "$PWD" -type f

To List All the Directories in a directory and its sub-directories:

find "$PWD" -type d

To List All the Directories and Files in a directory and its sub-directories:

find "$PWD"

And to filter by extension: find "$PWD" -type f | grep '\.json$'
No Need of post-processing with grep, use -name in find like: find "$PWD" -type f -name *.json and if you want to delete the files listed : find "$PWD" -type f -name *.json -exec rm {} \; similarly, if you want to copy it then replace rm with cp and destination: -exec cp {} destination
must use "$PWD/" in my condition: find "$PWD/" -type f;
And to list a particular file with its full path find "$PWD/README.md"
J
Justin Johnson

I don't know about the full path, but you can use -R for recursion. Alternatively, if you're not bent on ls, you can just do find *.


R
Ry-
du -a

Handy for some limited appliance shells where find/locate aren't available.


any idea how I can remove the filesize wihtout awk ?
I
Idelic

Using no external commands other than ls:

ls -R1 /path | 
  while read l; do case $l in *:) d=${l%:};; "") d=;; *) echo "$d/$l";; esac; done

Unknown option '-1'. Aborting.
@ilw That's weird; I'd think ls -1 is fairly standard; but try just leaving it out if it's unsupported. The purpose of that option is to force ls to print one line per file but that's usually its behavior out of the box anyway. (But then of course, don't use ls in scripts.) (Looking at the POSIX doco, this option was traditionally BSD only, but was introduced in POSIX in 2017.)
All the subfolders will be in the list, not just files.
a
apaderno

find / will do the trick


D
David Golembiowski

Run a bash command with the following format:

find /path -type f -exec ls -l \{\} \;

Likewise, to trim away -l details and return only the absolute paths:

find /path -type f -exec ls \{\} \;

find -ls avoids running an external process for each file and is a lot easier to type.
You don´t need the -exec ls \{\} \; part, since the default behavior of find is to print the full path. That is, find /path -type f does the job if you don´t need the file attributes from ls -l.
o
oers

The easiest way for all you future people is simply:

du

This however, also shows the size of whats contained in each folder You can use awk to output only the folder name:

du | awk '{print $2}'

Edit- Sorry sorry, my bad. I thought it was only folders that were needed. Ill leave this here in case anyone in the future needs it anyways...


Interesting, because it shows me stuff I didn't know I wanted to know -- kind of like Google suggest. It turns out, I like knowing how much space each file takes.
h
halfer

Don't make it complicated. I just used this and got a beautiful output:

ls -lR /path/I/need

OP wants just the full path and nothing else. ls -lR wouldn't meet that goal.
sorry, it doesn't do it
Sorry man, doesn't fit the bill. If you need a list of full paths, you won't get it this way. At least not with bash or zsh on BSD or MacOS
G
Grzegorz Luczywo

With having the freedom of using all possible ls options:

find -type f | xargs ls -1


D
Dimitrios

I think for a flat list the best way is:

find -D tree /fullpath/to-dir/ 

(or in order to save it in a txt file)

find -D tree /fullpath/to-dir/ > file.txt

Tried on Mac OS 10.13.3. find: illegal option -- D
K
Kevin

Here is a partial answer that shows the directory names.

ls -mR * | sed -n 's/://p'

Explanation:

ls -mR * lists the full directory names ending in a ':', then lists the files in that directory separately

sed -n 's/://p' finds lines that end in a colon, strip off the colon and print the line

By iterating over the list of directories, we should be able to find the directories as well. Still workin on it. It is a challenge to get the wildcards through xargs.


great answer.. exactly what i needed!
k
koeselitz

Adding a wildcard to the end of an ls directory forces full paths. Right now you have this:

$ ls /home/dreftymac/
foo.txt
bar.txt
stackoverflow
stackoverflow/alpha.txt
stackoverflow/bravo.txt
stackoverflow/charlie.txt

You could do this instead:

$ ls /home/dreftymac/*
/home/dreftymac/.
/home/dreftymac/foo.txt
/home/dreftymac/bar.txt
/home/dreftymac/stackoverflow:
alpha.txt
bravo.txt
charlie.txt

Unfortunately this does not print the full path for directories recursed into, so it may not be the full solution you're looking for.


Also unfortunately you can't sudo ls with a wildcard (because the wildcard is expanded as the normal user).
Also unfortunately, ls has a lot of pesky corner cases; see parsing ls
S
Steve

A lot of answers I see. This is mine, and I think quite useful if you are working on Mac.

I'm sure you know there are some "bundle" files (.app, .rtfd, .workflow, and so on). And looking at Finder's window they seem single files. But they are not. And $ ls or $ find see them as directories... So, unless you need list their contents as well, this works for me:

find . -not -name ".*" -not -name "." | egrep -v "\.rtfd/|\.app/|\.lpdf/|\.workflow/"

Of course this is for the working dir, and you could add other bundles' extensions (but always with a / after them). Or any other extensions if not bundle's without the /.

Rather interesting the ".lpdf/" (multilingual pdf). It has normal ".pdf" extension (!!) or none in Finder. This way you get (or it just counts 1 file) for this pdf and not a bunch of stuff…


s
sivi

ls -lR is what you were looking for, or atleast I was. cheers


OP wants just the full path and nothing else. ls -lR wouldn't meet that goal.
t
tripleee

If the directory is passed as a relative path and you will need to convert it to an absolute path before calling find. In the following example, the directory is passed as the first parameter to the script:

#!/bin/bash

# get absolute path
directory=`cd $1; pwd`
# print out list of files and directories
find "$directory"

If your system has readlink you can do directory=$(readlink -e $1)
True, but cd/pwd combination will work on every system. readlink on OS X 10.5.8 does not support -e option.
D
Diego C Nascimento
tar cf - $PWD|tar tvf -             

This is slow but works recursively and prints both directories and files. You can pipe it with awk/grep if you just want the file names without all the other info/directories:

tar cf - $PWD|tar tvf -|awk '{print $6}'|grep -v "/$"          

you can also use simply : tar cvf /dev/null $PWD
P
Pavlo Neiman

Recursive list of all files from current location:

ls -l $(find . -type f)


Better yet find . -type f -ls which won't choke if the output from find is too long for ls
P
Paul Rougieux

The realpath command prints the resolved path:

realpath *

To include dot files, pipe the output of ls -a to realpath:

ls -a | xargs realpath

To list subdirectories recursively:

ls -aR | xargs realpath

In case you have spaces in file names, man xargs recommends using the -o option to prevent file names from being processed incorrectly, this works best with the output of find -print0 and it starts to look a lot more complex than other answers:

find -print0 |xargs -0 realpath

See also Unix and Linux stackexchange question on how to list all files in a directory with absolute path.


B
Brian Burns

I knew the file name but wanted the directory as well.

find $PWD | fgrep filename

worked perfectly in Mac OS 10.12.1


Works like a charm, also on MacOS 10.15.7, in Terminal, using zsh as default shell. Wonderful, and thanks!
This is rather imprecise and inefficient. A better attempt is find . -name filename
t
tripleee

@ghostdog74: Little tweak with your solution. Following code can be used to search file with its full absolute path.

sudo ls -R / | awk '
/:$/&&f{s=$0;f=0}
/:$/&&!f{sub(/:$/,"");s=$0;f=1;next}
NF&&f{ print s"/"$0 }' | grep [file_to_search]

M
Mirko Cianfarani

If you have to search on big memory like 100 Gb or more. I suggest to do the command tree that @kerkael posted and not the find or ls.

Then do the command tree with only difference that, I suggest, write the output in the file.

Example:

tree -fi > result.txt

After, do a grep command in file using a pattern like grep -i "*.docx" result.txt so you not lose a time and this way is faster for search file on big memory.

https://i.stack.imgur.com/dSJzg.png