2019-06-10 14:40:44 285浏览
今天千锋扣丁学堂Linux培训老师给大家分享一篇关于在线Linux上识别同样内容的详细介绍,首先比如有时文件副本相当于对硬盘空间的巨大浪费,并会在你想要更新文件时造成困扰。以下是用来识别这些文件的六个命令。在最近的帖子中,我们看了如何识别并定位硬链接的文件(即,指向同一硬盘内容并共享inode)。在本文中,我们将查看能找到具有相同内容,却不相链接的文件的命令。
	
	
$ diff index.html backup.html 2438a2439,2441 > <pre> > That's all there is to report. > </pre>
$ diff home.html index.html $
$ cksum *.html 2819078353 228029 backup.html 4073570409 227985 home.html 4073570409 227985 index.html
$ find . -name "*.html" -exec cksum {} \;
4073570409 227985 ./home.html
2819078353 228029 ./backup.html
4073570409 227985 ./index.html
$ fslint . -----------------------------------file name lint -------------------------------Invalid utf8 names -----------------------------------file case lint ----------------------------------DUPlicate files <== home.html index.html -----------------------------------Dangling links --------------------redundant characters in links ------------------------------------suspect links --------------------------------Empty Directories ./.gnupg ----------------------------------Temporary Files ----------------------duplicate/conflicting Names ------------------------------------------Bad ids -------------------------Non Stripped executables
$ export PATH=$PATH:/usr/share/fslint/fslint
$ rdfind ~ Now scanning "/home/shark", found 12 files. Now have 12 files in total. Removed 1 files due to nonunique device and inode. Total size is 699498 bytes or 683 KiB Removed 9 files due to unique sizes from list.2 files left. Now eliminating candidates based on first bytes:removed 0 files from list.2 files left. Now eliminating candidates based on last bytes:removed 0 files from list.2 files left. Now eliminating candidates based on sha1 checksum:removed 0 files from list.2 files left. It seems like you have 2 files that are not unique Totally, 223 KiB can be reduced. Now making results file results.txt
$ rdfind -dryrun true ~ (DRYRUN MODE) Now scanning "/home/shark", found 12 files. (DRYRUN MODE) Now have 12 files in total. (DRYRUN MODE) Removed 1 files due to nonunique device and inode. (DRYRUN MODE) Total size is 699352 bytes or 683 KiB Removed 9 files due to unique sizes from list.2 files left. (DRYRUN MODE) Now eliminating candidates based on first bytes:removed 0 files from list.2 files left. (DRYRUN MODE) Now eliminating candidates based on last bytes:removed 0 files from list.2 files left. (DRYRUN MODE) Now eliminating candidates based on sha1 checksum:removed 0 files from list.2 files left. (DRYRUN MODE) It seems like you have 2 files that are not unique (DRYRUN MODE) Totally, 223 KiB can be reduced. (DRYRUN MODE) Now making results file results.txt
-ignoreempty ignore empty files -minsize ignore files smaller than speficied size -followsymlinks follow symbolic links -removeidentinode remove files referring to identical inode -checksum identify checksum type to be used -deterministic determiness how to sort files -makesymlinks turn duplicate files into symbolic links -makehardlinks replace duplicate files with hard links -makeresultsfile create a results file in the current directory -outputname provide name for results file -deleteduplicates delete/unlink duplicate files -sleep set sleep time between reading files (milliseconds) -n, -dryrun display what would have been done, but don't do it
$ rdfind -deleteduplicates true . ... Deleted 1 files. <==
$ fdupes ~ /home/shs/UPGRADE /home/shs/mytwin /home/shs/lp.txt /home/shs/lp.man /home/shs/penguin.png /home/shs/penguin0.png /home/shs/hideme.png
# fdupes -r /home /home/shark/home.html /home/shark/index.html /home/dory/.bashrc /home/eel/.bashrc /home/nemo/.profile /home/dory/.profile /home/shark/.profile /home/nemo/tryme /home/shs/tryme /home/shs/arrow.png /home/shs/PNGs/arrow.png /home/shs/11/files_11.zip /home/shs/ERIC/file_11.zip /home/shs/penguin0.jpg /home/shs/PNGs/penguin.jpg /home/shs/PNGs/penguin0.jpg /home/shs/Sandra_rotated.png /home/shs/PNGs/Sandra_rotated.png
-r --recurse   recurse
-R --recurse:  recurse through specified directories
-s --symlinks  follow symlinked directories
-H --hardlinks  treat hard links as duplicates
-n --noempty   ignore empty files
-f --omitfirst  omit the first file in each set of matches
-A --nohidden  ignore hidden files
-1 --sameline  list matches on a single line
-S --size    show size of duplicate files
-m --summarize  summarize duplicate files information
-q --quiet    hide progress indicator
-d --delete   prompt user for files to preserve
-N --noprompt  when used with --delete, preserve the first file in set
-I --immediate  delete duplicates as they are encountered
-p --permissions don't soncider files with different owner/group or
         permission bits as duplicates
-o --order=WORD order files according to specification
-i --reverse   reverse order while sorting
-v --version   display fdupes version
-h --help    displays help
	
	
                          
  
	
【关注微信公众号获取更多学习资料】 【扫码进入HTML5前端开发VIP免费公开课】