linux - Writing a function to replace duplicate files with hardlinks -



linux - Writing a function to replace duplicate files with hardlinks -

i need write bash script iterates through files of specified directory , replaces duplicates of files hardlinks. right now, entire function looks this:

#! /bin/bash # sameln --- remove duplicate copies of files in specified directory d=$1 cd $d #go directory specified default input filenum=0 #loop counter dir=".*|*" f in $dir #for every file in directory files[$filenum]=$f #save file array filenum=$((filenum+1)) #increment counter done for((j=0; j<$filenum; j++)) #for every file if [ -f "$files[$j]" ] #access file in array for((k=0; k<$filenum; k++)) #for every other file if [ -f "$files[$k]" ] #access other files in array test[cmp -s ${files[$j]} ${files[$k]}] #compare if files identical [ln ${files[$j]} ${files[$k]}] #change sec file hard link fi done fi done

basically:

loop through files of depth 1 in specified directory put file contents array compare each array item every other array item , replace duplicates hardlinks

the test directory has 4 files: a, b, c, d

a , b different, c , d duplicates (they empty). after running script, ls -l shows of files still have 1 hardlink, script appears have done nothing.

where going wrong?

dir=".*|*" f in $dir #for every file in directory echo $f done

this code outputs

.*|*

you should not loop on files this. find command. see, code doesn't work because first loop faulty.

btw, don't name variables uppercase, reserved scheme variables, believe.

linux bash

Comments

Popular posts from this blog

java - How to set log4j.defaultInitOverride property to false in jboss server 6 -

c - GStreamer 1.0 1.4.5 RTSP Example Server sends 503 Service unavailable -

Using ajax with sonata admin list view pagination -