Changing links in lots of files

I’ve recently changed my cloud storage provider; from the, now defunkt, to ownCloud. It’s generally been a smooth transition, and I’m really pleased I’ve taken the leap to open source cloud infrastructure! I’ve gone with a paid (but good value) host, in the hope it’s a more sustainable business model and will last longer than Ubuntu One and, cloud providers I’ve previously used.

One problem that did appear is within document links. Especially in LaTeX and R code, e.g. calling a dataset from a different directory. This would have been stored in my ~/Copy folder, which has now become ~/Cloud (what it should always have been). There’s a separate argument here for relative paths, but let’s avoid that for now!

To solve this, without tediously opening all files all changing /Copy/ to /Cloud/ I’ve used some simple bash wizardry. If you’re a Windows user you can get bash for Windows, or very soon Microsoft will be bringing bash to Windows themselves. Crazy times.

So, this will tell you how many occurrances there are to change:

grep -r "/Copy/" ./* | wc -l

We’ve used the pipe (|) to send the output of grep to wc (word count). We can also look at the grep output, which will show us all the places /Copy/ occurs (the -r makes the command recursive, so you can look in folders):

grep -r "/Copy/" ./*

You should use output from the above to make sure your identifier (/Copy/) only gets what you want, which is why we’re not using “Copy”!

Finally, we can fix the problem

# Non-recursive
sed -i 's/\/Copy\//\/Cloud\//g' *

# Recursive (looks through folders)
find . -type f -name "*" -exec sed -i'' -e 's/\/Copy\//\/Cloud\//g' {} +

Note I’ve used the back slash (\) as an escape character to search for “/Copy/”.