Linux Horror Stories and Protection Spells (Volume I)

Don’t get me wrong. I love Linux. After many years of using it, I ended up appreciating how flexible, potent, and even beautiful it is. However, using Linux has never been a bed of roses and every single Linux user that I know has had to deal with many problems since the very beginning. Indeed, I still remember how frustrating installing my first Linux machine was, especially after realizing that my network card was not working. Had I given up, I would never have written this post.

Although many of the problems that I faced while using Linux are related to updates and drivers (how painful NVidia drivers updates can be, I will write another post about that in the future), I must recognize that on many other occasions I was the only one responsible for such problems. Consequently, I want to warn the reader against a couple of those mistakes I made in the past and provide some tips about how to deal with them.

My worst nightmare: rm –r * 

Removing files using the Linux terminal is dangerous. Contrary to removing things on the Graphic User Interface, once you remove a file, there is no recycling bin that can undo the removal. Although technically speaking your file is not immediately removed after the rm command, its location in the disk is lost thus, you will not be able to find it back without using advanced commands/search tools that are not guaranteed to succeed. What is even worse, your chances of success are even lower as you keep working with the file system, so if you need to recover an accidentally removed file, you would better do it now. And please, keep your expectancies low if trying to recover what you lost after rm –r *.

You may think that you’ll never be such an idiot to remove important things such as your home directory. I used to think that way, and although, so far, I have never removed my home folder, it is also true that on more than one occasion I have unintentionally removed valuable information (so I confirmed how idiot I am). One of my worst mistakes was to accidentally press the space bar while typing rm –r /this/is/my/folder/[ACCIDENTAL_SPACE]* and unbelievably, I did not realize it even when pressing the enter key. Fortunately, this kind of mistake can be easily minimized if you are prompted to confirm the removal. Using rm I or rm -i instead of rm will activate the confirmation dialog. You can permanently replace the behavior of the command by adding an alias to your $HOME/.bashrc file.

alias rm="rm –I" 

Another alternative way of deleting files that I find quite convenient is to use the find command. Find is a search tool that allows you to recursively locate files with a given name or pattern. For instance, if you want to find all the files and folders matching the pattern *txt, you can use the command:

#Match any file or directory ending in txt
find -name *txt  
#Match any file but no directory ending in txt
find -type f -name *txt

What makes find so powerful is that you can execute commands involving the found files using the option -exec. Thus, removing all files following the pattern could be done using find -type f –name *txt –exec "rm {} \;" or for the case of deleting find –name *txt –delete. So, when I want to be sure I am deleting what I actually want I first run find –name *txt | head, and once I confirmed that those are the files that I want to delete, I execute find again with the –delete option. 

There are a few hackish ways of protecting a given folder for removal. Thus, if we remove the reading permissions of a given folder (chmod a-x folderName), we will not be able to do anything with the content of such folder including removal. Although this option is infallible, it is also quite inconvenient, since we need to grant permissions again to the folder before acceding to the content. Another alternative way of protecting a file or folder consists of making it immutable, using the command sudo chattr +i fileOrDir. To make it mutable again use the command sudo chattr -i fileOrDir

Finally, I would strongly recommend you to back up your machine frequently, since it is the only way we can be sure that we can recover our data from either mistakes or hardware failure.  

Too many files for a common directory 

Have you ever tried to store one million files in one directory? Don’t do It, it’s a bad idea, it will slow down any operation that implies iteration through the files contained in the directory and it will prevent wildcard expansion (e.g., ls *) from happening. If you don’t believe me, try it yourself.

#create a directory and cd inside
mkdir fakeDir && cd fakeDir
#generate one million files
for i in `seq 1 1000000`; do touch $i.txt; done
#try to list txt files only
ls  *txt

Exactly, you get an odd error: “Argument list too long”. So if you want to delete only the txt files we just “accidentally” created while preserving the others that may be present, you are in trouble. Fortunately, our new favorite command, find, can help us, just type find -name *txt -type f -delete and press enter. And this should be even faster than removing the whole directory using rm -r.

Again, you may think that this is a highly unlikely scenario, and although I agree with you, this has happened to me several times while compiling large datasets. And although in many cases rm -r folderName could have solved the problem, when you are working on a cluster with shared storage, you really want to use find over rm. It could save you hours.

That’s all for today! I hope you enjoyed my disgraces and fortunately, you learned something from them, But if it is not the case, please, just take a look at our today’s favorite command find. I promise you won’t regret it.

Author