One of my bad habits is constantly buying computing books. This wouldn’t be so bad if I read them, but I have amassed a huge backlog of books that will most probably never be read and which ends up being a waste of money.
A couple of posts I read recently have led me to the decision that I should stop, or at least drastically cut down on, buying computing books. The first post talked about “learning voyerism” where you are really more interested in the idea of learning new things instead of learning the thing itself and the second talked about spending time going deeper into topics instead of boucing lightly through many different ones.
It is very difficult to stay focused on one thing when there are so many things happening in the world of computing all the time. There are a tonne of new and exciting languages and frameworks being released all the time. And while it would be great to try them all, that can only mean that you’ll never actually become good at one of them.
I have always been a bit of a generalist and while knowing a bit about everything isn’t a bad thing, there is a fine line between knowing a bit of everything while being proficient at some things and knowing not quite enough of everything to be unable to do anything at all.