I’ve been programing for the better part of my life. I’ve started with BASIC, in which I’ve written a small piece of software that allowed me to manually paint a series of bitmaps and animate them. My home computer had a 4-color CGA screen, and the computer at my school was a blue-black Commodore 64, but I knew I was hooked.
In high school I’ve written my first big project as part of a computer graphics course. I ended up creating a (very bad) car racing simulator in Pascal, complete with track editing and several car models. I’ve created all the graphics from scratch, from line rendering to lighting gradients on surfaces, camera tracking and even font display. The year ended just when I was about to add the simplest AI for an opponents car. Pascal was the language I used when I’ve learn the basic concepts of Obeject Oriented Design and the use of pointers.
University is just a blur of C/C++, Java and Matlab, but mostly it was about theory. There’s nothing in the university that prepares you to become a better programmer. I’m not saying that it should, either, just stating a fact. In the basic curriculum, even though many classes require that you program, almost none of them teach best-practices, processes of developing as part of a team, source control, bug tracking, unit testing, etc. Even design patterns are hard to come by. When I left the university, I wasn’t a much better programmer. I was just a little more experienced, and I had a much better resume (Java? Sure, I’m very experienced with Java!).
What did I learn?
Bits of Useful Knowledge
Surprisingly enough, linear algebra rises to the top. Most of the math classes are about learning how to learn and learning how to think. This is how you approach a new subject. This is how you dissect it to pieces, and build knowledge on top of other knowledge. This is how you read a scientific articles. Linear algebra was the only subject I used in later classes, and in my career later on, in subjects such as machine learning and image processing.
Machine learning algorithms, which later I’ve used for data mining. This one course I’ve taken on a whim, was broad enough so I could navigate my way around the subject during my years in BigCo. I have come back to this course again and again, when devising classification and clustering algorithms.
Data structures and basic concepts of complexity. This is the single most important thing that a programmer could get out of the BSc in computer science as it is built today. Even though these data structures are already implemented in most languages, knowing how they work and when to use them is very useful.
Bytes of Outdated Knowledge
Turing machines. Calculus. Outdated models of database design and concepts for relational databases. Classical mechanics.
Things I’ve learned in The Real World and I Wish Someone Would’ve Told Me
I’ve really started honing my craft when I’ve arrived at BigCo, fresh out of the University. The first thing I had to do was read a long and boring book about the working of relational databases and schema design. Most programmers hate the database. They hate the interfaces to the databases and they wish they didn’t have to deal with it. They don’t like the fact that it has it’s own language called SQL. Most of all, they don’t like the fact that there’s a lot of black magic in dealing with a database, because the thing is so damn smart, you can hardly predict how it will behave.
At this point I realized that most of what I’ve learned in the University is outdated and wrong. Complexity models are a terrible model to anticipate behavior. In the real world, you are not plagued by O(NlogN) problems. You are troubled by the time it takes to the spindle in your hard-drive to spin, the seek time, the read speed, the round trip between servers. You are interested in cache behaviors and what to do when not everything fits in memory.
The basic concepts of schema design don’t work. Normalization? Huh. No wonder programmers are flocking to NoSQL solutions. Everything they think they know about databases is wrong, because Universities teach bad-practices.
There is no Spoon
Once I was free of the notions of academia, I was free to start reading on my own. Google, blogs and Wikipedia really came to the rescue here. I’m not big on books (teach yourself nothing in 21 days), and rather have a constant stream of news, tips and relevant knowledge as I need it. My reading lists change often. C# and .Net gave place to Python. Centralized source control were replaced by decentralized source control. Model-View-Controller (MVC) is now Model-view-template (MTV). Waterfall became agile.
Until I’ve left BigCo last year, I was always learning new things, and I’ve had the privilege to test new ideas both inside and outside of BigCo. I’ve learned that the most important thing is not taught in schools: It’s about the people. A small, dedicated group can out maneuver a bigger group by adopting better processes, not better technologies. Human interactions and communications are more important to a team’s success than anything you can learn in computer science.
I haven’t learned any new technology this year. Instead, I’m now spending part of my time honing my craft and becoming a better programmer by learning about processes. I think you should too.