Years ago, I found myself sitting in a darkened cinema surrounded by bored fidgety kids. My elder son was eating his body weight in Pick ‘n Mix sweets, while suffering through endless adverts and previews.
Then the screen dimmed. An expectant hush fell over the audience. The movie was about to start.
Except it didn’t.
Instead, they played a cartoon. Entertaining. Funny. A trip back in time.
100 years ago, Walt Disney got his start making short animations that were shown before movies. 75 years later, Pixar was doing similar things. The first time I encountered The Simpson’s cartoon, it too was as an animated short that screened before a feature film.
This time, a ninety-second cartoon showed a snowman and a moose wrestling over a carrot on a frozen lake. There was slapstick comedy. Imaginative cheating. A heartwarming act of friendship. And a surprise ending.
My over-sugared son laughed so hard he almost wet himself. The other kids in the movie theatre, and most of the adults, were doing the same
That short cartoon turned out to be a clever advertisement for what would become Disney’s most successful movie: Frozen.
A couple of months later my son saw movie posters featuring the same snowman starting to appear on the side of London buses. He begged me to take him to the cinema to watch it.
But here is the thing. That scene did not appear in the movie.
Nor was Frozen a slapstick comedy. The snowman was a sidekick. The moose barely featured at all.
Instead, we saw a formulaic Disney movie featuring a princess having an adventure, overcoming adversity, and falling in love. The only memorable thing was the soundtrack. “Let it go!”
The film wasn’t any better or worse than all the other Disney movies we had seen. Yet we were disappointed because our expectations had not been met.
A few years later I watched the movie again while visiting my nieces. They were obsessed with Disney Princesses at the time, insisting that everyone dress up to watch it. To this day, I’m still not quite sure how it happened, but I found myself seated next to them on the couch wearing a purple fairy princess dress they had liberated from their mother’s wardrobe. I would be lying if I said it went well with my Santa Claus beard, hairy chest, or hirsute legs. Regardless, it made the tribe of little girls happy.
The movie wasn’t nearly as bad as I remembered.
Same annoying song.
The thing that had changed was my expectations.
Before I could enjoy it, I had to first unlearn much of what I thought I knew about it.
This week a miraculous thing happened: free time.
No children needing to be homeschooled.
No conference calls occupying me from dawn to dusk.
No self-important C-suite executives making unreasonable requests.
This was not a fleeting unoccupied moment between clamours for my attention or demands upon my time.
It was wonderful contiguous blocks of unallocated time. Dwindling ToDo list. Empty calendar. Freedom!
I can’t recall the last time this had occurred. It has been a while. Before the pandemic certainly. Possibly as far back as my last seasonal semi-retirement.
For the first couple of days, I didn’t trust the sensation.
Constantly experiencing a feeling that something was wrong. Had I forgotten something? Wasn’t there a conference call I was supposed to be attending? A panic to extinguish?
Then I would remember that my deliverables had been submitted. Deadlines met. Final invoice paid.
My time was now my own. The kids are on summer holidays, and unlike previous years, both are of an age where they are largely content to amuse themselves.
I decided to use my newfound free time to take on a side project and dust off a long-forgotten hobby.
As a child, I enjoyed torturing computers.
Teaching myself BASIC in primary school.
Teaching Turbo Pascal to the computing class in high school.
The programming language that I enjoyed the most as a kid was Visual Basic. It was easy to learn. Quick to deliver results. Simple. Utilising a drag and drop design canvas with immediate feedback.
I studied programming during my final year at school, planning to use the “easy” marks on offer to help secure a place at a good university. My plan worked. That year I topped the state in computing, once the marks had been fitted to their standardised Bell curve I received a score of 121 out of 100 for the subject.
At university, I initially enrolled in a combined Accounting and Computing degree.
I arrived for my first day on campus with the naïve hope that I would learn some tangible skills that would help me get ahead of the 30+% youth unemployment rate in my home town at the time.
Over the next six months, the Computer Science academia proceeded to kill off my love of programming.
They taught it badly.
They taught it theoretically.
They taught it wrong.
Instead of teaching programming using any of the widely used commercial languages that could help get me a job, they opted for a “teaching” language that nobody used in real life.
Rather than teaching students about “fat clients” and the client-server model that then dominated the industry, they continued to rehash their twenty-year-old syllabus on mainframes. A technology that had been shrinking in relevance and demand throughout that entire twenty-year period!
I asked the department head when the faculty would be updating their curriculum to cover modern computing languages?
He frowned and told me there was no problem with the curriculum. “Modern” languages were mere passing fancies. The stuff of fashion and trend. Computational mathematics. Computing theory. Engineering fundamentals. History. These were timeless.
That is what they taught.
That is what they had always taught.
That is what they would always teach.
To succeed, I would need to unlearn all that “self-taught hacking” knowledge I had acquired. Develop the sound theoretical basis upon which any successful career in computing would depend. If I didn’t like it, I was free to pursue an alternative discipline.
I enrolled in a Commerce degree the next day.
More than a decade has passed since I last worked “on the tools”. A conscious decision, taken once I observed that customer-facing roles in high-cost locales were less vulnerable to being outsourced or offshored.
Since that time, the world of computing has changed in many ways. APIs. Apps. Big Data. Cloud. Containers. Data Privacy. DevOps. Immutability. Mobile-first. The Internet of Things. Zero Trust.
Once upon a time “script kiddies” were once mocked because they couldn’t program in “proper” languages and didn’t understand how computers worked. They ended up winning the day.
Easier to learn.
Established frameworks to work within.
Development becoming paint by numbers.
An unsurprising outcome in hindsight. A simple cost benefit equation.
Meanwhile, those “proper” languages like C++ and Java feature heavily in the job advertisements for megacorp legacy support engineers. Often offshore.
Today, few programmers understand how the machines that their programs run upon actually work, or even whether they physically exist. Just as few commuters understand how the cars, planes, and trains they depend upon for transport actually operate. Working seemingly by magic.
Quaint concepts from a decade ago, such as being safely inside a firewall or trusted on a network, have given way to everything being on the internet always and nothing is trusted. Ever.
Many of the “best” practices from back then are the anti-patterns of today.
Many of the anti-patterns from back then aren’t far off the preferred design patterns today. It is funny how these things cycle! “Fashions and trends”, as the old department head had proclaimed long ago.
Mainframes were large central computers accessed via dumb terminals in the 1950s and 60s.
Virtual Desktop Environments were large central computers accessed via remote dumb terminals in the 1990s.
Today’s websites are essentially large central computers accessed via (mostly) dumb browsers.
Old ideas endlessly repackaged in shiny new ways. Each generation believing they have invented something extraordinary. Mostly they have merely incrementally enhanced what came before. Sometimes not even that, simply reinventing the wheel.
For my side project, my childish instinctive approach had been to jump back on the tools and code, just like I used to in my youth.
The only certainties were that I would burn loads of time. Never be finished. And have little of value to show for my efforts.
Then I thought about the things people are forever complaining about on technical sites I work at.
Budget constraints. Compromise. Legacy hell. Not doing things “properly”. Politics. Poor communication. Technical debt.
If I were going to do some programming for myself, then the only thing stopping me from doing things properly would be me.
Over the next week, I spent many hours unlearning much of my early career development approach and researching what “properly” currently looked like.
“Best” tools for the job.
Conscious of how accurate that “fashions and trends” observation had been, I attempted to look beyond the current fads, and focus on the things that would get the job done well.
Before long I had assembled a collection of useful toys with funny-sounding names. Few had been widely used the last time I had done any serious programming. Today they are ubiquitous.
Bitbucket. Confluence. D3. Django. Docker. Jira. Kafka. Kubernetes. Python. Spark. Terraform. Travis.
Meanwhile, many of those time-consuming ancillary tasks that developers used to worry about, which once took hours or days to resolve, have been automated out of mind or eliminated entirely.
By the end of the week, I had done very little coding.
Yet I had managed to string together a collection of powerful tools that, when used correctly in pursuit of a well-defined goal, are capable of performing amazing feats.
Though the learning curve was steep, I enjoyed the process. Knocking some of the rust from my long-dormant computer torturing skills. Breathing new life and currency into my dated technical knowledge.
Most of all, it was refreshing to be doing the doing again.
Not spending every waking hour in meetings.
Not fighting for survival in the corporate Game of Thrones.
Not acting as agony aunt. Angry Dad. Cheerleader. Mentor. Traffic cop. Or troubleshooter-in-chief.
For one old dinosaur playing alone on his computer, this approach seems as jarringly out of place as my wearing that fairy princess outfit and singing along with my nieces to “Let it go!”.
Yet I long ago learned to apply the timeless approach of hoping for the best while preparing for the worst.
My professional career has taught me that the best results come from simple repeatable automated processes. Designing machines, that once built, reliably allow us to achieve our desired outcome again and again simply by turning a handle.
My side project may well start with a burst of enthusiasm, before fizzling out as the excitement fades. Or it may have legs, requiring the hiring of staff to cope with the workload as things expand and grow.
The processes and tooling I have put in place now will support either option. It will be fun to see where it goes.
Paul Graham once wrote:
“Work with people you want to become like, because you’ll become like whoever you work with.”
This thought struck a chord with me. It stung more than I am willing to admit.
The more I thought about it, the more I realised that it not only applies to who you work with, but also to the type of activities you invest your precious time in performing.
There are a great many “bullshit jobs” that consume vast numbers of hours performing low-value tasks.
Do those things for long enough, and before you realise it, that is all you are qualified to do.
Difficult to sell during an economic downturn.
Unsatisfying at the best of times. Unfulfilling. No fun at all.
Difficult to unlearn.
My most recent role involved an uncomfortable amount of that type of low-value activity. In contrast, the side project involves very little.
Getting the balance right is difficult, the result of choices.
Both those we consciously made, and those that we avoid making.
- Graeber, D. (2018), ‘Bullshit Jobs: A Theory‘, Simon Schuster
- Graham, P. (2020), ‘Work with people you want to become like, because you’ll become like whoever you work with.’, Twitter
- Pixar Animation Studios (1986), ‘Luxo Jr.’, YouTube
- The Simpsons (1987), ‘Watching TV’, YouTube
- Victor, B. (2012), ‘Inventing on Principle’, YouTube
- Walt Disney Animation Studios (1928), Steamboat Willie’, YouTube
- Walt Disney Animation Studios (2013), ‘Disney’s Frozen Teaser Trailer’, YouTube
- Walt Disney Animation Studios (2013), ‘Frozen’, IMDB