Back in the early 1970s, when programmers actually typed their code and computers didn’t pretend to be clever, we had a simple rule:
“If it works, don’t touch it.”
(Modern translation: Ship it, pray, and blame the intern.)
The Golden Age of Restraint
We wrote software knowing every byte cost money and every cycle mattered. Edsger Dijkstra solemnly reminded us that unstructured programming was sinful. Niklaus Wirth offered entire languages designed to prevent us from shooting ourselves in the foot.
Naturally, the industry ignored him immediately.
C, C++, and the Worship of Footguns
By the late 70s and early 80s, Dennis Ritchie gave us C, a language built on the refreshing principle that “safety checks are for cowards.” Its spiritual successor, C++, was introduced shortly after — a language best described as:
“What if an octopus tried to nail extra legs onto a dog, for performance reasons?”
Programmers adored it. Managers adored that programmers adored it. Industry veterans wept quietly.
Ada existed, of course. Strong typing, contracts, verification, correctness guarantees — utterly unacceptable. It was almost as if Ada was trying to make software reliable.
We couldn’t have that.
The 1990s: The Web, Where Hope Went to Rot
Then came the Web, an environment built on three pillars:
- A document markup language accidentally repurposed as an application platform.
- A scripting language invented in ten days.
- The belief that “users won’t click that.”
By this point, any illusions that correctness mattered had fully evaporated.
The motto became: “If it compiles, deploy it.”
The 2000s–2010s: Automation, But Make It Worse
Automation arrived as the promised savior. Continuous Integration! Continuous Deployment!
Finally, we would eliminate human error.
Instead, we automated it.
A misconfigured YAML file could now take down an entire multinational business in under a minute. A malformed JSON blob could melt a data center. A stray space in a terraform file could reconfigure half of Europe.
Efficiency had been achieved.
2020s: Welcome to Global Single Points of Failure
As of 2025, we have perfected the art of centralisation:
- One CDN that everyone uses
- One DNS provider that everyone relies on
- One cloud that everyone deploys to
- One shared belief that “this time it’ll work fine”
And when it doesn’t?
A small typo in a bot-mitigation config can — and recently did — break major sections of the global internet. A marvel of engineering, really. The blast radius has grown from “local server room smells like burning” to “half the planet loses access to AI chatbots for an hour.”
This is progress.
The State of Software Today
We have:
- Faster machines
- Larger frameworks
- Longer dependency trees
- More abstraction layers
- And the exact same bugs we had in 1973, but now distributed at light speed
Modern systems are “self-healing,” meaning they break in new and exciting ways.
We have “microservices,” meaning your bugs can now collaborate.
We have “DevOps,” meaning everyone is responsible, therefore no one is responsible.
In Conclusion:
Fifty years later, computing has evolved magnificently — aesthetically, at least. Terminals are prettier. Errors are friendlier. The code is more colorful.
Under the hood?
We’re still duct-taping complexity together with hope, writing billion-dollar systems in languages originally meant for toy projects, and deploying at global scale with the same cautious discipline one uses when opening a can of petrol near a campfire.
If Knuth and Wirth were writing today, they’d likely say:
“The art of computer programming has advanced tremendously —
except for the computing part, and the programming part.”