The Value of Mistakes
“An Expensive Education” – A Timeless IBM Leadership Tale and What Cybersecurity Has in Common with First Aid Response
In the mid-20th century, under the visionary leadership of Thomas J. Watson Sr., IBM was building not only machines—but a culture of boldness, curiosity, and character.
One day, a young salesman made a mistake. A big one—the kind that stings in a balance sheet. According to company lore, it cost IBM somewhere between $600,000 and $12 million, depending on who's telling the story.
Shaken by the fallout, the employee did what many in his position might: he walked into Watson’s office, apologized profusely, and offered his resignation.
Watson looked at him quietly and said:
“Resign? Now? After we just invested [millions] in your education?”
No reprimand. No lecture. Just a powerful reframing. The salesman left not just still employed—but transformed. Mistakes, in the right hands, become tuition. And leaders who understand this cultivate loyalty, innovation, and a culture of growth that outlasts any single transaction.
Mistakes can be costly. But if you hide them, you lose the ability to learn from them and the money spent to fix them is wasted. Learning from the error can turn cost into investment.
I once had an experience that taught me this lesson profoundly. Years ago, I worked as part of an ambulance crew—back when paramedic training wasn’t what it is today. We had a handbook, Emergency Care in the Streets, and we practiced it to the best of our ability.
We responded to a child suffering from cramps. I was first on the scene and found a young mother holding her five-month-old son, bundled in a duvet and blankets. I recognized the symptoms of fever cramps immediately.
By the book, I reacted: I gently unwrapped him, carried him to the bathroom, and ran a shower at a temperature just below normal body heat. As I held him under the mist, the cramps eased, his color returned, and calm was restored. I was soaked—but the child stabilized, and we took both him and his mother to the hospital.
Expecting praise, I was surprised when my supervisor pulled me aside and said:
“You had two patients: the child, and the mother. You treated the child—by the book—but you overlooked the mother's trauma. You didn’t speak to her, didn’t reassure her. You made her feel like a bad parent for trying to keep her child warm.”
It stung. But once I swallowed my pride, I saw he was right. That moment changed how I respond to crises—not just as someone first on an accident scene, but professionally and personally.
Cybersecurity operations follow the same arc:
Observe before engaging – Know your environment.
Analyze before reacting – Context matters.
Act decisively – Use your tools and training.
Debrief and learn – Even success holds insights.
Mistakes will happen. Security will never be perfect. Risk assessments may miss the mark. Openness and accountability are vital—they foster resilience and adaptation.
Leading cyber standards don’t just aim to prevent incidents—they require structured “lessons learned” after every event, Here are a few examples:
ISO/IEC 27001: mandates post-incident reviews & corrective actions
NIS2 Directive: introduces legally binding after-action requirements
DORA (Digital Operational Resilience Act): demands that critical entities test and learn from real and simulated crises
Cyber Essentials: encourages organizations to assess failures and improve
They all share one principle: Turn mistakes into insight.
Value your mistakes. They expose gaps, they teach humility, they strengthen people and systems.
IBM didn’t fire; they invested.
As a a first responder, a tough critique changed everything.
Cybersecurity frameworks mandate reflection.
In every domain—from boardroom to ambulance to SOC—the path to excellence is paved with lessons, not with perfection.