Doing the Right Thing Isn’t Natural

What would you do?

The alarm blares. The system — a system you helped design — says an attack is coming. Protocol demands you act immediately. Fear demands it too. Your training, your leaders, the flashing lights: all push you in one direction.

Do you follow the script? Or do you trust your judgment — and risk everything?

We like to think that doing the right thing comes naturally — that when faced with a decision, our values will automatically guide us.

But reality tells a different story.

In organizations everywhere, good people stay silent, risks are overlooked, and small warnings are missed. Not because they don't care, but because the environment around them makes doing the right thing harder than we admit.

Courage is not just an individual trait. It is a cultural product.

If we want ethical, resilient organizations, we have to build the conditions where integrity isn't an act of heroism — it's the everyday norm.

On September 26, 1983, Soviet Colonel Stanislav Petrov received the nightmare alert: U.S. missiles inbound. His orders were clear — report immediately, escalate to nuclear retaliation.

But Petrov didn’t act. He hesitated.

A mathematician and one of the architects of the early warning system, Petrov knew something others didn't: the system was new, untested, and fallible. Logic said "follow orders." Training said "report." Fear screamed "act now."

Petrov chose something else: to think.

He judged the attack unlikely. He trusted his knowledge over the system. And he made the loneliest decision of his life — to report it as a false alarm.

He was right. The alarms were triggered by sunlight reflecting off clouds. His decision likely saved hundreds of millions of lives.

Most of us believe we would be brave like Petrov. But in truth, every day in our workplaces, we trust systems, procedures, "the way things are" without question. We assume someone else will act. We assume safety is handled.

And that's when danger grows invisible.

The Myth of Rational Courage

We often carry an inner image of ourselves as principled, rational actors, ready to live by our values when the moment calls. But James G. March, one of the greatest organizational thinkers, showed how deeply flawed this self-image can be. March argued that much of human behavior is not driven by careful weighing of consequences ("logic of consequences") or by conscious moral reflection. Instead, we often act based on a quieter mechanism: the "logic of appropriateness" — an internal script of what "someone like me" does in "a situation like this."

Crucially, the content of that script is shaped by our environment.

In psychologically unsafe environments, the appropriate behavior might be to stay silent, protect oneself, and avoid risk. In psychologically safe environments, the appropriate behavior can align with courage, integrity, and speaking up.

Our real challenge, then, is not just to trust that people will do the right thing, but to build cultures where doing the right thing is what feels natural.

In high-stakes moments like Petrov's, where the connection between values and action is immediate and unavoidable, the choice stands stark and clear. But in our world, the decisions are usually smaller, quieter, less urgent. Reporting a security flaw. Questioning a shortcut. Challenging a risky decision. These acts feel minor, their connection to our larger values blurred.

And when the personal risk — to reputation, relationships, career — feels immediate and real, while the consequences of silence seem distant and abstract, it's easy to rationalize doing nothing. Easy to tell ourselves the decision wasn't important. That it wasn't really our responsibility.

This everyday drift between our professed values and our quiet actions is not because we are bad people. It's because we are human — and humans are exquisitely talented at rationalizing their choices when fear or uncertainty enters the picture.

Doing the right thing isn't natural. It's fragile. It requires support.

The Role of Psychological Safety

This is where psychological safety becomes critical — not just as a “nice-to-have,” but as the bridge between our values and our actions.

Psychological safety empowers individuals to act according to their better selves. It softens the fear that warps our decision-making. Without it, even those who internally align with the "logic of appropriateness" — those who want to do the right thing — may find themselves paralyzed by the "logic of consequences".

In an environment without psychological safety, people don't ask, "What is the right thing to do?" They ask, "What will happen to me if I speak up?" Fear transforms small, value-driven decisions into perceived existential threats.

But in a psychologically safe environment, individuals feel supported to close the gap between their internal values and their external behavior. They are more likely to:

  • Recognize when something is wrong.

  • Trust their own judgment even when the group is silent.

  • Act in alignment with the organization's mission, not just their own self-preservation.

Creating psychological safety doesn't guarantee that people will always make the courageous choice. But it dramatically increases the chances that, when faced with the quiet, cumulative decisions that shape organizational security and ethics, individuals will feel able — and expected — to act with integrity.

Bridging the Gap: What You Can Do

As an individual:

  • Recognize that your instinct to "trust the system" is natural but not infallible.

  • Regularly ask yourself: "What is the right thing to do here?" not just "What’s safest for me?"

  • Find allies who value integrity over comfort.

  • Practice small acts of speaking up — even when it’s uncomfortable — to build your own courage muscle.

As a leader:

  • Foster psychological safety explicitly: reward truth-telling, curiosity, and constructive dissent.

  • Make "doing the right thing" the central narrative of roles in your organization, not just "achieving results."

  • Model vulnerability: admit mistakes, ask for feedback, and show that learning outweighs blame.

  • Create rituals and structures (like safety checks or ethics moments) that normalize critical questioning.

In Conclusion

Doing the right thing isn't natural. It doesn't happen automatically.

By understanding our human decision-making biases and deliberately building environments where psychological safety thrives, we can move from false safety to true resilience. Not by being superhuman, but by being a little more humble, a little more deliberate, and a lot more human.

Our safety depends on it.

Next
Next

Psychological Safety – The Key to Teams That Dare and Perform