Safety on Purpose
Podcast where safety meets leadership, culture, and human connection. Hosted by Joe Garcia—speaker, culture advocate, and safety leader—this show dives beyond checklists and compliance to explore what really keeps people safe: purpose-driven leadership, trust, communication, and mindset.
Safety on Purpose
Just Culture in Action - From Blame to Learning
We flip safety’s script from blame to learning and lay out a practical model for just culture that blends accountability with compassion. Two case studies show how fixing systems cuts harm, boosts reporting, and builds trust without letting reckless behavior slide.
• why blame shuts down reporting and learning
• symptoms of blame culture across teams and leadership
• definition of just culture and its balance of fairness and accountability
• differentiating human error, at‑risk behavior and reckless behavior
• leadership behaviors that create psychological safety
• manufacturing case: redesigning guards and targets to cut jams
• decision tree for fair, consistent post‑incident responses
• healthcare case: label redesign, backup scanning and stronger reporting
• five practices to embed learning into daily work
• a 48‑hour challenge to start culture change now
If today's episode flips a switch for you, do two things. Share it with someone who starts it up with who did it. Subscribe and review. It helps more leaders move from blame to learning.
Hosted by: Joe Garcia, Safety Leader & Culture Advocate
New Episodes Every Other Tuesday
Safety on Purpose
Follow & Connect:
🔸 Instagram: Instagram
🔸 LinkedIn: Joe Garcia
🔸 Spotify | Apple | Podcasts: Search "Safety on Purpose"
Welcome to Safety on Purpose, the show where we turn rules in the relationships and compliance into commitment. I'm your host, Joe Garcia. And today we're tackling a topic that flips traditional safety on its head, just culture and action from blame to learning. If you've ever sat in an incident review that felt like a witch hunt or watched people hide near misses because they feared the fallout, then today's conversation is for you. Buckle up, let's dive in. Why blame fails in safety? Blame, it's a shortcut. It gives the illusion of resolution without really understanding. When something goes wrong, it's tempting to just simply point a finger, to find the who instead of understanding the why. But here's the problem: blame shuts down learning. When people fear being blamed, they're gonna hide mistakes. They're gonna under-report near misses or not even report them at all. Avoid speaking up about hazards. They're gonna stick to the rules even when they know they don't reflect reality. Blame creates silence. And silence is dangerous in any high-risk work environments. At its core, safety isn't just about preventing bad outcomes, it's about learning from all outcomes. Blame prevents the learning, it focuses on the past instead of improving the future. Let's talk about symptoms of blame culture. If your organization is experiencing any of the following, blame might be quietly driving the culture. Incident investigation stops at human error. They should have known better instead of asking why they made that decision at that time. People hesitate to report injuries or near misses because they fear discipline, retaliation, or being labeled careless. Leadership focuses on rule violations, not system gaps. Policies are seen as infallible, and they deviate from what really is happening. Is it failure of the system or of the person? Safety is weaponized, used to punish instead of protect. Frontline workers feel disconnected from safety decisions. They're told what to do, not ask what they need. The alternative, a learning culture. Instead of blame, progressive safety cultures focus on curiosity over judgment, conversations over conclusions, context over control, learning over liability. People aren't the problem. They're the source of insight. The goal is not to assign fault, but to understand how the system failed to support safe decisions. Because in the end, you can punish error or you can learn from it. You can't do both. So what exactly is just culture? Just culture is the foundation of a high trust, high-performing organization. It strikes the balance between accountability and learning. It recognizes that humans are fallible, even in well-designed systems, and that response to mistakes determines the future of safety and trust and engagement. Instead of asking who messed up, a just culture asks, what was the context? Why did this make sense to that person at that time? Was this a system issue, a behavioral choice, or a skill gap? It's a mindset and an operating model that encourages reporting, learning, and fairness without letting personal accountability fall by the wayside. Let's talk about some core principles of just culture. Number one, people make mistakes. Systems enable or prevent them. Humans are not perfect. We all know that. A just culture recognizes that most errors are unintentional and often caused or enabled by system weaknesses. We don't fix people, we fix systems so people can succeed. Number two, differentiate between human error, at-risk behavior, and reckless behavior. Not all actions are the same, and neither should be the responses. Human error, slips, lapses, mistakes. They support and improve the system. We need to figure out what that human error was. At-risk behavior, taking shortcuts, without fully recognizing the risk. We need to coach, educate, and reframe that risk perception, reckless behavior, conscious disregard for known risk. We need to address through appropriate accountability. We need to hold people accountable for that risky behavior and that reckless behavior. This framework removes emotional and subjectivity, helping leaders respond fairly and consistently. Number three, learning is more valuable than blaming. Blame stops the conversation. Learning opens it. When you prioritize learning over punishment, you're going to improve reporting rates, gain insights into hidden issues, empower employees to speak up, and build organizational resilience. Every incident, near miss, and deviation is a gift wrapped in insight if we're willing to look deeper. Number four, leaders set the tone. In a just culture, leadership behaviors matter. Leaders must respond to mistakes with curiosity, not punishment. Model humility and transparency. Separate outcomes from intent. Acknowledge systematic influences and create psychological safety. Your response to failure teaches your people how safe it is to speak the truth. Number five, accountability and compassion can coexist. Just culture is not a free pass. It's not about avoiding consequences, it's about assigning them fairly. The goal is to hold people accountable for choices, not for being humans. A healthy organization doesn't swing between punishment and protection. It lives in the space of trust, fairness, and shared responsibility. So why does it matter? Why does this just culture matter? Just culture isn't just a safety strategy, it's a leadership imperative. It builds trust, boosts morale, and ultimately it creates a workspace where people feel safe to speak up. Systems continually improve, accountability is fair and clear, and safety is owned by everyone. Because when people feel safe, they perform better. When they trust leadership, they engage more fully in the mission of safety, growth, and carefully. So I have a story from back in my days when I was doing some consulting stuff. A packaging line jammed every Friday at this one manufacturing site. One day, an actual injury occurred to somebody's hand. Traditionally, there was a reaction to that. What do we need to do? Was it operator error, discipline? And then do we need to retrain? What's the focus? Let's see how the just culture should have made this happen. So just culture shift, gathering a learning team. You can get the operator, maintenance tech, supervisor. And then we're going to ask some questions. Ask. Walk us through what made sense in that moment. And then the findings, what happened in that ultimate point there was that guard frequently removed from minor jams. So what did that do? The unrealistic time of getting the machine back up and running was a 90-second restart expectation. And in reality, that is not real. That's not going to happen. So the guard bolts were stripped, couldn't be tightened. It's because they were quickly trying to do all this work, quickly trying to get stuff done so that 90 seconds could become a reality. And what's really happening is we're trying to make that 90-second goal more of a reality than anything else. And that's where we lose the focus. So engineering started to ask quick release guards for safety interlocked. They added these quick release guards and it helped a lot. The target reset to a more realistic three-minute restart time on that machine. And the operator began to be recognized for honesty and saying, hey, this 90 second isn't real. We need to be realistic. If you want me to be safe, if you want me to do things the way that they should be done, I can't be trying to make things happen really quick. I can't be trying to do things. I'm going to have to end up taking shortcuts. I'm going to have to end up doing things the wrong way instead of the right and safe way. The result of this counted up for 75% reduction in jams, zero injuries in 18 months. When systems change, behaviors change. Let's talk about what is just culture decision tree. I don't know how many of you guys have heard of the Just Culture Decision Tree, but the Decision Tree is a structured tool used by leaders and safety professionals to analyze behaviors after an incident and to guide fair, consistent responses. It helps separate human error from at risk or reckless behavior and ensuring accountability without blame. So let's talk about how it works. The decision tree walks through a series of questions like was the action intentional? If yes, explore whether it was a rule violation and why that happened. Number two, did the individual know the risk? If no, it may be a risk behavior or a training issue. We need to retrain. Number three, were similar past actions supported or rewarded? If yes, it might be a cultural or system flaw, not an individual failure. Was the behavior reckless? If there was a conscious disregard for risk, then disciplinary action may be appropriate. Why is this valuable? Why do we use this decision tree? Because it does promote fairness and consistency, removes bias or emotion from decision making. And we all know that that's very difficult to do to remove our bias and our emotions from any decision making. It encourages learning and improvement over punishment, reinforces psychological safety and trust. So, in short, just culture decision tree helps organizations respond to mistakes with clarity, fairness, and empathy, and not blame. They had a relationship with a nurse. A nurse administrator had given a patient 10 times the dose. Fortunately, the patient was not harmed, survived, everything was fine. Okay. Traditional route for this kind of action came to termination. So there was no gray area, there wasn't anything like it's normally termination. All right. So let's look at what the investigation looked like. They had to look basically what was going on here. So they had two vials. Each of them looked exactly the same. And the barcode scanner that they used to scan each of these vials was not working. It was down for maintenance. And then, of course, what we've heard over the years, staffing shortness. They didn't have enough people. Nurses covering too many different areas. This particular nurse was covering two halls. So again, oh understaffed and overworked. So let's talk about the behavior classification. Was this at risk? Yes, it definitely was. Compounded by system design, right? Because we didn't have enough people in that area. One person had to carry the weight of all this, and that's just way too much. So the actions, they decided to relabel the vials with bold color to separate each other from. So there's no, you have a vial that looks exactly the same. Well, there's an orange and there's a blue. So there's a complete difference now. You have a backup scanner that's there that was purchased. So now when your scanner goes down, you have a backup for it. And then debrief the team. Nurse ultimately kept her job and became a trainer now to share the lesson that she learned from that day. Six months follow-up after this, 40% increase in voluntary near miss reports. Culture of learning visible. So horrible situation. 10 times the dose, not what we want to happen, right? But in the end, the system got better. Everybody learned from their mistakes. We were fortunate enough that this was not an error that cost somebody their life, but it did help so that they could train more people to not make that same mistake. And ultimately, in a just culture, we want to have people learn and not get blamed for stuff. So this is another perfect example of how just culture can become more of a learning culture than a blame culture. So let's talk about five practices to embed learning into your culture. So five practices to embed learning in safety culture. Number one, after action reviews. Conduct quick structured reflections after incidents, near misses, or even successful operations. Sometimes if you have a success, it takes a lot to figure out what happened. And let's figure out what happened so that we know how to redo this again, how to repeat that, right? Ask what happens, why? What can we do differently next time? Make learning a routine, not a reaction. Number two, psychological safety. Foster an environment where people feel safe to speak up, share concerns, and admit mistakes without fear of punishment. Learning can happen where silence thrives. Number three, learning-focused investigations. Shift from who is to blame to what allowed this to happen. Investigations should uncover systematic gaps, not just individual actions. And number four, systematic fixes over quick fixes. Focus on fixing the process, not just the person. Embed lessons in the policies. Design the training so that learning sticks and repeats. And then the errors and the mistakes don't continually happen. So if we can fix this in the policy, in the training, this shouldn't happen again. And then feedback loops. Close the loop with those involved. Share what was learned and what was changed because of it. This builds trust and reinforces a true learning culture. These practices move organizations from reaction to proactive, from blame to improvement, and from policy driven to people-centered. So I want you to challenge yourself with these four different things over the next 48 hours. Number one, pick one recent incident, big or small. And number two, I want you to map it out using a decision tree. And then number three, ask two what made sense questions with the team involved. And number four, share one system change, however minor, by the end of the week. Then watch what happens to trust. Remember, blame is fast, learning is lasting. One that says human error is inevitable, but learning is optional, and leaders choose whether the option is on the table. If today's episode flips a switch for you, do two things. Share it with someone who starts it up with who did it. Subscribe and review. It helps more leaders move from blame to learning. Now again, I'm Joe Garcia. Thanks for leading safety on purpose. Until next time, stay curious, stay human, and keep learning.