Volume 40 Number 2 | April 2026
Summary
The article explains how parallel processing affects medical laboratory work and why errors often stem from cognitive overload rather than individual negligence. Drawing on human factors and neurobiology, it urges root cause analyses to examine system‑imposed cognitive demands, interruptions, and missing supports. Understanding these pressures leads to more ethical, effective approaches to improving safety and reducing error.
Julie Freidhoff, MS, MLS(ASCP), Volunteer Contributor

And completely unaware that her door had opened. Unaware of me standing in her doorway. Unaware of my arm grandly gesturing for her attention.
“Hello?!” How could she not see me? I was in her line of view. “Is she ignoring me?” I wondered.
She was so focused on all her work, peripheral vision narrowed as she concentrated on the needle going through the bead—anticipating her next step, responding to her friend in the chat function of her game, then moving her character to acquire the newly hatched fruit.
She happened to move her head just enough for me to suddenly appear in her line of view. Her body jumped. Startled, she sheepishly smiled as she turned down her music. “Uhg … what?”
There was a part of me that wanted to give in to the lightning-fast, reactionary inner mom dialogue:
Are you kidding me right now?
I’ve been standing here trying to get your attention forever!
Why were you ignoring me?
But I didn’t. (It certainly helped that she smiled when she responded!) I smiled back and gently said, “Lunch is ready.”
Does this sound familiar? At home? Maybe at work?

“What appears as singular focus is often parallel processing—with attention distributed across sensory channels and procedural tasks running in the background.”
From a relational neurobiology perspective, this matters because the brain is constantly making one primary assessment: Am I safe enough to stay engaged? When the answer is yes, attention can be flexibly distributed, emotion is regulated, and executive functioning is accessible.
Another term that fits this scenario is multimodal attention—attending to different sensory channels at once (auditory music, visual tutorial, fine motor task). Or perhaps you are more familiar with divided attention. This classic term is situationally dependent: people with expertise can divide attention effectively, but only when tasks draw on different cognitive resources and the nervous system remains regulated.
This distinction is well supported in the human factors literature. Cognitive psychologist Christopher D. Wickens, known for his Multiple Resource Theory, demonstrated that humans can process information in parallel only when tasks draw on separate cognitive resources, for example, combining visual work with auditory input. When tasks compete for the same resources, such as two visual–manual demands or simultaneous decision-making requirements, the brain encounters serial bottlenecks rather than true parallel processing. Performance slows, workload increases, and error likelihood rises—not because of individual failure, but because of predictable limits in human neurobiology.
In today’s lab, new employees are expected to master parallel processing without first mastering the benchwork. They have not yet become experts, and expecting them to safely carry this load is a design flaw.
“Parallel processing is not inherently distracting—it’s a neurobiological strategy that supports regulation and efficiency or, when capacity is exceeded, shifts the brain from learning and connection into survival.”
Inside the Invisible Work
Scene: a person sitting on their bed, listening to music; laptop open with a tutorial; iPad gaming; bead organizers splayed; threading beads.
From the outside, it looked like quiet, focused work. Inside, multiple layers were happening at once: the music stabilized arousal, the tutorial guided attention, procedural memory carried hand movements, and her brain anticipated and adjusted for every slip of thread.
If you’ve ever cooked while carrying on a conversation, driven while listening to a podcast, or prepared a specimen while mentally rehearsing next steps, you understand this experience. We live in a state of parallel processing far more often than we realize. Most of the time, we don’t call it that. We just call it doing our jobs—while monitoring instruments, documenting results, responding to questions, managing time pressure, maintaining accuracy, regulating emotion, and staying patient-focused. All at the same time. Oh my.
And that casual naming matters. When parallel processing remains unnamed and unmeasured, new tasks are quietly added, expectations compound, and open cognitive loops grow without acknowledgment. Capacity is not infinite. Every brain—regardless of experience or neurotype—can carry only so much load. When systems continue to add invisible work without protecting slack, the risk of error accumulates.
Relational neurobiology reminds us that executive functioning is state-dependent. When cognitive load remains within capacity and the nervous system stays regulated, these layered demands can feel fluid. When they exceed capacity, survival and emotional responses take over, and decision-making narrows.
Human Factors: Seeing What Really Happened
Human factors asks a deceptively simple question: “What did the system require the human brain to do at that moment?”
Relational neurobiology adds: “What state was the nervous system in while doing it?”
Parallel processing often shows up in traditional root cause analysis (RCA) as an active failure. In reality, it is usually a latent condition created by system design:
- Simultaneous monitoring, documentation, and communication
- Cognitive tasks layered with emotional labor
- Frequent interruptions treated as background noise.
As load accumulates, the nervous system may shift out of a cognitive, relational state and into an emotional or survival state. Access to the prefrontal cortex, our center for executive functioning, working memory, and flexible thinking, becomes limited. The brain is no longer prioritizing learning or connection; it is prioritizing safety. Even basic perceptual processes can be affected: saccades, the rapid eye movements that guide visual scanning, become less efficient under stress or cognitive overload, meaning the brain may literally miss visual information in the environment. That’s why attributing error to “loss of focus” or instructing someone to simply “read the SOP” is satisfying but misaligned with human neurobiology and incomplete.
Cognitive Load: What RCA Often Misses
Traditional RCA asks, what step failed? Who was responsible?
A human factors lens asks instead, how many cognitive streams were active at the same time? Which tasks required executive control vs. procedural memory? What was competing for attention, both seen and unseen?
Even when an experienced brain thrives on divided attention, the capacity to handle multiple streams is not limitless. Errors still happen—especially when systems add pressure, invisible tasks, compound expectations, and remove protective slack.
Why it Made Sense at the Time
Behavior is locally rational. As cognitive load increases, executive function becomes strained; emotional reactivity rises; survival responses dominate; and efficiency and habit (muscle memory) take precedence.
From the outside, narrowed situational awareness may look like negligence. From the inside, it often feels like doing the best possible thing with the available capacity. Recognizing this distinction is both ethically and practically crucial.
Supports We Don’t Notice Until They’re Gone
Routine patterns, rhythmic motion, music, or background regulation are not distractions, they are stabilizers. Removing them without understanding their role pushes the nervous system closer to emotional or survival states, reducing access to executive functioning at exactly the moment precision is needed.
“Which parallel processes were quietly supporting regulation and performance, and were they altered or removed?”
Interruptions, Slack, and Fragility
Interruptions disrupt working memory and trigger stress responses. Safe resumption requires cues, structure, and time. If another interruption occurs before recovery is complete, the effect compounds, further straining attention and impairing executive functions like decision-making and language skills. In that moment, even a simple request for help becomes an unreasonable expectation. Systems without slack leave the nervous system little margin to recover, increasing the risk that minor deviations cascade into mistakes. Parallel processing consumes slack even when nothing goes wrong.
Not all “interruptions” are harmful. A colleague stopping by to check in or offer help can be protective—providing connection, co-regulation, and a sense of felt safety that supports cognitive functioning.
If a system only works when cognitive capacity is maxed out, it isn’t efficient—it’s fragile.
Changing the Story We Tell about Error
Language shapes learning and nervous system response. In RCA discussions, framing observations with a human factors lens can shift the conversation from blame to understanding. Consider statements like:
- “The task required sustained parallel processing across multiple cognitive domains.”
- “Cognitive load exceeded what the system was designed to support.”
- “The response was consistent with efficiency under pressure.”
To make RCA more effective, teams can also ask:
- “What parallel tasks—visible and invisible—was the individual managing at the moment of the event?”
- “Which cognitive, emotional, or sensory demands were layered at the same time?”
- “Were there recovery supports, cues, or slack in the system, and were they sufficient?”
- “Did any interruptions or environmental demands compound the cognitive load?”
These prompts guide teams to examine what the system required the human brain to do, not just what went wrong. The conversation shifts from blame toward understanding how work is actually done—and toward designing systems that keep people in regulated, cognitive, relational states as long as possible.
A Gentle Invitation
Parallel processing is not a flaw to correct—it is a reality to design for. When systems fail to acknowledge the invisible cognitive work, errors are predictable, not personal failings.

But rather: “What was the system asking of their nervous system in that moment?”
Viewing error through this lens is not just practical, it is ethical, supporting both quality and patient safety. If that question gives you pause—if it feels uncomfortably familiar—you’re already doing the work this article hopes to invite.
Julie Freidhoff is a Coach and Consultant at SafeSpace Coaching and Consulting in Rochester, Minnesota.