Any boater who spends any time online has seen the boat fail videos at Haulover Inlet in Florida. It’s super easy and even fun to critique them and wonder what you would have done differently. There is no doubt that many of those boaters lack basic boathandling skills (and I truly mean BASIC!). And probably some of them are (literally) showboating for the camera. But these videos also highlight the fact that the majority of all incidents/accidents involving boats are caused directly by the operator and not by some mechanical issue. What are some of these causes?
These human factors items can be broken down to various elements.
There is the decision-making process:
- Situational awareness
- Evaluation of the options available to you
- Choosing a decision and acting on it
- Evaluating the decision for the desired results
And the factors that affect decision making:
- Risk Management
I’m definitely not a psychologist, but being licensed as both a pilot and mariner I’ve always been interested in the human factors portions of both aviation and maritime operations. In both operating areas, there has been much study of the factors that lead to poor outcomes. These are some of those topics:
- Normalization of Deviance
- Expectation Bias
- Continuation Bias
- Inattentional Blindness
- Change Blindness
This happens in many aspects of our daily lives, at work or at home. It is characterized by accepting a lower standard of performance to “get the job done” and eventually that lower standard becomes the new norm. This effect can start with very subtle changes or relaxations of performance and spiral into deeper departure from the original norm.
In the boating world, an example might be adherence to the standard of applying a rigorous checklist before departing. This checklist would include verifying raw water valves are open for engine cooling, etc. With pressure from passengers to get going, a boater might start to become less rigorous with the checklist completion. The process becomes glossed over, and then the norm has changed to completing some checklist items from memory. Every time this is done successfully, it becomes easier to do the next time. That becomes the new norm, and all is fine until that one day where a prior maintenance project resulted in a closed raw water valve. An engine is started and is promptly cooked from overheating, or at the very least a water pump impeller is destroyed.
This happens when a person’s initial thoughts or perceptions of what is going to happen influences their future behavior. In aviation it is defined as “having a strong belief or mindset towards a particular outcome”. When things become repetitive, a person can become blind to something that is different from what is expected.
There are plenty of examples in the maritime world. Imagine that you very often work with the same crewmembers on a boat and the routine when docking has been to first make fast Line 2 (the after bow spring). And let’s suppose today’s crosswind situation is such that you want Line 1 made first. You might ask the crew to make Line 1 (the bow line), and they repeat back Line 2. They didn’t properly hear what you said, and you didn’t properly hear what they repeated back because it was different than what was expected. It’s not realized until the boat is behaving differently than what was planned, and by then maybe too late to avoid an incident.
Continuation bias is defined as the unconscious cognitive bias to continue with the original plan in spite of changing conditions. This is also sometimes called “get-home-itis” – and the effect is stronger the closer you are to the completion of the task.
This is a big one, and I believe that this is a very common cause of incidents. It can be on a large scale, when approaching a harbor in the fog or in adverse conditions. The operator is “so close” and “just a bit more to go” and hazards are not properly assessed until too late.
It can also happen on a very small scale, such as pulling the boat into a slip. On a windy day, you JUST WANT TO BE DONE and get her in there. The approach might not be ideal and it’s obvious that the boat is going to strike a piling or another boat, but the desire is so strong to finish that the risk is ignored. This is different than lacking the basic skill to complete the evolution. The skill might be there, but fatigue and other factors are causing this bias to cloud judgement.
This one is defined as when an individual fails to perceive an unexpected stimulus in plain sight, purely as a result of a lack of attention rather than any vision defects or deficits.
Here is a great example:
If you’re like most folks, you were so focused on the white players that you failed to see what happened. Even when they were right in front of you in plain sight.
These are very often characterized by a high level of focus on something. What’s a boating example? Imagine a tear in a jib or a loose anchor on the bow which has totally captured your visual attention. You might totally miss a developing collision situation from another crossing boat that’s right in front of you.
Another SUPER common cause is the distraction from cell phone usage. We’ve all seen this and maybe even experienced it ourselves. On the road, drivers simply cannot stay in their lane nor maintain proper spacing while talking on an intense phone call, even though they are looking straight ahead.
Change blindness is a perceptual phenomenon that occurs when a change in a visual stimulus is introduced and the observer does not notice a change.
This is a famous study on this topic:
In the boating world, I notice this most often in the case of monitoring engine instrumentation. While scanning across the gauges and monitoring oil pressures, coolant temperatures, etc, a change in the values (especially with digital displays) is so easy to miss even though you are looking right at it. There might also be some expectation bias going on in this example as well.
So how do we work on avoiding these items? The first safety step is to understand that biases are real. They are natural responses to the environment and the volume of information we receive. The only way is through training and education. Learn to recognize what is happening as it is happening and modify the outcome. And educate – read about the experiences of others and what would have helped, and ingrain those ideas into your behavior.
CASA Deputy Principal Medical Officer Tony Hochberg suggests a Trekkie solution – invoke the extraterrestrial, emotionless and implacably logical intelligence of Star Trek’s Mr. Spock, who was never afraid to contradict the mercurial Captain Kirk.
It’s sometimes good to be like Spock. Ask yourself, “What would Spock do?”
Click here for a great aviation article on this topic.