Any critical software application must be useful, secure and reliable, bearing in mind how people interact with the system. Poor usability or unfriendly software can become the cause of failures. In this article we discuss the importance of this aspect in critical systems.
Why it is important to develop user-friendly systems
We’ve all had a bad user experience at some point in our lives. Whether it’s an app that we’re unable to get to perform the action we want, a visit to a website where we can’t find the information we need or forms that are almost impossible to complete.
In these cases, we will probably decide to uninstall or leave the website, which may result in the loss of a customer. This is a serious problem in itself, but nothing compared to the consequences that a usability problem could have for a safety-critical system.
A system that is not very user friendly may put a lot of good engineering work in jeopardy. The consequences can range from increased costs due to rectifications to failure to pass verification processes, preventing the product from reaching the market. In the worst-case scenario, it can result in the loss of human life and the destruction of critical equipment or facilities.


Consequences of designs with poor usability
The goal in critical systems engineering is to achieve a level of quality that minimizes the likelihood of errors capable of entailing serious consequences. To this end, there are working procedures that ensure that everything is documented, tested and verified to achieve the appropriate level of reliability.
This is where the user experience and its design (UxD, User Experience Design) comes into play. Many people still understand this discipline as something that applies purely to the aesthetic dimension, but the importance of usability goes beyond this. The reliability of critical systems depends not only on the hardware and software, but also on the people who will use them and their context. The design of an aircraft cabin can make the difference between life and death.
To better understand this, here are two examples.
John Denver’s plane crash
In 1997 John Denver suffered a fatal accident in California when his small plane plummeted to earth. Following an investigation by the National Transportation Safety Board (NTSB), it was found that the accident occurred because the pilot’s attention was distracted in an attempt to swap fuel tanks. The selector valve was located behind his left shoulder, forcing him to turn around in order to use it. This action may have involved a rudder movement that led to loss of control over the plane.
Jeep Grand Cherokee electronic gear lever
In 2014 this range of vehicles incorporated an automatic gear lever with an unusual operating system. Instead of occupying different positions, each time it was activated it returned to the same point. The only way to know which gear had been engaged was by means of a small display on the dashboard and on the lever itself.
This made it very difficult for the driver to easily know whether the vehicle was in park (P) or reverse (R). As a result, on numerous occasions the car was parked with the reverse gear engaged by mistake, which meant that the car would drive itself.Furthermore, in the event that this occurred, the car did not have any warning or alarm system for the driver.
The result: more than 100 collisions, 38 people injured and one fatally-injured driver hit by his own car. The National Highway Traffic Safety Administration (NHTSA), after studying the case, said that this gear lever was not intuitive and provided poor tactile and visual feedback, increasing the possibility of erroneous gear selection.


Safety-critical systems and usability
As you have seen, poor usability can have serious consequences. If we’re talking about critical systems where it is essential to minimize failures, this is simply not acceptable. When analyzing the level of criticality of a system, it’s necessary to look not only at the software or hardware itself, but also to understand how users may behave in the context of its use. This is crucial for the safety and control of systems, especially in the environments of aviation, transportation, energy, finance, etc.
In order to avoid situations like those mentioned in the examples, it’s necessary to take into account other existing complex systems and how they interact with each other in the development of software and hardware. In addition, human nature itself and the possibility of it being a source of failures must be considered, building preventative measures into the design. The context can also have a bearing on the person’s state, being overloaded and subjected to high levels of stress (which affects their cognitive abilities), for example in the case of an emergency during a flight.
Therefore it is necessary to follow and comply with all the requirements and regulations imposed in the various sectors. The main reason why these standards and regulations have been developed is precisely to ensure a guaranteed minimum level of safety and reliability in critical systems.
Critical systems engineering
The cost of troubleshooting once the system is developed can be up to 100 times higher than in the design phases. This is the reason why many projects fail in sectors such as aviation, defense, transportation, etc. Very demanding regulations apply in these sectors and need to be taken into account from the early design stages.
CENTUM Digital has more than 16 years of experience offering Critical Systems Engineering services in the most demanding environments, such as the Aeronautical, Naval, Defense, Railway and Automotive industries, mainly geared towards Certification, Safety, Environmental Rating and HW/SW Assurance processes.
For more information about our services you can click here.