How to reduce the risks of vehicles with partially automated systems

Most car enthusiasts are aware that completely self-driving cars are not yet a reality. However, some vehicles are fitted with partial automation systems, and the owners are inclined to treat these cars almost like self-driving cars. There are numerous warnings issued about this and reports of dangerous crashes. Treating these cars as self-driving cars puts those in the car, as well as others on the road, at risk. The US-based Insurance Institute for Highway Safety (IIHS) is highly concerned about this dangerous trend and has been carrying out detailed research. IIHS researchers interviewed hundreds of such car owners and a surprising majority said they thought their cars were almost fully self-driving. Car drivers often take both hands off the steering wheel when the fully automated system is engaged.

Partial automation systems still require the complete attention of the driver. The two main features of partial automations systems available today include adaptive cruise control, which ensures the car moves at a speed set by the driver, and lane cantering, which checks the car’s position in the driving lane. Tests and studies of real-world crashes show that today’s partial automation systems still cannot recognise and react as expected, to many road features and situations. Drivers report having to take over driving as the car did something unexpected while the driver was engaged in some other activity. 

Misleading names of cars

IIHS is of the opinion that in their zeal to sell the highest number of cars possible, automakers are not informing such car owners about the risks of misusing partially automated systems. Often the names of such cars are misleading too. Adding ‘autopilot’ or ‘pro-pilot’ to the name of a car can be very misleading. Drivers of such cars assume they can read or text or even sleep while the car is driving itself!

Sensor technology needs more development

Experts are of the opinion that automakers should consider including better sensing technology in such cars so that the software would know when the driver is looking elsewhere instead of the road. Attention reminders and system lockouts are other safeguards that can be incorporated to keep drivers engaged when using partial automation systems. Warnings that get progressively more annoying could also be incorporated to make people pay attention.

Existing automation technology

Automated driving does not mean zero involvement of humans. True, automakers are continuously adding safety features as well as self-driving features to their vehicles in an effort to reduce accidents and get people used to the idea of driving completely hands-free someday, but that day has not arrived yet. The Society of Automotive Engineers (SAE) has developed a scale from zero to five to categorise how longitudinal control (acceleration and braking) combines with lateral control (steering). There are occasional areas of overlap between the levels too.

Level 0 - No automation: At this level, there is zero automation and the driver is completely in control. Such vehicles may have automatic emergency braking systems, backup cameras and the ability to sound warnings at blind spots before imminent collision and during lane departure but that is not enough to provide complete safety.

Level 1 - Driver assistance: Adaptive cruise control is an example of this level of automation. Acceleration and braking, especially on highways, are controlled. Lane keeping assistance is also provided in many instances.

Level 2 - Partial automation: Highway pilot is an attribute of this level. Steering, acceleration and braking can be performed by the system, but the driver is in charge at all times.

Given the current development in automation technology, level 2 is the highest level available. The next three levels — level 3: conditional automation, level 4: high automation, level 5: full automation — are not yet ready for the road. Drivers must be made aware of these facts.

Who’s to blame?

When a partially automated vehicle crashes, more often than not, the driver is blamed completely. However, there is another school of thought. Is it always the driver’s fault? When the automation system fails, the driver is expected to take over immediately. For that to happen, the driver must be 100 per cent vigilant at every moment but is that a reasonable expectation? Since it is a partially automated system, the driver may not be as alert as one driving a manual car. It requires a high degree of skill, competence, and enough time for the driver to immediately gauge what went wrong and take the perfect action to override the error. Crashes usually happen because drivers cannot make the perfect decision in that very narrow frame of time between awareness about a problem and taking action.

Human attention wavering when supervising an automatic system over prolonged periods, is quite normal. This is a similar situation. In addition, drivers may be engaged in other activities such as using a mobile phone or the entertainment system. In other words, both the automation system and the driver make a mistake leading to a crash, but the blame is heaped entirely on the vehicle driver. Is that always correct?

* For organizations on the digital transformation journey, agility is key in responding to a rapidly changing technology and business landscape. Now more than ever, it is crucial to deliver and exceed on organizational expectations with a robust digital mindset backed by innovation. Enabling businesses to sense, learn, respond, and evolve like a living organism, will be imperative for business excellence going forward. A comprehensive, yet modular suite of services is doing exactly that. Equipping organizations with intuitive decision-making automatically at scale, actionable insights based on real-time solutions, anytime/anywhere experience, and in-depth data visibility across functions leading to hyper-productivity, Live Enterprise is building connected organizations that are innovating collaboratively for the future.

Recent Posts