Volume 26 | Issue 2
Click here to read the complete illustrated article or continue below to read the text article.
The rapid adoption of artificial intelligence (AI) will significantly affect risk management in 2024, and the insurance community has a critical role to play helping businesses think about the risks that could slow down deployment. The focus for carriers and brokers should be on using risk management expertise to improve data quality, testing, warnings, checks and other processes that will eliminate risk or mitigate the exposure if something does go wrong.
For instance, in the coming year, it is likely that more testing will be conducted around self-driving vehicles, as well as factory robots in the manufacturing process. This means that software is now controlling something that can cause bodily injury or property damage, notably challenging the dynamic for general liability. While the risk of an autonomous vehicle is quite different from a robot in a factory, it is imperative that highly trained industry risk specialists oversee the underwriting of these emerging technologies. It is crucial to have risk expertise, which is where a skilled human insurance underwriter comes in.
The industrial use of AI-driven systems also begs the question: Who is liable when something goes wrong with computer software. Assessing who is liable for damage or injury caused by a self-driving car is tricky. Is it the software, the hardware, or the human who is in the car? These are questions that are still being debated as the use of AI technology has become mainstream.
There are also no general guidelines. Different states have gradually developed their own rules for what is allowed on the road, and some have rules over who is liable in the event of an accident. However, given the various levels of ‘self-driving’ vehicles and the lack of case history with autonomous vehicles, it’s hard to gauge which rules will prevail and in what circumstances.
In addition, there are currently no global standards for developing AI, and there appears to be no general best practice. Today, the field is self-policing with unbridled property and casualty risks. It’s a race to see how to best capitalize on its potential in a responsible way, and who will get there first. The Hartford believes this can create a risky commercial business environment and it’s essential that companies partner with specialized insurance underwriters who can help mitigate the risks.
While some business may be demanding AI products, AI tech companies do not have fully proven products yet, which could lead to a seller overselling their product and a willing buyer believing the hype. If the product falls short, all parties lose. AI companies need to be realistic in their sales and the AI buyers need to be skeptical. This is where technology errors & omissions (E&O) and professional liability insurance can help mitigate risk.
Today, most businesses run on technology. If that technology, including AI fails, it can have a big impact on their finances. And, if the technology company was the one that supplied that AI technology, they could face legal action.
Unfortunately, traditional liability policies usually will not cover pure financial losses. It’s beneficial to partner with an insurer like The Hartford that offers technology E&O insurance. This coverage can help cover a business’ legal fees and other related costs if the software or equipment sold to a client fails and if, for example, the website designed by a business looked too much like its key competitor’s site.
Most companies understand that they need to start testing AI, otherwise they will fall behind their competition. Having an experienced carrier like The Hartford with a team of specialized underwriters in AI technology can help secure the investment.
About the Author:
Andrew Zarkowsky is Technology Industry Practice Leader for The Hartford. He has more than two decades of experience in insurance, holding various leadership roles in underwriting.
Patti Jo Rosenthal chats about her role as Manager of K-12 STEM Education Programs at ASME where she drives nationally scaled STEM education initiatives, building pathways that foster equitable access to engineering education assets and fosters curiosity vital to “thinking like an engineer.”