in Data Intensity


The State of Real Time for Property and Casualty Insurers

Seth Luersen

Previous Head of Training, Curriculum, and Certification Programs

The State of Real Time for Property and Casualty Insurers

In some industries, a hesitance remains in recognizing the commodification forces of real-time solutions. These industries often rely on orthodox tenets as barriers to marketplace entry, such as regulatory compliance, traditional value propositions, brand recognition, and market penetration. The term “ripe for disruption” often characterizes these industries and their respective leaders.

Arguably, an illustrative industry in the midst of responding to commodification, adapting to real-time technology, and fearing disruption is the Property and Casualty Insurance industry. An examination of this industry’s most commodified products and services serves as a litmus test for our common understanding of the state of real time.

Let’s consider the most commodified line of business: personal auto insurance. Here, we find that many traditional insurers capture little to no real-time data about driver behavior and vehicle performance. Instead of real-time systems that capture, analyze, learn, and predict, these insurers rely on expensive national advertising campaigns, extensive commission networks, quarter-hour policy quotes, lengthy claim processes, long call center queues, and monthly billing cycles.

bringing-io-t-to-property-and-casualtyBringing IoT to Property and Casualty

Metromile exemplifies a Property and Casualty insurer with a modern, transformative model for personal auto insurance. Using a smartphone app and an Internet of Things (loT) telematic device called Pulse that plugs into a car’s diagnostic port, Metromile owns the customer journey. Data-driven insights and services embody the digital experience for customers beyond common coverages: gas usage optimization, engine light demystification, parked vehicle location, and street sweeping parking ticket avoidance.

An obvious technological challenge for system-of-record businesses like Property and Casualty insurers is real-time processing at scale. Even with hybrid – on-premise and cloud – datacenter infrastructures, many enterprise messaging and database technologies struggle with maintaining linear scale at commodity costs when attempting to process, analyze, learn, and predict from streaming data in real time.

why-enterprise-systems-struggle-to-adapt-to-real-timeWhy Enterprise Systems Struggle to Adapt to Real-Time

The reasons why enterprise systems struggle to adapt to real-time include:

  1. Event-based messaging and service-oriented architectures remain overly verbose and complex for internal and external integrations.
  2. Batch jobs that extract, transform, and load data require additional computing resources to schedule, coordinate, and monitor the jobs.
  3. Disk-based databases read and write only as fast as the best non-volatile solid state disks and IO caches perform.

light-at-the-end-of-the-tunnelLight at the End of the Tunnel

In examining the state of real-time through the lens of the Property and Casualty insurance industry, there is good news! Competitors are taking notice of the technology behind usage-based insurance. In 2016, several US insurers now underwrite auto-insurance policies requiring a telematic device (Nationwide, Progressive, and State Farm). To better segment risk profiles and enhance claim processing, 36% of auto insurance insurers expect to implement usage-based insurance products by 2020. This trend is representative of enterprise businesses looking to benefit by using IoT devices and operating at real-time.

For enterprises looking to compete today, real-time technology is available on commodity hardware. With infinite iterator messaging systems like Apache Kafka paired with real-time database like SingleStore, today’s traditional enterprises can eliminate batch jobs, reduce integration complexity, improve network operations, and replace disk-based I/O operations with magnitudes faster, in-memory operations. Such systems produce, consume, and ingest data at millions of instances times per second while simultaneously analyzing, learning, predicting, and responding to real-time data. Most importantly, they do it at linear scale, meaning that the costs to scale as data and services grow remain linear. The only question now is how enterprises like those in the Property and Casualty insurance industry and in many other industries will harness the power of massively parallel, distributed, in-memory, SingleStore database technology to make possible real-time products and solutions for their customers.




Share