Digital Transformation: Old Wine in New Bottles?

So much of what we find new and exciting requires what we too often write off as outmoded.
Today’s insurance technology initiatives are increasingly motivated by our latest term of art, digital transformation. We love to throw those words around as if they represent some magical incantation that, when invoked, will produce brilliant solutions that lift us to otherwise unattainable competitive positions, as masterworks of art that evoke feelings of awe eons after their original creation.
Of course, we’ve been “digitally transforming” for decades. Setting aside the nineteenth-century innovations of Charles Babbage for a moment, modern “digital” computing is easily traced at least as far back as 1945 with the introduction of ENIAC, “the first programmable, general-purpose electronic digital computer”.[1] The intervening years have seen a remarkable explosion of computing power. Famously, the Apollo Guidance Computer (AGC) used to put men on the moon in 1969, with its 2 MHz CPU speed, had roughly the same computing power as a twenty-five year-old Nintendo Entertainment System (1.8 MHz). An old iPhone 4 (2010), with its 800 MHz CPU speed, outgunned the $32 million Cray 2 supercomputer (1985) by a factor of three (244 MHz).[2] And today’s iPhone 12 (2.99 GHz) and Sony PlayStation 5 (3.5 GHz) make those computing milestones seem quaint.
The growth in computing power, and therefore the number of practical applications that can be handled by affordable computers, has been astonishing. Indeed, it has made the aspirations of computer scientists who only dreamed about artificial intelligence and virtual reality just a few decades ago – dreams because they would require rooms full of very expensive hardware – available to the masses in tiny packages for very modest sums.
So it follows that today when we hear about insurers wishing to undertake digital transformation initiatives, we understand that their desire is to leverage today’s massive computing power to gain a competitive advantage. Otherwise, we’re simply talking about modernization, which was all the rage way, way back in 2015. Today’s initiatives have the far more ambitious goal of producing novel solutions, in the sense that competitors haven’t yet discovered – let alone adopted – them, and so they’re in a very real sense disruptive.
But disruption comes out of tolerance for mistakes. Disruption comes from having the wherewithal to experiment and fail repeatedly. Disruption comes from having the courage to engage in radically candid conversations laced with dissent and debate. So disruption can only happen if the company culture permits it to happen – an idea antithetical to an insurance company’s traditional mission, which is to avoid undue risk.
This frosty bit of insight begs an entirely different approach to insurance company operations that goes well beyond technology. Famously linear thinkers, insurance professionals have historically worked to place a price x on some risk y in anticipation of a positive return z. We press this button and that happens. Of course, this approach has turned out to be of dubious value, evidenced by the prevalence of combined ratios that exceed the century mark. Instead, a confluence of factors in a variety of dimensions conspire to destroy our bottom lines, if not our innocence: Geopolitics. The environment. Social movements. Generational sensibilities. Competitive moves. Regulatory constraints. Human psychology. Solar flares?! And yes, the rapid pace of technological change. After all, how popular was cyber insurance – arguably influenced by each of those factors – in 1950?
Woke (forgive me, but the term seems to work in this context, too) insurers have accepted this. And so their efforts are directed toward aggregating not just traditional datasets that populate rating algorithms or underwriting rules, but those many ancillary bits of information that influence risk selection and loss potential in a far more informed (read: non-linear) way. They utilize Big Data. They leverage artificial intelligence. They employ dedicated predictive analytics units. They automate routine operational processes. They invest in new technology. And they adopt change management programs to support those initiatives. That’s a long list of expensive undertakings for a smaller insurer. But that’s the world in which they have to compete.
Middle-tier regionals with relatively modest means must contend with tiny upstarts with tens of millions in capital investment unburdened by years of legacy operations on one end, and multi-billion dollar behemoths spinning off autonomous innovation centers on the other, for their share of the hundreds of billions of premium dollars blown skyward by the shattering of preconceived notions.
And so we arrive at the intersection of culture and technology, of art and science, of hard skills and soft skills. In an industry famously fixated on risk avoidance and profit margins, this juncture becomes an especially challenging moment in time. Indeed, a quick review of recent literature on disruption in the insurance industry makes scant mention of the behavioral changes that must accompany any radical innovation, both within an organization among its constituents and outside among its customers and suppliers.
The impact on many well-established insurers? InsureTech startups are eating their lunch. That is, unless those veteran organizations were prescient (and well-capitalized) enough to develop their own skunkworks, separate and apart from their core organizations in order to permit the risk-tolerant cultures found in their more nimble adversaries. That’s fine if you’re a major player, one of the billion-dollar insurers who can afford separately funded venture arms, or an agile start-up with fifty million smackers to burn. But what of the middle tier, those thousands of regional insurers vying for market share in the face of old threats (mainstays) and new (InsureTechs)?
The obvious answer is they need to think a little differently. With no discretionary trove of millions to casually deploy, the focus must be on manifesting beneficial change. And beneficial change begins with vision, culture, and leadership – not bits and bytes. Old wine in new bottles, you might say.
I’m not suggesting plastering office walls with poster-sized admonitions to “embrace change,” nor am I suggesting that beneficial change is a thing that happens if you hire the right consultants. I am suggesting, however, that with all of the marvels of technology available in the twenty-first century, it’s still people who matter most. It’s still paying attention to what motivates – inspires – every individual responsible for the welfare of the organizations in which they toil that separates leaders from laggards. And most importantly, it’s regularly respecting and acknowledging their contributions to ensure they stay focused and motivated, long after the paint is dry on that beautifully executed automation project.
Of course, standard “tactical” practices for operational improvements and technology deployments involving proven toolsets for workflow analysis, business process design, and technical project management are essential for a successful digital transformation initiative. But no amount of funding will replace the unbridled enthusiasm of a group of colleagues setting out to effect change for the better. It’s that enthusiasm and commitment that drive organizations to prosperity; it is rarely prosperity – and never technology – that drives individuals to become enthused if they’re not adequately engaged and committed to the work they do.

Contact Perr&Knight to support your digital transformation initiative with experienced project managers, business analysts, and process improvement experts well-versed in the ‘people part’ of transformation, who can assist with the requirements management, process redesign, and change management capabilities that are essential for any such project.

[1] Swaine, M. ENIAC. (n.d.). Britannica. Retrieved January 25, 2021 from https://www.britannica.com/technology/ENIAC
[2] Routley, N. (2017, November 4). Visualizing the trillion-fold increase in computing power. Visual Capitalist. https://www.visualcapitalist.com/visualizing-trillion-fold-increase-computing-power/.

Improve State Filing Efficiency, Even Working Remotely

Authors: Jessica Witvoet API, AIS, AINS, AIT, Diane Karis AINS, CPCU, and Neresa Torres
Many insurance companies were faced with a difficult transition when state or county orders meant to mitigate the spread of COVID-19 forced some or all of their staff to stay home earlier this year. Those who weren’t prepared for extensive remote working scrambled to set up secure systems easily accessible by employees unable to return to the office where servers, desktop computers and physical files are stored.
Because of Perr&Knight’s five regional offices across the United States, we’ve already had extensive experience using digital tools to collaborate from geographically dispersed locations. Our state filings support team conducts the majority of our work online using sophisticated web-based software, enabling us to work together seamlessly from anywhere.
StateFilings.com is a proprietary software tool we use internally to provide insurance filings support for our clients. It is also available for subscription, so insurance companies can more efficiently manage their own rate, rule and form filings.
StateFilings.com enables companies to maintain the pace and accuracy of filings even when working offsite. The software includes built-in features controlling three aspects of the process: project management, research and workflow. Here’s how these tools support the entire scope of insurance product filings support. 

Project Management

Because all aspects of our form filings services are online – including access to SERFF – StateFilings.com updates all phases of the project in real-time and provides visibility to team members who have been granted access. This means an individual can create and submit a filing to a DOI and others can see exactly what has been done and how far along the filing is in the approval process.
This real-time visibility eliminates the need for lengthy internal back-and-forth communication via email and enables all members of the state filings department to provide support or peer review without a cumbersome catch-up process.
On our end, easy access and full transparency for all filings enable our state filings support team to process any countrywide filing project within ten business days. Insurance companies who subscribe to StateFilings.com for their own filings departments also report that access to a single, user-friendly filing repository has dramatically increased their efficiency.
Real-time processing also eliminates the need to regularly check with DOIs to monitor actions. StateFilings.com has access to SERFF via a secure API, so the system automatically downloads approvals or objections and updates project status automatically. Dispositions trigger automatic emails noting the filing has been closed, along with a link to the approval. This high level of automation simultaneously eliminates time-consuming batch processing and ensures individuals in filing departments always have instant access to current information.

Research

StateFilings.com can be utilized as a research tool to evaluate various historic countrywide projects. We use StateFilings.com to enhance our insurance product filings support by checking previous filings to determine particular jurisdictional nuances that may impact our clients’ filings. We can also calculate the average DOI turnaround time by state and line of business to accurately gauge the anticipated time to approval for similar filings.
The software also keeps approved forms and rates and rules on file, accessible via the web. By removing the need to house this information on location-specific servers, filing department staff can quickly review important information, even while working remotely.
Companies who subscribe to StateFilings.com have access to all these features for their own current and historic filings.

Workflow Assignment 

Activity Manager allows users to view outstanding items on a filing including the activity type, due dates and follow up dates for any state included in a project. Assignments can be divided by line of business or state, providing at-a-glance insight into outstanding issues with filings, project assignments and approval status. Work load can quickly be determined, delegated and easily reassigned within Activity Manager to one or multiple users. This functionality allows our team’s supervisor to ensure efficiency and productivity so that we may deliver the most value to our customers.
This level of organization is also helpful for quickly onboarding new members to the State Filings Department. Access to historical filings enables new employees to easily review submitted information, required materials, questions from regulators and any notes made during previous filings.
Because of our nationwide presence and focus on technology, Perr&Knight has been structured to support virtual collaboration for years now, so the shift to remote work was not disruptive to our workflow. We know many of the tools and processes we employ can help other insurance companies improve productivity and get their products to market faster, even in today’s uncertain climate. For insurance filings support, StateFilings.com has been a crucial asset to our business model and we have seen it help other companies achieve the same high level of filing efficiency. 

Perr&Knight is ready to help you add the technological assistance and advantage of Statefilings.com to your organization. Contact Perr&Knight today to talk.

Predictive Analytics Provide Big Gains for Small Insurance Companies

It is no secret that the amount of data in the world is expanding at an extremely rapid pace, and in a time where business is being conducted online due to COVID-19, this rapid data production is going to accelerate more than ever. Of course, this mass influx of data is only useful to companies when they have access to it.
This data gap between insurance companies will widen even more due to COVID-19 as large companies with direct-to-consumer online platforms will see increased business due to stay-at-home orders and less in-person business. The increase in online business could be used in predictive models to analyze marketing trends and gain new market share, which leads to more business to enrich pricing and claim triage models, which increases profits and the ability to gain market share. From there, the cycle will start all over again.
A large data gap already exists between large and small companies, and now the gap for online business capabilities is also growing at an increasing rate. What can small companies do to help make sure that gap does not become insurmountable?
Luckily, external data is now more readily available and easily accessed than ever before. Companies with little data (or in some cases even no data) can take advantage of external information for predictive modeling. Some examples of external data sources include:

  • Government information such as census demographics, weather databases, occupational statistics, geospatial, and property tax information.
  • Numerous industry statistics and services from advisory organizations such as ISO, NISS, or NCCI.
  • Industry information from publicly available rate filings and financial statements.
  • Quote comparison services for competitive analyses.

Misconceptions about small companies’ ability to use predictive analytics are not limited to data constraints. There is also a common misconception about the models themselves having to be extremely sophisticated. While it may be true that many companies are using such complex models, smaller companies can still benefit from the use of analytics by simplifying their scope to accommodate less data. Some examples include:

  • Creating models that assist with monitoring programs by ranking predictors with the largest impact on results. These give quick insights to help focus additional research.
  • Using simpler assumptions and grouping variable levels in order to increase the credibility of the model.
  • Combining company data with external data sources to add additional predictors to your results.
  • Consulting with industry experts to follow modeling best practices such as removing data outliers and/or missing values in order to maximize the amount of usable data and not skew results.

Not only can these scenarios be applied today to help insurer performance, but Perr&Knight has experience in assisting clients in each and every one of them. With both experienced predictive modeling personnel and industry expertise in virtually all lines of insurance, Perr&Knight is uniquely qualified to assist small companies in implementing predictive analytics to help improve insurer performance and profitability.
As the world continues to evolve technologically, so too does the sophistication of insurance products and the insurance process. It is important for small companies to modernize their approaches to help minimize the data gap in an increasingly data-driven environment.

Leverage the power of predictive analytics. Contact the experts at Perr&Knight to learn more about how your company can use data from predictive analytics to improve business outcomes.

Digital Transformation: Are You Sure You’re Ready?

Motivations behind digital transformation initiatives usually involve improving speed, cost, or quality of products or services. Migrating to the cloud, increasing mobility, implementing robotic process automation (RPA), deploying intelligent automation solutions, or capitalizing on data from the internet of things (IoT) all have the power to profoundly impact an insurance company’s competitive standing – if not their very survival.
With 67% of insurers considering implementing digital transformation initiatives over the next twelve to eighteen months[1], companies not considering these changes will soon be left behind.
However, undertaking an initiative too hastily may overlook critical organizational considerations likely to inflate project costs and jeopardize the success of the program.
Throughout hundreds of insurance technology consulting engagements, we have identified five phases comprising successful preparation for every tech project: Initiate, Design, Experiment, Prioritize, and Plan. Each is a valuable component of the process which, if executed carefully and correctly, has the potential to double the chances of success of your IT project.
In this article, we’ll cover an often-overlooked – but vitally important – aspect of the Initiate phase: determining your readiness to even begin.

The Picture Is Bigger Than You Think

Because every company’s systems are so deeply intertwined, each affects more aspects of the business than may be initially apparent. Even if equipped with good intentions, simply jumping into major structural or process changes can create serious roadblocks for other departments or processes down the line.
In our decades providing insurance technology consulting services, we have seen dozens of costly – and avoidable – challenges arise when companies implement new initiatives without adequately ascertaining the impact of their project on the whole of the organization. It’s not uncommon for organizations to discover their planned initiative reflects only a superficial portion or “end of the line” aspect of the required change, when they should plan for complementary tech upgrades or cultural impacts in other departments as well.
By “preparing to prepare,” you determine your organizational readiness to undertake any transformation – digital or otherwise. And, taking a holistic view reveals the true scope of the initiative, one that may extend well beyond initial expectations.

Conduct a Readiness Assessment

We believe so strongly in the value of determining whether organizational readiness supports – or inhibits – a project’s ultimate success that we have designed a comprehensive organizational Readiness Assessment.
Conducted during a sixty- to ninety-minute web meeting, the assessment reviews key questions about your organization to determine its level of readiness in six areas of your business: Personnel, Processes, Technology, Metrics, Governance, and Environment.
The assessment drills down into a range of relevant factors:

  • Personnel availability
  • Team skills
  • Individual and team empowerment
  • Staff commitment
  • Impact of program on processes
  • Process indoctrination
  • Benefits realization
  • Technology infrastructure
  • Software considerations
  • Interfaces
  • Metrics definitions
  • Visibility into KPIs
  • Response to met or unmet KPIs
  • Extent of existing program planning
  • Program organization
  • Program procurement
  • Program implementation & deployment
  • Program support, monitoring & evaluation
  • Physical location
  • Company culture
  • Team morale
  • Other business-specific considerations

Readiness Assessment for Insurance Technology

A page out of Perr&Knight’s Digital Transformation Readiness Workbook

Answers to straightforward questions on these subjects reveal a clear picture of your organization’s strengths and weaknesses in areas with the potential to impact the result of your transformation initiative. This valuable contextual view enables your team to further develop your strategy before beginning project planning, in order to avoid playing costly catch-up later.
The results of this exercise create the foundation for the remaining four steps of preparation for your project. In addition to determining your level of readiness overall, this assessment provides a useful starting point to prioritize areas for work before and during your digital transformation initiative.
Readiness Assessment Results by Perr&Knight

The results area of Perr&Knight’s Digital Transformation Readiness Workbook

Preparation Is Important – But So Is a Tolerance for Risk

Successfully executing any major change requires commitment, tenacity, and a risk tolerance from leadership and the organization as a whole. To promote innovation, there must be support for experimentation and a willingness to endure the challenges of repeatedly failing, learning, and failing again before success finally takes root. The values that represent your organizational culture are crucial – if individuals are punished for failure, they will cease to experiment, and innovation will become a distant hope rather than a realized goal.
Gaining a thorough understanding of the benefits of a successful endeavor balanced against the pitfalls that lie ahead – even before approaching the starting line – gives you a much stronger chance of completing a successful digital transformation initiative and truly remake the way you do business.
To support the insurance industry’s unprecedented embrace of digital transformation, the workshop is being offered by Perr&Knight on a complimentary basis to organizations contemplating any type of transformation project. The accompanying readiness workbook and resulting assessment will be provided at the workshop’s conclusion.

Interested? Contact Perr&Knight today and carve out just ninety minutes to significantly increase the likelihood of a successful digital transformation initiative – ready or not.

[1] SOURCE: 2020 Financial Services Digital Transformation Survey, BDO

Pioneering Insurance Automation

The automation of time-consuming manual processes has unlocked ever-increasing levels of efficiency for businesses across the insurance industry. At Perr&Knight, we have long recognized the value of offloading process-heavy tasks to machines in order to free up actuaries, agents, and filing teams to focus on tasks requiring human judgment.
Let’s take a look at how our own automation evolution has opened up greater efficiencies internally, as well as for our clients.

A Breakthrough in Automation: Ratefilings.com

Anyone who has been in the insurance industry a few decades shudders to think of the inefficient early process of obtaining publicly available insurance company filings from the Department of Insurance for competitive analysis.
Perr&Knight was the first in the industry to aggregate these filings on RateFilings.com. In the early days, we physically sent someone down to the state department of insurance (DOI) building, equipped with a scanner. The rep would spend all day buried in the stacks, scanning documents until the job was done. From there, the person would head back to our office and transfer the scanned PDFs to the Data Entry Department, then spend hours manually entering metadata into the database. The average number of documents that could be entered per day was capped at about thirty per person.
Around 2005-06, NAIC launched the System for Electronic Rates & Forms Filing (SERFF), which greatly reduced the number of paper filings requiring scanning. SERFF also standardized many formats, further streamlining the process by increasing the uniformity of filing requirements.
As DOIs posted publicly-available filings to their websites, we did less scanning and more and more downloading – itself an important time-saver. The new downloadable, standardized SERFF format enabled our Data Entry department to copy and paste data instead of manually typing it out, further increasing accuracy and speed.
The massive breakthrough in automation came in 2008, when we developed “The Auto-Indexer,” a PDF parsing software program that could read a PDF document and copy and paste the data from the PDF directly to our RateFilings.com database.
Now, instead of entering the data, our human staff member was tasked only with auditing and validating that the data entered by the system was correct. Though all filings were reviewed by human eyes, the computer could automatically process straightforward filings as long as there were no errors. Complicated, high-priority filings received closer scrutiny from our staff.
With this advancement, productivity skyrocketed by 1,000%. We could now complete up to 300 rate filings per day per person, instead of a mere thirty.

Statefilings.com Expands the Scope of Automation

Perr&Knight’s StateFilings.com shares a similar history, but took automation even further. When StateFilings.com was launched in 2003, we would manually enter filings, objections, responses, and all correspondence into the system. Then we used similar parsing technology from the Auto Indexer to automate much of the data entry.
Further building on our process, Perr&Knight began talking with the NAIC, ultimately becoming the first vendor to integrate a new RESTful API developed by the NAIC into our StateFilings.com software.
Not only did this drastically reduce the amount of uploading and manual labor required to enter data, but the updates were virtually instant. The API also gave us easy access to granular filing data. For example, forms and rules could now be broken out from the filings. As such, Perr&Knight was the first company with an automated, real-time forms library and rule library.
Our clients could now access and search DOI documents and company forms instantly from any web-enabled device. The fees our clients paid to license the software were offset by time savings and ease of searching and segmenting data from a single, cloud-based location.

The Future of Automation at Perr&Knight

In the coming years, we envision increased use of automation for two-way data exchange.
As of right now, using the SERFF API, we have the ability to extract data from the DOI websites, but the information flow is limited to one direction. With two-way integration, we’ll begin to automate the filing creation process. Imagine one-click Bureau adoption filings and auto-generated actuarial support for rate change filings.
Perr&Knight is continuing to develop software tools that will ultimately become a bridge between Statefilings.com and an insurance company’s IT systems thus eliminating the need for manual handoff and reducing the chance of errors.
Working with rate filing teams, actuaries, and IT departments, we’re developing and brainstorming new software and systems that offload more time-consuming burdens to machines, so valuable human teams can direct their focus where it’s needed most.

Looking for ways your company can streamline state filings or other operational procedures? Our insurance technology experts are here to help.

The Race to Autonomous Vehicles

The $2 trillion global automotive industry is ripe for disruption from autonomous vehicle technologies that make driving safer, more energy-efficient and more convenient. Driver error causes more than 9 out of 10 crashes.  Autonomous vehicles are robots on wheels that eliminate driver perception, distraction and incapacitation errors. While cybersecurity risk poses a safety threat, there is little doubt that robots can drive better than humans under normal conditions. Most autonomous vehicles are powered by eco-friendly, zero-emission electric batteries, and they are designed to drive safely and efficiently. Autonomous vehicles offer limitless opportunities for convenience by changing the driver into a passenger.
Following several years of product and strategy improvements along with making progress in gaining regulatory approvals for road testing, the major players are emerging in the race to commercialize fully autonomous vehicles. To name a few, Waymo started as Google’s self-driving car project and is a self-driving taxi service currently operating in Phoenix, Arizona with a pilot program for employees in California. Waymo is the largest active self-driving company in terms of daily miles driven. General Motors’ Cruise provides an autonomous ride-hailing service for its employees in San Francisco and recently unveiled plans for its fully autonomous Origin with no steering wheel or pedals. Volkswagen and Ford have made large investments in self-driving software company Argo AI with plans to implement Argo AI’s software in new vehicles in the early 2020s. Uber is heavily investing in replacing its human fleet with a driverless fleet. Startups like Optimus Ride and Pony.ai have launched self-driving ride-hailing services in designated areas of cities like Brooklyn’s Navy Yard.
These companies have really smart people, breakthrough technologies and deepening pockets. And they are all watching Tesla whiz by in the race to commercialize autonomous vehicles.
Here are several reasons why.

ADVANCED TECHNOLOGY WITH NO BOUNDARIES

Tesla’s autonomous vehicle system primarily uses cameras to identify stationary and moving objects in the vehicle’s surroundings. Radar and other sensors are used to help see in dark and adverse weather conditions. The Autopilot system is a standard feature and currently qualifies Teslas for SAE Level 2 Automation, which means it can handle all aspects of driving under certain conditions, but the driver must be ready to intervene at all times. Tesla deploys the hardware needed for self-driving in all of its vehicles sold to consumers, and they use the hardware to train artificial intelligence systems called neural networks that are designed to automatically improve with new data.
Virtually all major players except Tesla are using LiDAR technology to build autonomous vehicles. LiDAR is a sensor system that measures reflections from laser pulses to build a 3D representation of the environment around the vehicle. Geofencing is used to define spatial boundaries, and detailed maps of the terrain and objects within the geofence are developed. The self-driving car projects the sensor data on top of the map to gather information and determine the safest path.
Proponents of LiDAR argue the technology is crucial to reliably assess and measure the environment around the car in all conditions. Argo AI describes a “street-by-street, block-by-block” mindset[1] underlying their LiDAR-based technologies to make self-driving vehicles safe and accepted by society. The goal of this approach is SAE Level 4 Automation, which does not require any human intervention in limited spatial areas.
Elon Musk, Tesla’s founder and CEO, criticized the use of LiDAR in autonomous vehicles at Tesla’s 2019 Autonomy Day event. “In cars, it’s freaking stupid. It’s expensive and unnecessary…once you solve vision, it’s worthless. So you have expensive hardware that is worthless on the car.”[2] He has a point. Although the per unit cost of LiDAR is dropping, it still costs a few thousand dollars per vehicle. Researchers at Cornell University found that cameras can detect objects with near the precision of LiDAR at a fraction of the cost[3]. Also, developing capacity for LiDAR use by geofencing and mapping communities is costly and slow whereas camera-based systems can be employed in cars anywhere in the world.  Musk’s goal for Tesla is SAE Level 5 Automation, which does not require any human intervention with no spatial limitations.

TESLA HAS THE DATA

Training a self-driving car requires a lot of data. Tesla has over 3.3 billion Autopilot miles and 22.5 billion miles in Tesla vehicles[4] from its fleet approaching 1 million units sold worldwide. On an average day, Tesla collects approximately 650x more driving data than Waymo.[5] Tesla feeds the vast amount of data it is collecting into its advanced neural networks, which use the data to improve the vehicle’s ability to predict common behaviors as well as behaviors for rare situations that are difficult to simulate. Although Autopilot is currently intended only for use on highways, Tesla is using the data it gathers in all environments to train its cars how to handle intersections, traffic lights and pedestrians.

VERTICAL INTEGRATION FOSTERS INNOVATION

Many autonomous vehicle companies are partnering with automotive companies to implement their self-driving platform into new vehicles. Waymo has equipped several types of cars with its self-driving equipment. Argo AI partnered with Ford and Volkswagen to roll out its autonomous vehicle technology in both the U.S. and Europe. Daimler has partnered with Baidu to equip Baidu’s Apollo program, an open-source autonomous vehicle platform, onto Daimler’s Mercedes-Benz vehicles to test self-driving vehicles in Beijing, China.
Tesla is an automotive company and an autonomous vehicle company, allowing the company to fully integrate hardware and software autonomous vehicle specifications into its vehicle design and build processes. Large automobile companies typically source their parts from suppliers all over the world who can meet their quality demands at the lowest cost. Tesla learned the dangers of a global supply chain the hard way when its Model X deliveries fell far short of demand in early 2016 caused by a shortage of parts from a supplier. Tesla has moved many parts manufacturing operations in-house, which has led to new types of batteries, seats, motors, windows and other parts that differentiate Tesla from the competition. Bringing parts manufacturing in-house allows Tesla to be flexible and nimble in pushing improvements into its products. Musk noted Tesla pushed 20+ improvements per week into the product development process of Model S[6]. Tesla’s culture of continuous improvement is key for automation where iterative development is required to make driverless cars safe.

SELF-DRIVING TESLAS ARE STILL PERSONAL AUTOMOBILES

Most autonomous vehicle companies are intending to provide ride-hailing services. These companies are making big bets on the future of shared vehicles, but they don’t have much choice. Consumers do not want to buy a personal automobile that doesn’t operate outside of the town’s geofence, and the LiDAR-based system is costly equipment to pass on to the consumer. A vehicle-sharing model makes sense in highly congested urban areas where parking space is limited, but it will not displace personal automobiles anytime soon. Car owners value the accessibility and independence of having their own vehicle. Also, in the new era of social distancing and extra health safety precautions, vehicle sharing and ride sharing faces serious headwinds.
In contrast, when Musk and the regulators determine Tesla’s fully autonomous vehicle technology is safe for use, a simple over-the-air software update can transform Tesla’s automobile fleet into a fleet of driving robots with human-driver capabilities.
In 2019, Musk predicted Tesla’s self-driving vehicle technology will be feature-complete by the end of 2020. While this timeframe seems overly aggressive, I hesitate to doubt Musk. After all, one of Musk’s other companies, SpaceX, just became the first private company to send humans into orbit, and the company is seeking to send humans to Mars and beyond. Compared to space travel, teaching robots to drive safely at 55 miles per hour is a manageable problem.
In reality, there will be room for many winners in the autonomous vehicle market. Global automakers like Volvo, BMW, Nissan and Toyota have stumbled out of the gates in building self-driving vehicles, but they continue to invest and will not be far behind. Ride-hailing startups could shift consumer preferences on car ownership if people are able to order a ride on their phone anytime, anywhere. Autonomous vehicles have the potential to be used for a multitude of purposes including for commercial cargo transportation and in vehicles used for urban commuting or long-distance transit.

The time is now to start planning your insurance needs for the autonomous vehicle age.  Contact our product development and product design experts for help.

 
[1] https://www.argo.ai/2019/09/the-argo-ai-approach-to-deploying-self-driving-technology-street-by-street-block-by-block/
[2] https://www.theverge.com/2019/4/24/18512580/elon-musk-tesla-driverless-cars-lidar-simulation-waymo
[3] https://www.therobotreport.com/researchers-back-teslas-non-lidar-approach-to-self-driving-cars/
[4] https://lexfridman.com/tesla-autopilot-miles-and-vehicles/#:~:text=The%20following%20is%20a%20plot,Tesla%20vehicles%3A%2022.5%20billion%20miles
[5] https://towardsdatascience.com/why-teslas-fleet-miles-matter-for-autonomous-driving-8e48503a462f
[6] https://www.caradvice.com.au/367472/tesla-model-s-gains-20-engineering-changes-per-week/

Why Stat Reporting Shouldn’t Be an Afterthought

Authors: Jason Hudson, Principal Director, Statistical Reporting Services, and Mark Nawrath, Principal Director, Account Management
When insurance companies prepare to implement new software for policy and claims administration, regulatory reporting of the data captured is an afterthought. What appear to be turnkey systems often turn out to require more retrofitting and configuration than initially expected to meet statistical reporting requirements, resulting in an increase in investment and a longer timeline to launch.
Here’s why it’s necessary to consider statistical reporting needs throughout the entire development and implementation process.

Powerful and flexible modern systems…they require more configuration

Legacy technology (early mainframe systems) demanded a ton of programming to account for every possible scenario required for policy and claims administration. Building the complex logic required to encode, transform and format data into compliant statistical plan formats was an assumed part of the implementation process.
However, when client server technology started to take off in the 1990s and 2000s, new client-server-based technology vendors decided not to invest in complex logic to comply with statistical reporting mandates. These new products were like warm Jell-O waiting to be molded: they had the ingredients to  perform policy and claims administration processing, but required heavy configuration and customization, not just to write insurance business (that is, all the transaction sets in a business life cycle—endorsement transactions, change renewals, cancellations, etc.), but to conform to the regulatory  mandates for statistical reporting. It was up to insurance companies to make sure they were covered.
In plain terms: many of today’s systems are set up for collecting information, but how they store data on the back end is not designed to meet statistical reporting requirements.

Set yourself up for stat reporting success

In the rush to get new products to market, insurance companies often get caught up in launching their new system (or product or policy) as quickly as possible. Today’s client-server-based systems are not less capable than previous systems, they’re just more malleable. In providing insurance companies with more flexibility, the vendors put the onus on the insurance companies to configure their systems to perform and ensure compliance. Unfortunately, companies tend to focus on business functions (product rating, forms, coverages, claims handling, etc.) and overlook the importance of collecting and formatting specific transaction sets and data points needed to meet regulatory standards. This is why it’s so crucial to consider statistical reporting requirements from the very outset of your new technology implementation. Here are some strategies that work:

  • Involve the right people from the start

Bringing statistical reporting compliance stakeholders to the table late in the game increases the odds of revealing functionality and data needs previously unaccounted in defining implementation specifications. Therefore, it’s important to have statistical reporting subject matter experts work together with experts in rating, underwriting and claims, early in the process to understand what products, coverages and claim events are contemplated and to define the transaction sets and data elements required. 

  • Account for configuration in your budget

Because of the heavy amount of programming required for legacy systems, it was very difficult to ascertain exactly how much was invested in programming for statistical reporting. These days, it’s easier to identify and quantify. Avoid sticker shock on the final project by earmarking a section of the budget for statistical reporting requirements definition, configuration and testing.

  • Clarify your specifications

Identify the statistical file generation processes you’re currently using to inform your needs for your new system. From there, generate a comprehensive list of specifications and make sure they are reviewed by the teams who will be responsible for statistical reporting. Statistical reporting subject matter experts and third-party reporting consultants can come in handy here, as they can make you aware of current industry best practices and other information “you don’t know that you don’t know.”

  • Produce usable test policy data

One part of the transition that is often overlooked is the availability of “production like test data”, essential to ensure completeness in the encoding/transformation process and often required in bureau electronic testing certifications. A number of statistical agents and rating bureaus require you to compare the captured and encoded statistical data (for risks, coverages, policy and claims transactions) to what is being produced on front-end for the insured. That means classifications of business, coverages being offered, rates and premium amounts must be a direct replica on the back end for statistical reporting process. Account for this in your roll-out plan and dedicate appropriate resources for it.

  • Don’t rely on your data warehouse for stat reporting

Data warehouse solutions are typically not architected to satisfy statistical reporting mandates (including rating, premium and claims detail at the line/subline/coverage/transaction level, policy and endorsement form data and onset/offset entries for regular and out of sequence endorsements). Rather than making your data warehouse too complex and robust, let your statistical reporting experts and programmers work with native data that comes from policy and claims administration systems.

Create a comprehensive game plan

Develop a proactive strategy to test how your system will issue policies and transactions once policies become enforced. Don’t make the mistake of testing front-end functionality without an end-to-end review of how those policies and claims get formulated in comparison to the statistical data that you will also be collecting on the back end.
Involving an outside insurance reporting and development consultant to guide you through the process can be especially valuable here. At Perr&Knight, we offer workshops as a part of our Statistical Reporting Solution service offering. These workshops involve all relevant stakeholders and cover key topics that will inform the game plan that guides you forward.
In these multi-day workshops, we discuss with your teams the interlacing of statistical reporting file creation and testing processes into your information technology objectives, the risks associated with delivering an improved statistical reporting capability, and the coordination of your team and third-party participants to schedule projects related to the implementation of an enhanced statistical reporting solution. The final deliverable to you is an evaluation of strengths, potential vulnerabilities, and a plan for moving ahead that includes clear roles and responsibilities, cost projections and duration estimates. Whether you decide to partner with us or not, you end up with impartial strategy for implementation that you can use as a roadmap.
Because statistical reporting is not seen as revenue-generating aspect of the business, it’s often overlooked during technology development. However, doing so only short-changes you on the back end of your project implementation, as teams scramble to retrofit new systems to meet statistical reporting mandates. Instead, keep statistical reporting requirements in mind straight from the start and save yourself the headache of having to go back and make costly corrections – or being fined for non-compliance.

Want to discuss how to make statistical reporting more manageable for your in-house staff? Our insurance technology experts can help.

InsurTech: The New Frontier for A&H

As troves of data and lightning-fast processing capabilities become increasingly available to insurance companies, cumbersome manual processes are being replaced with faster, more advanced data capture and analysis. The applications for property and casualty insurance, particularly with personal home and auto coverage, were evident straight away; therefore, P&C providers quickly began utilizing innovative technologies from InsurTechs to streamline their workflows, increase rating accuracy, and improve the customer experience.
These technologies are now starting to expand to additional insurance types, ushering in an exciting new era for accident and health coverage providers as well.

InsurTech’s new tools and new opportunities

As millennials and Gen Z buy homes, start families and advance their careers, their needs for insurance increase. However, these emerging customers are unwilling to compromise on the speed and accessibility of any products they buy – including insurance. Therefore, the traditional method of over-the-phone insurance sales or person-to-person broker relationships no longer apply. These customers demand control, transparency, and ease. They want to complete transactions with a few clicks.
They are also accustomed to an unprecedented level of control and customization in their own lives. Non-traditional career trajectories, home-ownership as a second income stream, greater flexibility with travel and work schedules…all add up to a clientele that demands fast, flexible coverage that conforms to their specific needs. This often means shorter coverage periods, specific add-on coverage, and instant payment – again, all accessible via website or smart phone app.

The changing face of A&H

Traditionally, insurance product development for accident, health and travel has adopted a “one size fits” all approach, offering protection that covers a wide variety of scenarios over an extended period of time. However, new technologies enable A&H coverage to achieve an entirely new level of customizability that can provide customers with exactly what they need, only when they need it. Some forward-thinking examples of InsurTech applications for A&H that we have seen include:

  • Travel Insurance
  • Short-term Accidental Injury coverage for specific activities
  • Customizable Supplemental Health Insurance plans such as Critical Illness
  • Major Medical price transparency comparisons
  • Health benefits packages for gig economy workers

This level of tailoring serves customers more effectively, generates new product potentials, and creates efficiencies that ultimately lower internal operations costs for insurance companies.

Apps, IoT and AI – oh my!

InsurTechs have evolved many aspects of today’s insurance industry, but we have seen the most advancement to A&H in the areas of smartphone apps, the Internet of Things (“IoT”), and Artificial Intelligence.
Insurance companies are finally beginning to recognize the value of smart phone apps in connecting with their customers. Mobile technologies are invaluable to insurers, enabling more efficient product marketing, a direct point of sale, and the ability to collect data from wearables. These streamlined products and advanced data collection can reduce or even eliminate the need for underwriting. The result for insurance companies: more efficiency for a lower cost.
“Smart devices” that connect to the internet and transmit data over a network are known collectively as the Internet of Things. These devices work quietly in the background to collect and transmit data that can help insurers provide more accurate premiums to customers. Some major medical insurance companies offer incentives such as premium discounts or gift cards for meeting exercise goals while wearing specific devices (think: Fitbit trackers). Insurers can now tie premiums and rewards to real data, not theoretical projections.
Finally, artificial intelligence (or “AI”) is releasing insurers from burdensome manual processes. These technologies have the ability to learn and reason, freeing up their human counterparts to focus on areas that require more complex reasoning or subtle discretion. Insurance companies have successfully used AI to develop chatbots that streamline the customer service experience and applied machine learning to build more accurate algorithms and models for analyzing data. By applying machine learning to predictive analytics, insurance companies can analyze key consumer data claims risk, fraud detection, anticipated demand for a new product, claims processing and underwriting. This could lead to better rate adequacy and a better overall risk profile.

Control the risks

Emerging technologies are already transforming the insurance industry, but regulation is still woefully behind the curve. Though coverage offerings are more flexible than ever, insurance product development is still subject to a rigid regulatory environment. Regulation of coverage periods, marketing materials, and underwriting processes are still rooted in traditional ways of thinking.
Additionally, InsurTechs may bring the technological expertise, but they often lack industry-specific knowledge. They usually do not even have an underwriting company or reinsurer to take on the insurance risk. This can come back to haunt you if you’re not careful. With this in mind, it’s smart to connect with an experienced independent insurance product development partner to manage regulatory requirements as you incorporate new technologies into your product suite. Their expertise regarding the jurisdiction-by-jurisdiction requirements will be invaluable as you head into the approvals process.
InsurTechs are set to make sweeping changes across the insurance industry as their technologies provide opportunities for insurance companies to respond to never-before-seen coverage needs. These innovations are not trends – they’re here to stay. As data collection and analysis evolve, A&H insurers are positioned to develop systems and products that feature faster policy uptake and fulfillment,  greater flexibility in coverage, and increasingly targeted customer service.

Want to know more about how technology can advance your A&H offerings? Our team of insurance and actuarial experts can help.

Predictive Modeling: 5 Benefits of an Independent Review

The practice of predictive modeling is a powerful tool for risk assessment for today’s insurance industry. What once was considered a new technique for insurance pricing is now getting utilized in all aspects of the industry.
Your models are only as sound as the industry knowledge that goes into their development. Lack of complete regulatory support for predictive models has slowed InsurTech companies and carriers on their path for regulatory approval.
Instead of dealing with the expensive and time-consuming fallout of stalled approvals, it makes more sense to get ahead of potential pitfalls by investing in an independent review of your model from experienced insurance actuarial consulting experts.
Here are five reasons an independent review of your predictive model is worth the investment.

1. Discover your model’s strengths and weaknesses

Independent review from actuarial consulting experts will reveal areas where your model can benefit from improvement as well as verify its biggest benefits. A review that combines proven industry benchmarks with professional actuarial judgment will surface erroneous assumptions, incomplete support and lead to model improvement.

2. Comply with state regulations

Many predictive models have been rejected by state insurance departments due to lack of compliance in that jurisdiction. States have their own unique regulation and you want to be prepared. By partnering with an independent reviewer who knows the nuances of each state’s regulatory process, you’ll strengthen your chance of approval.

3. Strengthen your case with key decision-makers

Achieving buy-in from the customer is crucial when marketing an InsureTech predictive model to insurance carriers. Though your model may perform impeccably, if your company has a limited track record in the insurance industry, it may be a hard sell to the carrier’s executive team. Getting an independent review with comprehensive documentation will demonstrate to decision-makers that your product has been carefully evaluated by insurance industry professionals. This vetting of your model and accompanying written proof may be the deciding factor between your product and a competitor’s.

4. Increase your speed to market

Presenting your model to regulators without thorough pre-submission scrutiny may reveal surprise shortcomings. Discovering these deficiencies while your model is deep into the review process adds unnecessary time. It’s much smarter to pressure-test your model before submitting to state insurance departments to speed up approval for your model’s implementation.

5. Trust in your results

Your data may support strong predictors used in your model, but to be truly effective, results must be combined with subject matter expertise. Insurance experts who understand all steps in the insurance process give you insights for model improvement.
High level assessment of your model’s viability, paired with detailed scrutiny from subject matter experts who specialize in insurance, is a smart way to protect your investment. An independent reviewer will ask tough questions, and follow best practices for predictive modeling in order to assess your methodology to add credibility and strength to your work product. It’s like investing in “insurance” for your insurance product.

Get your independent predictive model review today! Perr&Knight’s experienced actuarial consulting team can help.

How to Make the Most of Your Software Evaluation Period

Determining whether a new software system will meet your company’s needs may seem like a gamble.

  • How can you tell if this new product will really streamline your workflow?
  • Will it actually increase process efficiency?
  • Can you guarantee that it will add value to your organization?

Vendors understand that you’ll need to get your hands on the product before you can make an informed decision. Luckily, some offer a time-boxed “trial” period where you and your team have a chance to apply the software to your workflow and obtain a much sharper perspective. By taking full advantage of this evaluation period, you’ll be able to determine whether or not the solution will work for you.
Here’s how to make the most of your evaluation period before investing in full.

Take time to prepare. Don’t just dive in.

After identifying software that seemingly checks all the boxes, it’s tempting to want to begin trying it immediately. However, preparation is a crucial step that lays the foundation for an accurate assessment. Here are some less obvious things to do in order to be thoroughly ready to analyze the new product:
Develop test cases: Include both standard transactions and anomalous outliers, as well as things that are on your team’s “wish list.” Identify and document an array of scenarios such as initiating records from scratch, conducting tests on reporting at the beginning/middle/end of the workflow, and interfacing/importing external records.
Use existing data: If your prospective software will replace an existing system, use your trial period to evaluate how your new solution works with your existing data. Your IT department can provide your vendor with a detailed list of the types of data objects you’ll need to manage which can be used to populate the new system with your current data set. Even if you must scrub some of your data before sharing with the vendor, the more you can provide, the more accurately they can assess your needs. If you were to license the software, using the data from the evaluation period may be used for your implementation of the product.
Gather requirements from your legacy system: Your current system was built on set of requirements and design documents. If you want your replacement system to mimic those assumptions, share these details with your vendor. This information will help them determine if the system will meet your needs as-is or if it will require customization.

Treat it seriously, like a project

Though your evaluation period might not cost anything upfront, there is a very real opportunity cost if the software ultimately doesn’t deliver or if you dismiss a quality product without sufficient review. Therefore, treat your software evaluation period like a mini-project. Dedicate the same level of attention as if your company had already invested capital.
Create a one-page “mini” charter that defines important specs such as: What is the purpose of this project? What is its perceived benefit? Who is assigned to work on it? Who ultimately “owns” it? What are the roles and responsibilities of each individual or team? What is the scope of the project? Developing an official “work order” enables you to obtain signoff for teams to dedicate their time to evaluating the software, thus increasing the chance that all parties will take the evaluation period seriously.
As part of defining the scope of this mini-project, don’t hesitate to bring in an outside subject matter expert who can springboard you into software usage to accomplish your goals straight away. Their detailed answers about your specific questions can save you plenty of time that may otherwise be wasted on back-and-forth between you and your vendor.
Finally, be sure to hit your dates. All projects have a timeline and this one should be no different. Work with your vendor to establish an achievable schedule that will keep your teams on track. During evaluation periods for our own state filings software, StateFilings.com, we create a week-by-week plan that outlines what we will accomplish, from training to full implementation. A timeline creates a necessary structure that keeps everyone on track and enables teams to identify any hiccups early on.

Ask plenty of questions

Don’t make assumptions about what the software may or may not be able to do. If it is important to your particular process and the answer is not apparent, ask! Be specific about what you need. By describing your workflow requirements in detail, the vendor can train you on how to perform your desired task, explain how the software handles the requirement in a more efficient manner, or determine that the software may need further customization to meet your needs. Don’t be afraid to ask questions that may seem obvious. It’s faster and easier to confirm that your needs will be met than to worry whether or not the software can perform the basics.
Additionally, feel free to ask the vendor to share the product road map with you, which will detail upcoming features. This might answer some of your questions about functionality and may reveal opportunities for you to improve your own workflow when new features are launched.

Determine how close you are to going live

If you present your vendor with a full set of current data, going live may be as easy as “flipping a switch” after signing the licensing agreement. If the data is only a partial set, or requires additional software customization, fully operational real-world use may take longer. Work with your internal teams and vendor to determine what functionality is mission-critical and what can be postponed until after launch. Strategic configuration during the evaluation period can minimize implementation delays. Perr&Knight’s StateFilings.com evaluation period has resulted in many near-immediate go-live implementations for customers.

Help your vendor help you

Your vendor wants to do more than just sell you a product. Ultimately, they want to solve your problem. Make sure that your teams are not impeding the process. When the vendor asks for information, keep turnaround time to a minimum. Establish a regular meeting schedule (once a week is preferable, but bi-weekly meetings can suffice) and keep communication clear and frequent. As mentioned above, if you have questions, ask them ASAP. Finally, if it becomes clear that the software solution is not what you expected and will not meet your needs, let your vendor know immediately. Instead of wasting everyone’s time, it’s better to move on.
The goal of your software evaluation period is to reduce surprises further down the line. Pressure-testing the system from all angles will provide a clear perspective on what the software can and cannot do. By taking the trial period as seriously as you would any other project, you stand the best chance of achieving your ultimate goal: implementing a smarter solution for your organization.

Want to know if your proposed insurance software will actually perform? Our insurance experts can help you better capitalize on your evaluation period.