Articles - From data to disclosure a disaster or delight


I have experienced many different Actuarial tools, systems, and processes over the last 20 years. I missed the jump from physical to digital, but I still got to experience running highly model pointed data on low memory 32bit desktop computers, then distributing across machines stacked in the corner. Now I have the luxury of using on-demand cloud computing with full data and granularity available.

 By Andrew Blackburn FIA, Principal and Insurance & Longevity Consulting at Barnett Waddingham

 New actuaries won’t know what it was like to do late-night shifts or weekend office drop-bys to check on runs when remote access wasn’t a thing and running out of hours was the most efficient method to hit deadlines. I’ve looked through reams of data on printouts to identify issues, updated assumptions cell by cell, pasted results from one table to another and re-started so many runs due to memory limits that even my memory has hit its limit.

 Overall, not the most delightful of experiences.

 So, given we are now in an age of automation, constant connectivity, and potentially infinite computing power, why does it still take insurers months to produce the information required by businesses, the market, and the regulator? Why does it take hundreds of resource days to produce all of this in an apparent information vacuum to the outside world?

 Are the insurers, the software vendors or the regulatory and professional bodies holding back innovation? Can anything be done differently to improve the experience of processing data through to disclosure? In this blog, we’ll explore all of these questions, as we look at some of the possibilities for change.

 Why are we where we are?

 Insurers

 The insurer’s remit:

 Value the business in line with the regulations to provide protection to customers.
 Determine, understand, and manage the risks appropriately.
 Extract value for investors.
 If the insurance company in question has always managed to comply with the regulations, the auditors are satisfied, and investors achieve desired returns, why should the insurance company change anything?

 It may be that new regulatory and industry approaches may not be compatible with the existing set up - developments that were originally sandboxed in Excel and only supposed to be used temporarily have been subsumed into the BAU process, leading to half the process now taking place in Excel with a few heavy models in the middle. Departments may have been allowed to choose their software and hardware which could have created a very disparate siloed situation where there are inconsistent data formats, audit trails and review processes.

 The Finance and Actuarial functions are vital cogs in the insurance machine, but due to it being a back-office function it can often be overlooked as a place where improvements are required or innovation can occur to improve time efficiency, reduce costs, and increase understanding.

 Vendors

 The software vendor’s remit:

 Provide valuation software and services to the clients that are fit for purpose.
 In return receive a fee commensurate to the deliverables.
 The first valuation systems were designed for a single user, single machine, single run. These systems cost a lot to develop initially and so fees were charged in accordance with that - a large upfront fee with a recurring annual fee and then, as new features, new functionality and access to more hardware became available, additional fees were added on top.

 These pricing structures created significant barriers to change, as paying existing fees, implementation costs plus any additional fees meant any potential cost savings were easily wiped out. Couple this with IP ownership of code, models embedded in workflow processes alongside data, analytics and reporting that can’t access third party offerings, means yet more hurdles are placed in the way. The result is that companies are running decades old models that have layers of unknown development as part of processes, using elements which are no longer best in class.

 Yet against all these changes, there are still new software offerings being created with new approaches, open connectivity, embracing new technology and trying to improve the processes required.

 Regulatory & Professional Bodies

 The regulator’s remit:

 Ensure the customers are protected.
 Ensure companies comply with the regulation.
 Continually review regulation and suggest new methods, approaches and practices given the latest research, market changes or technological developments.
 Invariably, changes to regulation breed the necessity for developments in companies, and this does drive the propensity for investment in new technology and skills.

 The move from commutation factors to cashflow modelling, the push to do multiple shock runs and stochastic modelling that underpins Solvency II and IFRS17 regulation drove companies to upgrade and invest where they could. The actuarial profession’s use of open-source modelling as part of the syllabus in recent years ratified the use of open source as part of the toolkit. But due to years of under investment, lack of resource and budgets stretched due to changing economic environments, sometimes the mark is missed as to what can and should be upgraded or replaced.

 But if it wasn’t for this continual updating of the regulations, which creates opportunity, then I don’t think we would have seen the advances in technology that we have experienced.

 The path to discovery and delight
 So what is the aim? What should, or are, companies looking for, and is it even possible? I think companies are looking for the seemingly conflicting scenarios for their technology to be:

 Open but secure: They want to have access to a range of skilled resource, not pay large fees or be locked in for many years, but need to know it has the correct functionality and is supported should questions arise.

 Quick but cheap: Be able to run in minutes or seconds, not hours, by leveraging the use of on-demand infrastructure - yet doing so at a reasonable cost.

 Accurate but understandable: Can model the complexities of products and regulations in force but with the ability to easily analyse the logic and extract numbers to understand and explain the results and any movements.
 Are these scenarios even possible? I think yes, given the major advancements and developments that have been made recently, specifically in areas such as:

 Investment and accessibility in cloud computing: Since the global pandemic many companies who were resistant to change are now embracing the move to cloud computing. AWS & Azure are leading the way in offering off-site, on-demand infrastructure with Google, IBM, Salesforce and Oracle providing alternatives.

 Embracing open-source offerings with a shared community of coders: Many of the available open-source languages being used have been in existence for many years, but with the increase in data available, the use of open-source such as Python and R as data-science tools have introduced actuaries to options other than vendor driven software and brought in more developer minded individuals.

 Changes in approach by vendors to licencing and costs being more flexible and reflecting the on-demand nature of computing: Vendors are now moving to more flexible subscription pricing structures, rather than the large upfront with reduced annual renewal for x years. Vendors are also now offering to host the solution within a partitioned cloud to further embrace new ways of working yet maintaining confidentiality and security.

 Vendors re-imagining implementation approaches and opening up software to third party connections: The libraries of old are no longer the de-facto approach. Bespoke, focused code developed via code generators and AI algorithms look to vastly reduce implementation time whilst ensuring minimal code path redundancy. Models can become packaged executables in a workflow, opening up companies to a wider range of tools for workflow, data & assumption management, results storage, MI & disclosure.

 So technically we are in a world where we can implement and test models more quickly than in the past, then automate the full end-to-end process from data to disclosure if we choose. Data extracts can be moved and cleansed automatically, assumption tables set up for all stresses just from a base run, 1000’s of scenarios created on the fly, calculations distributed to the cloud as required, and results appearing in close to real time.

 Then we have accessible dashboards across multiple devices providing automated analysis of results with success, failures or warnings, email notifications provided along the way informing users to review, check and approve key steps, and finally, interactive dynamic reports created with a combination of text, tables and charts ready for review and editing prior to disclosing, all with a flexible and cost-efficient subscription pricing structure.

 Taking the next steps
 So, if we have answered the question of ‘is there a better experience of processing data through to disclosure?’, the next question is how can insurance companies start to realise the possibilities and potential out there to them, that fits within their budget, timescales, and resourcing skillset?

 BW consultants have been working with clients for many years on a multitude of projects to enhance their experience. Whereas historically the approach tended to be a large-scale, fulltime multi-year project, now, due to the modularisation of the process, this doesn’t have to be the case.

 The elements that provide the most benefit for the investment can be undertaken first, and over time others can then be brought up to date. These elements may initially be around using new technology or softward to improve the efficiency of existing processes, before shifting to implement previously untenable approaches that are expected by the regulator, or are becoming industry standard, e.g. stochastic-on-stochastic or long term granular timestep projections.

 To determine where to begin, we recommend that some key steps should be taken:

 Nurture an environment for change: Accept that hardware, software, models and processes all have a shelf life, and you can’t keep adding layer after layer of development – at some point you will need to replace elements, learn new skills and embrace new approaches.

 Undertake a review and audit of your systems and processes: Identify what you have, what are your current capabilities, what is required, what might be missing, and what more you may want to achieve.

 Strategy review: Define your vision and roadmap ensuring all parties are involved from IT, Finance, Actuarial and Business development. Without a full cross function buy-in, the end goals may still be achievable, but it will take more time and effort.

 Put in place an initial plan: Focus on time/budget/skills/urgency/impact – rank all aspects and identify the most achievable/important/impactful aspects you would like to implement.

 Implement: Be realistic on resource, time, and cost. Ensure contingencies for uncertainties are allowed for, but also embrace the potential for some scope creep as new ideas arise as you get to understand more about the capabilities and functionality of any new tools.

 Review periodically: To ensure the feasibility of the plan is still correct, the whole process should never be a one-off but a continually re-visited and evolving one. As they become embedded in your day-to-day work, these reviews becomes easier and are no longer viewed as a standalone project but normal practice, which is a good place to be. 

Back to Index


Similar News to this Story

The reserving actuary natural vs artificial intelligence
Why human actuaries still have the upper hand over AI when it comes to the nuanced art of reserving in the insurance industry. Every year, we take in
Five step approach vital for DB schemes looking to buyout
Insurers may refuse to quote and provide pricing for buy-ins and buy-outs where the DB pension schemes’ data is of a poor quality, warns Hymans Robert
What insurers must know about the hidden risks of silent AI
Anja Vischer, Senior Emerging Risk Manager at Swiss Re Institute, discusses the emerging risks of AI for insurers. She stresses the need to reassess c

Site Search

Exact   Any  

Latest Actuarial Jobs

Actuarial Login

Email
Password
 Jobseeker    Client
Reminder Logon

APA Sponsors

Actuarial Jobs & News Feeds

Jobs RSS News RSS

WikiActuary

Be the first to contribute to our definitive actuarial reference forum. Built by actuaries for actuaries.