General Insurance Article - Machine Learning work: Mixture of Experts is the answer, but


Recent work, led by Dugas and Begio, has shown that mixture of experts models can be the best way for Actuaries to do pricing. What are mixture of experts models? Is this the end for traditional techniques? And is this finding relevant for most of us?

 Firstly, what are mixture of experts models?

 Imagine you walk into your GP with a stomach complaint. He has a think about whom you should see and sends to be treated by a certain specialist. A little while later you walk into your GP again, this time with a foot complaint. You would be surprised if he sent you to the same specialist. Indeed he does not, and you get sent to see someone that specialises in foot problems. Of course, sometimes your GP may not be entirely sure which specialist is best and may send you to see two specialists and then take the advice of both.

 Why have this kind of system? Because the human body is complicated. One specialist cannot be an expert in all of it. We need local experts.

 This is exactly what a mixture of (local) experts models is. The patterns we are trying to find in data are often too complex and confusing for one model to be good for all the data. We therefore train a few different local models (specialists) over different parts of the data and then use a gater model (the GP) to decide which local
 model to use for the case we are trying to predict.

 Actually, we don’t know in advance how we are going to split the data between the different experts, adaptive models can deal with this aspect for us too.

 Dugas and Bengio found that mixtures of experts models, where each expert was a neural network were best. Better than ordinary neural networks, generalised linear models, decision trees or support vector machines.

 Does this mean the end for generalised linear models? Far from it. For many of us, especially those working in the London Market, we might need reasonably good models, but more importantly we need relatively simple and transparent models which can be discussed with underwriters and which stand some chance of being implemented in practice. However, even in personal lines, eking out small fractions of improvement in model performance at the expense of ever increasing model complexity is not necessarily the best way forward. If GLMs are used though, they should be properly fitted and machine learning techniques can be used to ensure this is done quickly and efficiently.

 Whatever your preference, at MLS our aim is to provide you and your team with the training you need to able to apply machine learning techniques in practice. Visit our site http://machinelearningsolutions.co.uk/training/ to find out more. If you do want to make some quick initial progress, we can help you with our consultancy services – feel free to email us at info@machinelearningsolutions.co.uk

  

Back to Index


Similar News to this Story

Hurricanes and earthquakes could lead to USD300bn losses
Following the long-term annual growth trend of 5–7%, global insured natural catastrophe losses may reach USD 145 billion in 2025, mainly driven by sec
FCA set to launch live AI testing service
The FCA is seeking views from firms about how its live AI testing service can help them to deploy safe and responsible AI, which will benefit UK consu
Over one third of London market firms now actively using AI
The Lloyd’s Market Association (LMA) has hosted a seminar on the use of AI within the London specialty market. The seminar referenced results from a r

Site Search

Exact   Any  

Latest Actuarial Jobs

Actuarial Login

Email
Password
 Jobseeker    Client
Reminder Logon

APA Sponsors

Actuarial Jobs & News Feeds

Jobs RSS News RSS

WikiActuary

Be the first to contribute to our definitive actuarial reference forum. Built by actuaries for actuaries.