Professor: Europe must regulate ‘personalisation’ of insurance

A study from University of Zürich looks at how the insurance companies use data to assess, select, price, predict and prevent risks. Big Data analytics and Artificial Intelligence are used in areas such as telematics in car insurance, fraud detection, and quantified-self applications for health and life insurances. This can be done for public good but also mean unfairness, discrimination and violation of our privacy. Therefore, the university gives a list of recommendations, including regulation.

“In California price optimization is illegal. But in Switzerland and Europe you can do it.”

Such were the words from professor Florent Thouvenin from University of Zürich and part of a 75NRP, a big data national research program looking at insurance and data, at a conference in Zurich 19th September discussing the research results.

With ‘price optimization’ he means changing the price according to how data tells you the customer is willing to pay. In the US they call it ‘price dynamics’ but in the EU we call it ‘price discrimination.’ This price differentiation is probably something in between

“You can prohibit is or allow it with no limits – and then there are a lot of options in between, for example it could be allowed in car insurence and house insurance and then it could be banned,” said Thouvenin
Allowed in car insurance and insurance on contents (house). Otherwise personalisation is not allowed.

At the conference there was a general agreement that personalisation of health insurance was not fair, it opens up for discrimination and it kills the solidarity principle of insurrance.

According to the study, regulators in Europa really need to determine under what conditions and to what extent insurance companies should be allowed to personalise their insurance contracts based on big data analytics. They should also  continuously monitor and anticipate the use of data for the personalisation of insurance contracts, identify unwanted forms of personalisation, and create specific provisions in insurance law, where needed and either prohibit personalisation or define the conditions and the extent of permissible personalisation, according to the study. Further insurance companies should

  • Avoid using data sources that are not related to the insured risk, as this may undermine the customer’s trust in the products and services of the industry.
  • Demonstrate to their clients how they protect core values such as privacy, fairness or solidarity from risks posed by Big Data analytics.
  •  Increase their awareness about the nature and impact of the unwanted discriminatory use of big bata-based machine learning in prediction, pricing and fraud detection.
  • And adapt their general business ethics principles for achieving accountability to the systematic handling of ethical issues resulting from the digitalisation of the industry.

Get the study here

Comments are closed.

Password Reset
Please enter your e-mail address. You will receive a new password via e-mail.