How appropriate is it for insurers, lenders and banks to snoop on their customers?

It’s an important question that arises after Facebook felt compelled to take action against insurer Admiral last week, banning it from viewing young drivers’ Facebook profiles to help set premiums. With their customers’ permission the insurer had planned to use their posts and likes to help assess how safe they might be as drivers, and award discounts where it deemed them appropriate.

For Facebook and many other critics, that crossed a line. However, there are already many financial products and insurance offerings that require the provider to delve deep into the personal lives of its clients. What’s more, a growing number of service providers, including landlords, are researching potential customers more thoroughly than ever before, as our ability to gather data increases.

New technology, same questions

It is not unusual for insurers and other financial services companies to seek out in-depth information about their clients. Life insurance is almost always granted only after extensive questioning about an individual’s medical history and current health, for example, including queries about their weight and even whether they indulge in casual sex.

And technology has made it easier for companies to glean more information. Many young drivers have agreed to have a “black box” fitted to their vehicles to monitor how and where they drive in order to bring down the cost of cover. That’s something that previous generations might have considered to be Nineteen Eight-Four-style surveillance but which is now seen as a way for careful young motorists to enjoy more affordable premiums than more dangerous youthful drivers.

Rod Jones, insurance spokesperson at uSwitch.com, says: “To calculate any changes to your premium on a telematics policy, insurers will typically look at how many miles you have driven as well as your speed, if you are prone to sudden breaking, sharp cornering and the time of day you are driving.

“Some policies also track where you drive and how frequently you travel along the same route, so motorists who do the same routes regularly, and are therefore seen as a lower risk, will likely enjoy cheaper premiums. For black-box insurance to appeal to all, the telematics industry needs to make it clear to motorists exactly how their personal data is used and, just as importantly, protected. A common misconception is that this data is used elsewhere, when in fact insurers work very hard to ensure it doesn’t happen.”

Following the Mortgage Market Review, mortgage providers are also now required to carry out far more intrusive checks on applicants’ finances.

Holly Andrews, managing director of KIS Finance, explains: “Mortgage lenders are now obliged to ask far more questions particularly about what else you spend your money on, to make sure you can afford the mortgage. As customers, we cannot decide for ourselves alone that we can afford a mortgage, and advisors have to ask questions covering whether or not you have expensive hobbies, how much you spend on gifts, holidays, clothes and entertainment.”

And it’s not just financial organisations. Research from ClearScore, a company that provides free credit scoring, 89 per cent of landlords and letting agents check the credit files of prospective tenants and 68 per cent say they are more likely to credit check tenants now than they were just five years ago.

If the data is available, companies will try to use it.

Dystopian discounts

It seems too that younger customers are more willing to put up with intrusive questions or even monitoring if it means they can bag a bargain. Customer engagement software company Pegasystems, which works with many global financial services providers, polled millennials in the UK and discovered that more than a quarter strongly agreed that they would be willing to provide an insurer with a regular blood or urine sample to prove their good health if it meant they could claim a discount.

Rather more dramatically, 22 per cent said they would be willing to have a chip or tracker inserted into their bodies to monitor their behaviour if it meant spending less on insurance.

Tony Tarquini is head of insurance at Pegasystems and takes a strong interest in how insurers evolve their systems to make use of social media and other data sets. He suggests that many insurers will be less interested in social media data and more keen to access information from the new wave of smart home devices, such as leak and smoke detectors.

Tarquini adds: “All credit to Admiral for attempting what I know other insurers are exploring too. Clearly you must have consent of customers and when there’s a hazy line it’s sensible to pause and rethink. The future of insurers lies in having a closer, ongoing relationship with customers that’s enriched by how insurers gain useful insights from the data customers share.

“Many digital businesses already deliver great customer experiences based on how they do predictive analytics of customer behaviour to make relevant offers. To succeed, insurers need to become data and customer centric businesses too. This case was an example of growing pains in a natural evolution for the sector.”

Selfies not statistics

Whatever the justification for using tangible data such as driving style, credit history or intra-body tracking devices, there are many justified concerns over the use of social media by service providers or any commercial organisation. Pam Cowburn, communications director at privacy campaigners the Open Rights Group, says: “Using non-financial data to determine financial decisions is problematic, especially when we don't know or understand the criteria that such decisions are being based on.”

That is particularly important because there is no specific regulation covering how a company might choose to interpret information that has been shared through social media platforms. A jokey selfie with a bottle of wine or a series of updates about a varied romantic life might be interpreted in different ways a car or life insurance provider.

Cowburn adds: “Algorithms and poor data can perpetuate social biases, for example around race, gender, religion or sexuality. We could also see people who don't use social media or share their data being treated less favourably. This could push people into sharing data with companies, undermining the notion of consent.