Facebook wanted to test the loyalty of its user to breaking point

Facebook is the most popular social network in the world, with more than 1.50 billion monthly active users. The US social network, which was launched by CEO Mark Zuckerberg in 2004, is now valued at a staggering £197 billion. But details of an experiment to test social media users' loyalty to Facebook has now emerged online. The Californian social network is believed to be preparing for the eventuality that Google one day removes Facebook's apps from its Play Store marketplace for competitive reasons. As a result, Facebook tried to test the loyalty and patience of its Android users to the limit.

The US firm secretly rolled-out a slew of artificial errors within the Android app that would automatically crash the mobile app for hours at a time, a source has claimed. The experiment was designed to test at what point a Facebook user would give-up and ditch the Facebook app from their device all-together. Speaking anonymously to The Information, a source familiar with the one-time test, which is believed to have taken place a few years ago, said Facebook was never able to reach this threshold. "People never stopped coming back," the source said.

Facebook CEO Mark Zuckerberg founded the site in 2004, which is now valued at £197 billion

Android users were logged out of the app to test whether or not they would delete the app

Facebook wanted to see whether users would abandon the social network or simply switch to the far-inferior mobile website while their Android app was artificially broken. Former Facebook data scientist JJ Maxwell defended the move, saying tests like these are "hugely valuable" to the company and "their prerogative," The Verge reports. Admittedly, Facebook is not alone – many technology firms quietly test new features on users. Google famously cycled between 41 different shades of blue on its homepage, to see which promoted the best response from its users. But tweaking a shade of blue is very different to testing the loyalty of your users by deliberately crashing their access to the service. Especially when you state your company mission is to "connect the world" and you have a feature – dubbed Safety Check – to allow users to log-in and signal to one another that they are safe in a time of disaster. It's criticial to ensure people can stay connected.

Facebook promises to 'connect the world'

The latest revelation follows the controversial 2014 experiment which manipulated users' emotions using the Facebook News Feed. Devised by the social network's on-staff data scientist, Facebook scientifically tweaked the News Feed of hundreds of thousands of users. Some were sent an onslaught of upsetting or negative posts, while others were given a barrage of positive posts to another group. A number of critics highlighted the potential dangers of this type of manipulation, following the publication of two separate studies from the University of Houston which linked Facebook to depression. Entitled "Seeing Everyone Else's Highlight Reels: How Facebook Usage is Linked to Depressive Symptoms," the study provided evidence that Facebook users felt depressed when comparing themselves to others.

But Facebook data scientist and co-author of the study Adam Kramer said: "The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product. "We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. "At the same time, we were concerned that exposure to friends' negativity might lead people to avoid visiting Facebook."