At least half of the 50 most popular Android mobile apps have inherited security vulnerabilities through the reckless re-use of software libraries, according to the security team that uncovered the ‘Heartbleed’ vulnerability in OpenSSL.

Researchers at Codenomicon, which first published information about the OpenSSL vulnerability and coined the ‘Heartbleed’ name, will this week publish findings which name and shame some of the world’s most successful app developers for their lax approach to security.

Preliminary results from the study reveal that over half of the 50 apps send user data to third party advertising networks without user permission - often in clear text.

The researchers concluded that many of the developers of these applications were not aware of the vulnerabilities they were shipping in the code.

Olli Jarva, chief security specialist at Codenomicon, said 80 to 90 percent of mobile app software is made up of re-used libraries, most of which are available under open source.

He said it was natural that developers did not want to "invest in reinventing the wheel” with every app they push out.

But while “in theory” the open source community should result in better quality code, owing to the number of developers contributing, the numerous bugs in OpenSSL proved this was not the case, he said.

“We’re seeing the end products inherit vulnerabilities - sometimes it’s just poor software design or logic errors in implementations, and sometimes those bugs are identified and patched. Sometimes, like in the case of Heartbleed, they are not identified for two years."

More concerning is when “developers act intentionally,” Jarva said.

“Some people might have been providing a vulnerability on purpose in order to do something nasty” once the code has been distributed.

It’s rare that developers do their due diligence on who created the libraries before they embed them in the apps, he said.

“Who are they working with? Do they have sideline jobs somewhere else? The developers might be getting their dollars from ad networks," Jarva said.

End users who purchase or commission the development of mobile apps are unlikely to be aware that the apps reuse software libraries connecting them to advertising networks which exfiltrate private data without user consent, he said.

The preliminary study found that close to half of the top 50 Android apps on the market submit the user’s Android ID to third party advertising networks.

One in ten apps send either the user’s device ID (IMEI code) or location data to a third party, and one even sends the user’s mobile phone number. One in ten applications connected to more than two ad networks.

The study found that over 30 percent of the apps transmit private data in plain text and plenty more are not encrypting the transfer of this data to best practice.

“The issues are invisible to users,” Jarva said. “A lot of things are happening behind the scenes, it only afterwards they know what has been done.”

Jarva said IT security should be concerned by any app that sends irrelevant or sensitive information to third, fourth and fifth parties if this communication doesn’t align with what the app purports to do.

There are sandboxing tools available that allow an administrator to scan the binaries in an installation file and “reveal the true characteristics of the app” in under a minute, he said.

“Its not a huge amount of data to analyse.”

Jarva said open source does not provide a "free lunch".

“We have to take care to test well enough the libraries we use so we can be confident they are safe enough to be used,” he said.

Jarva said IT managers usually turn to a whitelisting strategy to overcome these issues, but were struggling to keep up with the volume of new apps being released every day.

“It’s too time consuming,” he said.

“At the end of the day, we have to make the developers and those that employ them to understand the importance of testing.

"The difficulty we face is that the motivating factor for app delivery is rarely the quality of security. More testing means more time spent, and that means more cost for the developer and a higher price for the solution. It only becomes an issue when something bad happens.”