Political parties are trying to find ways to use WhatsApp as a campaign tool in the general election, but Britons’ reluctance to join group chats involving strangers is hindering the effort.

The pro-Labour campaign group Momentum is urging supporters to sign up to its WhatsApp broadcast service to receive messages to forward to their friends, while messages targeting British Hindu voters with fear-mongering claims about a Labour government have been spotted.

But these examples are as yet nothing compared with India and Brazil, where WhatsApp has been credited with key roles in both electing political candidates and spreading a rash of misinformation. Hundreds of people can join a single shared chat, which can make WhatsApp an effective medium for sharing viral memes and videos.

Rasmus Kleis Nielsen, of the University of Oxford’s Reuters Institute, said there may be cultural factors limiting the use of WhatsApp for political campaigning in the UK.

“In countries where people express concern that sharing their views online may influence how they’re seen by friends or family or by colleagues or acquaintances, we see more people engaging with news and politics in the relatively private environment of WhatsApp than some public social media,” he said.

Although Britons are increasingly heavy users of the service, they are substantially less likely to join groups with people they don’t know. This makes it much it harder for political activists to broadcast disinformation to an unwitting audience.

“In the UK it’s overwhelmingly a social infrastructure rather than one that’s used for more public purposes. The groups people are in are with friends, family or colleagues,” said Nielsen.

His institute found that six in 10 WhatsApp users in Brazil joined groups with people they did not know, compared with just over one in 10 in the UK. While almost a fifth of Brazilian users were happy to discuss news and politics in a public WhatsApp group, just 2% in the UK felt likewise.

The flipside is that when a political WhatsApp message or video does go viral in the UK, it tends to do so organically, without the involvement of a broadcast group. In addition, it will tend to be forwarded into a group by a friend or family member, increasing the potency of a meme mocking a politician or a shocking news story because it comes from a trusted source.

One Labour campaigner suggested WhatsApp could have more impact in communities with close connections to friends and families in other countries, claiming they were more likely to be members of large Facebook groups built around shared interests.

This category includes many British Indian voters who are being targeted with pro-Conservative WhatsApp messaging by campaigners associated with India’s ruling Hindu nationalist party. In one message seen by the Guardian, readers are warned about supposed connections between the Labour party and Pakistan before being told to “pass this to every true Indian”.

However, it is impossible to measure the reach and influence of such WhatsApp messages, which spread person to person in a manner more like a chain letter than a traditional media broadcast. While public posts on Twitter and Facebook are accompanied by share counts, which give a rough idea of the reach for a video or article, there is no such count for WhatsApp posts. The anti-Labour message could have been seen by dozens of people or hundreds of thousands.

Facebook’s main approach to reducing the spread of disinformation on WhatsApp is to make it harder to reach a large number of people. This year it stopped users forwarding messages to more than five individuals or groups at the same time, in an attempt to stop users sending information to thousands of people in seconds.

Instead, journalists and academics are reduced to making educated guesses as to the audiences for a viral WhatsApp message. One study by the Tow Center during this year’s Indian election described closed networks such as WhatsApp as “a way for political campaigns and activists to avoid scrutiny from regulators and reporters” and found widespread false information.

It predicted a future in which the “information ecosystem heads further toward closed networks where it’s easier to micro-target groups of people for nefarious political purposes”.