The Ministry of Defence is developing a secret, multimillion-pound research programme into the future of cyberwarfare, including how emerging technologies such as social media and psychological techniques can be harnessed by the military to influence people's beliefs.

Programmes ranging from studies into the role of online avatars to research drawing on psychological theories and the impact of live video-sharing are being funded by the MoD in partnership with arms companies, academics, marketing experts and thinktanks.

The Guardian has seen a list of those hired to deliver research projects, which have titles such as Understanding Online Avatars, Cognitive and Behaviour Concepts of Cyber Activities, and Novel Techniques for Public Sentiment and Perception Elicitation.

The projects are being awarded by a "centre of excellence" managed by BAE Systems, which has received about £20m-worth of MoD funding since 2012. The MoD plans to procure a further £10m-worth of research through the centre this year.

While the centre commissions a wide range of research, such as studies of alcohol consumption in the armed forces, a substantial stream of research comes under the heading of "information activities and outreach". The term is significant in that it has its roots in Britain's 2010 strategic defence review and national security strategy. Its aims include understanding the behaviour of internet users from different cultures, the influence of social media such as Twitter and Facebook and the psychological impact of increased online video usage on sites such as YouTube.

Typical targets, for now, would include groups of young internet users deemed at risk of being incited or recruited online to commit terrorism.

Dr Tim Stevens of Kings College London, who studies cyberwar and strategy, said there was increased state interest in the role of emergent technologies such as social media and the development of powerful psychological techniques to wield influence.

"The current furore over inter-state cyberwar is probably not where the game's at. What is far more likely is that states will seek to influence their own populations and others through so-called 'cyber' methods, which basically means the internet and the device du jour, currently smartphones and tablets," he said.

"With the advent of sophisticated data-processing capabilities (including big data), the big number-crunchers can detect, model and counter all manner of online activities just by detecting the behavioural patterns they see in the data and adjusting their tactics accordingly.

"Cyberwarfare of the future may be less about hacking electrical power grids and more about hacking minds by shaping the environment in which political debate takes place," he added.

The current MoD research drive in the area is being run by the Defence Human Capability Science and Technology Centre (DHCSTC), which is administered by BAE.

While most projects remain under wraps an insight into the area of research has been provided by a previous report commissioned by the MoD, and which has been released under the Freedom of Information Act. It examined how chatbots – computer programmes that make human-sounding small talk and which have been used in everything from customer relations to sex industry marketing – could take on military roles in intelligence and propaganda operations to influence targets.

The research into the programmes, which are designed to emulate human conversation and are familiar as "virtual assistants" on retailers' websites, envisages a future in which "an influence bot could be deployed in both covert and overt ways – on the web, in IM/chatrooms/forums or in virtual worlds".

"It could be a declared bot and fairly overt influence play, or pretend to be a human and conduct its influencing in less obvious ways," says the 2011 report by Daden, a technology group that develops chatbots for commercial and educational clients.

Daden also suggested chatbots could be used as "cyberbuddies" shadowing soldiers through their careers or as data-gatherers in digital environments such as chatrooms and forums, where they could "scout for targets, potentially analyse behaviour, and record and relay conversation".

The report cautions, however, that the barriers to their use in data-gathering and influence operations include ethical issues, adding that "the adverse effect that the unmasking of a non-declared bot would have on the subject, and their wider group needs to be carefully considered".

It says: "One approach, as in real life, is for the bot to withdraw if it thinks it may be compromised. In the early days, it may be better that the bot activity is declared and overt – in the same way as much broadcast and UK plc promotional activity."

BAE said they had nothing to add to a statement given by the MoD.

An MOD spokesperson said that strategic defence review and national security strategy recognise "the need for influence to sit at the heart of future UK military operations".

"The goal is to tackle security risks at source and in advance of crises spilling over into conflict, through earlier engagement and more effective use of 'soft power' to shape behaviour and exert UK influence in international affairs," she said.

"Key to this is an understanding of the complex human terrain in which our armed forces must operate, and giving them the capability to promote the UK's messages and values over those of our adversaries in a global information environment that is connected, congested and contested."

She said that the Defence Human Capability Science and Technology Centre was effectively harnessing cross-disciplinary innovations from a world-class supply base in industry and academia.

"These innovations support the entire spectrum of military operations – from early warning, rapid crisis response and effective 'upstream prevention' to post-conflict stabilisation, security and capacity-building. The benefits of the work are measured ultimately by the reduced frequency, duration and costs – human and financial – of conflict that result from improved conflict prevention, and enhanced UK and international security."

The projects

• Full Spectrum Targeting – a sophisticated new concept that is growing in influence at the MoD and measures future battlefields in social and cognitive terms rather than just physical spaces. Emphasis is put on identifying and co-opting influential individuals, controlling channels of information and destroying targets based on morale rather than military necessity. The £65,285 project is being delivered by the Change Institute (a London-based thinktank whose previous work includes carrying out research for the government into understanding Muslim ethnic communities), the BAE subsidiary, Detica and another defence and security-orientated company, Montvieux.

• Cognitive and Behaviour Concepts of Cyber Activities – £310,822 project being delivered by Baines Associates, a strategic marketing firm, i to i Research, a consultancy in "social and behavioural change", and universities including Northumbria, Kent and University College London.

• Innovation: Tools and Techniques for Influence Activities – a £28,474 project being delivered by the Change Institute, the University of Kent and QinetiQ, a company spun out of the MoD research department.

• Understanding Online Avatars – a £17,150 project being delivered by the Change Institute.

• This article was amended on 18 March 2014 to include a statement from the Ministry of Defence.