Facebook and Twitter may be forced to reveal detailed information about how and why users were targeted for political advertising, the information commissioner Elizabeth Denham has suggested.

Speaking to the digital, culture, media and sport (DCMS) select committee, which is holding an inquiry into fake news, Denham said that transparency in political campaigning was crucial.

“Our intention is to be able to pull back the curtain and to be able to explain and expose for the public, for parliamentarians, for civil society, what happens with their personal information in the context of political advertising and political messaging,” she said.

“There needs to be transparency for the people who are receiving [political adverts on Facebook], so that they can understand how their data was matched up and was used to be the audience for the receipt of that message. And I think that’s where people are asking for more transparency,” Denham added.

Q&A What is a Twitter bot? Show Strictly defined, a Twitter bot is any automated account on the social network. That can be something as simple as automatically tweeting links to news articles – most of the Guardian's social media accounts are technically Twitter bots, for instance – to complex interactions like automatically generating Emoji-based art or automatically replying to climate change deniers with scientific evidence. But, as with "troll" and "fake news", the strict definition has been forgotten as the term has become one of political conflict. The core of the debate is the accusation that a number of political tweets were sent by "Russian bots", with the intention of subverting political debate, or simply creating chaos generally. Based on what we know about Russian information warfare, the Twitter accounts run by the country's "troll army", based in a nondescript office building in St Petersburg, are unlikely to be automated at all. Instead, accounts like @SouthLoneStar, which pretended to be a Texan right-winger, were probably run by individuals paid 45-65,000 rubles a month to sow discord in Western politics. In other ways, they resembled bots – hence the confusion. They rarely tweeted about themselves, sent far more posts than a typical user, and were single-minded in what they shared. People behaving like bots pretending to be people: this is the nature of modern propaganda.

Additionally, she suggested that more wide-ranging reforms may be needed. “I think the use of social media in political campaigns, referendums, elections etcetera, may have got ahead of where the law is, and I think it might be time for a code of conduct so that everybody is on a level playing field and everybody knows what the rules are.

“Some MPs are concerned about the use of these new tools, particularly when there is analytics and algorithms that are determining how to microtarget someone where they may not have the transparency and the law behind them.”

The recommendations will be made as part of a formal investigation into data processing for political ends, announced by the Information Commissioner’s Office (ICO) in May 2017. Currently, Denham said, that investigation has 10 full-time staff working on it.

“It involves more than 30 organisations: social media platforms as well as data analytics companies like Cambridge Analytica; it involves political campaigns and parties.”

The investigation will have two main outcomes: the creation of a wide-ranging report that will recommend changes to the regulation of data use in politics, and potential enforcement action against parties and data processors who have broken data protection laws as they currently stand.

Denham said: “If we find contraventions of the Data Protection Act, then we’ll be taking enforcement action and that enforcement action could take some time.”

The ICO has issued seven information notices so far in the course of its investigation, which entitle it to formally request information from organisations. One of the seven, issued to UKIP, has been contested, with the information rights tribunal due to rule in the coming months.

Before Denham spoke to the DCMS select committee, it heard evidence from Edward Lucas, the senior vice president of the Center for European Policy Analysis, who warned that the focus on past misinformation campaigns by Russia risked blinding legislators to potentially different attacks in the future.

“Please don’t spend too much time looking in the rear view mirror at what Russia did to us,” Lucas said. “Look through the windscreen at what’s coming down the road. That’s much more dangerous.”