Social media platforms wanting to help curb political disinformation on their sites are offering new ways to avoid it online. Twitter has discontinued all political ads and Facebook has new tools for content moderation. This is a good start, even if not all platforms are showing initiative, and not all such attempts are fruitful.

However, electioneering and political campaigning don’t start with technology firms. They start with the political parties and their affinity groups. To keep our elections free and fair, and to be confident in their outcomes, parties need to disclose what ads they’re buying, on which platforms, and at what price. Now is the chance for campaigns to demonstrate their commitment to transparency and accountability.

Because data has become deeply entrenched in campaigning, we need rules for its use

Over the past few months, Oxford University’s Technology and Elections Commission (of which we are a part) has been investigating the range of ways that political parties, social media platforms and election officials need to collaborate to make sure the next UK election is a fair fight, without outside interference. There are lots of regulatory gaps, and social media platforms and data brokers have much to answer for. However, the customers buying the misleading ads and launching problematic campaigns are usually political parties and campaign managers.

In preparing for 12 December, political campaigners in the UK are relying heavily on data-driven targeted advertising, but share little information about their content, placement and pricing. After 10 months of research and consultation with policymakers, the tech industry, security experts and academia, our central recommendation is that advertising transparency must start with political parties.

Digital political advertising differs from traditional campaigns in print or broadcasting. The advertising that is shown on mobile phones and laptops is called up by data based on individual behaviour, targeted at small groups within the population. We can’t be confident that our neighbours are getting the same content as us, and we already know these techniques are used to discriminate or send conflicting messages to different audiences.

And while data-driven advertising and content marketing have mushroomed into a multibillion-dollar industry, it is political parties and their affinity groups that are pouring money into this industry. Campaigners in the UK use their budget to place content on platforms like Facebook, Instagram, Snapchat and YouTube, on voter files offered by data brokers, and on data analytics software.

Transparency rules that were once designed for analogue campaigns have been rendered obsolete in the digital age. Right now, digital campaign spending is reported in ways that do not reveal information about where advertising was placed or how much it cost. A legal requirement for imprints (indicating who is responsible for content) on campaign material does not extend to online ads. Advertisers can therefore withhold their identity for political purposes. Democratic citizenship demands that voters have access to meaningful information on campaign spending, and on who is behind the sponsored messages.

Facebook CEO Mark Zuckerberg. Photograph: Andrew Harnik/AP

Dubious campaign messages and junk news have circulated online during critical moments of democratic life in the United Kingdom. For the general election in December, we need to watch out for unorthodox campaigning in the political mainstream. Cambridge Analytica may be out of business, but there is no shortage of digital campaign experts with questionable track records.

Unfortunately, transparency efforts by technology firms are wildly inconsistent. Twitter has just banned political advertising altogether, but that raises questions about what exactly the platform will consider to be political. Researchers have concluded that Facebook’s ad library (which includes some limited information about targeting and spending) is inadequate for meaningful analysis, and the company has already said it will not fact-check political ads. It is not primarily classic “paid ads” that are the problem, but other forms of sponsored content and messages that spread organically.

Social media companies should be archiving all ads all the time. If political parties are spending money on different platforms, in a variety of formats, they should also archive the ads they buy.

Parties should also report clear information about the sources of the data they acquire, to ensure full transparency of its provenance. This should include data from the electoral register, third parties and data brokers, open data, and their own sources of data. They should disclose what profiling tools and analytics software they use to process data and infer personal information. This will help to increase transparency, advance the creation of accountable codes of practice, and help to protect electoral integrity and people’s data.

Down the road, there are a range of regulatory reforms we could consider. There is a strong, existing framework for the lawful uses of data for political campaigning and democratic engagement (in the form of GDPR, the Data Protection Act 1998, abd the EU’s Privacy and Electronic Communications directive). But practice shows that campaigners struggle with attesting to the integrity of third-party data and external analytics software. The Information Commissioner’s Office has identified this gap, and a guide to the use of personal data for political campaigning is in development. But with election day rapidly approaching, the immediate task is to get political parties to be transparent about their communications.

Because data has become deeply entrenched in campaigning, we need rules for its use that go beyond just protecting our political systems, and actually strengthen democracy.

Elections are perhaps the most fundamental exercise of democracy. The tech industry has taken a few steps in the right direction, but the UK’s political parties need to lead the way.

• Lisa-Maria Neudert is a doctoral researcher at the Oxford Internet Institute, and secretary to OxTEC. Phil Howard is director of the Oxford Internet Institute and OxTEC commissioner