karen middleton

A University of Portsmouth researcher spoke to MPs investigating last year’s UK riots, and the rise of false and harmful content online

19 March 2025

3 minutes

Dr Karen Middleton from the University of Portsmouth has been called as an expert witness in the UK Government's inquiry into social media, misinformation, and harmful algorithms. 

The inquiry, led by the Science, Innovation and Technology Committee, was set up following rising concern that UK online safety laws risk being outpaced by rapidly advancing technology and the politicisation of social media platforms.

MPs are investigating the consequences of AI driven algorithms, which were used in widely shared images posted on Facebook and X inciting people to join Islamophobic protests after the killing of three schoolgirls in Southport in August. 

The first hearing took place last month, with representatives from Meta, TikTok and X giving evidence. Yesterday (18 March), the committee examined the risks posed by generative AI, the monetisation of harmful content, and the role of digital advertising in spreading misinformation.

Right now, the way digital advertising works means that harmful content - misinformation, hate speech, and even fraud - can end up being funded without advertisers even realising it. 

 

Dr Middleton, School of Strategy, Marketing, and Innovation, University of Portsouth

Dr Middleton, from the University’s School of Strategy, Marketing, and Innovation and Volunteer Advisor the Conscious Advertising Network (CAN), spoke alongside several advertising, technology and online safety experts.

She highlighted the lack of accountability in the current advertising ecosystem, which allows misinformation and hate speech to be financially supported through programmatic advertising models.

In 2024, programmatic advertising spending in the UK reached an estimated £3.7 billion and is expected to rise significantly. With more than 80 per cent of UK digital advertising transactions occurring programmatically, the challenge of ensuring ethical and responsible advertising has never been more critical. 

Programmatic advertising uses AI to find the best audience for online advertising, making sure it reaches people who are most likely to take action. The AI will only create audience groups if it helps improve the ad campaign's success.

During the panel session, Dr Middleton explained how social media platforms such as Meta, TikTok, and X prioritise engagement-driven content, even when it is misleading or harmful. 

She warned current digital advertising supply chains make it difficult to track the funding of misinformation, and stressed the urgent need for regulatory interventions to ensure advertising revenue does not inadvertently fund harmful content.

Dr Middleton also provided recommendations for embedding safety-by-design principles, increasing transparency in ad placement, and leveraging AI-driven safety technology to proactively moderate content.

We were pleased to support Dr. Karen Middleton as she gave evidence to the Science, Innovation and Technology Committee on how online advertising relates to the business models of social media companies, and the algorithms their platforms use. We hope these important conversations about the need for transparency and accountability in the advertising ecosystem continue to be had in Parliament.

Alex Murray, Head of Advocacy at CAN

Commenting on her evidence, Dr Middleton said: "Right now, the way digital advertising works means that harmful content - misinformation, hate speech, and even fraud - can end up being funded without advertisers even realising it. 

“The system is so complex and automated that there's little accountability. We need stronger regulations and better industry practices to make sure that advertising money isn’t fueling the very problems we're trying to fight online. It’s about making the internet a safer place for everyone."

Dr. Middleton brings over 25 years of experience in marketing practice, teaching, and research. She actively collaborates with the marketing industry, policymakers, and civil society, working with organisations like the Conscious Advertising Network (CAN) to ensure advertising combats hate speech and harmful content. Her expertise also informs key policy discussions on issues such as violence against women and girls."

Alex Murray,  Head of Advocacy at CAN said: "We were pleased to support Dr. Karen Middleton as she gave evidence to the Science, Innovation and Technology Committee on how online advertising relates to the business models of social media companies, and the algorithms their platforms use. We hope these important conversations about the need for transparency and accountability in the advertising ecosystem continue to be had in Parliament".

Dr Middleton’s panel discussion is available to watch back online here. Other expert witnesses included Phil Smith, Director General of the Incorporated Society of British Advertisers; Dr Eirliani Abdul Rahman, Former Twitter Trust & Safety Council Member and online safety advocate; and Lyric Jain, CEO of Logically.