Original report by BBC writer
A father in Lahore, an amateur hockey player from Nova Scotia, and a man named Kevin from Houston, Texas, are all connected through Channel3Now, a website that has been widely quoted in viral posts on X for providing a false name of the 17-year-old charged over the Southport attack. The website also wrongly claimed that the attacker was an asylum seeker who arrived in the UK by boat last year. These incorrect claims, along with unsubstantiated reports from other sources that the attacker was a Muslim, have been largely blamed for sparking riots across the UK, some targeting mosques and Muslim communities.
The BBC has investigated several individuals linked to Channel3Now, speaking to their friends and colleagues who confirmed their identities, and questioned someone claiming to be part of the site’s management. The investigation suggests that Channel3Now operates as a commercial entity aggregating crime news while monetizing social media. No evidence was found to support claims that Channel3Now’s misinformation could be connected to the Russian state.
A person from Channel3Now’s management admitted that the publication of the false name “shouldn’t have happened, but it was an error, not intentional.” The false article lacked a named byline, making it unclear who was responsible for it.
James, an amateur hockey player from Nova Scotia, was identified as having a rare byline on a different article on the site. His LinkedIn profile and a related Facebook account, which includes a friend named Farhan, suggest he is a journalist for the site. A social media account for James’s school and one of his friends confirmed his identity. James’s friend mentioned that James was curious about his involvement in the article but stopped responding after the inquiry.
Farhan, who is based in Pakistan, was confirmed by former colleagues and posts about his Islamic faith and children on social media. His name is not linked to the false article. Farhan blocked the BBC investigator on Instagram shortly after being contacted.
Kevin, based in Houston, Texas, responded to emails from Channel3Now’s official email. He declined to provide his surname, leaving doubts about his true identity, but he answered questions over email. Kevin claimed to speak from the site’s “main office” in the US, matching the timing of the site’s social media posts and his email replies. Initially calling himself “the editor-in-chief,” he later described himself as the “verification producer.” He refused to name the site owner, citing concerns for everyone’s safety.
Kevin stated that over 30 people work for Channel3Now across the US, UK, Pakistan, and India, most recruited from freelancing platforms, including Farhan and James. He emphasized that Farhan was not involved in the false Southport story, for which the site has publicly apologized, blaming the “UK-based team.”
Channel3Now faced accusations of links to the Russian state due to old Russian-language videos on its YouTube channel. Kevin explained that the site had bought a former Russian-language YouTube channel focused on car rallies many years ago and later changed its name. No videos were posted for six years before it started uploading content related to Pakistan, where Farhan is based and where the site admits to having writers.
“Just because we purchased a YouTube channel from a Russian seller doesn’t mean we have any affiliations,” Kevin said. “We are an independent digital news media website covering news from around the world.”
Buying and repurposing a monetized YouTube channel can quickly build an audience and start generating revenue.
Kevin said the site is a commercial operation and “covering as many stories as possible” helps generate income. Most stories appear accurate, drawing from reliable sources about US shootings and car accidents. However, the site has shared more false speculation about the Southport attacker and the person who attempted to assassinate Donald Trump. Following the false Southport story and media coverage, Kevin said their YouTube channel and most Facebook pages have been suspended, but their X accounts remain active. A Facebook page re-sharing the site’s content, called the Daily Felon, is also live.
Kevin argued that the blame for the social media storm and subsequent riots cannot be placed solely on a “small Twitter account” making “a mistake.” Channel3Now’s incorrect story became a source for many social media accounts that helped spread the false accusations.
Several of these accounts, based in the UK and US, have histories of posting disinformation about the pandemic, vaccines, and climate change. These profiles, which have gained large followings, benefited from changes made by Elon Musk after acquiring Twitter.
Bernadette Spofforth, accused of making the first post featuring the false name of the Southport attacker, denied being the source, claiming she saw the name in another post that has since been deleted. Speaking to the BBC, she said she was “horrified” about the attack and deleted her post once she realized it was false. She insisted she was “not motivated by making money” on her account. “Why on earth would I make something up like that? I have nothing to gain and everything to lose,” she said, condemning the recent violence.
Spofforth had previously questioned lockdown measures and net-zero climate change initiatives. Her profile was temporarily removed by Twitter in 2021 for allegedly promoting misinformation about the Covid-19 vaccine and the pandemic, claims she disputed. Since Musk’s takeover, her posts have regularly received over a million views.
The false claim Spofforth posted about the Southport attacker was quickly reshared by conspiracy theory influencers and profiles with histories of anti-immigration and far-right ideas. Many of these accounts have purchased blue ticks, giving their posts greater prominence on Twitter.
Another change by Musk has made promoting these ideas profitable for both conspiracy theory accounts and commercially focused sites like Channel3Now. Some profiles have garnered millions of views over the past week by posting about the Southport attacks and subsequent riots. Twitter’s “ads revenue sharing” allows blue-tick users to earn revenue from the ads in their replies. Accounts with fewer than half a million followers can make $10-20 per million views or impressions on Twitter. Some disinformation-sharing accounts achieve over a million impressions per post, posting multiple times daily.
Other social media platforms like YouTube, TikTok, Instagram, and Facebook also allow users to monetize views but have previously demonetized or suspended profiles violating misinformation guidelines. Except for rules against faked AI content, Twitter does not have specific guidelines on misinformation.
Politicians have called for social media companies to take more action in response to the riots, but the UK’s recently enacted Online Safety Bill does not legislate against disinformation due to concerns about limiting freedom of expression. Tracking down Channel3Now writers revealed that those posting false information are often based abroad, complicating efforts to take action against them. The responsibility to address such content lies with social media companies. Twitter has not responded to the BBC’s request for comment.