Google said last week it plans to expand into Germany its campaign to “inoculate” people against misinformation — as if it were a virus — after seeing “promising results” in Eastern Europe.

The campaign is based on an approach called “prebunking” designed to teach people how to spot false claims before encountering them, thereby “inoculating” them against the “disease” of misinformation “like a vaccine does” against a physical disease, Euronews reported.

The tech giant will release a series of short videos that highlight techniques — such as fear-mongering, scapegoating, false comparisons, exaggeration and missing context — that are commonly used to promote misleading claims.

The videos dissect these different techniques so viewers can more readily recognize them when consuming media.

Proponents of the campaign say it’s an “efficient way to address misinformation at scale.”

But some critics allege Google’s campaign is selectively targeting information related to corporate and government interests and is motivated by money rather than a sincere desire to protect readers from false information.

“They [Google’s leaders] want Google to be what they see as a safe place for advertisers,” said Clayton Morris, a former Fox News anchor who co-hosts the online news show “Redacted.”

Videos to run as ads on Facebook, YouTube, TikTok

Google’s “prebunking” videos will run as advertisements on Facebook, YouTube or TikTok in Germany. A similar campaign in India is also in the works, the AP reported.

Last fall, Google ran a test video campaign in Poland, the Czech Republic and Slovakia.

The campaign focused on inoculating viewers against “false claims about Ukrainian refugees” and showed techniques commonly used to support such claims, such as alarming or unfounded stories about refugees committing crimes or taking jobs away from residents.

The AP did not report the specific statements regarding Ukrainian refugees that Google deemed as false.

The videos were viewed 38 million times on Facebook, TikTok, YouTube and Twitter.

Researchers said people who viewed the videos were more likely to be able to identify misinformation techniques and less likely to spread false claims than people who hadn’t watched the video.

‘You can think of misinformation as a virus’

Alex Mahadevan, director of MediaWise, a media literacy initiative of the Poynter Institute, told the AP that the strategy was a “pretty efficient way to address misinformation at scale, because you can reach a lot of people while at the same time address a wide range of misinformation.”

In November 2022, Google and YouTube gave the Poynter Institute $13.5 million to strengthen its fact-checking efforts with $12 million earmarked to create a Global Fact Check Fund.

“You can think of misinformation as a virus,” Sander van der Linden, Ph.D., professor of social psychology in society at the University of Cambridge, told the AP. “It spreads. It lingers. It can make people act in certain ways.”

It also sometimes needs a periodic “booster,” according to the AP, because the effects of the videos eventually wear off.

Van der Linden assisted Google in developing its prebunking campaign and is presently advising Meta, which owns Facebook and Instagram, the AP said.

Google announced its expanded campaign just before the Feb. 17 start of the Munich Security Conference.

According to the AP, the timing of the announcement reflected the heightened concerns of government officials and tech companies regarding the impact of misinformation.

“There’s a real appetite for solutions,” Beth Goldberg, head of research and development at Jigsaw, a unit of Google that “explores threats to open societies,” told the AP. “Using ads as a vehicle to counter a disinformation technique is pretty novel. And we’re excited about the results.”

Google has not announced plans to expand its campaign to the U.S., so it remains unknown if and when the California-based company will apply its misinformation prebunking tactics on its home turf.

‘Prebunking’ campaign more about promoting ‘corporate and government interests,’ say critics

Commenting on Google’s latest announcement, “Redacted” co-host Natali Morris said, “It’s clear the corporate idea of disinformation is really only related to corporate and government interests — not at all to human interest.”

The topics addressed by Google’s misinformation campaign, such as COVID-19 and climate change, are topics “that will either give governments or corporations more power,” Natali said.

Meanwhile, other topics of human interest — such as child trafficking — remain untouched by Google, she said.

Clayton Morris, who co-hosts “Redacted” with his wife Natali, said he believed Google’s efforts to fight misinformation are motivated by money.

For example, Clayton said, companies like Pfizer and Moderna spend billions of dollars in advertising and Google wants to be an “advertiser-friendly environment” so it “pushes down” information and opinions that criticize the companies’ pharmaceutical products.

“Imagine if Pfizer execs are sitting there and they’re thinking about where to put their ad dollars and they start seeing Google search results that are like Pfizer this, Pfizer that … that’s why they [Google] downrank all of this stuff to push that stuff away and they continue to make their good ad revenue,” he said.

Now the pharmaceutical companies don’t even have to produce ads anymore because Google is essentially doing it for them, Natali added.

“Imagine what a prebunking video would look like around a COVID vaccine,” she said.

Source –