The ads were among more than a dozen similar examples found on YouTube that appeared before videos spreading falsehoods around the 2024 election.
Credit...Anastasiia Sapon for The New York Times

On YouTube, Major Brands’ Ads Appear Alongside Racist Falsehoods About Haitian Immigrants

Large organizations and brands saw their advertising dollars funneled to videos amplifying inflammatory narratives, underscoring how difficult it can be to maintain brand safety online.

by · NY Times

On YouTube, an ad for the car company Mazda appeared before a video that repeated the racist falsehood that Haitian migrants in Ohio were “eating ducks on the side of the road.” An ad for the software giant Adobe showed up alongside another video that claimed “people have had their pets abducted and eaten by people who shouldn’t be in this country.”

Even an ad for Vice President Kamala Harris, the Democratic nominee for president, was placed ahead of a video that spread the unsupported statement that migrants were “going to parks, grabbing ducks, cutting their heads off and eating them.”

Many advertisers have tried for years to avoid sharing space with content about polarizing politics, pandemics, hate speech or misinformation, for fear of damaging customer perception and risking public censure. That ads appeared anyway on YouTube ahead of falsehoods about Haitian migrants underscores the difficulty advertisers face in maintaining brand safety in an especially volatile election year.

Just this month, researchers discovered advertisements on YouTube for more than a dozen large organizations and consumer brands that monetized xenophobic (and quickly debunked) claims. Advertising dollars flowed both to YouTube and to the commentators it allowed to amplify inflammatory and racist narratives, according to a report by Eko, a group focused on corporate accountability.

The videos that were accompanied by the ads garnered nearly 1.6 million views on YouTube in a 72-hour period after former President Donald J. Trump promoted a falsehood about Haitian immigrants in Springfield, Ohio, during the presidential debate on Sept. 10, Eko found. The group estimated that the commentators likely earned a few thousand dollars collectively from the advertisements.

Claire Atkin, a co-founder of Check My Ads, a digital advertising watchdog, said that tech platforms were infested this election season with conspiracy theories, false narratives and misinformation.

“We are dealing with information disorder, and advertisers can’t trust it,” she said.

A spokesman for YouTube said videos appearing on its site might be restricted from earning money if they had violated its “advertiser-friendly guidelines” or other policies. The company said it had removed one video flagged by Eko for violating its policies about deceptive content and was reviewing others.

Adobe, the Harris campaign and Mazda did not respond to requests for comment.

Advertisers increasingly rely on algorithms that distribute ads automatically on YouTube and other websites. The practice has been criticized by the United Nations as “opaque” and the U. N. has called on tech companies to scale the practice back to avoid inadvertently funding disinformation or hate.

Advertisements for companies that used digital advertising platforms were 10 times more likely to appear on misinformation websites than on those that did not use the technology: Nearly 80 percent of the most active advertisers relying on such tactics from 2019 to 2021 had ads that appeared on such sites, according to a report published in Nature in June by researchers from Stanford and Carnegie Mellon.

Having an advertisement land next to misinformation can be financially damaging — the share of people who click on such ads is 46 percent lower than for ads that avoid being twinned with toxic content, according to Integral Ad Science, a company focusing on brand safety.

YouTube is far from the only online platform that has sparked advertiser anxiety over the quality of its content. In 2020, more than 1,000 advertisers publicly joined a boycott of Facebook that was organized by civil rights groups protesting the platform’s handling of hate speech and misinformation. Hundreds of advertisers left X last year over similar concerns.

Now, a majority of marketing executives are on edge, as heightened political tensions make consumers especially sensitive to how brands engage with political topics, according to a recent report from Forrester. Analysts at the research company called it “a seemingly no-win situation” for advertisers.

In interviews, many ad executives said they feel that YouTube and other tech platforms on which they advertise are not doing enough to protect against harmful content.

YouTube banned or temporarily restricted a number of accounts after the attack on the Capitol on Jan. 6, 2021, including banning the account of former president Donald J. Trump. The company appeared to walk back some of those restrictions in 2023 when it reinstated Mr. Trump’s account and reversed a ban on misinformation about the 2020 election. The video platform has also remained a prominent home for a number of right-wing commentators known for sharing misinformation. YouTube also has been accused of profiting from videos that presented climate change as a hoax or exaggeration.

“Frankly, brands are promised a lot of things by the platforms — that platforms are using technology and human content moderators at scale to make sure this doesn’t happen,” said Harriet Kingaby, who co-founded and co-chairs the Conscious Advertising Network, a coalition of advertisers, technology providers and others. “There is a real sense that things aren’t working. The trust has been broken.”

Now, brands are trying to regain more control by dictating specifically where their ads can be placed and demanding more manual audits, Ms. Kingaby said.

YouTube says it allows companies to block ads from appearing alongside certain creators, on certain websites or appearing alongside “sensitive content” like disasters or graphic scenes of combat or war.

“Brands really do want to get it right in terms of appearing in places that are not just safe, but suitable,” Ms. Kingaby said. “But at the moment, it is a very, very, very difficult space to navigate.”


Explore Our Business and Tech Coverage

Dive deeper into the people, issues and trends shaping the worlds of business and technology.