Current:Home > NewsResearchers explain why they believe Facebook mishandles political ads-InfoLens
Researchers explain why they believe Facebook mishandles political ads
View Date:2025-01-09 08:07:09
Facebook has worked for years to revamp its handling of political ads — but researchers who conducted a comprehensive audit of millions of ads say the social media company's efforts have had uneven results.
The problems, they say, include overcounting political ads in the U.S. — and undercounting them in other countries.
And despite Facebook's ban on political ads around the time of last year's U.S. elections, the platform allowed more than 70,000 political ads to run anyway, according to the research team that is based at the NYU Cybersecurity for Democracy and at the Belgian university KU Leuven.
Their research study was released early Thursday. They also plan to present their findings at a security conference next August.
After analyzing more than 4.2 million political and 29.6 million non-political ads from more than 215,000 advertisers, the researchers said that in Facebook's enforcement efforts on U.S. political ads, "61% more ads are missed than are detected worldwide, and 55% of U.S. detected ads are in fact non-political. "
Researcher criticizes Facebook's use of 'rudimentary' methods
Laura Edelson of NYU, a lead author of the study, says two things that emerged from the research surprised her.
"One is this very high false positive rate in the U.S.," Edelson said.
Part of the surprise there, she added, is due to what she calls the "rudimentary" way Facebook seems to use keyword models to classify advertising and content.
"We can do a lot better," Edelson said. "This is not the state of the art of content moderation or detection of problematic content. There are many more sophisticated methods that could be employed here that Facebook doesn't seem to be using."
"Facebook does involve humans in some portion of the ad and content moderation policy, but it's definitely automated first," she said. "That approach just has accuracy problems."
The other surprise, she said, was the problem Facebook apparently had in enforcing its ban on political ads in the U.S. After the policy was announced, many political advertisers simply stopped running ads. But not everyone observed the ban, Edelson said: "Quite a few of them kept running ads and just stopped declaring them as political."
It should have been easy to detect a political advertiser who was flouting the ban, she said.
"The errors here aren't subtle," Edelson said. "They are just really reflective of lack of investment."
Facebook responds to the researchers' findings
Responding to a request for comment about the pending study, a spokesperson for Meta, Facebook's parent company, told NPR:
"The vast majority of political ads they studied were disclosed and labeled just as they should be. In fact, their findings suggest there were potential issues with less than 5 percent of all political ads.
"If it were a complete view, this report would have also noted we offer more transparency into political advertising than TV, radio or any other digital advertising platform."
At issue in the U.S.: more topics have become politicized
As for what is considered a political ad, the researchers note that Facebook itself says its political ad policy applies to "ads about social issues, elections or politics." It then configures its system to enforce the rules based on that definition.
In recent years, the definition of what might be construed as a political message has grown broader, as the language around social and health issues has increasingly become politicized. The researchers link that trend to Facebook's tendency to mistakenly label a non-political ad as a political one in the U.S..
Edelson points to the way Facebook approaches COVID-19 information.
"A lot of pandemic and COVID-related content became politicized," she said. "A lot of vaccine-related content became politicized. But the way that Facebook managed that was not in a very subtle or nuanced way."
A big part of the problem, Edelson said, is that Facebook relies on automated detection mechanisms that in her view are simply not very accurate.
"Ads with a person who had a mask on were getting flagged as political. Ads that mentioned or talked about vaccines or COVID were getting flagged as political," she said.
By mislabeling health messages as political ads, the researchers said, Facebook created new problems that it had to solve.
"Facebook created a policy carve-out explicitly for government health bodies," Edelson said, "so that they could be exempted from these policies on political speech. Because they just didn't seem to know how to apply them — to catch stuff that was political about COVID without also catching stuff that was just like, 'this is where you can go get a vaccine.' "
In that scenario, if a community organization wanted to run an ad saying they're hosting a weekend vaccine drive, "Facebook could potentially flag that ad as a political ad," KU Leuven's Victor Le Pochat, another lead author of the study, said. From there, he said, the ad might be taken down.
"If Facebook does this kind of incorrect detection, it might end up preventing these kind of community organizations from ever publicizing their vaccine drive," he said.
Researchers also found some improvements
In light of what we know about Facebook's handling of political ads in the 2016 elections, we asked, did the researchers see any improvements since then?
"I would say it has gotten somewhat better," Edelson said.
"We do see it in our data," Le Pochat added.
"We do see that Facebook is able to get more ads which were improperly declared," compared to a few years ago, he said.
The majority of the ads were properly declared, Le Pochat said, "so advertisers do follow Facebook's policy. We found over 4 million political ads which were correctly declared. And then we found over 150,000 which the advertisers failed to declare."
The study corroborates recently leaked Facebook records
The study is emerging weeks after a trove of Facebook documents from whistleblower Frances Haugen depicted the social media giant as failing to deal with a number of political and social complexities, particularly in countries where people post content in Arabic, Hindi and other widely spoken languages.
In some cases, the company accidentally banned everyday words, according to the documents. In others, Facebook's screen systems reportedly allowed incendiary language to spread.
"Our findings corroborate that Facebook is really not paying a lot of attention to making sure that communities outside the U.S. ... are also protected from the harms that are done by misleading political advertising," Le Pochat said.
'We see this very high rate of false positives in the United States," Edelson said. "It really looks like this is due to the fact that Facebook seems to use a keyword model to detect political content in the United States. And we don't see that pattern in other countries anywhere near as much."
It's a reflection, she said, of where Facebook has chosen to invest money and time.
"To employ a keyword model like that, you need to have some knowledge of the politics of that country so that you can build that keyword list," Edelson said. "And it just doesn't look like Facebook has invested in understanding the politics of all the countries in which it runs political ads. It just cannot do that kind of detection in Malaysia or Macedonia or Argentina if it hasn't spent the money to understand the political landscape in those countries."
Editor's note: Meta pays NPR to license NPR content.
veryGood! (5)
Related
- John Robinson, former USC Trojans and Los Angeles Rams coach, dies at 89
- Where Joe Manganiello Stands on Becoming a Dad After Sofía Vergara Split
- Why U.S. men's gymnastics team has best shot at an Olympic medal in more than a decade
- Automakers hit ‘significant storm,’ as buyers reject lofty prices at time of huge capital outlays
- Federal judge orders Oakland airport to stop using ‘San Francisco’ in name amid lawsuit
- USWNT starting XI vs. Zambia: Emma Hayes' first lineup for 2024 Paris Olympics
- Brittany Aldean opens up about Maren Morris feud following transgender youth comments
- Judge threatens to sanction Hunter Biden’s legal team over ‘false statements’ in a court filing
- Review: 'Emilia Pérez' is the most wildly original film you'll see in 2024
- American Olympic officials' shameful behavior ignores doping truth, athletes' concerns
Ranking
- College Football Fix podcast addresses curious CFP rankings and previews Week 12
- Meta’s Oversight Board says deepfake policies need update and response to explicit image fell short
- Single-engine plane carrying 2 people crashes in Bar Harbor, Maine
- Kit Harington Makes Surprise Return to Game of Thrones Universe
- Katharine Hayhoe’s Post-Election Advice: Fight Fear, Embrace Hope and Work Together
- North Korean charged in ransomware attacks on American hospitals
- Meta’s Oversight Board says deepfake policies need update and response to explicit image fell short
- Zendaya's Wet Look at 2024 Paris Olympics Pre-Party Takes Home the Gold
Recommendation
-
Mike Tyson vs. Jake Paul referee handled one of YouTuber's biggest fights
-
Days before a Biden rule against anti-LGBTQ+ bias takes effect, judges are narrowing its reach
-
Crews search for missing worker after Phoenix, Arizona warehouse partial roof collapse
-
USA Basketball players are not staying at Paris Olympic Village — and that's nothing new
-
The Office's Kate Flannery Defends John Krasinski's Sexiest Man Alive Win
-
San Diego Padres in playoff hunt despite trading superstar Juan Soto: 'Vibes are high'
-
Zoinks! We're Revealing 22 Secrets About Scooby-Doo
-
Christina Hall Accuses Ex Josh Hall of Diverting More Than $35,000 Amid Divorce