Meta plans to remove thousands of sensitive ad-targeting categories.

Ad buyers will no longer be able to use topics such as health, race or sexual orientation to target people with unwanted ads on Facebook or other Meta apps.,

Advertisement

Continue reading the main story

Meta plans to remove thousands of sensitive ad-targeting categories.

Meta’s headquarters in Menlo Park, Calif.Credit…Jim Wilson/The New York Times

Nov. 9, 2021, 3:00 p.m. ET

Meta, the social media company formerly known as Facebook, said on Tuesday that it planned to eliminate advertisers’ ability to target people with promotions based on their interactions with content related to health, race and ethnicity, political affiliation, religion, sexual orientation and thousands of other topics.

The move, which takes effect on Jan. 19, affects advertisers on Meta’s apps such as Facebook, Instagram and Messenger and the company’s audience network, which places ads in third-party apps. The Silicon Valley company said it was making the changes to limit the way that its targeting tools can be abused. In the past, these features have been used to discriminate against people or to spam them with unwanted messaging.

“We’ve heard concerns from experts that targeting options like these could be used in ways that lead to negative experiences for people in underrepresented groups,” said Graham Mudd, a vice president of product marketing for Meta.

Meta relies on targeted advertising for the bulk of its $86 billion in annual revenue. The company has excelled at giving advertisers a place to personalize promotions, with brands often able to aim their ads at Facebook, Instagram and Messenger users who are interested in topics as specific as L.G.B.T.Q. culture or Catholicism. Such tailored ads often have a better chance of sparking a sale or prompting users to join a particular Facebook group or support an online organization than more generalized ads.

Join technology writer Shira Ovide, Reddit’s co-founder Steve Huffman and the drag queen Latrice Royale for a virtual discussion of what makes for robust internet communities.

But Meta has also faced a litany of complaints about advertisers abusing these targeting abilities.

Before the Jan. 6 storming of the U.S. Capitol, for example, advertisers used targeting tools to direct promotions for body armor, gun holsters and rifle enhancements at far-right militia groups on Facebook. In 2020, auditors concluded that Facebook had not done enough to protect people who use its service from discriminatory posts and ads.

In 2019, the Department of Housing and Urban Development sued Facebook for allowing landlords and home sellers to unfairly restrict who could see ads for their properties on the platform based on characteristics like race, religion and national origin. And in 2017, ProPublica found that Facebook’s algorithms had generated ad categories for users interested in topics such as “Jew hater” and “how to burn jews.”

In response to the abuse, the social network has tweaked its ad-targeting tools over time. In 2018, it removed 5,000 ad-targeting classifications to keep advertisers from excluding certain users. Facebook also disabled the anti-Semitic ad categories after the ProPublica report.

But Meta’s latest changes may be unpopular with the millions of organizations that rely on the company’s tools to grow their audiences and build their businesses. Advertising on Facebook, Instagram and Messenger that is finely tuned to people’s interests is often more affordable and effective than advertising on broadcast television and other media.

“Like many of our decisions, this was not a simple choice and required a balance of competing interests where there was advocacy in both directions,” Mr. Mudd said. He added that some of the ad changes have been under discussion since 2016.

Understand the Facebook Papers

Card 1 of 6

A tech giant in trouble. The leak of internal documents by a former Facebook employee has provided an intimate look at the operations of the secretive social media company and renewed calls for better regulations of the company’s wide reach into the lives of its users.

How it began. In September, The Wall Street Journal published The Facebook Files, a series of reports based on leaked documents. The series exposed evidence that Facebook, which on Oct. 28 assumed the corporate name of Meta, knew Instagram, one of its products was worsening body-image issues among teenagers.

The whistle-blower. During an interview with “60 Minutes” that aired Oct. 3, Frances Haugen, a Facebook product manager who left the company in May, revealed that she was responsible for the leak of those internal documents.

Ms. Haugen’s testimony in Congress. On Oct. 5, Ms. Haugen testified before a Senate subcommittee, saying that Facebook was willing to use hateful and harmful content on its site to keep users coming back. Facebook executives, including Mark Zuckerberg, called her accusations untrue.

The Facebook Papers. Ms. Haugen also filed a complaint with the Securities and Exchange Commission and provided the documents to Congress in redacted form. A congressional staff member then supplied the documents, known as the Facebook Papers, to several news organizations, including The New York Times.

New revelations. Documents from the Facebook Papers show the degree to which Facebook knew of extremist groups on its site trying to polarize American voters before the election. They also reveal that internal researchers had repeatedly determined how Facebook’s key features amplified toxic content on the platform.

Augustine Fou, an independent ad fraud researcher, said advertising on Facebook and its other apps has long worked “better than any other display ads elsewhere because Facebook has years of people volunteering information, and it’s pretty accurate.” He added that personalized advertising outside the platform often relies on guesswork that is “so wildly inaccurate that when you try to target based on that, you’re worse off than trying to spray and pray.”

Yet Meta has often struggled with how to take advantage of consumer data without abusing it.

“Of course, Facebook can deduce that you’re gay, or that you’re African American, but then the question becomes whether it is ethical to use those categories for targeting,” Mr. Fou said.

The new changes do not mean Meta is getting out of ad targeting. The company will still allow advertisers to aim ads at users based on tens of thousands of other categories and topics. It added that it would continue to use tools such as location targeting.

The company also said it would let users, who can already limit their exposure to ads about topics such as politics and alcohol, to start blocking promotions related to gambling and weight loss early next year.

“We continue to believe strongly in personalized advertising, and frankly personalized experiences overall are core to who we are and what we do,” Mr. Mudd said.

Leave a Reply