Get marketing news you'll actually want to read
Marketing Brew informs marketing pros of the latest on brand strategy, social media, and ad tech via our weekday newsletter, virtual events, marketing conferences, and digital guides.
The company formerly known as Facebook will be getting rid of several ad-targeting options that “people may perceive as sensitive,” it said this week. Targeting users based on topics related to health, sexual orientation, religious practices and groups, political beliefs, and social issues will soon be a no-go.
So, what’s this mean in practice? Starting January 19, advertisers using Instagram and Facebook will no longer be able send ads to users based on phrases like “Catholic church,” “same-sex marriage,” or “chemotherapy.”
The announcement was made on Tuesday, a decision reached to “better match people’s evolving expectations of how advertisers may reach them on our platform,” after hearing feedback from “civil rights experts, policymakers and other stakeholders,” wrote Facebook Meta’s VP of product marketing and ads Graham Mudd in a company blog post, who wrote that this was a “difficult decision.”
But, but, but: It isn’t a uniform ban on targeting. Advertisers can still target profiles using customer data, with permission, and Facebook Meta still allows ads to be served via “lookalike audiences” and “location targeting.”
Not the company’s first rodeo:
- In 2018, Facebook removed 5,000 ad-targeting terms related to religious and cultural interests, like “Passover” and “Buddhism.”
- In 2019, Facebook limited how advertisers running housing, employment, and credit ads could target people after “civil rights organizations” (and a lawsuit from a fair-housing group) alleged the ad platform was used to discriminate.
- In July, it blocked advertisers from targeting users under 18 “based on their interests or on their activities on other websites and apps.”
Related, unrelated: A reminder, Facebook Meta has had a pretty rough few months: Whistleblower Frances Haugen leaked a treasure trove of internal documents that show the company was well aware of the harm it has caused. Last week, Facebook said it would shut down its facial-recognition program, but Meta can’t say the same, according to our friends at Emerging Tech Brew.—RB