ProPublica is a nonprofit newsroom that investigates *****s of power. Sign up to receive our biggest stories as soon as they’re published.
For Dolese Bros. Co. construction and supply company, which has a fleet of 300 trucks, recruiting enough qualified drivers in rural Oklahoma has been a challenge. The company has hung up banners at its plants. It has bought classified ads in newspapers. It has even turned its massive mixer trucks into moving billboards, with bumper stickers telling people how to apply.
But Dolese fills most jobs, according to community relations director Kermit Frank, by advertising on Facebook. On Nov. 4, the company placed a video ad featuring a longtime driver in a hardhat wiping down his truck, talking about all the reasons he appreciates the company: “Here, I’m home every night. And I make really good money. And I get to see my family a lot more.”
The company used Facebook’s new special ads portal, which doesn’t allow targeting by gender, age, race or ethnicity. That was fine with Dolese. While its drivers tend to be men, the company has no gender preference. “The gals we have in our group are fabulous,” Frank said. “We’d take any and all of them we could ever get.”
By the time the ad stopped running ten days later, more than 20,000 people had seen it. Eighty-seven percent of them were men.
In March, Facebook reached a “historic“ settlement of five lawsuits by civil rights groups. Under the terms of the settlement, the social media giant created the special ads portal to prevent discrimination in employment, housing and credit ads against legally protected groups such as women and older workers. The new portal also restricts Facebook’s algorithm from considering gender and age when finding audiences for these ads. “Getting this right,” Facebook CFO Sheryl Sandberg said in a press release, “is deeply important to me and all of us at Facebook because inclusivity is a core value for our company.”
But new research as well as advertising information available on Facebook suggest that the social media giant has not gotten this right. As Facebook promised in the settlement, advertisers on the new portal can no longer explicitly target by age or gender. Nevertheless, the composition of audiences can still tilt toward demographic groups such as men or *****er workers, according to a study published today by researchers at Northeastern University and Upturn, a nonprofit group that focuses on digital inequities. ProPublica helped design the research with Northeastern and Upturn and placed some additional ads of its own.
“We’ve gone above and beyond others to help prevent discrimination in ads by restricting targeting and adding transparency,” Facebook spokesman Joe Osborne said in an emailed statement. “An advertiser determined to discriminate against people can do so on any online or offline medium today, which is why laws exist...We are the only digital media platform to make such meaningful changes in ads and we’re proud of our progress.” Osborne did not respond to questions about specific ads.
One reason for the persistent bias is that Facebook’s modified algorithm appears to rely on proxy characteristics that correlate with age or gender, said Alan Mislove, a Northeastern University professor of computer science and one of the study’s co-authors. “Our research shows that simply removing a few protected features from an algorithm is unlikely to provide any meaningful protection against discrimination,” Mislove said. If the advertiser provides a sample audience of software engineers, for example, that might be considered a proxy for male profiles.
Facebook’s ad delivery algorithm further skews the audience based on the content of the ad itself, the researchers previously found. As a result, even when advertisers try to reach a diverse audience, they aren’t always able to. Dolese’s ad, for example, could have reached a predominantly male audience because it featured a man, or because an interest in trucking acts as a proxy for maleness, or both. (A Dolese spokeswoman said the ad targeted categories “that would appeal to someone in this line of work.”) The settlement did not resolve the potential bias from proxies and ad content, but said Facebook would study the issue.
ProPublica spotted multiple real-world employment advertisements that favored men or excluded older potential applicants. We found these ads and others in Facebook’s ad library, an archive of advertisements on the platform. Some inactive ads in the library, such as those from Dolese, contain information about how they performed, including a breakdown of age and gender. The analysis does not contain information about the targeting choices the advertiser made on the front end.
In addition, testing by ProPublica found that housing and employment advertisers could circumvent the special ads portal and go through Facebook’s old system, which allowed them to target by age, race and gender. The settlement imposed a Sept. 30 deadline for implementing the new portal, but didn’t set a date for rerouting misplaced ads into it. A Facebook spokesman said the company stepped up policing on Dec. 4 with the launch of a special archive for housing. Employment and credit ads are expected to be added to the archive.
The findings by Northeastern and Upturn are likely to fuel the growing debate over whether algorithms that appear to favor one gender, age group or race over another violate the law even when they don’t explicitly consider such factors — an area of civil rights doctrine known as “disparate impact.” For example, the New York State Department of Financial Services is scrutinizing the criteria used by Apple’s credit card after reports last month that the company was extending smaller lines of credit to women than to men. Goldman Sachs, the bank behind the Apple Card, told ProPublica that it “has not and will never make decisions based on factors like gender, race, age, *****ual orientation or any other legally prohibited factors when determining credit worthiness.” New York’s financial services department has also opened an investigation of Facebook over discrimination in ads for housing and other opportunities. The department declined to comment because the investigation is pending.
The Trump administration, which has backed away from Obama-era policies against disparate impact in education, is similarly retreating in its interpretation of housing law. In September, the U.S. Department of Housing and Urban Development introduced a rule that as long as an algorithm involved in housing decisions is not programmed with explicit information about protected classes, like race, or closely related proxies, like zip code, banks and landlords cannot be held liable for its disproportionate effect on people of color. In an apparent contradiction of this stance, HUD also has a pending lawsuit against Facebook, alleging that its advertising system violated fair housing laws. HUD didn’t respond to repeated requests for comment.
Osborne did not respond to emailed questions about the investigations by HUD and the New York financial services department.
Fair housing, employment and credit laws, which prohibit overt discrimination against protected groups such as women, minorities and older people, have been interpreted to apply to digital advertising. In September, the Equal Employment Opportunity Commission found that seven employers broke the law when they excluded women and older workers from seeing job ads on Facebook. The EEOC has said it is considering charges against dozens of more employers.
The settlement talks in the five lawsuits accusing Facebook of enabling advertising discrimination were confidential. But the civil rights groups involved in the case say that while they anticipated the potential for machine bias, the new portal’s potential for disparate impact doesn’t violate the current settlement.
Galen Sherwin, a senior staff attorney at the ACLU, which represented the plaintiffs in the civil rights lawsuits, called the findings “unfortunate but not unexpected. The settlement was a step forward to eliminate the most overt forms of targeting handed to advertisers, but we knew that it wouldn’t solve the continued problem of targeting advertising based on vast troves of user data. The removal of just a few data points is not going to change the targeting outcome. There’s still a lot of work to be done to eradicate the bias that remains on the platform.”
Since targeted advertising is the core of Facebook’s business model, any effort to reduce a marketer’s ability to target by age or gender can hurt the company’s bottom line. Facebook has acknowledged in SEC filings that its anti-discrimination initiatives have had “a small negative impact” on its advertising business.
The company, which controls 22% of the U.S. market share for digital ads according to eMarketer, has repeatedly come under fire for allowing advertisers to unlawfully discriminate against black homebuyers, older people and female debtors, among others. In 2016, ProPublica first reported that Facebook’s tools allowed housing marketers to exclude blacks, Hispanics and other groups protected by the Fair Housing Act. Facebook promised to fix the problem, but in 2017, we found that it hadn’t: Advertisers could still target housing ads to whites only and exclude African Americans, Jews and Spanish speakers from seeing them. This reporting spurred the five lawsuits resolved by this year’s settlement.
Prior to the March settlement, housing, credit and employment advertisers had access to a “Lookalike audience” feature, in which they could upload a list of users, known as a “source audience,” and Facebook would reach more people who “look like” them, based on criteria that included age, race and gender. Now, if these advertisers want to upload a source audience, they use a modified version of the “lookalike feature” known as a “special ad audience.” The algorithm for the special ad audiences no longer considers the fields on a Facebook user’s profile identifying “age, gender, relationship status, religious views, school, political views, interested in, or zip code” as inputs.
Nevertheless, the researchers found that Facebook’s algorithm perpetuates existing discrepancies in gender and age in a given source audience through a myriad of related characteristics. The study suggests that a similar bias would appear for race, but it is harder to document; Facebook gives advertisers data on the people they reached by age and gender, but not by race.
In one experiment, the researchers created two source audiences: one composed of more than 11,000 Facebook employees and the other of more than 11,000 random Americans. They uploaded each of these source audiences into Facebook’s new special ad audience system, ran the same generic employment ad — “Find your next career!” with a link to Indeed.com job search site — and compared the results. When the special audience was generated from the source list of Facebook’s employees, the ad was delivered to a new audience that was significantly biased along age and gender lines, reaching 88% men, nearly half of whom were aged 25 to 34.
These proportions roughly reflected Facebook’s workforce, which has been criticized for its lack of diversity. According to its 2019 diversity report, 77% of the company’s technical workers are male. Facebook does not disclose ages, but according to the market research firm Statista, the median age of a Facebook employee is 28. In another indication that the algorithm tends to replicate the source demographics, half of this special audience lived in California, where Facebook is headquartered. By comparison, the ad audience generated from the list of random Americans reached 54% men, of whom 15% were aged 25 to 34, and 2% lived in California.
Such examples raise concerns that employers could still tailor their audience to focus on groups that they consider the most appealing job candidates, said Peter Romer-Friedman, a lawyer with Outten & Golden in Washington who represented several plaintiffs in the cases against Facebook. “It was certainly a fear of mine that advertisers in good or bad faith could circumvent anti-discrimination measures by uploading their own unbalanced list of Facebook users,” he said. “We knew that this would be a very real risk and that Facebook had not taken any measures to prevent discriminatory custom audience lists from being used.”
Also potentially contributing to disparate impact is Facebook’s ad delivery algorithm. To make sure that ads are seen by people who Facebook thinks are most likely to click on and engage with them, Facebook skews the audience depending on the content of the ad itself, said Piotr Sapiezynski, an associate research scientist at Northeastern and the lead author of the new study. How many women see a job listing for an open janitorial position, for instance, depends not just on what the advertiser tells Facebook, but on how Facebook interprets the advertisement’s image and text.
This effect persists in the new portal. The researchers uploaded a demographically representative audience and ran ads for artificial intelligence and supermarket jobs without any targeting. An ad for a supermarket job reached an audience that was 72% women and mostly aged 35 or older, whereas the audience for AI jobs was 66% male and almost exclusively *****er than 35. Why the algorithm picked these audiences is not known, though it may have observed that women are more likely to shop for groceries and men are more likely to work in computer science.
Facebook’s system appears to draw similar conclusions about construction, a historically male trade. This assumption, however, can thwart advertisers who are trying to make the future of construction look different than its past.
The Chicago-based chapter of the International Union of Operating Engineers, Local 150, has long used Facebook, along with traditional TV and print advertising, to publicize an apprenticeship program for people interested in a career in construction. Ed Maher, the union’s communications director, said that the goal of its advertising was to attract diverse candidates and show that “there’s room for everyone in Local 150 trades.”
But the union’s goals may have been stymied, as Maher put it, by “nuances of the algorithm that lay entirely outside our control.” Several of the union’s Facebook’s ads, purchased in November, feature videos and photos of female members and women of color; the ads are not targeted by gender, but by location and to those interested in construction. Nevertheless, Maher said, the audience reached by the ads is about two-thirds men.
Paradoxically, because advertisers can no longer target by age or gender, they have little recourse to remedy these disproportions. “It’s an unintended consequence,” Mislove said.
“You can’t say steer it toward men instead. Facebook gives you no way to say, ‘I want this to be balanced.’”
The Muslim Public Affairs Council, a non-partisan advocacy group for Muslim representation, has experienced this constraint firsthand. Each summer, the non-profit runs a 10-week program for college-aged students in Washington D.C. Known as the Congressional Leadership Development Program, the internship offers housing, a stipend and the opportunity to work on the Hill. According to Ann Vallebuona, a digital media manager with the MPAC, the program goes out of its way to reach a diverse group of applicants. One Facebook ad, which started running on Nov. 12, read: “Apply to MPAC’s Congressional Leadership Development Program today. WORK ON CAPITOL HILL! Don’t wait. Apply now. 🏛” The accompanying video features six ***** Muslim American men and women, discussing the internship’s highlights over peppy techno beats.
When ProPublica told Vallebuona that Facebook’s archive indicated that the council’s ad was reaching 73% men, she was shocked. “It really is a quite alarming difference between females and males being reached,” she said, adding that far more women than men actually apply to the program.
Osborne did not respond to questions about whether Facebook’s algorithm considered words like “leadership” or “congressional” more relevant to men. He said that the ad system is designed to show ads to people most likely to take action, based on their behavior and intent.
Similarly, last week, the Christian charity Lifewater International advertised an engineering job to support its missions in Africa and Southeast Asia. Gary Weyel, director of marketing and communications at Lifewater International, said that he used the special ads portal to run his job advertisement, and had no desire to target by age or gender. Yet the ad’s audience was almost 70 percent men, and mostly between the ages of 18 and 34.
“I want to get this in front of qualified applicants, who are aligned with our mission and values. They’re making it difficult to do that, especially if there are behind-the-scenes algorithms like this,” he said.
Most of Facebook’s safeguards against bias live within the new special ads portal. Advertisers who want to run housing, credit or employment ads are supposed to click a button identifying their ads, and Facebook requires them to sign a “self-certifcation” statement that they have agreed to follow anti-discrimination laws.
As part of the settlement, the company has introduced an automated classification system to detect these “special ads” and route them into the portal. It also promised to add automated and human reviewers to make sure that advertisers were using the new system. But, at least until last week, Facebook may not have been catching many of the discriminatory housing, employment or credit ads bought through all of its traditional tools, which allow for targeting by age, gender and race.
Last month, ProPublica bought dozens of housing and employment ads on Facebook, but did not designate them as such. The classification program rejected a few of them, including one that read, “We’re hiring. Are you the right gentleman for the job?”
But it did not stop most of them, even when the ad copy specifically said, “We’re hiring” or, “Rent this home.”
We purchased multiple ads linking to employment sites such as Indeed.com, with copy such as “Come work with us,” targeting these mock opportunities exclusively to men under the age of 40. We also bought housing ads targeted by age, gender and family status. One ad only reached people between the ages of 25 to 45 with the relationship status of “married” and with an interest in parenting school-aged *****ren or pre-*****s.
The ads started running within minutes. ProPublica has since removed them.
Facebook’s Osborne said that the company had not been tightly monitoring how employment, housing and credit ads were purchased across its advertising platforms until Dec. 4. Facebook’s protections would have caught the ads had they run after that date, he said.
We may not be the only group to have run employment ads outside of the portal. Some ads in Facebook’s library have been shown to almost 100% men or 100% women, a breakdown that should be impossible without specific targeting. This includes a Nov. 4 ad from Barker and Sons Plumbing, recruiting plumbers in California into the “Barker and Sons family.” According to the ad, the jobs offered “competitive wages, comprehensive medical insurance, training programs and more.” It was shown only to men.
Several days later, Quantum Health, a medical benefits counseling firm, advertised openings for a “healthcare hero,” a “customer service pro,” or “a recent college graduate driven by a passion for helping people.”
The opportunities were shown almost exclusively to Facebook users under the age of 44.
Barker and Quantum did not respond to repeated requests for comment.
As the settlement requires, Facebook is studying how to protect against potential discrimination in ads, and engaging with academics, civil rights groups, and other experts, Osborne said. The company is not required to take specific actions or make this research public, according to plaintiffs in the case.
Steven Lindner, a talent acquisition expert with with the Society for Human Resources Management, a coalition of human resources professionals, said employers concerned about legal liability for recruiting discrimination should take the new findings seriously: “If the algorithm is proving to be discriminatory, the employer should stop using Facebook,” he said. “That’s what I would advise my clients.”
Correction, Dec. 14, 2019: This story originally misstated the last name of an associate research scientist at Northeastern. It is Piotr Sapiezynski, not Sapiensinski.
Correction, Dec. 19, 2019: This story originally misstated the last name of a talent acquisition expert with with the Society for Human Resources Management. He is Steven Lindner, not Linder.