COVID-19: Community Updates and Resources
As the conversation around racial justice continues throughout the country, many organizations are taking a look at how their platforms and policies potentially introduce bias and contribute to systemic discrimination.
Albany Law School’s Associate Dean for Strategic Initiatives Antony Haynes is serving as a valuable resource in one such project, lending his law and technology expertise to Airbnb. The popular travel and lodging service has launched the initiative Project Lighthouse, which the company says will uncover, measure, and combat discrimination and bias experienced by people of color on its service. Dean Haynes served as a reviewer for Airbnb’s recently released paper, “Measuring discrepancies in Airbnb guest acceptance rates using anonymized demographic data,” which was written by the project’s anti-discrimination team.
The information uncovered through Project Lighthouse will help Airbnb create a new benchmark to fight discrimination—all while balancing the collection and protection of user data. A common roadblock in eliminating bias in algorithms and machine learning is that the programs often involve protected trade secrets, according to Dean Haynes.
Dean Haynes, who also serves as director of Cybersecurity and Privacy Law, was one of several data privacy experts consulted ahead of the project’s launch. Airbnb opening up its model for study is a refreshing step in the right direction, he said.
“I haven’t seen any [other entity] try to preserve the anonymity of the human beings whose data we’re collecting and using to feed into our big data analysis to determine if there’s a racially disparate impact,” he said. “It shows you what’s possible.”
Project Lighthouse—a partnership with Color of Change, the nation’s largest online racial justice organization—will specifically focus on Airbnb’s reservation process, including why cancellations occur, user reviews, in-app correspondence, and search data. Users can opt out of the project if they choose and the information pulled will be separated from an individual’s profile.
“This is the first really powerful solution to a very complicated problem that also incorporates data privacy concerns,” Dean Haynes said. “It’s a very important and complicated challenge.”
Data from this project will also help to build a set of standards for building new software programs, he added. Right now, there is no set of parameters; developers and designers build and fix things as they break. Ideally, one day there will be a framework to follow, and initiatives such as Project Lighthouse will have a similar impact on technology as Upton Sinclair’s novel The Jungle had on the meat-packing industry, Dean Haynes said.
“I think that we are with software today [is similar to] where the country was in the [early] 1900s with food and drugs. At that time, that was the most radical, most intrusive government regulation. It upended all food preparation—cosmetics and drugs, everything—and to this day companies are unhappy with the level of requirements of the FDA. But it was necessary. That’s where we are with algorithms,” he said.
Earlier this year, Dean Haynes presented on the ways in which hiring practices include unconscious biases at the President's Inclusive Leadership Institute at Towson University. An example: A startup having a “cultural fit” portion of the interview process, during which current employees meet with candidates to see how they get along.
“You’re more likely to replicate people who are already there and whatever their demographic is,” he said. “You have to understand what it means—you may have an unconscious bias in your system. There’s no algorithm there but there’s a process you’re following.”
The presentation was a culmination of Dean Haynes’ ongoing study into "immoral software" and bias in automation. When humans program, their biases appear—unconscious or sometimes conscious—and he’s been hard at work to add to the burden of proof.
Project Lighthouse is another example—and Dean Haynes said he’s looking forward to seeing the data and calibrating the model to work in other settings.
Dean Haynes believes the project will set a crucial precedent for other companies to eliminate bias as they build software, use machine learning, and rely on algorithms for information. By getting the framework started now, Airbnb is building a concrete example of the underlying anti-discrimination work for other companies—especially as they hire and promote employees to do said work.
“The disparities around race and gender become even more profound and pronounced as you move up the managerial and executive levels,” Dean Haynes said. “The idea of diversifying tech is very important. If companies take the ideas—the high-level ideas behind Airbnb’s Project Lighthouse, which focuses on one problem—and you generalize out the entire organization, you can positively impact the level of diversity in an organization.”