
On February 1, 2025, Colorado’s Transportation Network Transparency Bill (SB24-075) took effect. CITP scholars, along with their colleagues, played a key role in supporting advocacy around the bill by creating the FairFare app, which provided transparent data to help drivers, union organizers, and policymakers better understand the ride hail industry.
Among other things, the new law requires Transportation Network Companies (TNCs) like Uber and Lyft to disclose to drivers and riders what the driver is getting paid and how much of the fare the company takes for itself. SB24-075 requires TNCs to show the driver immediately at the end of a ride (on a single screen) the total paid by the rider excluding any tip; and to show to the rider (again displayed prominently on a single screen) the total amount the driver will receive, and to do this quickly enough that the info appears before the rider leaves a tip.
The Team Behind The Workers’ Algorithmic Observatory
This sort of transparency is intended to allow both drivers and riders to make their own decisions about how much to work, ride, and tip. It’s an area of research interest for some scholars at CITP and their colleagues from other universities; together they form the Workers’ Algorithmic Observatory (WAO). The team of researchers looking into the transparency issues with TNCs includes Princeton Assistant Professor of Computer Science and co-leader of the Princeton HCI Lab, Andrés Monroy-Hernández, Princeton Computer Science PhD student Varun Rao, University of Colorado Boulder PhD student Samatha Dalal, and Assistant Professor of at Penn State, and former CITP post-doc, Dana Calacci. Professor Monroy-Hernández also leads the AI & Labor Seminar Series hosted at CITP.
The scholars at the Observatory worked with a Colorado-based drivers’ union to co-design FairFare, through which drivers can voluntarily share information about their rides. After consulting with the union, the researchers aimed to calculate what they call the “take rate” – the total percentage of the rider’s price (minus tips) that the TNC takes as a platform fee. What they found was that the TNC’s take rate was always more than what drivers would consider fair.
A Peek Under the Hood
Dalal explains, “For many years, drivers in Colorado have felt that platforms are pushing down their wages while raising rider prices. Without transparency into how much drivers are making and how much riders are paying, it is impossible for workers to empirically investigate changes in their working conditions and come to the bargaining table with companies on equal footing.” This is why the data is crucial – “Increased transparency into pay conditions is a first step towards putting workers on an equal playing field with app companies,” Dalal says.
Their recent paper titled “FairFare: A Tool for Crowdsourcing Rideshare Data to Empower Labor Organizers” is now available with their research findings. These findings – along with studies looking at drivers’ online communications to gauge their concerns – is the type of research that can help inform legislators’ minds about TNC platform regulations.
“In Colorado, we are using the FairFare data to investigate the impacts of increased transparency on rider tipping behavior. Drivers hope that when riders see how little of their fare goes to the driver, they will tip more. But that remains to be seen,” says Dalal.
Rao reflects, “I view transparency as the first step toward greater accountability. Right now, we don’t fully understand how these AI and algorithmic platform decisions, impacting people’s work, are made.” He goes on to say, “Increased transparency can help explain what’s happening under the hood and help us design effective interventions to monitor platform behavior and improve worker well-being.”
Sifting Through Reddit
The team analyzed data from Reddit from r/uberdrivers and r/lyftdrivers to explore transparency concerns among rideshare drivers at scale. These online communities are rich with discussions on various aspects of rideshare work, including transparency. However, much of the data was irrelevant to their study, so WAO developed a method to filter and transform unstructured conversations into structured data, resembling community survey responses.
Using LLMs, they devised a mixed-methods approach—combining LLM-based analysis with interviews—which, to the researchers’ knowledge, is the first of its kind. They call this methodology QuaLLM, which will appear as a separate paper in Findings of ACL, NAACL 2025. Their findings from Reddit discussions validated insights from their interviews and demonstrated the generalizability of their approach.
Public Interest Technology Enabling Accountability
The data from FairFare helped union organizers get a better sense of the broader conditions of work in Colorado, which informed their negotiations with platform companies. As one Union advocate for drivers said in an interview for their research, the FairFare calculations “gave us a clearer picture of what to debate as far as the language in the bill and what would resonate with lawmakers,” and “helped us demand the need for transparency in our bill.”
As Calacci points out, “Workers and consumers deserve to know where their money is coming from and where it’s going. This bill helps workers make better decisions about their work while ensuring that companies like Uber treat riders and drivers with the basic respect they deserve.”
Princeton and the Workers’ Algorithmic Observatory are at the forefront of a national trend to demand transparency around take rates. According to another organizer who participated in the study, “I’ve heard your name [FairFare] in different circles just randomly because this research is going in many different people’s hands, so it’s also building a more national narrative.”
Envisioning What’s Next
Perhaps the most promising use of this data could be in gauging the effectiveness of various state and local interventions. For instance, after a locale passes a minimum wage law, how does that affect the take rate? As Rao explains, “We now have over 800 drivers participating and a dataset covering more than 800,000 rides. We’re excited to expand our research using this data—for example, enabling organizers to estimate unemployment compensation drivers could reclaim during deactivation periods, or investigating ‘algorithmic wage discrimination‘, a term coined by legal scholar Professor Veena Dubal to describe unequal pay for similar work.”
Uber had unsuccessfully challenged the new Colorado law in Court citing first amendment concerns around free speech, despite playing a cooperative role in drafting it. Regardless, the WAO will continue its work assisting driver advocates with the sort of data they need to make their case to state legislatures. They are hoping to continue and expand the FairFare project, allowing the calculation of take rates even if Uber refuses to share real-time data on each ride.
As Professor Monroy-Hernández puts it, “The WAO is an exciting initiative for me because it prioritizes supporting worker organizations over traditional academic research pursuits. Tools like FairFare help worker organizations collect and use data to serve needs we didn’t even think about. For example, FairFare assists worker organizations that provide legal representation for unfairly deactivated drivers by calculating their owed pay in locations where they have a legal right to compensation.”
“I look forward to expanding the suite of tools the WAO provides and the range of partnerships that can make use of them.”
Leave a Reply