Meet the Researcher: Sam Hafferty

side by side portraits of two people
Pictured: Sam Hafferty (left); Jason Persaud ’27 (right)

Sam Hafferty is part of the 2024 – 2026 Emerging Scholar Program cohort at the Princeton Center for Information Technology Policy (CITP). Hafferty is contributing to work concerning data privacy regulation and broadband equity. Princeton undergraduate Jason Persaud ‘27 recently sat down with Hafferty to discuss how they got started in this type of work, how their creative background ties into their interests, and offers advice to others interested in responsible tech.

Jason Persaud: Could you tell us a little bit about yourself and some of the work that you do here at CITP?

Sam Hafferty: Yes. I’m an Emerging Scholar Research Fellow here. My main areas of interest and work have been digital literacy, broadband equity, and privacy – mainly digital privacy. 

Jason: I saw from your background that you majored in postcolonial studies and history in undergrad, and then you got your master’s in telecommunications. How has that shaped the way that you think about technology and privacy?

Sam: I think coming from the liberal arts background that I have in undergrad specifically really taught me to think more critically about dominant systems–systems that hold a lot of power hierarchically. And that sort of led me to maybe approach things, particularly modern technologies, with a bit of skepticism and try to always think things through as I’m responding to them, or as they’re being implemented. 

And in my break between undergrad and the master’s program that I did, I got very into reading about privacy and surveillance capitalism – just because it was lockdown. In 2020 we saw a lot of very quick adoption of digital technology throughout education: Zoom, Chromebooks being sent home with kids, so on and so forth. And so just being confronted with that while working in a public library with children. I was observing a lot of their experiences with modern technologies in the classroom. Coming from that liberal arts background informed the way that I was able to approach that.

Jason: On that note, I saw some of your work regarding regulating FRT – facial recognition technology – in New York City. Could you speak a little bit more about that?

Sam: So outside of CITP, in my personal life, I am involved in some legislative advocacy work related to regulating how facial recognition technology can and should be adopted in various settings. There are some pending bills regarding public accommodations and FRT (facial recognition technology) used in the classroom. Currently, there are few guardrails around how data is captured and protected in these contexts. So yeah, that’s a personal passion of mine as well – not only making sure that these bills are sufficient and moving by holding city council members accountable, but also, trying to increase general public awareness on these topics.

“I’m very, very glad that I just sort of followed my gut on things that I thought were important and interesting and sought out other folks working on similar things…So don’t doubt yourself”

Jason: Nice. I also saw a little bit about your work on Discernible Tech. Could you explain what that project is?

Sam: That was my thesis project for the master’s program that I was in. That program – the Interactive Telecommunications program – was also an interdisciplinary type of thing: art, tech, design. We had a lot of people interested in user experience, user design, that sort of thing. And for my thesis, which was prompted to have some sort of design or creative output I focused on, again, public understanding of privacy: surveillance infrastructure, what privacy rights we have, and I made use of data visualization and mapping tools in order to draw that point across. Specifically, it focused on facial recognition technology-enabled cameras in Brooklyn and sort of mapped those out to form an interface for people who lived in the area to see how many cameras were on their walk. And it’s from a very messy data set, so it was more of a speculative kind of thing in the end. But surrounding that work, I had a lot of very interesting conversations with my neighbors.

Jason: Yeah, that’s really cool. I know you mentioned in your previous answer – as well as kind of alluding to it right now – but surveillance technology is being used in schools, like in the classroom. Is that for safety? What are your thoughts on that trade-off between safety and surveillance, especially for minors?

Sam: This is actually the focus of a project that Nitya [Nadgir] and I – who you recently spoke to – have worked on at CITP along with a really solid team of folks at the center and beyond. We looked at school-issued devices like Chromebooks (laptops) that are given to students to work at home on their homework or whatever else. 

Those devices typically have software loaded onto them in order to protect kids from harmful content – which, you know, maybe sounds vague, and it can be. But the proposed aim of implementing surveillance technology, whether it be physical like facial recognition technology or digital monitoring like some of these products offer, is typically to protect students from might be considered harmful content and communications by the controlling school, for example.

And secondly – and this is a more recent thing that I’m kind of tackling in some upcoming and current work – also in this realm of safety – to sort of understand contextually whether a student might be harming themselves or planning to harm themselves or others. 

That’s what a lot of these private companies that we’re looking at share as their guiding aim. For example, with computer vision, there’s a company called Zero Eyes that employs object-detection technology in schools in order to identify guns in a school environment.

Another example is some of the products that we have looked at in the digital space on the student-issued Chromebooks and such that employ natural language processing and image analysis in order to contextually understand if a student is writing something in a Google Doc that might allude to them planning to harm themselves or if they were just making a joke. 

Safety is a huge conversation around the use of these technologies. But again, almost all of the products that we looked at – at least in our work and a lot that are at the forefront of these discussions – are made by private, for-profit companies, and they’re being employed in public settings, right? And there’s very little modern regulation that equipped to even understand how these technologies work, let alone ensure rights-preserving use cases of them.

Jason: It’s really interesting. Wow, I didn’t realize technology was used in that manner, especially in the classroom. Is that a case study in a certain school, or is this across the United States?

Sam: This is deployed across the United States. Elizabeth Warren and Senator Ed Markey put out a report just a couple of years ago, within the past three or five years, raising similar concerns about how quickly these were being adopted and deployed all throughout the country. Not every school, right, but many, many do. There are, at this point, dozens and dozens of companies – at least in the digital software realm – that are marketing products for online safety and detection and things of this nature.

Jason: Okay, your work covers a lot, so what advice would you give to undergrads who are interested in the sort of space you’re studying or involved in?

Sam: Well, number one, you don’t have to be an expert on anything to get started or to really, really immerse yourself in these topics. If you’re interested in it – even just a little bit – see what work people at your university or in your community are doing, and make it known that you are interested and that you are available to work on these projects. 

There’s an entire ecosystem of your data – our data – being collected and monetized in different ways, and certain predictions being made about us that, down the line and currently, can have a ton of unintended impact on the way that we’re able to live our lives.

As mentioned, I come from a liberal arts background. And making that jump to focusing on issues related to technology when I didn’t even know how to change a Word doc into a PDF when I finished liberal arts school – I was very not technologically literate at that time – was a scary jump to make. I struggled a lot with confidence, but I’m very, very glad that I just sort of followed my gut on things that I thought were important and interesting and sought out other folks working on similar things in order to support that work and work on projects related to it. So don’t doubt yourself is my number one thing.

Jason: Amazing, yeah. Okay, final question: What’s a myth about either surveillance or digital equity – or both – that you wish people would stop repeating, and what’s the better take they should have?

Sam: Well, this is a really common one, but I have to say it because it’s the first thing that comes to my mind: “Why should I worry about surveillance if I have nothing to hide?” Right? Which, you know, fair – if you don’t want to worry yourself over it, you don’t have to. But I definitely think the level of data collection present throughout our lives whether in our personal lives, work lives, school lives – might go a lot deeper than many people expect. 

And this ties pretty closely into digital literacy, too – how we understand the capabilities of these devices and the internet, period. I think things go a lot deeper than people expect, and there’s a lot more than just finding a “bad guy,” or, enforcing the law using surveillance technology. There’s an entire ecosystem of your data – our data – being collected and monetized in different ways, and certain predictions being made about us that, down the line and currently, can have a ton of unintended impact on the way that we’re able to live our lives.

Jason Persaud is a Princeton University junior majoring in Operations Research & Financial Engineering (ORFE), pursuing minors in Finance and Machine Learning & Statistics. He works at the Center for Information Technology Policy as a Student Associate. Jason helped launch the Meet the Researcher series at CITP in the spring of 2025.


Comments

Leave a Reply

Your email address will not be published. Required fields are marked *