Skip navigation
Please use this identifier to cite or link to this item: http://arks.princeton.edu/ark:/88435/dsp013r074x98x
Full metadata record
DC FieldValueLanguage
dc.contributor.advisorKnox, Dean-
dc.contributor.authorRyoo, Haneul-
dc.date.accessioned2020-10-01T14:15:33Z-
dc.date.available2020-10-01T14:15:33Z-
dc.date.created2020-04-27-
dc.date.issued2020-10-01-
dc.identifier.urihttp://arks.princeton.edu/ark:/88435/dsp013r074x98x-
dc.description.abstractAfter a string of police-caused deaths of African American men in the mid 2010s, the United States had no choice but to turn its attention to the state of policing. The graphic videos of these killings and the subsequent innocent rulings of the police officers responsible signaled to the public that the system was broken. Protestors rallied in the streets demanding justice and change. These cries were met with a quick response from policymakers and law enforcement officials, who all agreed that something must be done. The answer to the broken system? Police body-worn cameras. Police body-worn cameras were quickly adopted by law enforcement agencies across the United States, and today more than half of all police departments are using police bodyworn cameras in some capacity. However, five years after the beginning of the police bodyworn camera movement, the sweeping support for these tools has died down. Questions of effectiveness, privacy violations, and financial costs, have politicians, police officers, and civilians reconsidering the role that police body-worn cameras should have in America. One of the largest oversights in the implementation of police body-worn cameras, was the underestimation of just how much data would be produced by these tools. The overwhelming volume of data that is currently being captured by police officers is not only too expensive for police departments to store, but also impossible for supervisors to adequately review. These issues have caused numerous police departments to consider putting an end to their police body-worn camera programs. In response to these issues, a team of researchers at Princeton University has proposed a project to create an AI system that would automate analyses of police body worn camera footage. This paper lays the groundwork for this project, offering an evaluation of police departments’ needs to guide the team in a direction that would be most beneficial to police officers. After establishing the goals of the project, this paper describes the technical requirements necessary to achieve these goals—from the data required, to the types of computer vision techniques that the team will need to use. Then, this paper asks whether these requirements can be met, and if the project is ultimately feasible. The answer to both questions is yes, with the collaboration of a police department. This thesis then moves onto the ethical considerations of the Automated Analysis Project, first establishing that the continued use of police body-worn cameras is justified given the long-term expectations of benefits to increase and costs to decrease with advances in technology and legislation. Then, the ethics of the project itself are considered, reaching the conclusion that although this project has the potential to have significant positive impacts on the future of policing in America, the mental health risks of the project must first be addressed before embarking on the project. This paper concludes by tackling the too-often-ignored issue of researcher protection for the Automated Analysis Project team. Specifically, this thesis puts forth a set of recommendations for the team to follow to establish a data pipeline that effectively mitigates the detrimental mental health effects that vicarious exposure to police body-worn camera footage may have on researchers. While the recommendations discussed in this thesis are targeted toward the Automated Analysis Project team, this thesis serves as an example for all researchers who plan to engage in projects that pose risks to researchers. This paper underscores the need for researchers to take responsibility for the well-being of their team members, and give researchers the same respect and consideration that has been afforded to research subjects in research environments.en_US
dc.format.mimetypeapplication/pdf
dc.language.isoenen_US
dc.titleThe Untold Side of Artificial Intelligence: A Call For Researcher Protection From Vicarious Traumatizationen_US
dc.typePrinceton University Senior Theses
pu.date.classyear2020en_US
pu.departmentPrinceton School of Public and International Affairsen_US
pu.pdf.coverpageSeniorThesisCoverPage
pu.contributor.authorid961157110
Appears in Collections:Princeton School of Public and International Affairs, 1929-2020

Files in This Item:
File Description SizeFormat 
RYOO-HANEUL-THESIS.pdf2.38 MBAdobe PDF    Request a copy


Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.