Please use this identifier to cite or link to this item:
http://arks.princeton.edu/ark:/88435/dsp014b29b837r
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.advisor | Felten, Edward W | en_US |
dc.contributor.author | Kroll, Joshua Alexander | en_US |
dc.contributor.other | Computer Science Department | en_US |
dc.date.accessioned | 2015-12-07T20:00:03Z | - |
dc.date.available | 2015-12-07T20:00:03Z | - |
dc.date.issued | 2015 | en_US |
dc.identifier.uri | http://arks.princeton.edu/ark:/88435/dsp014b29b837r | - |
dc.description.abstract | Important decisions about people are increasingly made by algorithms: Votes are counted; voter rolls are purged; financial aid decisions are made; taxpayers are chosen for audits; air travelers are selected for search; credit eligibility decisions are made. Citizens, and society as a whole, have an interest in making these processes more transparent. Yet the full basis for these decisions is rarely available to affected people: the algorithm or some inputs may be secret; or the implementation may be secret; or the process may not be precisely described. A person who suspects the process went wrong has little recourse. Traditionally, Computer Science addresses these problems by demanding a specification of the desired behavior which can be enforced or verified. But this model is poorly suited to real-world oversight tasks: real specifications are complicated or might not be known in advance; laws are often ambiguous precisely because it would be politically infeasible to give a precise description of their meaning. People do their best to approximate what they believe the law will allow. Disputes about what is acceptable happen after-the-fact via expensive adjudication. Actual oversight happens only rarely, if at all. This dissertation relates the tools of technology to the problem of overseeing decision making processes. These methods use the tools of computer science to ensure properties that can be proven, while providing information necessary for a political, legal, or social oversight process to operate effectively. First, we present an example of the current state of the art in technical systems ensuring accountability: a well-defined policy, specified in advance, is operationalized with technical tools, and those same tools are used to convince outsiders or auditors. Our system allows the accountable execution of orders by a judge for compelled access to private records by an investigator. Moving beyond these methods, we present a general framework for accountable algorithms, unifying a suite of tools from cryptography to design processes that enable after-the-fact oversight, consistent with the norm in law and policy. Accountable algorithms can attest to the valid operation of a decision policy even when all or part of that policy is kept secret. | en_US |
dc.language.iso | en | en_US |
dc.publisher | Princeton, NJ : Princeton University | en_US |
dc.relation.isformatof | The Mudd Manuscript Library retains one bound copy of each dissertation. Search for these copies in the library's main catalog: http://catalog.princeton.edu/ | en_US |
dc.subject | Accountability | en_US |
dc.subject | Assurance | en_US |
dc.subject | Cryptography | en_US |
dc.subject | Oversight | en_US |
dc.subject | Security | en_US |
dc.subject | Zero-Knowledge | en_US |
dc.subject.classification | Computer science | en_US |
dc.subject.classification | Public policy | en_US |
dc.subject.classification | Law | en_US |
dc.title | Accountable Algorithms | en_US |
dc.type | Academic dissertations (Ph.D.) | en_US |
pu.projectgrantnumber | 690-2143 | en_US |
Appears in Collections: | Computer Science |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
Kroll_princeton_0181D_11518.pdf | 1.3 MB | Adobe PDF | View/Download |
Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.