insight. There can be no royal road to becoming Goethe. In scientific atlas after
scientific atlas, one sees explicit argument that “subjective” factors had to be part of the
scientific work needed to create, classify, and interpret scientific images.
What we see in so many of the algorists’ claims is a tremendous desire to find
scientific objectivity precisely by abandoning judgment and relying on mechanical
procedures—in the name of scientific objectivity. Many American states have legislated
the use of sentencing and parole algorithms. Better a machine, it is argued, than the
vagaries of a judge’s judgment.
So here is a warning from the sciences. Hands-off algorithmic proceduralism did
indeed have its heyday in the 19th century, and of course still plays a role in many of the
most successful technical and scientific endeavors. But the idea that mechanical
objectivity, construed as binding self-restraint, follows a simple, monotonic curve
increasing from the bad impressionistic clinician to the good externalized actuary simply
does not answer to the more interesting and nuanced history of the sciences.
There is a more important lesson from the sciences. Mechanical objectivity is a
scientific virtue among others, and the hard sciences learned that lesson often. We must
do the same in the legal and social scientific domains. What happens, for example, when
the secret, proprietary algorithm sends one person to prison for ten years and another for
five years, for the same crime? Rebecca Wexler, visiting fellow at the Yale Law School
Information Society Project, has explored that question, and the tremendous cost that
trade-secret algorithms impose on the possibility of a fair legal defense.44 Indeed, for a
variety of reasons, law enforcement may not want to share the algorithms used to make
DNA, chemical, or fingerprint identifications, which puts the defense in a much
weakened position to make its case. In the courtroom, objectivity, trade secrets, and
judicial transparency may pull in opposite directions. It reminds me of a moment in the
history of physics. Just after World War II, the film giants Kodak and Ilford perfected a
film that could be used to reveal the interactions and decays of elementary particles. The
physicists were thrilled, of course—until the film companies told them that the
composition of the film was a trade secret, so the scientists would never gain complete
confidence that they understood the processes they were studying. Proving things with
unopenable black boxes can be a dangerous game for scientists, and doubly so for
criminal justice.
Other critics have underscored how perilous it is to rely on an accused (or
convicted) person’s address or other variables that can easily become, inside the black
box of algorithmic sentencing, a proxy for race. By dint of everyday experience, we have
grown used to the fact that airport security is different for children under the age of
twelve and adults over the age of seventy-five. What factors do we want the algorists to
have in their often hidden procedures? Education? Income? Employment history? What
one has read, watched, visited, or bought? Prior contact with law enforcement? How do
we want algorists to weight those factors? Predictive analytics predicated on mechanical
objectivity comes at a price. Sometimes it may be a price worth paying; sometimes that
price would be devastating for the just society we want to have.
More generally, as the convergence of algorithms and Big Data governs a greater
and greater part of our lives, it would be well worth keeping in mind these two lessons
44 Rebecca Wexler, “Life, Liberty, and Trade Secrets: Intellectual Property in the Criminal Justice System,”
70 Stanford Law Review, XXX (2018).
163
HOUSE_OVERSIGHT_016966
Discussion 0
No comments yet
Be the first to share your thoughts on this epstein document