The recent hard copy issue of Washington Lawyer arrived in my mailbox this weekend. After being amused at the fact the content is being delivered via regular mail on paper, I was struck by the cover story and imagery. The story is captioned “A Flawed Record: The Fragility of Eyewitness Memory”; the imagery, however, displays a visualization of the electronic connections, synapses, and measurement output that are processed by a human brain. The article by Sarah Kellogg is superb at explaining how memories can be shaped and altered by our biases, our experiences, and our preferences for a particular outcome. For me, the most telling insight from the article was the observation that DNA testing of blood samples and other human fluids “. . . is keeping eyewitness testimony honest—painfully so.” In other words, the measured test results are proving superior as objective evidence with which to build an accurate understanding of the events and actors under examination.
But, while the article intelligently highlights reforms that are being made to improve the reliability of eyewitness testimony, the article overlooks something far more important—the emergence of the computer as a witness, one for which biases, experiences, and personal preferences for a particular outcome are not influential. To be sure, for acts of physical violence that occur outside the surveillance of any machines, human eyewitness testimony remains important to reconstructing a description of the events and actors involved. Yet, more and more, our computers are becoming monitors and witnesses to our lives, our actions, our movements, our decisions, and our communications. As machines, computers are remarkably objective—the event logs, access logs, operations logs, keystroke logs, video recordings, and audio recordings are merely capturing and preserving data as documentation of what has, or has not, occurred in fact. There is no fiction and, unless programmed to exclude, omit, or delete specific content, the machines are comprehensive in what information is captured and preserved.
As anyone remotely familiar with current trends in electronic discovery knows, compelling evidence of fraud or mismanagement is not being found in the direct communications or records (as digital equivalents of paper documents) but in the metadata, operating logs, and similar assets that identify attempts to delete, modify, or otherwise wipe away the incriminatory content preserved by the computers. But only the most sophisticated actors doing so realize they must also wipe away the “mouse-prints” of their efforts to delete the objective evidence preserved by the machines.
So, while assuring the reliability of eyewitness testimony still is important, just like DNA testing is keeping the testimony honest, so too are computers rapidly emerging as superior witnesses, ones for which judges, juries, and regulatory authorities are increasingly prepared to rely upon to build their understanding of the history of events against which the rule of law will be applied. Indeed, particularly for corporations and businesses, their computers are now the primary source of the evidence of the truth on which regulators rely in applying their rules. Oral, eyewitness testimony is rapidly becoming obsolete and, for all of the reasons noted in the Kellogg article, less and less reliable.
But smart attorneys know that, just like human eyewitnesses, the reliability of computers, and the trustworthiness of their digital evidence, can be challenged. Indeed, building digital trust includes building the rules with which computers will be evaluated, both as eyewitnesses to the events with legal significance, and as custodians of those records. Simply, we can not presume that any digital record is conclusive and objective as evidence of the truth. Nor can we presume that every application, device, system, and network preserves information assets with functional effectiveness that preserves the integrity and trustworthiness of those assets.
Technology companies, IT divisions, and CIOs that include these variables and considerations in their designs will surely gain traction and market advantage. Those that do not will, ultimately, fall to the side of the road, incapable of delivering what society and the rule of law require—efficient, trusted production of the evidence of the truth.
As graphically illustrated by the cover imagery of the Washington Lawyer, we now know that our brains are, themselves, computers, wired and processing billions of bytes a day as we evaluate the world around us, including the information we rely upon to make decisions, take actions, and apply the rule of law. As our computers accelerate in their capacity to replicate human decision processing, including the decisions to trust specific information assets, the world is going to become very interesting.
In my new book, I lay out the tools for improving the velocity of that acceleration. I was able to teach these tools for the first time at the University of Oxford last month—the world has already become interesting! If you would like to know more, be sure to reach out to me at firstname.lastname@example.org.