In professional cycling, the ultimate cheat is to be pulled along by a team car through a tough moment in the race in order to catch your competition or recover from a calamity—a crash, a flat tire, or even an unexpectedly long response to a call from nature. Each rider is expected to compete with his own power and use that power to overcome the calamities.
Yesterday, in one of the first major one-day races of the cycling season, known as Milan-San Remo, following a somewhat unexpected win by a somewhat unexpected winner, Arnaud Démare, other cyclists complained. Démare had been involved in a crash before a major climb. He was, with others, far behind the front of the race.
The complaints allege that, on the major last climb, Démare was pulled up by holding onto his team car. They could not say whether Démare was directly holding the car frame or a “sticky” bottle (where a bottle is offered to a rider and he holds onto it for a long time, thereby being pulled by the car’s occupant). But the race officials dismissed the complaints in the absence of any video proof and confirmed Démare as the winner.
You are right to ask, “Where is the connection with digital trust?” Immediately after the official confirmation of the winner, the cycling world has erupted, asking for the cycling equivalent of a digital replay review. If Démare cheated, there were digital records that would confirm or invalidate the complaints.Professional cyclists have two digital devices on their bikes. First, there is a transponder issued by the race officials, which transmits a radio signal that confirms the location and speed of each rider. Second, each rider has a cycling computer that reads various sensors installed on the bike and rider to record heart rate, speed, cadence, location, and power (the watts of energy generated by the cyclist’s effort).
The real-time data these devices generate are then captured and analyzed. For the race officials, the transponder is there to confirm that no one deviates from the course and provide general data for reporting the overall status of a race to the media and public. For the cyclists and their managers, the data supports ongoing training and conditioning analysis to bring riders to their maximum fitness and keep them there.
No one had thought to use the data to investigate a claim of unfair cheating until yesterday. The intrigue is even higher. Many riders immediately post their race data files to public sites (one of which is Strava.com); Démare’s Strava file was suddenly deleted after the race. As of this morning, there are no reports whether the race official files and the data within the cyclist’s own computer continue to exist.
What could any of the digital data reveal? By evaluating Démare’s cadence, speed, and power, particularly in comparison to cyclists that he was reportedly passing, one could reach a conclusion whether or not he was climbing the final climb with his own power. If he was cheating, his cadence would be different and he would be using less power, rather than more, to pass other cyclists and regain the front of the race from which he sprinted for the win. If he was completely on his own, then the data would prove his innocence, recording the greater effort required.
This incident brings into focus two important and controversial issues of digital transparency.
First, when digital data is collected for defined purposes, should it be available to be used for other purposes? The race officials had no rules in place for managing this investigation, and the insistence that only video records could be relied upon is almost quaint and simplistic in these times of instant replays and computer simulations of the flight of tennis balls at 125 mph. Does the existence of data mandate its consideration as evidence to be evaluated or, since it was never intended to be used for that purpose, would the race officials be justified in refusing to do so?
Second, what standards will be used to trust the data as superior evidence (on which the oral testimony will be offsetting he said-he saids) with which to reach a conclusion? How well were all of the mechanical units calibrated? How is the deletion of a public, duplicate file to be considered as evidence against the accused? If the data does indeed confirm that cheating occurred, what happens next? What time frames exist in which a complaint is to be filed and adjudicated? Like the error in announcing the Miss Universe winner, do officials take back the bouquet of flowers and winner’s check?
The social drumbeat is that technologies are creating greater and greater transparency. But it is not the degree of transparency that is changing; it is the ubiquitous capture of more and more data recording our events, our actions, our decisions, and our competitions. When the rules are in place for how to use that data, transparency becomes a possible, but not an inherent, outcome.
A cyclist could legitimately argue his ride data (from his personal onboard computer) is personal information not available to race officials for consideration. The officials could, themselves, argue the data they capture from the transponders has not been designed to be used as post-finish line adjudicative evidence and reject any claim they should consider it.
But the capture of the records will continue, and we will always pursue capturing data that measures our contexts, our actions, and their outcomes with greater and greater granularity and precision. The race data is there, but are the rules in place to allow the transparency into the data as evidence of the integrity of a cyclist?
Today, they are not. But as March Madness unfolds in the US, and college basketball fans watch more and more moments when the striped shirts of officials huddle over video replays from multiple angles, there is no question the rules will evolve into being. As a species, we do not welcome cheaters. But we have so often resisted technologies that record and document us acting as cheaters. The best example? Not a single toll road in the United States uses the sensor data of a car entering and leaving a toll road as evidence of unlawful speeding.
When we can create and record data that portrays performance, and express our rules for using the recorded data as evidence to evaluate whether performance fell within the boundaries of the rules, greater efficiency is always the outcome. So is a fair race. But will global companies welcome the transparency that results? Will they resist performance data internally generated being used to measure their compliance with the rules? And what will the rules be for doing so?
Among cyclists, and I am one, it is often said, “Everything I needed to learn about life I learned on my bike.” The same seems to be true about what we are about to learn about transparency, ubiquitous surveillance, authorized usage of digital records, and competition in commerce.
What will happen when all of our systems data is available to measure whether our companies play within the rules? What will be our appetite for evaluating the data of our competitors to know if they are competing fairly? Will we start deleting our data to preclude its availability or will we adapt and compete fairly, knowing our winnings could be taken from us if the data reveals our cheating? Will we encrypt our data to preclude its utility as evidence? Will we have rules for requiring decryption when the alleged crime justifies doing so? Will we honor the intended uses for data (such as privacy laws) or disregard those intentions when the digital record can serve as definitive evidence of the truth?
As I was finishing this post, a friend texted me a link to the Edelman trust barometer survey results for 2016. You can find them here. Since the beginning of the 21st Century, Edelman has been measuring varying levels of trust within different communities and economic sectors. Fascinating stuff, to be sure. But their surveys measure the emotions of trust—how do people feel about certain institutions, products, companies, governments, or defined categories of actors?
In Achieving Digital Trust, I make the case that trust is not an emotion, but a calculated decision fueled by objective information. We are at an evolutionary pivot point where we are about to shift away from the measurement of emotions, and instead turn to measuring our performances, as recorded by digital devices, as the foundation from which we calculate compliance and trust.