On Saturday, my son-in-law described his passion and interest in the neurosciences of fear. As a black belt martial artist, he is skilled in knowing how to defend and, when necessary, attack. But he explained that there is a real difference between the emotions of worry and fear. The skilled martial artist knows how to avoid fear, in part because they control the way they react to risk—what might be truly fearsome to most of us is, to the trained fighter, merely a worry. Action is required to avoid injury, but the action is not fear-based. There are certainly times when we need to react because we truly fear some outcome—but often the reaction can be ill-considered and, sadly, ineffective at deterring what has made us afraid. How does one evaluate the situation at a particular instance in time and control the fear? This question is his focus—absolutely fascinating.
Then, today I spoke with Jeff Lowder, president of the Society for Information Risk Analysts (SIRA). SIRA exists to improve how we analyze risks to information. In our discussion, we began to explore what it means to manage risk—what is one managing? Where did the idea originate that business management embraces managing risk? Is one managing the objectives of the business, or trying to manage the likelihood of events interfering with those objectives? If we accept that one cannot manage what cannot be measured (the essence of Six Sigma and other enduring management models), what must be measured to gain control of risks, and the likelihood of bad things happening?
These are surprisingly difficult questions to answer. But I wonder if those managing information risk can learn something by working out in the gym with a martial artist. When done well, both benefit from slowing down the passage of time, learning how to assess all of the surrounding circumstances, process and evaluate all of the relevant indicators and evidence, and then make rational, informed decisions. Yet, so often those addressing risk management in business act more on fear than on the actual evidence around them. The professionals develop extensive controls—both offensive and defensive—for responding to risks, but do not really try to calculate the real probabilities and make the controls proportionate to the probabilities.
One dominant method of organizing risk analysis is to grade the risks based on color—red for extreme risk, yellow for moderate risk, green for low risk. But, in a world in which we can measure and automate assessment of so many variables, why are we still relying on a methodology that is not much better than trying to fight with your eyes covered by a blindfold, unable to sense and evaluate all of the variables?
On Friday of last week, my wife was routinely reviewing our bank statement online and saw 12 transactions in four states within the preceding 24 hours. Of course, I had not left our home except to get groceries. Dang it—my debit card had been compromised, something we had suffered through last year when my wife’s card was compromised while we were travelling in France. Twice in one year! We were able to immediately call our bank, report the transactions, cancel the card, and already the credits are being restored. On the one hand, the risks to us were properly managed—and our bank provided terrific support. But it left me wondering—if we have been compromised twice in one year, are the risks being properly managed? Is the fact banks have fraud reporting hotlines some indication that, in martial arts parlance, not enough training is occurring?
Over the next few months, I will be exploring the questions and the answers.
As anyone following the Olympics on even a casual basis knows, a few days ago four badminton teams were disqualified for trying to intentionally lose matches in order to improve their position in subsequent rounds. They were shown the dreaded “black card”.
At first, I was not going to write anything, then a British cyclist intentionally crashed half-way through a team race on the velodrome, which entitled the team a chance to restart. On the “do over” the team, indeed, had a superior time. There was no disqualification. Then today, a runner was disqualified for not running hard enough in a qualifying heat, it being reported he was “saving himself” for another event in which he was entered. So, similar behavior in different sports, and with different outcomes. But, there is unanimity in the ethical conclusion the behavior is not acceptable.
Now, I am truly perplexed. Are there any circumstances in business where one might wish to intentionally “lose” in competition? Once you start thinking about it, you realize there are tons of situations where losing on purpose makes sense. Securities markets trade “short” and “long”; employees “throw” transactions in exchange for other consideration, etc. So, in designing your business systems, what controls are needed to protect your company against the possibility that an employee may wish to pretend to play the game?
Using my Trust Prism is somewhat like a truth serum. Just like a prism disperses light, my analytical process demands that all of the operating data from a system or process be capable of being examined. Only by doing so can we expose the type of data that may be indicative of when someone on our team is actually playing to lose.
We were watching a game of women’s handball between Angola and Russia. For those who like their sport fast-paced, physical, and tough, this is a great event! It is somewhat surprising the sport is not more popular in the US.
But here we are, sitting in Virginia, watching live video from London on a laptop, two national teams from different continents, speaking entirely different languages, playing the same game.
Even without the commentator’s audio, as spectators we were able to understand the game. Though separated by distance, language, and radically different cultures, both teams were able to play to win. When the game was over, both teams knew who had won, and graciously accepted the outcome.
So, what makes this possible? The rules of the game create the shared language for playing. The rules are the building blocks of understanding, communication, scoring, penalties, winning and losing. The rules prescribe the court, the size of the net, the shooting lines, the size of the ball, and the materials with which the ball is constructed.
Even when translated into different human languages, the rules deliver the consistency, the structure, the methodology, and the manner of scoring and winning. They enable everyone to trust the game, and play with vigor and resolve.
In the global inventory of complex information systems, each system begins as its own nation. But to engage in commerce, to exchange information, to execute transactions, and to create wealth for the operators, the systems must move onto a new, shared playing field.
It is not enough to look at the Internet as the only field of play. Health care, consumer games, financial services, supply chain management—each adds its own rules to create the game to be played. But playing the game requires that a system (and a system’s owner) know, and execute, all of the rules—the business rules, the technology rules, and the legal rules. The rules must be authored so that everyone who wants to play can access, and understand, the rules.
Using my Trust Prism, I help companies see and better analyze what all the rules are, build an inventory of those rules, and develop the systems and processes for assuring those rules are followed. In doing so, a company is better prepared to play to win.
During both of the Olympic bicycle road races, the rules of the race prohibited the cyclists from using radios to stay in contact with their team directors. In both races, the winning move was made by cyclists at the front of the race who were quickly out of sight due to the bending, winding roads.
Several of the losing racers—all professionals who are allowed to use radios in major races like the Tour de France–attributed their lack of audio input to the reason they lost—they were, in effect, racing deaf. They had lost access to the valuable information about what their competition was doing. Losing that knowledge, in the minds of the competitors, made all the difference.
In any level of commerce, we value information about how our competitors are performing. There is even a Society of Competitive Intelligence Professionals that teach the ethical collection of competitive intelligence.
For many networks and systems, collaboration among competitors is an indispensable element of how things work. Stock exchanges are a great example. That collaboration, like racing, demands a certain level of transparency among competitors. Take that transparency away, and one of the key benefits of collaboration within the system may be lost. Just like the cyclists this past weekend, no one likes to lose access to competitive intelligence.
Using my Trust Prism, I help companies understand the competitive value of the information they produce, and what information they can lawfully gather when participating in systems involving their competitors. The results are the same as giving you a race radio for the first time. You can build better controls to govern your digital information—protecting what is valued, and sharing what can be shared to improve competitive outcomes.
But, more and more, in our interdependent, wired world, we also resist investing in systems which do not enable us to know what our competitors and suppliers are doing. But no one who is playing to win can afford to play the game without having access to what their competitors are doing. How about you—will you make your own performance visible to your competitors in order to know what they are doing to try and beat you?
In surfing across the Olympics coverage, I happened upon the final rounds of the gold medal team archery competition between Italy and USA. It was the first time I had ever watched and it was breathtaking.
Each team shoots 30 arrows at a target, with 10 points for the bullseye, and 9, 8, 7, etc. being scored for the further rings(each just 2.4” in width). The archers stand 70 meters away—that is 230 feet away. The arrow’s shaft is perhaps ¼” in width. An arrow touching a line between the rings scores the inner value.
The gold medal came down to the final arrow from Italy—a 10 would win, 9 would tie, and 8 would lose. The arrow’s shaft was in the 9 ring, but it did touch the outer edge of the bullseye—and was scored a 10.
We often refer to football as a game of inches; same as soccer. Gotta tell you—archery wins—victory by 1/16th of an inch over a distance of 230 feet! But what made the competition so incredibly exciting is that you immediately knew how the rules were applied, and how victory, or defeat, would be measured.
In business online, companies continue to invest millions in fancier websites and interactive pages and services. But they fail to make the additional investment in having in place the controls for measuring how well those features produce results that convert into success. More and more, the difference between success or failure is measured in smaller and smaller increments.
What are the measures of your success? Can you measure them with enough precision to know you are winning? Or, as is often the case, are you shooting for the target without any sense of knowing whether you hit the bullseye?
My trust prism is really cool to use; when you start asking questions like the ones above, you discover how important it is to build in the ability to measure success.
In watching the Summer Olympic Games, I realized that there are only a few sports that have rules in place for temporarily removing a player after committing a violation. Water polo allows the official to remove a player, creating a “man down” advantage for the opposing team. Lacrosse is similar. During winter, in ice hockey, if a skater commits certain types of fouls, the official sends them to the penalty box for a measured time period. The opposing team has a “power play” opportunity to score.
Of course, in nearly every sport, there are rules that award sanctions against a player that violates the rules. In soccer, a single “red card” penalty can result in permanent removal (without substitution); in basketball, five or six fouls (depending on the level of play) can result in being expelled from a game. But why do only a few games have the “penalty box” concept?
Clearly, a “man down” creates advantages for the opposing team; in theory, the risk of these advantages being realized is intended to discourage the conduct.
Sometimes the threat of sanction is not enough; the player must be sat down for the officials to make their point that the rules will be enforced, and sanctions will be imposed. Those sanctions can be outcome determinative, such as additional goals being scored that change the final score.
But these “man down” and “penalty box” rules are fascinating. The rules allow the game to continue, with both teams still going for the win. Though sanctioned, the offending player has a chance to return to the game and re-engage in the competition. There is a synergy and inter-dependence of the game itself, and all of the participants, that is allowed to be sustained.
When drafting agreements, companies rarely consider introducing “penalty box” provisions for how to deal with violations of the rules. The contracts become playbooks for enabling the lawyers to have really big disputes after the relationship has been terminated. Rarely are the contracts authored to enable the game to go on, empowering everyone to continue to work toward the reason they are playing together in the first place.
But, in cloud services today, the game has changed. There are qualities of synergy, interdependence, and mutually shared objectives that transform how companies work with one another. Often, the dependence is so strong that it is difficult for the parties to just walk away, and leave the mess to the lawyers.
The next time you draft a contract for cloud services, consider using a “penalty box” concept. When someone does not follow the rules, sanctions may be appropriate, but they should be explicit, swift, certain, and allow everyone to get back to the game—working together to achieve new profits. Just like in water polo, lacrosse, or ice hockey, you need to trust that the continuation of the game is why everyone began doing business in the first place.
As a serious amateur cyclist, I follow closely all of the news about blood doping in sport. Blood doping, or any other drug-based technique to enhance performance, violates the rules.
Athletes who dope shatter a cardinal principle in sport—the expectation that all athletes come to the playing field as equals, prepared to compete based on their strength, their speed, and their agility. The athlete is trusted to participate fairly , and those who cheer the sport trust the athletes to compete fairly.
Of course, athletes try to gain every possible advantage before taking the field. But some innovations or strategies violate our sense of fair play and we make rules that prohibit the behavior. Speedsuits in the swimming pool, pine tar on a baseball bat, amphetamines in the blood stream—you get the point. We simply don’t trust that the competition remains fair.
The doping stories from London are not surprising: new dopers are getting caught and thrown out. Do they really believe it is worth the risk? What makes doping in amateur sport so confounding is that the gains are so marginal and short-term: glory, perhaps some financial payment from sponsors, but the fame is truly fleeting. But, if doping athletes are not caught, if they are able to win and be glorious, that fame means everything.
In today’s world, digital information is the fuel of business. We cannot afford to make business decisions without trusting our information. Companies, systems, and individuals that try to play by cheating, and by offering up information that is not real, are getting caught. As in athletics, we are seeing a transformation—“data doping” can no longer be sustainable as a business model for competing.
Much of the mortgage loan crisis can be traced to “data doping”. Driven by the desire for amazing economic returns (the gold medal in business), companies stopped testing and auditing the veracity of information. Employment histories were not checked; financial data was not verified; home values were not confirmed.
My trust prism is a powerful tool for companies to use to evaluate whether their own data is being “doped” by inadequate governance. The trust prism is also invaluable for testing the integrity of inbound data from customers and suppliers. Don’t get caught data doping!
So, the games have begun. And I am not talking about the sporting events in London. No, I am referencing the political games that are played around, about, under, and inside the venues that exploit the athletes to make political statements.
As reported nearly everywhere, the women soccer team from North Korea takes the field to play their first game in the Olympics. They look up and see the flag of South Korea displayed on the large video screen. The team walks off the field, protests, and squats in their locker room for an hour before finally taking the field.
Immense waves of apologies follow—“simple human error” is the explanation. There was, of course, no confusion as to the identity of the team on the field, their origin, or their competitive ambitions. We knew, they knew, everyone knew who they were. Their identity was certain!
Yet, the confusion in displaying a flag froze the proceedings. The event generated several dozen linear feet of news stories across the global reach of cyberspace, and diverted the attention of millions from focusing on the game itself. But it also diverted people from focusing on their work or whatever other activities they were sitting at the computer to perform. We can only imagine the adverse economic drain that such diversions caused across the global workplace.
So, a simple human error in displaying an identity badge (i.e., the flag), that did nothing to actually confirm the identity of the team on the field, had a direct, negative economic effect on global productivity.
Across our information systems, we invest billions in building and operating controls and processes for confirming identity as a condition to accessing and using those systems. We do so because one of the critical building blocks of trust is identity. . . and our ability to confirm the validity of that identity.
When those systems fail, we can spend a lot tracking down why they failed, and strengthening the controls and processes. But, what happened in Glasgow at a soccer game had nothing to do with confirming the validity of identity. It had everything to do with an actor (in this case, the team) being properly identified to the audience (i.e., the users of the displays in the stadium).
In business, the first challenge is to strengthen and enforce true tests of identity. But it is equally important that, once identity is known, the identity is consistently represented across the systems and the lifecycle of the processes. In other words, in order to maintain trust in its operations, the system has to fly the right flag for every user.
In my Trust Prism—a decision model for how companies build trust into their digital information—identity, and managing identity, is a critical building block. Truly understanding the costs of flying the wrong flag can make all the difference.
On Tuesday, January 31, 2012, I will be speaking at the ALM Law Firm CIO/CTO Forum, held in conjunction with LegalTech in New York City.. It is a pleasure to be the featured speaker following the legendary Don Tapscott. My topic—“Building and Securing Digital Trust”—represents the first time I have presented my argument for trust as an economic asset to law firm executives. I am looking forward to it. If you will be at LegalTech in New York, and are a CIO/CTO type, please be sure to join us. If you cannot join us, drop me a note at email@example.com and I will be sure to send along an executive summary of my remarks.
Following a break, I will also be the facilitator for a panel on “Navigating the Rapids—Delivering on Mobility, Security, and Privacy”. The Ritter Academy has been invited to set up a table—our first appearance at LegalTech. All in all, it should be a great experience.
There is a quiet revolution occurring in how we evaluate whether companies can be trusted. After all, whether buying widgets or the securities of a company, the purchase decision involves a calculation of confidence that we will get what we believe we are purchasing. Indeed, much of the modern legal framework regulates what information is required to be disclosed to make the commercial transaction acceptable: financial statements, food product content, drug warning labels, new car purchase stickers, contractual warranties, etc. All of these exist to enable informed decisions, but somehow we have made little progress in terms of our ability-as consumers-to obtain insights as to the digital trust a seller or supplier can demonstrate.
Whether looking to make stock investments or obtain Internet-based services from “cloud providers”, a growing momentum exists to demand increased transparency regarding how well companies manage their digital infrastructure. On different fronts, companies are facing new formal legal requirements and the pressure of voluntary compliance with mechanisms to report and disclose information about how they create and manage their information systems.
Public Company Disclosures
In October, 2011, the U.S. Securities and Exchange Commission published a Guidance on the obligations of registered companies to disclose cybersecurity risks and cyber incidents. The Guidance is available here. The Guidance examines different types of disclosures that are to be made in order that a reasonable investor making an investment decision does not rely on disclosures which would be otherwise misleading. Here are some examples:
· If an investment would be speculative or risky because of the risk of cyber incidents, those risks should be disclosed if they would be among the significant risk factors. Risk factors that are disclosed need to describe the nature of the risks and their impact on the company.
· Actual cyber incidents that have a material impact on the company (an example given by the SEC is a cyber attack embedding malware that compromises customer data) may also require disclosure.
· The financial impact of specific attacks may also require discussion in the “Management’s Discussion and Analysis”.
· Material pending legal proceedings involving a cyber incident may be appropriate for disclosure.
· Financial statements can take into account both investments in prevent cyber incidents, as well as their impact on diminished cash flows, customer goodwill, and customer-related intangible assets.
There is no question that this guidance will provoke considerable discussions in corporate board rooms regarding whether any real cyber security risks exist. But the SEC has taken an important step forward—it has empowered investors to have (a) a legitimate basis of inquiry regarding the information systems of a company, and (b) a basis to ask questions regarding the security and integrity with which those systems are maintained which are no different in their value to the investment decision than questions regarding the physical facilities, human resources, intellectual property, and other assets of a company.
In doing so, the SEC has also put into motion a further dimension of the dialogue that occurs between public companies and their service providers regarding cyber security and cyber incidents. Now, addressing information security, system security risks, and the security controls that are required by commercial contracts is no longer a discretionary item. Far too often, these topics are minimized in the contracting process, addressed with general, non-binding language, or avoided completely.
Now, the public companies have the incentive to demand a quality of security across their entire operations (including those that engage cloud-based service providers) that enables (a) material cyber risks and the potential for adverse incidents to be controlled (thereby avoiding the public disclosure of those risks), and (b) remedial and corrective action plans to be in place to assure that any incident, if it does occur, is less likely to create a reportable event.
Disclosing Cybersecurity Controls
The Cloud Security Alliance (CSA) has taken a different, and perhaps more influential step. Many lawyers and information security managers have been frustrated by the inflexibility of cloud service providers in addressing security concerns substantively in the related commercial contracts. Often, the service providers, whether providing software, platform, or infrastructure as a service, oppose making contractual undertakings that enable their customers better confidence the customers are able to meet increasingly complex legal rules for maintaining information security controls.
CAS has announced a free and publicly accessible registry that allows service providers to file and document the security controls they offer. The service is called the Security, Trust and Assurance Registry (STAR). Detailed information on STAR is available here. The service was launched with the initial filings of Google, Microsoft, Verizon, Intel and McAfee. Vendors may submit either one of two types of reports, each of which requires detailed disclosures regarding their security practices, and the alignment of those practices with published CSA best practices.
STAR is an important innovation because it sets a standard of care in place for how vendors earn the confidence and trust of their customers, as well as the larger ecosystem a specific customer may support. For example, ABC Manufacturing selects Vendor X to host in the cloud various services that provide data and reporting on 6,500 distributors of ABC products. Those distributors now have greater visibility into the security controls that enable the data services, and in turn, should have better ability to address their own compliance mandates.
Clearly, competitive advantage is going to be realized by those who understand, and do not try to deny, the importance of transparency. Those vendors who decline to participate face two new hurdles. First, customers will make competitive peer-to-peer comparisons between vendors who participate in STAR and those that do not. Second, the SEC guidance, which will surely be copied over time by other national agencies, creates regulatory demands that make it difficult for any public company (or their providers) to do business with a cloud vendor that cannot offer the transparency required to document cyber incident risks are controlled and not reportable.
Where does this lead?
As consumers and customers, and as vendors and sellers, all of us see both sides of the process through which trust is secured that enables a transaction to be executed. Working for our company, we often try to secure the sale with our smiles, great pricing, and advertising. But, in the evening, as consumers and investors, we aggressively investigate any seller, blowing past smiles, pricing and advertising to seek knowledge that informs our decision: crowd-sourcing (such as eBay merchant evaluations), investment analyses, product reviews, etc. It is an essential truth in an open, competitive digital market that the vendor that does not provide comparable information, both in the types of information and quality, will be dis-favored by the consumer.
Technology enables transparency, but it also enables us to express and incorporate into our purchasing our own criteria and preferences. There is no barrier today that precludes customers from demanding transparency on security controls, as well as the effectiveness of those controls. Nor is there any barrier to expanding the criteria on which we seek information until, for any single transaction, we reach a point of indifference. I believe this drive toward transparency will continue and gain momentum across a much larger catalog of criteria than security controls. Each additional object of information allows us to lower our risk that a decision to trust a vendor is subsequently voided by performance failures on which we could have asked better questions.
Companies must anticipate this level of disclosure and build into their system designs the expectation that their controls, their performance, and their failings, will be reportable events, not just internally but to external audiences (such as regulators or customers). It will no longer be sufficient to offer that “we employ commercially reasonable information security procedures”; instead, transparency will be competitively required to enable the trust decisions a customer must make.