Did you know that the US has a “national intelligence official for warning?” Me neither.
His name is Kenneth Knight and he oversees a staff of a half-dozen analysts whose job is essentially to monitor the rest of the intelligence community, challenging their analyses and assumptions. Because this seemed like a timely sort of job given today’s perilous and uncertain intelligence economic environment, I thought it would be worth a moment to see how his perspectives expressed in his interview with McKinsey might adapt themselves to law-land.
What is his job? As he puts it, “to avoid surprise.”
This is of course not a new concept, and its original incarnation was during the Cold War, but then, as Knight puts it,
“it was focused on what I would call static, more military-oriented problems, where you could define the bad outcome ahead of time and kind of monitor against that–country A invading country B.
“I think in today’s world, that’s a necessary but not sufficient part of the overall warning mission.”
Is this starting to sound a bit like the strategic challenges you were fixated on from, say, 2001 until September 2008? Firm A invading your cherished practice area space in a key city? Firm B poaching some partners? Firm C acquiring a sweetly positioned boutique you had your eye on?
We all know the world is not so simple any more. It’s not a bipolar or even a multipolar competitive landscape you’re facing–when at least everything you had to worry about was on a single plane, in the geometric sense. Now it’s entered 3-D.
To be sure, you still have all your own moves and potential counter-moves of competitors to worry about, as well as clients’ heightened demands and expectations. And don’t ignore or fail to plan for competitors’ counter-moves: As they memorably put it in the military, “the enemy gets a vote.” I’m repeatedly amazed by the failure to anticipate the most elementary reactions of other firms to a planned initiative.
But as I said, that’s old news. The new news is that I believe there are also changes rumbling under the surface in the basic business model. Just to suggest a few:
- Associate career paths: Is the “Cravath system” under pressure?
- Leverage: Are the days of 4:1 or 3:1 or even 2:1 associate:partner ratios history?
- Alternative fees: Is the billable hour for the first time under serious challenge, despite decades of ineffectual spilled ink to that effect for the past 30+ years?
I’ll write more about these in the future. But back to Knight.
One of his core insights is the difference between what he calls the “simple likelihood-of-the-event versus impact-of-the-event calculation.” In other words, you can be pretty certain that a low-impact event will happen if you wait around long enough to read all the tea leaves: But so what? On the other hand, you would be well advised to send up distant early warning flags about high impact events even if their likelihood remains deeply uncertain or even positively improbable. The challenge is to tread the fine line between missing major eventualities and overwarning and crying wolf.
In terms of big-deal but improbable events: Understand: I am not suggesting that you indulge in the news-entertainment media complex’s obsession with profoundly improbable events–being attacked by a shark or having octuplets–but, ludicrous examples aside, it still is worth devoting more than average attention to developments that could have profound consequences even if they’re not evidently likely.
The question remains whether you can try to systematize this type of analysis.
Knight thinks you can, or at least that there are ways to improve your odds.
First of all, understand and beware the cognitive biases of experts. Experts develop, over time, analytic frameworks. This can be wonderfully economical of thought ceteris paribus.
A lot of times surprises occur when those analytic frameworks that the experts have–and have built their career on, and have built their experience on–no longer apply. And so we are constantly kind of pushing that and causing tension, where a nonexpert from my office is engaging with an expert and challenging their expert bias. I think the ways to get around that: there’s really two. And we’ve tried in both areas, and I think have made progress in both areas.
The first is training. You can train individual analysts and expose them to the research that you mentioned in your question. Try to make them cognizant of their own potential biases. Give them tools for challenging their own assumptions, for doing kind of competing alternative analyses. Do things internally to their organizations, such as conduct peer reviews. Try to get their managers to embrace some of these ideas also. And we’ve done a lot over the last five or six years to try to train the workforce–all analysts–in those kinds of techniques and analytic procedures methodologies.
I think the second thing is really what my organization represents. You can assume the analyst will always have some kind of pathologies and create then an institutional check–so a warning staff that has the mission of challenging those. We have a number of organizations that have some kind of function along those lines, whether it’s red teams, or red cells, or alternative analysis groups. And I think that’s fairly successful.
If there’s a third way, I’d be happy to hear about that because this expert-bias notion and the fact that you’re asking an expert in a crisis to maybe challenge or abandon an analytic framework that has served them well, I mean it just goes against human nature, especially with the time and pressure that you have in a crisis situation.
Knight adds, not entirely facetiously, that he tries to make his analysts “the dumbest guys in the room.” In other words, those with permission to ask bedrock questions and essentially take nothing for granted. Usefully, he embraces the analogy of his father’s job as a Washington, DC policeman, and distinguishes between the “stakeout,” where you’re surveilling some known activity of interest, vs. walking the beat, where you’re just going through the neighborhood trying to be attuned to things that “don’t look right.” They may or may not be innocent–but you need to understand it first to make that evaluation.
Another aspect of the “walking the beat” mindset is being tolerant of ambiguity. Maybe heightened activity on the corner by the bodega is pure happenstance; maybe it’s going to turn into something pathological; maybe it’s going to germinate instead into a youth soccer league. In other words, don’t assume you have to jump to making a dichotomous, either-or, call. Think more in terms of process and evolution than in terms of end result: “Leader A is going to be removed from power or he’s not.” If you have to make that call, you’re going to wait (if you care about your career) until the evidence is overwhelming one way or the other. At which point your “intelligence” is common knowledge, and not worth much more than yesterday’s front page.
And get this: Knight thinks relying too much on experience is an obstacle, not a vehicle, to clarity of thought:
I’ve been in this business 30 years. I would say almost everything I learned in the first 25 years is challengeable. And so to believe that expertise comes with longevity, to believe that my successful past model has always got me through and therefore will continue to get me through, … to assume that the way that was 5 years ago is the way it is today or will be that way 5 years in the future, I just think gets us into trouble.
I think if you get far enough in the future, [the system will be] unrecognizable compared to where we are today or where we were five years ago. But I [insist upon] this constant need to look at your own business model or your own, in our case, intelligence model, and to look at what are the baseline assumptions that you have that that model rests on and to challenge those–and not just in a check-the-box way, not just do an alternative assessment at the end of your baseline analysis and reconfirm your analysis, but to truly challenge them–to me it’s the only way, I think, to stay current and perhaps stay ahead of the curve.
Knight alludes to the difference between “mysteries” and “secrets,” as first outlined in testimony Joseph Nye, then Dean of the Kennedy School of Government at Harvard, gave in 1996 to the Commission on the Roles and Capabilities of the US Intelligence Community:
Yet a fourth reason why analysis is more difficult [in the post-Cold War era] is what I’ve sometimes called before and others have called the difference between mysteries and secrets. Secrets are things which you can steal, like the size of a warhead on the SS-20. Mysteries are things which it doesn’t do you any good to steal, because the people you’re stealing from don’t know the answer, either, such as will Yeltsin be in power a year from now? Yeltsin doesn’t know the answer to that.
The Cold War, I think, was a setting in which many more secrets were important. And in the period after the Cold War I think a higher proportion of the questions that decisionmakers and policymakers want answered are mysteries. And that leads to a problem in terms of how you do good analysis. If you use your clandestine means and focus too heavily on just stealing secrets, and you think that that’s where the value-added comes from, you may miss the fact that a great deal of what’s good analysis of mysteries comes from diverse, open sources.
In that sense, the problem that we have, to use a metaphor, is that sometimes the intelligence community has felt that as it assembled a jigsaw puzzle [from] several very nifty pieces which had been obtained by clandestine means, but it often had difficulty seeing the picture of the puzzle as a whole that was on the cover of the box. And it was those open sources, those outside sources, which often provided that information or that perspective that allowed you to determine where the clandestinely obtained piece of the jigsaw puzzle really fit.
Where does this all lead us?
- Do not let perfect intelligence be the enemy of the good.
- Alternatively phrased, remember that a good plan executed today beats a perfect plan executed tomorrow.
- Permit yourself to be “the dumbest guy [gal] in the room.”
- Look outside your own organization–to “open source” intelligence.
- Recognize that what you’re looking for may not be secret; it may be a mystery.
- Tolerate ambiguity.
- Beware expertise; it can equate to calcification of thought.
And keep scanning the neighborhood for things that just don’t look right.