Archive
Great Seal of the United States National Commission on Terrorist Attacks Upon the United States



Fourth public hearing of the National Commission on Terrorist Attacks Upon the United States

Statement of Mary O. McCarthy to the National Commission on Terrorist Attacks Upon The United States
October 14, 2003

Mr. Chairman, Governor, Commissioners:

It is my privilege to be here today to contribute in any way that I can to your important work. In your quest to determine the points of failure in our Government that permitted our national tragedy of September 11, 2001, and thereby to shape remedies that will enable our citizens to have renewed confidence in their Government's ability to protect them in the future, you have rightly identified, as an area of inquiry, the question of the Intelligence Community's capabilities to provide persuasive and timely intelligence warning. It is on this question of warning capabilities that I wish to focus my remarks.

The quick answer is that the Intelligence Community has an extraordinary workforce of talented and motivated public servants who are entirely capable of producing the best possible intelligence and analysis. The system that should exist to produce timely and credible warning information, however, repeatedly has broken down, as a brief review of history will show. Successive recommendations over the decades to construct and maintain a system by which the Intelligence Community can reliably warn of potential outcomes have faltered in the implementation.

In my remarks today, I would like to provide a brief history to illustrate the cyclical nature of the warning reform issue, explain what is meant by "warning", describe a schematic for producing reliable warning intelligence that applies to the so-called conventional threats as well as transnational-threats, point out the dangers that arise if this kind of approach is not used, and offer some suggestions for breaking out of the cycle of review-of-failure followed by imperfectly-implemented-reform.

The History

It has been widely noted since September 11th that the Intelligence Community, shaped and grown as it had been during the Cold War to address a single threat, was ill suited to deal with the changed world of the early Twenty-first Century. During the Cold War, these observers maintain, the Intelligence Community developed sound systems for providing warning concerning the Soviet Union and other situations involving the mobilization of large militaries. In particular, this view suggests, until a dozen or so years ago, that the Intelligence Community hardly thought about the so-called transnational threats, or about other parts of the world except as they related to the Soviet Union. Therefore, we should not expect the Intelligence Community to be able to warn of terrorist activity. This view is simply wrong on two counts.

First, even during the Cold War, geopolitical realities required that the Intelligence Community collect, analyze-truly understand-virtually all areas of the world. Long before the end of the Cold War, the Community began to focus on the so-called "new threats", including political instability, terrorism, and proliferation. Nonetheless, the magnitude and complexity of these threats have increased over time and clearly now require a greater level of effort, more comparable to the level-both in number of personnel and array of capabilities--devoted to the Soviet Union during the Cold War.

Secondly, the Intelligence Community's record of providing timely and persuasive warning during the Cold War was mixed. On the issue of Soviet military mobilization, perhaps we were never tested. On other Soviet matters, the August 1991 coup, for example, our huge intelligence apparatus did not produce a warning.

A staff specifically devoted to monitoring the world's largest militaries and providing warning of mobilizations was formed in 1953, only after spectacular failures to warn of the Berlin Blockade, the North Korean invasion of the South, and the Chinese entry into the Korean War. A string of intelligence failures with far-reaching strategic and policy implications followed, clustering in the 1970s, and prompting Congressional recommendations that led to the establishment of the National Intelligence Officer for Warning and the National Warning Staff. In particular, the Staff of the House Permanent Select Committee on Intelligence, which had amassed considerable expertise on intelligence matters through their work on the Pike Committee, issued a Staff Report in 1979, in the wake of the latest intelligence failures on Iran. The Report strongly recommended an empowered National Intelligence Officer for Warning and highlighted the kinds of threats the Intelligence Community would have to address in the future.

The failure of the Community to warn persuasively of the Iraqi invasion of Kuwait in 1990 prompted yet another study, and yet another affirmation of the role of the National Intelligence Officer for warning. (The NIO/W had in fact warned clearly that Saddam Hussein intended to invade and occupy Kuwait, but with the rest of the Community demurring, he was unable on his own to persuade senior policymakers.) Then-DCI Gates issued a formal Directive to reorganize the warning effort: he established a single point of accountability in the NIO for Warning, but directed that the warning function be integrated into each and every analytic unit of each agency of the Intelligence Community. That system atrophied with Gates' departure.

Nonetheless, over the decades the idea of doing warning on generic threats such as political instability, terrorism, insurgency, and proliferation has taken hold, and analysts have adopted and tested innovative methodologies for making assessments and identifying threats. This kind of analysis, however, tends to be carried out in units that are separated from the principal regional and functional analysts, and thus, their work tends to be marginalized. Over the decades, analysts who were methodically analyzing alternative hypotheses, especially the "high-impact-low-probability" outcomes, faced enormous bureaucratic battles as mangers of the mainstream analysis often urged their bosses to "let the people who know about this problem take care of it."

What is Warning?

Warning is a process of communicating threat information to decisionmakers in time for them to take action to manage or deter the threat. Warning is not simply sounding an alarm. The warning process is iterative-a series of assessments of increasing precision regarding the likelihood and proximity of the unwelcome outcome.

The warner's job is not complete until the decisionmaker has taken action-even if that action is a decision to do nothing for now. Thus, if the threat is becoming more likely, and if the consequence of the event occurring would be high, then the warner must persuade. Sometimes persuasion can be accomplished because the messenger is credible (the DCI, for example), but even then the warner usually must convince by arguing from the facts and demonstrating that the threat is more likely than the alternatives. Or, if a less likely alternative would have severe consequences, then an early warning of that outcome is necessary. Analysts often hesitate to even try to persuade, knowing that the underlying rationale for the warning is highly ambiguous, but that is precisely the point at which the decisionmaker needs warning; if there is no ambiguity, then the situation will be evident. The warning mission is all about sorting out ambiguity to develop a clear statement of probabilities.

Warning Analysis

There is no question that analysis of and collection on trans-national threats has become more complicated. The mobility and small footprint of non-state actors present new challenges. Yet, the principles of sound analysis are a constant. The approach works for warning of North Korean movement across the DMZ, for the overthrow of the Shah, or indeed, for terrorist operations. Here are seven steps:

  1. Envision the outcomes-craft a series of outcomes ranging from the devastating (or merely the undesirable) to the benign.
    • What can we infer or imagine the adversary may wish to do to achieve his goals? These we often call the "alternative hypotheses." The key is to posit a range of outcomes.

  2. Reach an understanding, or carefully explore, what acts the adversary would have to take to achieve the particular outcome, or what circumstances would have to converge to produce that outcome. We often call this process, "thinking backwards." For example,
    • If the North Koreans wanted to move across the DMZ, what units would they have to move? What kinds of logistic activities would they have to undertake? What political actions would we see?
    • If terrorists wanted to use airplanes as weapons, what actions would they take to prepare?

    Sometimes we need to employ special methodologies to think backwards, and in those cases, we need to consult theoreticians and tap academic research.

    • What makes a state fail, a government collapse? How does an insurgency win?
    • Then we make lists of those indications.

  3. Now we rank those "indicators," so that we can begin to establish some diagnostics.
    • We ask, which actions are absolutely essential to the outcome?

  4. Next we start to array the data under each of the hypotheses. Most important here is that we continue to let these hypotheses compete with each other. We don't eliminate one, unless we have unassailable evidence that absolutely negates it.

  5. At this point we begin to see glaring gaps in what we know. Although we probably have known they were there all along, it is only when we begin trying to prove a hypothesis that the gap becomes really relevant.
    • Here a point about collection: only the analysts involved in this meticulous and rigorous process can really understand the salience of the information; and thus the analyst must be in continual dialogue with the collectors.
    • Go back to those "diagnostic" indicators that we identified earlier. That is the particular information which the analyst must have in order to understand the probability of that particular outcome. If we do not have that information, then we have no business adopting that hypothesis as the sole track of our analysis, or of discarding others, unless they have been disproved.
    • Analysts, therefore, must work together, and with collectors, to devise creative ways of obtaining that information; and analysts should not accept the "too hard" message of collectors.

  6. As the data are arrayed, often new hypotheses will present themselves, and thus the process continues.

  7. And as the process continues, iterative warnings are given. Probabilities inevitably shift. But the process allows the Intelligence Community to convey to the decisionmaker the basis for the judgments and some sense of probability and proximity.

Much of terrorism analysis ends up being "small picture", or tactical, rather than the "big picture", or strategic warning. Yet, strategic warning analysis is fundamental. It will lead to a greater understanding of the overall issues, help identify gaps in intelligence so the information more likely is available when needed , assist the decisionmaker in developing policy choices, and elucidate terrorist tactics and intentions-thus increasing the instances and chances of success for the operational aspects of counterterrorism (interdiction-disruption-apprehension).

It must be added that a vibrant warning system is dependent upon access to the full array of terrorism intelligence. Without sytematic sharing of intelligence across agency boundaries in accessible, integrated databases the warning function cannot be reliable.

Why A Separate Warning System is Necessary

Various studies of intelligence failures-both academic and government-have identified common causes that emanate from analytic tendencies. The process for analyzing threats described above offers a way to minimize these problems. For example, an enduring endemic problem has been that of the existing expectations of analysts and the way those expectations distort the effects of new information. Discrepant information that challenges a belief (e.g., terrorists are likely to attack abroad) is required to meet a higher standard of evidence, and to pass stricter tests to gain acceptance, than is the evidence tending to support the favored hypothesis.

Institutional politics in intelligence agencies also plays a major role. Alex George, in his study almost 25 years ago of a series of intelligence failures and surprises, noted that the only common factor in all the cases was "the influence of organizational politics." The safe analytic line, i.e., the one that challenges least the conventional wisdom or causes the least stress on the organization, held sway.

But probably the greatest single impediment to the institutionalization of a Community-wide warning focus is the increasing emphasis over the last decade or more on the production of what is called "current intelligence." An institutional focus on daily publications for top policymakers can overpower an agency's top analysts, pushing them toward the daily "take" on the situation, and away from the heavy lifting of warning analysis. Often this emphasis has been in response to urgent needs: support to the warfighter, for example, or the round-the-clock effort to support the war on terrorism. Across the Community, however, much of the activity has been geared toward answering the policymaker's question of the day, of responding to a specific query, or to an expression of interest.

Intelligence managers cannot expect analysts who are responsible for daily production to be able to discern early or subtle developments that would indicate an emerging threat. Talented as these individuals may be-and usually managers assign the best to current duties on important accounts (or the managers themselves are the best experts)-the pace of work does not permit reflection, research, or the application of methodological techniques that might help them weigh alternative hypotheses when processing in new data.

Breaking the Cycle

After every major intelligence failure there has been a commission of some kind that has recommended a strengthened effort on warning. The scheme Bob Gates devised as DCI seems to me to be the right one: make the NIO for Warning the substantive leader on warning analysis and an advisor to the DCI, but integrate the warning function into each and every analytic unit in the Intelligence Community. However, with the Gates plan, as with every other so far, the bureaucracy inevitably wins out. Warning analysis often upsets the applecart; good, honest, hardworking, and patriotic Americans though they are, intelligence officers are also human beings and do not want to be second guessed, especially by those who may be considered less knowledgeable.

It is clear from recent history, as well as from the past, that simply having a National Intelligence Officer for Warning is not sufficient. The NIO can fulfill an important role: providing substantive leadership on the discipline of warning, acting as advisor to the DCI, and could even be designated as the single person accountable for warning failure. But the NIO as a practical matter cannot really be responsible for the implementation, on a day-to-day basis, of warning analysis throughout the Intelligence Community. It is simply a physical and intellectual impossibility for one person to accomplish, and it questionable whether he or she would ever be perceived as having sufficient authority to do so.

It is rather, a system of warning analysis that is needed; that is, individuals accountable for the conduct of warning analysis in each analytic unit-regional and functional-throughout the Intelligence Community, and senior officials each also accountable for warning analysis in their own areas of substantive responsibility. The most accomplished and respected analysts, along with those known for their creative abilities, should be assigned to these warning units.

The Congress, too, has a role in warning. Unlike other functions of democratic government, the conduct of intelligence is purposely-and with the consent of the public-carried out in secret, out of the view of public and without much public debate. Instead, Americans rely on their representatives in Congress to ensure that the intelligence function is performed, not only in a way that keeps us safe, but also in a way that is consistent with our democratic values.

Thus, the intelligence oversight committees have a heavy burden. Unlike other committees which regularly receive citizen input, and are assisted by the scrutiny of the public over "what the government is up to," the intelligence committees must depend on small staffs and input from the very agencies they are charged with overseeing.

The issue of warning analysis has not received much Congressional oversight in recent years. It has been almost 25 years since the last major look at the issue. Now, the Congress should move quickly to: (1) demand a baseline review of Intelligence Community organization and capabilities devoted specifically to warning analysis; (2) increase the size and expertise of the oversight committee staffs to support such an examination; (3) authorize and appropriate funds to create more balance between the so-called current intelligence (current situation and near-term outlook) to a methodical analysis of threats such as the one presented above (or according to some other rigorous analytic scheme); (4) legislate managerial accountability for failure; (5) mandate the creation of integrated Intelligence Community information technology; and (6) establish ongoing review mechanisms, possibly through the General Accounting Office or a similar structure.

In the end, we are all human beings. And even with the help of increasingly capable machines, we cannot know everything. Nor can we imagine everything. We cannot know every fact. We cannot even find out every fact. We most certainly cannot eliminate failure. But we can do better, and we can be safer.

Dr. Mary O. McCarthy, a CIA officer, is currently a Visiting Fellow at the Center for Strategic and International Studies, researching and writing on intelligence matters. Previously (from 1996-2001) she had served) as a Special Assistant to the President (Clinton and Bush) and Senior Director for Intelligence Programs on the National Security Council Staff. Prior to moving to the White House, Dr. McCarthy served on the National Intelligence Council as National Intelligence Officer for Warning, and Deputy NIO for Warning. She began her career at CIA in the Directorate of Intelligence in analytic and managerial positions in the areas of Africa and Latin America.

Prior to beginning her government service, Dr. McCarthy spent time in the private sector as a Director, then Vice President of BERI, S.A., a Swiss-based company conducting risk assessments for international businesses and banks; and in academics, teaching at the University of Minnesota and serving as Director, Social Science Data Archive, at Yale University.

Dr. McCarthy has lectured and written on the relationship between policy and intelligence, on the problem of intelligence warning, and on numerous topics involving the risks associated with international business. She has also published a book on the social history of Ghana. Her Ph.D. in history is from the University of Minnesota.



Current News


The Commission has released its final report. [more]

The Chair and Vice Chair have released a statement regarding the Commission's closing. [more]

The Commission closed August 21, 2004. [more]

Commission Members


Thomas H. Kean
Chair


Lee H. Hamilton
Vice Chair


Richard Ben-Veniste
Fred F. Fielding
Jamie S. Gorelick
Slade Gorton
Bob Kerrey
John F. Lehman
Timothy J. Roemer
James R. Thompson

Commission Staff


Philip D. Zelikow
Executive Director


Chris Kojm
Deputy Executive Director


Daniel Marcus
General Counsel