Cover Image

PAPERBACK
$70.00



View/Hide Left Panel

1
Analysis in the U.S. Intelligence Community: Missions, Masters, and Methods

Thomas Fingar


The intelligence establishment of the United States is a vast enterprise with more than a dozen agencies, roughly 100,000 employees (Sanders, 2008), and a budget larger than the gross domestic product of many nations.1 Approximately 20 percent of the employees are analysts,2 a category that subsumes photo interpreters, those who interpret intercepted signals, specialists on foreign military systems, and a number of other specialists in addition to those who analyze political, economic, societal, and other security-related developments. All are members of the intelligence community (IC), but their missions, customers, professional identities, and organizational cultures are to a substantial extent determined by the agency (or agency component) to which they are assigned.3 They work on different kinds of problems for diverse sets of institutional and individual customers. The diversity of missions and masters has resulted in a pluralistic structure with sensible—if not always optimal—divisions of labor and professional specialization.

1

The National Intelligence Program budget for fiscal year 2008 was $47.5 billion (Office of the Director of National Intelligence, 2009).

2

The approximate percentage of analysts is based on the number of analysts listed in the Analytic Resources Catalog and the total number of military and civilian U.S. government personnel working in the IC (Sanders, 2008).

3

For descriptions of IC organizations and their primary missions, see Members of the Intelligence Community at http://www.dni.gov/members_IC.htm [accessed December 2009], 2009 National Intelligence: A Consumer’s Guide at http://www.dni.gov/IC_Consumers_Guide_2009.pdf [accessed December 2009], and An Overview of the United States Intelligence Community for the 111th Congress at http://www.dni.gov/overview.pdf [accessed December 2009].



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 3
1 Analysis in the U.S. Intelligence Community: Missions, Masters, and Methods Thomas Fingar The intelligence establishment of the United States is a vast enterprise with more than a dozen agencies, roughly 100,000 employees (Sanders, 2008), and a budget larger than the gross domestic product of many nations.1 Approximately 20 percent of the employees are analysts,2 a cat- egory that subsumes photo interpreters, those who interpret intercepted signals, specialists on foreign military systems, and a number of other spe- cialists in addition to those who analyze political, economic, societal, and other security-related developments. All are members of the intelligence community (IC), but their missions, customers, professional identities, and organizational cultures are to a substantial extent determined by the agency (or agency component) to which they are assigned.3 They work on different kinds of problems for diverse sets of institutional and individual customers. The diversity of missions and masters has resulted in a pluralistic structure with sensible—if not always optimal—divisions of labor and professional specialization. 1 The National Intelligence Program budget for fiscal year 2008 was $47.5 billion (Office of the Director of National Intelligence, 2009). 2 The approximate percentage of analysts is based on the number of analysts listed in the Analytic Resources Catalog and the total number of military and civilian U.S. government personnel working in the IC (Sanders, 2008). 3 For descriptions of IC organizations and their primary missions, see Members of the Intel- ligence Community at http://www.dni.gov/members_IC.htm [accessed December 2009], 2009 National Intelligence: A Consumer’s Guide at http://www.dni.gov/IC_Consumers_Guide_2009 .pdf [accessed December 2009], and An Overview of the United States Intelligence Commu- nity for the 111th Congress at http://www.dni.gov/overview.pdf [accessed December 2009]. 3

OCR for page 3
4 INTELLIGENCE ANALYSIS: FOUNDATIONS This essay is intended to set the stage for the discipline- and field- specific essays of the other contributors. It seeks to identify key character- istics of the IC and to explicate, albeit in abbreviated fashion, why the IC is organized as it is and how mission, expectations, and structure empower and constrain the work of individuals, agencies, and the IC as a whole. ANALYTIC MISSION OF THE INTELLIgENCE ENTERPRISE The mission of intelligence analysis is to evaluate, integrate, and inter- pret information in order to provide warning, reduce uncertainty, and identify opportunities. Providing insight on trends, the political calculus of particular foreign leaders, or the way problems are perceived by people outside the United States is often more helpful to decision makers than is the presentation of additional “facts” or speculation about “worst case” possibilities.4 Discovering that a country is cheating on a treaty commit- ment may be less important than providing insight into why it is doing so.5 Ferreting out all details of an adversary’s new weapon system may be less useful than finding a vulnerability that can be exploited. Prompting deci- sion makers to rethink their own assumptions and preliminary judgments may be more beneficial to the national security enterprise than providing definitive answers to specific questions.6 Intelligence, especially analytic support, is useful to decision makers in direct proportion to the degree to which it is timely, targeted, and trusted by those who receive it. Thorough examination of all relevant factors and how they interact is seldom possible within the real-world decision time- lines of U.S. officials, and getting it completely right is often less important than providing useful information and insights to the right people at the right time. Even data-rich and methodologically brilliant analytic products may contribute little to the national security enterprise they are supposed 4 Examples of trends affecting the agendas and capabilities of governments include the rapid “graying” of populations in Europe and Japan and youth bulges in African and Central Asian countries already struggling to meet demands for education and jobs (National Intelligence Council, 2008a). Political leaders widely considered “close” to the United States who found it expedient to distance themselves from Washington when running for reelection include Iraqi Prime Minister Nouri al-Maliki (Steele, 2008) and Afghan President Hamid Karzai (Voice of America, 2009). For an example of how other countries view U.S. policies, see Tiron (2007). 5 For example, “Russia” failed to honor its obligations under the Chemical Weapons Con- vention because the retired general assigned to oversee dismantlement of now-prohibited activities failed to do what he was supposed to do. When this was discovered, the general was fired by President Yeltsin (Boudreaux, 1994). 6 For example, the way in which U.S. policy makers approached the problem of illicit Chinese sales of chemical weapon precursors changed when they understood that part of the problem stemmed from the limited ability of the Chinese government to enforce its own export regula- tions (Nuclear Threat Initiative, 2007).

OCR for page 3
5 ANALYSIS IN THE U.S. INTELLIGENCE COMMUNITY to support if they are prepared without understanding the knowledge, timelines, and objectives of officials working on the issue.7 In addition to being factually accurate, intelligence analysis must be— and be seen to be—both objective and germane to the needs of those for whom it is intended. The importance of tailored support is one of the reasons the U.S. intelligence enterprise has so many different and some- what specialized components. Oversimplifying greatly, the 16 constituent agencies—with 19 analytic components counting the National Intelligence Council, National Counterintelligence Executive, and National Counterter- rorism Center—exist because each serves different, and somewhat unique, customers and missions. Each has developed expertise and analytic tools to meet the needs of its primary customers. Their customers have confidence in the work performed by “their” intelligence unit because they know the people and routinely find the work they produce to be more useful than that provided by analysts elsewhere who perforce are less well attuned to the specific intelligence requirements of the parent department.8 FORM FOLLOWS FUNCTION Legacy arrangements whereby individual and institutional customers rely primarily on analysts and agencies that look at issues and intelligence through lenses keyed to their own mission requirements are logical and often sufficient to meet core requirements. Indeed, the approach adopted by the Office of the Director of National Intelligence (ODNI) in 2005 and implemented thereafter has sought to preserve and build on the best fea- tures of a de facto federated system of intelligence support. That approach made it easier to take advantage of complementary skills, achieve more rational divisions of labor, and improve the overall performance of the analytic community by improving the performance of all analysts and each of the analytic components.9 This approach deliberately eschewed institutional consolidation and the formation of country- and/or issue-specific centers intended to “rationalize” 7 For example, after U.S. policy makers became convinced that they needed to work with the Chinese government to halt the sale of missiles to countries in the Middle East, they wanted information and insight from the IC that would help them to determine how to do that, not additional reports confirming that sales had occurred in the past (Gordon, 1990). 8 The Central Intelligence Agency does not have a “parent department” in the sense that this term is used here, but it is the primary source of analytic support for the National Security Council staff and a primary or secondary source for customers across the U.S. government. 9 The ODNI was established by the Intelligence Reform and Terrorism Prevention Act of 2004. The position of Director of National Intelligence was created to enhance integration of the IC and was given a specific mandate to improve the quality of analytic products. See Fingar (2006) for discussion of many elements of the approach adopted by the ODNI.

OCR for page 3
6 INTELLIGENCE ANALYSIS: FOUNDATIONS organization charts and lower institutional barriers to information exchange and collaboration because the ODNI judged that potential gains from co- locating analysts working on similar problems would be less than the probable loss of insight and trust resulting from proximity to particular cus- tomers.10 Rather than consolidating analysts, the ODNI approach sought to preserve and enhance the advantages of analytic boutiques (e.g., the Marine Corps Intelligence Activity and the State Department’s Bureau of Intelligence and Research) that were able to provide tailored support while making it easier for them to contribute to, and benefit from, the work of colleagues elsewhere in the IC. Furthermore, the approach aimed to reduce the autarky and isolation of analysts by facilitating knowledge of, access to, and collaboration with colleagues and counterparts in other components of the intelligence enterprise. The notional “model” for the analytic enter- prise was more like Radio Shack’s networking of widely dispersed affiliates located near their customers than Walmart’s distribution of standardized goods through megastores located far from people previously served by neighborhood shops. PARAMETERS AND PRESSURES AFFECTINg ANALYTIC PERFORMANCE Implementation of the blueprint summarized above has begun, and the initial results suggest it is both workable and worthwhile. The results also demonstrate, however, that several more challenges must be under- stood and addressed to minimize unnecessary duplication while providing more accurate, insightful, and useful analytic support to the IC’s large, diverse, and demanding customer base.11 The magnitude of the task is complicated and compounded by the explosive growth of requirements and 10 Thecall for formation of subject-specific centers was made, i.a., in The 9/11 Commission Report (National Commission on Terrorist Attacks, 2004, pp. 411–413). Preservation of mul- tiple analytic components that had evolved independently in a context that made it difficult to rely on work done by colleagues in other components— because of impediments to knowing precisely who did what, the expertise of analysts elsewhere, or how responsive they would be to requests for assistance—also preserved unnecessary as well as appropriate duplications of effort. It also perpetuated cultural differences, bureaucratic rivalries, and other organizational pathologies (in this volume, see Zegart, Chapter 13; Tinsley, Chapter 9; and Spellman, Chap- ter 6). Knowing more about the capabilities, staffing, and missions of each component was a requisite for identifying which capabilities were redundant and which could be eliminated without risking a single point of failure or jeopardizing the ability of the IC to obtain multiple independent analyses of critical issues. Reducing and realigning independent capabilities was postponed until more was known about individual and aggregate strengths and weaknesses. 11 These are the personal observations of a participant observer. I made many of the decisions incorporated into the approach summarized here and closely monitored their implementation, but the judgments about their efficacy are largely subjective and impressionistic.

OCR for page 3
7 ANALYSIS IN THE U.S. INTELLIGENCE COMMUNITY escalating expectations of customers, overseers, and the attentive public. Simply stated, in addition to their many other challenges, IC analysts must contend with more requirements from more customers, and must answer more difficult questions more quickly and with greater precision than ever. Moreover, they must do so while coping with exponentially increasing volumes of information (for further discussion, see Fingar, 2011b). Each of these interconnected challenges warrants both explication and illustrative examples of their implications for the analytic enterprise. In the years since the demise of the Soviet Union, and especially since the attacks of 9/11, “national security” has been redefined, often implic- itly, in ways that require radically different approaches to analysis, the way analysts engage with one another, and the missions they support. Once limited almost exclusively to concerns about military, diplomatic, and political/ideological threats to “American national interests,” national security now subsumes concerns about the geopolitics of energy, global financial flows, spread of infectious disease, and the safety of individual American citizens anywhere on the globe.12 Expansion of the concept and concerns of “national security” has also expanded the scope (i.e., number and variety) of institutions and individuals who desire or demand analytic support from the IC.13 Because intelligence support has long been treated as a “free good,” there are few constraints on what customers can request or what members of Congress expect to be provided.14 The proliferation of customers and topics on which the IC was expected to acquire information, develop expertise, and deliver analytic insights raised questions about how to do so. The default setting was for new customers to go to the Central Intelligence Agency (CIA) because its 12 The broader scope of questions addressed to the IC is illustrated by the titles of unclassified reports published by the National Intelligence Council during the past decade. They include: The Impact of Climate Change to 2030: Commissioned Research and Conference Reports (National Intelligence Council, 2009), Strategic Implications of Global Health (National Intelligence Council, 2008b), SARS: Down But Still a Threat (National Intelligence Council, 2003), and Global Humanitarian Emergencies: Trends and Projections 2001–2002 (National Intelligence Council, 2001). 13 Perhaps the clearest example of this expansion is the creation of the Homeland Security Council by the George W. Bush Administration and the subsequent incorporation of “domes- tic” agencies into the restructured National Security Council undertaken by the Obama Ad- ministration. It is also reflected in the redefinition of “national intelligence” in the Intelligence Reform and Terrorism Prevention Act of 2004 (Section 1012). 14 Members of the Intelligence Oversight Committees in both the Senate and the House of Representatives have raised questions about the appropriateness of devoting intelligence resources to nontraditional issues and customers, but members who sit on committees with responsibility for the nontraditional issues and agencies generally take the opposite view. For examples of debate over the proper scope of topics to be addressed by the IC, see the blog, Kent’s Imperative (n.d.), http://kentsimperative.blogspot.com/ [accessed May 2010]. For an ex- ample of disagreement among members of Congress, see Congressional Record–House (2007).

OCR for page 3
8 INTELLIGENCE ANALYSIS: FOUNDATIONS mandate was to support all national security customers, and the CIA ini- tially accepted the new requirements. Rather quickly, however, customers and intelligence analysts rediscovered the value of proximity and tasking authority that had spawned the creation of so many different analytic com- ponents. Simply stated, the U.S. government faced, at least implicitly, the question of whether to replicate the old approach of creating new special- ized units co-located with customers, or to develop better ways to frame requirements and tap expertise without creating new units. In other words, the IC had to find a way to provide boutique-like service and attention to customers without creating new bureaucratic units or substantially increas- ing the number of analysts. In addition to coping with a wider range of requirements from a larger and more diverse set of customers, intelligence analysts had to address many questions that were inherently more complex than most of those that had become routine during the Cold War. One dimension involved the shift of focus from the national level (e.g., what does Moscow or Cairo want?) to subnational and nongovernmental organizations and groups (e.g., is the basis for the insurgency political, tribal, economic, religious, or something else?). Addressing such questions requires both greater and different kinds of expertise and analytic techniques than were sufficient in the past. These challenges are further compounded by shorter deadlines—to be useful now, analytic insights often must be provided in days or hours rather than weeks or months—and the demand for more “actionable intelligence” (i.e., information that can be used to disrupt a terrorist plot, prevent the deliv- ery of chemical precursors, or freeze bank accounts being used for illicit purposes).15 More numerous and more complex issues require use of more and dif- ferent types of information. Much of the required information is readily available to anyone at no or little cost; other types of information can only be acquired, if at all, through clandestine methods. Knowing what to look for, where to seek it, and how to provide guidance to collectors have become much more demanding aspects of an analyst’s job than in days when much 15 The kinds of difficulties and dilemmas inherent in meeting demands for actionable intel- ligence can be illustrated with a simple, but typical, example. Foreign governments being asked to search a shipment or block a flight suspected of carrying illicit material want detailed infor- mation about the content and source of the intelligence that triggered the request because they want to make an independent judgment about whether it is sufficiently reliable to jeopardize their own interests by taking the requested action. Sometimes, such requests for information about the intelligence are also intended to learn more about U.S. intelligence capabilities; foreign governments want to know how much—and how—the United States knows about what happens in their countries. It is not always easy to provide information that is sufficiently detailed to be persuasive and to minimize the likelihood of error without jeopardizing sources and methods (for further discussion, see Fingar, 2011b).

OCR for page 3
9 ANALYSIS IN THE U.S. INTELLIGENCE COMMUNITY of the job often entailed evaluating and explaining secrets and other bits of information collected and disseminated “because it was obtainable” rather than because it addressed high-priority analytic questions. Moreover, the dramatic increase in publicly available information that has characterized the past two decades, and the extraordinary capabilities of new methods of technical collection and data storage, have greatly increased the height of the information haystack. It probably does contain more “needles” than before, but they are often much harder to find. WHAT ANALYSTS DO: INDIVIDUAL AND COLLECTIVE RESPONSIBILITIES Every analyst’s job is multifaceted and somewhat unique, but all entail core responsibilities and employ—or should employ—the same high stan- dards of analytic tradecraft. The challenge, and it is a significant one, is for every individual and the analytic community as a whole to strike the right balance when allocating time and effort to each component of the job. This cannot be achieved by assigning arbitrary priorities or percentages of time. The generic tasks summarized below are—or should be—complementary, but they are more often characterized as zero-sum with a bias for address- ing what is current at the expense of what might be more important. This is a long-standing lament, but most proposals to alleviate competing demands do not go beyond calling for more long-term strategic analysis and less attention to current issues (e.g., Russell, 2007; Commission on the Intelligence Capabilities of the United States Regarding Weapons of Mass Destruction, 2005). Answer Questions A portion of every analyst’s job involves answering questions. Some- times the questions are posed in the course of a meeting and may require both an immediate answer and a longer and more considered response. One’s ability to provide confident answers with adequate levels of detail is a function of one’s expertise and ability to anticipate what the customer or meeting is likely to require; the adequacy of the response is, in part, a function of the degree to which those present have confidence in the ana- lyst.16 Sometimes the most important “answers” are the ones provided by 16 For purposes of this paper, “expertise” is a function of formal academic study; time spent working on particular places, people, or problems; and understanding of U.S. interests and objectives germane to one’s areas of specialization.

OCR for page 3
10 INTELLIGENCE ANALYSIS: FOUNDATIONS an analyst to questions that customers should have asked, but did not.17 To be useful, the analyst needs to find out what his or her customers “know,” what they are trying to accomplish, and what approach is being used to formulate and evaluate policy options. Questions that are more difficult to address include those that come to an analyst indirectly, with little or no information on why the question was asked.18 The objective in all cases is to provide more than “just the facts.” Good analytic tradecraft requires providing information on context, patterns, the quantity and character of intelligence germane to the subject, and other insights likely to help custom- ers to understand the issues that prompted the query (Commission on the Intelligence Capabilities of the United States Regarding Weapons of Mass Destruction, 2005).19 Three keys to providing timely and useful answers are (1) command of one’s portfolio, (2) knowledge of where to go for data and help to interpret what it means, and (3) practice of good tradecraft even on routine or quick turnaround matters. 17 The following examples of questions that should have been asked, but were not addressed until an intelligence analyst injected them into the conversations, are real, not hypothetical. The first occurred in the context of a long discussion of how Moscow would respond to a variety of U.S. and/or European moves intended to affect developments in the Caucasus and how various scenarios were likely to play out. A question that should have been asked early in the discussion was, “How do the Russians view the situation and what do they want to hap- pen?” The second occurred in the context of a discussion of the efficacy of a new program to protect Iraqi oil pipelines from attack by insurgents by paying local militias to act as a pipeline protection force. The indicator used to gauge the efficacy of the program was the number of attacks after the stand-up of the new protection force, and several people noted with pleasure that no attacks had been made on a particular section of the pipeline for more than 6 weeks. What should have been asked—and addressed sooner than it was—was whether the pipeline was operative before or during the period under discussion. It wasn’t. Until noted by the intelligence analyst, no one had considered that the reason there had been no attacks on the pipeline was probably because it had already been put out of commission. 18 When I was Chairman of the National Intelligence Council, I received a message from the staff of a senior director at the National Security Council requesting an update on political reconciliation, economic reconstruction, and public safety in Iraq. The request was mislead- ingly clear in that it seemed to require updated information of the kind that was regularly incorporated into spreadsheets and graphics used to depict progress and problems. The IC contributed to, but usually did not prepare, those graphics, which were the responsibility of officials in the State and Defense Departments. When I sought clarification, I learned from the senior director that he was looking for an analytical assessment of the relationships among reconciliation, reconstruction, and the security of different ethnic and religious groups. The operative assumption up to that time had been that progress toward political reconciliation— elections—would facilitate progress on the other dimensions of interest. He was asking if the evidence supported that hypothesis. It did not. 19 The need to identify assumptions, characterize sources of information, and specify levels of confidence in both judgments and underlying information is codified in IC directives (see Director of National Intelligence, 2007a, 2007c).

OCR for page 3
11 ANALYSIS IN THE U.S. INTELLIGENCE COMMUNITY Provide Warning Every analyst has a responsibility to monitor developments and trends in his or her portfolio in order to determine where they seem to be headed and whether they might “threaten” American interests or the viability of approaches being considered or implemented by those they support. Ana- lysts should also be alerted to potential opportunities for policy intervention to mitigate or capitalize on what is taking place. For most analysts, most of the time, the focus should be on providing strategic warning—informing customers what could happen far enough in advance to allow deliberation and the formulation of policies to encourage what appears desirable and to thwart or mitigate unfavorable or dangerous developments. But no policy maker likes to be surprised. Too often their expectations and demands for “warning” are conveyed or interpreted as demands to be informed or alerted about any development that might be made known to colleagues and counterparts, or about which they might be asked questions by Con- gress or the media. This desire for “no surprises” often skews the work of analysts too far in the direction of “current intelligence,” amounting to little more than duplicative and ill-informed commentary on developments that, in the grand scheme of things, are not all that important (e.g., Russell, 2007). Monitor and Assess Current Developments and New Information The ability to provide warning of what lies over the horizon, around the bend, or behind a tree requires continuous and close monitoring of developments that might affect places, problems, people, or policy maker requirements in every analyst’s portfolio. This dimension of the analyst’s job involves more than just evaluating, assessing, interpreting, and transmitting the latest fruits of collection efforts.20 Many analysts feel overwhelmed because they attempt to—and cannot—“read everything” that collectors push at them and they know is available in unclassified materials (“open source” in the argot of the IC). The days when an analyst could, or could be expected to, read everything are long gone. It would be counterproduc- tive and fruitless to try to solve the problem by narrowing the scope of 20 “Evaluation” of intelligence involves determination of its use (e.g., whether it contains new information, whether it corroborates or contradicts previously acquired information, and whether it requires reconsideration of previously reached judgments). “Assessment” deals with other dimensions of a report’s possible use such as its reliability, credibility, and importance to understanding the particular issues under study. “Interpretation” of intelligence addresses “what does it mean” questions germane to understanding capabilities, intentions, trends, and drivers. “Transmission” of intelligence requires communication of information about its provenance, reliability, significance, and implications to other analysts and to customers outside of the IC.

OCR for page 3
12 INTELLIGENCE ANALYSIS: FOUNDATIONS portfolios and adding more analysts.21 What is required is better under- standing of complex problems, not a large contingent of analysts who know more and more about less and less. To perform this part of the job, analysts must begin with a clear (or as clear as their relationship with customers allows) understanding of what customers are working on, worry about, and want to know.22 Armed with this knowledge, and the analyst’s own subject matter expertise and under- standing of the issues and dynamics involved, the analyst can narrow the scope of his or her search and analysis efforts to what are thought to be key drivers, key indicators, and key developments germane to the concerns of customers and, as importantly, to their own ability to understand what is happening, why, and where events appear to be headed. There are obvi- ous advantages to divisions of labor with fellow analysts and increasing opportunities to work together via collaborative tools such as A-Space and other capabilities to access and assemble data and to garner insights from colleagues. However, at the end of the day, each analyst is responsible for identifying and interpreting information germane to the interests of his or her customers that might affect their understanding of the situation and ability to achieve their objectives.23 21 In the past, when many existing systems and procedures were developed, a substantial part of the intelligence enterprise was devoted to ferreting out secrets and attempting to learn “any- thing we could” about dangerous and denied areas on the assumption that knowing a little about some dimension of a place or problem was better than knowing very little about most dimensions. The job of the analyst was to explain and interpret whatever “facts” happened to be collected on a subject in his or her portfolio. Now such an approach is impractical, un- necessary, and often unhelpful. Much—even most—of the time, analysts begin with a question that, if answered, will provide key insights into the subjects they study and seek information that promises to help answer that question. 22 This can happen in many ways. One is the “EAP Informal,” a weekly meeting chaired by the Assistant Secretary of State for East Asia and the Pacific that I found to be an exception- ally effective forum and format for ensuring that policy maker and intelligence counterparts understand what each is working on, attempting to accomplish, and worried about. The six or seven senior participants from the Department of State, Department of Defense, the Joint Chiefs, the National Security Council, the National Intelligence Council, and the Department of State’s Bureau of Intelligence and Research (INR) come together as peers who check their bureaucratic roles at the door so they can speak freely about issues, options, and objectives without worrying about turf or other bureaucratic issues. I participated in these meetings as Director of INR’s Office of Analysis for East Asia and the Pacific and found them extremely useful for providing guidance to collectors and focusing the work of IC analysts. 23 A-Space is a pathbreaking collaborative workspace that enables analysts from across the IC to access and share highly classified information, pose questions, and post observations without having to know precisely who might have answers or find the observations of interest, mentor and obtain help at a distance, and collaborate to produce and update reports and data repositories. Time named A-Space (which the media calls Facebook for Spies) one of the 50 best inventions of 2008 (Time, 2008).

OCR for page 3
13 ANALYSIS IN THE U.S. INTELLIGENCE COMMUNITY Building Expertise and Strategic Analysis Observations—and criticism—that analysts devote too much time to “current intelligence” often lament that too little time is spent on “strate- gic analysis.” Many prescribe corrective measures that include setting up separate staff to conduct long-range studies or assigning all “current intel- ligence work” to a small staff so that most analysts can engage in strategic analysis (e.g., Treverton and Gabbard, 2008). From my perspective, both the diagnosis and the prescriptions are somewhat off the mark. The IC certainly can do a lot better in terms of the way it monitors and reports breaking developments (what Secretary Powell correctly referred to as “the news”). Yet that does not obviate the need for the vast majority of analysts to address issues already or soon to be on the agendas of those they sup- port because if they do not and cannot do that, the IC will not meet the requirements and expectations of those it supports. Second, although many proclaim the need for more strategic analysis, I have found the “market” for such work to be both small and episodic. So-called “tyranny of the inbox” is a bigger problem for policy makers than for analysts, and the needs of customers drive the process. Perhaps policy makers should think more about the long-term future, but few do so on more than an intermittent basis, and all tend to have less interest in long-term issues as they spend more time on the job. One can lament or decry the situation, but it is difficult for officials to think about how events might play out after their term of office while piranhas are working on their legs.24 Intelligence is fundamentally a support function; it exists to provide information and insight that will help customers to perform their assigned missions in the national security enterprise (for further discus- sion, see George and Rishikof, 2011). Analysts can, should, and do regard reminding customers of long-term trends and strategic implications of cur- rent decisions as an important part of their job, but they must do so within the parameters of trust, temporal pressures, and the agendas of those they support. The alternative is to be regarded as unhelpful or irrelevant (e.g., Treverton, 2008). Rather than focusing on structural solutions such as creating strategic analysis units, or on changing the behavior and expectations of decision makers, the most useful proposals to improve analytic support begin from the premise that providing useful insights and context when addressing “current” issues requires both deep expertise and understanding of strategic 24 One senior official with whom I worked earlier in my career described the problem some- what more colorfully when he said, “I’d love to spend more time thinking about the future but right now I’m up to my ass in alligators.”

OCR for page 3
18 INTELLIGENCE ANALYSIS: FOUNDATIONS customer needs into guidance for collectors.31 The IC collection system is both vast and nimble. It can be tweaked to go after specific topics and targets, and collectors do their best—which is a substantial effort—to meet the ever-changing panoply of needs given to them by analysts. Analysts do not—or should not—simply relay questions from customers. They translate such requests by asking themselves—and their colleagues—what the ques- tion is intended to illuminate, what kind of information would produce the greatest understanding of the underlying problem, and where collectors should look to find that information. The formal process for translating information needs into guidance to collectors is still a work in progress and is still more cumbersome than it should be, but analysts are and will remain the key to its success.32 The schematic summary above somewhat obscures the extent to which all of these job elements are interrelated and occupy a continuum rather than compartmented activities. It also omits activities such as updating databases that occupy significant portions of some analysts’ time. That said, many analysts and commentators speak as if the different elements are in zero-sum competition and lament that certain ones constrain what can be done to address others. As one might imagine, there is a natural tendency to decry and exaggerate the amount of time that must be allocated to tasks an 31 The NIPF is the formal mechanism for translating policy-maker priorities into collection and analytic requirements. A matrix is formed by arraying 32 intelligence topics (subdivided into 3 prioritized bands) against roughly 220 countries and non-state actors. Each cell in the matrix receives a “score” ranging from zero to five, with zero meaning the cell is empty and that the topic will receive essentially no attention from the IC. One is the highest priority, and topics in that group will receive a great deal of attention. Category five topics are essentially “global coverage” accounts maintained to support diplomatic and other ongoing responsi- bilities, or “fire extinguisher” accounts maintained at a low level of effort because of their potential to flare up with significant implications for U.S. interests (for additional information on the NIPF, see Director of National Intelligence, 2007b). 32 In addition to the formal NIPF process through which senior policy makers update their priorities every 6 months, analysts provide regular guidance to collectors through a variety of less formal, but more nimble, procedures. One is a biweekly update compiled by the Analytic Mission Management team in the Office of the Deputy Director of National Intelligence for Analysis by soliciting input from the 12 National Intelligence Officers (NIOs; 6 regional and 6 dealing with transnational issues). These updates reflect input and insight obtained through close interaction between the NIOs and their counterparts on the National Security Council and other executive branch agencies as well as similar input from senior analysts in individual IC components. A second mechanism, also managed by the Analytic Mission Management team, involves the convening, usually by a National Intelligence Officer, of senior analysts working on a particular place or problem. The purpose of these meetings is to clarify what policy makers want and need to understand and determine collectively what kinds of informa- tion might provide greatest insight into the problem. This is refined into specific guidance to collectors to “look in these places for this kind of information on these topics.” The guidance is also keyed to decision timelines to ensure that input from the IC is delivered in time to make a difference.

OCR for page 3
19 ANALYSIS IN THE U.S. INTELLIGENCE COMMUNITY individual finds more difficult or less rewarding than those on which he or she would rather spend time. All are important and interconnected efforts to make analysis more accurate, more useful, and more efficient. They will have the greatest impact if they address all of the elements in the job jar as parts of an integrated whole rather than as specialties that can be compart- mentalized and assigned to discrete groups of analysts. MAKINg NECESSITY A VIRTUE LEADS TO A BETTER WAY OF DOINg BUSINESS In the past—and here the past is as recent as the immediate post-9/11 period during which there was tremendous growth in the IC budget and the number of analysts—the standard response to increased demands was to add people and/or create new analytic components.33 To improve infor- mation sharing, reduce “cultural” barriers to collaboration, and consoli- date work on important issues, the 9/11 Commission recommended and the Intelligence Reform and Terrorism Prevention Act of 2004 (IRTPA) endorsed the creation of specialized “centers.” The IRTPA also gave statu- tory authority to the National Counterterrorism Center (NCTC) that had been created four months earlier by Executive Order (National Commis- sion on Terrorist Attacks upon the United States, 2004; Section 1021 of the 2004 IRTPA). The newly established ODNI made a conscious decision not to adopt that approach. By mid-2005, calls to integrate and rationalize the IC and competing demands for “more analysts” made it impractical and imprudent to create and staff new analytic units to support new missions and new cus- tomers. New units would have had to be either too small to achieve critical mass on any issue or so large that duplication of effort would have been inevitable. Moreover, the start-up problems of the NCTC and the fact that no agency abolished or significantly downsized its own counterterrorism unit when NCTC was established underscored previously learned lessons about distancing analysts from their primary customers and providing what looks like one-size-fits-nobody analytic support (see DeYoung, 2006; Whitelaw, 2006; for additional analysis, see Fingar, 2011a).34 The ODNI 33 The IC budget was classified until 2008, but authoritative figures were released occasion- ally. In 1998, Director of Central Intelligence George Tenet cited the figure of $26.7 billion (Central Intelligence Agency, 1998). The figure for 2009 was $49.8 billion (Office of the Director of National Intelligence, 2009). 34 The NCTC’s start-up problems resulted from the concatenation of many factors, includ- ing ambiguities in law and policy, resistance from certain agencies, and the sheer magnitude of the task. In addition, agencies were understandably reluctant to downsize their existing capabilities before the NCTC had proven that it could meet their specific counterterrorism requirements.

OCR for page 3
20 INTELLIGENCE ANALYSIS: FOUNDATIONS was determined to find a better way to organize and integrate IC analytic capabilities. To address the need to bring new types of expertise to bear on new problems for new customers without creating new units or adding signifi- cantly to the analytic workforce, the ODNI set out to discover whether such expertise existed anywhere in the IC, where this expertise was considered critical to the performance of core missions, and where it was vestigial or serendipitous.35 This effort also revealed how strong or weak the analytic community was in each area, now and when factoring in projected retire- ments and other forms of attrition. Using loose and subjective criteria, the ODNI set out to determine where the IC had sufficient expertise (if it could be harnessed effectively), where gaps existed in specific agencies and in the IC as a whole, and where there was potential to “grow” expertise by mentoring across agency boundaries. Developing the “better way” is still a work in progress, but the prin- cipal building blocks of the approach are relatively clear and experience to date provides an empirical basis for adjustments and improvement. The first building block was to identify with a fair degree of precision what each of the component analytic elements did (i.e., the missions and customers they supported, the areas of expertise they had developed, and the kinds of assessments they produced). This inventory revealed less redundancy than many assumed, especially when one examined specific areas of focus subsumed under broad rubrics such as “China” or “missiles.” Yet it also indicated that many agencies had developed small elements to address subjects tangential to their core missions because they did not know where relevant expertise could be found elsewhere in the IC, could not “task” analysts elsewhere to provide necessary input, or could not have confidence in the quality of the work done by people they did not know and could not evaluate on their own.36 This mapping exercise also revealed that most components judged that they lacked a “critical mass” of expertise on all but 35 Vestigial capabilities exist for many reasons, ranging from the magnitude of the IC effort against the former Soviet Union and its Warsaw Pact allies to expertise on Libya’s weapons of mass destruction programs that became less relevant when Muammar Gaddhafi decided to dismantle those programs and surrender key components to the United States. Serendipitous capabilities also have many variations, including the linguistic abilities and cultural knowledge of first- and second-generation Americans, the skills and knowledge individuals acquired in previous assignments, and the ability to use friendships and professional ties to secure assis- tance on certain types of issues. 36 In the parlance of the IC, the ability to “task” an assignment entails the ability to ensure that it is carried out and to hold individuals and organizations accountable for their perfor- mance. “Tasking” carries a lot more weight than merely “asking” that something be done. The former is mandatory; the latter can be trumped by assignments that have been specifically “tasked” to an analyst or agency.

OCR for page 3
21 ANALYSIS IN THE U.S. INTELLIGENCE COMMUNITY a small number of topics. When combined, these agency-by-agency map- ping exercises provided a reasonably complete picture of the customers and activities supported by the IC and a first-cut approximation of duplication and deficiencies. The second building block was to inventory the skills and experience of the analysts themselves by reinvigorating and making enrollment man- datory in the Analytic Resources Catalog (ARC).37 A primary objective of the ARC was to map what analysts know, individually and collectively. An assumption confirmed by the ARC data was that expertise on many subjects is deeper than organizational and staffing charts would suggest because analysts retain knowledge from previous assignments even as they assume new responsibilities. The mapping exercise, in conjunction with demo- graphic data using years of experience as a proxy for age and similar ways to avoid running afoul of privacy laws, also revealed areas where expertise was concentrated in particular age cohorts (e.g., a disproportionate percent- age of those working a given subject had 20 or more years of experience, suggesting an upcoming problem of simultaneous retirements with no suc- Prior to the creation of the ODNI and the adoption of uniform tradecraft standards for all agencies, there were both real and exaggerated differences in the quality of work done by dif- ferent components of the IC. Cultural differences and their consequences magnified negative perceptions and stereotyping of the people in and work done by counterparts in other agencies. This was a serious impediment to collaboration and meaningful divisions of labor. In addition to the adoption of uniform tradecraft standards, the ODNI launched a course in basic ana- lytic tradecraft attended by new analysts from across the IC. More than 3,000 analysts have graduated from “Analysis 101.” They have trained together and know that their counterparts elsewhere are as talented and well prepared as they are because they have learned the same skills at the same time in the same classrooms (see Kelly, 2007). 37 The ARC was the brainchild of John Gannon, the first Assistant Director of Central Intel- ligence for Analysis and Production (ADCI/AP) and one of my predecessors as Chairman of the National Intelligence Council. Gannon’s objective was to facilitate more efficient use of ana- lytic skills and the ability to make better informed decisions about what skills to pursue when filling vacant positions. His efforts were thwarted by officials who used counterintelligence concerns and other arguments to stifle this early effort to integrate the analytic community. A few years later, Mark Lowenthal, who succeeded Gannon as ADCI/AP, revived the ARC with somewhat more success, but he gave it a rationale that caused both individual analysts and managers to avoid entering pertinent information out of concern that the ARC would provide the basis for recruiting analysts for task forces, undesirable assignments, or other activities they might not wish to do. When I inherited their positions and the ARC, I made clear that it was to be a database of expertise, not a free agent list or roster of candidates for reassignment. The ARC has now become the model and foundation for similar and integrated databases of collectors, scientists, and other IC professionals.

OCR for page 3
22 INTELLIGENCE ANALYSIS: FOUNDATIONS cessors in the pipeline).38 One objective of this inventory of expertise was to make it easier for analysts to find potential collaborators and for analytic managers to find persons with the skills and experience needed to address subjects beyond the competence of their own agency. Stated another way, the goal was to be able to harness the totality of expertise in the analytic community, not just that of persons currently occupying particular billets (e.g., “Southeast Asia terrorism” or “Andean economics”). The exercise described above made clear that the IC had more expertise than suggested by staffing patterns if it could find a way to tap what people already knew, even if that knowledge was from previous assignments, and if the IC found a way to enable analysts to collaborate at a distance. The goal was to facilitate voluntary formation of “virtual” teams with the advantages of proximity to key customers and synergistic benefits from collaboration.39 Realizing the potential benefits inherent in this vision required overcoming a number of technical, policy, and cultural obstacles.40 Some have been surmounted; others have yet to be tackled. It also made clear, however, that the IC did not have and was unlikely ever to have enough people with 38 The specific case referenced in the text was drawn from a pilot project to map with preci- sion the capabilities and experience profiles of analysts working on Africa. When the initial results were shown to me, my first reaction was that the numbers were dangerously small, but the demographic profile of IC Africa analysts did not look that bad. When I looked more closely at the high end of the experience curve, however, I realized that we had a more serious problem than indicated by the data. Specifically, the profile showed that roughly 10 percent of the analysts had more than 20 years of experience (that was the way the question had been asked). But I knew several of the analysts in that category, and also knew they had all served for more than 30 years and were already eligible to retire. This discovery lent new urgency to efforts to recruit at higher levels of experience and to use the experienced veterans to mentor more junior analysts without regard to home agency affiliation. 39 The impetus for formation of virtual teams was the desire to create “critical masses” of expertise sufficient to address the complex analytic problems assigned to the IC without sacrificing the advantages of proximity to key customers that would result from the creation of “centers” (as called for by the 9/11 Commission) or similar arrangements intended to overcome cultural differences and impediments to information sharing through proximity. It was also the result of my experience in the Bureau of Intelligence and Research, where I had a number of teams composed of analysts scattered across three floors of the State Department. They interacted primarily through e-mail, a method that could easily be used to link analysts in different buildings, different agencies, and different cities. This experience was reinforced by the findings of the study of teams prepared for the Intelligence Science Board (Hackman and O’Connor, 2004). 40 Technical impediments included incompatibilities among legacy systems that made it dif- ficult to “wire” together certain components of the IC and use of different meta-data standards that impeded use of materials resident in different databases. Policy obstacles ranged from measures to address counterintelligence concerns spawned under conditions very different from those of today to “rules” that precluded e-mailing certain categories of documents to anyone in certain agencies. Fixing the technical obstacles was easier than reducing policy impediments. Cultural obstacles often boiled down to some version of “why would I want to work with anyone in that organization?”

OCR for page 3
23 ANALYSIS IN THE U.S. INTELLIGENCE COMMUNITY sufficient expertise to cover all of its missions in the small time frames that had become the norm. It was imperative to find ways to develop continuing relationships with scholars, journalists, think tank researchers, diplomats, and others with deep knowledge of subjects of interest to policy makers and essential to the analytic mission of the IC.41 This had to involve more than just compiling a list of “experts on everything.” Indeed, one objective was to make the incorporation of information and insights from outside experts a regular part of each analyst’s job in order to raise the level of individual and corporate expertise in the IC. A second was to be able to use the outside expert as a sounding board for ideas and as a source of guidance on where to look for answers to specific questions. A third objective was to nurture these relationships so they could be activated immediately in the event of a crisis or extremely short fuse requirements. The advantages are obvious, but not enough to overcome concerns, many of them legitimate, about interchange with people outside the IC.42 The proposed arrangements also raise important questions about deference to authority figures, protection of sources and methods, and other methodological concerns. DEMOgRAPHICS Perhaps the most important characteristic of the analytic workforce is its youth. Any plans to improve the quality of analytic products must give proper attention to the fact that more than 50 percent have joined the IC since 2001. A second is that the age distribution of the other 50 percent is skewed toward the retirement end of the scale, largely because the hiring freezes, downsizing, rightsizing, and organizational turmoil of the 1990s limited intake and caused many younger analysts to seek employment else- where, often in firms that do contract work for the IC. These demographics create a number of challenges (e.g., the need to use and capitalize on the expertise of senior analysts now serving in managerial positions, and to pull junior analysts up the learning curve faster than would normally have been the case in the IC).43 This also means that more formal training is required 41 Arguments for more and continuous interchange between IC analysts and specialists from outside the IC are developed at greater length in Fingar (2007). See also Intelligence Com- munity Directive (ICD) 205: Analytic Outreach (Director of National Intelligence, 2008). 42 Concerns about interchange between IC analysts and experts from outside of the IC in- clude the potential for unintentional disclosure of sensitive intelligence or information about sources and methods, increased vulnerability to the intelligence collection efforts of other nations, and the desire of some “collectors” to monopolize contacts with outside experts. 43 The IC has a long tradition of developing talent and shaping careers through approaches that would be familiar to guilds in the Middle Ages. Analysts gradually are exposed to more dimensions of the intelligence business and expected to more or less replicate the career paths and behaviors of those who have gone before. Such approaches are no longer adequate for the challenges of today or acceptable to the talented and ambitious new analysts who have joined

OCR for page 3
24 INTELLIGENCE ANALYSIS: FOUNDATIONS to compensate for the brevity of on-the-job learning through observation of how more senior analysts practice their craft. Those are the downsides of demography, but there are a great many upsides as well. For example, the cohort that has joined in the past 7 to 8 years is extremely talented and exceptionally well trained in the disciplines they pursued in graduate school (and most of the new analysts do have graduate training, many from leading universities). They are also of the “digital generation” and completely at home in environments requiring collaboration at a distance, sharing and providing information to trusted interlocutors, experimenting with analytic tools, searching the Internet and classified databases, and performing other tasks (Palfrey and Gasser, 2008). They routinely communicate with friends across institutional boundaries and expect to do the same in their professional lives. Persuading them to adopt new techniques and to work differently than the generations they are succeeding is easy. What is less easy—but essential—is developing modern- day means to vet information, exercise quality control on products devel- oped using Wikis and blogs, and maintain the requisite security safeguards when dealing with persons outside of the IC.44 In doing so, IC leaders must diligently adapt policies and procedures developed for a different time, dif- ferent types of problems, and different generations to suit the capabilities and expectations of the youthful workforce. WILL AND ABILITY TO ADAPT The IC as a whole and the analytic community in particular are neither broken nor bad, but they can and want to be better. They want to be better for the right reasons: to ensure the security of our country, the safety of our fellow citizens, and the success of policies to protect American interests and promote American ideals. The majority of analysts are new to the IC, but they are not new to analysis. As a group, they represent and reflect the best training available in America’s best universities. Their seniors, in both age and position, are among the most knowledgeable subject matter experts in their fields. Most of them feel a strong sense of professional responsibility to move successors up the learning curve as rapidly as possible. Top ana- lytic managers “get it” and are (mostly) eager to do what is necessary to transform the way analysis is done in the IC in order to satisfy burgeoning the IC in the past decade and will leave if they feel inhibited by hoary traditions that no longer make sense to them or their expectations of the intelligence profession. 44 Vetting information entails discovering and conveying to others details about its prov- enance that could affect judgments about reliability, accuracy, and intent (e.g., Was it pub- lished in a government- or party-controlled newspaper? Did the source have first-hand or only indirect access to the information? Or is it simply the repackaging of information published previously in another media outlet?).

OCR for page 3
25 ANALYSIS IN THE U.S. INTELLIGENCE COMMUNITY requirements, support new missions, realize the full potential of the analytic workforce, and retain the talented people who have joined the IC in the past decade. Getting it right will not be easy or quick, but conditions for sustained improvement have never been better. REFERENCES Boudreaux, R. 1994. Yeltsin fires chemical warfare chief. Los Angeles Times. April 8. Avail- able: http://articles.latimes.com/1994-04-08/news/mn-43642_1_chemical-weapons [ac- cessed May 2010]. Calabresi, M. 2009. Wikipedia for spies: The CIA discovers Web 2.0. Time. Wednesday, April 8. Available: http://www.time.com/time/nation/article/0,8599,1890084,00.html [accessed October 2010]. Central Intelligence Agency. 1998. Statement by the Director of Central Intelligence regard- ing the disclosure of the aggregate intelligence budget for fiscal year 1998. Press release, March 20. Available: https://www.cia.gov/news-information/press-releases-statements/ press-release-archive-1998/ps032098.html [accessed May 2010]. Commission on the Intelligence Capabilities of the United States Regarding Weapons of Mass Destruction. 2005. Report to the President of the United States. March 31. BookSurge, LLC. Available: http://www.gpoaccess.gov/wmd/pdf/full_wmd_report.pdf [accessed April 2010]. Congressional Record–House. 2007. Proceedings and debates of the 110th Congress, first session, 153: 77—Part II, May 10, pp. H4895–H4896. Available: http://www.gpo.gov/ fdsys/pkg/CREC-2007-05-10/pdf/CREC-2007-05-10-pt2-PgH4881-3.pdf [accessed Oc- tober 2010]. DeYoung, K. 2006. A fight against terrorism and disorganization. Washington Post, Au- gust 9. Available: http://www.washingtonpost.com/wp-dyn/content/article/2006/08/08/ AR2006080800964_pf.html [accessed May 2010]. Director of National Intelligence. 2007a. Intelligence community directive (ICD) 203: Analytic standards. June 21. Available: http://www.dni.gov/electronic_reading_room/ICD_203.pdf [accessed May 2010]. Director of National Intelligence. 2007b. Intelligence community directive (ICD) 204: Roles and responsibilities for the National Intelligence Priorities Framework. September 13. Available: http://www.dni.gov/electronic_reading_room/ICD_204.pdf [accessed May 2010]. Director of National Intelligence. 2007c. Intelligence community directive (ICD) 206: Sourc- ing requirements for disseminated analytic products. October 17. Available: http://www. dni.gov/electronic_reading_room/ICD_206.pdf [accessed May 2010]. Director of National Intelligence. 2008. Intelligence community directive (ICD) 205: Analytic outreach. July 16. Available: http://www.dni.gov/electronic_reading_room/ICD_205.pdf [accessed May 2010]. Fingar, T. 2006. DDNI/A [Deputy Director of National Intelligence for Analysis] ad- dresses the DNI’s information sharing conference and technology exposition. Intelink and Beyond: Dare to Share. Denver, CO, August 21. Available: http://www.dni.gov/ speeches/20060821_2_speech.pdf [accessed May 2010]. Fingar, T. 2007. Remarks and Q&A by the Deputy Director of National Intelligence for Anal- ysis & Chairman. National Intelligence Council at the ODNI Open Source Conference. Washington, DC, July 17. Available: http://www.dni.gov/speeches/20070717_speech_3. pdf [accessed May 2010].

OCR for page 3
26 INTELLIGENCE ANALYSIS: FOUNDATIONS Fingar, T. 2011a. Office of the Director of National Intelligence: Promising start despite am- biguity, ambivalence, and animosity. In R. Z. George and H. Rishikof, eds., The national security enterprise: Navigating the labyrinth. Washington, DC: Georgetown University Press. Fingar, T. 2011b. Reducing uncertainty: Intelligence analysis and national security. Stanford, CA: Stanford University Press. George, R. Z., and H. Rishikof (Eds.). 2011. The national security enterprise: Navigating the labyrinth. Washington, DC: Georgetown University Press. Gordon, M. R. 1990. Beijing avoids new missile sales assurances. The New York Times. March 30. Available: http://www.nytimes.com/1990/03/30/world/beijing-avoids-new- missile-sales-assurances.html [accessed May 2010]. Hackman, J. R., and M. O’Connor. 2004. What makes for a great analytic team? Individual versus team approaches to intelligence analysis. February. Available: http://www.fas.org/ irp/dni/isb/analytic.pdf [accessed May 2010]. Kelly, M. L. 2007. Intelligence community unites for “Analysis 101.” National Public Radio, May 7. Available: http://www.npr.org/templates/story/story.php?storyId=10040625 [ac- cessed May 2010]. National Commission on Terrorist Attacks Upon the United States. 2004. The 9/11 Com- mission report. New York: W.W. Norton. Available: http://www.9-11commission.gov/ report/911Report.pdf [accessed April 2010]. National Intelligence Council. 2001. Global humanitarian emergencies: Trends and projec- tions 2001–2002. Available: http://www.dni.gov/nic/special_globalhuman2001.html [ac- cessed May 2010]. National Intelligence Council. 2003. SARS: Down but still a threat. Available: http://www .dni.gov/nic/special_sarsthreat.html [accessed May 2010]. National Intelligence Council. 2008a. Global trends 2025: A transformed world. Available: http://www.dni.gov/nic/PDF_2025/2025_Global_Trends_Final_Report.pdf [accessed May 2010]. National Intelligence Council. 2008b. Strategic implications of global health. Available: http:// www.dni.gov/nic/PDF_GIF_otherprod/ICA_Global_Health_2008.pdf [accessed May 2010]. National Intelligence Council. 2009. The impact of climate change to 2030: Commissioned research and conference reports. Available: http://www.dni.gov/nic/special_climate2030. html [accessed May 2010]. Nuclear Threat Initiative. 2007. China’s chemical and biological weapon-related exports to Iran. Available: http://www.nti.org/db/China/cbwiran.htm [accessed May 2010]. Office of the Director of National Intelligence. 2009. DNI releases budget figure for 2009 National Intelligence Program. October 30. Available: http://www.dni.gov/press _releases/20091030_release.pdf [accessed January 2010]. Palfrey, J., and U. Gasser. 2008. Born digital: Understanding the first generation of digital natives. New York: Basic Books. Partnership for Public Service. 2009. The best places to work in the federal government 2009. Available: http://data.bestplacestowork.org/bptw/index [accessed May 2010]. Russell, R. L. 2007. Sharpening strategic intelligence. New York: Cambridge University Press. Sanders, R. 2008. Conference call with Dr. Ronald Sanders, associate director of national intelligence for human capital. August 27. Available: http://www.asisonline.org/secman/ 20080827_interview.pdf [accessed April 2010]. Shaughnessy, L. 2008. CIA, FBI push “Facebook for Spies.” CNN.com/technology. September 5. Available: http://edition.cnn.com/2008/TECH/ptech/09/05/facebook.spies/ [accessed May 2010].

OCR for page 3
27 ANALYSIS IN THE U.S. INTELLIGENCE COMMUNITY Steele, J. 2008. Maliki drops the mask: With his tough stance on U.S. withdrawal, Sunni militias and the Kurds Iraq’s leader risks doom. September 5. Available: http://www. guardian.co.uk/commentisfree/2008/sep/05/iraq.middleeast [accessed December 2009]. Time. 2008. Time’s best inventions of 2008. October 29. Available: http://www.time.com/time/ specials/packages/completelist/0,29569,1852747,00.html [accessed May 2010]. Tiron, R. 2007. Afghanistan officials question drug-eradication, nomination. The Hill. Avail- able: http://thehill.com/business-a-lobbying/2261-afghanistan-officials-question-drug- eradication-nomination [accessed May 2010]. Treverton, G. F. 2008. Intelligence analysis: Between “politicization” and irrelevance. In R. Z. George and J. B. Bruce, eds., Analyzing intelligence: Origins, obstacles, and innovations (pp. 91–106). Washington, DC: Georgetown University Press. Treverton, G. F., and C. B. Gabbard. 2008. Assessing the tradecraft of intelligence analysts. The RAND Corporation, National Security Research Division. Available: http://www. rand.org/pubs/technical_reports/2008/RAND_TR293.pdf [accessed May 2010]. Voice of America. 2009. Afghan election poses policy dilemmas for U.S. August 3. Available: http://www.voanews.com/english/2009-08-03-voa34.cfm [accessed May 2010]. Whitelaw, K. 2006. The eye of the storm. U.S. News and World Report 141(7):48–52.

OCR for page 3