4
The Workforce

The quality of analyses depends on the intellect, talents, and skills of the analysts and the efficacy of their training and career development.

The intelligence community (IC) operates in an increasingly complex, turbulent, and fast-changing threat environment (see Fingar, 2011). This environment has important implications for recruiting, training, organizing, retaining, and managing the IC workforce. In stable environments, organizations can rely on stable work practices overseen by a relatively rigid administration that directs a generally hierarchical and compliant workforce. In turbulent environments, organizations require innovative work practices, flexible administration, and a creative, inventive workforce, given rein to find new approaches and novel solutions.

This chapter considers the findings on workforce issues from the behavioral and social sciences. Specifically, it looks at the discipline of strategic human resource management, whose focus is determining the best ways to create and manage a workforce that meets an organization’s needs. Building an IC workforce that is well suited for the challenges of the 21st century will require two broad efforts: (1) recruiting and selecting analysts—and other specialists—with the abilities, skills, and temperaments needed for success in this new environment; and (2) building the capabilities of that workforce by enhancing continuous learning, motivation, and collaboration (see Crook et al., 2011, for an analysis of the relationship between human capital and organization performance).



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 49
4 The Workforce The quality of analyses depends on the intellect, talents, and skills of the analysts and the efficacy of their training and career development. The intelligence community (IC) operates in an increasingly complex, turbulent, and fast-changing threat environment (see Fingar, 2011). This environment has important implications for recruiting, training, organiz- ing, retaining, and managing the IC workforce. In stable environments, organizations can rely on stable work practices overseen by a relatively rigid administration that directs a generally hierarchical and compliant workforce. In turbulent environments, organizations require innovative work practices, flexible administration, and a creative, inventive workforce, given rein to find new approaches and novel solutions. This chapter considers the findings on workforce issues from the behav- ioral and social sciences. Specifically, it looks at the discipline of strategic human resource management, whose focus is determining the best ways to create and manage a workforce that meets an organization’s needs. Building an IC workforce that is well suited for the challenges of the 21st century will require two broad efforts: (1) recruiting and selecting analysts—and other specialists—with the abilities, skills, and temperaments needed for success in this new environment; and (2) building the capabilities of that workforce by enhancing continuous learning, motivation, and collaboration (see Crook et al., 2011, for an analysis of the relationship between human capital and organization performance). 

OCR for page 49
0 INTELLIGENCE ANALYSIS FOR TOMORROW RECRUITMENT AND SELECTION The IC employs approximately 20,000 analysts with a wide range of talents and expertise, and it has begun to define the array of competencies that analysts will need through their careers (e.g., Director of National Intelligence, 2008). However, the IC definitions rely on “face validity” or intuitive appeal rather than on an evidence-based evaluation. Strategic human resource management offers an objective, scientific approach to developing the best possible workforce. It is grounded in the findings that individuals differ on a wide range of psychological charac- teristics—such as cognitive ability, personality, and values—that predict corresponding differences in educational achievement, job performance, and career success. Some of these characteristics are relatively stable, such as cognitive ability, personality, and values, while others are more mallea- ble, such as job knowledge, job-specific skills, attitudes, and motivational characteristics.1 Stable characteristics can influence the malleable ones. For example, it is well established that individuals with relative higher cognitive ability gain more from experience and training than those with relatively lower cognitive ability (e.g., Judge et al., 2010; Schmidt and Hunter, 2004). To assemble and develop individuals with the optimal collection of characteristics, the IC needs to pay attention to recruiting and selecting the right people, as well as to their training, motivation, and support. Both of these efforts will be important, but recruitment and selection is especially important because the quality of the human resources pool assembled in this first step facilitates or constrains an organization’s subsequent ability to build and develop its workforce. A failure to maximize the talent pool at this step cannot be rectified by subsequent efforts. Psychological research has identified a wide range of characteristics that differ from individual to individual and can help to identify people with the greatest potential to become successful analysts. It is important to note that the optimal qualities for IC analysts may turn out to be quite different from the current criteria. For example, current practices may undervalue raw cognitive ability (a stable characteristic) and overvalue historical or political area knowledge (a malleable characteristic). Furthermore, the IC may need to shift from proxy measures, such as having a college degree, to direct measures of cognitive ability, as there is generally substantial varia- tion in the cognitive abilities of college degree holders even from the same institution. Direct measures with strong psychometric validation are readily 1 Attainments such as an advanced graduate degree or extensive domain-specific expertise, although technically malleable, may entail a sufficiently long developmental period as to be considered stable characteristics for practical purposes.

OCR for page 49
 THE WORKFORCE available. Ignoring them will cause the IC to lose the opportunity to ensure the highest quality pool of human resources for its needs. The IC should also design its recruitment strategies to reach the best possible pool of potential recruits, from which the best candidates can then be selected. Finding the best candidates will require overcoming the com- mon tendency of organizations to rely on “traditional” sources or pathways by which potential recruits become part of the applicant pool. These prac- tices can create insufficiently diverse and talented applicant pools. There are well-developed recruitment methods (e.g., Rynes and Cable, 2003) to help ensure that applicants to the IC offer a wide range of the capabilities needed for intelligence analysis. Given the difficulty and importance of its mission, the IC needs to use methods that have been evaluated and proven to be effective. For example, it is very common for an unstructured interview to be one component or even the only component of a selection system. But unstructured interviews are known to suffer from significant problems (see Huffcutt and Arthur, 1994; Kozlowski, 2011; McDaniel et al., 1994). One such problem is that interviewers select candidates they like personally, who tend to be people who are similar to themselves. In the current fast-changing intelligence envi- ronment, that tendency could be costly in terms of the diversity of people and skills needed. DEVELOPMENT In addition to improving its recruitment and selection practices, the IC needs to provide its workforce with the training, management, and organization needed to maximize their potential. Recruitment and selection establish the quality of the human resources pool, but the knowledge, skills, abilities, and other characteristics of potential new analysts are desirable to a wide range of jobs and organizational settings and makes potential analysts desirable to other potential employers as well. It is the specialized training that analysts should receive from the IC that develops the unique skills necessary to be successful analysts. The IC needs to make optimal use of its information advantages, its knowledge of what national security decision makers need from the IC, and its ability to tap into expertise both inside and outside the federal government. To do so successfully, it needs to create knowledge and skills in its workforce that are specific to the orga- nizational mission and that provide an advantage in innovation and agility (i.e., to ensure U.S. intelligence is superior to that of its adversaries) (Barney and Wright, 1998; Crook et al., 2011; Ployhart, 2006). In contrast to recruitment and selection, the process of workforce development will unfold over a long period of time and should evolve as scientific knowledge and IC experience dictate. A new approach to work-

OCR for page 49
 INTELLIGENCE ANALYSIS FOR TOMORROW force development requires not a one-time fix, but, rather, a basic shift in managerial practices to enhance the value and effectiveness of the IC work- force. The rest of this section discusses three key elements of development: continuous learning, motivation and assessment, and collaboration. Continuous Learning As demands on analysts shift, their performance will be largely deter- mined by the extent to which the IC embraces and values continuous learning and training in the face of the normal pressures to give higher priority to the demands of the moment. Training should not be viewed as an impediment to “getting work done,” nor should it be provided only to entry-level personnel. Instead, it must be seen as a career-long commitment, as much a part of the job as preparing analyses or providing guidance to intelligence collectors. A good starting place would be the creation of a common curriculum of essential analytic tradecraft skills in joint settings, such as those taught in Analysis 101 by the Office of the Director of National Intelligence (ODNI) and its newer companion Analysis 101 for managers, which trains managers to support the use of new analytic methods. Neither of these are nor can be viewed as taking time away from analysts’ “real work.” These two starting elements would be important steps toward enhancing skills and overcoming organizational and cultural barriers to collaboration. But more needs to be done to develop a culture of continuous learning. All such efforts at improv- ing performance should receive publication-quality evaluation—even if the results of those studies will never be shared outside the IC. One key element of training is a proper range of procedures and con- tents. Chapter 3 identifies basic behavioral and social science knowledge that would be helpful to analysts. In addition to this basic skills training, there is a science of training effectiveness that has well-developed methods for identifying training needs, designing instructional methods, and evaluat- ing their validity (for a review, see Kozlowski, 2011). Analysts would also benefit from hands-on experience on a wide range of problems to improve their analytic tradecraft skills. Analysts today face several obstacles to improving their judgments (for a review, see Fingar, 2011). First, feedback regarding the accuracy and value of assessments is often limited, which makes learning from experience difficult and ambigu- ous. It has been extensively documented that effective feedback is necessary for learning (e.g., Kluger and DeNisi, 1996; for a review, see McClelland, 2011). Second, many critical analytic problems have a relatively low base rate (i.e., they are encountered only rarely), which creates great pressure to analyze them right the first time. Analysts should not be forced to “feel their way” through major challenges.

OCR for page 49
 THE WORKFORCE One way to address both issues is to use simulations to create synthetic experiences, providing exposure to infrequently encountered events along with timely, precise feedback. Simulations also make it possible to role play in situations in which “failure” can be used for reflection, learning, and innovation, rather than being a source of blame. As noted in Chapter 3, research shows that more learning generally accrues from failure than from success. The science of simulation-based training is well established and in widespread use by the aviation industry, the military, the medi- cal community, the National Aeronautics and Space Administration, and other organizations that face analogous challenges (e.g., Bell et al., 2008; Cannon-Bowers and Bowers, 2010; Salas et al., 2011). The culture of training should also include informal practices that pro- vide opportunities for learning on the job (e.g., socialization, mentoring, cross training, job rotation, and career progression paths), strengthening institutional supports (e.g., time, money, and encouragement) for continu- ing education, both inside or outside the organization, and removing insti- tutional barriers that inhibit continuous learning. Several features of the IC environment require substantive and system- atic training. Analysts must communicate with others both up and down the information chain, and they must collaborate with others who have different information and different types of expertise. At the same time, the IC environment can change swiftly. This situation suggests the need for various types of cross training, from acting in a different role during a train- ing simulation to serving in different organizations and even on different types of assignments. Experiencing other people’s jobs and situations from their perspectives can help analysts better know how to communicate with people in those roles, and the expertise that one develops from experiencing a variety of situations leads to greater flexibility and insight when dealing with a new situation. Everyone should be involved in training. Those who “know the most” should teach what they know. Teaching is itself a learning experience: in trying to explain things to other people, a teacher must first see those things from the student’s point of view, which can lead to a questioning of one’s own knowledge and assumptions.2 The IC does not now embody strong self-reflective norms in its teach- ing and mentoring programs. Rather, the programs seem to emphasize the transfer and preservation of institutional knowledge and practices. Although that is certainly an important task (especially given the expected 2 Imparting the wisdom of more experienced analysts is an effective way to broadly leverage knowledge. However, experience alone does not make one an effective trainer (see Marsh, 1995). It is necessary to “train the trainer” so that experience can be translated into learning for others.

OCR for page 49
 INTELLIGENCE ANALYSIS FOR TOMORROW high turnover because of the IC’s younger, more fluid analytic workforce), it is also important for those doing the teaching to receive feedback from the new generation of analysts and for senior managers to create conditions that foster and capitalize on the skills and backgrounds of both current and future analysts. Promoting continuous learning requires serious investments in training and development. It also requires the systematic elimination of practices that inhibit continuous learning, such as insufficient time, resources, and incentives. If the IC is to develop a scientific approach to continuous learn- ing, its senior and middle managers need to be committed to the concept and communicate that commitment in their goals and programs. Motivation and Assessment Once valuable, unique, and difficult-to-replicate capabilities have been developed in a workforce, management practices can help shape employee attention, provide motivation, reward effectiveness, and encourage con- tinuous improvement. Some widely practiced strategies, such as annual evaluations, often produce inaccurate feedback (Murphy and Cleveland, 1995), and it is difficult to ensure the factors being rated are aligned with organizational goals (for further discussion see Kerr, 1995; Lawler and Rhode, 1976). In contrast, research shows the value of ensuring that super- visors have appropriate training in continuous performance management, including skill at setting goals, providing frequent feedback, coaching, and development (e.g., Aguinis, 2007; Smither, 2011). One important factor in motivation is that the reward system that is most effective is different for different types of employees. With extrinsi- cally motivated employees—that is, employees who are motivated mostly by external factors, such as pay, recognition, and advancement—it is impor- tant to link compensation to desired performance (see Bartol and Locke, 2000). With employees who are intrinsically motivated by their work—that is, those for whom internal satisfaction is more important than external recognition—signs of the inherent value of their work is the key. In the IC, intrinsic motivation is quite common, with analysts and other staff motivated by such things as the opportunity to save lives, serve their country, or solve important and interesting puzzles. For such workers, the most important approach is often to let them use and expand their skills with as few obstacles as possible. Fortunately, although extrinsic rewards are always constrained and limited in supply, intrinsic rewards are not. Several structured rating methods for assessing performance effective- ness could be adapted for use by the IC (for further discussion of perfor- mance appraisals, see Murphy and Cleveland, 1991, 1995). It has long been known that, if assessments fail to address organizational needs, employees

OCR for page 49
 THE WORKFORCE will focus on only those things that they see as being rewarded, to the detri- ment of other goals (Lawler and Rhode, 1976), or they will seek workplaces that better fit their needs. For the IC, a particular challenge will be to design performance evaluations that take into account both individual actions and teamwork. Collaboration When organizations face complex problems in changing and uncer- tain environments, teamwork and collaboration are essential. Team-based work systems locate decision making closer to the source of problems (in the IC, this refers to analytic decisions), capitalize on diverse perspectives and expertise, and encourage innovation and agility3 (Ilgen et al., 2005; Kozlowski et al., 1999). Team-based intelligence analysis can bring more information to bear on the analytic task and allow teams to dampen errors (for a review, see Hastie, 2011). If done correctly, expanding the informa- tion search and generating more alternatives have been shown to improve the effectiveness of forecasting (e.g., Dailey and Mumford, 2006). Creating communities of practice can allow like-minded analysts and others to focus on common interests, pool their knowledge, generate more alternatives for problem solving, and disseminate innovations. The issue of collaboration is explored more thoroughly in the next chapter, but it is worth noting here that increasing teamwork and collaboration—in appropriate ways—can improve analysts’ performance. For example, differentiation of expertise—that is, distributing knowledge and abilities among numerous individuals, each with specific areas of expertise—promotes greater exploration and innovation (Argote, 2011; Jansen et al., 2006). In order to harness this potential, the IC has to inte- grate diverse expertise (Miller et al., 2007; Fang et al., 2010). A positive example of the IC’s growing focus on teamwork and col- laboration is the Analytic Resources Catalog, which allows intelligence officers to locate other IC personnel with particular knowledge and skills. Another recent development, A-Space, is aimed at helping the development of collaborative, self-organizing networks of analytic activity. In particular, A-Space allows analysts to query the community, share information and perspectives, and collaborate on solving problems. Electronic communities of practice initially grow and improve more slowly than other knowledge management tools, such as knowledge portals and team rooms, but over the long term they demonstrate more continual 3 The National Intelligence Strategy states the IC must be “agile: an enterprise with an adap- tive, diverse, continually learning, and mission-driven intelligence workforce that embraces innovation and takes initiative” (Office of the Director of National Intelligence, 2009, p. 2).

OCR for page 49
 INTELLIGENCE ANALYSIS FOR TOMORROW improvement (Kane and Alavi, 2007). A-Space allows the formation of flexible networks for linking disparate expertise, allowing a self-organizing, exploratory, and agile integration of skills and knowledge that makes it pos- sible to take advantage of the IC’s differentiation. A-Space is conceptualized as an ongoing “experiment,” so it is particularly important that A-Space be studied and improved over time and that the results be disseminated as a way of helping the IC embrace high-quality teamwork in a continuous learning environment. EVALUATION As discussed in Chapter 2, objective evaluation is key to organizational learning. Given the importance of the IC’s missions and the complexity of the issues it deals with, the IC’s adoption and application of outcome-based evaluation has the potential to significantly improve its various workforce- related practices, from recruitment and hiring to training and incentives. Applying scientific methods to understanding its own work environ- ment will allow the IC to make appropriate changes to recruitment, selec- tion, and training. One key topic is identifying the specific factors that affect analysts’ ability to learn and be successful in their job, such as cognitive ability, personality, education, and training. These can then be sought when recruiting and selecting employees. It is important to note, however, that the IC should carry out its evaluations in a way that focuses on systemic learning rather than on the assessments of individual analysts. Evaluation should be seen as a positive factor for the community—a process that will enable the entire IC to become more effective—rather than as a punitive process to be dreaded and, if possible, avoided. Effective evaluation, reflec- tion, and continuous improvement are the underpinnings of organizational learning, innovation, and agility. The IC, like any organization or group, must decide how to balance process evaluation (how well correct procedures are followed) and outcome evaluation (the quality of the analyses). This balancing act is especially difficult in environments, like that of the IC, in which there appears to be little scientific foundation for the best practices embodied in many current procedures, however sincerely and conscientiously they are applied (for a review, see Tetlock and Mellers, 2011). It is notoriously difficult to devise either process or outcome evalua- tion procedures that do not create perverse incentives (Kerr, 1975; Wilson 1989). In the case of intelligence analysis, a natural risk is emphasizing the number of work products, which is easily assessed, over their quality, which is very difficult to assess. Indeed, poorly designed evaluation processes can undermine morale and productivity by encouraging extrinsically motivated employees to game the system and discouraging intrinsically motivated

OCR for page 49
 THE WORKFORCE employees, who just want to do interesting work and be treated fairly. Conversely, well-designed evaluations look separately at the performance of the organizations, given its methods and structures, and at the performance of individuals, given the opportunities and constraints that the organization presents them. REFERENCES Aguinis, H. (2007). Performance Management. Upper Saddle River, NJ: Pearson Prentice Hall. Argote, L. (2011). Organizational learning and knowledge management. In S.W.J. Kozlowski, ed., The Oxford Handbook of Industrial and Organizational Psychology. New York: Oxford University Press. Barney, J.B., and P.W. Wright. (1998). On becoming a strategic partner: The role of hu- man resources in gaining competitive advantage. Human Resource Management, (1), 31-46. Bartol, K.M., and E.A. Locke. (2000). Incentives and motivation. In S.L. Rynes and B. Ger- hart, eds., Compensation in Organizations: Current Research and Practice (pp. 104-147). San Francisco, CA: Jossey-Bass. Bell, B.S., A.M. Kanar, and S.W.J. Kozlowski. (2008). Current issues and future directions in simulation-based training in North America. International Journal of Human Resource Management, (8), 1,416-1,434. Cannon-Bowers, J., and C. Bowers. (2010). Synthetic learning environments: On developing a science of simulation, games, and virtual worlds for training. In S.W.J. Kozlowski and E. Salas, eds., Learning, Training, and Development in Organizations (pp. 229-260). New York: Routledge Academic. Crook, T.R., S.Y. Todd, J.G. Combs, D.J. Woehr, and D.J. Ketchen. (2011). Does human capital matter? A meta-analysis of the relationship between human capital and firm performance. Journal of Applied Psychology. Dailey, L., and M. Mumford. (2006). Evaluative aspects of creative thought: Errors in apprais- ing the implications of new ideas. Creativity Research Journal, (3), 367-384. Director of National Intelligence. (2008). Intelligence Community Directive (ICD) 610: Com- petency Directories for the Intelligence Community Workforce. July 16, 2008. Available: http://www.dni.gov/electronic_reading_room/ICD_610.pdf [June 2010]. Fang, C., J. Lee, and M.A. Schilling. (2010). Balancing exploration and exploitation through structural design: The isolation of subgroups and organizational learning Organization Science, (3), 625-642. Fingar, T. (2011). Analysis in the U.S. intelligence community: Missions, masters, and meth- ods. In National Research Council, Intelligence Analysis: Behavioral and Social Scien- tific Foundations. Committee on Behavioral and Social Science Research to Improve Intelligence Analysis for National Security, B. Fischhoff and C. Chauvin, eds. Board on Behavioral, Cognitive, and Sensory Sciences, Division of Behavioral and Social Sciences and Education. Washington, DC: The National Academies Press. Hastie, R. (2011). Group processes in intelligence analysis. In National Research Coun- cil, Intelligence Analysis: Behavioral and Social Scientific Foundations. Committee on Behavioral and Social Science Research to Improve Intelligence Analysis for National Security, B. Fischhoff and C. Chauvin, eds. Board on Behavioral, Cognitive, and Sensory Sciences, Division of Behavioral and Social Sciences and Education. Washington, DC: The National Academies Press.

OCR for page 49
 INTELLIGENCE ANALYSIS FOR TOMORROW Huffcutt, A.I., and W. Arthur, Jr. (1994). Hunter and Hunter (1984) revisited: Interview valid- ity for entry-level jobs. Journal of Applied Psychology, (2), 184-190. Ilgen, D.R., J.R. Hollenbeck, M. Johnson, and D. Jundt. (2005). Teams in organizations: From input-process-output models to IMOI models. Annual Review of Psychology, , 517-543. Jansen, J.L.P., F.A.J. Van Den Bosch, and H.W. Volberda. (2006). Exploratory innovation, exploitative innovation, and performance: Effects of organizational antecedents and environmental moderators. Management Science, (11), 1,661-1,674. Judge, T.A., R. Klinger, and L. Simon. (2010). Time is on my side: Time, general mental abil- ity, human capital, and extrinsic career success. Journal of Applied Psychology, (1), 97-102. Kane, G.C., and M. Alavi. (2007). Information technology and organizational learning: an investigation of exploration and exploitation processes. Organization Science, (5), 796-812. Kerr, S. (1975). On the folly of rewarding A, while hoping for B. Academy of Management Journal, (4):769-783. Republished in 1995 in Academy of Management Executive, (1), 7-14. Kluger, A.N., and A. DeNisi. (1996). The effects of feedback interventions on performance: A historical review, a meta-analysis, and a preliminary feedback intervention theory. Psychological Bulletin, (2), 254-284. Kozlowski, S.W.J., S.M. Gully, E.R. Nason, and E.M. Smith. (1999). Developing adaptive teams: A theory of compilation and performance across levels and time. In D.R. Ilgen and E.D. Pulakos, eds., The Changing Nature of Work Performance: Implications for Staffing, Personnel Actions, and Development (pp. 240-292). San Francisco, CA: Jossey-Bass. Kozlowski, S. (2011). Human resources and human capital: Acquiring, building, and sustaining an effective workforce. In National Research Council, Intelligence Analysis: Behavioral and Social Scientific Foundations. Committee on Behavioral and Social Science Research to Improve Intelligence Analysis for National Security, B. Fischhoff and C. Chauvin, eds. Board on Behavioral, Cognitive, and Sensory Sciences, Division of Behavioral and Social Sciences and Education. Washington, DC: The National Academies Press. Lawler, E.E., and J.G. Rhode. (1976). Information and Control in Organizations. Pacific Palisades, CA: Goodyear Publishing. Lewis, M.M. (2003). Moneyball: The Art of Winning an Unfair Game. New York: W.W. Norton. Marsh, P. (1995). Training trainers. Technical and Skills Training, , 10-13. McClelland, G. (2011). Use of signal detection theory as a tool for enhancing performance and evaluating tradecraft in intelligence analysis. In National Research Council, Intel- ligence Analysis: Behavioral and Social Scientific Foundations. Committee on Behavioral and Social Science Research to Improve Intelligence Analysis for National Security, B. Fischhoff and C. Chauvin, eds. Board on Behavioral, Cognitive, and Sensory Sci- ences, Division of Behavioral and Social Sciences and Education. Washington, DC: The National Academies Press. McDaniel, M.A., D.L. Whetzel, F.L. Schmidt, and S.D. Maurer. (1994). The validity of em- ployment interviews: A comprehensive review and meta-analysis. Journal of Applied Psychology, (4), 599-616. Miller, D.J., M.J. Fern, and L.B. Cardinal. (2007). The use of knowledge for technological innovation within diversified firms. Academy of Management Journal, 0(2), 308-326. Murphy, K.R., and J.N. Cleveland. (1991). Performance Appraisal: An Organizational Per- spective. Boston, MA: Allyn and Bacon. Murphy, K.R., and J.N. Cleveland. (1995). Understanding Performance Appraisal: Social, Organizational, and Goal-Based Perspectives. Thousand Oaks, CA: Sage.

OCR for page 49
 THE WORKFORCE Office of the Director of National Intelligence. (2009). National Intelligence Strategy of the United States of America. Available: http://www.dni.gov/reports/2009_NIS.pdf [August 2010]. Ployhart, R.E. (2006). Staffing in the 21st century: New challenges and strategic opportunities. Journal of Management, (6), 868-897. Rynes, S.L., and D.M. Cable. (2003). Recruitment research in the twenty-first century. In W.C. Borman and D.R. Ilgen, eds., Handbook of Psychology: Industrial and Organizational Psychology, , 55-76. New York: John Wiley and Sons. Salas, E., S.J. Weaver, and M.S. Porter. (2011). Learning, training, and development in orga- nizations. In S.W.J. Kozlowski, ed., Oxford Handbook of Industrial and Organizational Psychology. New York: Oxford University Press. Schmidt, F.L., and J. Hunter. (2004). General mental ability in the world of work: Occupa- tional attainment and job performance: General intelligence, objectively defined and measured. Journal of Personality and Social Psychology, (1), 162-173. Smither, J.W. (2011). Performance management. In S.W.J. Kozlowski, ed., The Oxford Handbook of Industrial and Organizational Psychology. New York: Oxford University Press. Tetlock, P.E., and B. Mellers. (2011). Structuring accountability systems in organizations: Key trade-offs and critical unknowns. In National Research Council, Intelligence Analysis: Behavioral and Social Scientific Foundations. Committee on Behavioral and Social Sci- ence Research to Improve Intelligence Analysis for National Security, B. Fischhoff and C. Chauvin, eds. Board on Behavioral, Cognitive, and Sensory Sciences, Division of Be- havioral and Social Sciences and Education. Washington, DC: The National Academies Press. Wilson, J.Q. (1989). Bureaucracy: What Government Agencies Do and Why. New York: Basic Books.

OCR for page 49