Economists, researchers, and educators from all over the country recently took turns here looking into a crystal ball with two urgent questions: No. 1, what job skills will employers need in the decades ahead? And, No. 2, are students getting the education they’ll need to be employable?
As with most prognostications, depended on whom you consulted.
Ask Dixie Sommers, an assistant commissioner for occupational statistics and employment projections at the U.S. Bureau of Labor Statistics, and she would tell you that, for example, from 2004 to 2014, —a jump of 20 percent.
But ask Stuart W. Elliott, the director of the Board on Testing and Assessment at the National Research Council—part of the National Academies, a private, nonprofit quartet of institutions chartered by Congress to provide science, technology and health-policy advice—and you’d get a very different answer. He’d tell you that by 2030, the question of what skills current employers might want could be moot for most jobs.
By then, according to , 60 percent of human jobs as we now know them—including 74 percent of U.S. library, training, and teaching positions—may disappear.
Or maybe both of those scenarios will happen. Or neither.
“My summary of what skills demands there will be is, ‘Who knows?’ ” said Harry J. Holzer, a professor of public policy at Georgetown University in Washington.
The important thing, all the participants seemed to agree, was that all those ideas got shared and debated.
Researching the Future
Admittedly, the agenda for the May 31-June 1 conference—analyzing and assessing the wildly varying research methods that produce predictions as different as Ms. Sommers’ and Mr. Elliott’s—was ambitious. After all, how many people in the 1970s could have correctly calculated which jobs would most be in demand by now, let alone how best to prepare for them?
But the stakes don’t get much higher. Considering the potential for computers’ widespread displacement of human jobs, the projected rise in career competition from highly educated immigrants, and employer expectations that new employees come to work already trained, there is “a huge quantum paradigm leap in the way employers are thinking about work,” said Peter Cappelli, a professor of management at the University of Pennsylvania’s Wharton School in Philadelphia.
A preliminary analysis suggests that computers will take over many of the current jobs in some employment sectors by 2030.
“If you’re a K-12 teacher, the workforce that you’re influencing is one that will exist several decades into the future, not the one that exists now,” said Mr. Elliott. “You need to shift your focus into the future.”
His projections for future work displacement were based in part on “Moore’s Law,” the rule of thumb that the number of transistors—and therefore the amount of computing power—that can fit onto an integrated circuit chip doubles about every 18 months. If that rate holds steady, Mr. Elliott said, computers will be able to perform effectively all current human work by 2100, or even 2050.
That doesn’t mean that 60 percent of the workforce will necessarily be unemployed by 2030, he said—just that “unprecedented investments in education for both children and adults” will need to be in place by next decade if a majority of workers are to stay ahead of computer capability.
‘Soft Skills’ Needed
Mr. Elliott stressed that the pilot analysis was a “coarse approximation” of the kind of full-scale, computer-science-based analysis of future job-skills needs he has proposed.
But in the room full of educators, it was his pilot’s projection that nearly three-quarters of currently configured teaching, training, and library jobs would be taken over computers that drew the most attention.
“A little absurd” was how the projection struck Helen F. Ladd, a professor of public-policy studies and economics at Duke University in Durham, N.C. “Who programs the computers on an ongoing basis?” she wanted to know. “We haven’t developed the software for what teachers do. It’s different from classroom to classroom.”
There was broader agreement on the necessity for effective self-management, interpersonal and written communication, and other so-called “soft skills.” As several participants noted, employers are already decrying the lack of those attributes, particularly in science and software-engineering jobs.
“Work is in fact very social—even engineering,” said Beth A. Bechky, a professor in the graduate school of management at the University of California, Davis. 69ý at all levels get hit with plenty of core content, she added, but “the things that students don’t get training in is, ‘How do you communicate?’ In science or sales, you need to know how to talk to people.”
There was also, however, a broad sense of déjà vu on that topic. “If you Google ‘SCANS’ and see what employers needed in the 1980s and 1990s, the skill list was the same as we just heard,” said Thomas R. Bailey, a professor of economics and education at Columbia University’s Teachers College in New York City, referring to the U.S. secretary of labor-appointed Secretary’s Commission on Achieving Necessary Skills.
The commission, which issued several reports from 1990 to 1992, routinely cited interpersonal skills, thinking skills, and personal qualities such as responsibility and integrity as must-have employee traits that schools were failing to teach.
“We’ve been teaching these skills for 10 to 15 years,” said economist and Harvard University education professor Richard J. Murnane. “What have we learned?”
Into the Void
Not much, according to Susan Traiman, the director of education and workforce policy at the Business Roundtable, a Washington-based nonprofit association of chief executive officers that advocates maintaining a well-trained workforce.
“It’s obvious that we’ve got learn to walk and chew gum at the same time,” she said, referring to the need to teach both core-course content and the soft skills needed to make use of it. “Unfortunately,” the former teacher added, “most teachers don’t know how to do that without lowering standards.”
Others at the workshop suggested the blame lies elsewhere. “These kinds of interactive skills are not measured” by the welter of content-based standardized tests for which educators must prepare students, Mr. Murnane noted. “It often comes down to a drill-and-kill approach,” he argued, “which is not good at teaching these skills.”
Moreover, suggested Mr. Bailey, who also directs the Teachers College Community College Research Center and the National Center for Postsecondary Research, K-12 educators can only do so much to prepare their students for future needs.
“It takes 20 years or 30 years for what’s being taught to make its way into the workforce,” he observed. Neither, he said, is clairvoyance about future work skills a prerequisite for good workforce development.
Community colleges, for example, “don’t really need to have good forecasts,” he said, “because they have almost day-to-day contacts with local employers.”
For all the data and discussion, the researchers and academicians inevitably found themselves at the lip of an abyss, peering into the as-yet-unbridgeable gulf between what’s known and what won’t be known for years.
In the end, said Mr. Cappelli, the Wharton School professor, the skills that might turn out to be most valuable for students to learn are “the skills for managing uncertainty.”
“When you’ve got major technology shocks and global warming and everything else, [the future] is no longer knowable,” said Ms. Ladd, the Duke University professor, who also serves on a blue-ribbon commission on testing and accountability in North Carolina. “But that doesn’t mean we shouldn’t have these discussions.”