Not a day goes by that we are not reminded of the growing gap between the preparation provided by institutions of higher education and the rapidly changing job market. The worry caused by this perceived gap has as much to do with a seeming decline in the return on investment for the rising tuition costs of traditional higher education as it does with an apparent lack of preparation for graduates’ first job.
The ensuing negative press and political commentary on what to do about this gap often points to one of two possible solutions. The first is to infuse higher education curricula with vocational-skills training, an idea that often inspires much hand-wringing in university classrooms over losing sight of the critical thinking and leadership skills that are stated hallmarks of higher education. This response can lead to a variety of defensive postures expressed in angry claims that business is taking over the liberal arts or more apologist approaches in which broad swaths of fields such as the humanities remind us of their relevance in the workplace.
In more practical terms, concern over the skills gap has produced a myriad of undergraduate and master’s degree programs that claim to consult with various industries to produce the trained workers they need. This approach has rightfully been viewed as having industry dictate college and graduate curricula, which—given the variability in needs across even a single sector, as well as the rapid rate of change—is an effort that has short-term benefits at best.
A second, more severe response is to view traditional higher education as an outdated remnant of the past and to focus instead on the expanding universe of micro credentials that are primarily online and significantly lower in cost than standard degrees. The various certificates, micro degrees and other assorted digital badges are far narrower in focus, either around specific skills or content objectives, and are designed to give a more granular view of learner competencies.
The pragmatic appeal of this approach, especially for employers, is somewhat offset by the reductionist view of each individual and potential employee as a simple collection of utilities. After all, we are seeking to hire human beings and not simply downloading software that performs a set of expected functions. For the time being, we still seek human intelligence rather than what currently passes as artificial intelligence.
So where does this leave us? Perhaps no closer to closing the gap between today’s higher education and work, much less addressing the very likely widening gap between higher education and work today, much less addressing the very likely widening one between both in the future. But there is cause for hope. Arguably, all of the parts of the solution are on the table; what is required is an integrative effort that goes beyond simplistic Band-Aids in the form of changing who seems to be in charge of what education should achieve. Rather it is the moment to rethink what we mean by “work” while also revisiting the ground rules for how theory and praxis should be mutually supportive.
Simply put, work must be more broadly defined with a straightforward recognition that it encompasses everything. We must put an end to the false dichotomy between what scholars and researchers do in academia and what the rest of society does for a career. Isn’t being a scholar a job? The days of the gentleman scholar of significant financial means, who could pursue research without having to think about a paycheck, are long gone.
Universities have made strides to make scholarship and research more inclusive, and the growing support for an appropriate work-life balance in academia mirrors improvements in other places of work. Once we recognize that everything we do in the name of career or vocation that carries recompense is indeed work, we can let go of the counterproductive notion that academia and every other kind of work are disconnected, if not at odds with each other.
When I went to graduate school in biology more than 20 years ago, many science departments taught in ways primarily designed to train the next generation of scientists. But over the past decade, quite a few departments, including my own, have recognized that most of our students will not go on to be scientists. And while we take special care to foster and, indeed, expand the relatively small numbers that do go on to graduate school in our fields, our courses increasingly recognize that understanding science and having the intellectual tools of the scientific method are critical for everyone, regardless of future occupation.
This change, which is sweeping through liberal arts and STEM (science, technology, engineering and mathematics) curricula, is the first step in a crucial integration of what is taught in universities with the broader swath of students and their motivations and subsequent careers. Some fret that this approach is a dilution of our duties as scientists to only focus on creating more scientists. That attitude is truly shortsighted; the broadening of our charge not only lays the groundwork for connecting the creation of knowledge that happens in universities with the widest scope of what people care about, it also prepares our students for a present and future where career choices continually evolve over time.
We know that those starting their first job today will likely change careers at least three to five times during their working years. It is not infrequent for today’s graduate students and young scientists to move between the domains of academia and what we usually call the workforce or to keep one foot in each. A more integrated curriculum that combines the foundational concepts and habits of mind of the liberal arts and sciences with the skill sets more traditionally thought of as preprofessional or vocational will better prepare our students for this more dynamic future.
Integration of what has traditionally been kept separate in four-year colleges, however, requires more than a better mingling of classes. It calls for a curriculum that is dynamic and continuously taught in context. This area is where more challenge-driven learning presents new opportunities for meaningful integration of higher education and work.
Challenge-driven education is based on specific and current problems that provide fodder for team-based learning and innovation among students. Many of today’s most pressing challenges, including those defined by international efforts such as the United Nations’ 17 sustainable development goals, require solutions that cross the boundaries of traditional disciplines. Using concrete challenges that are appropriately placed in a local context provides an opportunity for students to learn by bootstrapping the basic knowledge and skills needed to understand problems and develop creative and actionable solutions.
This scenario is most often what happens in both research and the traditional workplace. It is time for us to recognize that education can no longer fill a static toolbox that we then use for the rest of our careers; it must give us the wherewithal to constantly refill our own toolboxes with a personal agency born of genuine motivation and the lifelong learning capacities required by the work of tomorrow.]] This is an important moment of curricular transformation in higher education and an opportunity not only to close the gap between the college classroom and jobs but to foster an essential symbiosis that recognizes that the future of work is and always has been the future of higher education
[“source=scientificamerican”]