Abstract
The importance of active, “experiential” learning, and the belief that schools and colleges need to engage the world outside their walls in new ways in order to equip students for the rapidly changing future, is increasingly held to be fundamental to any educational program or reform effort, and in particular to the current rethinking about what (and how) students need to learn. In fact, however, this truth was recognized as early as in ancient Athens. In recent years the world has been transformed, and we are now well into a new era of individual and social experience. With the unprecedented processing power of mobile devices, the seemingly endless expansion of storage capacity, and unlimited access to information, we confront a vastly different landscape for education. This essay examines what this all means for education futures.
“Education is not preparation for life; education is life itself.”
John Dewey (239)
The importance of active, “experiential” learning, and the belief that schools and colleges need to engage the world outside their walls in new ways in order to equip students for the rapidly changing future, is increasingly held to be fundamental to any educational program or reform effort, and in particular to the current rethinking about what (and how) students need to learn. In fact, however, this truth was recognized as early as in ancient Athens, where Socrates used the dialogue as the major medium of instruction, walking around the city with his students in “peripatetic” fashion. Socrates’ great student Plato noted that, “a free soul ought not to pursue any study slavishly… (for) nothing that is learnt under compulsion stays with the mind (Plato, p. 536).” The conviction that ideas required institutions other than the church – the staid and controlling institutional container for education for more than a millennium – influenced the emergence of medieval European universities in Bologna and Paris, while many “big ideas” circulated well beyond monastic walls across multiple global centers of learning where knowledge not just from Greece and Rome but from China, India, Persia, Byzantium, and Egypt was transmitted and exchanged. The global circulation of ideas and approaches that were devised for a new and changing world – ideas concerning astronomy, astrology, medicine, philosophy, warfare, and education – often took root, or were revived, far from the usual institutions (leave alone their places of origin). Without the Zero that was invented in India modern day mathematics would never have emerged. Translations of Aristotle only survived to the present day because of the work of Avicenna, a Persian polymath.
The need for educational reform has been central to the agendas of many canonic philosophers for centuries. John Locke, the seventeenth century English philosopher who had enormous influence on the founding fathers of the United States in their ideas of property, freedom, and the state, wrote in his work, “Some Thoughts Concerning Education,” (Locke, 1693) that children were born as a blank slate. While he believed that the mind had inherent capacities, predispositions, and inclinations, he noted that these were only triggered when the mind received ideas from experience. Locke’s educational musings were echoed a few decades later by the Swiss philosopher Jean-Jacques Rousseau in his seminal book, “Emile, or On Education” (Rousseau, 1762). Rousseau believed in the original goodness of human nature, blaming experiences imposed by advanced society for the degradation of social life. A forerunner to more contemporary social constructivists, and a variety of theorists who believed education could mold children in new (and sometimes utopian) ways through early and careful cultivation, Rousseau’s views were deemed heretical by some and he saw his books banned and burned during his lifetime. The greatest source of resistance to both Locke’s and Rousseau’s views on education came from the church, which many felt should control education for the cultivation only of spiritual lives.
Indeed, for most of the centuries of human history, religious institutions, and later the state, have taken the lead in offering and designing education. In early India, education was reserved for the priesthood and in particular for the sacerdotal elite, while in China, education was only for the elite cadre of officials who served the state. In the west, the flowering of new ideas about education had to await the long historic break known as secularization, a process propelled not just by periodic ages of enlightenment but also by the industrial revolution. And yet, as much as enlightenment values opened educational institutions to new ideas and ambitions, they also were shaped by the growing needs of industrial society. Industrialization entailed not just the massive movement of labor from agricultural fields to urban factories, it required the training of larger and larger populations in skills that could no longer be conveyed simply through guilds and apprentice-based relationships. As technology advanced and industrialization grew in scale and significance during the nineteenth century, so too did the need for a standardized workforce to meet the demands of industry and capital. The mass production of the factory came thus to be mirrored in the mass production of workers through schooling. This involved the wholesale standardization of pedagogical practices, narrow specialization in the curriculum, and a focus on limited profession-based training for the masses. National compulsory education was introduced during this time, first by Prussia, but followed quickly by other European nations, with the expectation and hope this would be the foundation for industrial prosperity and military power. In Asia, Japan was the first to introduce a centralized education system as part of the post 1868 Meiji era reforms, again driven by desire for industrial and military progress.
In the United States, the growing needs of industry were compounded by the arrival of millions of immigrants during the late nineteenth and early twentieth centuries, many fleeing poverty and famine in Ireland and Italy, or war and restricted opportunities in Prussia, Scandinavia, among many other European points of origin. In the late nineteenth century, traditional educators saw even high school as still an elite affair, since it was predominantly meant to prepare just the most advanced students for college (a relatively small number: in 1890 there were only 120,000 college students across the country, with a much smaller number actually completing college and receiving a bachelor’s degree). But it was increasingly recognized that the growing and newly diverse population needed some greater measure of education even when college was not seen as necessary. The National Education Association appointed the “Committee of Ten” in 1892 to adjudicate this and several other debates about educational policy. Chaired by Charles Eliot, the long-serving President of Harvard University, and made up of educators including the Presidents of the Universities of Michigan, Colorado, Missouri, Vassar College, and the Headmaster of Lawrenceville Academy, the committee made several major recommendations, among them standardizing eight years of primary school and four years of high school. Most importantly, while it maintained an emphasis on classical subjects and in particular Latin and Greek, it included courses on mathematics, science, English, contemporary foreign languages, and history. Second, it recommended a common curriculum for all students, regardless of whether they intended to go to college or not. The committee concluded that: “every subject which is taught at all in a secondary school should be taught in the same way and to the same extent to every pupil so long as he pursues it, no matter what the probable destination of the pupil may be, or at what point his education is to cease.”
In addition to reminding us of a time when universities played a key role in setting standards for K-12 education, it is noteworthy, and in theory admirable, that the American system differentiated itself early on from Europe by formally eschewing any kind of tracking, predicated on the conviction that all students deserve an equal education. Unfortunately, these blanket assurances frequently excluded, or the very least marginalized, minority groups including women for many years. Segregation and inferior schools were a glaring part of the educational mix until Brown vs. the Board of Education in 1954 reversed the discriminatory conceit of “separate but equal.” Looking back, however, even the relative progressivism of the Committee of Ten also meant entering the era of mass education with overriding importance granted not so much to learning but rather to standards of efficiency and the demands of scale – mandating a one size fits all approach to education in the US that persisted through World War II. And despite the formal emphasis on equality, this led not just to mass standardization, but also to efforts to differentiate students through new criteria of selection and merit. Thus the simultaneous push to measure achievement, and aptitude, “objectively,” through mass testing, first through the creation of intelligence tests, and then in the reliance on a range of purportedly neutral exams on aptitude for specific subjects. Lewis Terman, a psychologist at Stanford, and a noted early twentieth century eugenicist, developed the idea of IQ, and launched a national movement to use tests to measure the aptitudes of American school children, exemplified by the “National Intelligence Test” for grades 3 to 8 as well as the Stanford Achievement Tests for individual skills and disciplines such as reading comprehension, mathematics, and science. The movement for testing was soon adopted by college admission officers in New England colleges, some of whom believed that tests like the SAT could provide the basis for an American meritocracy. Unfortunately, however, “old world” beliefs about natural social distinctions and predispositions were smuggled into American egalitarian ideals through the strong correlation between performance on exams and socio-economic background.
In the wake of World War II the American university system expanded exponentially, first with the influx of veterans funded by the GI bill, and then with the growing expectation that a college degree was a necessity rather than a luxury for a postwar America. But this growing sense of educational opportunity came along with a renewed commitment to tracking students as early as middle school, and pooling public resources for college preparation on the basis of objective testing and putatively meritocratic methods. Tracking was further legitimated by the idea that some children were naturally “gifted.” And tracking was the natural outcome of the growing acceptance that an industrial economy such as that of the United States depended on principles of Taylorism and Fordism, in the first instance a technique of labor discipline and workplace organization based on conceptions of efficiency and incentives, and in the second instance the reorganization of industrial production around the moving assembly line and the mass market. These guiding principles continued to hold sway not just through to the end of the twentieth century, but into the present century as well. They can be seen in policy initiatives such as the 2001 “No Child Left Behind” Act in the US as well as in the National Literacy Strategy and OFSTED school inspectorate in the UK. And of course the pattern of mass standardized testing had already taken root in systems as various as China’s with the centrality of the Gaokao, and India’s with its different school leaving and college entry exams.
But even as American educational practice during the first half of the twentieth century was shaped and then set on this “industrial” path, new voices emerged that began to echo earlier educational theories in favor of focusing more on individual talents and interests, hailed instead by the belief that education should be organized around how young people actually learn in the world rather than solely by what society might feel they need to study in school. Perhaps the most important educational thinker along these lines was the philosopher John Dewey, one of the founding figures of American pragmatism. Beginning his life as a teacher (he went to graduate school when he realized he didn’t have the aptitude to teach secondary school), he thought about education throughout his career, publishing his most influential book after his retirement. In his classic “Experience and Education (1938),” he made a full and robust case for the role of experience as the foundation of learning. But he was clear that this did not mean that any random experience on its own constituted learning, since in order for experience to be truly educational, it had to be part of a cycle of inquiry, research, interrogation, argument: in short, “experience” had to be “structured experience.” Thus, he explained, experience was critical for learning, but it required both intentional planning and rigorous processing for students to take full advantage of what in effect was a sequence of curated experiences.
Although Dewey was part of a distinctive American tradition of thinking, he spent two years in China between 1919 and 1921, during which time he gave more than 200 lectures not just in Beijing, where he was invited to teach by his former student, Hu Shuh, later President of Peking University, but in cities such as Nanjing, Shanghai, Suzhou, and Hangzhou. Hu Shuh was deeply influenced by Dewey’s pragmatism, leading to his own pioneering work to promote the use of vernacular, and streamlined, Chinese in place of classical forms and scripts. And although Dewey’s educational theories were not widely adopted at the time, either in China or the U.S., they – like the ideas of Confucius, Locke, Rousseau, and other great thinkers before him – have had their greatest influence after his death and are today seen as more relevant than ever. Whether in the work of educational psychologists that focus on human development, or of learning science based on new methods in neuroscience that help us understand the physiology of attention and cognition, or in the arguments of educational critics and disruptors that focus on skills and aptitudes necessary for success in the world rather than traditional forms of knowledge as they have been narrowly conceived in schools and colleges, Dewey’s emphasis on the importance of experience has been repeatedly invoked and used.
But the world has been transformed during this same period of time, and we are now well into a new era of individual and social experience. What began as the information age in the second half of the twentieth century with the development of the computer, has now spawned what Klaus Schwab of the World Economic Forum has called the “Fourth Industrial Revolution”. To recap his (and others’) argument, the first industrial revolution used water and steam power to mechanize production; the second created mass production through the harnessing and use of electric power. The fourth industrial revolution is an intensification and transformation of the third, predicated on a fusion of technologies that is blurring the lines between the physical, the digital, and the biological spheres. While still based on machine computation, the kinds of changes brought first by Moore’s law and the almost exponential expansion of computing power, and second by the interconnectivity of the internet, new kinds of sensors, and the advance of machine intelligence, robotics, and bio-technology, characterizes what Schwab calls “the transformation of entire systems of production, management, and governance (Schwab, 2016).”
With the unprecedented processing power of mobile devices, the seemingly endless expansion of storage capacity, and unlimited access to information, we confront a vastly different landscape for education. We also confront a vastly different landscape for the economy, because so many of the skills that used to translate into gainful employment will disappear with the digital transformation of production regimes. When contemplating a future made up of autonomous vehicles, advanced robotics, the internet of things, 3-D printing of new materials and even biologically derived organs and body parts, some believe we are headed to a utopian future where our lives will be made ever more convenient, healthy, and prosperous. There is little doubt that technological innovation will continue to lead to long term gains in efficiency and productivity, and speed medical as well as other forms of science. And yet, early returns strongly suggest that while the digital economy makes many things more available and less expensive, a combination of the net displacement of workers by machines and the growing concentration of wealth in the hands of the providers of intellectual and physical capital for technological innovation, will only further intensify growing inequality and the preponderant stagnation of wages and diminution of employment opportunities for the providers of labor, especially those without high levels of skill relevant to the new digital economy.
Unlike earlier industrial revolutions, which commenced in particular regions of the world and were distributed extremely unevenly in regional terms, the fourth industrial revolution is profoundly global in its immediate spread as well as in its impact. China, which ended the last century as the world’s largest producer of new digital technology, whether laptops or I-phones, etc., is now the world’s fastest adaptor of new digital technology and will soon be the world’s largest technology consumer. China is also poised to overtake the west in terms of the development of some features of artificial intelligence and machine learning in the next two or three decades given the extraordinary entrepreneurial energy and the huge datasets at China’s disposal. India is projected to outpace China in population in 2027 and in all probability in cell phone use as well – in the last five years, India’s smartphone market has grown over 200% from 47 million units sold per year in 2013 to a staggering 142 million units sold in 2018. And Africa, with its rapidly growing population and high birthrates will soon play an ever more dominant role in technological consumption (and, soon, in production as well). In short, the fourth industrial revolution will be global not just in its scope but also in the forces that will be central to the organization of production as well as consumption.
What does all this mean for education? First, knowledge – by which I mean the capacity not just to access information but to be able to use it in ways that address and contribute to, and for that matter take advantage of, some of the core features of the new digital economy of production – will be more central and important than ever. As low skill jobs either disappear or become increasingly concentrated in service industries and caregiving occupations, the gap between high-skill/high-pay and low-skill/low-pay segments will likely grow. This alone creates growing economic and social value for the knowledge based skills that only education can provide. But there are many other reasons why education will only grow in importance, ranging from the cultural and intellectual capital that is still highly correlated with success in private and public life to the need for expertise of a kind that will be required to comprehend the massive technological changes taking place. These changes in turn produce major challenges, including perhaps most saliently the growing recognition (from all positions on the political spectrum) that growing economic inequality and the accompanying social tensions that an advanced knowledge economy will produce, combined with climate change, increased global migration, public health crises, geo-political tensions and conflict, etc., will only increase the complexity of managing (and surviving) these technological, ecological, political, and economic transformations.
Predictions about the future are, as Yogi Berra quipped, characteristically more about the present (and for that matter the past) than about our capacity to know for certain what will happen in years still to come. But a few thoughts anyway. First, while wealth will be distributed more globally than was the case in the first three industrial revolutions, it is likely at the same time to become even more concentrated in the hands of global elites that have access either to capital or to knowledge as well as to global social and cultural influence. While capital will not be widely distributed without major social and political interventions, knowledge will be, even as the roles that relate to knowledge creation and its deployment will not only expand but become ever more critical for social mobility and social survival. Despite the steadily growing importance of technological knowledge, however, the kind of knowledge that will be instrumental in its deployment both as an economic good and as a force for political and social action will increasingly not be merely technical, but also analytic, synoptic, and eclectic. On the one hand, it is already becoming clear that that even fundamental technical skills such as coding will in many cases themselves be automated, meaning that technical knowledge alone will hardly be sufficient to ensure the success, leave aside the happiness, of an educated person. On the other hand, neither the big ideas that will drive the economy nor the monumental challenges that vex our political, social, and economic life will be only technical in scope or application.
There is good reason that colleges and universities globally are overwhelmed by student demand for courses and programs in computer science, coding, and an array of advanced digital skills. And there is no doubt that the recent push for STEM, and the focus on STEM not just in North America and Europe but in parts of the world such as China and India in particular, reflects the recognition that skills that are directly connected to the digital economy have immediate bearing on employability and the capacity to control important elements in the emergent technological landscape of the fourth industrial revolution. And yet, a growing number of critics from science and technology backgrounds are joining forces with humanists (like me) in stressing that serious training in STEM must also be conjoined with appreciation of and genuine exposure to the arts and design, along with deep immersion in multiple languages, a full understanding of the power of humanistic study, and comprehensive engagement with social, political, cultural, and economic modes of understanding and analysis – which includes serious global understanding and cultural competence.
All this is not simply because of an old fashioned commitment to the disciplinary assumptions of bodies like the committee of 10 in the late nineteenth century. Nor is it merely because the aesthetic is a pleasing luxury product, or because the human value of encountering the great if outmoded works, follies, and legacies of diverse cultures and world civilizations might simply enhance an abstract appreciation for the past. But it is also not just in the service of the pursuit of human fulfillment. The more complex our technology, the more we will rely both on collaboration across teams – and thus the capacity to communicate and collaborate broadly – and on insight and vision that will transcend the immediate instrumental ends of any particular project or task. The more vexing our global challenges – whether in the sphere of climate change or of continuing (and for that matter growing) inequality – the more we will rely on a combination of digital understanding, creative imagination, and ethical engagement. If the future is not to appear as increasingly dystopian – as so many in our younger generations believe – we will also need to rely on our capacity to maintain an educational commitment to a robust humanism along with the advances of science and technology.
What all this means for our educational institutions is still unclear; confident predictions about the imminent collapse of traditional educational institutions and practices have still been largely ignored by real trends and changes, in large part because knowledge acquisition is still difficult to distill from the social, cultural, and psychological support system that schools and colleges provide, not to mention traditional beliefs about brand and reputation. And most of the voices urging or predicting disruption in education have both underestimated the enduring value of some traditional educational values and goals and taken too narrow a view of what knowledge is and how it is actually acquired by individual students (even as the idea that traditional schools can simply be replaced by non-school experience is still naïve). It seems probable that change of a major kind is doubtless coming sooner than most traditionalists would like to think. But, assuming this to be true, the present is not a particularly good guide for thinking about the future.
Whatever the future holds, I believe that we must acknowledge that the real challenges we confront, both for ourselves and for our societies, require multiple skills and competencies: among them, the cultivation of aesthetic and artistic sensibilities, the development of both secure roots in local communities and cosmopolitan cross-cultural (and genuinely global)attitudes and understandings, and the inculcation of moral and ethical responsibility. Cognizant that these goals are both enormously ambitious, and must sit astride mastery over ever more complex fields of knowledge ranging from the applied and theoretical sciences (social, biological, and physical) to the specific skills relevant to digital, computational, and data driven/enabled technology, we must indeed ask how we might encode all this into a pragmatic program for student learning and competency in the educational futures we are called upon to design (and re-design) going forward?
As we turn our attention to this task, I would encourage us to remember the long intellectual history – from Plato to Dewey – that has emphasized the importance of experiential modes of learning (and of experience as the basis for learning itself). But we must also remember that Dewey always insisted that for these experiences to become constitutive of individual mastery and progress they must both be carefully structured and calibrated to individual learning styles and aptitudes. While schools must open their doors in a much more systematic way to the outside world, the worldly experiences that can be fundamental to improved student competencies and outcomes must also be structured by the same kind of thought that has been applied to school/college curricula and “knowledge progressions” across the history of our existent educational institutions. Even as our social experiences are fundamentally changing in the landscape of the fourth industrial revolution, our modes of engagement with this world must continue to be predicated on a range of competencies and a set of values that go well beyond what we see as the narrowly defined needs of the present.
Dewey worried that the rise of progressive thinking about education would lead to a narrowly instrumental focus on student competencies as primarily about knowledge as a set of methods. On the one hand, he believed that traditional education “employed as the subject matter for study facts and ideas so bound up with the past as to give little help in dealing with the issues of the present and future (Dewey, 1938, p. 23).” However, he also worried that addressing this could in turn lead to the “problem of discovering the connection which actually exists within experience between the achievements of the past and the issues of the present. We may reject knowledge of the past as the end of education and thereby only emphasize its importance as a means. When we do that we have a problem that is new in the story of education. How shall the young become acquainted with the past in such a way that the acquaintance is a potent agent in appreciation of the living present (Dewey cited in De Nicholas, 1989, p. 320)?” History may always be in some sense a history of the present (or, rather, a history in which interests, interpretations, and emphases are always rooted in present contexts), but its relevance and its scope should never be limited to or by an instrumental reading of the present. Dewey advocated as he practiced, stressing rigor and systematic thought, even as he understood the importance of engagement with the world. As he did so, he championed a commitment to education as fundamental to life itself, in all its guises, as the ground of being in the world. As he wrote, be believed, “finally, that education must be conceived as a continuing reconstruction of experience; that the process and the goal of education are one and the same thing (Dewey, 1897, p. 77).”
Footnotes
Declaration of conflicting interests
The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Funding
The author(s) received no financial support for the research, authorship, and/or publication of this article.
