What Does Digital Disruption Mean for Education and Training? (Part 1)
This is the first installment in a series of posts that explore the significance of digital technology for education and training in Africa and elsewhere around the developing world. There is no question that mobile technology, the scarcity of bricks and mortar educational institutions, the emerging new wave of population explosion, and the cost and availability of formal education at all levels are the critical factors driving change and digital disruption. The big unanswered questions are what does it all mean now and what will it likely mean in the medium and long term?
Where Disruption Is Headed
According to technology expert Shankar Maruwada, educators need to prepare for a coming “education revolution” that will see learners breaking from traditional models to embrace shorter spurts of education throughout their lives.
Maruwada, former head of marketing for India’s biometric identification system, now runs EkStep — an open-source education platform designed to host digital tools and infrastructure to deliver primary education to 200 million children in India. His prediction is that learners will move toward shorter courses of education and life-long learning in order to equip themselves for the “gig economy,” where more and more people are taking on freelance and short-term work, rather than long-term positions with regular wages.
In Sub-Saharan Africa, where unemployment and underemployment rates fluctuate between 25–35%, the job situation is often unrelated to a self-styled choice in the type of employment one pursues. There it is more about the jobs that are available, and more often than not about creating a path of self-employment, which means that a great many enterprising people of all ages find themselves in the heart of the bustling informal economy.
Speaking on a panel session during the two-day Global Skills & Education Forum, a Davos-style event aimed at raising the status of global education in Dubai, Maruwada said “micro is the new mega-trend” when it comes to jobs.
He believes this employment shift will necessitate a similar transition in education, asking “if jobs go micro then can learning be far behind?”
As part of this change, “learning will be a lifelong journey as opposed to the current stage of learning being a rite of passage where you learn, then you earn and then you retire. The future will be about … lifelong cycles of learning and earning,” he suggested, and the next generation “may not have the luxury of retiring.”
The shift will also put greater emphasis on applied learning, Maruwadar said. The focus will no longer be on “what you learned or how you learned … but how you can apply what you’ve learned.”
As a result, students will start to move away from traditional education models, including four-year degree programs, and instead take short courses in order to learn specific skills they need at that time, which he described as “just in time expertise.”
This is the direction that the digital disruption seems to be headed. But that begs the question of where it started from?
Where Disruption Started From
Education has been a highly stable activity throughout the history of human civilization. It has largely rested upon two pillars: the teacher and the text. If we are a little more imaginative, we might include a third pillar, which is the learning community. The best educational experiences often involve a tight and sometimes competitive group of learners spurring one another on, arguing, resolving puzzles, and trying to excel each other’s efforts. The common foundation for these three pillars has been place. There had to be a place where teachers, texts, and students gathered so that learning could occur. Alongside this tradition of education in community (which was often centered on church and faith), there has also been a rich tradition of auto-didacticism (or self-directed learning).
These elements explain the growth of the early universities. They couldn’t take root just anywhere. Texts were not widely available, for one thing. Harvard University, for example, is called by that name because John Harvard donated his collection of books to be used by students. (I’m not sure we could find many seed investments that have paid off so handsomely.) The teachers gathered near the texts and the students followed.
As books became radically less expensive thanks to the growth of printing technology and societies grew in wealth and sophistication, it became possible to replicate the model of the college or university many times over. As a result, access to a traditional education increased substantially (the 20th century U.S. demonstrates the point), but the fundamental elements were constant and place continued to be a limiting factor. The student had to go to the place of learning. The need to attend a place of learning limited the ability of those who could not give their life over full-time to obtain an education. Night school opportunities helped, but the issue of access remained a problem.
For roughly as long as the broadcast technology era, it has been believed that technology could substantially alleviate the limitations imposed by the necessity of geographic location. When radio reigned, some dreamed of “colleges of the air” that would broadcast lessons to learners anywhere a receiver could be appropriately tuned. The same idea applied to television. While broadcast didn’t get far in terms of providing a substitute for a college education, there was clearly progress in terms of instructional television. One might think of Bishop Fulton Sheen’s religious lessons or those offered on public television having to do with crafts such as woodworking, landscape painting, cooking, or home renovation.
The closest television came to solving the problem of place was through the medium of videotape. Universities used videotape to overcome the problem of higher numbers of students than could be fit into lecture halls. Instead of cramming the big lecture hall at one time, students might opt for several different showings of the lecture. MBA programs of the 1980’s showed videotaped lectures in downtown locations so that degree seekers could use lunch hours or postpone their commute to gain valuable seat time. Through the use of professor-extenders, such as graduate students, such courses could accommodate much larger numbers of students while maximizing the use of the educational content of lectures and presentations. Videotape (and simulcast, also based on television) failed to make substantial inroads. Likely reasons include a lack of interactivity and increasing development of technology, which could provide a more satisfying experience.
Then we entered the internet era. The University of Phoenix and other schools offered courses online even when dial-up access provided slow and inconsistent connections. Their appeal expanded considerably as broadband internet access became standard. It has been the promise of this new technology (and the speed and customizability that comes with it) that has finally begun to present a real challenge to the traditional system.