When I was in middle school my grandfather bought a computer. It was state of the art and ran Windows 95. As the computer savvy grand child, I became the designated support guy. As I explained the systems preference panel, my granddad would take detailed notes in his doctor scrawl that I imagined was only legible to pharmacists used to decipher prescriptions. We practiced conjuring up menus and files that he managed to accidentally hide; he dutifully took notes at every repetition. When I handed him the mouse for a practice run, he carefully aligned the cursor over the box he was supposed to click, only to be veered off target ever so slightly by the tremor in his hand every time he pressed his index finger on the mouse button. This was an obvious source of frustration for both of us, but he persevered in his tenacity. It wasn’t only an aging physique that impeded a smooth user experience of the Windows 95 systems preferences, it’s very logic was alien to my granddad — despite his detailed notes, menus kept disappearing and it was only after long support calls (or the next visit) that we could coax them back into sight. My granddad was by no means technologically illiterate. As a radiologist he dealt with complex, state of the art machinery his entire professional life. He wasn’t alien to logic either; he attended university lectures for senior citizens and would rejoice debating Kant, evolution and other matters beyond the comprehension of my narrow, adolescent intellectual horizon. Yet I was the whizz kid at home on computers, sporting a rapport to Windows 95 he could never approximate.
The crackling sound of the dial-up modem, Jpeg images loading line by line, and the constraints of the 1.4 megabyte floppy disk were defining experiences of my induction into computers; they are now museum worthy sentimentalities. Today’s youngsters send snaps and like selfies on Instagram; they have no memories of that first PC arriving in the household or of the primitive homepages that made up the web 1.0. They are growing up surrounded by information and communication technology that has become relatively inexpensive, speedy, and ubiquitous. This fact led to the proclamation of a new generation of ‘digital natives’, who, by virtue of having been born into this technological environment – rather than having migrated into it from the past –supposedly have innate skills and knowledge of information technologies, as well as distinct learning styles as a result of this exposure. This claim is also made for healthcare. The website iQ by Intel writes under the headline “Digital Natives Push for Personalized Healthcare Technology”:
Caring for a rapidly aging population is challenging. Experts working to revitalize healthcare for the 21st century are tackling this challenge by shifting from a one-size-fits-all to a more personalized healthcare approach, one that is heavily influenced by how young people use technology.
To combat skyrocketing healthcare costs for an American population of 326 million people spanning six generations, experts are turning to bioscience and new technologies as well as to young, tech-savvy digital natives who are already nudging healthcare into the Internet age.
We’re already seeing that millennials and younger generations wont be the same kinds of patients as their parents, said Eric Dishman, an Intel Fellow and general manager of Intels Health and Life Sciences.
These 18-to-34 year-olds already expect to have data and tools to help them manage their health just like they do for everything else in their lives.
There is no doubt that the idea of digital natives has enjoyed widespread popularity, but is it actually useful to anticipate plausible futures of ICTs in health and education?
For starters, it seems far from clear whether there is indeed a generational gap in how people engage with ICTs. Empirical research both in the domains of education and health communication find the assumption of a generational gap vastly overstated – using information and communication technologies effectively depends on many different social factors, of which age is but one. danah boyd, for instance, finds that teenagers are less addicted to their devices as they are to their groups of friends.
Beyond the empirically dubious claim of a cohorts of digital natives, the very idea is flawed — the SnapChat andInstagram generation was born as much into a material world ruled by the laws of physics and social institutions as was my grandfather; why should these forces suddenly have less traction on today’s teenagers? It is simply a fallacy of the technological imagination that technology can somehow magically operate outside the social realm, whether this applies to romantic ideas about the internet being a magical vector of democratization (a very popular idea when I was a political science undergraduate in the early 2000s) or fancies to break gridlock in climate change through schemes of geo-engineering. Technologies certainly change societies, but from within rather than from the outside. Social media may have helped to mobilize protesters, but it didn’t make the Arab Spring; resources and social organization, not Twitter, proved to be the sustaining forces that kept people in the streets, just like during the civil rights movement. As Kentaro Toyama puts it, technologies act as amplifiers of social processes. The effects of this are not just incremental; after all, the internet didn’t just make communication easier and expanded the reaches of agents of all kinds of shades: it stopped information fading into entropy by archiving everything. And that is bound to change our lives profoundly.
Arguably, the implementation of new ICT technologies in education and healthcare affect how service providers and recipients interact, not to mention the kinds of services that are provided. Since these are public services in many countries, funded by taxes, the obvious question is whether ICTs deliver real improvements to service quality and coverage. There likely is no easy answer. Healthcare is one of the most complex policy and technology arenas and what counts as a positive outcome depends as much on the specific circumstances of a particular case, as on the values that underwrite our evaluative criteria. The notion of digital natives might fit into the promises of some of the world’s most valuable companies that have huge stakes in the education and health markets of tomorrow. It is moot, though, as a device to imagine societal transformation and technology governance.
Just as my grandfather once struggled to keep the cursor on target, I never made friends with typing on a touch screen. Instead of forming a barely perceptible membrane between the material and virtual worlds, its slick, featureless surface leave my fingertips somewhat bereft of orientation; my text messages remain short to the point of coming across as rude. There is no birthright to technological mastery, tomorrow’s technology will be new to every generation alive.