Tuesday, June 1, 2010

The Importance of the Term "Digital Native" and Ethical Ethnography


The recent Frontline documentary Digital Nation brings back the term "Digital Native" to describe technology users who were born into a world in which personal digital technologies already existed and were in proximity to them. Marc Prensky coined the term and his article "Digital Natives, Digital Immigrants" introduced it to academic parlance in 2001.

Now that academic disciplines and approaches in the digital humanities are increasingly overlapping, the importance of the term Digital Native needs to be reemphasized. One way the term can make an intervention is that it offers a clearer application of ethical ethnography onto analyses of participants in the digital age. For example, scholars and researchers have long avoided disdainfully turning up their noses at unfamiliar cultural practices simply because of their foreignness. In those cases, the researcher is often an "outsider" and the subjects are often "natives" of some sort. Further, when the researcher presents her/his research to other scholars, the same rules of behavior apply. The researcher's peers and colleagues also do not turn their noses up at something just because it's foreign. This has not always been the case, but it began to be ever since academics who employ anthropological, ethnographic, and folkloric approaches started conscientiously grappling with the colonial and cultural imperialist undertones of their disciplines.

You can probably see where I'm going with this. Unlike in the case of studying unfamiliar cultures, when engaging with emerging digital practices, scholars in the humanities very often proudly display disgust at new developments, regardless of if those developments have anything morally, or ethically wrong with them, or are just alien or unpopular with Luddites. Here is where the term digital native bears remembering. Most academics would never want to be associated with the ex-pat or tourist who smugly and ignorantly sits in judgment of natives without being able to speak their language, while ironically expressing disdain at the natives' accented or broken English.

Still, the situation is not as dire as that of the tourist I describe above. In many ways, most scholars are still cultural insiders to the digital cultures they critique regardless of their use of them because of shared nationality, economic status, etc. with the digital natives. However, I paint that stark image so that it will give pause to any who might be too ready to automatically occupy that comfortably technologically disdainful position out of habit and without a second thought.

I am also not trying to shut down reflective critiques of technology. Here is a good example from the University of Richmond Writing Center’s blog of an approach that is neither blindly celebratory of technology nor automatically wary of it.

Finally, I am not unsympathetic to academics who are suspicious of technology. I too was skeptical when reading of the overly technologically celebratory take many writers had in response to a study which demonstrated that using Google engages more brain cells than reading a bound book. The study's findings made sense in that Google activates both the reading and the decision making parts of your brain, while reading a book does not usually activate decision making processes. However, I do not think that should have any bearing on whether or not one activity is better for your brain. I was glad when Gary Small, the doctor who performed the study, was interviewed in Digital Nation by media scholar Douglas Rushkoff and made clear that no such conclusions could be drawn from it (image of Small and Rushkoff above).