top of page

Researched Argument

Exploring the perils of losing ourselves in modern media 

through the lens of a Brave New World;

 

 

 

Don't Talk to Me, I'm Texting

Introduction

            Have you read Aldous Huxley's, Brave New World? In this dystopian novel, we encounter a world where everything is turned into entertainment: politics, news, education, religion, business, even sex. Happiness is the only goal of the leaders of this world and they achieve this by "inflicting pleasure." Maybe that sounds utopian. "Inflict a little pleasure on me!" we cry. And we are pacified with four-second memes, three-minute news stories, and two-thumbed communication. Neil Postman, a prolific writer in the fields of language and media theory, postulated, "America is engaged in the world’s most ambitious experiment to accommodate itself to the technological distractions made possible by the electric plug" (449). Amidst the dystopian celebrations of 1984, this observation was somewhat alarming. In 2016, we are now wireless and not concerned in the least. Distraction from reality is in the palm of our hand. The appealing chime of our cell phone grabs our attention and we heed its call without a second thought to our present surroundings. Listening, talking, multi-tasking, even driving doesn't stop us from checking many-times-a-minute what the chime has to say. We are controlled by our tech and we don't mind it at all. Is computer technology undermining our education and our families? Is this the "soma" that is leading us into a Brave New World? Hard-core techies and casual users alike argue that today's computer technology is incredibly useful, amazing and fun. While conceding these descriptors are accurate, I maintain that computer technology is undermining education and families by being an ever-present distraction, displacing spontaneous critical thinking and supplanting real-life experience with an alternate pseudo-reality.

Background

            Today's stunning advances in technology would be considered miraculous if they weren't so commonplace. Nearly every day another groundbreaking invention arrives: smarter phones, smaller laptops, self-driving cars, 3-D printers, not to mention the horde of movies and games that are computer generated or enhanced. Software, hardware, applications, updates, add-ons,... the list is never-ending and neither are the Tweets. CQ staff writer, Marsha Clemmitt reports that in 2008, Twitter users posted 100 million tweets every 3 months and by 2013 the Library of Congress had archived 170 billion. Clemmitt also notes Facebook is not a minute behind; it boasts well over 500 million users since its opening in 2006 ("Social Media" 91). Social networking isn't the only niche to see explosive growth in the past 10 years. Patrick Marshall, technology columnist for "The Seattle Times" and "Government Computer News," reports applications for architecture, psychology, medicine, military and entertainment are in high production. These latter two markets in particular have driven the development of virtual-reality technologies (Marshall 201). Contributing writer for "Smithsonian" and "Medscape" magazines, Alicia Ault, reports "gamification", the process of turning activities and tasks into games, will have a strong presence in education, health care and many work places by 2020 (147).  In 1971, the first computer game “Oregon Trail”,  was introduced by teachers for educational use. From 1967 to 1981, computer games sauntered their way into homes via Brown Box, Odyssey, Atari, Pac Man, and Donkey Kong. Yet, by 1982 the U.S. Surgeon General, C. Everett Koop warned children can become addicted to video games (Ault 155). But the party is not over. Gaming addiction isn't the only challenge for tech as the 21st century advances.

            From 1983-1993, business owners and a U.S. panel reports that the national intellect is at risk due to a "rising tide of mediocrity" among American graduates. By 2011, Florida, Idaho and New York have instituted virtual and online schools as an educational option (Clemmitt, "Digital Education" 1011). Yet, in a 2015 article titled, "Teaching Critical Thinking" by Marsha Clemmitt, students are lacking higher-order thinking skills and 81% of employers affirm that critical thinking and analytical reasoning are the top qualities they look for when hiring new employees (323). The distraction of computer games concerned the American Psychiatric Association, and they called for further study of gaming addiction in 2013 (Ault 155). But development of games is undaunted: from 1991-2012 arcade games using Virtuality Group's 3-D goggles morphed into augmented-reality glasses produced by Google. In 2016, we will find virtual-reality headsets by Microsoft, Sony, HTC and Oculus under the Christmas tree (Marshall 203).

Lines of Argument

            Technology has sparked the development of incredibly useful applications due to the work of brilliant developers and the insatiable appetite of tech-loving consumers. Health care, education, business communications, product design, and industry training markets have added technological advances to improve the lives of employees and consumers. Simulations have been developed to train students and employees in public speaking, negate the fear of flying, cope with trauma from violence, and gather in real-time, face-to-face, for business meetings. These innovations have provided a boost to the entertainment and business sectors of the economy (Marshall 195-8). The University of Washington's Human Interface Technology Lab, is developing virtual reality games that are powerful enough to "cancel out" the pain burn victims feel (Marshall 210). Technology is useful in administering compassion, yet this is where enthusiasm for the miraculous should turn to serious concern.

            Mental manipulation by technology is a dangerous practice to embrace. We should ask: "What else can manipulative technology "cancel out?" Who develops the software programs that educate our children and what are their qualifications and motives? How does this affect a child's developing mind? The reality is that experts just do not know. Scientists from the Stanford Center on Longevity, and the Max Planck Institute for Human Development in Berlin, assert that “there is little evidence that playing brain games improves underlying broad cognitive abilities, or that it enables one to better navigate a complex realm of everyday life." Pediatricians as well are concerned that very young children could be acutely affected by video games in this critical developmental stage (Ault 150). Because games are ubiquitously available, psychologists are seeing a surge in Internet Gaming Disorder. For decades, Douglas Gentile, a psychologist at Iowa State University, has worked on defining why gaming has such power. Fittingly, he uses the ABC's: "The A Is Autonomy, we like to feel we're in control. B is Belonging, we like to feel connected to other people. And the C is Competence, we like to feel that we're good at what we do" (Bresnahan and Worley). These are basic human needs that should be filled with basic human relationships, not devices. 

            Hand-held devices offer amazing autonomy and flexibility. When experiencing an automotive malfunction while in the middle-of-nowhere, all one needs to do is ask Siri, or Google, "Why won't my car start?" Thousands of responses will instantly be available and the solution at hand. Of course it takes knowledge of mechanics to know just what question to ask, and a handiness with tools to be able to accomplish the repair, and that is exactly why it is vital to have a well-rounded, experience-rich education. Apprenticeships offer hands-on experience for the student and opportunities for critical thinking related directly to applications in life and in employment. Jamaine King, an apprentice for BlueCross BlueShield, confirms in an article written by Susan Ladika, freelance writer for "HR Magazine","Workforce" and "The Wall Street Journal-Europe," that he received technical training as well as professional and communication skills (855). Yet among those coming from the digital generation, employers see new apprentices are deficient in basic math skills, the ability to communicate effectively, and the self-discipline needed to perform well on the job (Ladika 853). Understandably this has alarmed employers, who now have the burden of teaching students as well as grooming employees. Pooja Anand, the head of strategic projects and talent acquisition at Siemens USA, concurs, “We’re running out of talent, so something will have to happen” (Ladika 858).

            Clearly both critical thinking and digital skills are necessary to navigate the 21st century. The recently adopted model of teaching to the test doesn't encourage critical thinking (Clemmitt, "Teaching" 317). To counter the seduction of tech and its absorbing suck and yet employ it where it excels in education will require those cognitive skills that are under attack. Digital Technology Research Professor Steve Higgins, of Durham University asks:

Do we need a curriculum with less specified knowledge, allowing a greater emphasis on skills, based on the argument that information (and therefore knowledge) is more readily accessible? Or do we need more knowledge, as the basis for developing greater expertise and the ability to make informed and complex judgements, based on a deeper understanding of a topic or field ? (571)

Learning to both distinguish the quality and manage the quantity of the mass of 21st century information must be incorporated into educational models (Higgins 571). Beginning with a strong love of learning grounded in the home, supported with proper education, and facilitated by incredible advances in technology, balance can be achieved. But to get there, we must first conquer the addiction to distraction.

            Phones that enable calls, reminders, video, texts and tweets; games that demand constant interaction, upgrades and alerts, Facebook, Instagram, Pinterest, and the constant anxiety to recharge our devices are more prevalent activities in our lives than any other. Parents and employers find this unacceptable. “Children now spend more time with digital media than with any other single influence,” reports the American Academy of Pediatrics in October 2015" (Ault 149). The influence of social media on the family is obscure, however analysts are aware that it is replacing face-to face conversation. Among those of the digital generation, conversing in person is preferred last behind texting, Facebook, Instant messaging and phone calls (Clemmitt, "Social Media" 83). Larry Rosen, a professor of psychology at California State University, Dominguez Hills, finds that kids in the digital age particularly dislike talking to adults. Non-verbal cues like tone, posture, intonation, pauses, hand gestures and facial expressions are essential to understanding others and without real-life practice, Rosen asserts, miscommunication is inevitable (Clemmitt, "Social Media" 84). Technological distractions erode the integrity of personal and family relationships that are the foundation of society. A major problem with mobile device use is distraction. “You’re constantly interrupted, and you’re self-interrupting,” with an obsessive need to check the phone. If the device is taken away, within only 5 minutes, anxiety builds that distracts from the present moment. Family dinners or conversations with people are ineffective because attention is spent worrying about what messages they are missing on their device (Clemmitt, "Social Media" 87). But who can resist that personalized ringtone?

            The 21st century is truly an amazing time to live. Technology has been developed to supply all of our wants, needs, whims, wishes and desires. Do you want to work from home? There is hardware for that. Do you need to learn multivariable calculus? There is software for that.  Do you feel like scuba diving in the Bahamas? There is virtual reality for that. Our lives are now immersed in digitized stimulus. Digital advances have begun an overhaul of education by producing games that offer learning platforms designed to engage students in strategic thinking (Clemmitt, "Teaching" 326). Professor Higgins, whose focus is digital technologies, admits that both critical thinking and digital skills are needed to navigate the 21st century, but they aren't enough for success. Higgins concludes that education must include "productive thinking which helps the individual to surmount challenges and find solutions to problems" (571). Consider how critical hand-eye coordination is for surgeons. The Lycra DataGlove was developed at MIT to employ optic fibers that track the minute movements of a student as they practice surgical techniques (Marshall 207). This accelerates the aspiring medical student's opportunities for learning and ensures the quality of education.

            The prospect of virtual and augmented reality propelling education into the future is exciting for both developers and students because this technology has the ability to revolutionize teaching methods that promise more engaged students. Games designed for education lead students to think like a scientist, discover history by recreating historical scenarios, engage in teamwork, and even practice math equations until they calculate them correctly. Can you imagine a 12-year old practicing brain surgery? Well, you won't have to imagine it. Soon Reactive Grip technology, currently in the prototype stage at the university of Utah, will enable any object, a virtual scalpel or bipolar forceps for example, to be manipulated during the virtual surgery thus "combin[ing] the experiences of real-life and digital life" (Marshall 204-205). Internationally educated at both Hong Kong and Cambridge Universities, Robert Keiner, predicts in his article, "Future of Public Universities," MOOCs (Massive Open Online Courses), are revolutionizing education models by providing free online access to thousands of instructional videos or by partnering with universities to offer low cost access to hundreds of courses (58).  Khan Academy specializes in K-12 mathematics, history, medicine and computer science. Khan offers more than 3,600 free online video tutorials and is used by public, private and home schools (Kiener 67) and it's fun. Other MOOCs like Hillsdale College's free online courses on History, U. S. Constitution and Economics are appealing to adults. Modern technology has created opportunities for learning which benefits society. More people than ever can view lectures by world-renown professors given at prestigious universities who would not otherwise have the opportunity because of time, location or money. Computer software programs can also recognize speech, correct pronunciation, and track the progress of the student (Clemmitt, "Digital Education" 1003). Educational video game programs are developed to accommodate the exact developmental level of the student. This personalized teaching method excels over the classroom model of teaching to the middle of ability, which puts pressure on the slower students and ignores the advanced ones (Clemmitt, "Digital Education" 1003). Games like Minecraft allow students to jump in and build their niche since the game has no rules. Even the code can be modified by the computer savvy to further creative opportunities. Many see this developing needed skills while others are not convinced that skills are transferred to real-life application. Joel Levin, co-founder of "MinecraftEdu," admits it is difficult to measure what a student is learning from Minecraft especially when it comes time to take standardized tests (Ault 157).

             Education is suffering because of the distractions tolerated in a culture that bows to every technological advance. When a love of learning is not the goal and instead educators "teach to the test," there is a marked increase in cheating. Sarah Glazer, a graduate of the University of Chicago, explains that disengaged students use their cell phones to cheat and claim they do so because integrity doesn't matter when the subject matter is boring (3-4). Copy/pasting, retweeting and remixing, all contribute to a misunderstanding of what plagiarism really is in the mind of someone immersed in today's digital world. Addressing the confusion, Urs Gasser, executive director of Harvard’s Berkman Center for Internet & Society, explains: “Sharing is in the DNA of the Internet...It’s no longer so clear — not only for youth but, honestly, also for adults — what is plagiarism” (Glazer 5). Real-life demands that we fail as well as succeed. Learning to fail gracefully and try again begins in families: riding a bike, baking a cake, losing a board game. Real-life, face-to-face experiences cannot be replaced by watching someone else experience it on YouTube, or through "experiencing" with a virtual-reality headset.

What They Ignore

            Proponents of everything tech fail to consider that while tech offers the same to everyone, not everyone interacts with tech the same. Online access via Skype, Zoom, FaceTime, YouTube or virtual-reality applications is an avenue for high quality education, yet this model doesn't work for students who thrive on personal interaction and external energy for motivation. The pseudo-realities that games and video worlds create are an immersive escape from the responsibilities of our own reality. In games and VR, relationships are built with algorithms and synthetic personas, and in social media, with people with whom we never rub shoulders. The quality of family time is strained with defiant attitudes as members grouse about turning off the digital world. Reality demands a measure of unselfishness which includes chores, talking about subjects important to others and, sometimes, even boredom. Building relationships is more successful when not in direct competition with the adrenaline rush of the digital world and the call of online "friends." There is no substitute for real-life experience even if virtual-reality software is completely convincing. Even digitally-enhanced school work cheats reality with auto-correct, auto-fill, google glass, google earth, google play, google everything. Google and a plethora of other digital applications think for us. Social media tells us what to think in seven-second sound bites and memes. Deep interest in and ability for contemplation and serious study of worthy subjects is quickly losing ground. Instead, information must be entertained to us quickly and without effort. Who makes the decision on how and what children learn? Parents, or programmers? If parents abdicate the responsibility for education then it is whoever pays for the development of the software. Is it Soros or Trump? Either way, we are manipulated; controlled. But we have been persuaded subtly by degrees. We like the convenience, we embrace the tech, the entertainment, and accept its programming bias when we tap "Accept." As parents and teachers we are allowing too much time and trust in unproven and dubious sources for education and entertainment and possible mentors and heroes for our children. The alternate pseudo-realities we offer our children should be scrutinized for value and not simply accepted because everyone else is doing it. Although there are great uses for social media in connecting people, the detriment of isolation and loss of speaking, negotiating and counseling skills is drastic and damaging for socialization. Texting and tweeting communication is not the same as speaking, does not developing writing skills, and does not transfer to personal, business or real relationships. Alas, the goal of youth in using social media or gaming is not to increase their exposure to people of other cultures or backgrounds or to improve their communication skills. Youth merely get absorbed in the computerized world of their choice and they do not develop real inter-personal relationship skills.

Conclusion

            Do not allow families to be undermined by technology. Distractions from our highest priorities have and always will abound, yet conscious of them, we can sift the destructive from the worthwhile.  If not, we will live in an alternate pseudo-reality designed and controlled by those who develop the software and applications of our technology. Obsession with tech has the power to destroy the intimacy of families and the enthusiasm for experiential learning. "As Huxley saw it, people will come to love their oppression, to adore the technologies that undo their capacities to think" (Postman 448). Critical thinking is fostered and creativity inspired by individualized education. In fact, the miraculous advances in technology we debate have come from the creative minds of students whose education was clearly effective. Utilizing the best of advancing tech and real-life inspired experiential learning, the future of education will be both cutting-edge and timeless. But proceed with caution, technology is the "soma" of the Brave New World we live in. Consider carefully the dose you prescribe yourself. It will be the turning point for future generations.

Works Cited

Ault, Alicia. "Video Games and Learning." CQ Researcher 12 Feb. 2016: 145-68. Web. 25 Oct. 2016 library.cqpress.com.byui.idm.oclc.org/cqresearcher/document.phpid=cqrerre20160  21200&type=hitlist&num=0.

Bresnahan, Samantha, and Will Worley. “When Video Games Become an Addiction,” CNN, 6 Jan. 2016. http://tinyurl.com/zbng5by.

Clemmitt, Marcia. "Digital Education." CQ Researcher 2 Dec. 2011: 1001-24. Web. 25 Oct. 2016 library.cqpress.com.byui.idm.oclc.org/cqresearcher/document.php?id=cqresrre2011120200&type=hitlist&num=0.

---"Social Media Explosion." CQ Researcher 25 Jan. 2013: 81-104. Web. 23 Nov. 2016 library.cqpress.com.byui.idm.oclc.org/cqresearcher/document.php?id=cqresrre2013012500&type=hitlist&num=0.

---"Teaching Critical Thinking." CQ Researcher 10 Apr. 2015: 313-36. Web. 23 Nov. 2016 library.cqpress.com.byui.idm.oclc.org/cqresearcher/document.php?id=cqresrre2015041000&type=hitlist&num=0.

Glazer, Sarah. "Plagiarism and Cheating." CQ Researcher 4 Jan. 2013: 1-28. Web. 25 Oct. 2016 library.cqpress.com.byui.idm.oclc.org/cqresearcher/document.php?id=cqresrre2013010400&type=hitlist&num=0.

Higgins, Steve. "Critical Thinking For 21-Century Education: A Cyber-Tooth Curriculum?." Prospects  Dec. 2014: 559-574. Web. 18 Oct. 2016 web.b.ebscohost.com.byui.idm.oclc.org/ehost/pdfviewer/pdfviewer?vid=2&sid=3fa1eee6-2104-4a70-a941-6de7c0058865%40sessionmgr105&hid=118.

Kiener, Robert. "Future of Public Universities." CQ Researcher 18 Jan. 2013: 53-80. Web. 25 Oct. 2016 library.cqpress.com.byui.idm.oclc.org/cqresearcher/document.php?id=cqresrre2013011800&type=hitlist&num=0.

Ladika, Susan. "Apprenticeships." CQ Researcher 14 Oct. 2016: 841-64. Web. 25 Oct. 2016 http://library.cqpress.com.byui.idm.oclc.org/cqresearcher/getpdf.php?id=cqresrre2016101400.

Marshall, Patrick. "Virtual Reality." CQ Researcher 26 Feb. 2016: 193-216. Web. 23 Nov. 2016 library.cqpress.com.byui.idm.oclc.org/cqresearcher/document.php?id=cqresrre2016022600&type=hitlist&num=0.

Postman, Neil. "Amusing Ourselves to Death." 1984: Web. 27 Sept. 2016. emp.byui.edu/davisr/202/Neil_Postman.htm.

© 2023 by Michelle Ryder. Proudly created with Wix.com

bottom of page