In my own write: Digital dementia

Are we surrendering too much of our brain power to electronic media?

Apple's iPhone 6 (R) and iPhone 6 Plus. (photo credit: REUTERS)
Apple's iPhone 6 (R) and iPhone 6 Plus.
(photo credit: REUTERS)
As if the word “dementia” wasn’t enough to scare us with troubling images of memory loss and severely impaired cognitive and social abilities as we grow older, modern societies now appear to be facing an alarming condition called “digital dementia.”
I first heard the term from a European friend here on a visit and, as it implies, this deterioration of brain function is linked not to our increasing age but to an enthusiastic overuse of computers, smartphones and the Internet.
I remember back in high school how we had to learn a host of things “by heart” – mathematical and scientific formulas, famous poems, speeches from great plays, dates in history; even, going further back, multiplication tables. I still get a lot of pleasure from remembering my favorite poems, and though my mathematical ability is singularly unimpressive, I can instantly multiply any two numbers up to 12 – something I’ve noticed many younger people cannot.
Easy digital access to all this information has made this kind of learning seem burdensome and anachronistic, and in line with the unbending principle of “use it or lose it,” our memories have weakened accordingly.
THE TERM digital dementia was already being used by The Korea Times back in 2007, unsurprising in a country which, according to the World Bank, has one of the world’s greatest populations of Internet users: 83.8 percent of all South Koreans (some statisticians say over 90%).
And do they love their Internet. We’re talking about a country in which a man called Lee dropped dead in the city of Taegu after playing an online computer game with few breaks for 50 hours; where a couple was sent to jail in 2010 after their three-month-old baby died of neglect while they raised a “virtual” child in a fantasy game.
The West is somewhat behind in Internet use – 71.2% in Western Europe and 64.3% in North America, but both percentages are predicted to rise to around 80% by 2017.
According to a respected Internet analyst, North Americans spend an average of 7.4 hours of their day looking at screens: TV, computer, smartphone or tablet. (Indonesians spend nine hours.) In a shocking mirroring of the Korean tragedy, in 2013 an Oklahoma couple were so immersed in their fantasy video game world where their avatars married and had jobs that their real-life two-year-old was only just saved from starving to death.
IN ISRAEL, a survey carried out in 2013 found that 57% of the population have access to a smartphone (nearly 70% in South Korea). Although these phones, performing many of the functions of a computer, entered the Israeli market relatively late, only in 2009, Israel has emerged as a world leader in their use.
Which prompts the question: Are modern societies surrendering too much of their brain power to digital media, making themselves dependent on it and, worse, victims of a condition in which their brains’ ability to transfer information to long-term memory has been impaired because of heavy exposure to digital gadgets?

Stay updated with the latest news!

Subscribe to The Jerusalem Post Newsletter


I WAS disturbed by the robot-like images evoked by experts who note an “emotional flattening” among children and teens who are heavy Internet users – reminding me of my amusement mixed with dismay at the sight of a couple of two-year-olds I saw in North America toddling around clutching iPads not much smaller than themselves which they proved quite adept at using.
How adept will those toddlers be in social situations when they are grown? A column in the Wall Street Journal poignantly titled “Just look me in the eye already” quoted a communications-analytics company’s study of 3,000 people in speaking situations which concluded that American adults make eye contact between 30% and 60% of the time in an average conversation, when it ought to be 60% to 70% of the time to create a sense of emotional connection. It noted that one barrier to contact is the use of mobile devices for multitasking, adding that among twentysomethings, “it’s almost become culturally acceptable to answer that phone at dinner, or to glance down at the baseball scores.”
An earlier behavioral study cited FOMO, or “Fear of Missing Out” on social opportunities, in which young adults dissatisfied with their lives or relationships feel compelled to check mobile gadgets repeatedly to see what social opportunities they are missing.
NEUROSCIENTIST Manfred Spitzer is medical director of the Psychiatric University Hospital in Ulm, Germany, and author of Digital Dementia: What We and Our Children are Doing to Our Minds (2012). Emphatic about the inadvisability of “outsourcing” one’s mental activity, he holds that while computers can be fine tools for adults, they are “poison” for kids. He abhors the constant “clicking around” involved, which he says is distracting, as is multitasking. Computers, he says, impair children’s learning and lead to stunted social skills.
Young people look at their smartphones about 150 times a day, Spitzer says, a frequency of screen use that raises stress and anxiety in all ages. Better to read a newspaper, he advises; you retain more and are less distracted.
Spitzer believes – perhaps unrealistically, given the ubiquity of digital devices among all age groups – that the minimum age for media consumption should be between 15 and 18.
He told an interviewer that in South Korea doctors had reported seeing young patients with memory and cognitive problems that were more commonly linked to brain injuries.
“The more you train kids with computer games, the more attention deficit you get,” he said.
OTHER EXPERTS consider such fears overblown.
Neuroscientist Michael Madeja has called digital dementia “a term primarily intended to have advertising appeal,” saying there is no evidence that using digital media leads to harmful changes in the brain, particularly not those found in types of dementia such as Alzheimer’s disease.
“The brain constantly alters and adapts itself and is therefore a system that is constantly learning,” he says. “It expands its capacity to process what is required, and downsizes capacity where it is no longer needed.
“For example, if you sit in front of a screen a lot playing computer games, your brain will optimize itself in line with this challenge. Your fine motor skills, reaction times and decision-making ability will improve – you will be learning in the truest sense of the word. But conversely, if you learn less thoroughly in this way, your cognitive performance will decline.”
Decrease in memory function? The importance of that is determined by society, he says.
Other experts have criticized Spitzer’s arguments as the same ones that used to be made about movies, TV and comic books, all of which are today widely accepted as legitimate art forms, with their effects on children and adults judged ambiguous, at most.
STILL, THE American Academy of Pediatrics in 2011 urged no TV for children under two, promoting instead unstructured playtime as critical for learning problem- solving skills and fostering creativity.
“Media use has been associated with obesity, sleep issues, aggressive behaviors and attention issues in preschool- and school-aged children,” the academy said, recommending that parents limit screen time for children of all ages to two hours a day and set “screen-free zones,” including bedrooms.
This last recommendation I can relate to, as I have a visceral aversion to having a TV in my bedroom.
UNLESS WE are prepared to take ourselves off to some scantily inhabited location and raise our children there in the bosom of nature, it would be foolish to pretend that digital media, having accessed our everyday lives with such awesome power, are not here to stay. It behooves us to learn their ways and use them for our benefit.
At the same time, we need to stay aware of that power and ensure that we, and not it, remain in control of our lives, and the lives of our children.