Technological development has always had a profound impact on our society. Better medical technologies save more lives today than ever before. More efficient industrial technology helps us produce things quicker and cheaper. Computer use at the office has reduced the time needed for many complex tasks from hours or days to mere minutes or seconds.
In a world where technological evolution is so constant and rapid, differences in available tech from one generation to the next have a marked influence over how we relate to one another. It shapes who we are and what we think of others — especially our youth.
In the span of a half century we have gone from telephones to social media. Each step of the way, adult generations have felt generally uneasy about the transition.
For the most part, the argument remains largely the same: This new tech seems like it makes growing up very different from what I knew. Why change? It worked for us.
The apprehension is understandable. It’s also generally wrong.
We know it’s wrong. The television was supposed to make us all dumber—it didn’t. Though we can’t strictly be considered more intelligent in a true psychological sense, we do have the ability to tackle more complicated tasks than ever before.
Video games only serve to make children lazy, inactive, and more violent. They have no positive purpose. Except it turns out this view is also likely inaccurate. Sure, there remains plenty of research to be done on the topic. Still, it’s safe to say a simple black and white view of the effects of gaming is increasingly unlikely.
In short, doomsday predictions about the impact of new tech on our youth have a remarkable habit of being wrong.
In 2018, much of this is readily understood. Netflix subscriptions have gone from around 30 million in 2012 to 120 million in 2017. Stories of binge-watching the latest craze are barely worth noting now. We’ve come to expect that when someone says they’re watching show Y, they mean they’re watching that show very regularly and will have watched all of it over a relatively short time-frame. We watch more TV than ever, and we don’t seem bothered by any dumbing down.
By the same token, video games have grown from a children’s hobby to an activity enjoyed by people of all ages. Data from a 2015 US study pegs the average age of gamers at 34 years old. At this point, we already have an entire generation of people who grew up on video games—and they’re still playing. The vast majority also get up and go to work the next morning. Gamers are no more lazy or violent than anyone else.
Technology may have made our world more complex, but new generations have risen to the task every time. In fact, in looking at research and statistics on the issue, the safe bet definitely seems to be closer to “everything will be fine” than “we’re going down in a ball of flames.”
That’s not to say such conversations aren’t important. There are very real questions that require answers. For instance, while the impact of social media on individuals remains a nebulous topic at best, the issue of ideological echo chambers and targeted content is a growing phenomenon that may have tremendous social implications.
Of course, recent history suggests even that is unlikely to have irreversibly negative implications in the long run. Yeah, we need to have these conversations, but I’m not sure we should be worried about the imminent collapse of childhood.
When it comes to technology, learning how to adopt tech as an adult is often far more troublesome than learning as a child. Our youth are astoundingly flexible and capable. While we do need to keep an eye on things, recent history suggests we should probably also give them the benefit of the doubt.
We all need someone to set the time on our VCRs eventually.
Note: This blog piece is taken from our Spring 2018 e-bulletin. To sign to our e-bulletin mailing list, click here!