“Digital immigrants and natives alike are bombarded with vast volumes of information in today’s electronic society, which… calls for an even greater emphasis on critical thinking and research skills – the very sort of ‘legacy’ content that teachers have focused on since classical times” – Timothy VanSlyke

One of the most common symptoms of digital age communication is the steady decline of fact-based debate, a reality which many of us are painfully aware. “The facts, however interesting, are irrelevant” have almost become watchwords on social media. It was in that vein that I shared a recent article titled, “Professional Commentary in the Public Domain” from the Wavell Room, a website that serves as a crucible for British contemporary military thought. I seized on one portion of the article that emphasized “stick to the facts.” My point was that it’s not enough to just stick to the facts, we need to know them, as well.

Though most of the ensuing dialog was positive, one particular comment caused me to take pause. While the comment itself underscored my point, it also highlighted a generational difference in how we view facts. I am, the commenter noted, a digital immigrant; digital natives are not as hidebound to the facts as my generation. Even as I responded, I stopped in my tracks. Am I really a digital immigrant? To be fair, I’m part of a generation that grew up with neither the benefits nor challenges associated with information age technology. Does that make me an immigrant, or just old?

Growing up, I was like a poor man’s Tony Stark, with all the curiosity and none of the money. While still in grade school, I used two transistor radios and a walkie-talkie to make a working CB radio (yes, I know that makes me sound old). I built a functioning wind generator in ninth grade but lost out in the science fair because no judge would believe that a 14-year old was capable of creating something that advanced (for the record, it really wasn’t all that advanced). I learned to code as a high school student, first on old card punch machines (something few people have even seen), then graduated to other platforms, systems, and languages. My first substantive program recreated the climactic Death Star battle from Star Wars: A New Hope (when it was simply known as “Star Wars” and sequels were still a couple of years away). I built my first computer in college – complete with a monitor salvaged from the refuse of a television repair shop – and have never lived a day without one in my life since. When the internet era arrived, I taught myself HTML and established a side business developing web sites. When social media finally arrived on the scene, it was just the natural evolution of a process that began before I was old enough to drive.

Am I really a digital immigrant?

There are a variety of definitions to bin digital immigrants and natives. I recently read a Medium post from a self-proclaimed expert in the field who defined the differences primarily based on fear: immigrants fear new technology while natives do not. Even recently, there was a widely held belief (since debunked) that digital natives learn differently than their prehistoric predecessors. Mark Prensky’s definition – detailed in the 2001 article “On the Horizon”, in which he coined the terms – remains the standard: you were either born into the digital age or you were not.

So, where does that leave me? Semantics.

Felix Frankfurter, who served on the Supreme Court from 1939 to 1962, put semantics into perspective: “All our work, our whole life is a matter of semantics, because words are the tools with which we work, the material out of which laws are made, out of which the Constitution was written. Everything depends on our understanding of them.” Semantics matter.

Some trace the dawn of the digital age to the invention of the transistor in 1947, but that was merely a waypoint on a much longer journey. In fact, the digital revolution – the breakthrough that would allow us to shift from analog to digital electronics didn’t come for another eleven years, when Jack Kilby invented the first integrated circuit. Within three years, the U.S. Air Force fielded the first computer to use Kilby’s technology, and the digital age was in full swing. The integrated circuit spurred the modern digital industry, paving the way for a technological revolution that has transformed every facet of our society. Without Jack Kilby there is no integrated circuit, there is no handheld calculator, and there is no Fourth Industrial Revolution.

Forget what you read on the internet. Jack Kilby was the Daniel Boone of the digital age, blazing the trail that would one day become the information superhighway. But it all started with Kilby’s invention of the integrated circuit in the summer of 1958; my entry into the world was still a few years into the future. So, if facts matter as much as I like to think, that makes me a digital native.

I might be old, but I’m no digital immigrant.

Related News

Steve Leonard is a former senior military strategist and the creative force behind the defense microblog, Doctrine Man!!. A career writer and speaker with a passion for developing and mentoring the next generation of thought leaders, he is a co-founder and emeritus board member of the Military Writers Guild; the co-founder of the national security blog, Divergent Options; a member of the editorial review board of the Arthur D. Simons Center’s Interagency Journal; a member of the editorial advisory panel of Military Strategy Magazine; and an emeritus senior fellow at the Modern War Institute at West Point. He is the author, co-author, or editor of several books and is a prolific military cartoonist.