The Event Horizon and the Death Star: SINC Talks the Digital Economy with Kenneth McGee, Info-Tech

SINC interview Nate Arnold GE

SINC’s Director of Content Annie Liljegren spoke with Kenneth McGee in October 2021 for this interview, which has been edited for length and clarity.

I recently sat down with Kenneth McGee, Research Fellow at Info-Tech, who’ll be featured at our 2021 Canada IT & Security Leaders Virtual Forum, November 15th-17th, delivering a keynote presentation entitled “The Digital Economy.”

Ken’s notable career includes 29 years as a VP at Gartner, and VP roles at Goldman Sachs and later Salomon Brothers, where he served as VP International Communications Director in London, and VP IT Budget Director in New York.


Great to have you here, Ken. I'd like to start by asking about digitization. It'd be a silly question to say, would this have happened without COVID, but as companies move into that space or embody it more fully, what would digitalization–and specifically, this whole idea of the digital economy–what would that have looked like without the recent acceleration?

Kenneth McGee: We’re going to first establish a digital economy’s definition. And therefore, there’s fodder for disagreement and debate. But having said that, no—COVID did not bring about this idea. Claude Shannon did in 1948, when he wrote the first paper on converting physical atoms to digital bits. So we have to blame Claude. 

Tongue in cheek, of course, but no–it’s been going on for decades. But, we are at this level (indicates gap between thumb and forefinger) of completion so far. It’s been an initiative that’s been undertaken for decades, but we have a long way to go.

There have been recessions, there have been pandemics throughout history. This is the first time that we are coming out of a recession where there is a digital economy, a fundamental change in economic principles, waiting for us. Never happened before.

The last time we came out of a recession we were still doing manufacturing, we were still doing services. This is brand-new stuff.

COVID, and the need to recover from it, and therefore the recession, is accelerating digital transformation. And the way it’s doing that is digitally transforming manual things to electronic things–things performed by humans will now be performed by non-humans. Digital transformation is nothing if not that.

"There have been recessions and pandemics throughout history.

This is the first time where, coming out of a recession, a fundamental change in economic principles is waiting for us..."

When you used that thumb-and-forefinger gesture just now, to indicate we are 'here' with digitalization, you obviously meant that as a percentage of a whole. What is the "whole" you had in mind? Is is the entire hand spread out, is it arms-wide? What's the ratio of that thumb-and-forefinger gap to full completion?

KM: That’s a very fair question. And let me let me see if I can take it this way, because it’s very, very fair question. Michael Hammer used to say: A good question is wrong, and you can muddle through. A very good question is one you can actually answer, and a fabulous question is one where you can show a slide.

An economy is is predicated upon some fundamental occupations, activities, task processes and procedures: we have to advertise stuff. The world of advertising is heading towards one trillion dollars of global revenue. It is the only category in commerce that has passed 50% from being manual–from being atoms like newspapers, magazines, etc.—to “digital.”

But when you look at all the wholesale trade, all the resale trade that takes place on Earth, that’s why I held up a small number, un peu, because it is many decades underway since Claude wrote his paper, but we have a long way to go.

Is this a permanent trajectory? Do you see the potential for any sort of backlash or are we committed to full digital, as in, digitalization on and on in that direction?

KM: I certainly don’t see backlash, but there is an element here that’s worth considering: once you go past this line you’re never coming back. You’re never coming back. And therefore you see things like blockchain and disappearing intermediaries.

But here is the point: I do not believe in the merit of the comment that technology is changing all the time. I think that’s rubbish. Products change all the time, and therefore it’s the difference between discoveries and inventions. Inventions change a lot of things, but there are only 107 things on the periodic chart, for example. We don’t keep discovering fundamental elements.

And therefore, there are things that come along that—like on the periodic chart—are so distinctly different they forever change things, and will forever change the future, and from which you will never return. One of them is the telegraph, another is penicillin: fundamentally changing the course of the human species. And we are saying the aspects of the digital economy are similar to that. They are of this ilk. They are ‘realm’ changes that will change humans forever and from which we will never return.

To that idea of an event horizon: Clearly we can see this happening with tangible products, with IoT or whatever else, but at a deeper level, what does the world look like past that point of no return? You have a deep background in the financial vertical, and now in addition to your work at Info-Tech you're also an adjunct professor—as we consider that point of no return, what are some tangible ideas of how the world fundamentally looks different–financially, for the educational sector, or maybe in ways of which we're not even fully cognizant yet?

KM: First of all, Annie, you did your homework–thank you. A couple of things, but I think most profound amongst them is the first time, perhaps in the history of commerce–and that’s at least 7000 recordable years of evidence of humans conducting commerce–the first time in the history of that commerce where transactions can take place in a trusted fashion.

Where once you identify the most verifiable version of the truth–a transaction, for example–and it is locked in blockchain, it can’t be changed…yet. Or can only be changed with evidence. It will be the first time where we have an environment of perhaps unprecedented degrees of trust within transactions. We can’t say that we’ve had that in the past.

It is also the first time in history where we no longer need intermediaries as much. We don’t need literary agents, we don’t need real estate agents, we don’t need attorneys, for example, in order to make smart contracts and such. So the demise of intermediaries, for the first time.
And then the third and final–of many, I would propose–is the ability to have equal status with peers, even though those peers may be banks, even those peers may be people who supply retail matters with wholesale matters, whatever the entire definition of commerce. We can conduct those trades not in a slave/master relationship in terms of computing, but of equal ability because everybody has the same database everywhere—period, end of sentence. Those are three pretty big ones. But because it’s so nascent in the history of our experience, we just don’t have enough people yet who have fully internalized what the heck that means, certainly.

One of my favorite questions to ask in these interviews: What's top-of-mind for you right now, as far as either something that you wish was being discussed at all, or something that gets a lot of buzz but you feel the conversation needs to be different, needs to be deeper, needs to take a different approach. What are you thinking about, Ken McGee, that we should all be thinking about?

KM: I’m going to perhaps give an adjacent answer, though I think it’s in the same zip code. The whole world of commerce in a digital economy has to face one reality, and it’s something that I do not believe anybody in the IT advisory industry are willing to say, and that is the fact that we have had so many profoundly important breaches in cybersecurity.

It is time to call out the cyber security world as a profound, abject, and complete failure, and to arrive at a better solution than having individual companies, individual agencies, and individuals use their limited resources to combat what is very obviously state-sponsored cyber terrorism.

We have to reach a point where we can say: what other example can we cite where so many failures have occurred, on an ongoing basis, and yet we continue to spend money on it?

Keep talking about spending money–you’re wasting your time. Blow it up, and let’s start over.

"It's time to call out cybersecurity as a profound, abject, and complete failure, and arrive at a better solution than having individuals use limited resources to combat what is, very obviously, state-sponsored cyber terrorism."

Okay, one response–some might call this a cynical view–but one response to that might be to suppose that, as consumers, we've allowed that as an acceptable risk: that we've accepted the holes in cyber security, the abject failures, as you put it, as acceptable in exchange for everything that is afforded to us.

So to what degree would that sort of radical change necessarily need to have the consumer involved? Is that going to happen from within industry alone, or does it need to come from from other stakeholders as well?

KM: The answer is that is not going to come from the industry. Secondly, it is not going to come from anywhere more potent than for nations to realize they are already at war, and they are–as in the common adage complaining about generals–they are preparing for the last war. The next war, I don’t yet see it on the radar screen, being formulated and compiled by anybody.

Now if it is, I’m not even sure I would be amongst those who would detect it. But I know this: my daughter works at the White House. She had her name stolen by a foreign government, this was all over the papers, and the fact that they could reach in to this kind of secure domain and have her name, among many thousands of others, from the other side of the planet is so not acceptable that I cannot see the scenario that could accommodate: Well, we have to get used to it.
That is against nature, it is so wrong. And yet we are continuing to use the 1980s and 90s book of how to prepare, and how to protect your data. For the love of God, let me get in my DeLorean.

So, as important as training is, it's not as simple as to say training is the problem; we'll just teach people not to click malicious links and be done...

KM: Yes, of course. Of course it is. But to have the Death Star so vulnerable to a little flying machine is not an acceptable option, and nor is it acceptable destination.

The principle is the funding. The principle is not the techniques to prevent attack. The principle is to recognize that we will spend the country’s money to protect our people when a country physically invades our border. But when they do so via ones and zeros, we’re going to be stupid about it and expect everyone else to prepare for that kind of onslaught.

No, it requires funding from the national level, no different than it requires funding when they come across our borders. And any country would feel that way. So let’s not ask companies to be the sole source of funding for security. Let’s take a look at what it is–it is war.

Because, in a digital world, a breach is a border invasion.

Kenneth McGee: And what is the difference? Claude Shannon told us in 1948 what the difference is, and we haven’t caught on yet.