In many ways, for me, 2013 has been the year of the Dennings. It was the end of April this year, after publishing a profile piece I wrote on Dorothy Denning – a giant in the world of cybersecurity research – that I received an email note from her husband, Peter Denning, a computer scientist with his own impressive list of accomplishments.
“I’ve seen many interviews with Dorothy”, he tells me in a subsequent conversation, adding that my profile “was the only one that seemed to grok her – you kind of got who she is.” I was flattered by the feedback, which was accompanied by his offer to have our own chat about his views on the computing landscape.
Our scheduled one-hour conversation turned into a two-hour-plus marathon that took me from the decade of Free Love right up through today’s era of handheld smartphones with more computing power than was available on the Apollo 11 Command Module.
A “Pure” Computer Scientist
Denning’s resume is one to be envious of: educated at Manhattan College then MIT, followed by faculty positions at Princeton and Purdue, and then onto NASA’s Ames Research Center in California, where he founded its Research Institute for Advanced Computer Science (RIACS) – and that’s just a sample. All the while, Denning churned out academic papers and books on his chosen field of computer science – and on topics beyond his research purists.
For the past eleven years, Denning has served as the chair of the Computer Science Department at the Naval Postgraduate School in Monterey, California. Born in 1942 in Queens, New York, and then raised in Connecticut, he recalls that, a little more than a decade ago, Dorothy “caught wind” of the NPS’ need for a department chair, “and we always had this notion that we would return to California someday.”
Describing himself as “more of a pure computer scientist” than his better half, the two took positions in different departments at the school, where they have both been ever since. His expertise lies in the design, control and management of operating systems, and the principles they employ.
I ask Denning to tell me about the balance his current position at NPS requires. “I teach six sections a year. I like being with the students. I like to know who are they, and what’s on their minds.” Recent courses he has taught to his pupils – who at about age 30, are older than your typical graduate student – include operating systems and Great Principles of Computing.
“The majority of our students come to us with Bachelor degrees in other fields. We’ve learned how to help students transition into computer science, and then fulfill all the master’s science degree requirements, including thesis, and graduate on time”, he tells me. “This is not a job that I’ve ever come across in other universities”, Denning says. In his experience, at other institutions, it’s hard if not impossible to crack into the master’s program if you don’t have a bachelor’s in computer science.
Denning says the US Navy goes through cycles of emphasis regarding its advanced degrees. Not too long ago, the focus was on those with business administration skills, but he has witnessed a change in this philosophy as of late. “The Navy and the military forces are beginning to shift back in the direction of wanting more science and technology, because they need their officers to be smart and creative in the pitch of battle.”
A Dream Come True
Peter Denning is considered a pioneer in the fields of computing and networking. It’s a label that can be viewed both positively and negatively. First, we think of pioneers as being at the cutting edge of a particular development, but as time goes by, these same pioneers can be characterized as out-of-touch. This is not the case with Denning, from what I can gather.
"We wound up cultivating a whole generation of people who where building computers and operating systems, who knew nothing about the past, and cared nothing about the past" |
I ask Denning to take me on a trip back to MIT, and paint a picture of how the world of computing as we know it today was so very different during the late 1960s. “I was inspired by all the visions that the designers had, about things that could be accomplished with these new kinds of systems”, he relays. The need for “time sharing” on these computing systems was necessary to distribute their immense cost over many users, and as he recounts, “personal computing was a dream.”
I ask him about the pocket-sized devices most of us carry around today, and whether he ever pondered such advancements during his time at MIT. “When I was a graduate student that was like a dream”, he reminisces. “There were people running around at MIT saying, ‘One day we’re going to have a desktop computer’. It never ceases to amaze me, that stuff I dreamed about when I was a kid, or a graduate student, actually came true!”
Early Recognition
It is often said in the information security field that as computing and networking advanced, security was an afterthought – a bolt-on feature that lacked attention during the development and design phases. But considering that both computing and networking had their origins in military and government uses, this is likely a misconception. As Denning points out, the formative years of computing lacked the widespread inter-connectivity of today’s environment. When it comes to data protection, he affirms, connectivity “just expands the problem.”
Shortly after joining the faculty at Princeton in the late sixties, Denning was asked to chair a national task force to design a core course on operating systems. “We had five main topic areas, and one of them was protection and security. So this was actually recognized from the very beginning”, he says.
I then ask Denning what he admits is a rather controversial question: How do today’s operating systems stack up in terms of security when compared with those like Multics from Project MAC and the IBM commercial systems of his early career?
Back in the late 1960s to early 1970s, he says, computers were highly expensive machines – each costing a couple of million dollars in a time when a million dollars meant something. These forerunners to the personal computing era – the commercial operating systems he referred to – “had all this fantastic functionality” and addressed fundamental security issues. Denning adds, however, that others in the field of computer science were looking for cheaper alternatives to help bring this price down, yet still maintain the functionality of the commercial mainframe systems.
Power to the People
“They called it Unix”, Denning says, referring to Bell Labs' colleagues’ first attempts to scale down computers – both in price and size. “They really wanted the interactive features of Multics…and the protection and security and sharing features, but they didn’t want to pay huge sums of money to get them.”
What Denning describes to me sounds a lot like an earlier version of the consumerization trend of the last several years – albeit on a higher level. It brings us to the late 1970s to early 1980s timeframe, “when the chips started to come out”, Denning continues. “Apple Computer, and Atari and all those companies, were little garage kind of things, and they said, ‘We want to make the computer so simple that you can have one, and you can be an ordinary person in your house, and own one.’”
So that’s what companies like Apple set out to do, Denning says in reflection. “Their computers were so small and simple that they couldn’t really hold an operating system, the memories were too small. It was more like ham radio.”
I then ask Denning if he was one of those garage-based tinkerers, in the spirit of a Steve Wozniak. “I was interested in it, big time”, he responds.
“By the time the PC revolution started, operating systems had got pretty big”, but Denning admits that in today’s terms, they were a “historical curiosity. So the folks who were trying to pioneer [PCs] had a kind of animosity against operating systems, because they thought that it resulted in corporations blocking the small guy out of using computers.”
Let History be Your Guide
I then transition our conversation to the contemporary era of computing – the handheld, smartphone/mobile era we currently occupy – and ask Denning how this evolution has affected both security and his approach to teaching about the operating systems that underlie them.
“I think there’s a deeper issue here, when you’re talking about security, which is that these people produced a new generation of computer builders who started out disliking operating systems and anything having to do with big computers, and disavowing them, and trying to go down their own path”, Denning observes.
The issue here, he concludes, is that this new group of PC builders lacked a historical knowledge of the security issues that confronted an older generation of mainframe technologists. “So, we wound up cultivating a whole generation of people who were building computers and operating systems, who knew nothing about the past, and cared nothing about the past.”
The result, as he sees it, was a decades-long delay in confronting the security issues that forbearers of the PC contemplated – and in many cases addressed.
With history as a guide, nevertheless, Denning says recent years have witnessed a return to these fundamental security issues that the old mainframers encountered. “There’s been a lot of security research in the last few years, which is basically oriented on resurrecting old knowledge and adapting it to the new world.” He then issues a clarification: “It’s not simply going to the ACM [Association for Computing Machinery] digital library and looking up the old papers, it’s kind of like re-discovering and re-inventing the knowledge.”
Sometimes Older is Better
It’s with more than a bit of nostalgia that Denning laments the rise of the mobile operating system, regardless of its security issues. Take Apple’s OS X, which he calls “one of his favorite operating systems”. A user of this for years, Denning regrets that Apple is adapting newer versions to mirror the appearance and functionality of iPhones and iPads.
I ask him about the progress of security over the last two decades or so, specifically the progress made by that other popular vendor of operating systems. “Microsoft had started with DOS, a really, really awful and insecure operating system, and then they came out with Windows as a way to try and respond to Apple”.
Microsoft wasn’t serious about security during the formative years of the Windows operating system, Denning remembers. This changed soon enough, he admits, as the company realized that security issues were resulting in big hits to its reputation – the trust factor. “So Microsoft started to get very serious about this, and they brought in Steve Lipner, who seems to have been a big godsend for them…they made some very significant improvements in the security of their systems.”
As we close the conversation – at least the security portion – I ask Denning if, from his perspective, there is one operating system that stands out as being the most secure. His response is certainly diplomatic, if not based on decades of experience.
“I don’t think it’s completely the operating system anymore”, he tells me. “We worry about the network as a whole, we worry about the way we organize the servers at the network.”
Security, in the view of this operating system ‘expert’, is more about networking, connectivity, and the people who use and operate these devices. “These issues transcend individual operating systems.”
Perhaps the most important lesson I learned from this accomplished teacher is that there are basic constants we can’t forget if we expect to enjoy success in any field – data security included. “Although technologies have changed a lot, we still keep on using the same fundamental principles for design. And the more we understand those principles the better off we’ll probably be”, Denning concludes. I can’t help but recall the great American football coaches like Vince Lombardi or Chuck Knoll, who stressed that sound fundamentals are the underlying component to any successful endeavor, regardless of how the game changes.
The issues facing security professionals, software designers, hardware manufacturers, and end-users are “not something that one operating system is going to solve”, he contends. “Each one has to make its little contribution.”