Kyle Schurman
Oct 17, 2011

Dennis Ritchie: Programming pioneer passes away

 

When Steve Jobs passed away a couple of weeks ago, the mainstream media and social media networks were filled with tributes.

 

Understandably so. After all, Jobs was a well-respected – almost beloved – co-founder of Apple, serving as a pioneer for society-changing devices, such as the Macintosh computer, the iPod digital music player, and the iPad tablet computer. His story of starting Apple with Steve Wozniak in a garage is inspiring.

 

Jobs led an innovation revolution.

 

When Dennis Ritchie – or dmr, as he was commonly known online – passed away last week, the tributes weren’t quite as widespread.

 

Again, understandably so. Ritchie, the 70-year-old creator of the C programming language and the Unix operating system, didn’t develop the “fun” products that Jobs did. Neither was Ritchie at the forefront of a company that was a household name.

 

Yet, perhaps Ritchie should have received more public accolades.

 

Anyone who has done any computer programming, especially with some of the older computers and older languages – or, God forbid, with punch cards – knows the incredible impact that the C programming language had on the industry. Without C, developed in the early 1970s at the Bell Telephone Labs, and the subsequent C++, programming would not have progressed as quickly as it did, sparking amazing technological improvements during the past four decades.

 

“The tools that Dennis built, and their direct descendants, run pretty much everything today,” Brian Kernighan told the New York Times after Ritchie’s death. Kernighan worked with Ritchie at Bell Labs.

 

C was the first multiple-machine language, and it was a language that many different types of programmers could use because of its simplicity, at least compared to other languages. C and its descendants are the most widely used programming languages in the world.

 

Ritchie developed C specifically for use with the Unix operating system, which Ritchie developed at Bell Labs in 1969 with Ken Thompson and a few others.

 

Unix’s reach is almost as widespread as that of C, as many different operating systems have their roots in Unix. More importantly, Unix was the first operating system to allow multiple users, which was extremely important to researchers on university campuses in the 1970s. Unix helped multiple users to share time on the computer by managing the workload.

 

Before Unix, programmers had to schedule time when they alone would have access to a computer. This was expensive and made it difficult to test small programs. If you had to wait days or even weeks to gain access to the computer, you’d want to get the most bang for your buck and test complex programs.

 

Today, with the prevalence of multitasking and multi-core processors, such problems seem like they belong more in the stone ages than the early days of computing. Still, without Unix and C, computing’s migration out of the stone age would have taken much longer than it did. With a good operating system and a good programming language, more programmers could make use of the available technology, and more innovations could occur.

 

For a time, computer experts thought Unix might end up being the dominant operating system for personal computers, at least until DOS came onto the scene, followed by Windows. Think about how different the world of technology would be if Microsoft’s DOS had been supplanted by Unix on the PC platform.

 

Still, Ritchie didn’t have the widespread type of notoriety that some computing pioneers have received.

 

Granted, Ritchie was well respected and honored among his peers. Ritchie’s list of high-tech awards was impressive. He was awarded the Turing Award, considered the “Nobel Prize” of computing; the Hamming Medal from the IEEE; and, from then-President Clinton, the National Medal of Technology.

 

The Apple products and innovations spearheaded by Jobs will always be more well-known by society than anything Ritchie did. Yet, without Ritchie’s work, Apple might never have existed … or Microsoft … or Linux … or … you get the idea. Unix and C showed what was possible in the world of computing. It’s no coincidence that high-tech pioneers – guys like Jobs, Gates, Torvalds, etc. – were just learning about computing and programming about the same time Unix and C were becoming popular.

 

Although the pioneers who head companies and stand on the stage at media gatherings, wearing turtlenecks and jeans while describing and testing the latest cool product, tend to receive most of the attention and praise, it’s often the behind-the-scenes people who make those products possible. Ritchie is a prime example.