Some things we now know to be good ideas:
Writing operating systems in a compiled machine-independent programming language.
Performing file I/O by reading, writing, or overwriting integral numbers of bytes at integral offsets.
Creating processes by duplicating existing processes.
Null-terminated byte strings.
Investing a substantial proportion of programmers’ time in building tooling to make themselves more productive.
When explaining a new programming technique, starting with “Hello, world”.
Connecting programs together by piping the output of one to the input of the other.
It’s hard to believe that there was a time when any of these weren’t conventional wisdom, but there was such a time. Unix combines more obvious-in-retrospect engineering design choices than anything else I’ve seen or am likely to see in my lifetime.
It is impossible — absolutely impossible — to overstate the debt my profession owes to Dennis Ritchie. I’ve been living in a world he helped invent for over thirty years.
Comment feed for ongoing:
From: Alex Esplin (Oct 12 2011, at 20:13)
I learned to program only 8 years ago, but the more I learn, the more I feel indebted to the genius that is Unix and the Unix philosophy.
[link]
From: Jarrett Vance (Oct 12 2011, at 20:26)
What an elegant way to speak morbidly.
[link]
From: alectheheek (Oct 12 2011, at 20:36)
He was also a delightful and modest man.
[link]
From: Brad Gilbert (Oct 12 2011, at 21:02)
Actually null terminated strings is a bad idea.
To copy a string you need to know how long it is, to find that out you need to scan the string for the null terminator.
So what during a simple copy, is the string is read twice. Sure there are ways to scan the string while it is being copied, but that still isn't as fast as knowing the length ahead of time.
Which is why the D language stores the length of the string with the string, and appends a null byte when calling C code.
[link]
From: Mike Cantelon (Oct 12 2011, at 21:08)
DMR's contributions were simply fundamental to modern computing. RIP.
[link]
From: Alastair Dallas (Oct 12 2011, at 21:14)
I'm sad to learn of Dennis Ritchie's passing. He made coding cool in many ways. Your list of advances left off ++. The most widely-used languages today:Java, JavaScript, C++, C#, even PHP were all begat by C.
[link]
From: Hub (Oct 12 2011, at 21:27)
DMR contribution to computing is invaluable to me. I wouldn't be in that field if it wasn't for his work.
[link]
From: Jan Gray (Oct 12 2011, at 22:25)
See also Ritchie's Retrospective on Unix in the 1978 Bell System Technical Journal:
http://www.alcatel-lucent.com/bstj/vol57-1978/articles/bstj57-6-1947.pdf
[link]
From: Tony Fisk (Oct 12 2011, at 23:00)
Actually, Brad, you miss the point and fall into the hell reserved by Knuth for premature optimisers.
It is certainly more *efficient* to store the length of the string at the start of the sequence. However, you also have to consider how many bytes to reserve for that length. One? Two? Four? Perhaps a format that defines that number?
This is mildly complicated, although certainly solveable.
The real point is, though, that a brain-dead scanner will stop copying the string when it encounters a 0, wherever it happens.
I think you'll find that D would hedge its bets and include a terminating zero as well.
[link]
From: Dr. Azrael Tod (Oct 12 2011, at 23:15)
WTF? At least one of these is everything, but by no means a good idea.
Null-terminated Strings are one of the worst problems we still have to keep up with.
And Interprocess-communication is by no means solved just by piping.
I could continue here with a few minutes of thinking, but i guess you get what i wanted to say.
[link]
From: Mike Oliver (Oct 12 2011, at 23:18)
Yes, for sure, DMR has given most of us in programming more than even the most informed of us know. The fact we don't know is testimony to his humbleness and greatness.
[link]
From: Skc (Oct 13 2011, at 00:03)
Null terminated string does not allow to store null bytes and are responsable for an incredible number of bugs and security problems.
[link]
From: Geoff Teale (Oct 13 2011, at 01:27)
DMR was a great man. For clarification, wasn't is actually Doug McIllroy who invented piping data from one program to another?
[link]
From: JustSomeGuy (Oct 13 2011, at 02:24)
Now *this* I can get emotional about. Steve Jobs? Pah! While I applaud Pixar, I consider Apple to have been a bad mistake. But I can't find fault in *any* of Richie's stuff (that I know about, anyway).
[link]
From: Phil Stefans (Oct 13 2011, at 02:26)
Great post. Vale, DMR.
[link]
From: webreac (Oct 13 2011, at 02:47)
Some of the points are real wisdom (OS not in assembly, programmer tooling, hello world).
The remaining is questionable.
The pipes are a key of unix success when command line was omnipresent, they are a relic of the past. The "everything is a file" of plan9 is optimal for this category of OS, but may be useless for (future) object oriented OS. The C string is ideal for low level languages but is not an issue for higher level languages (it is different in Ada or python with no inconvenience).
The wisdom of Dennis is not in this kind of list, but mainly in the articles where he explains how these choices have influenced evolution and what could have been done better.
[link]
From: Thomas Downing (Oct 13 2011, at 03:28)
The passing of one of the great pioneers of modern computing. I am distressed by some of the somewhat negative comments here. I wonder if the commenters truly appreciate the scope of the problems faced and solved, that now seem obvious.
They weren't obvious at the time, that they now seem so marks the genius.
Finally, no one solution to a class of problem is the most appropriate to all members of the class - but that in no way detracts from the compelling value of the solutions Ritchie produced.
[link]
From: Bill Walker (Oct 13 2011, at 03:28)
The fact that we can debate the merits of some of his epiphanies 39 years after their invention shows his true impact on the industry.
Right, wrong... NULL terminated strings were earth-shattering in the age of 80 character cards and paper tapes.
I started my career with AT&T System 3 and BSD versions beginning with 2. I had the honor of spending a few evenings in my misspent youth in the company of DMR, and will cherish those conversations and debates. I learned more about *how* to think in the span of a half dozen beers than I learned in a half dozen years of university.
[link]
From: Saurav Sengupta (Oct 13 2011, at 04:44)
I am saddened to see people using tributes like this to the great man as vehicles for criticising his work, not just here but on several places on the Internet. People should have the sensibility to refrain from this on the occasion of remembering a person after his demise, whether they personally like his work or not.
[link]
From: Mike Garrity (Oct 13 2011, at 05:36)
I think that Dennis has said that NULL terminated strings don't belong on this list. He said it was the right design at the time, but that they knew then that it had a lot of shortcomings. This was discussed in the comment section of the Queue article this summer: http://queue.acm.org/detail.cfm?id=2010365
But in a lot of ways, NULL terminated strings are a good example of his knack for recognizing what's important and not letting a search for the perfect solution get in the way of shipping something great. As Steve said: "Real Artists Ship". Dennis was certainly a real artist.
[link]
From: Jeremy (Oct 13 2011, at 06:19)
"Goodbye, World."
[link]
From: Nicolás Conde (Oct 13 2011, at 07:21)
I learned C about 10 years ago, I'm still enjoying working wit hit.
Dennis Ritchie (et al.) solved a lot of problems that got us to where we are today. It's easy to say he got wrong many things, but those where the only solutions back then.
He will be missed, RIP.
[link]
From: pltr (Oct 13 2011, at 07:29)
Except Ritchie didn't invent half of these things and null terminated strings is not a good idea, it's a TERRIBLE idea, one of the worst in history of CS in fact.
I do think Dennis Ritchie was a great man and his contribution is hard to overestimate, but epitaphs like this made by degenerates make me cringe
[link]
From: Selena Deckelmann (Oct 13 2011, at 07:35)
Reading K&R for the first time as an undergraduate Chemistry major, I decided that I could write code.
I owe my career to that book.
[link]
From: Senpo (Oct 13 2011, at 08:26)
We pray for all. 777.
Deep bow!
Senpo
[link]
From: Raymond Chiu (Oct 13 2011, at 09:05)
Nanos gigantium humeris insidentes. DMR is a giant, which has propelled and continues to rocket us into the digital age. Rest in peace.
[link]
From: Kevin (Oct 13 2011, at 10:06)
We can debate the merits (or otherwise) of null terminated strings until the cows come home, and also argue about who invented what, but speaking as a computing lecturer of 27 years experience - a job which means that I am always learning something new every day - it is clear to me that Dennis Ritchie had more knowledge of computer science in his little fingernail than I will ever manage to acquire.
RIP
[link]
From: Gregg (Oct 13 2011, at 11:28)
The examples in K&R were illustrative and never intended to be "the C library" by anyone involved as I recall. The most important thing that DMR did was solve real problems to get multi-user computing into the mainstream world as a place to "research" what computing could really be used for.
Simple things like pipes and ttys and "everything is a file" made it possible to reuse applications for lots of different purposes. That was a simplification that was needed for the "text processing" crew that PDP UNIX was written for. They needed NROFF and a text editor and not much more, but the pipe made it possible to look at the output on their screen for pagination, and then to print the same thing when needed.
Go look up Rob Pike on Google+ and ask him about DMR. DMR was a gentle giant that I got to be in the presence of a couple of different times, once when I was working at Bell Labs. He was always about "learning what's best" and not about "I know best". A very modest man.
[link]
From: Another Kevin (Oct 13 2011, at 11:49)
Length-prefixed strings are indeed a premature optimization. It is possible to do all the common string operations on null-terminated strings in amortized O(N) time where N is the string length; even building up a string by successive substring concatenation can be done in amortized O(N) time as long as you're willing to pay a space penalty of o(N).
gets() and scanf("%s") were, however, abominations. Dennis once told me that he considered the lack of range checking on input buffers the second biggest botch of the design of C. (The first was the fact that there was no graceful way to specify conformant dimensions when passing 2-d arrays as parameters.) He was a humble man - he told me of the botches in reply to an innocent question on comp.lang.c in the mid-1980's about how one might go about implementing a conformant array type. Completely out of the blue; I never expected to hear from him.
He was a brilliant light, and the world will be darker for his absence.
[link]
From: Andrew (Oct 13 2011, at 13:55)
@Brad Gilbert
Really, just cock off. As Rob Pike said, the world has lost a great mind. Under the circumstances, quibbling about null-terminated strings is at best inappropriate.
But while I'm here, they were a brilliant innovation back in the 70s, I'm guessing before you were even in nappies, and continue to be invaluable, and enable most elegant and efficient code. The fact that strings in D are more efficient under some circumstances is neither here nor there. And why do imagine Walter Bright called the language D in any case?
RIP, dmr.
[link]
From: VijayK (Oct 13 2011, at 14:07)
I owe the passion and fruits of my profession, indebted to the ingenuity of Dennis Ritche, and his colleagues (Ken Thompson, et al) at Bell Labs for the creation of C and Unix systems.
[link]
From: Huge (Oct 13 2011, at 14:34)
dmr was one of the giants whose shoulders Steve Jobs was standing.
[link]
From: Robert Young (Oct 13 2011, at 15:35)
FWIW, much of what's been mentioned originated with Multics. How much DMR was involved in Multics, I know not.
[link]
From: Sainagakishore Srikantham (Oct 13 2011, at 16:06)
#include "stdio.h"
int main()
{
int index;
for ( index = 0; ; ++index ) /* infinite loop */
{
printf( "\n R.I.P Dennis Ritchie!" );
}
}
[link]
From: Osvaldo Doederlein (Oct 13 2011, at 16:40)
FWIW, NULL-terminated strings (NTS) were a good idea at the time, but not today. With length-prefixed strings (LPS) you can process [most of] the string in big chunks with SIMD + loop unrolling + ILP tricks. Not just for copy but also compares and some other operations, and even for non-ASCII strings, it's only a bit trickier but still pays off. In safe languages, unrolling is also critical to reduce the cost of bounds checking. You can adapt these optimizations for NTS, but it's never as good.
And, (@Another Kevin): O(N) is not always the best perf possible. LPS can implement length() in O(1), and this is often critical; e.g. if you only want to know if two strings are distinct (don't care about ordering), different lengths = instant result false, and that's often the vastly-dominant case. This works even for strings with arbitrary encoding (if both strings have the same encoding and are normalized).
And before I hear "premature optimization" again, just look at modern implementations, people are still thriving to make strings faster - e.g. JDK 7's single-byte strings and tons of VM intrinsics for common patterns of string code... half of the heap of the typical app is taken by strings, it's the most important kind of structured data by far for a huge amount of apps. Any library/compiler trick that wins 5% cycles or bytes off important string operations, has a measurable impact on overall perf of real-world apps.
[link]
From: Yigit Turgut (Oct 13 2011, at 19:03)
He made amazing contributions to mankind, changed and shaped the way we look at things. RIP Dennis Ritchie.
[link]
From: Mike P (Oct 13 2011, at 19:14)
Absolutely Tim. Everything I've done over the last 8 years is owed to him.
[link]
From: Djun Kim (Oct 13 2011, at 20:12)
To Tim's list of DMR's contributions, I would add: the notion of directories ('folders'), shell scripts, environment variables, setuid programs.
I'm not sure how many people nowadays have experienced any operating system pre-dating Unix, but they were almost uniformly awful from the user's perspective. Unix, forty years later, is still a joy to use.
I also want to mention K&R. Almost 35 years after it was first published, it's still in print. In all of those years, there has been no other programming book that compares in terms of clarity, elegance, and succinctness. I'd recommend it to any programmer, regardless of whether they use C, as a model of exposition and style.
[link]
From: Shoma (Oct 13 2011, at 22:22)
I absolutely echo the author's thoughts - It is impossible — absolutely impossible — to overstate the debt my profession owes to Dennis Ritchie. I’ve been living in a world he helped invent ..... RIP
[link]
From: Paul Blackburn (Oct 14 2011, at 00:39)
computing_genius--;
sad_day++; /* rip dmr */
I think null terminated strings are a good idea. Easy to handle and process.
Dr Ritchie's work in C and Unix prepared the foundations for our Information Age.
I am glad I had the opportunity to meet him once.
[link]
From: deepee (Oct 14 2011, at 00:49)
wtf?? All those arguing, please step outide right now, this is not the place to argue or discuss, just a place to remember and to thank DMR for his immense contribution to the software industry. We wouldn't be where we are today without this great man's work, and I for one wish to thank him for that. As one other poster put, "Goodbye World." That said it all for me. RIP, and thank you.
[link]
From: g (Oct 14 2011, at 04:04)
Osvaldo Doederlein, I agree that null-terminated strings are at best debatable, but I think your argument against them is unconvincing. Yes, with length-prefixing you can often establish quickly that two strings are unequal -- but so you can with null-termination, since two different strings commonly differ early. And since many strings are short, allowing (say) 4 bytes for a length prefix instead of 1 byte for a terminating null can mean a substantial increase in that large fraction of heap size you mentioned. (Though maybe in the cases where half the heap is strings, they tend to be long strings.)
I entirely disagree with everyone who's saying it's improper to question the merits of null-terminated strings here. The point isn't that anyone thinks it was stupid or evil to introduce them back in the 1970s -- surely it wasn't -- the question is whether Tim was right to say that *now* we know they're a good idea.
[link]
From: Michael Lynn (Oct 14 2011, at 04:19)
I'm sorry to hear of Dennis's passing. C is a beautiful, simple, powerful language. The C Programming Language is one of the best computing books ever written.
[link]
From: SteveL (Oct 14 2011, at 05:00)
Pipes are profound, along with a notion that all devices are files. When you think how the unix select() statement lets you way for file input, user events or network traffic in one go -with only one API to learn, you can see the elegance of this design. It even scales to the custer, with Named Pipes in a cluster OS.
If you ever look at the Win32 API, you can see where some people didn't play with Unix enough, as it lacks that coherence. Mouse and KB events: one API call (GetMessage()). Network: another (select(), and WsaRecvEx()). PipeIO (::WaitNamedPipe(), etc)
Then they had to add some other function to wait for different sources...
One concept: char streams, one API, one conceptual model. Profound.
[link]
From: Christian Sciberras (Oct 14 2011, at 06:09)
Null-terminated strings is GROSSLY BAD idea. It's the worst mistake regarding strings in the history of Computers. If you don't see why, please, don't bother with a reply.
Connecting programs through pipes, while it is a good idea in general, still doesn't excuse the developers from doing a more unified and infallible data sharing mechanism. Because this antiquated concept has been widely adopted, stifling any possible competitors, it also makes it a bad idea in practice.
While Dennis did some major changes, to which we all owe him, he also did some things which, unfortunately, are still with us and which everyone wants to secretly get rid of (except, maybe, unix purists). It takes a good budget to write a non-conventional OS with features that go beyond the 80s' mindset. Sadly, they're widely criticized mostly because of this rotten culture of incompetence.
[link]
From: Mike O'Connor (Oct 14 2011, at 06:46)
"It is impossible — absolutely impossible — to overstate the debt my profession owes to Dennis Ritchie. I’ve been living in a world he helped invent for over thirty years."
The single best statement I've seen, and am likely to see, on Dennis Ritchie's passing. Perfectly put.
[link]
From: Makerofthings7 (Oct 14 2011, at 06:56)
The end of a great man's life is more significant than the end of a computer string.
Please refrain from debating that topic here, out of respect.
[link]
From: KrzysztofCiba (Oct 14 2011, at 07:20)
The real master has faded away without a big noise in media, the creator of great tools far off fancy overpriced gadgets.
Still having this K&R book somewhere around, slightly covered with dust.
[link]
From: Gjorgji (Oct 14 2011, at 07:42)
DMR is a guy who really change the world
and the quote "everything is a file" should be "everything behave as a file" which can be translate as 'everything behave as a object'
[link]
From: Chris Stueck (Oct 14 2011, at 08:38)
Before there was Steve Jobs, there was DMR, before there was Apple or Macintosh or iPhone, there was DMR,Unix, Linux .... Those of us who date back to the age of Large Computing Dinosaurs, he was truely an intellect and an innovator. He began the journey to personal computing in the 70's. We will miss you Dennis.
[link]
From: Bond (Oct 14 2011, at 10:03)
Thanks Tim for putting it more eloquently than I ever could.
Many years ago I sat down at a diskless Sun 3/60 with a copy of K&R after previously spending my time meddling with PDPs and Vax 11/780 fortran 77. I never looked back until today. What an incredible contribution.
[link]
From: Christopher P. Kile (Oct 14 2011, at 11:28)
Just looking at the wide variety of responses, and the several running debates threading through them which have lost the R.I.P. tone completely, I can't think but that this is as fine a tribute as can be. Reminds me of my first bachelor party: two of my brothers watching porn on a VCR while ten of us gathered around the best man's new Amiga 1000. It's the way of our world.
When I found out about C from the Bell Lab's yellow spiral-bound C Language Summary, I saw my future. Five years later, I had my first job as a C programmer, and earned my bread-and-butter that way for ten years; it's still my favorite language for playing what-if. I programmed on MSDOS, a bit like learning to skate in the dark on thin ice, but at least the command structure was simple to learn, like the operating system from which it copied directories, piping, etc., which was of course Unix. Thus, to this day I stand on the shoulders of this giant, and mourn his passing greatly.
It's been a bad week for giants. Let's raise one in hope that a new crop is growing.
[link]
From: John Roth (Oct 14 2011, at 15:34)
Dennis was a true pioneer to whom we owe a great deal. Not every decision he made was optimal, but then that's true of every pioneer in my memory.
As far as the null terminated string thing goes, I think most commentators are missing the point. The true killer is not taking account of the bounds of the fixed length containers in which those variable length strings exist. The way the variable length is described is secondary.
History has shown that, like manual memory management, it's a very bad idea to require even the best and most meticulous programmers to handle that detail manually. It's an invitation to disaster, and disaster is quite willing to accept the invitation.
[link]
From: Mario Enríquez (Oct 14 2011, at 15:58)
Una gran persona,Dennis Ritchie ha contribuido desde los inicios de la sociedad del conocimiento, agradezco su gran aportación a los fundamentos de la programación para la administración de sistemas como Unix.
Una gran perdida para el mundo, sin embargo cuando somos tan pequeños en este universo, él ha incidido en el mundo con sus aportaciones nanotécnicas.
Mis condolencias a la familia.
Atte: Mario Enríquez.
[link]
From: Venky (Oct 15 2011, at 00:31)
This has been a sad 2 weeks. First Steve Jobs and now Dennis Ritchie. Both their contributions to technology and mankind in general are equally historic and revolutionary!
I can only hope and pray to GOD that their families and loved ones have the strength the bear this big loss. However, there is one more that I pray about. And that is, in this world where we take technology for granted, may the world be blessed with people like Steve Jobs and Dennis Ritchie in the next generation to enable them to contribute to technology and mankind like them!
[link]
From: Danish Farman (Oct 15 2011, at 06:15)
Truly a Genius !
It was his book that made me a programmer I am right now !
I still wonder if any one can achieve such great skills in programming ever!!
[link]
From: Idan (Oct 15 2011, at 13:08)
I read K&R (1st release) as a teenager. It was eye-opening and inspiring. Truly a foundation for our craft.
[link]
From: Mrinmay Bhattacharjee (Oct 16 2011, at 05:08)
Rest in Peace Sir. Dennis Ritchie
[link]
From: sean (Oct 18 2011, at 08:53)
We miss you Dr. Ritchie. Your contributions to computer science are well appreciated. You are our hero.
What I won't miss (refering to posts here) and deal with everyday are immature geeks that can't stop being a jackass for a second.
[link]
From: Andrew (Oct 18 2011, at 16:54)
AFAICT, they were invented by Doug Mcilroy, not Dennis - the following is from an interview with Rob Pike:
(And speaking of Doug, he's the unsung hero of Unix. He was manager of the group that produced it and a huge creative force in the group, but he's almost unknown in the Unix community. He invented a couple of things you might have heard of: pipes and - get this - macros. Well, someone had to do it and that someone was Doug. As Ken once said when we were talking one day in the Unix room, "There's no one smarter than Doug.")
[link]
From: sandro (Oct 19 2011, at 22:05)
Forty years for computer technology are comparable to ages in any other engineering discipline in technology. In early seventies we had kilobytes. Terabytes where simply unconceivable.
In our days terabytes are common concept even for computer electronics but the inventions of Mr. Ritchie and the team of scientists and engineers whit whom he worked forty years ago are still here at the basis of almost any development performed since then.
So if forty years are comparable to ages for computer technology, what a legacy Mr. Ritchie has left to the ages that came.
[link]