You may not know Dennis Ritichie but you are using his work right now. He was instrumental in developing Unix (which is ultimately related to developing Linux, which you are using right now because this web site is delivered via a Linux server) but he’s more well known, probably, for his work developing the C language.
Every time an old computer language is mentioned for some historical reason, we learn that even thought we thought the language was dead, or obscure, or irrelevant, it turns out to be used in some wide range of functions and life itself would be impossible as we know it without it. I think with C it is the opposite. If you know anything about anything, you know that C or some variant of C is basic to everything. C is arguably one of the hardest languages to program in, but it is also the language that most stuff that needs to be efficient gets programmed in. It will probably turn out that it is not used anywhere for anything. (But I doubt that.)
(Code Trolls will tell you in the comments below the many ways in which what I just said is unconscionably wrong. I’m sure they’re right. I’m sure they’ll be polite, though.)
If you learned C you probably read K&R. R is Ritchie.
Sorry to hear this. I have a copy of K&R on my shelf;
Programming in my younger days made that book quite dog-eared. The mantra then: Java is for pretty. C is for fast;
Oh my. You just put a pall over my whole day;
I have K&R on my bookshelf. I used it to teach myself C about 20 years ago;
Up until March, I still used it every day. C on AIX Unix on an IBM machine;
Forced to read K&R, I cursed him, but he was a genius and like so many, will leave us almost unnoticed;
Windows core components are written in C.
Linux kernel is C, and thus all Android stuff is C at core, as well as all the Amazon and Google and Facebook systems.
OSX is based on Unix and BSD so has a C core, so all the Apple stuff is based on C or another direct derivative, Objective C.
Many embedded systems use C.
Very many other computer language’s compiler or virtual machine is written in C or C++, a direct derivative of C.
The unix and C philosophy of “good enough” has ruled the systems landscape for decades.
Ritchie was one of the people who directed the flow of systems and computers to be the way it is today. One of many, but I do think the world would be quite different if K&R and C and Unix weren’t developed. Open languages with several different compilers, open linkage interfaces, I think these might not have been the norm without C and Unix using it.
I celebrate that we had such wonderful smart people to help build our world.
Everyone always says programming in C is so hard. I never found that. It was the second language I ever learned and the first one that really made sense. Maybe I’m just lucky.
Why are some of you people leaving off your semi-colons?;
Semicolons are for the wishy-washy. Either use a comma or a period and leave that half-way garbage for 8th grade English.
😛
sprintf(“Sorry Dennis, just kidding.”);
timberwoof.sad = TRUE;
C is hard? It’s a matter of degree. It’s harder than Basic, but once you “get it” it makes sense. IBM 360 assembler is hard. PL/I is hard. COBOL is tedious and writing anything worthwhile (to me) in it would be hard. Fortran â?¦ writing good Fortran is hard, but it’s easy to write Fortran in any language.
C’s syntax has found its way into many other languages: PHP, Java, LSL, even Perl!
I used BCPL at uni (sometimes expanded as Before C Programming Language) to write small DOS and compiler for a mini that had been acquired by our comp. sci. lab.
Several years later I got a job writing some system interface stuff for a PDP-11. I was going to write it in assembler, but a friend lent me K&R. I was amazed (after working with COBOL and RPGII) at the language’s compactness and ease of use…and its similarity to BCPL. In the beginning, the only trouble I had was with the stronger data typing of “C”.
Anyway, a big thank you to Dennis Ritchie for “C” (Kerninghan give Ritchie the design credits).
main()
{
char *ptr;
*(ptr–)=0;
}
I long for the days when this could do some real damage. C taught me to take care in my development.
C is hard if your definition of hard is, “A casual coder shouldn’t expect to just bang something out and expect it to compile on the first try.” But it does make intuitive logical sense, and it forces you to be careful about how you code.
I learned C coding as a grad student, from The Gospel According to Saints Kernighan and Ritchie (2nd edition) and one other book that was specifically aimed at n00bs. Subsequently, I picked up IDL as a tool for turning my data into figures. At times, I felt frustration when something that C would have flagged at the compiler stage would cause my IDL program to crash and burn after 20 minutes or more of data crunching (often, a speling eror in a variable name). Other times, I found that I could get away with sloppy thinking in my IDL programming (which is not a good thing).
One minor detail that I encountered: C passes variables by value, while most other languages (at least the ones a science grad student might encounter in the 1990s) pass variables by reference. That one tripped me up at least once when I forgot IDL would be modifying my original variable rather than a copy of it as C would.
main(){printf(“goodbye, world”); return 0;}
Holy cow, the troll is in the blog itself and not the comments!-)
I make my living programming in C and Linux. Trust me when I say that C is everywhere, including your toaster and/or microwave oven. (A micro-controller? Programmed in C.)
Everywhere.
Holy cow, the troll is in the blog itself and not the comments!-)
Think of it as object oriented blogging: Data and instructions in the same entity.
Kernighan and Ritchie were great – but I wouldn’t hire anyone who still uses the K&R book except in a history class. Bell Labs has an article on the history of C: http://cm.bell-labs.com/who/dmr/chist.html
@Markk: If WinDuhs is C, why all the FAR PASCAL declarations? It may have been built with a C compiler for the past 25 years, but PASCAL was the flavor of the month way back then and it still shows. The same goes for the Apple monitor program and Apple’s SystemX.
@Greg #15: Don’t you mean von Neumann blogging? (code and instructions living together)
I never understood the “C is hard” thing – every time I hear a self-proclaimed programmer say something like that, or “assembly is hard” or “writing device drivers is hard” or “mutlithreading/multiprocessing is hard” my immediate response is “well you’re no goddamned good then, are you?”
timberwolf: actually, C syntax is a reduced variant of ALGOL, the ancestor of that imperative style. The form has changed very little, although the specific characters used for the syntactical tokens may change a bit (e.g. ALGOL blocks with begin – end, while C uses { – }).
C continues to be a mainstay at the OS level, and not just in the big 2 1/2 (darwin derives from BSD). Note that C was designed to be an easier alternative to ASM specifically for writing portable Operating Systems. Only the really small (8 bits) embedded machines these days still have to drop down to ASM or FORTH for their OSes.
C abstracts out the hardware while leaving the algorithmic and program structure details to the programmer. All higher level languages typically try to abstract out above that level either (or both) in support of certain algorithmic or syntactical styles or in support of “safer” programming (show me a language that guarantees you can not shoot yourself in the foot, and I’ll show you one that makes it all but impossible to do anything of serious magnitude – or how to shoot your foot in it).
MadScientist: FAR PASCAL was not because Pascal was used, it was because (by accident) that particular ordering of mapping function arguments (left to right) in the call to their stack image for the called function to unwrap turned out to be more convenient since it allows for variable argument lists and also yields (slightly) more efficient object code. The FAR part refers to the segmented addressing architecture of the 16 bit x86 processors.
Sad news indeed, just been informed that Dennis Ritchie Developer of the C language, Multics and our beloved UNIX operating sytem has passed away. (September 9, 1941 â?? October 8, 2011).
I have spent 20+ years working as a UNIX Systems Engineer / Admin, thanks to the ground breaking and innovative development Dennis and the rest of the Bell Labs gang laid out in the 70’s. RIP Dennis and Thank You!
I think the most significant thing about C is that it was a great leap forward in cross platform compatibility.
http://herbsutter.com/2011/10/12/dennis-ritchie/
@GrayGaffer: Is that really why there were so many parts of MSWin declared to pass parameters via the pascal convention?
@Rich#21: Code portability is something the AT&T guys worked on in the later years of C, but Richard Miller and others in Australia deserve the credit for really pushing portability and cross-compilation: http://www.uow.edu.au/content/groups/public/@web/@inf/@scsse/documents/doc/uow103747.pdf
R.I.P Dennis Ritchie..!! We will Miss you..!!
See this:
http://scienceblogs.com/gregladen/2011/10/i_got_this_note_from_john_ritc.php