Why is C taking over, and what can be done about it?

My company, a FORTRAN shop since its inception 14

years ago, and which employs 200 very good programmers,

is now switching to C for one reason only: it is

getting impossible to find good FORTRAN programmers

anymore. We use C already for some special areas

where it is appropriate, but there is no computer-

science-related advantage to using C over FORTRAN

for 90% of the work we do, and some disadvantages

(not to mention all the work required to make

our old code C-compatible).



-- Joe Shipman, January 10, 1997

Answers

I wrote something about this a few years ago when I was trying to figure out what I'd learned from building a huge Common Lisp system. Check out my lessons learned page. Richard Gabriel, one of the old-time great hackers, surveyed the late 1980s collapse of the Stanford/MIT "build the right thing" philosophy. He wrote a long "worse is better" paper about why quick hacks so often snowball into popular systems. I have some new thoughts on this subject. Thought 1: Companies don't actually use C for anything important. If it matters, it is programmed in SQL and the hard stuff is left to the relational database management system. Corporations have gotten burned so many times by buggy C/Fortran/COBOL programs that they aren't willing to take risks. They pay Oracle a huge ransom and suffer what is sometimes a factor of 1000 in reduced performance just so that they are sure to never lose a transaction. The company might write a little C code for an application to run on a clerk's desktop. If it crashes, they'll reboot the machine. No big deal. Thought 2: Unix became popular because it was the simplest operating system to port to a new computer. The source code was not written in assembler. The operating system didn't try to do anything ambitious so it didn't need any special memory management hardware. Once people were forced to learn to use C to program their Unix box, it became just as easy for them to use C for other things. Having to use multiple programming languages is the worst of all situations because you never get good at using any one of them. Thought 3 (related): it used to strike me as odd that the range of technology available in automobiles, a relatively mature industry, is much narrower than that available in programming environments. A Lexus may be more advanced than a Hyundai but they are both way ahead of where cars were in the 1970s. Yet in programming we have guys using really nice source-level debuggers in safe languages like Common Lisp (a 1980s thing) and also guys using print statements in C programs (a 1950s thing). I've decided that it is because the initial shock of changing programming environments is so large. I can step out of my Hyundai into a nice new Lexus and say "this is a way better car; I can immediately drive faster." But if I step out of my comfortable 1950s-esque C world into a fancy SmallTalk or Java programming environment, I can't even make "Hello World" run for maybe a day or two. It might take me a month before I'm more productive in the new environment. Who has a month to spare? Anyway, Joe, I think your company is probably making the wrong decision. C may still be the right language for developing Adobe PhotoShop. Adobe needs the ultimate in speed and they probably have at least 200 good people just to test this one program. However, most companies have a broader range of programs to write, fewer programmers, and an even higher need for reliability than Adobe (after all, if PhotoShop crashes, the user can just restart it under WinNT). Perhaps the tools for Java aren't fully mature, but I do think it might be a smart idea to build up future libraries of software in Java instead of C. Life is just too short for programmers to be manually allocating memory and manipulating pointers. Your company probably has enough money to pay programmers to hunt down bugs in their ad hoc memory allocation schemes. However, your competitor may already have figured out a way to deliver the same service to a custom by using the Netscape Enterprise Server and a small Java applet.



-- Philip Greenspun, January 18, 1997

Have you heard about "f2c" the FORTRAN to C converter? Its pretty good, actually, if all you want is your code to compile and run, its really good. As for producing C code that is maintainable by humans its pretty lousy, because its produces amazing speghetti code. But, it would make all your existing code C compatible, and then you could develop all your new code in C, and easily integrate with the old. I think I'd have to argue with you about the "no computer-science-related advantage to using C over FORTRAN" It all really depends on your application, and I know that both languages have their disadvantages... My gut feeling is that C is a little easier to maintain and debug than FORTRAN, especially when you consider the tools that are available for C that aren't available for FORTRAN. Well, I could go on here, but I'll restrain myself...



-- Steve Lacy, January 13, 1997

Vogue. As of the mid-80s. I must admit that I cringed from the way C initially promoted clever-clever techno-jibberish coding, stroking the egos of those who thought it was superior to baffle others with impenetrable programs... all demanding the reader to remember tracts of arcane variable/function names. It has been said that the greatest benefit of C is job security. When I first met C I was using Pascal, was focusing on transparency and was writing stuff like KeyboardInterruptsOn; EyeColour(Jane) := blue; whilst C coders seemed (then) to produce &rgbff +15 = (&rgbff +15) & 0x1C4; // to diddle a device register for the interrupts off bit - pattern. ca[psn[getc[k]]] (my C is too rusty.. what I want is to read the keyboard, map the key to a person array and then set a char in that array to a second keystroke.. all on one line, please) Well, THEN, it was a matter of pride to me that a secretary could pick up one of my programs - and understand it. My programs focused on what was going on, not how things were to be done. That sort of detail was handled by the compiler (all very neat, but this didnt make for job security). With later, tighter compilers and more of an eye to style C has turned out.. not so bad after all. I do wonder though about a language that needs to offer 8 ways of adding 1 to a number... Anyhow, vogue remains an issue. The company I worked for at this time was very vogue driven. For example: the upgrade of an I/O controller. This used a Z80 and assembler to handle networking, disks and serial comms I/O. The Z80 was good at comms and it worked fine. The replacement was the long awaited and hyped 68000, with the same functionality implemented in C. What a speedup! The Z80 took 1/4 hr to low-level format a big hard disk. The 68000 + C took... 1 1/2 hours for the same job. But the customers wanted the latest CPU so we sold the 68k card as standard. Change for changes sake!! What can be done? Supplant the vogue cachi with... something more in vogue! Thus, net languages will rise and displace C. ?;^) SteveB



-- Steve Broderick, January 16, 1997

As Phil says, C (or C++) is probably the right language for certain applications -- mainly commercial applications that require effficient, low-level languages. I would disagree, in a sense, that "companies don't use it for anything important." It's true that end-users application writers usually rely on RDBMS's for the important stuff, but RDBMS's are written in C too -- just by different people.



A lot of people complain about C syntax, but I just have to say "get over it." Put some standards in place to ensure readability and enforce them via code reviews. If your management doesn't have that much will, then you'll be out of business soon enough anyway.



I wonder why your company feels like it needs to find "good FORTRAN programmers." Why not find good programmers and train them in FORTRAN? Actually, I know the answer to this: few good programmers want to learn FORTRAN, because it's a career-limiting move. Why is it career-limiting? Because most other companies are out there doing the same thing yours is -- hiring only people that are pretrained in the particular technologies they use, and not many of them use FORTRAN.



Most companies would rather hire a complete hack who's pretrained than make an investment in a talented programmer that has been using slightly different technology. All the time I see employment ads that require N years of experience in seven or eight disparate technologies -- particular languages, environments, databases, etc. I remember seeing an ad at least a year ago for candidates with at least two years experience in Java! This kind of thinking insults the professionalism of all software developers, and is short-sighted to the point of being Dilbertesque. Yet 90% of the companies out there are doing it.



The opinions expressed are those of the author, and not necessarily those of Microsoft Corp.



-- Mark C., January 31, 1997

Actually, Mark, we have always had extensive training classes in FORTRAN and our company's systems; a typical new hire fresh out of college will get paid a full salary for three months while participating in a training class. A few quit or are punted, but most "graduate" as good programmers. A minority of our new hires (like myself 4 years ago) already have enough relevant experience to skip the training class. As you suspected, our problem now is that new programmers don't WANT to learn FORTRAN when they already know C. We will solve this problem with a lot of internal retraining and systems work over the next year or so; the process will be painful but finite. I completely agree with you that a good programmer can learn any language and system and that requiring n years experience with specific systems is stupidly shortsighted. My company has always known this and been willing to train, but the other side of that coin is that programmers themselves need to be willing to learn something different, and it is a sad state of affairs that they no longer do. The attitude of management that you decried as Dilbertish (only hire someone with exactly the right experience so he can start with zero training) is only half the problem, even enlightened management must cope with programmers who are similarly shortsighted (learning new languages, platforms, systems, etc. can only help a programmer in the long run, but somehow the new graduates have acquired the attitude that they only need to know what they already know and will somehow damage their careers by learning other stuff).



-- Joe Shipman, February 1, 1997

Two comments: (1) "Some want to leave us and go to another planet And blow up the earth They'd still be murderous, lying and they'd want it all For whatever it's worth." ("Animals on Wheels" by Sam Phillips from her album _Omnipop_) Yes, C is a non-ideal language. So what? No langauge is ideal; if you believe otherwise, I say you have a very limited of definition of "ideal", or you haven't asked the question "ideal for what purpose?". I find it somewhat entertaining that everyone is developing these theories as to why C is taking over. I'm sure it's really just because it's convenient. Are -you- going to write real-time MPEG decompression software in Java? LISP? Anything but a language as low-level as C? Yes, C is a bit irritating and I certainly don't use it without good reason, but neither am I a C hater. It does what it does, and for what it does, it works well enough. (2) In a previous response, Steve Broderick (steveb@tds.bt.co.uk) said, "I do wonder though about a language that needs to offer 8 ways of adding 1 to a number...", to which I might respond: I am curious, however, about a language that feels compelled to offer almost ten ways of adding one to something. Nevertheless, any language that has to offer eight ways to increment a number makes me a little suspicious. The fact that there are no less then 8 ways of increasing the value of an integer by one in a single language concerns me a bit. I find it a tad strange, nay worrying, that this language seemingly could not be made with less than 8 ways of advancing the value of a number. The point being, of course, that I'm not convinced it's a bad thing for there to be more than one way of expressing something in a single language. Yes, it does add to what you have to learn to be able to use/read the language, but if done properly it doesn't add that much, and (more to the point), it can add a great deal to the expressiveness of the language and even the ability to write clear code, because with this flexibility, the programmer is free to explore alternate ways of phrasing things that might be clearer. Furthermore, I think you'll find that the human brain is perfectly happy with a non-minimalistic grammar -- witness how cumbersome and even comical Newspeak is in George Orwell's _1984_. Boy, it sure is fun to wax philosophical about obscure topics such as this, especially knowing that someone may very well take me seriously or actually care.



-- Logan Shaw, March 10, 1997

One more comment: AAAAAAAIEEE!! Something formatted my (quoted) verse into prose without telling me. No fair. (I am beginning to have less and less faith in the ideal of leaving the low-level details to the software.)



-- Logan Shaw, March 10, 1997

I am not a "C hater"; I will work with it when the application makes it appropriate. The problem I'm complaining about is that programmers are not willing to learn a language such as Fortran when they already know C, even in an environment where Fortran is as efficient as C and more straightforward to program in (because of all the tools that have been developed for and in Fortran in this environment). -- JS



-- Joe Shipman, March 11, 1997

I think if a person won't learn something new to ensure that a job is well done, that person should not be hired. I learned FORTRAN because I come from a background in theoretical physics and I had to run Monte Carlo simulations on an IBM vector supercomputer. If I'd insisted on C, these simulations would probably still be running, 8 years later. I learned C++ because good toolkits nowadays are in C++, and anyway it was fun learning it. If a person won't learn FORTRAN, or C, or C++, or Java, or Smalltalk, because he or she already knows COBOL, or Scheme, or SNOBOL, or Tcl (sorry, Philip) then what makes anyone think that person would learn, say, Metropolis algorithms if that's what it takes to solve a problem? A lack of desire to learn something new should not be rewarded by changing the job description.



-- Obi Thomas, September 25, 1997

I think the push to C comes from lazy programmers, ignorant management and fear. And I know I suffer from the latter. When I talk to potential employers the questions are usually "what language/API/whatever are you familiar with", and usually the subject they're asking about is the flavor of the month (right now, "MFC and Visual anything"). Thus, when I'm career planning, I'm going to be very reluctant to look at a job where I'll walk out saying "FORTRAN on a CRAY XMP-48", because when your average employer is thumbing through resumes mine's one of the first that'll hit the circular file. Might even get passed around the office with a "Look at the dinosaurs this guy works on" Post-It, but I'm not likely to get a call back. They're looking for buzzwords. On the other hand if I pepper a resume with "MMX 3d games developer C++ with Windows 95" even the folks using MIPS based SGI machines sit up and take notice. This is a horribly counter-productive management attitude, because it screams for employee turnover: "MMX programming? We've never done that, better hire an outside expert", but it is a fact of life. But the reality is that it doesn't take me, or many programmers, very long to become productive in nearly any language or API. So I'm willing to take that risk, but you're going to get better work from me if you structure things well. And computers are getting powerful enough that even on large market applications we can start using special purpose languages, witness Delphi, the current availability of SQL servers, and the web has lead to a lot of systems using Perl, or Tcl, or various other special purpose scripting languages. So if you're going to work in an off-brand language: Have a language lawyer, a person in the company who's sole job it is to respond to questions about the APIs and the languages you use. This person's job is to walk around and talk with programmers, ferret out problems they've got, help them optimize, keep current at knowing what language constructs lead to what sort of output, know the ins-and-outs of any APIs you're using. This is a hard place to be politically, because the job isn't billed to a specific cost center and it's hard to point to the job and say "here's what this person is producing", but it is necessary. The person filling the job has to be very proactive, not reactive, and it's not for everyone. Make sure that the reasons for using the non-mainstream stuff is valid. All languages suck, some suck less than others. I've heard a lot of C++ bashing, but if you're a C house the advantages C++ offers (for one thing just the better "lint" capabilities) should be jumped on immediately, as long as your staff is mature enough to know that there are hidden data structures and function calls that can bite hard. Also, realize that C optimizers on microcomputers have gotten extremely good, and that FORTRAN optimizers probably won't keep up. Make sure that your fears about the language of the month are well founded. I've heard a bunch of people complaining about C++ versus C, but when pressed they couldn't come up with anything but "It's slower" (not if you're smart about using virtual functions), or "Overloaded operators can get you in trouble" (yeah, and so can casts). You can write BASIC (or COBOL, or whatever) in any language, some languages encourage certain paradigms, but in the end it's programmer discipline that makes a language useful. And if you hear upper management saying "I've heard good things about Java, I think we should transfer this 500,000 line C system to...", run, don't walk, to a safe distance and send your resignation letter via the postal service... Unless you still have fond dreams of UCSD Pascal...



-- Dan Lyke, January 7, 1998

I agree that Joe's company is choosing the wrong language, but what can be done? Perhaps a lesson can be learned from the rebirth of Cobol. Now you can find Cobol for PCs, Visual Cobol, web-enabled Cobol and so on. How did the B-movies do it? Blob, Return of the Blob, Son of Blob, Beware the Blob, and so on. What Fortran needs is a good PR campaign, and the release of a slew of products: PC Fortran, Fortran++, Visual Fortran, Javatran, Fortran Script, Web Fortran, Fortran/2000, MyFortran...



-- Kevin Kelleher, May 5, 1998

Obviously, it must be because no one has written a "Fortran for Dummies" book...



-- Carl Layton, May 6, 1998

FORTRAN... (Checking 'Lighthouses of the East Coast' Calendar) Isn't it 1998 or so... I used WATFIVE Fortran at the Engineering school at Rutgers U in the (uggh...) middle 70's Hmmm... I learned from that experience that I shouldn't be a programmer, and I'm not today (I'm a System's administrator for a bunch of Netware 3.2 servers...which will be upgraded 1'st Q99 to 4.11). I can understand why the new kids being hired don't wanna learn C. I'll bet the owner of the company knows FORTRAN. ;-)



-- Richard ARgentieri, August 30, 1998

If you're writing numerical code, Fortran (77, or 90, or HPF) is and always will be the only logical choice. I didn't understand this until I worked at a supercomputing center. The optimizing and parallelizing compilers available for Fortran are not available for C/C++, and any programmer who says otherwise is probably talking out of his ass. IBM and Cray didn't spend 30 years writing C tools. I suspect you (Joe Shipman) already know this full well; I just wanted to point it out to others reading the thread. But for everything else, people need a language that is geared towards data structures (incidentally Java and Smalltalk are superbly well suited for such work) or frameworks (SQL comes to mind, though it is a declarative VHLL; n.b. it's great for testing theorem/dependency provers). Especially for business applications, which after all are where the money is. People who are interested in Grand Challenge problems will end up writing Fortran, but why on earth would anyone else do so? Especially since there's Matlab for the other 99% of the world. Distressing perhaps, but it points to why your company may need to switch to the Dark Side to survive. What the future should look like (for database/business app programmers) was developed 10+ years ago at IBM Almaden, among other places. Lots of data visualization people had the same (at a certain level of abstraction) problems over and over again; encapsulating these operations as visual building blocks allowed dynamic modifications to the program's workings, and visual dataflow debugging. The minor annoyance here is that programs like Data Explorer and AVS only run on high-end workstations (or ideally on low-end SMP supercomputers) and cost $10k or so per seat. D'oh. Java Studio is a poor knock-off of this idea which (hopefully) will get much better. C and C++ are/were (I hope) progressions along the path to this sort of thing; visual tools for RDBMS apps certainly are. Somebody still has to run the numerical codes to figure out the grand challenge problems, though, and that's why I finally started to learn Fortran 77. There are actually some very nice C++ features for high-level and parallel code (simultaneously), like the HPC++ group's reimplementation of the standard template library, and MPI-2 has the capability to accomodate C++-style dynamic bindings. But in the end I suspect that f77/HPF with a dash of C++ will rule the world of supercomputing, while everyone else eventually migrates to encapsulated, safe languages, in visual, type-checking environments. It doesn't make sense to use C++ where VBasic will do, or to roll your own RDBMS when you can use Postgres or Oracle. So, what about the people to write good Fortran code? Hell, maybe the only purpose for supercomputers will be to house monster databases in the near future. There's no money in research, so why learn Fortran? (or "...don't you have something better to do with an afternoon?") On the other hand, it's kind of satisfying to watch the same program your neighbor wrote in C++ run 10 times faster just because you coded it in Fortran 77 and ran it through a good optimizing compiler. ;-)



-- Tim Triche, Jr., September 9, 1998

C will not take over. Ergo there is nothing more that needs to be done. Look at the Windows world. Unless you are writing a system of some sort (like a compiler or web server) you will probably use Visual Basic for standard "IT kind of" programming. Now, I don't have any love lost for VB but it makes a lot more sense to use for higher level applications than using C. Similarly, look at the growing popularity of Java. And TCL for that matter. While TCL is a dreadful language, I would much rather write AOLServer TCL scripts than CGI programs in C. The unfortunate thing with C and C++ is that you have to know so much before you do anything useful with the language. In other words, the care and feeding of the language (memory allocation, pointer manipulation, avoiding leaks, etc) becomes much more of a headache than the solution the program was intended to provide. Look at any well implemented C++ class. About 9/10ths of the code is directly or indirectly related to pointer fiddling and memory management. It would have been nice if Scheme or Lisp were more mainstream. In their absence, I would gladly use Perl, TCL or VB for my programs unless I was writing an operating system or a compiler, or something similarly complex.



-- Jagadeesh Venugopal, August 26, 1999

I always thought it was either machoism or ignorance. People often quote portability, optimizability, speed, size -- all of which I think are misled priorities when it comes to vertical app or abstracted level programming. C is mainstream, but 4GLs are probably better when it comes to most work which do not involve communication with hardware. What can one do? That's a good question.



-- Zhiwen Chong, April 14, 2000

I'm just worried C will be displaced by something worse

"The determined Real Programmer can write FORTRAN programs in any language." -- Ed Post, Datamation, 1983. There used to be a copy of this article ("Real Programmers Don't Eat Quiche") on the corkboard at any big-iron programming shop. I always assumed I was a dinosaur because I still program in Fortran (F77, more or less). I'm a couple years younger than Philip Greenspun, which is awful young to be a dinosaur. I figured it would be a disadvantage if I tried to bail out of academia. I wonder if Joe's company tried to get Fortran programmers out of physics departments (though maybe they aren't all good programmers). If you want to do hard-core calculations, Fortran still beats C, because in C arrays are second-class citizens, not to mention the speed advantage. However, anything involving complex data structures is a pain in Fortran - you essentially have to write your own package. C is much more flexible. Unfortunately, it is so flexible it is easy to create code that is very difficult to read, let alone maintain. (While in Fortran, code is difficult to read because of the inflexibility of the language.) I think C has come to dominate because of the Unix influence, because you can program interesting data structures in it (which provides homework assignments for CS classes), and because it is a strongly typed language which is not so restrictive as to be utterly useless - Pascal, I'm looking at you. Remember when CS departments used Pascal to teach us kiddies programming? No? Thank your lucky stars. Can anything be done about it? Maybe not. After all, I'm using Fortran partly for performance reasons, but also because in grad school I inherited 10,000+ lines of legacy code (originally written for a CDC Cyber, yikes!). Just think of how many lines of C legacy code there are now. The other problem is, what if C is replaced by something even worse? Suppose ten years from now, you can't find a C programmer because the industry standard is some sort of pointy-clicky Visual Studio-like brain-damage-ware. Philip's comments about why companies might take a 1000x performance hit and do everything in Oracle sort of indicate to me that there are not enough good clean C programmers around. Your average system spends most of its cycles doing one or two things. In the bad old days, you got the local guru to write that bit in assembler and then called it from a high level program (C, Fortran, Lisp, BLISS-10, whatever). Because it was a little bit, it was easy to do it right in assembler the first time. These days, you could use C instead of assembler without suffering pain, and SQL takes the place of the high level language. Unless I completely misunderstand how these systems are put together, which is possible.



-- Ben W., June 20, 2000

A good start is Eiffel. Read "Object Oriented Software Construction" available at this site.



-- Neal Lester, July 17, 2000

What is there do say against the use of Visual Basic or any derivative on any system ?