Kuro5hin.org: technology and culture, from the trenches
create account | help/FAQ | contact | links | search | IRC | site news
[ Everything | Diaries | Technology | Science | Culture | Politics | Media | News | Internet | Op-Ed | Fiction | Meta | MLP ]
We need your support: buy an ad | premium membership

[P]
Why C Is Not My Favourite Programming Language

By James A C Joyce in Technology
Mon Feb 09, 2004 at 03:24:52 AM EST
Tags: Software (all tags)
Software

Brian Kernighan, the documenter of the C programming language, wrote a rant entitled Why Pascal is Not My Favourite Programming Language. I can picture him thinking to himself smugly as he repeatedly strikes facetiously at Pascal by describing a few of its small flaws over and over again.

Unfortunately, time has not been kind to Kernighan's tract. Pascal has matured and grown in leaps and bounds, becoming a premier commercial language. Meanwhile, C has continued to stagnate over the last 35 years with few fundamental improvements made. It's time to redress the balance; here's why C is now owned by Pascal.


No string type

C has no string type. Huh? Most sane programming languages have a string type which allows one to just say "this is a string" and let the compiler take care of the rest. Not so with C. It's so stubborn and dumb that it only has three types of variable; everything is either a number, a bigger number, a pointer or a combination of those three. Thus, we don't have proper strings but "arrays of unsigned integers". "char" is basically only a really small number. And now we have to start using unsigned ints to represent multibyte characters.

What. A. Crock. An ugly hack.

Functions for insignificant operations

Copying one string from another requires including <string.h> in your source code, and there are two functions for copying a string. One could even conceivably copy strings using other functions (if one wanted to, though I can't imagine why). Why does any normal language need two functions just for copying a string? Why can't we just use the assignment operator ('=') like for the other types? Oh, I forgot. There's no such thing as strings in C; just a big continuous stick of memory. Great! Better still, there's no syntax for:

  • string concatenation
  • string comparison
  • substrings

Ditto for converting numbers to strings, or vice versa. You have to use something like atol(), or strtod(), or a variant on printf(). Three families of functions for variable type conversion. Hello? Flexible casting? Hello?

And don't even get me started on the lack of exponentiation operators.

No string type: the redux

Because there's no real string type, we have two options: arrays or pointers. Array sizes can only be constants. This means we run the risk of buffer overflow since we have to try (in vain) to guess in advance how many characters we need. Pathetic. The only alternative is to use malloc(), which is just filled with pitfalls. The whole concept of pointers is an accident waiting to happen. You can't free the same pointer twice. You have to always check the return value of malloc() and you mustn't cast it. There's no built-in way of telling if a spot of memory is in use, or if a pointer's been freed, and so on and so forth. Having to resort to low-level memory operations just to be able to store a line of text is asking for...

The encouragement of buffer overflows

Buffer overflows abound in virtually any substantial piece of C code. This is caused by programmers accidentally putting too much data in one space or leaving a pointer pointing somewhere because a returning function ballsed up somewhere along the line. C includes no way of telling when the end of an array or allocated block of memory is overrun. The only way of telling is to run, test, and wait for a segfault. Or a spectacular crash. Or a slow, steady leakage of memory from a program, agonisingly 'bleeding' it to death.

Functions which encourage buffer overflows

  • gets()
  • strcat()
  • strcpy()
  • sprintf()
  • vsprintf()
  • bcopy()
  • scanf()
  • fscanf()
  • sscanf()
  • getwd()
  • getopt()
  • realpath()
  • getpass()

The list goes on and on and on. Need I say more? Well, yes I do.

You see, even if you're not writing any memory you can still access memory you're not supposed to. C can't be bothered to keep track of the ends of strings; the end of a string is indicated by a null '\0' character. All fine, right? Well, some functions in your C library, such as strlen(), perhaps, will just run off the end of a 'string' if it doesn't have a null in it. What if you're using a binary string? Careless programming this may be, but we all make mistakes and so the language authors have to take some responsibility for being so intolerant.

No built-in Boolean type

If you don't believe me, just watch:

$ cat > test.c
int main(void)
{
bool b;
return 0;
}

$ gcc -ansi -pedantic -Wall -W test.c
test.c: In function 'main':
test.c:3: 'bool' undeclared (first use in this function)

Not until the 1999 ISO C standard were we finally able to use 'bool' as a data type. But guess what? It's implemented as a macro and one actually has to include a header file to be able to use it!

High-level or low-level?

On the one hand, we have the fact that there is no string type, and direct memory management, implying a low-level language. On the other hand, we have a mass of library functions, a preprocessor and a plethora of other things which imply a high-level language. C tries to be both, and as a result spreads itself too thinly.

The great thing about this is that when C is lacking a genuinely useful feature, such as reasonably strong data typing, the excuse "C's a low-level language" can always be used, functioning as a perfect 'reason' for C to remain unhelpfully and fatally sparse.

The original intention for C was for it to be a portable assembly language for writing UNIX. Unfortunately, from its very inception C has had extra things packed into it which make it fail as an assembly language. Its kludgy strings are a good example. If it were at least portable these failings might be forgivable, but C is not portable.

Integer overflow without warning

Self explanatory. One minute you have a fifteen digit number, then try to double or triple it and boom! its value is suddenly -234891234890892 or something similar. Stupid, stupid, stupid. How hard would it have been to give a warning or overflow error or even reset the variable to zero?

This is widely known as bad practice. Most competent developers acknowledge that silently ignoring an error is a bad attitude to have; this is especially true for such a commonly used language as C.

Portability?!

Please. There are at least four official specifications of C I could name from the top of my head and no compiler has properly implemented all of them. They conflict, and they grow and grow. The problem isn't subsiding; it's increasing each day. New compilers and libraries are developed and proprietary extensions are being developed. GNU C isn't the same as ANSI C isn't the same as K&R C isn't the same as Microsoft C isn't the same as POSIX C. C isn't portable; all kinds of machine architectures are totally different, and C can't properly adapt because it's so muttonheaded. It's trapped in The Unix Paradigm.

If it weren't for the C preprocessor, then it would be virtually impossible to get C to run on multiple families of processor hardware, or even just slightly differing operating systems. A programming language should not require a C preprocessor just so that it can run on both FreeBSD, Linux or Windows without failing to compile.

C is unable to adapt to new conditions for the sake of "backward compatibility", throwing away the opportunity to get rid of stupid, utterly useless and downright dangerous functions for a nonexistent goal. And yet C is growing new tentacles and unnecessary features because of idiots who think adding seven new functions to their C library will make life easier. It does not.

Even the C89 and C99 standards conflict with each other in ridiculous ways. Can you use the long long type or can't you? Is a certain constant defined by a preprocessor macro hidden deep, deep inside my C library? Is using a function in this particular way going to be undefined, or acceptable? What do you mean, getch() isn't a proper function but getchar() is?

The implications of this false 'portability'

Because C pretends to be portable, even professional C programmers can be caught out by hardware and an unforgiving programming language; almost anything like comparisons, character assignments, arithmetic, or string output can blow up spectacularly for no apparent reason because of endianness or because your particular processor treats all chars as unsigned or silly, subtle, deadly traps like that.

Archaic, unexplained conventions

In addition to the aforementioned problems, C also has various idiosyncrasies (invariably unreported) which not even some teachers of C are aware of: "Don't use fflush(stdin), gets() is evil, main() must return an integer, main() can only take one of three sets of arguments, you musn't cast the return value of malloc(), fileno() isn't an ANSI compliant function..." all these unnecessary and unmentioned quirks mean buggy code. Death by a thousand cuts. Ironic when you consider that Kernighan thinks of Pascal in the same way when C has just as many little gotchas that bleed you to death gradually and painfully.

Blaming The Progammer

Due to the fact that C is pretty difficult to learn and even harder to actually use without breaking something in a subtle yet horrific way it's assumed that anything which goes wrong is the programmer's fault. If your program segfaults, it's your fault. If it crashes, mysteriously returning 184 with no error message, it's your fault. When one single condition you'd just happened to have forgotten about whilst coding screws up, it's your fault.

Obviously the programmer has to shoulder most of the responsibility for a broken program. But as we've already seen, C positively tries to make the programmer fail. This increases the failure rate and yet for some reason we don't blame the language when yet another buffer overflow is discovered. C programmers try to cover up C's inconsistencies and inadequacies by creating a culture of 'tua culpa'; if something's wrong, it's your fault, not that of the compiler, linker, assembler, specification, documentation, or hardware.

Compilers have to take some of the blame. Two reasons. The first is that most compilers have proprietary extensions built into them. Let me remind you that half of the point of using C is that it should be portable and compile anywhere. Adding extensions violates the original spirit of C and removes one of its advantages (albeit an already diminished advantage).

The other (and perhaps more pressing) reason is the lack of anything beyond minimal error checking which C compilers do. For every ten types of errors your compiler catches, another fifty will slip through. Beyond variable type and syntax checking the compiler does not look for anything else. All it can do is give warnings on unusual behaviour, though these warnings are often spurious. On the other hand, a single error can cause a ridiculous cascade, or make the compiler fall over and die because of a misplaced semicolon, or, more accurately and incriminatingly, a badly constructed parser and grammar. And yet, despite this, it's your fault.

To quote The Unix Haters' Handbook:

"If you make even a small omission, like a single semicolon, a C compiler tends to get so confused and annoyed that it bursts into tears and complains that it just can't compile the rest of the file since one missing semicolon has thrown it off so much."

So C compilers may well give literally hundreds of errors stating that half of your code is wrong if you miss out a single semicolon. Can it get worse? Of course it can! This is C!

You see, a compiler will often not deluge you with error information when compiling. Sometimes it will give you no warning whatsoever even if you write totally foolish code like this:

#include <stdio.h>

int main()
{
char *p;
puts(p);
return 0;
}

When we compile this with our 'trusty' compiler gcc, we get no errors or warnings at all. Even when using the '-W' and '-Wall' flags to make it watch out for dangerous code it says nothing.

In fact, no warning is given ever unless you try to optimise the program with a '-O' flag. But what if you never optimise your program? Well, you now have a dangerous program. And unless you check the code again you may well never notice that error.

What this section (and entire document) is really about is the sheer unfriendliness of C and how it is as if it takes great pains to be as difficult to use as possible. It is flexible in the wrong way; it can do many, many different things, but this makes it impossible to do any single thing with it.

Trapped in the 1970s

C is over thirty years old, and it shows. It lacks features that modern languages have such as exception handling, many useful data types, function overloading, optional function arguments and garbage collection. This is hardly surprising considering that it was constructed from an assembler language with just one data type on a computer from 1970.

C was designed for the computer and programmer of the 1970s, sacrificing stability and programmer time for the sake of memory. Despite the fact that the most recent standard is just half a decade old, C has not been updated to take advantage of increased memory and processor power to implement such things as automatic memory management. What for? The illusion of backward compatibility and portability.

Yet more missing data types

Hash tables. Why was this so difficult to implement? C is intended for the programming of things like kernels and system utilities, which frequently use hash tables. And yet it didn't occur to C's creators that maybe including hash tables as a type of array might be a good idea when writing UNIX? Perl has them. PHP has them. With C you have to fake hash tables, and even then it doesn't really work at all.

Multidimensional arrays. Before you tell me that you can do stuff like int multiarray[50][50][50] I think that I should point out that that's an array of arrays of arrays. Different thing. Especially when you consider that you can also use it as a bunch of pointers. C programmers call this "flexibility". Others call it "redundancy", or, more accurately, "mess".

Complex numbers. They may be in C99, but how many compilers support that? It's not exactly difficult to get your head round the concept of complex numbers, so why weren't they included in the first place? Were complex numbers not discovered back in 1989?

Binary strings. It wouldn't have been that hard just to make a compulsory struct with a mere two members: a char * for the string of bytes and a size_t for the length of the string. Binary strings have always been around on Unix, so why wasn't C more accommodating?

Library size

The actual core of C is admirably small, even if some of the syntax isn't the most efficient or readable (case in point: the combined '? :' statement). One thing that is bloated is the C library. The number of functions in a full C library which complies with all significant standards runs into four digit figures. There's a great deal of redundancy, and code which really shouldn't be there.

This has knock-on effects, such as the large number of configuration constants which are defined by the preprocessor (which shouldn't be necessary), the size of libraries (the GNU C library almost fills a floppy disk and its documentation, three) and inconsistently named groups of functions in addition to duplication.

For example, a function for converting a string to a long integer is atol(). One can also use strtol() for exactly the same thing. Boom - instant redundancy. Worse still, both functions are included in the C99, POSIX and SUSv3 standards!

Can it get worse? Of course it can! This is C!

As a result it's only logical that there's an equivalent pair of atod() and strtod() functions for converting a string to a double. As you've probably guessed, this isn't true. They are called atof() and strtod(). This is very foolish. There are yet more examples scattered through the standard C library like a dog's smelly surprises in a park.

The Single Unix Specification version three specifies 1,123 functions which must be available to the C programmer of the compliant system. We already know about the redundancies and unnecessary functions, but across how many header files are these 1,123 functions spread out? 62. That's right, on average a C library header will define approximately eighteen functions. Even if you only need to use maybe one function from each of, say, five libraries (a common occurrence) you may well wind up including 90, 100 or even 150 function definitions you will never need. Bloat, bloat, bloat. Python has the right idea; its import statement allows you to define exactly the functions (and global variables!) you need from each library if you prefer. But C? Oh, no.

Specifying structure members

Why does this need two operators? Why do I have to pick between '.' and '->' for a ridiculous, arbitrary reason? Oh, I forgot; it's just yet another of C's gotchas.

Limited syntax

A couple of examples should illustrate what I mean quite nicely. If you've ever programmed in PHP for a substantial period of time, you're probably aware of the 'break' keyword. You can use it to break out from nested loops of arbitrary depth by using it with an integer, such as "break 3"; this would break out of three levels of loops.

There is no way of doing this in C. If you want to break out from a series of nested for or while loops then you have to use a goto. This is what is known as a crude hack.

In addition to this, there is no way to compare any non-numerical data type using a switch statement. C does not allow you to use switch and case statements for strings. One must use several variables to iterate through an array of case strings and compare them to the given string with strcmp(). This reduces performance and is just yet another hack.

In fact, this is an example of gratuitous library functions running wild once again. Even comparing one string to another requires use of the strcmp() function.

Flushing standard I/O

A simple microcosm of the "you can do this, but not that" philosophy of C; one has to do two different things to flush standard input and standard output.

To flush the standard output stream, one can use fflush() (defined by <stdio.h>). One doesn't usually need to do this after every bit of text is printed, but it's nice to know it's there, right?

Unfortunately, one cannot use fflush() to flush the contents of standard input. Some C standards explicitly define it as having undefined behaviour, but this is so illogical that even textbook authors sometimes mistakenly use fflush(stdin) in examples and some compilers won't bother to warn you about it. One shouldn't even have to flush standard input; you ask for a character with getchar(), and the program should just read in the first character given and disregard the rest. But I digress...

There is no 'real' way to flush standard input up to, say, the end of a line. Instead one has to use a kludge like so:

int c;
do {
errno = 0;
c = getchar();

if (errno) {
fprintf(stderr,
"Error flushing standard input buffer: %s\n",
strerror(errno));
}
} while ((c != '\n') && (!feof(stdin)));

That's right; you need to use a variable, a looping construct, two library functions and several lines of exception handling code to flush the standard input buffer.

Inconsistent error handling

A seasoned C programmer will be able to tell what I'm talking about just by reading the title of this section. There are many incompatible ways in which a C library function indicates that an error has occurred:

  • Returning zero.
  • Returning nonzero.
  • Returning a NULL pointer.
  • Setting errno.
  • Requiring a call to another function.
  • Outputting a diagnostic message to the user.

Some functions may actually use up to three of these methods. But the thing is that none of these are compatible with each other and error handling does not occur automatically; every time a C programmer uses a library function they must check manually for an error. This bloats code which would otherwise be perfectly readable without if-blocks for error handling and variables to keep track of errors. In a large software project one must write a section of code for error handling hundreds of times. If you forget, something can go horribly wrong. For example, if you don't check the return value of malloc() you may accidentally try to use a null pointer. Oops...

Commutative array subscripting

"Hey, Thompson, how can I make C's syntax even more obfuscated and difficult to understand?"

"How about you allow 5[var] to mean the same as var[5]?"

"Wow; unnecessary and confusing syntactic idiocy! Thanks!"

"You're welcome, Dennis."

Variadic anonymous macros

In case you don't understand what variadic anonymous macros are, they're macros (i.e. pseudofunctions defined by the preprocessor) which can take a variable number of arguments. Sounds like a simple thing to implement. I mean, it's all done by the preprocessor, right? And besides, you can define proper functions with variable numbers of arguments even in the original K&R C, right?

In that case, why can't I do:

#define error(...) fprintf(stderr, ...)

without getting a warning from GCC?

warning: anonymous variadic macros were introduced in C99

That's right, folks. Not until late 1999, 30 years after development on the C programming language began, have we been allowed to do such a simple task with the preprocessor.

The C standards don't make sense

Only one simple quote from the ANSI C standard - nay, a single footnote - is needed to demonstrate the immense idiocy of the whole thing. Ladies, gentlemen, and everyone else, I present to you...footnote 82:

All whitespace is equivalent except in certain situations.

I'd make a cutting remark about this, but it'd be too easy.

Too much preprocessor power

Rather foolishly, half of the actual C language is reimplemented in the preprocessor. (This should be a concern from the start; redundancy usually indicates an underlying problem.) We can #define fake variables, fake conditions with #ifdef and #ifndef, and look, there's even #if, #endif and the rest of the crew! How useful!

Erm, sorry, no.

Preprocessors are a good idea for a language like C. As has been iterated, C is not portable. Preprocessors are vital to bridging the gap between different computer architectures and libraries and allowing a program to compile on multiple machines without having to rely on external programs. The #define statement, in this case, can be used perfectly validly to set 'flags' that can be used by a program to determine all sorts of things: which C standard is being used, which library, who wrote it, and so on and so forth.

Now, the situation isn't as bad as for C++. In C++, the preprocessor is so packed with unnecessary rubbish that one can actually use it to calculate an arbitrary series of Fibonacci numbers at compile-time. However, C comes dangerously close; it allows the programmer to define fake global variables with wacky values which would not otherwise be proper code, and then compare values of these variables. Why? It's not needed; the C language of the Plan 9 operating system doesn't let you play around with preprocessor definitions like this. It's all just bloat.

"But what about when we want to use a constant throughout a program? We don't want to have to go through the program changing the value each time we want to change the constant!" some may complain. Well, there's these things called global variables. And there's this keyword, const. It makes a constant variable. Do you see where I'm going with this?

You can do search and replace without the preprocessor, too. In fact, they were able to do it back in the seventies on the very first versions of Unix. They called it sed. Need something more like cpp? Use m4 and stop complaining. It's the Unix way!

Sponsors

Voxel dot net
o Managed Hosting
o VoxCAST Content Delivery
o Raw Infrastructure

Login

Related Links
o Why Pascal is Not My Favourite Programming Language
o how many compilers support that
o atol()
o strtol()
o Also by James A C Joyce


Display: Sort:
Why C Is Not My Favourite Programming Language | 556 comments (448 topical, 108 editorial, 3 hidden)
+1 FP: addresses a real problem (2.00 / 10) (#6)
by DJ Google on Sat Feb 07, 2004 at 03:28:27 PM EST

Even Microsoft knows better than to use C. Shit C sucks so badly they had to make their own language (C#) which corrects most of the mistakes mentioned in this article.

--
Join me on irc.slashnet.org #Kuro5hin.org - the official Kuro5hin IRC channel.

Uh... (none / 2) (#103)
by bigchris on Sun Feb 08, 2004 at 06:47:35 AM EST

Shyeah, that's right. Microsoft made C# to correct the mistakes of C.

More like they decided that Java had some pretty good ideas they liked, but they also liked C++ so they created C#.

---
I Hate Jesus: -1: Bible thumper
kpaul: YAAT. YHL. HAND. btw, YAHWEH wins ;) [mt]
[ Parent ]

Evolution of C# (Greatly simplified) (none / 2) (#251)
by tassach on Mon Feb 09, 2004 at 10:46:29 AM EST

The C++ language grew out of C because C wasn't object oriented.

Java was designed to be a better C++ by eliminating a lot of the dangerous features in C++ and imposing more structure.

C# grew out of Java to address flaws in Java.  (and because Bill and Scott can't play nice with each other)

Each language has it's purpose.  They're different tools designed for different purposes.  A good engineer knows how to select the proper tool for the job at hand, and learns to use new tools if he has to.  A bad engineer uses the same set of tools for every job because that's all he knows and is unwilling to learn something new.

"The tree of liberty must be refreshed from time to time with the blood of patriots & tyrants" -- Thomas Jefferson
[ Parent ]

+1FP, James A C Joyce (1.28 / 14) (#7)
by Michael Jackson on Sat Feb 07, 2004 at 03:35:21 PM EST

Child porn is much more enjoyable than C.

#kuro5hin.org -- irc.slashnet.org -- On the fucking spoke.
drdink -- gimpy pedo-fag felching drwiii off in the weeds

At least (2.69 / 13) (#8)
by flo on Sat Feb 07, 2004 at 03:40:05 PM EST

C lets you shoot yourself in the foot.
---------
"Look upon my works, ye mighty, and despair!"
-1, !clue (2.57 / 19) (#10)
by Bad Harmony on Sat Feb 07, 2004 at 03:56:12 PM EST

It might help if you understood what you are criticizing.

The signedness of chars is implementation dependent. On some systems they are signed.

C lets the programmer choose the string format. The language does not predefine it. Null terminated strings are a convention of libraries and system calls on certain operating systems.

Overflow detection is implementation dependent.

C was not designed to be a user-friendly language that holds your hand, wipes your bottom, and kisses your scrapes. It is a minimal set of abstractions over the hardware. It is not a general purpose high-level language for developing applications! It is a systems programming language, for use by experienced programmers. If you don't like that, go away and use some other language that is more appropriate for your task.

5440' or Fight!

Thank you captain obvious. (2.50 / 4) (#14)
by fae on Sat Feb 07, 2004 at 04:04:24 PM EST

C was not designed to be a user-friendly language that holds your hand, wipes your bottom, and kisses your scrapes. It is a minimal set of abstractions over the hardware. It is not a general purpose high-level language for developing applications! It is a systems programming language, for use by experienced programmers. If you don't like that, go away and use some other language that is more appropriate for your task.

REALLY NOW I DID NOT KNOW THAT.

-- fae: but an atom in the great mass of humanity
[ Parent ]

It isn't obvious to many people (2.50 / 4) (#17)
by Bad Harmony on Sat Feb 07, 2004 at 04:25:09 PM EST

Every time revisions to the C language are discussed, there are hordes of people who whine "Why doesn't it have X", where X is their favorite feature from some other language. They have little or no comprehension of the underlying philosophy of the C language. Rather than use a more appropriate language, they want to turn C into a Frankenstein's monster of trendy features.

5440' or Fight!
[ Parent ]

You don't get it (2.60 / 5) (#126)
by stuaart on Sun Feb 08, 2004 at 08:26:30 AM EST

There was an excuse in the 1970s for writing the mess that is C. There is no excuse nowadays.

``It is a systems programming language, for use by experienced programmers. If you don't like that, go away and use some other language that is more appropriate for your task.''

Saying that is ignoring the questions raised in the article. To be honest, I don't trust the ``experienced programmers'' enough to think that they are immune to the stupid little problem present in C. Why should we have to fit awkwardly to an old language? It's purely for embedded historical reasons that we still use C. People are wasting their productive time when they have to program ``around'' C; I don't understand why having this arcane knowledge about C pecularities is somehow a virtue. Believing that this is somehow acceptible is an incredibly old-fashioned attitude. I feel that C has had its day, and should now be retired.

Linkwhore: [Hidden stories.] Baldrtainment: Corporate concubines and Baldrson: An Introspective


[ Parent ]
Thought experiment (none / 2) (#136)
by curien on Sun Feb 08, 2004 at 10:06:50 AM EST

Construct a usable desktop computer (hardware and software system) wihtout using anything written in C or assembly.

--
All God's critters got a place in the choir
Some sing low, some sing higher
[ Parent ]
Care to elaborate? :-) nt (none / 0) (#142)
by curien on Sun Feb 08, 2004 at 10:56:26 AM EST



--
All God's critters got a place in the choir
Some sing low, some sing higher
[ Parent ]
Well, I was expecting (none / 0) (#149)
by curien on Sun Feb 08, 2004 at 12:05:39 PM EST

you to say, "machine language", which is already another (set of) language(s) that does the same thing as assembly.

--
All God's critters got a place in the choir
Some sing low, some sing higher
[ Parent ]
I'm curious... (none / 0) (#270)
by ksandstr on Mon Feb 09, 2004 at 12:06:53 PM EST

What exactly do you mean by writing ``around C''? I can't say I see any other ways of writing around C than that which is done by ill-educated nitwits who cling desperately to their misconceptions about "objects living in a pool happily squirting their warm fluids around" being the ideal way that things should exist in a software system. This confusion of mine is compounded by the understanding that there is, at the end of the day, so very little to C that writing around it would be like driving a main battle tank around a blade of grass -- doable, but ultimately a futile exercise. After all, it's ultimately a better idea to program with the language rather than against it, just as it is easier to move an object in a direction it's already headed (or wants to go) than where it is not.

Or maybe you're just annoyed that unlike other, trendier languages, C actually has a tradition older than yourself? And in the proud uhmerrykuun tribal style, you're so very ready to energetically flout and ridicule any such thing that you personally don't understand or cannot be arsed to learn?


Fin.
[ Parent ]

C is very much an American language. (none / 0) (#300)
by tkatchev on Mon Feb 09, 2004 at 03:29:56 PM EST

Americans are extremely hung up on theory, for some reason, and prefer writing languages that are non-functional, but adhere to some weird standard of "purity".

(Even Pascal is very much a practical language, but that is a topic for another discussion.)


   -- Signed, Lev Andropoff, cosmonaut.
[ Parent ]

Clarification (none / 0) (#355)
by stuaart on Mon Feb 09, 2004 at 08:13:04 PM EST

Writing ``around C'' is something that all languages have as a negative point. It is that action of being restricted by a language that unnecessarily (in our historically advantaged view) forces us to write code in a particular way; to be restricted unfairly by the very structure and nature of the language.

Now, these accusations may be levelled at all languages, however my point was that C has more of these niggles than some other, more modern languages. Without C there would probably be no Unix or C++ or vast quantities of other software, so don't get all whiney and presume I don't understand the historical context. C is an incredibly useful and powerful language. Having said this, if we are being honest about this historical perspective, we are also forced to admit that there are major flaws in C (which James A C Joyce has pointed out some of --- I do think that he also does not account for this historical context and therefore comments about strings are unfair).

By bringing out the argument:

``Or maybe you're just annoyed that unlike other, trendier languages, C actually has a tradition older than yourself?''

you have implied that you are unwilling to seriously take on and accept these objections some people may have with C. To write off criticism of C is a dangerous thing especially given the greater understanding we now have in hindsight about C's position, merits and demerits (however artful the design may have been) relative to other programming paradigms. It is my contention that C has weathered very well, but that we need to accept that it must be replaced by a new low level language that supports more a modern featureset. Alternatively I contest that we should keep C but ensure that it is used in the right place (ie. basic operating system level software).

Commenting on ``proud uhmerrykuun[sic] tribal style'' has merely enhanced the impression that you are unable to view the other side of the coin. Programmers are far too infrequently predisposed to engage in a little reflective commentary on programming itself; they want to be coding (usually is their job) and their bosses (if they have them) also want them to be coding. I am not holier than thou --- I fully understand programmers wanting to code --- I just wish to encourage evaluation, which is, I feel, a greatly underdone thing.

Linkwhore: [Hidden stories.] Baldrtainment: Corporate concubines and Baldrson: An Introspective


[ Parent ]
There're some things I love about C. (2.88 / 18) (#11)
by fae on Sat Feb 07, 2004 at 03:58:08 PM EST

It is very easy to learn the ins and outs of the core of C. It's all simple and consistent and elegant. I also love what you can do with C pointers. They are so powerful. Function pointers just make the deal so much sweeter.

I agree with much of your submission. Too difficult to do unicode, easy to accidentally overflow, rarely used functions (strncpy, anyone?), etc.

Yet, C is assembly language made (mostly) portable. Some of the difficulties of C are just a necessary outcome of its low-levelness. Other difficulties came when they decided to add bits and pieces of high-level. (C++ took this bloating even further and that is why I hate it.)

By the way, you should have a conclusion section.

-- fae: but an atom in the great mass of humanity

Um... easy to learn compared to what? (none / 1) (#446)
by Merc on Tue Feb 10, 2004 at 02:14:18 PM EST

I've found that C is one of the hardest languages to learn. I guess your statement depends on what you consider to be 'the core' of the language, but take this trivial example. Write a program that asks for a user's name, then says hello to that person.

In C, you'll first have to remember to allocate a variable to store the name someone enters. Next, you'll have to make sure you don't allow the space you allocate to overflow. You'll then have to use a completely convoluted syntax just to print that name out using 'printf' and its odd format string method.

If you want to find out what functions you'll need to do these operations, you'll have to either have a good book, get lucky with man pages, or ask someone. How else would you know that you'd need the stdio.h header file, and the strings.h one as well.

Compare that to most scripted languages where you don't need to pre-allocate storage, you don't need to worry about overflows, and strings are first-class data types. If you don't know what functions you'll need, you'll have a much better chance of browsing around to find the right package or group.

Even if you compare C to other compiled languages, most are more straightforward than this.

I've been programming C for years, and I still don't really know the ins and outs of the core of the language. Take just the standard data types: 'char', 'short', 'int' and 'long'. 'char' is a character, and it's mostly 8 bits long, but sometimes more. 'short' is a short integer, and it is in fact just shorthand for 'short int'. Sometimes it's 16 bits, but you can't count on that either. 'int' is well... an integer, it's tied to the 'word' size of the machine you compile the code for. 'long' is... another type of integer, but is actually just shorthand for 'long int', which on most machines I've used is the same as an 'int'. What if you know you want 32 bits? Well you can't assume that you can use an 'int', because on some machines it might be 16 or 64 bits. Instead, you have to find or create a #define or typedef to specify what kind of type has 32 bits on your current architecture. But if you do that, you should always surround it by obscure #ifdef statements, so if the code is used on another machine, you won't be wrong about the size.

And then there's the core of the language dealing with strings. How do you safely, portably tokenize a string in C? How do you efficiently concatenate strings, without worrying about buffer overflows?

I don't think there's anything about C that's simple or elegant. It's a great language if you have limited resources, or need to have total control, but really, it makes the programmer work way too hard.



[ Parent ]
Word size (none / 0) (#471)
by smallstepforman on Tue Feb 10, 2004 at 06:04:31 PM EST

These days I've moved towards int32 and int64 variable typedefines.  Likewise, I have uint32, uint16 etc.  It should allow easier migration in the future.

One of the things I dont like about C99 is that they missed a very important chance with int64's, and instead use long long.

To summarise, I would ideally love to have
8 bit = char, unsigned char
16 bit = short, unsigned short
32 bit = int, unsigned int
64 bit = long, unsigned long

or int8, int16, int32, int64 and equivelant. But that makes too much sense.

[ Parent ]

Right, instead you have: (none / 0) (#477)
by Merc on Tue Feb 10, 2004 at 08:42:49 PM EST

u32, uint32, ui32, _u32, bob_uns_int32, etc. Everybody seems to have their own different way of marking something as an unsigned 32 bit number. Argh.



[ Parent ]
That could break some code (none / 0) (#493)
by squigly on Wed Feb 11, 2004 at 05:46:59 AM EST

Some applications are written assuming that an int is unknown length (either 16 or 32 bits), and therefore use a long to mean explicitely 32 bits.  Very few compilers treat a long as 64 bits.  

Personally, I would have liked to be able to specify arbitrary length integers, especially >64 bits.  Rules for very long integers are simple to deal with in a compiler, or in assembly/machine code, but a lot harder to implement in C simply because the processor supports a carry flag, and C does not.

[ Parent ]

shamelessly stolen from /. (1.85 / 7) (#16)
by pertubation theory on Sat Feb 07, 2004 at 04:21:16 PM EST

RespeCt the C++k! And tame the C#nt!

----
Dice are small polka-dotted cubes of ivory constructed like a lawyer to lie upon any side, commonly the wrong one.
- Ambrose Bierce
Just for the hell of it, a rebuttal (2.83 / 24) (#18)
by curien on Sat Feb 07, 2004 at 04:28:37 PM EST

I'm not going to go point-by-point, but I'll refute what comes to mind easily (ie, things that don't take too much thought to refute). I'm also skipping the ones I feel are completely retarded.

No string type
In C, a string is not a type, it's a data format. I suppose you can see this as a weakness, but the large number of routines available for manipulation of this data format have put the C-style string into the realm of abstract data type, IMO. I'd like to hear any argument otherwise. You might as well rant about how C doesn't have a matrix type.

Functions for insignificant operations
You're kidding, right? What are you, an APL fanatic? Or maybe you liked having to use keywords for everything (as in Cobol and Pascal)?

Flexible casting
You are showing your ignorance. Casting is not conversion, and "conversion" is what you really meant.

Array sizes can only be constants
Bzzt... wrong. Next!

The encouragement of buffer overflows
This is indeed an issue one must face when using C. I'm mostly a C++ programmer, and I only use raw memory (pointers, etc) when writing interfaces to low-level functionality. Without manipulating raw memory, programming wouldn't be possible, but it must be localized and (heh) buffered from the client programmer.

Integer overflow without warning
It's because C's a low-level language. :-} Seriously, if you want a language that holds your hand, use Ada. If you want one that only does exactly what you tell it, use assembly. If you want a portable  assembly, use C.

There are at least four official specifications of C I could name from the top of my head
Nope, only three, and one of them is just a library extension. Two of those three have been implemented completely in all four major compilers that I've used over the years.

If it weren't for the C preprocessor, then it would be virtually impossible to get C to run on multiple families of processor hardware
Only if it wants to take advantage of OS-specific functionality. Many Unix command-line utilities, for example, can be implemented in macro-free, standard C.

Even the C89 and C99 standards conflict with each other in ridiculous ways. Can you use the long long type or can't you?
Huh? First, it's not a conflict. Second, are you saying that new versions shouldn't provide new features?

almost anything like comparisons, character assignments, arithmetic, or string output can blow up spectacularly for no apparent reason because of endianness or because your particular processor treats all chars as unsigned or silly, subtle, deadly traps like that
Present one real-world example where a character assignment fails for platform-specific reasons.

various idiosyncracies... all these unnecessary and unmentioned quirks
Umm... how can they be unmentioned if people mention them? While you're at it, let's wonder why Pascal requires a period at the end of the main block's "END".

When one single condition you'd just happened to have forgotten about whilst coding screws up, it's your fault.
Well, yeah. Obviously the programmer has to shoulder most of the responsibility for a broken program. See, you agree!

Unfortunately, one cannot use fflush() to flush the contents of standard input.
You can't use sync as a filename completion utility, either.

About your code sample: unnecessarily complex.

  int c;
  while((c = getchar()) != '\n' && c != EOF) ;

Besides, you bely a basic fallacy in your complaint. There is no 'real' way to flush standard input up to, say, the end of a line. That's because the "end of the line" part is awfully arbitrary. Why should the library decide for you how much of the buffer to flush? Flushing the entire thing (which is what fflush does with output streams) is simply infeasible with input streams, as it would leave the stream at EOF. Not very useful, is it?

Inconsistent error handling
Oh, the horrors of backwards compatibility. This is indeed a failing.

Variadic anonymous macros
You could do it before, just not as easily. Also, do I have this right... you're complaining that C does provide a feature you're asking for?

The C standards don't make sense
First, footnotes are non-normative. Second, any document can be snipped in such a way as to appear to not make sense. (The C standard has real defects, as any paper of that size will, but that footnote is not one of them.)

In C++, the preprocessor is so packed with unnecessary rubbish that one can actually use it to calculate an arbitrary series of Fibonacci numbers at compile-time.
This is unrelated to the real argument, but you're thinking of templates, not the preprocessor. The C++ preprocessor is the same as the C90 preprocessor. This comment is interesting to note, however, in that it is indicative of your general lack of subject knowledge.

--
All God's critters got a place in the choir
Some sing low, some sing higher

The particular footnote... (none / 1) (#20)
by fae on Sat Feb 07, 2004 at 04:35:22 PM EST

...would make sense if you consider that a space is different than a tab, in a string.

-- fae: but an atom in the great mass of humanity
[ Parent ]
The real problem. (2.37 / 8) (#31)
by tkatchev on Sat Feb 07, 2004 at 06:12:30 PM EST

The real problem is the fact that C is an ideal abstract, portable assembler for the PDP.

Unfortunately, though, modern processors are quite a bit different from those in the era of PDP. We need something that understands and takes advantage of the vastly improved processor architectures of today.


   -- Signed, Lev Andropoff, cosmonaut.

Nobody wants to learn Occam. (none / 1) (#36)
by it certainly is on Sat Feb 07, 2004 at 06:34:31 PM EST

So if they just add the "vector" and "vect_add" keywords to C, like Apple did, everything will be right again.

kur0shin.org -- it certainly is

Godwin's law [...] is impossible to violate except with an infinitely long thread that doesn't mention nazis.
[ Parent ]

Not quite. (none / 0) (#106)
by tkatchev on Sun Feb 08, 2004 at 06:54:48 AM EST

This won't fix the problems with branch prediction, pipelining, caches and all the other features of high-performace processors nowadays.

No, in reality the modern architectures are coming closer and closer to the computing model used in functional languages. In the future, functional compilers will be more and more mainstream.

   -- Signed, Lev Andropoff, cosmonaut.
[ Parent ]

I was being sarcastic. (none / 0) (#220)
by it certainly is on Mon Feb 09, 2004 at 03:27:20 AM EST

Obviously you are too intelligent and ADD-raddled to notice.

However, I don't think modern architectures are becoming like functional languages, as I haven't seen any modern architectures built from nothing but combinators, and modern architectures are able to write to an address, rather than have to make a duplicate of the entire environment but with that address changed, to be able to pretend that all data is involatile.

kur0shin.org -- it certainly is

Godwin's law [...] is impossible to violate except with an infinitely long thread that doesn't mention nazis.
[ Parent ]

Well. (none / 0) (#223)
by tkatchev on Mon Feb 09, 2004 at 04:33:33 AM EST

Stacks are a functional feature. So is evaluating things out of order, and so are concurrent threads.

Think out of the box, there are no purely functional languages, just like there aren't any purely imperative languages. (Well, except GW-BASIC, but whatever.)

   -- Signed, Lev Andropoff, cosmonaut.
[ Parent ]

Stacks (none / 0) (#295)
by irwoodhouse on Mon Feb 09, 2004 at 02:13:38 PM EST

You mistake cause and effect.

Stacks are a consequence of primitive recursion (recursive function theory, Kleene schemes) and as a basic premise of what-is-computable it is hardly surprising that functional languages implement them.

But recursion is also supported by imperative languages.

On the other hand iteration (which is more directly implementable by Turing machines or unlimited register machines) is not a feature of functional languages.

Concurrent threads and out-of-order are a "feature" of functional languages only because by definition they do not implement the order inherant in imperative languages.

As far as microprocessors are concerned, until they have two Instruction Pointers all concurrency is virtual. Concurrency/parallelism is therefore a mapping into sequential time in the same way that recursion and iteration can model each other.

[ Parent ]

You professor ruined your brain. (none / 0) (#304)
by tkatchev on Mon Feb 09, 2004 at 03:48:55 PM EST

Besides the "theoretical foundations" claptrap, functional programming has a very real, very practical and hands-on utility.

"Purely functional programming" is, essentially, programming without state. That is, you only have values and functions that compute other values.

"Purely imperative programming" would be programming an abstract state machine, where you only have state and state transitions. (A Turing machine is very similar to this.)

For obvious reasons, neither of these "pure" models are useful or practical in the real world. The real-world problem is in finding a proper balance between the two models for particular hands-on problems.

Nowadays, the big interest in functional programming is due, in large part, to the fact that (for the moment, at least) it is virtually imporssible to write parallel programs without getting heavily into functional programming. Trying to write parallel programs in a purely imperative fashion is likely to short-cirtuit your brain.


   -- Signed, Lev Andropoff, cosmonaut.
[ Parent ]

No he didn't! (none / 0) (#339)
by irwoodhouse on Mon Feb 09, 2004 at 06:35:39 PM EST

I got out of the ivory tower of mathematical computer science and joined the engineering camp years ago and spend my time solving real problems. The background is still useful though, sometimes.

Besides the "theoretical foundations" claptrap, functional programming has a very real, very practical and hands-on utility

I did not dispute that. I interpreted your comment to mean that microprocessor design was taking lessons from functional programming whereas both stem from an underlying framework. As stated, you mistake cause and effect.

As an example, Out of order execution is an effect in functional programming because the latter does not implement that sort of flow control and it doesn't matter which branch is done first because without state neither branch can affect the other.

Microprocessors can only execute out-of-order when instructions do not interfere with one another. This is down to the ability to manipulate flow of control for performance and relies on the existence of timing of atomic operations, which is absent from functional programming.

... it is virtually imporssible to write parallel programs without getting heavily into functional programming ...

This is currently out of my field - score a point :)

[ Parent ]

No, you misunderstand. (none / 0) (#341)
by tkatchev on Mon Feb 09, 2004 at 06:49:32 PM EST

Rather, the current developments in processor design put functional languages in a priviliged position, somewhat. Certainly nobody designs processors with a specific type of programming language in mind.

   -- Signed, Lev Andropoff, cosmonaut.
[ Parent ]

Concurrency. (none / 0) (#434)
by warrax on Tue Feb 10, 2004 at 12:00:54 PM EST

As far as microprocessors are concerned, until they have two Instruction Pointers all concurrency is virtual. Concurrency/parallelism is therefore a mapping into sequential time in the same way that recursion and iteration can model each other.
Ever heard of SMP? There's true concurrency for you.

-- "Guns don't kill people. I kill people."
[ Parent ]
No shit, professor. (none / 0) (#437)
by tkatchev on Tue Feb 10, 2004 at 12:39:23 PM EST

Problem is, trying to program SMP in a purely imperative style is a lost cause.

Any serious attempt to do SMP invariably ends up re-inventing some sort of functional language. (Which, BTW, is exactly why SMP is still essentially unused in the real world. There is a lack of decent functional compilers.)


   -- Signed, Lev Andropoff, cosmonaut.
[ Parent ]

My word you are rude. (none / 0) (#441)
by warrax on Tue Feb 10, 2004 at 12:59:56 PM EST

I was simply pointing out that the parent didn't seem to grasp the fact that there is such a thing as true concurrency at the hardware level.

And then you spew forth this nonsense:

Problem is, trying to program SMP in a purely imperative style is a lost cause.
If the programmer is an idiot, yes. I've done it quite successfully on machines with lots of processors (64). Scaled almost perfectly too.
Any serious attempt to do SMP invariably ends up re-inventing some sort of functional language. (Which, BTW, is exactly why SMP is still essentially unused in the real world. There is a lack of decent functional compilers.)
Are you on crack? 'Cause I think I want some! SMP is NOT unused in the real world (where people program in C, Java and C#). Have you ever used a GUI? Well, that's probably multithreaded ("SMP-enabled", in other words). Photoshop filters? Multithreaded (many of them -- of course if the data flow is serial there's no point in trying to write an SMP version). Apache? Multithreaded (well, apache2 is).

Bleh. I a feeling IHBT.

-- "Guns don't kill people. I kill people."
[ Parent ]

Of course I'm rude. (none / 0) (#444)
by tkatchev on Tue Feb 10, 2004 at 02:12:21 PM EST

You're incredibly tacky, conceited and mal-educated.

Look, writing good SMP programs without race conditions, lockups and incoherent data is still and unsolved problem. There are whole university programs devoted to the problem, and we haven't even begun to research the problems of scalability.

Lastly, multithreaded isn't SMP, OK? Thanks.


   -- Signed, Lev Andropoff, cosmonaut.
[ Parent ]

Ok, bad explanation (none / 0) (#499)
by irwoodhouse on Wed Feb 11, 2004 at 09:18:09 AM EST

... the parent didn't seem to grasp the fact that there is such a thing as true concurrency at the hardware level ...

Actually, SMP exhibits limited concurrency but not true concurrency (total parallelism). Trivially, memory is shared - one thread CAN affect another, if permitted.

The better example is beowulf-class parallelism, where what happens on one node absolutely cannot affect any other.

My comment - two IPs - is flippant and directed at a person who obviously understands parallel algorithms. You have interpreted my comment in literal terms.

To make it clear, we're discussing whether functional languages are consciously influencing hardware design (I maintain they aren't). That is not to say that other designs - truly parallel machines - are not built specifically for functional languages (which tkatchev maintains).

[ Parent ]

That's not what I'm saying. (none / 0) (#507)
by tkatchev on Wed Feb 11, 2004 at 01:09:08 PM EST

Obviously, language design doesn't at all influence processor architectures.

Rather, as parallel processors start becoming more important, functional languages will become more useful and prominent to take advantage of these modern new-fangled processors.


   -- Signed, Lev Andropoff, cosmonaut.
[ Parent ]

Re Not Quite (none / 0) (#296)
by irwoodhouse on Mon Feb 09, 2004 at 02:23:39 PM EST

The device for mapping a language to a machine architecture is called a compiler.

You don't modify the language. Compilers are becoming more sophisticated as architectures evolve.

The "computing model" of functional languages has the same foundation as that used by imperative languages (Church-Turing).

We're talking commutative diagrams here.

Functional languages, Prolog, Lisp and expert systems all share the requirement for a powerful inference engine in order to unify LHS/RHS. Machine architectures do not show any sign of providing this.

[ Parent ]

Fallacies everywhere, where to start? (none / 1) (#306)
by tkatchev on Mon Feb 09, 2004 at 03:58:32 PM EST

First, it is dumb to delegate responsibility to compilers. You make it out as if "compilers" are everything, and "language" is nothing. If that were true, we'd be still using PL/I and COBOL.

Second, Turing's computational model is useful for describing the computer architectures we have, but not the other way around. We use a Turing machine as a way of simplifying the internal workings of your processor chip, but nobody in their right mind would ever even think about designing a chip based on a Turing machine.

Also, there are lots of other equally valid computational models that have nothing at all to do with Turing's machine.

Thirdly, functional languages aren't in any way whatsoever related to inference engines. (That's called "logic programming", and it is a completely different beast.)

Fourthly, it is very much possible to write functional compilers that are extremely small, extremely lean and map extremely well to current processor architectures. (Such a compiler would be good at solving a completely different class of problems, though.)

P.S. Look into something called "Markov algorithms". It is the functional equivalent of a Turing machine, except it is trivially easy to implement, maps very well to machine code and doesn't need an infinite tape.


   -- Signed, Lev Andropoff, cosmonaut.
[ Parent ]

Fallacies? Careful... (none / 1) (#352)
by irwoodhouse on Mon Feb 09, 2004 at 07:50:52 PM EST

First, it is dumb to delegate responsibility to compilers. You make it out as if "compilers" are everything, and "language" is nothing.

Oho - this statement can be interpreted two ways.

I was talking about making use of hardware features, in reply to your comment:

This won't fix the problems with branch prediction, pipelining, caches and all the other features of high-performace processors nowadays

As far as mapping to current execution architectures, what matters is what code is generated, regardless of paradigm and language. If the compiler cannot generate code to use a particular architectural feature, then the programming language can't use it (even if the language possesses that feature). Thus the compiler is everything.

For the second interpretation, it can be read as "why don't we all write in one language since the compiler makes the difference?" Viz your comment in this post:

If that were true, we'd be still using PL/I and COBOL.

Neither of these can twiddle bits or write to arbitrary memory locations, so you don't find OS or compiler writers using them. You certainly could design a record in COBOL to mirror a OS task structure, but you'd have to come to some understanding with the compiler about copying it into the CPU's TSS register. Similarly Java is effectively interpreted (virtual machine) and is platform independent whereas C is not, so C cannot be used for writing applets.

So, when talking about applicability of languages (relevant to Joyce's article), it's horses-for-courses language-wise. When discussing hardware features, it's down to the compiler, so I do have my cake and eat it :)

Whoa. Next.

Second, Turing's computational model is useful for describing the computer architectures we have, but not the other way around.

Two ways I can answer this. Firstly, I don't believe I actually claimed that - it certainly wasn't intentional.

However, you're wrong, as follows:

Turing is a model of computability using a state machine. Kleene is a model using recursive functions. Church is model using transformations. I've heard of Markov but not read the theory.

Regardless, all of the above are equally expressive and it has been proved there are exact mappings between them (this is basic Theory of Computation).

It is further true that a microprocessor is in itself a model and cannot be more computationally expressive than any of the above otherwise , de jure, we would not be using the above theories as model of computation. (A real microprocessor also implements things outside the realm of computation, such as I/O, which is out of scope in this argument).

Therefore, there must exist a direct commutative mapping between any of the above theories and any microprocessor. Therefore any microprocessor is an equally valid (albeit damn complicated) model of both computation and any of the above theories. And, in fact, the Berkeley RISC II is just such a model (concrete implementation of which is called the Sparc...)

whew! (end of rant)

nobody in their right mind would ever even think about designing a chip based on a Turing machine

You'd be surprised. Some site (slashdot?) recently posted recently posted a link to a software emulation of either a Turing machine or something similar.

Thirdly, functional languages aren't in any way whatsoever related to inference engines. (That's called "logic programming", and it is a completely different beast.)

My bad. And I should know, I had the joy of studying a prolog inference engine written in prolog AND I also played with a functional language called Gofer and its implementation in C. I think I got riled about your apparent insistence that microprocessors are getting features from functional languages which can be explained via alternative routes.

I apologise.

... it is very much possible to write functional compilers ...

True, but that is down as much to the design of the compiler as to the language. As noted, the Gofer interpreter is written in C therefore we have a transitive path to execution with the graph edges being compilers.

P.S. Look into something called "Markov algorithms"

I will. You might like to look at a model called the Unlimited Register Machine (URM) which is Turing-equivalent but replaces infinite tape with unlimited (or limited but indefinite) registers which are directly addressable (means the model programs are shorter). It's the closest academic model I've seen to assembly. I doubt I'll reply to any more comments (I can't keep up replies to four different subthreads at once :( but I will try to read any more replies.

[ Parent ]

Perhaps the main difference... (none / 1) (#157)
by joto on Sun Feb 08, 2004 at 12:58:34 PM EST

...between "modern" processors and C is that C essentially assumes that memory access is cheap.

Case in point: C makes it really hard to stuff more stuff into registers. This can be seen by the liberal use of the & operator. This is most obvious in calling conventions, e.g. call-by-reference.

Case in point: Restricted pointers. In C, two arrays can overlap the same memory area, so the compiler can't optimize code that reads from one array and writes to another. C9x introduced restricted pointers to override this, but it's really more of a hack than an elegant solution.

[ Parent ]

Yeah. (none / 0) (#224)
by tkatchev on Mon Feb 09, 2004 at 04:34:25 AM EST

Also, vectors and parallelism.

   -- Signed, Lev Andropoff, cosmonaut.
[ Parent ]

Memory access IS cheap. (none / 1) (#285)
by gte910h on Mon Feb 09, 2004 at 01:07:24 PM EST

You just have to hit the cache :)

[ Parent ]
So does every non-assembly language. (none / 2) (#297)
by irwoodhouse on Mon Feb 09, 2004 at 02:44:59 PM EST

That's the point of abstraction.

C has no notion of registers. Are you confusing the language with the compiler?

Even the register keyword in K&R was just a hint to the compiler that this variable was going to hit frequently.

You really ought to look at the assembly output of gcc for some sample code before making such a generalisation (with various levels of optimisation). The compiler's opinion of what should go in a register isn't necessarily the same as yours.

For overlapping arrays, I assume you mean loop unrolling writing over something you haven't read yet. It certainly makes it harder to get the CPU to do the work for you (e.g. intel family and the ESI/EDI register operations).

memmove() and bcopy() are both carefully written to handle this as efficiently as possible. For most systems, memXXXX and strXXXX are hand-coded either in native assembly (e.g. solaris on sparc) or independent assembly and mapped to native (e.g. glibc via gcc).

[ Parent ]

C has no notion of registers. (none / 0) (#308)
by tkatchev on Mon Feb 09, 2004 at 04:00:10 PM EST

Though it really, really should.

Stuff that made sence in 1970 sounds patently ridiculous nowadays.


   -- Signed, Lev Andropoff, cosmonaut.
[ Parent ]

1970s stuff (off topic) (none / 0) (#360)
by irwoodhouse on Mon Feb 09, 2004 at 09:04:13 PM EST

That 1970s stuff - C,Unix - got this far in the face of academic competition from other languages and market competition from other operating systems without dramatic modification is either a ringing endorsement of said stuff or damning indictment of utter public stupidity.

Which is to say that the free market has made its choice and there ain't a damn thing you can do about it.

I like the question of why a paperless office never happened - nobody would have questioned this if paper had been invented first. The (academic) theory is that technology improves over time, but it seems that shuffling paper fits better with human psychology. So the theory is not universal.


[ Parent ]

A slight misunderstanding on your side? (none / 1) (#328)
by joto on Mon Feb 09, 2004 at 05:19:02 PM EST

C has no notion of registers.

I wasn't talking about explicitly putting stuff into registers. I was talking about ways the semantics of C, and common calling conventions, prevent the compiler from putting stuff into registers that it should have been able to do.

Are you confusing the language with the compiler?

No.

Even the register keyword ...[snip]

I didn't even mention the register keyword. Where did you get the idea that I was speaking about it?

For overlapping arrays, I assume you mean loop unrolling writing over something you haven't read yet. It certainly makes it harder to get the CPU to do the work for you (e.g. intel family and the ESI/EDI register operations).

Exactly, loop unrolling. This is essential for getting the most out of cache performance.

By the way, x86 string operations have been slower than just looping since at least the pentium. It's there for compatibility only, as are a lot of the other complex instructions (decimal arithmetic, etc). I believe the fastest bitblit is achieved by using floating point registers, or maybe one of the MMX/SSI/3dnow/whatever instructions.

memmove() and bcopy() are both carefully written to handle this ...[snip]

Handle what? Certainly you want to do other things to arrays than just copying them? There's nothing wrong with a C implentation that notices array copying code and replaces with a call to memmove, but that was not what I was talking about.

[ Parent ]

A slight lack of detail on your side? (none / 0) (#358)
by irwoodhouse on Mon Feb 09, 2004 at 08:44:34 PM EST

You'll need to be more specific. You said:

C makes it really hard to stuff more stuff into registers. This can be seen by the liberal use of the & operator. This is most obvious in calling conventions, e.g. call-by-reference.

Liberal use of the & operator... call by reference? What? C is explicity call by value. C++ has call by reference (using the & operator in function definition) but we aren't talking about C++ here. Otherwise you must mean actual parameters:

int a; call(&a);

I spend more time passing pointers to structs than I do passing simple types so even if C passed arguments in registers (which it doesn't - it's via the stack) it breaks down there.

Handle what? Certainly you want to do other things to arrays than just copying them? There's nothing wrong with a C implentation that notices array copying code and replaces with a call to memmove, but that was not what I was talking about.

Again, be more specific. You said:

In C, two arrays can overlap the same memory area, so the compiler can't optimize code that reads from one array and writes to another.

If it's not straight read-write then it's read-compute-write and the compute step might be simple (add four) or complex (calculate standard deviation). If it's the latter optimisation of the loop is the least of your problems.

As for Handle What? - memmove and bcopy handle copying of overlapping arrays whereas memcpy doesn't.

Exactly, loop unrolling. This is essential for getting the most out of cache performance.

Optimising for hardware is outside my experience, but I do recall an example of a trivial loop not easily automatically unrolled which can bugger the cache on a sparc because the array size equals the cache line size (Adrian's - the Porsche Guy from Sun - book on performance).

Some things have to be hand coded *shrug*. Applies double when talking to hardware because compilers see registers and memory and little else. This is more about being a clever programmer (which I assume you must be) than a problem with the language.

It's also hardware dependent. It's far easier for a compiler to make clever use of registers on the sparc (32) than intel architecture (4+2).

By the way, x86 string operations have been slower than just looping since at least the pentium.

I didn't know that. My reference is intel part number 230985-003 386DX Microprocessor Programmer's Reference Manual which doesn't feature the niceties of MMX (I do have a more recent IArch PRM in pdf somewhere but I've not had cause to use it).

Regardless, please be specific in your comments so the rest of us Mere Mortals can understand what you mean.

[ Parent ]

Sorry... (none / 1) (#363)
by joto on Mon Feb 09, 2004 at 09:52:18 PM EST

Liberal use of the & operator... call by reference? What?

Exactly. C doesn't have call by reference or out parameters. These could be passed by the compiler in clever ways (read - registers). And pointer arithmetic and so on causes all sorts of problems for alias analysis.

I spend more time passing pointers to structs than I do passing simple types so even if C passed arguments in registers (which it doesn't - it's via the stack) it breaks down there.

C does pass arguments in registers. On x86 it doesn't, unless you use a nonstandard calling convention, but on any sane platform, it does so by default.

If you pass a struct (i.e. not a pointer to a struct), it can be broken up by the compiler and passed in registers too. ML typically does this. There is nothing stopping C compilers from doing it, and I think there's at least one common platform where that is part of the standard calling convention (although I don't really remember which).

If it's not straight read-write then it's read-compute-write and the compute step might be simple (add four) or complex (calculate standard deviation). If it's the latter optimisation of the loop is the least of your problems.

Ahh, yes. My mind was focused on memory operations at the moment. Sorry about the confusion.

It's also hardware dependent. It's far easier for a compiler to make clever use of registers on the sparc (32) than intel architecture (4+2).

Yes, x86 is atypical in that respect, and smells funny ;-) And yes, it's hardware dependent. But that was exactly my point. While C might have been close to the metal back in the days of PDP-11, today it is an odd match for modern computers, and makes it hard for the compiler to produce fast code. A little bit more abstraction would have made it possible to run even faster. Then again, it wouldn't have been C, but then again, we shouldn't need to use C for application programming anymore either.

[ Parent ]

Understood (none / 0) (#502)
by irwoodhouse on Wed Feb 11, 2004 at 09:56:29 AM EST

By and large I agree with you, now I understand what you mean. What you say makes particular sense on architectures where large register windows are used.

I have two comments, though.

C wasn't designed with writing the fastest possible code in mind. It was designed to add language features not available in B, and rewritten several times to enable the compiler to fit in the PDP-8's small memory.

Whilst not necessarily a design goal, C also ended up reasonably portable because it did neither made assumptions about, nor abstracted too far from, the hardware. This is why it is still so pervasive.

Nothing is faster than assembly hand-written by someone who knows, down to the clock cycles required per opcode (spot the x86 programmer?), the exact nature of the target CPU.

Obviously the intelligence behind hand-coding cannot be automated in the form of a compiler but (my second note) I still think the compiler has a large responsibility to sensibly make use of registers and otherwise optimise code.

To illustrate, an anecdote I recall from my lecturer in high performance architectures. From a piece of C for VMS on VAX, the antique PCC produced approximately three printed pages of assembly. GCC optimised and produced two pages. Digital's own compiler produced a single instruction, having determined that the code actually did nothing other than spit out a constant (clearly bad news if the rest was inserted for timing :)

GCC is very portable, but pales by comparison with the likes of Digital's compilers for VAX and alpha, and Intel's optimising compiler for iArch.

A second anecdote, this time from personal experience. I support machines used to solve classified partial differential equations (competition: guess the field). The tools written to do this use fortran (i.e. not C) and a compiler written by the vendor, for speed. However, I do not imagine the vendor is going to rewrite the operating system in fortran any time soon.

[ Parent ]

Please clarify (none / 1) (#396)
by ttsalo on Tue Feb 10, 2004 at 07:23:08 AM EST

The number one rule for C optimization other than algorithmic optimization (like replacing bubblesort with quicksort) has been: "Don't. The Compiler does it better." for some time now. And the processor architectures that are becoming more and more convoluted with 3-level caches, 20-stage pipelines, speculative execution and stuff like that are making this even more true.

So what exactly are you proposing here?

[ Parent ]

Don't let the door hit you on the way out... (2.42 / 14) (#33)
by BenJackson on Sat Feb 07, 2004 at 06:13:51 PM EST

I for one applaud your dislike of C.  That leaves more jobs available for those of us who can be trusted to use sharp tools.

By the way, how come your rant didn't include the lack of a native complex number type, or the lack of proper tail recursion?  Not to mention no lazy evaluation, no garbage collection and it makes you use symbols for the syntax instead of plain English.  It can't even figure out your program structure from looking at the whitespace.  And it's taking up 1/26th of all of the namespace available for one-letter programming language names!

re: complex numbers (none / 1) (#45)
by fae on Sat Feb 07, 2004 at 07:31:16 PM EST

The article does have a bit on that.

(Personally, I don't think C99 should have included that particular bloat.)

-- fae: but an atom in the great mass of humanity
[ Parent ]

Variable sized arrays (none / 0) (#62)
by Norkakn on Sat Feb 07, 2004 at 09:44:31 PM EST

Wouldn't they be implemented about the same time as variably sized arrays?

standards are fun but actually compiling stuff is fun (bloody out of date compilers that some nitwits still use)

[ Parent ]

Depends what you mean (none / 0) (#65)
by fae on Sat Feb 07, 2004 at 09:59:11 PM EST

In C, all things called "arrays" are declared at compile time. These are always of a fixed size.

You can also make things very similar to "arrays" at runtime, through malloc, and then resize them using realloc.

-- fae: but an atom in the great mass of humanity
[ Parent ]

You're out of date (none / 0) (#69)
by curien on Sat Feb 07, 2004 at 10:35:20 PM EST

C99 has true variable-length arrays.

--
All God's critters got a place in the choir
Some sing low, some sing higher
[ Parent ]
ah (none / 0) (#70)
by fae on Sat Feb 07, 2004 at 10:59:37 PM EST

I thought he was talking about resizable arrays. Neeever mind.

-- fae: but an atom in the great mass of humanity
[ Parent ]
Written "variable sized arrays" (none / 0) (#114)
by tkatchev on Sun Feb 08, 2004 at 07:15:25 AM EST

Read as "garbage collection".

   -- Signed, Lev Andropoff, cosmonaut.
[ Parent ]

actually.... (none / 0) (#212)
by pb on Mon Feb 09, 2004 at 12:30:04 AM EST

gcc has gotten better at tail recursion... not great, but at least good. Also, it's dead easy to do garbage collection in C, provided you use the Boehm-Demers-Weiser conservative garbage collector, or some similar package.

I guess it just goes to show, you don't necessarily have to bolt everything into the language specification. :)
---
"See what the drooling, ravening, flesh-eating hordes^W^W^W^WKuro5hin.org readers have to say."
-- pwhysall
[ Parent ]

+1FP Original! (2.42 / 7) (#37)
by Run4YourLives on Sat Feb 07, 2004 at 06:34:44 PM EST

Finally, someone has found something wrong with this language and speaks up.

Please go away.

It's slightly Japanese, but without all of that fanatical devotion to the workplace. - CheeseburgerBrown

wtf is pascal? [nt] (2.57 / 7) (#39)
by quartz on Sat Feb 07, 2004 at 06:51:09 PM EST



--
Fuck 'em if they can't take a joke, and fuck 'em even if they can.
Borland renamed it to Delphi (none / 0) (#57)
by lukme on Sat Feb 07, 2004 at 08:32:09 PM EST

Pascal was envisioned as a teaching language to aide in teaching structured programming. It was very popular in academic course work in the 70's and 80's.

One of the things I find interesting is how old things get reinvented as new things. Pascal was to compile things to P-Code, which was an intermediate language ment to be interpeted by a virtual machine. Doesn't this sound like Java?

Quite frankly, in 89 I learned C and never went back. There were some things that were more tedious to do in pascal than in C (I can't remember exactly what they were since it had been 15 years).




-----------------------------------
It's awfully hard to fly with eagles when you're a turkey.
[ Parent ]
vm's.. (none / 0) (#67)
by Work on Sat Feb 07, 2004 at 10:01:34 PM EST

Pascal was to compile things to P-Code, which was an intermediate language ment to be interpeted by a virtual machine. Doesn't this sound like Java?

Those ideas have been around since the 50s and 60s... Lisp (and its dialects) is also usually interpreted and running on a VM. And lisp is probably the oldest language still in use. Older than fortran even.

[ Parent ]

doh, im wrong (none / 1) (#68)
by Work on Sat Feb 07, 2004 at 10:02:38 PM EST

fortran *is* older than lisp by less than a handful of years.

[ Parent ]
Interpreted Lisp? (none / 1) (#147)
by ChazR on Sun Feb 08, 2004 at 12:00:00 PM EST

Most popular Lisps are compiled these days. Certainly, CMU CL, Gnu CL and SBCL all compile to native machine code, In many cases, they're as fast or faster than C.

[ Parent ]
scheme isn't usually... (none / 0) (#153)
by Work on Sun Feb 08, 2004 at 12:30:27 PM EST

Of course any lisp (or language for that matter...) can be compiled into machine code.

[ Parent ]
Whoa there! (none / 0) (#271)
by ksandstr on Mon Feb 09, 2004 at 12:12:29 PM EST

Most LISP used to be compiled, if only to byte-code, even back when compilation to machine instructions wasn't as common as it is today, if only because skipping a comment within a loop (or mapcar or whatever) as many times as the loop were executed was waste even back then. The byte-code you'd get in a LISP environment would obviously be quite different from what you'd see in, say, a JVM, but that is really beside the point.

For some reason, people seem to mistake a bulky and extensive runtime for an interpreter these days, which is kind of funny considering things like C# and Javur.


Fin.
[ Parent ]

I think people confuse Lisp with Emacs. (none / 0) (#309)
by tkatchev on Mon Feb 09, 2004 at 04:01:20 PM EST

I say blame RMS.

   -- Signed, Lev Andropoff, cosmonaut.
[ Parent ]

VM-based Lisps. (none / 0) (#524)
by voodoo1man on Wed Feb 11, 2004 at 08:16:20 PM EST

Those ideas have been around since the 50s and 60s... Lisp (and its dialects) is also usually interpreted and running on a VM. And lisp is probably the oldest language still in use. Older than fortran even.

Not quite true. The first VM-based Lisp implementation was Interlisp (I forget whether the -10 on the TENEX or the -D on the Xerox machines was actually first), and this was in the mid 70s. Up until then, most Lisps were interpreted, and Maclisp had a compiler. Interlisp was heavily influenced by some of the ideas from Smalltalk, and this is probably where the idea of a persistent, machine-independent VM came from (I'm pretty sure that Smalltalk was the first language to use one, although APL had a similar notion of "workspaces" which may have predated Smalltalk).

Nowadays, the only* Common Lisp that I'm aware of that runs on a VM is GNU CLISP (it's also incidentally the only Common Lisp implementation without a native code compiler). There's also Symbolics Open Genera (which is literally a virtual machine emulating the Symbolics Ivory hardware), which runs on the DEC Alpha, and the Medley, which is an Interlisp VM which used to work on a variety of machines, but hasn't been maintained in quite a few years.

* - In the near future (as soon as it supports more of the standard) Armed Bear lisp will also be on this list, since it runs on top of the JVM.

Oh, yeah. As other people have noted, Fortran predates Lisp by a number of years, which makes Fortran and Lisp, respectively, the 1st and 2nd oldest programming languages still in use.

[ Parent ]

Pascal. (none / 1) (#115)
by tkatchev on Sun Feb 08, 2004 at 07:16:22 AM EST

Pascal had no support for opening files, for example. How cool is that?

   -- Signed, Lev Andropoff, cosmonaut.
[ Parent ]

Nor any support for pointers (none / 0) (#365)
by lukme on Mon Feb 09, 2004 at 10:16:57 PM EST

and dynamically allocated memory for that point.


-----------------------------------
It's awfully hard to fly with eagles when you're a turkey.
[ Parent ]
Though there is nothing wrong with that. (none / 0) (#397)
by tkatchev on Tue Feb 10, 2004 at 07:34:47 AM EST

Originally, the point of Pascal was to create a minimalist language that would be dead-simple to compile straight to machine code.

If that meant sacrificing some features, well, so be it.

So the "limitedness" of Pascal is not necessarily bad.


   -- Signed, Lev Andropoff, cosmonaut.
[ Parent ]

Another fine rant. (2.84 / 25) (#41)
by it certainly is on Sat Feb 07, 2004 at 07:13:44 PM EST

Very good, sir. Here are some of my disagreements, at random.
  • no string type: C is not a high level language. Repeat after me: C is not a high level language. No modern processor actually has "string" support (x86 has some useless pascal-style string instructions, limited to 255 byte strings), and C always works in terms of the processor. As strings these days need to include full unicode support, I think C made the right choice: use an external library to do strings. The standard library has better things to do than bloat. The string library as it stands is small and useful for the occasional use of strings. For heavy use of string manipulation, use Perl. C is a low level language. It is not a string manipulating language.
  • no bool type: what is it with you and strong typing?!?!?!?! Gah, I can't stand that. Processors have a "zero" flag. The zero flag gets set on comparison of equal integers. In C, something is false if it's 0 (Branch If Zero), or else it's true (Branch If Not Zero). Actually forcing people to change nonzeros to explicitly the "true" constant is the kind of paper-pushing, dotting-the-'i's-and-crossing-the-'t's shit that low level languages like C do not put up with.
  • anything else to do with explicit language support for datatypes and structures: fuck off. C is a low level language. Use something else for frilly bits.
  • any language features that are not basic features of the von Neumann architecture: fuck off. C is a low level language. Use something else for frilly bits.
  • library cruft: we should just re-write old programs, then? You'll note that atoi() and atol() are just simple macro front-ends to the generic strtol().
  • operators: "*", "[]" and "->" are dereferences. "." is a structure offset. They are two different things, and if you don't know the difference then you are probably writing inefficient code with far more dereferences than necessary.
  • string cases: are you mad? Using ifs and strcmp()s is identical to how a higher level language would implement switch() on actual strings. You'll probably complain that case labels can't be variables, too. That is deliberate, as the switch() construct compiles directly to a jump table, or a list of well-ordered "subtract #constant / branch if zero" commands. That's what it's for.
  • endianism and alignment: if you didn't know this and take it into consideration, it's your own damned fault. The problem is never the actual endianness or alignment rules, it's those idiots who read and write data to disk or network where it might be used by different computers or software. Never do this. EA wrote the Interchange File Format just for you. All data should be serialised as a well defined bytestream, not as a raw memory dump. That is what debuggers are for.


kur0shin.org -- it certainly is

Godwin's law [...] is impossible to violate except with an infinitely long thread that doesn't mention nazis.

When other people do this, (none / 0) (#46)
by rmg on Sat Feb 07, 2004 at 07:37:09 PM EST

I can look past it when most people around here do this kind of thing, but you it certainly is? Tsk, tsk.

_____ intellectual tiddlywinks
[ Parent ]

I am thinking of the lurkers. (3.00 / 4) (#47)
by it certainly is on Sat Feb 07, 2004 at 07:45:26 PM EST

I do not want them to read this troll and think C is a bad language. I want them to go away thinking C is a low-level language, and understand why that could be both good and bad, and in what scenarios.

kur0shin.org -- it certainly is

Godwin's law [...] is impossible to violate except with an infinitely long thread that doesn't mention nazis.
[ Parent ]

Let the lurkers fend for themselves. (none / 0) (#48)
by rmg on Sat Feb 07, 2004 at 07:47:43 PM EST

If you're going to bite, hold out for the good stuff.

_____ intellectual tiddlywinks
[ Parent ]

It's not bad, you know. (none / 3) (#49)
by it certainly is on Sat Feb 07, 2004 at 07:48:51 PM EST

He raises many interesting points. Yes, I can see it now. He definitely has something there.

kur0shin.org -- it certainly is

Godwin's law [...] is impossible to violate except with an infinitely long thread that doesn't mention nazis.
[ Parent ]

Nitpickings (none / 1) (#92)
by joto on Sun Feb 08, 2004 at 04:52:23 AM EST

Processors have a "zero" flag.

Not all processors. Some RISC architectures just use a normal register. There are probably other approaches too.

"*", "[]" and "->" are dereferences. "." is a structure offset. They are two different things,

Yes, any programmer worth their salt knows this. But so does the compiler! And therefore it's just another annoyance, a design-bug that complicates the language for no apparent reason. The only argument in favour of -> is C++ smart pointers, but they were certainly not invented at the time.

[ Parent ]

But, being explicit is Good. (none / 0) (#266)
by ksandstr on Mon Feb 09, 2004 at 11:48:49 AM EST

A counter-example is when an ill-disciplined C++ neophyte implements a basic vector/matrix math library and sees fit to include overloaded operators for every single fucking datatype that he declares. Then, every unit test he'll write will be littered with opaque expressions such as "Vector<4, float> nutter = (mash*potato) | yoink;" -- good luck refactoring that kind of crap without some kind of an AI-complete text editor that'll instantaneously know what type your expressions are and wipe the programmer's ass with the appropriate (lack of) documentation.

Being explicit is Good, since it promotes transparency, and the virtues of transparency should be known to all but the most chronic of OO layering wankers.


Fin.
[ Parent ]

What are you talking about? (none / 0) (#320)
by joto on Mon Feb 09, 2004 at 04:54:08 PM EST

You mean you regularly refactor unit tests from neophytes attempting to write a basic vector/math library in C++?

[ Parent ]
I think what he's trying to say (none / 1) (#391)
by it certainly is on Tue Feb 10, 2004 at 06:03:29 AM EST

is what I said. A programmer should know when he is doing a pointer dereference, and he should have to write it explicitly. That way, he may stop and think, and he may devise a method to remove the need for a dereference, thus saving CPU cycles. It's good mental discipline. These Java fools don't realise how horribly inefficient their programs are.

kur0shin.org -- it certainly is

Godwin's law [...] is impossible to violate except with an infinitely long thread that doesn't mention nazis.
[ Parent ]

What's so bad about dereferencing pointers? (none / 0) (#398)
by tkatchev on Tue Feb 10, 2004 at 07:36:33 AM EST

A broken pipeline can do more damage than a thousand dereferenced pointers could.

Really, it is for this reason that C should be banned. It gives a totally unrealistic and plain wrong view of the hardware architecture.


   -- Signed, Lev Andropoff, cosmonaut.
[ Parent ]

Data locality. (none / 0) (#420)
by it certainly is on Tue Feb 10, 2004 at 09:18:02 AM EST

If you are dereferencing a pointer, the chances are that the target is not in the cache. Which, of course, leads to the same broken pipeline that incorrect branch predictions, etc, lead to.

kur0shin.org -- it certainly is

Godwin's law [...] is impossible to violate except with an infinitely long thread that doesn't mention nazis.
[ Parent ]

Chances are. (none / 0) (#430)
by tkatchev on Tue Feb 10, 2004 at 11:21:43 AM EST

But, really, you have no way of telling from inside C.

Which is precisely why C sucks.


   -- Signed, Lev Andropoff, cosmonaut.
[ Parent ]

No, not really (none / 0) (#401)
by ksandstr on Tue Feb 10, 2004 at 08:09:03 AM EST

The vector/math library unit test example was just the first thing that came to mind with regard to the opaqueness problems often introduced by operator overloading. What I was trying to say is that while, at the end of the day, the compiler may know what to do with a given expression it doesn't mean that it's a good idea to write code that cannot be understood without memorizing documentation that may not even exist.


Fin.
[ Parent ]
Von Neumann architectures. (none / 1) (#117)
by tkatchev on Sun Feb 08, 2004 at 07:19:40 AM EST

Really, how much of von Neumann is left in modern processors? I'd guess that not much.


   -- Signed, Lev Andropoff, cosmonaut.
[ Parent ]

C has a boolean type (1.57 / 7) (#42)
by Rupert Pupkin on Sat Feb 07, 2004 at 07:15:34 PM EST

ranting about things you don't know makes you look stupid, stupid.

Why a chainsaw is not my favortie tool (2.78 / 19) (#50)
by bugmaster on Sat Feb 07, 2004 at 08:06:12 PM EST

It is nearly impossible to screw in bolts with a chainsaw: you just end up with broken bolts and/or chainsaw. It doesn't even have a phillips head ! How stupid is that ?

That's because the chainsaw is a specific tool made to do a specific job. So is a screwdriver. So are C, Java, PHP, and whatnot. You wouldn't use a microscope to hammer nails (ok, maybe you would, I don't know), and you wouldn't use Java for embedded or real-time programming, and you wouldn't use C to create GUIs. What's the problem ?
>|<*:=

Actually... (3.00 / 3) (#90)
by joto on Sun Feb 08, 2004 at 04:40:21 AM EST

I would use java for embedded and/or realtime prrogramming. Come to think of it, I already have. And if the rest of your program is already in C, it makes a lot of sense to write the GUI in C too, something I also have done. So what's your point?

[ Parent ]
Not the best tool (none / 2) (#119)
by bugmaster on Sun Feb 08, 2004 at 07:27:47 AM EST

Yes, you can use anything for anything else. However, it makes more sense to use the right tool -- if you have a choice, that is. Of course, if the rest of your program is in C, you'd write the GUI in C as well, but in this case your choices are limited. I've had to write GUI in C as well, and I didn't like it.

Embedded-wise, it's a grey area. Sure, there are some devices powerful enough to run Java -- cellphones come to mind. However, if you're using something like the PIC16C or even the AVR microcontroller, there's no way you can run Java. I mean, come on, PIC16C has one working register, no stack, and no RAM. How would you run a JVM on it ?
>|<*:=
[ Parent ]

Java for Realtime Programming? (none / 1) (#178)
by rbt on Sun Feb 08, 2004 at 04:11:25 PM EST

I would use java for embedded and/or realtime prrogramming I've often wondered how to prevent the garbage collector from running at exactly the wrong point in time in realtime or time sensitive situations.

[ Parent ]
True, it's a tricky problem. (none / 1) (#204)
by joto on Sun Feb 08, 2004 at 08:52:36 PM EST

There are some ways out, however:
  1. You can avoid allocations after initialization time. (True, java makes that hard, but it's possible).
  2. You can work with something that isn't hard realtime, e.g. it's allowed to fail once in a while. (Many data-acquisition systems fit into this scheme).
  3. Since you wrote the program, you probably know better than java when it's safe to GC, and you can take over the control of this.
  4. You can use a realtime garbage collector.

The last point is probably the most important. It really exists.

Realtime GC will typically give you much better guarantees than e.g. malloc/free. But if you are the type of guy that inspects assembly code to count processor cycles with cache disabled, then you'll probably do better the traditional way.

[ Parent ]

Re: Why a chainsaw is not my favortie tool (none / 0) (#395)
by ttsalo on Tue Feb 10, 2004 at 07:05:48 AM EST

What's the problem ?

The problem is that I don't want to carry around the whole friggin' toolshed with me just to have always the exactly right tool at hand. It's too inconvenient.

[ Parent ]

C is not Portable? (2.66 / 9) (#58)
by lukme on Sat Feb 07, 2004 at 08:44:18 PM EST

ANSI C is the standard.

K&R C is the older standard.

POSIX C defines some thing in addition to ANSI C

GNU C, MicroSoft C, Think C, Power C, Borland C, ... are all implementations of ANSI C.

I bet the following:

1) you have never read the ANSI C standard.
2) you don't realize that ANSI C compliers can compile about 95% K&R C (there are a few minor differences you need to be aware of).
3) You have never worked on a portable C project.
4) You have never needed to have your code run fast.


-----------------------------------
It's awfully hard to fly with eagles when you're a turkey.
Bzzt. (2.50 / 4) (#89)
by joto on Sun Feb 08, 2004 at 04:36:37 AM EST

ANSI C is the standard.

It's the american standard, yes. The rest of the world prefer to use ISO C. They are more or less the same.

K&R C is the older standard.

No, it never was, and never will be a "standard". It was just the language described by Kernighan and Ritchie in their book about C. This book never strived to be a standards document. But since the ANSI standard for C changed a lot of things, it became necessary to invent a way to specify which dialect you talked about. Thus K&R.

POSIX C defines some thing in addition to ANSI C

Yes. So does any other unix spec, and there are so many to choose from... Luckily, there are relatively few contradictions.

GNU C, MicroSoft C, Think C, Power C, Borland C, ... are all implementations of ANSI C.

Well, they are all implementations of a subset of the superset of all official ANSI/ISO C specs, with extensions for K&R C, and private extensions specific to that compiler/platfrom, a few documented and undocumented bugs, etc...

[ Parent ]

Here are some of my honest questions for you. (none / 1) (#371)
by lukme on Mon Feb 09, 2004 at 10:52:22 PM EST

1) Is the ANSI C committee only consisting of Americans, or is there more of an international flavor to it?

2)Didn't ISO just adopt the ANSI C standard?

3)Wasn't the starting point to any of the "K&R" implementations their book? Would you agree that their book would be a defacto standard?

4)When the ANSI C committee started, didn't they consider the K&R book, as well as common implementations and working code? 5)Of all of the features defined in the ANSI/ISO standard, the ones that are specified as SHALL are for the most part (aside from bugs) implemented in the various implementations? Do you think that this subset could be used to build a cross platform application?

6)This may be an odd question, do you belive 1-2-3 evaluates to 0 as opposed to -4?




-----------------------------------
It's awfully hard to fly with eagles when you're a turkey.
[ Parent ]
portability (none / 0) (#448)
by mikpos on Tue Feb 10, 2004 at 02:27:51 PM EST

ANSI does allow international members. Unfortunately their rosters are in Microsoft Word format, so I can't tell what international members they have right now. If I remember right, the 1989 ANSI standard was used to create the 1990 ISO standard (the texts would be the same). For the current 1999 standard, I don't recall hearing about any direct involvement from ANSI. Mind you I'm making this all up, so it's likely wrong.

If K&R is the defacto standard for anything, it would be programming style: specifically, it's the only document I'm aware of that promotes correct formatting. It would be a defacto standard in a rather loose way, in that implementations didn't follow them to the letter. It's not the sort of standard that one could "follow" in any meaningful way as a user, though, as when push came to shove your compiler and K&R would disagree.

In my mind, there are only two C standards: C89 and C99. POSIX C I would not consider a C standard. It provides a standard for Unix system calls and functional calls and makes some requirements along the lines of "a POSIX system should have a C compiler installed", but the standard for the C language on POSIX systems is C89, not POSIX. This is the same for all other platforms: the C standard is C89 or C99; there are no other standards.

Yes, when the standard was created, they looked at common implementations. What's your point? The only other option is to create an entirely new language (design by committee) and have no one use it. The point of standard C is not to have no one use it.

I don't understand your question 5. Are you asking if one can write an application in standard C? The answer is "yes" up to certain definitions of "application". One can receive input, perform calculations, and produce output. Maybe that's not exciting enough, I don't know.

[ Parent ]

C Is Not Your Favourite Prog Language because (2.69 / 13) (#61)
by Idioteque on Sat Feb 07, 2004 at 09:40:13 PM EST

you're using it for the wrong applications and you don't understand how to use it correctly. C is very powerful and much closer to the hardware than you seem to want it to be, obviously you need another language. Please don't knock this great language just because you don't understand it's uses.


I have seen too much; I haven't seen enough - Radiohead
+1FP, Finally someone brave enough to mention it! (1.66 / 6) (#66)
by Azmeen on Sat Feb 07, 2004 at 09:59:13 PM EST




HTNet | Blings.info
once again... (2.33 / 6) (#71)
by teichos on Sat Feb 07, 2004 at 11:12:07 PM EST

You belittle the technology and its standards when it doesn't save you from your own stupidity. Perhaps, you're just not doing it right?

flames and modbombs are the most pathetic forms of flattery
C is stupid. (1.80 / 5) (#72)
by SIGNOR SPAGHETTI on Sun Feb 08, 2004 at 12:13:33 AM EST

You can't write a useful program in C without first implementing a LISP interpreter.

--
Stop dreaming and finish your spaghetti.

So you don't consider Lisp useful? (none / 2) (#87)
by joto on Sun Feb 08, 2004 at 04:14:44 AM EST

That would be the logical conclusion of what you said. But why would you need to write it then?

[ Parent ]
No, (none / 1) (#274)
by ksandstr on Mon Feb 09, 2004 at 12:19:18 PM EST

You can't write a useful program in C without first implementing a LISP interpreter.

And that is plainly your problem and not mine and not a problem of the C standard body or especially one of its users either.


Fin.
[ Parent ]

What if you want to write a Lisp Interpreter? (nt) (none / 0) (#525)
by UserGoogol on Wed Feb 11, 2004 at 09:03:00 PM EST



[ Parent ]
C programming is for artists (2.83 / 12) (#75)
by mstefan on Sun Feb 08, 2004 at 01:31:15 AM EST

You can make good art, or bad art. But at least you have complete control over the canvas, the paints and the brushes. And if you want to paint outside the lines, you can do that too. Whatever the machine can do, C (and a little inline assembly where required) will let you do.

The difference between Pascal and C is the difference between using a paint-by-number kit with crayons and a canvas with oil paints. You end up with a picture in both cases, and its certainly easier to make a mess with oils. But in the hands of master, there's no doubt which is the superior medium.



.Please enter a subject for your comment. (none / 1) (#93)
by qwertyuiop666 on Sun Feb 08, 2004 at 04:59:18 AM EST

That pretty muchs hits the nail on the head in the whole C vs C++ vs Java vs C# vs Python etc etc debate.

Do you mind if I use that quote?

[ Parent ]

Uh so tell me again... (none / 2) (#118)
by tkatchev on Sun Feb 08, 2004 at 07:24:59 AM EST

How do I control branch prediction from C? There was also something about vector architetures I heard somewhere sometime...

   -- Signed, Lev Andropoff, cosmonaut.
[ Parent ]

With gcc (3.00 / 3) (#121)
by Hot For The Teacher on Sun Feb 08, 2004 at 08:00:57 AM EST

__builtin_expect

But I don't understand what you're trying to criticize.

[ Parent ]

Agreed (3.00 / 3) (#133)
by mstefan on Sun Feb 08, 2004 at 09:58:10 AM EST

I'm not sure why the ability to control branch prediction would be considered a good thing in the context of this article. The general argument, that C gives you too much rope to hang yourself, wouldn't be helped by adding branch prediction functions to the standard. If anything it would make it worse because most programmers suck at manual optimization of their code, and even when they're very good at it, it's often at the expense of readability.

I'd tend to think of __builtin_expect in the same way as __attribute__((always_inline)) or __forceinline; useful in some limited cases, but easily abused and, in general, a job best left for the compiler.



[ Parent ]
My point is: (none / 1) (#171)
by tkatchev on Sun Feb 08, 2004 at 02:55:48 PM EST

Either C is a low-level language, or it isn't.

What's the point of writing code for a low-level machine that doesn't correspond to your processor architecture? That's about as dumb as trying to program the JVM by writing bytecode directly to a file.


   -- Signed, Lev Andropoff, cosmonaut.
[ Parent ]

Why so black and white? (3.00 / 4) (#173)
by mstefan on Sun Feb 08, 2004 at 03:36:55 PM EST

It would be nice if problems could be viewed this way, but often they can't be. I'd consider C a hybrid "medium level" language, providing some higher level constructs while still letting the developer address the problem in terms of the machine and still providing some degree of portability between platforms. C can be ideal for operating systems, drivers, components (where footprint and performance are paramount) and so on. For that user application that talks to your SQL database, it's not and in fact would be counter-productive. So we're back to the "if all you have is a hammer, everything looks like a nail" issue. But it doesn't have to be that way.

Don't get hung up on trying to pigeon-hole the language; whether its considered low-level or not is beside the point. The original article does a few valid points, but more than half of it is complaining about the lack of a string type and the various non-conforming implementations and extensions out there. I don't really consider a lot of what has been raised to be an issue with the language per se; well, there was that whole multilevel break snivel, but an old professor of mine once told me "If you find yourself needing to use goto or a multilevel break, then your code is muddled; don't hack, rewrite." But that's a whole other issue.



[ Parent ]
No, that's not the problem. (none / 2) (#222)
by tkatchev on Mon Feb 09, 2004 at 04:27:32 AM EST

C models an outdated processor architecture of the 1970's. Today, we need something exactly like C except with support for the radically different processors we have today.

It turns out that when the compiler tries to "recode" this 30-year-old architecture into a form that is palatable to your desktop machine, the compiler becomes insanely complex, bloated and produces code that is relatively slow.

We need C with modern semantics, that is all.


   -- Signed, Lev Andropoff, cosmonaut.
[ Parent ]

Well then, (none / 2) (#254)
by ksandstr on Mon Feb 09, 2004 at 11:10:47 AM EST

You go right ahead and pick up some Haskell. In the three years you'll spend, at the least, getting proficient with it the compiler technology will have improved to the point where it's irrelevant from an execution performance standpoint whether you're writing in Haskell, yh-wur or old grandpa C. From a C.S. point of view, it'd obviously be a good thing to give the compiler but an idealized abstract description of the algorithm (i.e. Haskell source or some equivalent form) and let it apply whatever transformations it sees necessary, but at the end of the day, could you imagine the hordes of ill-disciplined Java wankers spending 1095 times what the "in 24 hours flat!" book promised learning a language so easily dismissed as esoteric computing science?

By the way, I'd be quite interested in hearing which particular machine architecture today doesn't conform to the old "bunch of registers here, bunch of memory there, stream of instructions here" model.  Oh sure, there's SIMD and SMP and SMT and everything, but two of those things are but an instance of the optimization paths that occur in a modern multi-tasking operating system and the other is just an instruction-set performance hack readily wrapped in sets of low-to-middle level compiler intrinsics. So pray tell, what new primitives and semantics would you like in "C, Next Generation"? (And don't give us the tired old "object features!!!" crap, we've all seen where that particular path leads.)


Fin.
[ Parent ]

Think out of the box. (2.50 / 4) (#275)
by tkatchev on Mon Feb 09, 2004 at 12:21:38 PM EST

The Turing machine is useful as a concept that describes our desktop computer in an easy way, but it isn't a guide for designing computers.

There are lots and lots of other perfectly fine, percectly functional and performant computational models.

For example, the currently popular functional paradigm. It is very well suited for parallel stream processing. Good old imperative languages are good for writing things like event handling loops, etc. But neither of the two is more complicated than the other, and neither is more, or less, efficient than the other. It all depends on the problem you're trying to solve.

There are lots and lots of problems where coding it in Haskell would be much simpler, straightforward and efficient than coding it in C. (Besides, Haskell is a very easy language to learn and to program in; doubly for a beginner that hasn't been exposed too much to C or Pascal before.)

The problem, at the moment, is the lack of decent functional compilers.

The lack of functional compilers is due to the fact that most people write code that targets the 30-year-old C "virtual machine" that is implanted in their brains. (Few people want to learn arcane machine-code architectures just to write a compiler. Targetting C is ever-so-much easier.)

Meanwhile, modern processors are diverging more and more from the 1970's-style architecture of C. It makes progressively less and less sense to stick to the architecture C proposes.

In practical terms, what C needs is support for proper tail calls, parallelism, concurrency and SIMD. Also, explicit control of the stack, heap and registers. Controlling the cache and branch prediction would be nice as well. Control of the exact length and format of integers is needed. (Or at least a decent way of figuring out this information.) The ability to change (or specify) calling conventions would be good. Etc., lots of other stuff I forgot.

(Check out this site for more ideas. These guys are on the right track.)


   -- Signed, Lev Andropoff, cosmonaut.
[ Parent ]

That's quite cool. (none / 0) (#170)
by tkatchev on Sun Feb 08, 2004 at 02:53:55 PM EST

Unfortunately, it isn't C.


   -- Signed, Lev Andropoff, cosmonaut.
[ Parent ]

Branch prediction hints (none / 0) (#218)
by lamont116 on Mon Feb 09, 2004 at 01:30:00 AM EST

Don't do it. "The P4 and G4e do actually have one more branch prediction trick up their sleeves that's worth at least noting (if only because if I don't note it I'll get lots of email about it). That trick comes in the form of "software branch hints," or little prefixes that a compiler or coder can attach to conditional branch instructions. These prefixes give the branch predictor clues as to the expected behavior of the branch, whether the compiler or coder expects it to be taken or not taken. There's not much information available on how big of a help these hints are, and Intel at least recommends that they be used sparingly since they can increase code size." http://www.arstechnica.com/cpu/01q2/p4andg4e/p4andg4e-4.html

[ Parent ]
Ah yes, the "cs" prefix (none / 0) (#257)
by ksandstr on Mon Feb 09, 2004 at 11:17:47 AM EST

The last I checked, it was but a single byte in front of a branch instruction. While it is certainly true that using those before each and every branch will increase code size and thus hamper instruction decoding, the better reason to Just Trust The Compiler is that the branch hints tend to be taken absolutely literally by the processor. Meaning that if you declare a branch as going to be taken almost always, the processor won't apply its own branch prediction logic (which is still right about 90% of the time, given non-random behaviour) and so if your branch is taken more than 10% of the time you'll end up slowing your program down.

Really, the only place for "__builtin_expect()" and the like is in an "assert()" type macro. Not to mention that optimizing by hand for something that could result, in the worst case, in a 20-cycle penalty in a 2-gigahertz architecture is, relatively speaking, a waste of time.


Fin.
[ Parent ]

Art v. production (none / 2) (#137)
by debillitatus on Sun Feb 08, 2004 at 10:13:02 AM EST

So, C is inappropriate in any situation when you're actually trying to produce code for something useful, but appropriate when you're trying to be clever in some abstract sense?

In fact, I think you might be reiterating the point of the article.

It seems to me that this might be the wrong attitude to take. Code should be simple to implement and powerful. Also, since you're on a computer, it should automate tasks which should be canonical. Joyce has a great point about error handling.

This whole argument reminds of me *nix vs. everything else debates. For example, when I've told colleagues that I have switched from Linux to XP for certain tasks because XP just plain works better in these contexts, a very common response is something along the lines of "Well, sure, if you don't care about being a man, go ahead and use Windows." Something along the lines of the comments we used to make when we were teenagers about someone driving with an automatic transmission v. a manual one. But that comment has two subtexts. The first is obvious, and kind of useless, since why on earth are computers a macho thing? The second is more subtle, but a tacit admission that one tool is actually easier to use. And if it is, why not use it?

Anyway, your comment reminds me of this old debate.

Damn you and your daily doubles, you brigand!
[ Parent ]

To some extent (none / 0) (#168)
by mstefan on Sun Feb 08, 2004 at 02:17:28 PM EST

It all depends on how you look at the software that you're writing and what its goals are. An analogy would be to the art of glass blowing. You can go out, buy yourself a set of manufactured, mass produced tumblers and they'll be functional and do what they're supposed to: hold liquids. That's commodity business software. But there is a place for hand-blown, custom crafted glassware. Just like there is a place for carefully crafted, "lower level" code written in C.

The problem isn't with the language. The problem is that for most programmers, whatever language they know best is like a hammer in their toolbox -- and every problem looks like a nail. It's not an affliction exclusive to C programmers.



[ Parent ]
the art of C (none / 0) (#190)
by debillitatus on Sun Feb 08, 2004 at 06:17:14 PM EST

What you say is true, there is a time and a place for C.

The one main beef I would have with it is that it became really prevalent, some might say ubiquitous, in the early- to mid-90s. You just couldn't get away from it. And it was used for everything...

Damn you and your daily doubles, you brigand!
[ Parent ]

The manly chest-beating thing (none / 0) (#259)
by ksandstr on Mon Feb 09, 2004 at 11:31:26 AM EST

From what I can tell (by self-examination and looking around my peer group), the expressions of "well, maybe you're the bitch then" etc aren't actually referring to lack of manly characteristics but to a perceived mental limp-wristedness on the part of the person who would rather switch over to a drool-proof "do sort of what you think I should want" type system to do some things than shit blood for six hours trying to get over their own misconception of how the same thing should be done in a different sort of a system.

Personally, I've seen previously rather sharp people turn into the computing equivalent of a drooling moron (you know, the kind who can't tell an ip address from an internet protocol and their arse from their elbow) simply by letting themselves be led on by drool-compatible user interfaces, effectively allowing the operating system tell them what they should expect, prefer and how they should do things. Thankfully this process can be reversed, even inhibited, by exposing the victim to an unforgiving environment [a language that lets you shoot yourself in the bottom with a pointer is just fine], but witnessing the first change was rather disturbing in my experience.


Fin.
[ Parent ]

Sometimes art isn't necessary (none / 0) (#476)
by Merc on Tue Feb 10, 2004 at 08:36:46 PM EST

Art generally takes much longer, and many artists aren't as good as they think they are.

Say co-worker had an idea of how to rearrange the office furniture. You asked him to sketch out what he meant, and he pulled out the oil paints. Argh. An hour later, he's finished his sketch, there's paint everywhere, but you do have a pretty good sketch of what he meant. If he'd just used the etch-a-sketch you'd have had an idea what he meant in a few seconds, and cleaning up would simply involve a few shakes.

First of all, most people who code in C aren't very good programmers. The power it gives them is wasted, and they mostly just make a mess. Secondly, there are tasks for which C just isn't appropriate. Sure, it's great for device drivers, but for a quick one-off GUI program, or a quick-and-dirty tool, C is an awful choice.



[ Parent ]
Artists (none / 0) (#485)
by Pseudonym on Wed Feb 11, 2004 at 12:28:47 AM EST

Art generally takes much longer, and many artists aren't as good as they think they are.

Artists also tend to get pissed off if you criticise their work. "It's my artistic vision!"


sub f{($f)=@_;print"$f(q{$f});";}f(q{sub f{($f)=@_;print"$f(q{$f});";}f});
[ Parent ]
No, pre-stretched canvases are for artists. (none / 0) (#481)
by startled on Tue Feb 10, 2004 at 10:14:35 PM EST

As are brushes. All you can buy for C are rolls of canvas and wooden beams, and bristles and handles. In C, the first ten times you try to stretch the canvas, it tears in half. And when you try to bind the bristles into a brush, they disintegrate after several strokes.

Yes, there are many artists who like to make their materials from scratch to see what different results they can get. But many great artists buy their materials. You wouldn't call a painter who buys paint, as opposed to mixing it from dyes, a "paint-by-number" hack, would you?

[ Parent ]
Other languages are for non-artists (none / 0) (#484)
by Pseudonym on Wed Feb 11, 2004 at 12:22:50 AM EST

Fine art is non-functional, and exists as pure communication. Fine art is a wonderful thing. I can admire a great work of art as much as the next person.

Meanwhile, in the other corner, we see people not making fine art. They're the ones with a job to do with deadlines and budgets. They're the ones using photoshop filters, stock photography and CAD templates. People in this position are well advise not to emulate fine artists.

C may well be for artists, but other, more modern languages are for illustrators, graphic designers, industrial designers, typographers and architects.


sub f{($f)=@_;print"$f(q{$f});";}f(q{sub f{($f)=@_;print"$f(q{$f});";}f});
[ Parent ]
Yeah ... um ... you don't get it. (2.90 / 11) (#77)
by Mr.Surly on Sun Feb 08, 2004 at 01:43:53 AM EST

C really is just one step up from machine-language.  Go do some machine-language by writing it on paper, then inputting it using a hex keypad, then maybe you'll have a little perspective of where C came from.  C is old, and it is weird.  Deal with it, or don't use it.

This article really smacks of "Oh, poop! C is really hard, so I'll write an article complaining about it instead."  As such, it's probably a long, subtle troll.

And why not? (none / 2) (#85)
by it certainly is on Sun Feb 08, 2004 at 04:07:01 AM EST

The author is proud to admit that he is a crapflooder. To call him a troll is a compliment.

kur0shin.org -- it certainly is

Godwin's law [...] is impossible to violate except with an infinitely long thread that doesn't mention nazis.
[ Parent ]

He is right ... (3.00 / 5) (#156)
by Mr.Surly on Sun Feb 08, 2004 at 12:57:38 PM EST

... If you're expecting C to be more like Java, C#, Perl, whatever. As it is, he's expecting C to be something it isn't.

How about if I write a long article about how my Honda Civic can't go 0-60 in 6 seconds? I mean goddamnit, many modern cars have no problem with this?  What kind of stone-age automotive technology is Honda selling these days? To hell with that, I'm buying a Ferrari next time.


[ Parent ]

Is don't use it an option? (none / 0) (#475)
by Merc on Tue Feb 10, 2004 at 08:19:05 PM EST

Actually, C is 1 step up from assembly language, which is one step up from machine language. If you don't know that then you can't be taken seriously.

The author does have a point. Nobody would try to write a device driver in Python, so why are people writing GUI programs in C? Admittedly, if you are a very good programmer, doing that will allow you to squeeze every last bit of performance out of your system. On the other hand, you end up with a nasty, unmaintainable program, prone to security holes, with 10x as many lines as it needs to have.

Writing programs in C is like making a meal from scratch. If you're a good cook, you might end up with a much better meal. But, even the best cook will take longer to do it than if they just used a few pre-made sauces and such. If a master chef wants to cook from scratch, let them! On the other hand, if I just want a quick meal, sometimes a TV dinner is all you need.



[ Parent ]
I'm glad you don't like C (2.00 / 7) (#79)
by Sapien on Sun Feb 08, 2004 at 02:23:19 AM EST

You really suck at it.

buffer overflows is the biggie here... (none / 3) (#94)
by reklaw on Sun Feb 08, 2004 at 05:09:38 AM EST

... and it's the reason why using C for anything (especially anything large) is a very bad idea. Think of how much time and effort could have been saved over the years if C handled buffer overflows gracefully instead of crashing hard and/or letting people throw code into memory when they happen...
-
Yeah, just think (none / 2) (#125)
by curien on Sun Feb 08, 2004 at 08:11:34 AM EST

All those core libraries and device drivers that depend on direct memory manipulation all would have had to be written in assembly. I bet it would have set the computing world back ten years, and Linux would never have taken off (if it'd even been written).

--
All God's critters got a place in the choir
Some sing low, some sing higher
[ Parent ]
uh (none / 0) (#131)
by reklaw on Sun Feb 08, 2004 at 09:41:28 AM EST

I didn't say "if C didn't allow direct memory manipulation", I said "if C handled buffer overflows gracefully". Would you like to try again, and argue the actual point this time?
-
[ Parent ]
Can't do one without the other (none / 1) (#134)
by curien on Sun Feb 08, 2004 at 10:01:14 AM EST

C treats overflowing buffers more gracefully than any other language I know (except assembly): it doesn't consider it an error at all. Your OS might, but the language doesn't. This is what makes direct memory access outside the scope of the language possible.

--
All God's critters got a place in the choir
Some sing low, some sing higher
[ Parent ]
what!? (none / 0) (#139)
by reklaw on Sun Feb 08, 2004 at 10:46:26 AM EST

That's probably the stupidest thing I've ever read. Graceful handling means not treating it as an error when it obviously is?

It would be easily possible for C to offer direct memory access but still notice when you've gone off the end of a buffer -- name a situation where you'd ever want to do that deliberately.
-
[ Parent ]

Sure (none / 1) (#141)
by curien on Sun Feb 08, 2004 at 10:55:47 AM EST

  • Color text in DOS
  • Direct access of malloc meta-data
  • Aliasing
  • Treating multi-dimensional arrays as data-blocks
  • IPC
I could probably think of a few more, if you like.

--
All God's critters got a place in the choir
Some sing low, some sing higher
[ Parent ]
Strange things (none / 0) (#155)
by svampa on Sun Feb 08, 2004 at 12:35:48 PM EST

Color text in DOS

That's a DOS problem, not a need

Direct access of malloc meta-data

Why should you want to do so?

Aliasing

The lenguage could have a way for doing so in a safe way. For exmaple "A aliases C". The compiler could check a lot of things.

Treating multi-dimensional arrays as data-blocks

Why should you want to do so?

IPC

Use system features, dont' try to invade other's territory.



[ Parent ]
Um (none / 0) (#159)
by curien on Sun Feb 08, 2004 at 01:06:22 PM EST

At some point, there must be a way to interface with the OS (IPC and color text were two examples I presented). If you don't want the language to let you do so directly, you must use an API. How would you implement an API? With C (or something lower-level). I've no problem with saying that application programmers shouldn't use C -- I mostly agree with the sentiment. But not everyone's an application programmer.

As far as aliasing, yeah, that's great, but requiring that places undue limits on the programmer. For example, it would make

  strcmp(arr, arr + 5)

illegal. C took the opposite approach with the restrict keyword: that is, you can specify when two pointers must not alias each other.

As for the multi-dim array as a single block of memory thing, you haven't done much 3D programming, have you? It's all about matrix manipulation (logically a 2D array), but it's too expensive to copy several chunks of memory, so the 2D matrix must occasionally treated as a single large block of memory.

In all honesty, I've never had to directly access malloc meta-data. But I've seen it done in GC plug-ins for the C runtime. (Note that nothing actually prevents a C implementation from providing GC, bounds checking, etc -- they're just not high-demand features.)

--
All God's critters got a place in the choir
Some sing low, some sing higher
[ Parent ]

You kid, but it's true (none / 0) (#164)
by curien on Sun Feb 08, 2004 at 01:19:52 PM EST

Reverse-engineering often depends on direct memory manipulation. Damn, shoulda thought of that one myself.

--
All God's critters got a place in the choir
Some sing low, some sing higher
[ Parent ]
Re: buffer overflows is the biggie here... (none / 0) (#394)
by ttsalo on Tue Feb 10, 2004 at 06:58:38 AM EST

C doesn't have "buffers", so how could it know when one is being overrun? C merely provides a way of using indexes with pointers. If you want a buffer class, that's not the language's job.

At my work, a non-overflowing (dynamically expanding and contracting) buffer class (I call a structure with a related family of functions a class here) is one of the main ways of avoiding buffer overflows. The second is always checking the destination size when storing anything in an conventional array. It's really not all that hard and I would never blame the failure to do these things on the language. The need to do these things you can blame on the language, but not the failure to do them.

That's also the beauty of C, you can know pretty well what happens on the underlying hardware by looking at the code (of course, it has been getting harder all the time). There is very little hidden crap like bounds checking or garbage collection going on, for better or worse.



[ Parent ]

C wastes my time (2.60 / 5) (#130)
by meaningless pseudonym on Sun Feb 08, 2004 at 09:34:51 AM EST

Bravo, sir!

If I'm writing in C, there's all manner of things I have to remember but which are effectively useless in day-to-day coding. The practical outcome is to make it easier for me to write a hard-to-find bug.

We have various posters here screaming 'But it's a low-level language!'. Well, that's not a virtue of itself and in any case, is it appropriate to use a low-level language for so many of the things we do with it? It may well be a fantastic low-level language but if I'm writing a Minesweeper clone then that's completely irrelevant so I'm left with a language that's merely easier to bug and harder to debug. I'd wager that the average coder spends their time closer to writing Minesweeper than the Linux kernel.

Don't get me started on that syntax. Huge numbers of little symbols are _not_ easier to read than keywords. There's a reason we don't use APL any more.

Find me the times when I really need that memory level control or speed and I'll gladly find a C-loving masochist and get them to write the code. Well clear 99% of the time, that's not relevant and it's nothing more than vanity to suggest that it is. Most of what we write is emphatically not process bound, performance critical code. The rest of the time, give me any one of a number of better designed, more programmer-friendly languages that have all the control I need with none of the gotchas and watch me turn out higher quality code in less time.

Dear Sir (none / 1) (#214)
by kraant on Mon Feb 09, 2004 at 12:43:15 AM EST

Whyfore are you writing a minesweeper program in C?

Unless it's for fun or to learn C in which case why are you complaining?
--
"kraant, open source guru" -- tumeric
Never In Our Names...
[ Parent ]

100% correct (none / 0) (#283)
by TheLastUser on Mon Feb 09, 2004 at 01:02:26 PM EST

I am willing to concede that there are applications for C, just as I am willing to concede that assembler is useful. Neither of these language should be used to write code that runs in user space. They are for kernels and device drivers.

Most programmers write applications code that can be written better and in 10% of the time using VB, Java, or the like.

The speed mavens always whine about how much "faster" C is. Turns out that this is not always the case. It turns out that, unless you are Linus Torvalds or somebody like that, any significant application that you write in C will be SLOWER than if you wrote it in C# or Java. Reason? because the people who write VM's know much more about coding speedy software than most application coders.

In the end, speed has got to be the worst criteria to select a language with. Applications spend the majority of their existense waiting for the user to maneuver the pointer and click. Ease of development and ease of maintenance are, in fact, the two most important criteria for selecting a language. That's why big apps are written in high level object oriented languages.

[ Parent ]

C sucks. And blows. At the same time. (2.22 / 9) (#132)
by localroger on Sun Feb 08, 2004 at 09:50:38 AM EST

I learned to program by reverse engineering a BASIC interpreter with a tool similar to DEBUG and writing a new one for myself, so I'm not afraid of low-level programming. You have labels and variable names? What luxury! To this day I remember quite a few 8080 opcodes.

When I first encountered C around 1985 I was stunned at how ugly it was. Trying to be low-level and high-level at the same time, it manages to be neither. Every "enhancement" makes it more bloated and more complicated without making your life easier. Every once in awhile I've decided to bite the bullet and learn this piece of crap language just so I'll be current, and after a few chapters I rinse my eyes out with lye to make sure I never repeat the error.

I've had a pretty long career and I've done quite a few interesting things with truly low level and truly high level languages -- low level in assembly, and high level in various BASIC dialects or proprietary control languages. Nothing in between.

When you need performance, there is no substitute for assembly. C isn't portable assembly. If you have been taught that, you were lied to. It takes about five minutes examining the object code shat out by a C compiler to understand that, if you actually know how to program in assembly yourself. And with Intel and AMD going on six generations of object-code compatible CPU's the lure of "portable assembly" is dimmer than ever.

I've managed to go almost 20 years as a professional programmer without ever writing a line of C code, and if I can go another 20 it will represent one of the great successes of my career. With any luck I'll even live to see this horrible language die the death it so richly deserves.

What will people of the future think of us? Will they say, as Roger Williams said of some of the Massachusetts Indians, that we were wolves with the min

I quite like C (none / 2) (#191)
by squigly on Sun Feb 08, 2004 at 06:24:31 PM EST

Well, yeah.  C isn't the most elegant of languages, but raw C (no libraries or anything) does have a lot of predictability, and for most purposes, has many of the advantages of assembler without the need to learn an entire new architecture if you want to write for a new processor.

You do have a lot of control over your data.  It is very useful for producing code that requires direct access to memory.  You can write very fast efficient code.  If you use C, you will typically produce code with fewer errors than if you use assembler.  Perhaps you can write smaller faster code in assembler, but it takes longer, and if C is fast enough, I really don't see the benefit of using another language.

[ Parent ]

my take (none / 0) (#298)
by phred on Mon Feb 09, 2004 at 03:17:26 PM EST

Much of the ugliness of C isn't really C at all, I just view it as what I intuitively understand to be the crap system of logic for this iteration of the universe. Given that, C seems to be one of the nicest way to express the crap you have to go through to get computers to compute.

[ Parent ]
Sort of (none / 0) (#412)
by bugmaster on Tue Feb 10, 2004 at 08:49:24 AM EST

This is probably true for gigantic desktop CPUs such as Pentiums, etc. However, for tiny little microcontrollers, this is false -- the C that their manufacturer provides usually maps directly to Assembly instructions that you'd write by hand anyway. The reason for this is that these controllers have a very limited instruction set, and there's really only one way to do anything.

Now, granted, one can always invent some clever Assembly trick that would shave off one or two clock cycles here and there; and, for microcontrollers, this is actually important. However, C is a much nicer tool for handling the general logic of the program (such as "if they pressed this button, and this value is less than that, decrement counter") than raw Assembly: it produces virtually equivalent code, but it's actually readable.
>|<*:=
[ Parent ]

tradition (2.50 / 8) (#148)
by svampa on Sun Feb 08, 2004 at 12:05:31 PM EST

C is a very old language, and the reason why it's everywhere is similar to the reason why COBOL is still in lot of bank software: A legacy of old times.

C is a middle-level language, and it should be used for what it was made for: System and drivers programming. And even for this cases, people should think about using another kind of language, it has too many problems.

  • Pointers Aritmetic is a suicide, and should be used in a few special cases.
  • The need of "break" in "Switch" sentence (problably inherited from assembler) i dangerous. No compiler dares not to warn the lack of a "break". That shows how absurd is this "feature" of language.
  • If you write a single "=" instead of "==", you are in problems. it's a dangerous syntax
  • No boolean type
  • ...
  • Languages like java try to solve a lot of the known problems of C. But what I miss in new languages is that in order to ease learning to C programers they try to imitate C syntax. "}", uppercase/lowercase etc. What a pity. And C++ has extended a problemantic lenguage inheriting all the problems that C had.

    High level languages, and hard typed languages are not academic games, they are the result of investigation. They make compiler be aware of your logic, a lot of bugs are catched in compiler time, not in runtime.

    I'm sure that part of the problem of nowadays unstable software is C. I think that the day software developers (so companies do) dump C, will be a bright day for software.



    break; /*or lack thereof*/ (none / 0) (#165)
    by Norkakn on Sun Feb 08, 2004 at 01:40:30 PM EST

    I actually like the ability to not include break statements, it allows one to set up a function in which data can enter at any point and fall through all the statements.

    hmm, that didn't really make sense; I'll try again

    if you have ten steps that need to be done to a piece of data and you don't know what step you will start on, you can use case to declare the start and let it go through the remaining steps

    [ Parent ]

    Yeah, but (none / 0) (#166)
    by curien on Sun Feb 08, 2004 at 01:52:11 PM EST

    it should have been the other way around. You should have to use "continue" to prevent the jump, IMO.

    --
    All God's critters got a place in the choir
    Some sing low, some sing higher
    [ Parent ]
    I see your thinking (none / 1) (#197)
    by Norkakn on Sun Feb 08, 2004 at 07:01:05 PM EST

    I do understand why you think that, and it might make for a better language but my understanding is that the designers of C, when unsure of which way syntax should be, go with however it will actually be implemented.  breaks are how it is done in assembly and that is why they picked it for C

    [ Parent ]
    bad style (none / 0) (#189)
    by svampa on Sun Feb 08, 2004 at 06:12:46 PM EST

    Use 'if'. That's making ofuscate code. If someone looks at your code, at first sight he will think it's a "normal switch". Including a break is the expected behavior of switch.



    [ Parent ]
    Not really (none / 0) (#194)
    by curien on Sun Feb 08, 2004 at 06:57:57 PM EST

    You see, there are these things called "comments", which we use to describe unusual portions of code. For example...

    * Duff's Device, lack of breaks is purposeful. Don't fsck with this unless you know what you're doing. *

    --
    All God's critters got a place in the choir
    Some sing low, some sing higher
    [ Parent ]

    Ifs are slow (none / 0) (#195)
    by Norkakn on Sun Feb 08, 2004 at 06:58:02 PM EST

    switch(data.step) {
           case 0:
              do0(data);
           case 1:
              do1(data);
           case 2:
              do2(data);
              break;
           default:
               dodefault(data);
               break;
    }

    as oposed to

    if( data.step == 0)
              do0(data);
    if( data.step == 1)
              do1(data);
    if( data.step == 2)
              do2(data);
    if(data.step > 2)
               dodefault(data);

    the latter is a whole lot slower and if more steps are added one has to change the default syntax and there are a lot  mmore tests and the do functions need to iterate data.step
             

    [ Parent ]

    You do realize... (none / 2) (#246)
    by Kenoubi on Mon Feb 09, 2004 at 09:11:36 AM EST

    that the code snippets you posted don't do the same thing?

    Specifically, you excluded the break statements from your switch, so a 0 in data.step will cause all of do0, do1 and do2 to be executed, but with the if statements, only d0 would be executed.  I do agree that the ability not to break in a switch statement can be useful, but you're giving us a perfect argument for why having it the default is a bad idea.

    [ Parent ]

    It is the if statements that are confusing (none / 0) (#277)
    by Norkakn on Mon Feb 09, 2004 at 12:30:22 PM EST

    and the do functions need to iterate data.step

    *snooty airs*
    if one would take the time to read the comments at the bottom one would see that they snippets do the same thing

    I'm not an idiot, just stupid

    and doing multiple dos is the whole friggen point of it!

    [ Parent ]

    I guess it *could* work (none / 1) (#284)
    by Kenoubi on Mon Feb 09, 2004 at 01:05:33 PM EST

    But I think you'll have to forgive me for being confused.  You passed the "data" struct (presumably it's a struct, since this is C and you're using the dot operator on it) to the do* functions by value, not by reference.  (Yes, it is possible to pass a struct by value in C.  I was surprised to learn this, but I'm pretty sure it's true.)  Thus, any modifications done to the parameter in the do* functions will have no effect on the "data" struct in the surrounding code.

    It is still possible for this to work, if the do* functions just happen to have a pointer to the data struct lying around, or it's stored in a global somewhere.  But that kind of code would be extraordinarily non-obvious and not at all a Good Idea.

    [ Parent ]

    Duff's Device (none / 1) (#235)
    by harryh on Mon Feb 09, 2004 at 06:11:23 AM EST

    The need of "break" in "Switch" sentence (problably inherited from assembler) i dangerous. No compiler dares not to warn the lack of a "break". That shows how absurd is this "feature" of language.

    You've heard of Duff's Device right?
    Many people (even bwk?) have said that the worst feature of C is that switches don't break automatically before each case label. This code forms some sort of argument in that debate, but I'm not sure whether it's for or against.


    [ Parent ]
    break in switch (none / 0) (#372)
    by awgsilyari on Mon Feb 09, 2004 at 11:18:59 PM EST

    No compiler dares not to warn the lack of a "break". That shows how absurd is this "feature" of language.

    Clearly you've never heard of 'coroutine'.

    At the time C was invented it was an extremely common thing to implement a coroutine as a function which switches on a state variable. Of course, at that time they probably didn't call it a 'coroutine' but it did the same thing: implement a function that resumes execution after the last place it returned from. This uses 'Duff's device' to jump to the appropriate starting place.

    If you still don't know what I'm talking about, picture a DFA which receives input in finite-sized chunks and returns to its caller when its input buffer is exhausted.

    In systems programming (think device drivers), these sorts of coroutimes are extremely common, since they natually track the device state. Guess what: C was written to do systems programming, so it makes sense that switch statements work in a way that makes it easy to program such things without using 'goto' all over the place.

    It's nonsense to criticize a language for it's raison d'etre. You think you're being insightful by pointing out that C is a lousy applications language? Care to tell us anything else obvious?

    --------
    Please direct SPAM to john@neuralnw.com
    [ Parent ]

    Once again, wrong applications (2.88 / 9) (#167)
    by Idioteque on Sun Feb 08, 2004 at 02:08:19 PM EST

    Throughout my programming career I've worked in bascially two areas of programming: writing device drivers and writing video and audio CODECs that run on both PCs and embedded systems. I can't begin to think of another language more suited for these two areas.

    Do I use C for my web photo album or my CD ripping and encoding scripts? No. Would I want to write a graphical email client in C? No. Would I want to write a video decoder in any language besides C? No. When you're trying to squeeze out as much processing as possible, the easy integration of assembly language comes in quite handy too. Don't give me this nonsense about compilers being so good these days you don't need to hand code any assembly. Yes, compilers are very good, but this is to a point. When you're trying to implement SIMD instructions for further optimizations, hand coding in assembly is sometimes the only option.

    Furthermore, a good C programmer is one who understands how C translates into assembly, which tells you how close to assembly C really is.


    I have seen too much; I haven't seen enough - Radiohead
    Well said. (none / 1) (#202)
    by porkchop_d_clown on Sun Feb 08, 2004 at 08:49:43 PM EST

    C was designed for writing device drivers and operating systems. It's still the best language for that task.

    But no sane person deliberately uses C for user apps, no more than a sane person would try to write an OS in Perl, or a video game in SQL.

    --
    "the internet is to the techno-capable disaffected what the United Nations is to marginal states: it offers the illusion of empowerment and c
    [ Parent ]

    Damn... (none / 1) (#247)
    by mold on Mon Feb 09, 2004 at 09:35:32 AM EST

    Are you saying my SQL99 port of Super Mario Sunshine won't work?

    Figures. And I thought I had had a stunning revelation.

    ---
    Beware of peanuts! There's a 0.00001% peanut fatality rate in the USA alone! You could be next!
    [ Parent ]

    SQL is not Turing complete (none / 1) (#356)
    by FlipFlop on Mon Feb 09, 2004 at 08:13:49 PM EST

    I'm sorry to have to tell you, but SQL is not Turing complete. Its functionality was intentionally limited to guarantee that it would always terminate. That way, no one can create a query that winds up in an infinite loop.

    AdTI - The think tank that didn't
    [ Parent ]

    Whoa there! (none / 0) (#264)
    by ksandstr on Mon Feb 09, 2004 at 11:38:06 AM EST

    Are you saying that the entirety of the GNOME posse is somehow, you know, bonkers?

    Looking at the kind of function names some of their libraries export makes me wonder though if they aren't just a little nostalgic over COBOL...


    Fin.
    [ Parent ]

    If they're using plain C, yeah. (none / 0) (#426)
    by porkchop_d_clown on Tue Feb 10, 2004 at 10:49:05 AM EST

    I'm not real familiar with Gnome internals - but didn't the Gnome people also get their stuff working with C# and Mono?

    --
    "the internet is to the techno-capable disaffected what the United Nations is to marginal states: it offers the illusion of empowerment and c
    [ Parent ]

    I've written a web photo album program in C, (none / 0) (#261)
    by ksandstr on Mon Feb 09, 2004 at 11:36:22 AM EST

    And I can tell you that it wasn't much of a hassle. Though I must admit that using a proper string utility library and garbage collection (the Boehm-Weiser GC, in my case) did make it a bit less of a pain.

    I'll readily admit that if I had to write the whole thing again now, I'd use Perl or Haskell to do it, but doing it in C wasn't so bad either.


    Fin.
    [ Parent ]

    A good programmer understands how... (none / 1) (#262)
    by wumpus on Mon Feb 09, 2004 at 11:36:27 AM EST

    Check out this diary by localrodger. While C is indeed closely tied to assembler (and can even be thought of as a somewhat portable assember), any programmer should have some knowledge of assember.

    If you don't know assembler, you don't know how the computer is running your program.

    Wumpus

    [ Parent ]

    C obviously sucks (1.40 / 5) (#177)
    by psychologist on Sun Feb 08, 2004 at 04:08:07 PM EST

    After reading this article, I have come to the conclusion that  when I learn to program, I shall not use C.

    What is your opinion on C+ or D?


    I prefer Emin7 myself. (none / 2) (#180)
    by epepke on Sun Feb 08, 2004 at 04:46:58 PM EST


    The truth may be out there, but lies are inside your head.--Terry Pratchett


    [ Parent ]
    I haven't heard of C+, and D felt icky. (none / 1) (#186)
    by James A C Joyce on Sun Feb 08, 2004 at 05:30:06 PM EST

    D just seems to change a lot of things gratuitously. I don't want a C that feels totally different; I want a language that feels like C but has no suckiness.

    I bought this account on eBay
    [ Parent ]

    I agree (none / 3) (#187)
    by psychologist on Sun Feb 08, 2004 at 05:55:33 PM EST

    When I was a kid, I always felt there was something a bit wrong about D. Whenever I heard the words on Sesame street, "And todays show is brought to you by the letter D", I immediately switched channel.

    C, D, F and X are the letters I hate the most. I like P, O and Z. I also like S, because it is so sneaky. You can feel its sneakiness on your tongue.

    I didn't quite realise that these alphabets had such a complex .. thing .. behind them till I read your article. No wonder I had trouble learning in school. I tol everyone those alphabets were darned hard, and they said I was stupid. Haha! Who is stupid now! I bet they could not even understand your essay, and I understood it after reading it only twice.

    [ Parent ]

    OCaml. (none / 1) (#225)
    by tkatchev on Mon Feb 09, 2004 at 04:42:30 AM EST

    I'm advocating for ocaml these days.

       -- Signed, Lev Andropoff, cosmonaut.
    [ Parent ]

    good (none / 0) (#240)
    by 49399 on Mon Feb 09, 2004 at 08:12:45 AM EST

    the article misses the lack of a decent way of writing a tree in C

    [ Parent ]
    Why This Article Is Not My Favorite Article (2.46 / 15) (#179)
    by kitten on Sun Feb 08, 2004 at 04:20:28 PM EST

    It is tedious, long-winded crap, largely incorrect owing to gross generalizations and misapplications, desperately trying to be funny and failing, and is written by a crapflooding nitwit.
    mirrorshades radio - darkwave, synthpop, industrial, futurepop.
    It's better than anything you'll ever write. (1.75 / 8) (#184)
    by James A C Joyce on Sun Feb 08, 2004 at 05:22:43 PM EST

    "tedious, long-winded crap...desperately trying to be funny and failing..."

    Gee, that kinda reminds me of this other story I read once.

    And my story bubbles up through the queue, slowly but very much surely, score constantly ratcheting gradually onwards.

    Now, I'm going to stop and let my story do the talking instead of flaming people without provocation.

    I bought this account on eBay
    [ Parent ]

    Heh. (none / 0) (#192)
    by it certainly is on Sun Feb 08, 2004 at 06:46:17 PM EST

    You obviously haven't read kitten's story about the police being hoaxed into arresting him. That was classic.

    kur0shin.org -- it certainly is

    Godwin's law [...] is impossible to violate except with an infinitely long thread that doesn't mention nazis.
    [ Parent ]

    Haha (none / 0) (#327)
    by bc on Mon Feb 09, 2004 at 05:16:01 PM EST

    Forgot about that. What an enormous tit he is, hoaxing a bomb threat. Idiot.

    ♥, bc.
    [ Parent ]
    Yeah (none / 1) (#514)
    by kitten on Wed Feb 11, 2004 at 02:57:19 PM EST

    Forgot about that. What an enormous tit he is, hoaxing a bomb threat. Idiot

    You clearly did forget, since i'm not the one that "hoaxed" the bomb threat.

    But I expect you shan't be deterred by that pesky reality thing.
    mirrorshades radio - darkwave, synthpop, industrial, futurepop.
    [ Parent ]
    Sure kitten (none / 0) (#540)
    by bc on Thu Feb 12, 2004 at 03:54:29 PM EST

    But this is just another one of your hoaxes, eh? Tsk. What a cad.

    ♥, bc.
    [ Parent ]
    A trick, within a trick, within a trick. (none / 1) (#541)
    by kitten on Thu Feb 12, 2004 at 04:43:44 PM EST

    But you're an enigma wrapped in a twinkie. Not quite the same.
    mirrorshades radio - darkwave, synthpop, industrial, futurepop.
    [ Parent ]
    Look out everyone! (1.50 / 4) (#199)
    by kitten on Sun Feb 08, 2004 at 07:23:31 PM EST

    He's waving the credential wang around!
    mirrorshades radio - darkwave, synthpop, industrial, futurepop.
    [ Parent ]
    I prefer. (none / 0) (#206)
    by Kal on Sun Feb 08, 2004 at 10:34:58 PM EST

    I prefer to call it an e-peen when talking about this sort of thing in a virtual enviroment.

    [ Parent ]
    Why can't we program in English? (1.80 / 5) (#181)
    by United Fools on Sun Feb 08, 2004 at 04:52:38 PM EST

    Would that solve all the problems?
    We are united, we are fools, and we are America!
    They tried that. (3.00 / 5) (#205)
    by Kal on Sun Feb 08, 2004 at 10:33:45 PM EST

    It's called Cobol. No one who's used it ever wants to go near it again.

    [ Parent ]
    You're looking for lawyerspeak (nt) (none / 0) (#273)
    by ksandstr on Mon Feb 09, 2004 at 12:16:49 PM EST



    [ Parent ]
    It's Pretty Obvious (none / 0) (#382)
    by Gysh on Tue Feb 10, 2004 at 03:40:50 AM EST

    I'm not sure whether or not this is a serious comment, but I'll bite.

    We can't program in English because English is too imprecise. I forget who originally said this, but take the instructions on a shampoo bottle for example:
    • Lather
    • Rinse
    • Repeat
    Simple enough, right? But try following those instructions directly and you'll find that it's quite impossible since you can't keep lathering and rinsing forever, and the instructions don't tell you where to stop. For one thing, you'd run out of shampoo - shampoo in this case is a good metaphor for memory.

    Things like these are simple enough to humans because we have common sense - we know we're only supposed to "repeat" as many times as we feel neseccary (hence, hopefully not unto eternity), but computers don't have that so it kind of presents a problem.

    [ Parent ]
    Do we have common sense? (nt) (none / 1) (#384)
    by United Fools on Tue Feb 10, 2004 at 03:52:19 AM EST


    We are united, we are fools, and we are America!
    [ Parent ]
    Well, uh, sometimes... <NT> (none / 0) (#555)
    by Gysh on Mon Feb 16, 2004 at 05:38:54 PM EST



    [ Parent ]
    "here's why C is now owned by Pascal. " (2.25 / 4) (#182)
    by horny smurf on Sun Feb 08, 2004 at 04:55:36 PM EST

    You somehow forgot to mention why Pascal is superior. Perhaps because almost all of the "problems" (with the exception of the preprocessor) you listed also apply (in spades) to Pascal?



    Pascal's Advantage (none / 0) (#503)
    by netbogan on Wed Feb 11, 2004 at 09:57:49 AM EST

    While I wouldn't say Pascal is superior (right tool for the right job), it does have some major advantages. Now these vary between dialects, and some of the earlier ones did have major limitations (eg. ISO Pascal and Turbo/Borland Pascal).

    However, modern Pascal is not the language you learned in high school, it has been improved significantly.

    For advantages have a look at Free Pascal's Advantages Page

    This compiler is itself written in Pascal (compiles itself), and there is currently a framework available for those wishing to build a kernel (x86 32bit Multiboot compatable), So it is not lacking flexability.

    [ Parent ]

    Why crack is your favorite drug (2.80 / 5) (#183)
    by strlen on Sun Feb 08, 2004 at 05:01:12 PM EST

    First, let's start with the first one, and one closest (heh) to my heart. As others have stated, C is not a high-level language. Why shouldn't C have a built in string class? Ironically, you provided a reason yourself, with the idea of multi-byte strings: duplicating functionality of strcpy() etc.. on 16-bit strings would be rather trivial.  In addition, C's handling of strings as NULL-bounded arrays of characters allows use to use efficient CPU -level manipulation techniques on some platforms, or high degrees of optimization. So, if you want a string handling language, I high suggest trying Perl, or sed/awk/grep.

    As for lack as what's implemented as functions as parts of the language.. again: C is not a high level. C is translated to assembly. It is entirely possible to write a boot-loader, or an OS kernel in C, at a level where you have no libc.

    As for buffer overflows, the issue is a) the fact that this isn't an issue with a decent OS and a decent architecture. Typing "set noexec_user_stack = 1" >> /etc/system should be sufficient. Secondly, all sorts of patches exist to remedy the situation all ready: propolice, safe strlcpy() functions, and fucking strncpy functions already.

    As for pre-processor power, I suggest looking into plan9's C-compiler suite, which takes significant steps to curb that.

    And let me restate again: C is not a high level language. What you're looking for is called "C++". Have you ever look at assembly code generated from C++ and assembly code generated from C? That's where the difference will lie. You don't write CGI applications in C, you don't write an OS in Perl. C++ provides greater flexibility, but those who code an object oriented language without any knowledge of C, are generally the ones getting paid $15 an hour to program horrible Java code which will crash under most any conditions (see friendster's JSP backend for prime illustration of that.)

    --
    [T]he strongest man in the world is he who stands most alone. - Henrik Ibsen.

    Amen to that (none / 0) (#375)
    by Arkaein on Mon Feb 09, 2004 at 11:50:07 PM EST

    I wasn't going to talk about C++ in a reply to a C bashing article, but since it's been mentioned, what the hey.

    The beauty of C++ is that it lets the programmer write to the appropriate level for different parts of an application. Well written, high level C++ code can look just as good as Java or other more developer friendly languages. With the STL and reference types programmers should almost never need to do direct pointer manipulation even for managing complex data structures. Using the vector class made me wish I had looked more closely into the STL during a few years of heavy C++ programming without it. The string support is not at the same level as a language like Perl or even Java, but it's there. Also, the exception handling features may not be the greatest but they're there and they work.

    While these nice, safer features are there in C++, it's trivial to get down and dirty with purely C-like operations. I think the main reason C++ gets a bad rap, and is not seen as a superior alternative to C for applications development is that a lot of C++ programmers bring their C habits along with, and do not make use of superior C++ features when available. This leads to inefficiency and inconsistencies in code written by different developers.

    ----
    The ultimate plays for Madden 2003 and Madden 2004<
    [
    Parent ]

    [Slight OT] string manipulation in C++ (none / 0) (#378)
    by strlen on Tue Feb 10, 2004 at 12:25:38 AM EST

    You should check out boost, it's a collection of classes for C++, amongst them is a regexp class, which makes string processing in C++ a great deal easier.

    Honestly, I don't write C++ unless I'm paid to do so, I either code C or Assembly, or Perl, but in any case C++ sounded like the proper solution for the article's author. What he suggested is simply against C philosophy and design.

    --
    [T]he strongest man in the world is he who stands most alone. - Henrik Ibsen.
    [ Parent ]

    Why loose words on the evident? (2.00 / 4) (#185)
    by jope on Sun Feb 08, 2004 at 05:29:50 PM EST

    C is an anachronism, a better macro assembler. It is a terrible, ugly language that includes none of the developments of computer language design that occurred in the last 30 years. C++ and C# are nearly as bad. The only reason why people are using these languages is because they have to, or they do not know decent modern languages, or both.

    Ok. (none / 2) (#302)
    by valar on Mon Feb 09, 2004 at 03:35:49 PM EST

    What would you suggest? What is a decent modern language? I'll assume it isn't any of the ones you mentioned, obviously, and java is very similar to C# (in fact, C# implements everything from java that I've ever need). Also eliminated are the other .NET languages, because they implement the same features and libraries, and only differ in syntax. Lisp, algol, COBOL, fortran, etc are definitly not modern languages. Perl is essentially shell and grep rolled into one (object oriented) package. Of course, there are lots more, but in particular I am left with python, oberon, and ruby in my head, which aren't all that different, than the others, really... So honestly, what is your ideal modern language (maybe we can start a language-relgious war thread [oops, too late])?

    [ Parent ]
    Python, oberon, ruby. (none / 0) (#321)
    by tkatchev on Mon Feb 09, 2004 at 05:01:20 PM EST

    All three have virtually nothing in common.

       -- Signed, Lev Andropoff, cosmonaut.
    [ Parent ]

    to quote myself: (none / 0) (#368)
    by valar on Mon Feb 09, 2004 at 10:39:16 PM EST

    " head, which aren't all that different, than the others, really"

    Which, if correctly resolved in the english grammer means that these things (python, oberon, ruby) are similar to the other things (C++, C, Java, etc). If you still don't believe it... well, I doubt your understanding of those languages...

    [ Parent ]
    I doubt *your* understanding of anything. (none / 0) (#399)
    by tkatchev on Tue Feb 10, 2004 at 07:43:31 AM EST

    Python is a faux-functional language; a type of dynamically-typed ML for shell scripts, essentially.

    Ruby is a re-implementation of Smalltalk for writing shell scripts.

    Oberon is essentially a modern, safe version of Fortran compiled to efficient machine code.

    Not only do the three languages have nothing in common, they all have completely different and unrelated roots.


       -- Signed, Lev Andropoff, cosmonaut.
    [ Parent ]

    Heres a thought ... (none / 2) (#198)
    by blackpaw on Sun Feb 08, 2004 at 07:06:50 PM EST

    If you don't like it, *DON'T USE IT !*

    Sheeze

    In defence of the author... (none / 3) (#209)
    by Lord Snott on Sun Feb 08, 2004 at 11:44:48 PM EST

    As much as I want to say James A C Joyce is full of shit (he always has been before), I kind of agree with him here.

    While he and I may not use C, a lot of people do, including those that create the software we use (FreeBSD, Linux, Apache, Samba, etc).

    When C is used well, it's great, when C is used badly, it's horrendous.

    I (mostly) trust software from certsain places, like FreeBSD.org or Apache.org, but... well...

    I wouldn't use C.
    ~~~~~~~~~~~~~~~~~~~~~~~~
    This sig in violation of U.S. trademark
    registration number 2,347,676.
    Bummer :-(

    [ Parent ]

    exactly (none / 2) (#219)
    by coderlemming on Mon Feb 09, 2004 at 03:15:48 AM EST

    When C is used well, it's great, when C is used badly, it's horrendous.

    This line pretty much sums up why I think this article is needless.

    C has a very specific niche in today's world.  It's extremely fast, and you can have a great deal of control over exactly what's going on.  It's just a bit higher-level than assembly language.  

    There are a precious few situations where C is the right language for the job, but when it is, it's about the only language for the job.  In this case, one simply has to be careful to avoid making horrendous code due to the pitfalls described in the article.  When using a razor, be careful not to cut yourself.

    C doesn't try to be a perfect language for everything, which is what the author seems to want.


    --
    Go be impersonally used as an organic semen collector!  (porkchop_d_clown)
    [ Parent ]

    Don't forget the wankers! (none / 2) (#221)
    by Lord Snott on Mon Feb 09, 2004 at 03:55:37 AM EST

    C has a very specific niche in today's world.

    Spot on. Only a wanker would disagree. Unfortunately there are too many wankers loose, using the wrong tool for the job.

    I've seen people using knives instead of screwdrivers, and screwdrivers instead of hammers. And I've seen people using C instead of Java, Delphi, or Visual Basic.

    Wankers.

    ~~~~~~~~~~~~~~~~~~~~~~~~
    This sig in violation of U.S. trademark
    registration number 2,347,676.
    Bummer :-(

    [ Parent ]
    Usage (none / 0) (#267)
    by ZorbaTHut on Mon Feb 09, 2004 at 11:49:53 AM EST

    When C is used well, it's great, when C is used badly, it's horrendous.

    Doesn't that apply to every language? Well, except the ones that are horrendous even when they're used as well as possible (see: brainfuck, befunge, intercal, malbolge.)

    Honestly, show me one language that's still fantastic outside its niche, and I'll show you someone who's straining to make a point and failing.

    [ Parent ]

    Let me re-phrase that... (none / 1) (#361)
    by Lord Snott on Mon Feb 09, 2004 at 09:30:06 PM EST

    I have trouble with English, so I'll try to spell this out as best I can.

    I don't like C. My mind doesn't work the way C wants it to. I wouldn't use C.

    C has it's place, talented people use C well in it's niche. People also use C outside it's niche. This is a Bad Thing.

    People shouldn't use C outside it's place, because even if your a talented professional, it's too easy to make a mistake. Experienced and proffesional butchers who have been in their trade for decades still cut themselves, C programmers who know exactly what they're doing still make mistakes.

    Most guns have basic built in safety in the form of a trigger guard. It doesn't get in the way, it doesn't stop 99.99% of usage situations, but it stops 99% of accidental firing. It simply stops people (or things) bumping the trigger.

    C has no such basic protection. It is too easy to make a mistake and not realise it. Too many people unnecessarily use C, that was my point.

    I strained, but hopefull I still made my point :-)

    ~~~~~~~~~~~~~~~~~~~~~~~~
    This sig in violation of U.S. trademark
    registration number 2,347,676.
    Bummer :-(

    [ Parent ]

    Why C is not my favorite language (1.25 / 4) (#208)
    by Big Sexxy Joe on Sun Feb 08, 2004 at 11:34:46 PM EST

    Because Java is.  Not that C suffers any terrible faults in general, but Java is the most fantasically designed language I have learned.  Anything you can do in C you can do better in Java.

    I'm like Jesus, only better.
    Democracy Now! - your daily, uncensored, corporate-free grassroots news hour
    Can you ensure data locality in Java... (none / 1) (#217)
    by lamont116 on Mon Feb 09, 2004 at 01:15:59 AM EST

    ...so you don't thrash the living shit out of my cache subsystem?

    [ Parent ]
    ...sort of. (none / 0) (#229)
    by Will Sargent on Mon Feb 09, 2004 at 05:36:18 AM EST

    You're talking about making sure that some objects are only held in memory?

    There's no way to do that portably, because memory management is owned by the VM.  If you're on a very generous VM, you can maybe indicate that instances of some class could help, but garbage collectors are so much smarter these days about short and long term references it probably wouldn't be a help.

    On the other hand, Java 1.5 will have the following:

    "On other platforms, each Java application consumes some system memory, so you might end up using more memory than you need to when running multiple Java applications. Other languages, such as C or C++, solve this problem using what's called shared libraries. Apple developed an innovative new technology that allows Java code to be shared across multiple applications. This reduces the amount of memory that Java applications normally use. And it fits right into Sun's Hot Spot VM, allowing Mac OS X to remain compatible with standard Java. In addition, Apple has given this implementation to Sun so the company can deploy it on other platforms. It's just one example of how Apple supports standards and shares ideas to benefit all."

    I think this means a copy of rt.jar is shared in memory between multiple JVMs on the same system instead of creating their own unique copy.
    ----
    I'm pickle. I'm stealing your pregnant.
    [ Parent ]

    no, not that cache (none / 2) (#305)
    by coderlemming on Mon Feb 09, 2004 at 03:50:06 PM EST

    Parent of parent is probably referring to the CPU  cache subsystem.  When a miss happens in this cache, the cache pulls that piece of memory and a few nearby, with the assumption that you'll probably use them soon too.  This is usually correct because of arrays and because local variables in a function are all stored nearby eachother.

    In Java, there's no way to make sure that variables that will be used together are stored nearby eachother in memory, so the programmer can't provide hints to the CPU cache to help it speed things up.

    Such is the price you pay for using a portable virtual machine... the question is just an irrelavent dodge, since each language has its benefits and drawbacks.


    --
    Go be impersonally used as an organic semen collector!  (porkchop_d_clown)
    [ Parent ]

    That's what you think (none / 0) (#230)
    by Will Sargent on Mon Feb 09, 2004 at 05:38:18 AM EST

    "Anything you can do in C you can do better in Java"...

    Ever tried to write a JNI program in Java?
    ----
    I'm pickle. I'm stealing your pregnant.
    [ Parent ]

    Yes (none / 0) (#288)
    by Big Sexxy Joe on Mon Feb 09, 2004 at 01:34:59 PM EST

    Contrary to popular belief, JNI's can be written in Java.  I strongly prefer writing JNI programs in Java, in fact.

    I'm like Jesus, only better.
    Democracy Now! - your daily, uncensored, corporate-free grassroots news hour
    [ Parent ]
    C++ (none / 0) (#265)
    by ZorbaTHut on Mon Feb 09, 2004 at 11:40:58 AM EST

    Out of curiosity, do you make the same claim about C++ vs Java?

    [ Parent ]
    No (none / 0) (#290)
    by Big Sexxy Joe on Mon Feb 09, 2004 at 01:36:31 PM EST

    But anything C++ can do, Visual Basic can do better.

    I'm like Jesus, only better.
    Democracy Now! - your daily, uncensored, corporate-free grassroots news hour
    [ Parent ]
    Heh (none / 1) (#316)
    by ZorbaTHut on Mon Feb 09, 2004 at 04:31:16 PM EST

    Now you're just trolling. :)

    [ Parent ]
    Now? nt (none / 0) (#479)
    by Big Sexxy Joe on Tue Feb 10, 2004 at 09:26:30 PM EST



    I'm like Jesus, only better.
    Democracy Now! - your daily, uncensored, corporate-free grassroots news hour
    [ Parent ]
    Anything you can do in Java... (none / 1) (#353)
    by the on Mon Feb 09, 2004 at 07:51:39 PM EST

    ...I can do better in Forth. Of course I haven't bothered defining 'better' because that would just be splitting hairs.

    --
    The Definite Article
    [ Parent ]
    Java rules of-course, but you cannot do everything (none / 2) (#379)
    by Roman on Tue Feb 10, 2004 at 02:37:00 AM EST

    You cannot do everything in Java that is available to you in C, you must be kidding me.  Fork processes (not threads mind you), signal handling, system calls, direct memory manipulations are just a small number of things not available to you in Java.  I program in Java for over 5 years now, and in C for over 10, so I already had to deal with many limitations.  How do you create a function in Java?  Object oriented paradigm is not a good answer to many programming problems, object orientation doesn't make much sence when you need to implement an algorythm that is simply working with some arbitrary parameters and that stands on its own, not as an object with this algorythm been a method on the object.

    Do you realize that Java is simply the next Cobol - a high level business rule description language, which is not even that?  A real business rule description language would be a functional language such as Lisp or Scheme or ML.  Java is an object manipulation language where objects are glorified structs with methods in them, so these are 'classes'.

    C++ is the definitive language, it supports procedural programming as well as 'object-oriented', as well as low level system calls, and it has some amazing functionality hardly available to  others (templates) etc. but C++ sounds too complicated for business programming exactly because it does all of these things and business programming does not need many of them.

    Don't tell me multiple inheritence is available to Java, or friend classes (internal classes do not count, java compilers reimplement these as simply public classes with methods been protected for only the current package to use.)

    Can you force memory clearance in Java?  I mean you cannot release memory even by an explicit System.gc() call.  I miss something like System.free(Ojbect) in Java, sure it would leave nulls instead of references in many objects, but it would be very useful in special cases.  What about running native code, do you like JNI?

    Java is a nice language but don't confuse this property with been a superior language.

    [ Parent ]

    Procedural (none / 0) (#457)
    by MilesTeg on Tue Feb 10, 2004 at 03:26:51 PM EST

    You've been writing Java for 5 years and you've yet to discover the "static" modifier eh?

    [ Parent ]
    c'mon (none / 0) (#473)
    by Roman on Tue Feb 10, 2004 at 07:32:34 PM EST

    don't be an a-hole, sure there is static, but everyting is still a class, isn't it? So you end up with a cludge. good.

    [ Parent ]
    Okay (none / 0) (#480)
    by Big Sexxy Joe on Tue Feb 10, 2004 at 09:28:05 PM EST

    So since it refutes your whole arguement, it must be a cludge.

    I'm like Jesus, only better.
    Democracy Now! - your daily, uncensored, corporate-free grassroots news hour
    [ Parent ]
    that's interesting (none / 0) (#511)
    by Roman on Wed Feb 11, 2004 at 02:04:43 PM EST

    You do believe that it refutes my whole argument?  Sorry that I did not mention it in the original comment, I thought about it.  It is true, that static gives you an ability to have just 'code' that does not have to belong to an instance of a class.

    Firstly I argued about more than just that single point.

    Secondly, I do not like all the cludgework that goes with the 'static' modifier if all you want is a real procedural paradigm.  Why, Java isn't procedural, remember?  It is OO, but not pure OO.  Like Basic that was procedural and 'object oriented' part was added to the VB.  Is VB a good OO language?  No.

    I am still confused how does 'static' modifier refute my entire argument of the first comment.

    [ Parent ]

    Why C++ is my favorite language (none / 0) (#392)
    by ttsalo on Tue Feb 10, 2004 at 06:25:04 AM EST

    Come on, Java is retarded. You can't even write a function max(x, y) that works for all types that have a comparison operation defined. Granted, you can't in C either, and that's why C++ is my favourite language.



    [ Parent ]

    Ignorant (none / 0) (#456)
    by MilesTeg on Tue Feb 10, 2004 at 03:24:09 PM EST

    try: public max ( Object x, Object y ) { ... } Any class which has a comparitor defined will work.

    [ Parent ]
    Re: Ignorant (none / 0) (#488)
    by ttsalo on Wed Feb 11, 2004 at 03:47:31 AM EST

    I said all types, not just all classes. Does this really work with ints, doubles etc? If it does, Java has really been doing some catching up while I've been away...

    [ Parent ]
    Please continue. (none / 0) (#506)
    by i on Wed Feb 11, 2004 at 01:08:03 PM EST

    What's inside the curly braces?

    and we have a contradicton according to our assumptions and the factor theorem

    [ Parent ]
    Java?! (none / 0) (#533)
    by Arevos on Thu Feb 12, 2004 at 10:02:26 AM EST

    Not that C suffers any terrible faults in general, but Java is the most fantasically designed language I have learned.

    Java?! Java is the most fantastically designed you've leant?! How many languages do you know?

    As a language, Java is pretty damn bad. It's essencially a stripped down version of C++ with garbage collection and a good library behind it. The library's nice, but if you ignore that for a moment (as it has no bearing on the actual language itself), then Java's really bad.

    Anything you can do in C you can do better in Java.

    Not really. Some things Java can do better, or should I say, some things are easier to impliment with the Java libraries than C equivalents, but Java misses a lot of the features C++ has.

    The programming project I'm working on at the moment needs multiple inheritance and pointers for it to work nicely. I can't think of a good way to get around such a lack in Java. Not that C++ is any good, either, but at least it doesn't pretend that it's anything more than a hack on C.

    [ Parent ]

    Don't mess (2.83 / 6) (#226)
    by bloat on Mon Feb 09, 2004 at 05:10:09 AM EST

    If you find yourself writing a comment that says:

    /* Don't touch this code unless you know what you're doing */

    Then do everyone on your team a favour - delete the comment and rewrite the code.

    CheersAndrewC.
    --
    There are no PanAsian supermarkets down in Hell, so you can't buy Golden Boy peanuts there.
    Doh! (none / 0) (#227)
    by bloat on Mon Feb 09, 2004 at 05:13:22 AM EST

    That was supposed to go here
    --
    There are no PanAsian supermarkets down in Hell, so you can't buy Golden Boy peanuts there.
    [ Parent ]
    BS (none / 1) (#282)
    by curien on Mon Feb 09, 2004 at 01:02:09 PM EST

    Sometimes the performance of a tight loop is more important than the readability of the code. The example given was Duff's Device. People don't use it just to look flashy, they use it to make partially-unrolled loops that, say, fit inside a small, fast cache.

    --
    All God's critters got a place in the choir
    Some sing low, some sing higher
    [ Parent ]
    Then explain what happens in the comment! (none / 0) (#478)
    by Merc on Tue Feb 10, 2004 at 08:47:25 PM EST

    First of all, profile the code to be sure that that really is a bottleneck. If it isn't, don't optimize it! If it is, sure, go ahead and optimize it, but explain what the hell you're doing. Maybe you're an uber-elite programmer and can get something crammed into a cache. Maybe you're a sloppy ass who will start leaking memory because you forget you malloced above. Do your team a favour and comment the damn code.



    [ Parent ]
    Um (none / 0) (#509)
    by curien on Wed Feb 11, 2004 at 01:21:50 PM EST

    That's exactly what I said to do.

    --
    All God's critters got a place in the choir
    Some sing low, some sing higher
    [ Parent ]
    Where? (none / 0) (#512)
    by Merc on Wed Feb 11, 2004 at 02:42:11 PM EST

    In your comment you say that sometimes the performance is more important than the readability of the code. Nowhere in there do I see anything about commenting the code properly.



    [ Parent ]
    Because OP (none / 0) (#518)
    by curien on Wed Feb 11, 2004 at 06:31:31 PM EST

    meant to reply to another of my comments but responded top-level on accident, so the conversation got clipped.

    --
    All God's critters got a place in the choir
    Some sing low, some sing higher
    [ Parent ]
    Use Perl; (none / 2) (#228)
    by zentara on Mon Feb 09, 2004 at 05:34:53 AM EST

    Many,many ex-C programmers have found a "happy comprimise" with Perl. It takes care of most of the problems you discuss. Perl is like a "user-friendly" front-end to C. It makes it so easy to whip out a program, and if you feel the "need for speed", you can include Inline C or even Inline Assembly.

    Before I posted this, I wanted to see if anyone else mentioned Perl, so I searched the page for it. The matches I got were "pro perl y". Sort of a "zen moment". :-)

    Perl sucks. (1.00 / 6) (#232)
    by James A C Joyce on Mon Feb 09, 2004 at 05:51:23 AM EST

    I'm sure you understand.

    I bought this account on eBay
    [ Parent ]

    Matrix?! (none / 1) (#233)
    by bkhl on Mon Feb 09, 2004 at 06:02:02 AM EST

    Since the fsck when is "zen moment" a The Matrix reference?

    [ Parent ]
    LOL, that's my signature. (1.28 / 7) (#239)
    by James A C Joyce on Mon Feb 09, 2004 at 07:50:52 AM EST

    HAHAHAHAHAHAA YOU SUCK!

    I bought this account on eBay
    [ Parent ]

    It's his sig. (none / 0) (#383)
    by squigly on Tue Feb 10, 2004 at 03:48:27 AM EST

    Doesn't apply to your comment.

    [ Parent ]
    Unix Hater's handbook reference (none / 2) (#231)
    by Will Sargent on Mon Feb 09, 2004 at 05:43:21 AM EST

    http://librenix.com/?inode=3046
    ----
    I'm pickle. I'm stealing your pregnant.
    Efficiency is (NOT) king (2.83 / 6) (#234)
    by gidds on Mon Feb 09, 2004 at 06:09:16 AM EST

    Gosh! I'm surprised this story generated so much interest - and such polarised responses...

    As a professional software developer who's used C for yonks, I agree with most of what you say. C is a pain to use in many ways; structuring large systems is a pain, making them robust is a pain, memory management is a pain, the standard library is a pain... Yes, you can learn to work around most of these, but such a low-level approach is appropriate far less often than people seem to think.

    Where I disagree is in the reasons you attribute these problems to. It's not just ancient fashion, ignorance, or perverse pleasure. C is as it is because of a philosophy that there should be no hidden code: all code the compiler emits should map directly to a statement in your program. Most of the extra features you'd like to see - memory management, exceptions, more types, string handling - would involve extra work 'behind the scenes' to manage. C is designed for 'no overheads'.

    Why? Efficiency. Efficiency is the driving force behind C. (Runtime efficiency, that is. Programmer efficiency doesn't come into it...) And this is both its greatest blessing and its greatest handicap. It means that compilers are relatively easy to write, that the language is relatively small and concise, and C code is small and fast. But it also means that programmers have to spend time and mental effort working around the lack of higher-level facilities, or doing without and risking ugliness and flakiness.

    Worse than this, though: it encourages bad habits. Error handling is difficult and ugly, so people don't bother. Bounds checking is hard work, so people just assume that there's enough memory. C teaches that efficiency is the most important thing, so people use all sorts of ugly hacks and fragile constructs in pursuit of ultimate efficiency, when something slightly slower but perfectly usable would allow something far more robust, extensible, and reusable.

    Andy/

    my pet peeve with C (2.25 / 4) (#236)
    by the sixth replicant on Mon Feb 09, 2004 at 06:51:54 AM EST

    is that it forced a lot of languages to look like it so they could be taken "seriously". hence {}, if () etc etc. nothing too bad here but the biggest fuckup is that for some godforsaken reason a dumb arse computer programmer thought he's smarter than 3000 years of mathematical history and decides to start counting things from 0. yes, zero. was it the first language to do this. i don't think so (i guess there are more up their arse programmers out there that just love showing how "original" they are), but it was one of the most popular that started this trend. in fact, you can say this about computer languages :
    if you need to start counting from 0 - then it's a real computer language; if not - it's mickey mouse.

    i used C to do most of my combinatorial computing since it was the fastest language and i needed very large arrays to handled the finite field arithmetic (which by the way, having arrays start from 0 was very convenient!!!) - making me wait a few days that would have otherwise taken a few weeks of computation time. but my programs were "simple" recursion but no maths library stuff). if we now had to use C in our work environment (web company) i think most people would have gone crazy just reading each others code let along trying to debug it.

    so i've showed that C has it's place - speed issues, but can we shoot they guy (and it has to be a guy) who thought counting things starting from zero somehow made sense (yeah i'm sure it's a compiler thing etc and maybe assembler related - but jesus - 21st century people)

    ciao

    Counting from 0 (none / 1) (#241)
    by zakalwe on Mon Feb 09, 2004 at 08:13:38 AM EST

    Indexing from zero does make sense, and its nothing to do with compilers / assembler - its because it results in the most natural and convenient properties. What makes this "counter to 3000 years of mathematical history" (or at least the last 1400 years of it after 0 caught on?) Have you never seen the first term of a sequence referred to as t0? You mention yourself that indexing from 0 was very convenient for maths - had you been indexing from 1, you will often find the need for much more ugly (and bug-producing) +1s or -1s everywhere.

    For a good description of why 0 based indexing is desirable (as well as the "inclusive lower bound, exclusive upper bound" convention also common in C for loops) see this note by Dijkstra.

    [ Parent ]

    ok (none / 0) (#242)
    by the sixth replicant on Mon Feb 09, 2004 at 08:38:14 AM EST

    you got me with the t_0 bit :) but that's because we're indexing over the natural numbers and not counting, but i get your point

    i think the +1 and -1 are more common, not less common, now due to the fact that things like size-1, length-1, have to be used, though we might have a few more <= than < running around in our loops

    also it just confuses things: what does int[5] mean? if we forget about the syntax and abstractly think that we need to
    1. define the name of the array
    2. define the types of containers it needs, and
    3. how many containers

    in some (most) languages the number defined in 3 is the number of containers, in VB it's that number plus one, and you all start counting from zero...ugh, messy and counterintuitive for 99% of the population especially those who thought they knew maths

    no there is a big difference between labeling things and counting things and to be honest we're too ingrained to notice this to be honest

    anyway i needed to get it off my chest :)

    ciao

    [ Parent ]

    Counting vs indexing (none / 0) (#245)
    by zakalwe on Mon Feb 09, 2004 at 09:07:20 AM EST

    but that's because we're indexing over the natural numbers and not counting
    I'm not sure what you mean by counting. In C, counting, as in "how many elements are there" doesn't start from 0, just indexing. int[5] in C denotes an array with five elements (I agree that VB is horrible in its worst of both worlds approach - it just leads to sloppiness and ambiguity)

    i think the +1 and -1 are more common, not less common, now due to the fact that things like size-1, length-1, have to be used
    I disagree. The link I referenced gives IMHO a pretty good reason to prefer the i<= x <j convention for things like looping, slicing etc, and if you accept that, then 0 based indexing gets rid of having to use (num_elements+1) all the time for j here. It leads to a more natural and consistent system with fewer tacked on constants.

    [ Parent ]
    No, no. (none / 0) (#253)
    by tkatchev on Mon Feb 09, 2004 at 11:10:05 AM EST

    t[5] is the fifth offset of t, not the fifth element of t.

    That's the difference between an array and a real data structure.


       -- Signed, Lev Andropoff, cosmonaut.
    [ Parent ]

    Certainly, (none / 0) (#279)
    by ksandstr on Mon Feb 09, 2004 at 12:33:07 PM EST

    You can keep beating yourself over the head with doubt over "what does int whatchamacallit[5] REALLY mean?", if you really want to. I'd personally suggest that you take a good long look at an authoritative source on what that particular expression means in the context of the programming language you're using and get rid of that sort of doubt right away.

    I personally used to obsess over irrelevant little details such as "will this int be 32 bits, or will it be 33 with a sign bit on some future architecture that I've never ever even heard of?" but then I got over it. It's really just a matter of saying "Today I'm writing for this architecture, and I know that my set of assumptions with regard to its features are correct" and keeping a reference within an arm's reach.

    What I guess I'm really saying is that if you don't know your tools, you don't need a compiler to stab you in the back. Know your tools, but above all, know yourself.


    Fin.
    [ Parent ]

    Count? (none / 1) (#317)
    by FieryTaco on Mon Feb 09, 2004 at 04:42:55 PM EST

    C doesn't count. You're talking about indexes. It's not a failing of any kind in the language. I don't mean to insult you, because obviously you are quite intelligent, but the problem lies between the seat and the keyboard.

    [ Parent ]
    Needing C for speed? Not in years. (none / 2) (#260)
    by Smerdy on Mon Feb 09, 2004 at 11:32:02 AM EST

    If you think you need to code in C to get acceptable speed on modern computers, then you are out of touch. See these results from the classic Great Computer Language Shootout: Among free compilers, two ML compilers are only _one_ unit below gcc's first place spot in the runtime speed score determined for the benchmarks that were performed. Unfortunately, there aren't any commercial ML compilers, so C benefits from having more money thrown at efforts to develop uber-optimizing compilers for it. I think ML compilers would be just as effective or better with the same kind of political backing.

    [ Parent ]
    Offsets vs indexing... (none / 0) (#268)
    by trimethyl on Mon Feb 09, 2004 at 11:51:49 AM EST

    I suspect that the real reason why array offsets start with 0 rather than 1 is because of the manner in which addresses are computed. If you have a label representing the address of an array, the element of the array exists at

    Address of element = (index * size_of(type)) + (address_of(array)) .

    Thus, finding the machine address for element zero is rather simple. For data types whose size is a power of 2, the index can be loaded into a register, shifted left log2(size_of), and added to the array address in another register. It's a very simple, very fast machine operation. Since the original sizes of the primary datatypes (char, int, and float) were 1,2, and 4 bytes respectively, this convention makes array access fast and compiler writing easier.

    Now it is true that we could use any arbitrary base and subtract the base from the index before computing the offset. However, this must be done at runtime, and it introduces an additional instruction (or sometimes 3, if the architecture doesn't support a "dec" instruction). It would not be difficult, but it would incur a speed penalty. And because array access is often done at the innermost level of loops, this would be a rather substantial penalty.

    I think that the zero based system is probably the best one. Anyone who uses indexes for counting is bound to get confused; for example, the count of elements n to m is m - n + 1. Someone who accesses elements indexed 1 through 9 has done 9 accesses, but someone accessing 2 through 9 has done 8. By convention, we start counting at 1, so the first seems obvious. The second sequence, though is confusing, because 9 - 2 = 7? OTOH, someone accessing 0 through 9 has done 10, which means that a correct count of the accesses always requires computing m - n + 1 to get the correct answer, where as with a base of 1, it is easy for a programmer to become lazy and assume that the highest index gives the count of accesses.

    I suppose that people would complain either way, though. At least with 0 based indexes, programmers are forced to apply a formula every time, rather than some times and not others.



    [ Parent ]
    Nah (none / 0) (#325)
    by ZorbaTHut on Mon Feb 09, 2004 at 05:15:16 PM EST

    You're making a simple mistake. Just eliminate the concept of "X through Y". It doesn't exist. Instead, use "X to Y".

    So "0 to 10" is 10 items. "5 to 19" is 14 items. It's all simple at that point.

    Once you start allowing "through", everything gets horribly messy. :P

    [ Parent ]

    Matter of culture. (none / 0) (#504)
    by trimethyl on Wed Feb 11, 2004 at 12:27:35 PM EST

    Yes, either way has its advantages (I won't rehash Dijkstra's argument, but it is very good).

    The problem is that "to" is much less intuitive than "through". If you process an array from 1 to 10, using your method, you've processed 9 elements. But the code would appear as if you've processed all 10. And you'd inevitably get someone who would complain:

    Why can't I just use the indexes 1 to array-max when processing an array? I get sick and tired of having to start my loops at zero...

    The fundamental problem is that most people were taught to count starting from 1 in elementary school, yet they weren't taught how to determine the number of elements in a sequence by the relative position numbers. With either base 0 or base 1, you still have the problem that the index range and size of an array are never equal; that is, accessing elements n through m always results in n + m + 1 accesses. This problem isn't going to go away; it's a mathematical property of sequences.

    By using a base of zero, other parts of the language (such as loops) are often made more intuitive. For example for loops in different languages do different things. In C, it is intuitive - the middle expression must be true for the code block to execute. In Fortran, you must just "know" that FOR I = 1 TO 20 means that the loop will execute until I is 21. But notice that it says TO, not THROUGH.

    A zero bound also makes bounds checking easier. For an array of size n:

    • Any negative number is out of bounds.
    • Any number n or larger is out of bounds.
    But for a 1 based system:
    • Any negative number is out of bounds.
    • zero is out of bounds - yet a valid loop starter?
    • Any number larger than n is out of bounds.
    No matter how you slice it, bounds checking on a 1 based array is going to result in at least one additional instruction. Combine this with having to generate an additional instruction for addressing (as opposed to zero base), and you've incurred at least a 2 instruction penalty per access on a bounds checked array.

    But either way, you won't please everyone.



    [ Parent ]
    Just a couple of points on this crackrock story (2.75 / 4) (#237)
    by axel on Mon Feb 09, 2004 at 07:07:54 AM EST

    -The browser most people are using to read K5 is written in C. -The desktops on which most people work are written in C (gnome, any gtk-based stuff) or C++ (kde, ms-windows, etc). -All your OSes are belong to us too. -If you cant live without string type, there are _lots_ of libraries that implement wrapper string types that you can use (see for example glib). But anyway, live with it: or you can always go back to BASIC. -If can't use pointers properly, then it's not the language's fault: rule #1 for c compilers is 'the programmer knows what he's doing'. No point in having a smartass compiler that tries to guess stuff or fix people's code. First thing you have to learn to code C is pointer discipline. -Of course C is nasty and not suited for coding text filtering tools and scripts: that's what perl, awk, sed and etc are for. oh wait, they're written in C as well. we should rewrite them in java! that way they'll be more efficient!!!

    Good point. (none / 1) (#255)
    by tkatchev on Mon Feb 09, 2004 at 11:11:18 AM EST

    This is exactly why computers suck so much.


       -- Signed, Lev Andropoff, cosmonaut.
    [ Parent ]

    Rebuttal (none / 1) (#258)
    by Smerdy on Mon Feb 09, 2004 at 11:19:59 AM EST

    -The browser most people are using to read K5 is written in C. -The desktops on which most people work are written in C (gnome, any gtk-based stuff) or C++ (kde, ms-windows, etc). -All your OSes are belong to us too.
    The fact that a methodology has been used to achieve a goal in no way precludes the possibility that there are much better methodologies for achieving that goal. How is this a better argument in favor of C than this one for traveling long distances by horse-and-buggy: "Look at all the famous people who have traveled by horse and buggy!"
    -If can't use pointers properly, then it's not the language's fault: rule #1 for c compilers is 'the programmer knows what he's doing'. No point in having a smartass compiler that tries to guess stuff or fix people's code. First thing you have to learn to code C is pointer discipline.
    Why is this "discipline" valuable if it increases the amount of time it takes to produce quality software, since extra time may be taken up tracking down memory related bugs?

    [ Parent ]
    Counter-counter-points (none / 0) (#482)
    by Pseudonym on Wed Feb 11, 2004 at 12:01:57 AM EST

    The browser most people are using to read K5 is written in C.

    Which one would that be? Every version of Netscape that I'm aware of was written in C++, and ditto for MSIE, Opera, Konqueror... Are people still using Mosaic or something?

    The desktops on which most people work are written in C (gnome, any gtk-based stuff) or C++ (kde, ms-windows, etc).

    This is a rant against C, not C++. C++ is a very different language than C. Most peoples' desktops are written in C++, with the exception of non-GTK X setups. This does, admittedly, comprise a majority of X setups. However, it excludes both Windows and Mac OS.

    All your OSes are belong to us too.

    Not all. Most of Windows is written in C++. A lot of BeOS is, too. Linux, MacOS and QNX are mostly C, though.

    However, this is not necessarily a point in C's favour. I'm on the debian-security mailing list, and I get at least one report a week about a buffer overflow in some package (or, rarely, the kernel). Most of the potential attacks are theoretical, of course, but still.


    sub f{($f)=@_;print"$f(q{$f});";}f(q{sub f{($f)=@_;print"$f(q{$f});";}f});
    [ Parent ]
    Casting return value of malloc() (1.20 / 5) (#243)
    by fishpi on Mon Feb 09, 2004 at 08:50:02 AM EST

    You claim that you "mustn't" cast the return value of malloc(), which is totally false. Casting the return value of malloc() is totally acceptable within the standard, and some people (notably P. J. Plauger) advocate this.

    What is true is that you don't have to cast the return value of malloc. Casting it has no effect other than to hide possibly useful warnings. In the vast majority of cases the right thing to do is not to cast it.

    A language is a tool (none / 3) (#244)
    by gbd on Mon Feb 09, 2004 at 08:59:26 AM EST

    And like all tools, the more powerful it is, the more dangerous it can be if not used properly. C is sort of like an industrial strength nail gun; if wielded improperly, it can cause untold carnage. Used correctly, however, it can accomplish virtually anything with unparalleled speed and efficiency. Languages like Java, on the other hand, are more like the padded "Whack-A-Mole" mallets you might find at Chuck E. Cheese. If you screw up, the worst thing that will happen is that you'll give some kid a bloody nose and get a lecture from a man in a big mouse suit. But nobody (seriously) suggests writing an operating system or a CPU-intensive image processing algorithm in Java, because reasonable people realize that it's not the right tool for the job.

    --
    Gunter glieben glauchen globen.
    Not in Java, buuuut.... (none / 1) (#252)
    by Smerdy on Mon Feb 09, 2004 at 10:59:30 AM EST

    Competent people do suggest implementing CPU-intensive image processing algorithms in OCaml or Standard ML.

    [ Parent ]
    That makes sense. (none / 0) (#256)
    by tkatchev on Mon Feb 09, 2004 at 11:13:27 AM EST

    You need to really contort yourself to make C do parallel computation. Writing a good library with concurrency and vector operations, etc., would just about amount to writing a simple ML compiler, so why bother with C?


       -- Signed, Lev Andropoff, cosmonaut.
    [ Parent ]

    Different Tools (none / 0) (#411)
    by bugmaster on Tue Feb 10, 2004 at 08:37:32 AM EST

    C and Java are made for different purposes. You wouldn't write real-time programs in Java (it's too slow), and you wouldn't write GUIs in C (well, you could, but you'd be sorry you did). C is designed to be just marginally more complex than assembly, while retaining most of its speed -- hence the pointer arithmetic, the unions, and the total lack of error-checking. Java is designed to maintain complex, binary-portable, multithreaded projects where speed is not important -- hence the OOP, the JVM, and the built-in error checks everywhere.

    You wouldn't use a jackhammer for neurosurgery, and you wouldn't use an MRI machine to break up pavement. They're different tools made for different purposes.
    >|<*:=
    [ Parent ]

    Quote in context (none / 0) (#248)
    by Protagonist on Mon Feb 09, 2004 at 09:58:30 AM EST

    Just for the sake of accuracy, here's the full text of the footnote on whitespace:

    Thus, preprocessing directives are commonly called "lines". These "lines" have no other syntactic significance, as all white space is equivalent except in certain situations during preprocessing (see the # character string literal creation operator in cpp.stringize, for example).

    ----
    Hahah! Your ferris-wheel attack is as pathetic and ineffective as your system of government!

    For the sake of accuracy... (none / 0) (#249)
    by KrispyKringle on Mon Feb 09, 2004 at 10:04:51 AM EST

    don't bother reading the article. I'm not a fan of programming in C--I find it fairly painful, in fact. But the author makes a number of factual errors; C, despite it's irritants, certainly does have value (hence it's widespread use), and the author hasn't convinced me otherwise.

    I'd point out the errors, but I think enough other people did, as well. And I just woke up.

    [ Parent ]
    C might be ok (none / 3) (#250)
    by Cro Magnon on Mon Feb 09, 2004 at 10:24:11 AM EST

    if you're writing an OS kernel or device driver. But why the fcsk would anyone write a high-level program in it? Pascal, Ada, Python, and Perl are better suited for most programming.
    Information wants to be beer.
    Perl my ass. (none / 1) (#307)
    by FieryTaco on Mon Feb 09, 2004 at 03:59:59 PM EST

    Perl is absolutely not suited for programming. The fact that people write programs/applications in it is merely an oddity but in now way justifies saying that perl is appropriate for applications development.

    [ Parent ]
    article time... (none / 0) (#314)
    by coderlemming on Mon Feb 09, 2004 at 04:25:27 PM EST

    Why PERL Is Not My Favorite Programming Language, by FieryTaco.  :P

    Point:  To each his own.


    --
    Go be impersonally used as an organic semen collector!  (porkchop_d_clown)
    [ Parent ]

    In a word: $_ [nt] (none / 0) (#345)
    by esrever on Mon Feb 09, 2004 at 07:03:27 PM EST



    Audit NTFS permissions on Windows
    [ Parent ]
    Mixed opinion (none / 0) (#315)
    by Cro Magnon on Mon Feb 09, 2004 at 04:26:50 PM EST

    I've fiddled with Perl at home and found it far less error-prone than C/C++. However, I wouldn't want to maintain someone else's Perl code!
    Information wants to be beer.
    [ Parent ]
    In similar news: (none / 0) (#322)
    by tkatchev on Mon Feb 09, 2004 at 05:04:07 PM EST

    I heard from an authorative source that maintaining COBOL code is much, much less error-prone than maintaining IBM 360 assembler punchcards.

       -- Signed, Lev Andropoff, cosmonaut.
    [ Parent ]

    Yes it is (none / 0) (#332)
    by Cro Magnon on Mon Feb 09, 2004 at 05:35:23 PM EST

    I've done it. The COBOL that is. Thank God, I've never had to maintain IBM assembly!
    Information wants to be beer.
    [ Parent ]
    The author doesn't understand his own argument (2.92 / 14) (#263)
    by irwoodhouse on Mon Feb 09, 2004 at 11:36:56 AM EST

    I learned to program in 6502 assembly and BBC BASIC.

    Then I was taught to program in standard (similar to ISO-) pascal.

    Then I taught myself C in order to maintain a fairly arcane piece of software. I've about 10 years experience with C.

    BWK's article (which I've actually read - I wonder how many other other posters have?) is a report following his experience of trying to rewrite a suite of Software Tools from C into Pascal. It details the problems, and makes observations about the constraints (contraint!=limitation; constraint=disallowed, limitation=inability) Pascal imposes in order to enforce Good Programming Style (in the opinion of Wirth).

    Joyce's article on K5 appears by comparison to be just a plain rant.

    As many posters have noted, Pascal and C are entirely different tools. Most importantly, C was written to enable KT to get Unix working (practical problem, real-world solution) whereas Pascal was written to teach Structured Programming (at the time thought to be The Solution to shaky software, since superceded by Object Orientation which itself is superceded by Formal Methods).

    As people wanted to use the language for things it wasn't designed for, features were added.

    To pick an example, when I learned Pascal, it didn't have strings (they were introduced by Turbo Pascal). We have packed arrays of char, which were very inflexible.

    It occurs to me that Joyce doesn't really grasp the reasons behind the differences behind programming languages, particular where C is involved because of its (probably) unique position.

    To analyse every one of his comments requires an article in itself (I'm tempted), so I'll select a couple:

    "char" is basically only a really small number

    Not until the 1999 ISO C standard were we finally able to use 'bool' as a data type

    CPUs generally don't understand the concept of characters, booleans, strings, structs. They understand bytes and words, and must be taught everything else. Even Pascal has the "Type" keyword for custom data types.

    Inconsistent error handling

    A follow-on from the above (constraints of machine types). If an in-band error value is to be returned, it must be outside the valid range. For file descriptor functions, the error value is therefore negative. For functions return pointers, -1 might well be interpreted as an unsigned value and be a valid machine address. NULL is typically mapped to zero, and zero is typically guarenteed to be outside the valid address space of a process.

    errno is used where there are multiple possible causes of errors to avoid further polluting the return space with in-band error values.

    Joyce also lays some criticism without exploring the implications. He cites exception handling. C as a language has practically no run-time environment outside that provided by the operating system. Pascal has a fairly sophisticated one (amongst other things, it checks bounds on array even where the index is a variable). This run-time is implemented by the compiler, which provides the exception handling.

    The implications are two-fold. Firstly, said exception handling is an overhead not visible to the programmer and not under his control. Whilst useful for novices, it can be infuriating when it interferes with a programmer who knows exactly what he is doing. Compilers are undoubtedly clever, but (in the case of C) they should not attempt to second-guess the programmer.

    Secondly, the entire run-time environment including error handling has to be written somewhere, so at some level of programming it doesn't exist.

    C's design is for systems tasks (and Joyce seems to have forgotten this includes compilers) where there are no cosy environments to assist you. Writing systems-level tasks in assembly is not easy and is definately NOT portable, which is why only the minimum is done in assembly.

    To my knowledge, other than assembly and C, only one other language has been used to implement an operating system and compiler, and that is Oberon (also by Wirth, and complete disaster: read the book).

    The above points should indicate exactly why.

    Elaborate on Oberon, please. (none / 0) (#276)
    by tkatchev on Mon Feb 09, 2004 at 12:29:51 PM EST

    AFAIK, it is currently very popular in niche segments because it is more efficient than C while at the same time providing garbage collection and other hand-holding features.


       -- Signed, Lev Andropoff, cosmonaut.
    [ Parent ]

    Re: Oberon (none / 0) (#292)
    by irwoodhouse on Mon Feb 09, 2004 at 01:43:58 PM EST

    I don't have a problem with the language itself. Every language has its purpose.

    My impression is from the Project Oberon book in which - assuming I've not confused this with something else - the scope was the implementation of an operating system and compiler in Oberon itself.

    My problem with this is that Wirth is still stuck with the notion of "how to program" from the point of view of reliable software and was extending into systems areas where flexibility is more important.

    At the time I was working on an abstract algebraic specification for the 386 and was looking for example mappings into languages other than assembly (and C didn't look nice from the formal methods point of view). A PhD student whose thesis was in formal methods suggested Oberon.

    At this point I was already competent with C, and the 386 is a horribly intricate beast (memory access modes) so my judgement may be coloured, but just didn't think C had a competitor here.

    As an aside, you say "garbage collection and other hand-holding techniques" and I'm not sure whether you approve of GC or not. Having studied under people from the formal methods community (who disliked C) I shudder when I hear "garbage collection" because in the cases I've seen it infers trying to write the program to behave like the algebra and mopping up from programming inefficiencies.

    I am not aware of many languages which can be used to implement their own compiler (obviously C, I believe Java can - I'm sure more but I don't pay enough attention to other languages). For those which don't, you're ultimately relying on another language to provide your environment.

    My interpretation of Kernighan's article was that Pascal is fine a teaching language, but for a number of tasks it isn't enough, because it prevents some things considered "bad" which are often very useful. This cannot be fixed without modifying the language.

    By contrast, Joyce attacks C for being non-intuitive and dangerous, which can be addressed by becoming a Better Programmer.

    [ Parent ]

    GC. (none / 0) (#299)
    by tkatchev on Mon Feb 09, 2004 at 03:27:05 PM EST

    Of course GC is good.

    Unless you're competent enough to program your own garbage collection scheme, manual memory management will be buggy, unstable, leaky -- and, what's important, much more inefficient. (Because "malloc" and "free" are, in fact, very expensive calls and need to be optimised aggressively.)

    But of course, to program the original garbage collector you need a language that doesn't support it. It takes all kinds.

    (Me, I prefer to simply declare all arrays and data structures as static when programming in C. As an added bonus, you can skip linking the standard library when you do that.)

       -- Signed, Lev Andropoff, cosmonaut.
    [ Parent ]

    GC vs malloc (none / 0) (#324)
    by irwoodhouse on Mon Feb 09, 2004 at 05:13:59 PM EST

    "malloc" and "free" are, in fact, very expensive calls and need to be optimised aggressively

    GC introduces overhead just as does malloc-based management. Whether that overhead per se is less depends on the frequency of GC. However, as GC rewrites multiple pages of memory, you risk further overhead by touching the LRU algorithm the OS is using to manage those pages. In a large memory space, you may even force page-in from otherwise very inactive virtual memory.

    It would also be interesting to see the effect of running a GC language on a GC OS (e.g. MacOS prior to OS X) where the schemes may clash.

    manual memory management will be buggy, unstable, leaky

    Malloc is one of the most studied and debugged functions in the C library. That being said any general purpose memory management system can be made to fail/perform badly. There is a paper to this effect though I haven't the reference to hand. It categorises as "peaks", "ramps", etc and has graphs showing the memory load if that rings a bell.

    Dynamic data structures can't be declared static. If you don't use these you have two opposing problems: not making the arrays big enough and run out of space; or making them too big and wasting memory.

    Me I prefer to build structs whose sizes are multiples of one another where possible, which reduces fragmentation in the first place by simplifying malloc's job. In several malloc implentations this is done for you using fragments or multiple of the physical page size.

    As an added bonus, you can skip linking the standard library when you do that

    Not unless you also don't call printf(), strcmp(), isalpha(), and everything else. And in any case, this is a non-issue with shared libraries if you're thinking of saving space. In fact I believe that some Unixes (solaris?) automatically link the C library because that's where crti.o and friends live which contain the process startup/teardown code.

    Relying on things like GC is the same relying on the RTE of pascal - you're better off being a better programmer in the first place.

    to program the original garbage collector ...

    May I reference the original koan here regarding building better garbage collectors by making them self-referencing?

    [ Parent ]

    Malloc (none / 2) (#337)
    by joto on Mon Feb 09, 2004 at 06:25:26 PM EST

    Me I prefer to build structs whose sizes are multiples of one another where possible, which reduces fragmentation in the first place by simplifying malloc's job. In several malloc implentations this is done for you using fragments or multiple of the physical page size.

    In that case, if you are worried about using a bad malloc implementation, you'd better write your own, or download any of the many free ones. Or write your own per-size-alloc/free functions (guaranteed savings of one word per object!).

    Building structs which are multiples of one another doesn't really work, because all allocations from malloc are padded (at least the block size needs to be there), and unless you know exactly how much padding there is (typically one or two words), and take that into account, those multiples aren't really multiples. Also, if your malloc implementation likes fixed sizes, they don't necessarily have to be the same sizes you decide upon.

    Besides, if your malloc implementation really cares about object sizes, it should be able to pad your requests to it's own favourite size by itself. It isn't exactly the hardest problem in computer science. So, either trust malloc, or roll your own memory management system. Fine-tuning the sizes of your structs is just foolishness.

    [ Parent ]

    Malloc. (none / 0) (#342)
    by tkatchev on Mon Feb 09, 2004 at 06:55:49 PM EST

    In any case, managing memory manually is so hard and requires so much intricate knowledge to do properly that you'd better leave memory management to the runtime.

    Modern garbage collectors produce code that is much better than that of the average C coder. You have to be very knowledgeable in the intricate workings of your stdlib to surpass the garbage collector in this matter, and normal people should never bother.


       -- Signed, Lev Andropoff, cosmonaut.
    [ Parent ]

    Ummm (none / 0) (#350)
    by kraant on Mon Feb 09, 2004 at 07:44:32 PM EST

    There are ways to write code where memory allocation isn't a huge issue. Most filters for example. All you need to do is either operate on the input stream in one pass, or load all the input at startup and allocate all memory then and free it when the program finishes.
    --
    "kraant, open source guru" -- tumeric
    Never In Our Names...
    [ Parent ]
    Exactly. (none / 0) (#402)
    by tkatchev on Tue Feb 10, 2004 at 08:16:40 AM EST

    But in this case it doesn't matter whether or not you have garbage collection -- you're not really allocating memory anyways.


       -- Signed, Lev Andropoff, cosmonaut.
    [ Parent ]

    The other side of it (none / 0) (#472)
    by kraant on Tue Feb 10, 2004 at 06:10:39 PM EST

    Filters generally need to be fast. There's little to no memory allocation. It's a perfect application for C. ;)

    But seriously a fairly large class of applications can be trivially written this way -- most command line apps for example..  I just find it funny people complaining about lack of GC in C when most of the time when I'm coding in C I barely use malloc(), free() etc anyway.
    --
    "kraant, open source guru" -- tumeric
    Never In Our Names...
    [ Parent ]

    Eh? (none / 1) (#468)
    by ZorbaTHut on Tue Feb 10, 2004 at 05:09:07 PM EST

    What are you smoking? :P

    Memory management is hard if you have a design that implies hard memory management. If you don't do lots of crazy manual allocation and passing items by pointer - in other words, if you have it set up so that every object is owned by precisely one other object or chunk of code, which is responsible for deleting it, and ideally does so automatically even in the case of failure (which is easy with C++, since objects going out of scope = destructor, and destructor should = deleting contained objects), then memory management is dead easy.

    Yes, copying things by value wastes a bit of RAM and a bit of CPU, but who cares? At least you don't have to worry about modifying your copy - or, alternatively, duplicating your object every time you want to modify it.

    Or, if you want to be *really* clever, it's not hard to set up a basic refcounting-and-copy-on-change system in C++ with - and here's the fun part - the exact same semantics as pass-by-value. Meaning all your code just plain works, even after you make what is - let's be honest here - a pretty gigantic change to the object's functionality.

    [ Parent ]
    Well, guess what... (none / 1) (#490)
    by tkatchev on Wed Feb 11, 2004 at 05:28:07 AM EST

    ...you've just implemented a crappy and very inefficient garbage collector.

    What's the point? Wouldn't it have been better to just use a decent and stable garbage collector that doesn't suck in the first place?

       -- Signed, Lev Andropoff, cosmonaut.
    [ Parent ]

    Eh again? (none / 1) (#517)
    by ZorbaTHut on Wed Feb 11, 2004 at 04:37:41 PM EST

    I hate to say it, but I don't see how "deallocate stuff precisely when it is no longer useful and no later, using virtually no CPU cycles" can possibly be considered crappy or inefficient.

    How could you get more efficient or effective than that?

    Unless you're looking at the whole pass-by-value vs. pass-by-reference thing, in which case it's worth pointing out that Java, at least, is no better - you simply don't have the option to pass by value, you *have* to pass by reference, and if you want the same copy-on-change semantics it's just as hard - if not harder - than doing the same thing in C++.

    But, yes, you don't have to worry about refcounting, that's true.

    I still don't get it though. My programs spend practically zero time in allocation, use very little excess RAM, and don't leak. It ain't hard.

    [ Parent ]

    Let me explain. (none / 0) (#519)
    by tkatchev on Wed Feb 11, 2004 at 06:35:16 PM EST

    There are lots and lots of reasons why "deallocating when it is no longer useful and no later" is a very stupid and inefficient way of handling memory.

    I'll give you just one, very simple and obvious explanation: both "malloc" and "free" are actually very complex and time-consuming operations. (Contrary to what 90% of C kewlhaxurs believe, for whom allocating and freeing memory is some sort of magic fairy procedure that instantly returns unlimited amounts of RAM.)

    When you call malloc and free repeatedly one after another, you end up fragmenting the memory heap. Not only does successive allocation and freeing become much slower, you also end up wasting much more memory on useless tiny fragments.

    Any modern garbage collector will defragment memory blocks along the way and free memory in one clean, fast pass. Also, you can find a proper time for cleaning memory so as not to disturb other functions that depend on allocation and freeing of memory. (i.e., you never have to wait while the allocator searches for a free spot in the extremely fragmented heap while in an inner loop.)

    Again, like I said: proper memory management is extremely hard. Almost nobody who writes C manages memory efficiently, and those that do just end up reimplementing their own version of a crappy garbage collector.

    P.S. There are lots and lots of other issues with memory management, I just brought up some of the most obvious and breandead of them.


       -- Signed, Lev Andropoff, cosmonaut.
    [ Parent ]

    Optimize the Programmer, not the Code (none / 2) (#373)
    by hardburn on Mon Feb 09, 2004 at 11:28:15 PM EST

    GC introduces overhead just as does malloc-based management.

    Modern GCs are quite efficent, but ignoring that for the moment, consider:

    • Cost of 256 MB RAM: $50 (DDR4000)
    • Cost of 3.0 GHz Xeon Processor: $404
    • Salary of Programmer per year: $40k and up

    (Prices of hardware taken from PriceWatch.com).

    Insanely fast computer hardware is dirt cheep. If you're a company, the hardware itself is a small fraction of the cost. It makes good buisness sense to improve your programmers' efficency instead of the efficency of the code they write. You can go overboard and suck up so many resources that the system doesn't run well, but you can always profile your code and modify it later if you need to.

    Admittedly, this same philosophy probably shouldn't be applied to C. Though it can be improved, C does a good job of filling in the portions where speed really does matter. Further, when talking about GC, it's usually better that your language support it from the start instead of bolting it on later. So GC is likely something that shouldn't go into C. However, that's not a good reason to leave it out of other languages.

    If you're a programmer writing Free Software and care more about intellectual satisfaction than "business sense", I would argue that higher level langauges offer plenty to satisfy a curious mind. I was quite astounded when I first figured out how the Schwartzian Transform work (a common sorting idiom in Perl), or saw an example of how ML's strong type system was able to catch an infinate loop bug.


    ----
    while($story = K5::Story->new()) { $story->vote(-1) if($story->section() == $POLITICS); }


    [ Parent ]
    Another way of looking at this problem (none / 2) (#374)
    by smallstepforman on Mon Feb 09, 2004 at 11:44:41 PM EST

    Say you run an application not written with C (with Garbage collection, JIT compilation, etc), with a performance penalty of 25% compared to the C version.  This application is developed twice as fast as the C application.  As far as the Software company is concerned, it makes business sense to develop in Java/.Net since you can produce the same software faster (less development time), hence saving on the $40K+ developer wages.

    On the other hand, imagine you're a business which has to use the previous mentioned software.  Do you want to run an application from company A which runs 25% slower, or do you want to run an application from company B which is faster, but costs twice as much?  At the end of the day, your employees (which are paid at $20-$30 hour) spend more time fiddling their thumbs, waiting on an hourglass etc for many hours of a week, maybe even for several weeks a year.  Does this improve the bottom line?

    At the company I work for, we are forced to use massive bloated apps which run like snails.  I spend an enormous amout of time waiting for calculations to finish, waiting for the network, waiting on unnecessary locks/semaphores, waiting on an OS which has a global lock whenever one of its applications access the network (hence rendering the rest of the system unusable).  My company is paying me major dollars to sit on my arse staring at an hourglass.  If they purchased the tools which were twice as expensive, but faster, that may have been better for their bottom line.

    I'm an old school engineer who doesn't say "memory is cheap" or "CPU power is cheap", I say "employee labour is expensive", hence I'd rather use the efficient/faster tools which cost 2x more than cheaper tools which are bloated and slow.

    [ Parent ]

    Good Argument for Profiling (none / 0) (#417)
    by hardburn on Tue Feb 10, 2004 at 09:04:49 AM EST

    This is why profiling is important. The company with the cheeper software could profile their code and optimize specifically those sections which are resource-critical. The resulting Java/C#/Whatever code probably still won't be as fast as a C/C++ solution, but it should be fast enough that employees aren't just sitting there. Further, the solution likely won't have taken as much development time as the C/C++ solution even after profiling.


    ----
    while($story = K5::Story->new()) { $story->vote(-1) if($story->section() == $POLITICS); }


    [ Parent ]
    Employees just sitting there. (none / 0) (#431)
    by tkatchev on Tue Feb 10, 2004 at 11:25:11 AM EST

    95% of the wasted time is in reading crappy websites and taking coffee breaks.

    Sheesh, have any of you ever actually held a real job?


       -- Signed, Lev Andropoff, cosmonaut.
    [ Parent ]

    It all depends on where the slowdown is. (none / 0) (#440)
    by squigly on Tue Feb 10, 2004 at 12:57:07 PM EST

    Point 1: Most tasks that people do on a day to day basis are not performance critical.  It doesn't matter if a system takes 50ms to complete a task that could be done in 10ms, there's no time saved.   the bottleneck is the user.  

    Point 2: Most applications are not consumer level desktop apps.  I've worked on quite a number of applications, and precisely 2 have been intended for release.  The rest were internal tools intended for a single purpose within the company.  

    Even with consumer level apps, there is something to be said for faster development.  Most of the causes of slowdown are not because of the language used, but because of poor algorithms.  Faster development gives more time to fix the problems of slowdown.


    [ Parent ]

    Those numbers don't always work. (none / 1) (#377)
    by Kal on Tue Feb 10, 2004 at 12:18:33 AM EST

    If you're a company, the hardware itself is a small fraction of the cost.

    What about systems that don't run on commodity hardware? The cost of a faster Sun machine to handle your GC could easily be $10K or so.

    As an example the system I work on is largely written in C++ because we need it to run fast on the hardware we have available; 20-30 Sun workstations and servers ranging from 450Mhz up to 1Ghz. The cost for replacing those machines is prohibitive, both because of the price and because the hardware and maintenance you get on the Suns is just better than you'd get on commodity hardware.

    [ Parent ]
    Still Cheeper for Hardware (none / 0) (#416)
    by hardburn on Tue Feb 10, 2004 at 08:58:50 AM EST

    Your hardware costs have to be several hundred thousand per year (possibly in the millions) before the hardware becomes more expensive than a team or programmers. If you have such a situation, than yes, you need to balance the programmer/resouces ratio in favor of the computer. Most companies don't have that situation, so C/C++ is a waste for them.


    ----
    while($story = K5::Story->new()) { $story->vote(-1) if($story->section() == $POLITICS); }


    [ Parent ]
    I prefer (none / 1) (#330)
    by ZorbaTHut on Mon Feb 09, 2004 at 05:20:36 PM EST

    simply setting everything up so it cleans up automatically when it goes out of scope. 99% of the time, that's trivial with C++ destructors - most of the time I can just use the library destructors, in fact.

    Occasionally you need to do something more clever. But rarely.

    [ Parent ]

    German vs. Californian (none / 0) (#310)
    by tgibbs on Mon Feb 09, 2004 at 04:03:09 PM EST

    I see the Pascal vs. C debate as a class of cultures. Both arose out of structured programming concepts of the time, but they come with very different cultural baggage. Pascal is a very "Germanic" programming language: "Ve haf vays of making you write structured code". Everything not compulsory is verboten. Pascal protects you from yourself.

    C has a very Californian outlook: "Hey, do your own thing, man!" If you want to write comprehensible, structured code, C supports that. If you want to write "Hey, betcha can't guess what this statement does?" code, C supports that too. You're a responsible adult, make your own decisions. But if you get into trouble, don't come running to us for help...

    I tend to think that everybody should spend some time programming in Pascal before being turned loose on C.

    [ Parent ]

    Germany vs California (none / 0) (#483)
    by Pseudonym on Wed Feb 11, 2004 at 12:12:08 AM EST

    Pascal is a very "Germanic" programming language
    C has a very Californian outlook

    That's actually a really good analogy.

    Pascal is like a great European work of art. It's austere, refined and has a beauty which many can admire. At the same time, it serves little practical purpose and doesn't relate to the masses.

    C is more like a Hollywood movie. It's cheap, it's good for quick 'n dirty tasks where you don't want to think too hard. On the other hand, it is a throw-away item with no inherent beauty, it appeals only to our baser instincts and it does nothing for our intellect.


    sub f{($f)=@_;print"$f(q{$f});";}f(q{sub f{($f)=@_;print"$f(q{$f});";}f});
    [ Parent ]
    C's real beauty (none / 0) (#535)
    by Geno Z Heinlein on Thu Feb 12, 2004 at 12:24:08 PM EST

    C is more like a Hollywood movie. It's cheap, it's good for quick 'n dirty tasks where you don't want to think too hard. On the other hand, it is a throw-away item with no inherent beauty, it appeals only to our baser instincts and it does nothing for our intellect.

    I'll respectfully disagree with that. The beauty of C is that C is the HLL that gets you closest to the machine. Pascal versus C is apples and oranges: one is intended to model the computer, one is not.

    The pro-Pascal people remind me of the "natural language programming" advocates. I always get the impression they want the computer to take care of things for them. The real problem, of course, is that you can only program a computer to the extent that you understand and think like the computer. Pascal and NLP are both ways of saying, "Listen, I don't want to deal with this; you take care of it for me."

    (And, yes, I know that the same argument rages regarding the original referents. There, too, "austere" and "refined" creep me out. There are people who think Terminator 2 is not a profound film because it's painted in broad strokes of its own colors, and not delicate or sophisticated in the traditional art school way. I find those people kind of negative; I wish they'd just pull a gun and start shooting instead of sucking the life from me a little at a time.)

    Geno Z Heinlein
    [ Parent ]

    Close to the machine (none / 0) (#545)
    by Pseudonym on Fri Feb 13, 2004 at 01:26:29 AM EST

    I'll respectfully disagree with that. The beauty of C is that C is the HLL that gets you closest to the machine.

    True enough, but it's all a matter of degree. C doesn't let you tamper with register allocation, pipelining, d-cache optimisation or anything of that sort. It also doesn't, by default, give you access to your CPU's more advanced features (e.g. the Pentium SIMD instructions). It doesn't even give you any control over how you want to call routines in shared libraries.

    The difference is not that C programmers want to be close to the machine and everyone else doesn't. It's actually a matter of how close. Personally, if I want to be close to the machine, I use C++, because it lets me get close where that matters and far away where that matters. Mixed-language programmers have it best, though.


    sub f{($f)=@_;print"$f(q{$f});";}f(q{sub f{($f)=@_;print"$f(q{$f});";}f});
    [ Parent ]
    Languages used to implement OSes and Compilers (none / 1) (#338)
    by Tjalfi on Mon Feb 09, 2004 at 06:30:51 PM EST

    Many compilers are bootstrapped in their own language: Haskell, Common Lisp, Pascal, ML. For some languages a compiler is the first and last  significant app that's written.

    The following languages have been used to implement commercially available OSes:

    Ada - Rational systems had an Ada machine which ran an OS written in Ada.

    Algol60 - An extended form was used for HP's MPE/ix and on Burroughs mainframes.

    Algol68 - Xerox Dorado?

    BLISS - OpenVMS used this until 1996 when development shifted to C (it was getting hard to find BLISS programmers).

    C++ - IBM's OS/400, the Windows (NT, 2k, 2003) GDI

    Lisp - Symbolics Genera, LMI's OS, and TI Explorer

    Mesa - Xerox Alto

    Modula-3 - University of Washington's research OS SPIN.

    Oberon - There are a couple of research OSes written in Oberon from Wirth's institute.

    Pascal - Apollo's Domain OS was written in Pascal.

    PL/I - Multics, Stratus' VOS

    PL/S - Several IBM mainframe OSes.
    "With energy and sleepless vigilance go forward and give us victories." - Abraham Lincoln to Major General Joseph Hooker, 1863
    [ Parent ]

    C++ in Windows (none / 0) (#551)
    by demon on Sat Feb 14, 2004 at 08:16:07 AM EST

    C++ - IBM's OS/400, the Windows (NT, 2k, 2003) GDI

    Actually, Windows' GDI (and all of Windows, basically) is still implemented in C. It was so from day 1, especially with NT, for portability purposes. (Yes, that's since gone down the toilet, but...) The only C++ that's used is for user apps, and that often involves the evil abomination that is the MFC class libraries.

    [ Parent ]
    Re: C++ in Windows (none / 0) (#552)
    by Tjalfi on Sat Feb 14, 2004 at 06:10:28 PM EST

    I believe you're mistaken on this. Showstopper, a mediocre book about the development of Windows NT, mentions that C++ was used for the GDI on pages 70, 87-88, 164, 233 (I happened to have the book sitting 3m from this computer). Michael Abrash also mentions the use of C++ in this video (evil Realmedia format). You're correct that the rest of the kernel code is in C, albeit an extended version with exception handling. A friend of mine who works on NT device drivers at the University of Washington said that it's a reasonable system.

    If you're using C++ for Windows development, WTL is a pretty good library although it's unsupported and not particularly well documented. It's a template library along the lines of ATL. There's also the Atilla library which is an extension of ATL for applications.


    "With energy and sleepless vigilance go forward and give us victories." - Abraham Lincoln to Major General Joseph Hooker, 1863
    [ Parent ]
    OS Implementations: (none / 1) (#340)
    by jdougan on Mon Feb 09, 2004 at 06:44:34 PM EST

    Not just C, assembler and Oberon have been used to implement an OS.  Just off the top of my head:

      Lisp (Lisp Machines)
      Smalltalk  (Originaly Smalltalk on the Alto, others later)
      Forth  (Various embedded systems)
      BLISS (VMS)

    And I seem to recall some of the Algols and PL/1 were used back in the 60's.

    [ Parent ]

    Header files are evil (none / 2) (#272)
    by X-Nc on Mon Feb 09, 2004 at 12:13:23 PM EST

    It's been a while since I was a professional programmer but I still remember coding in COBOL, C/C++, perl, php, shell/sed/awk/etc. and some CICS. At this point in time I have decided that languages which require header files are evil. If you have to include something just to do fundamental or basic tasks then you're screwed. Why do I feel this way? Personal annoyance, I guess. You have to have every bit of every header file memorized in order to really do anything. This is the trouble with C/C++ and, to a lesser extent, perl. php has "include" in it but most of the things you need to do are all base parts of the language. php includes are more like subroutines.

    If I were still a "real" programmer I think I'd break things out this way...

    1. For web apps: php
    2. For system/admin/one-off's: shell & co.
    3. For back-end monster sized apps: COBOL
    4. For large apps: ruby
    That's not to say I would be opposed to using other langauages as needed. They are, after all, tools and each one has a place in the grand scheme of coding.

    The only thing that really bothers me is when someone says that such-and-such language is the "silver bullet" suitable for every possible coding situation. Knew a guy once who claimed that BASIC was the only language anyone ever needed to learn 'cause it could do everything anyone could ever want. I asked him about building an OS with it and he said that was a task that was not needed. "No one would ever want to build an OS." I guess that Linux, the BSD's, MenuetOS and the like are just figments of our collective imagination.

    --
    Aaahhhh!!!! My K5 subscription expired. Now I can't spell anymore.

    Modules (none / 0) (#289)
    by hardburn on Mon Feb 09, 2004 at 01:35:26 PM EST

    While I don't like that any non-trivial C program (and many trival ones) needs to have some header file imported, I disagree that importing things is evil of itself. It helps break things up. By not allowing modularization this way, you would need to import everything into the default language. Perl already has too much junk in its core (the text formatting should have been a module from the start). I don't want to have to download all of CPAN every time I want to upgrade Perl.


    ----
    while($story = K5::Story->new()) { $story->vote(-1) if($story->section() == $POLITICS); }


    [ Parent ]
    Modularization (none / 0) (#301)
    by X-Nc on Mon Feb 09, 2004 at 03:34:03 PM EST

    True. I should have been more precise in my comments. Modules (or what we old-timers might call subroutines) are definitely a Good Thing<tm>. But defining things separately from the procedural code makes things difficult at best. Trying to memorize all the data structures and macros contained in every .h file is unhealthy.

    --
    Aaahhhh!!!! My K5 subscription expired. Now I can't spell anymore.
    [ Parent ]
    Well (none / 0) (#329)
    by ZorbaTHut on Mon Feb 09, 2004 at 05:19:08 PM EST

    they're just text files. Often you can just, you know, open the file and do a search to find the function you want. No memorization required.

    Alternatively, you can check your docs.

    I guess I don't see the point of saying "It's in the source!" as being any better. You still gotta search for it, or read your docs, it's just in a slightly different place. In some ways I *prefer* having a "reference copy" of the interfaces, that I can look at without having to muddle through the implementation.

    [ Parent ]

    #include and -l (none / 0) (#347)
    by Trepalium on Mon Feb 09, 2004 at 07:22:43 PM EST

    I think a larger problem isn't the include files themselves, but rather you need to import the function prototype by #including it in the source code, and then also tell the linker which library to link to the executable. The headers can't tell the linker which libraries are required for the program to be linked successfully, so you're really doing the same task twice.

    Changing the C spec at this point to fix this rather minor problem would be a very bad idea, as far as I'm concerned.

    [ Parent ]

    Um, (none / 0) (#376)
    by it certainly is on Tue Feb 10, 2004 at 12:03:59 AM EST

    compiling and linking are two different things. cc is just a front-end to these two things. I would not like my cc to search all the #include'd files for secret magic words to tell it where the libraries are and what their name is. However, Mac OS X's "-framework" flag is cool in this regard. Adding a framework to the system automatically modifies the include search path, and "-framework <framework name>" automatically adds all the framework's libraries (of course, ld weeds out unnecessary libraries)

    kur0shin.org -- it certainly is

    Godwin's law [...] is impossible to violate except with an infinitely long thread that doesn't mention nazis.
    [ Parent ]

    Yeah (none / 0) (#432)
    by ZorbaTHut on Tue Feb 10, 2004 at 11:34:38 AM EST

    I don't really disagree here. There are some good things about it being set up this way (for example, being able to choose which library you want to link in) but it wouldn't be hard to set up something where you could override their settings.

    MSVC has had this functionality for a while, incidentally - there's a "#pragma lib" that you can put in a header file (or anywhere else, for that matter) that adds a token to the source file to link in a specific library. Overridable on the command line of course. I don't think many people use it though.

    [ Parent ]

    You've been out of the loop. (none / 0) (#311)
    by tkatchev on Mon Feb 09, 2004 at 04:06:41 PM EST

    It shows.

    Nobody uses COBOL or .sh anymore. (Well, unless you are a Kewlinux programmer, and in that case you have bigger problems with adopting modern technology -- things like showers and hyegine and not turning insane...)


       -- Signed, Lev Andropoff, cosmonaut.
    [ Parent ]

    Hmm (none / 0) (#359)
    by Kal on Mon Feb 09, 2004 at 08:45:09 PM EST

    I'd agree with you that no one would intentionally write anything in Cobol anymore, but there are plenty of Cobol systems out there that are still active and in need of maintenance.

    As for shell scripting I don't quite see what your problem is with that.

    Of course, seeing your other posts in this article, you're quite opinionated so it may just be that.

    [ Parent ]
    Shell sucks. (none / 1) (#403)
    by tkatchev on Tue Feb 10, 2004 at 08:17:32 AM EST

    Really sucks.


       -- Signed, Lev Andropoff, cosmonaut.
    [ Parent ]

    Ah yes. (none / 0) (#410)
    by Kal on Tue Feb 10, 2004 at 08:36:19 AM EST

    Thank you for the insightful and useful comment. How could I not have seen the uselessness of shell scripting.

    [ Parent ]
    Uselessness? (none / 0) (#413)
    by tkatchev on Tue Feb 10, 2004 at 08:52:13 AM EST

    A stone hammer is also very useful, but you're a dumbass if you try build a house with a stone hammer.

    Though undoubtedly there are lots of real manly men programmers that would claim that real men only use stone hammers.


       -- Signed, Lev Andropoff, cosmonaut.
    [ Parent ]

    Nice analogy... (none / 0) (#419)
    by Kal on Tue Feb 10, 2004 at 09:08:18 AM EST

    That also goes nowhere near answering the original question I asked. What problem do you have with shell scripting other than a vauge "it sucks" or "it's old" argument?

    [ Parent ]
    But it really does suck. (none / 0) (#428)
    by tkatchev on Tue Feb 10, 2004 at 11:20:00 AM EST

    It accomplishes nothing useful in orders of magnitude of more work, pain and suffering than practically anything else out there.

    Its only redeeming feature is that it is almost as common as other forms of refuse.


       -- Signed, Lev Andropoff, cosmonaut.
    [ Parent ]

    and you would suggest using ....? [nt] (none / 0) (#433)
    by needless on Tue Feb 10, 2004 at 11:52:02 AM EST



    [ Parent ]
    Sheesh, even Perl would be better. (none / 0) (#438)
    by tkatchev on Tue Feb 10, 2004 at 12:40:08 PM EST

    Anything, really.

       -- Signed, Lev Andropoff, cosmonaut.
    [ Parent ]

    I'd have to disagree. (none / 1) (#462)
    by Kal on Tue Feb 10, 2004 at 04:43:52 PM EST

    It's very useful for what it does. It's often far easier to write a quick shell script than it is to write a script in Tcl or Perl to do the same function. I'm sorry if you don't like it but I think you're just wrong.

    [ Parent ]
    I'd have to disagree. (none / 1) (#463)
    by tkatchev on Tue Feb 10, 2004 at 04:47:24 PM EST

    Anything more complex than just "run such-and-such command with such-and-such arguments" would be much better written in Perl or Python or whatever.

       -- Signed, Lev Andropoff, cosmonaut.
    [ Parent ]

    In my experience... (none / 1) (#470)
    by Kal on Tue Feb 10, 2004 at 05:50:24 PM EST

    People that think like that don't really know how to use their operating system. In addition, what you've said here has no more value than your other comments. You give no reasons, merely opinions. I could easily say that everything written in Perl or Python should be written in Tcl instead but it wouldn't be a correct or useful statement.

    Of course, judging by your other comments in this article, actually trying to have a meaningful conversation with you is fairly useless.

    [ Parent ]
    Whatever. (none / 0) (#491)
    by tkatchev on Wed Feb 11, 2004 at 05:30:15 AM EST

    Look, are you honestly claiming that shell, as a programming language, is as good as Perl or Python?

    If you are, than you're either an idiot or a very stupid troll.


       -- Signed, Lev Andropoff, cosmonaut.
    [ Parent ]

    Shell scripts have their place (none / 1) (#495)
    by squigly on Wed Feb 11, 2004 at 07:06:24 AM EST

    Shell scripts are very useful for running several applications in a row, with possible command line parameters, and a couple of simple conditions (e.g. if a file exists, then use application X on it).

    They're not programs.  They are scripts.  They do what they do very well.  C, perl, python will also do the same, but then you need to worry about other things, like making sure you have the right compiler or interpreter.  Generally speaking, you can expect sh to be installed on just about any Linux or Unix machine.  This cannot be said of perl.

    [ Parent ]

    Yes, that's the problem. (none / 0) (#497)
    by tkatchev on Wed Feb 11, 2004 at 08:54:21 AM EST

    Basically, it boils down to distributions beeing braindead.

    Somebody finally needs to make a distribution without bash. I'd certainly support them.


       -- Signed, Lev Andropoff, cosmonaut.
    [ Parent ]

    It's called BSD. (none / 0) (#505)
    by it certainly is on Wed Feb 11, 2004 at 12:52:35 PM EST

    You're welcome.

    kur0shin.org -- it certainly is

    Godwin's law [...] is impossible to violate except with an infinitely long thread that doesn't mention nazis.
    [ Parent ]

    Huh? (none / 0) (#508)
    by tkatchev on Wed Feb 11, 2004 at 01:12:31 PM EST

    BSD doesn't have sh? That would be news to me.

       -- Signed, Lev Andropoff, cosmonaut.
    [ Parent ]

    It doesn't have bash, (none / 1) (#526)
    by it certainly is on Wed Feb 11, 2004 at 09:39:27 PM EST

    it has the original /bin/sh, and it doesn't have /bin/sh symlinked to /bin/bash like most Linux distros do.

    kur0shin.org -- it certainly is

    Godwin's law [...] is impossible to violate except with an infinitely long thread that doesn't mention nazis.
    [ Parent ]

    In addition... (none / 0) (#500)
    by Kal on Wed Feb 11, 2004 at 09:27:01 AM EST

    Generally the majority of what you want to do with a shell script, and a lot of perl scripts I've seen, is readily available from the command line via built in macros, utilities on the system, and pipes. I'd say if your writing something in Perl instead of a shell language you're not using the OS properly as they should have the majority of the stuff you need already written. The shell script just chains it all together and does simple logic on input and output.

    [ Parent ]
    Depends. (none / 0) (#501)
    by Kal on Wed Feb 11, 2004 at 09:30:42 AM EST

    If you're writing Perl or Python to avoid writing a shell script, then yes shell is equally as good. I only use a scripting language when things get more complicated than what shell is good at or when I need something with more structure.

    As an example, on the system I work on I have a shell script wrapped around make to fill out a series of enviroment variables and handle custom command line arguments. I've looked at it a number of times and thought about rewriting it in Tcl but each time I do I realize it's smaller, simpler, and easier to understand as a shell script.

    [ Parent ]
    COBOL today (none / 0) (#560)
    by X-Nc on Thu Feb 26, 2004 at 05:00:42 PM EST

    Actually there's more new COBOL code being written today than any other language except C. There are a lot of people who only believe that COBOL is still what they saw in 1968. It's actually been evolving quite well with the latest standard being dated 2002. Todays COBOL is more OOP than C++ and cleaner than python or perl. Now, it's not a language one would develop an OS or write the next great embedded tool with. But if you have loads of data that needs crunched and mangles and spit back out then you can't find anything better than COBOL for the task.

    --
    Aaahhhh!!!! My K5 subscription expired. Now I can't spell anymore.
    [ Parent ]
    Two Factual Errors (none / 3) (#278)
    by gte910h on Mon Feb 09, 2004 at 12:32:10 PM EST

    On Library Size:

    You don't have to include a whole header if you want to only include a couple functions. Just prototype them. This is a common practice when working with multiple architectures, some of which may not implement some functions.

    And the entire library is only build into the executable if you use your linker improperly. Almost all modern linkers can do function based linking, only linking in the functions that may be called in the execution of the application. (Yup, you have to statically link to do this, but if you were looking to decrease execution footprint, you're probably already doing that).

    On Multiple Functions that do the same thing:

    This is called "history" and it had to do with the multiple places C was used, and now we'd like code from all three development families to use the same compiler. If you look at most late 80's-early 90's systems, you will only find one of those two or three functions you describe.

             --Michael

     

    C is good for certain tasks. (none / 3) (#280)
    by gte910h on Mon Feb 09, 2004 at 12:51:01 PM EST

    C is good for performance based tasks. I have written things in lisp or java, just to have to redo chunks them in C when there was an feature that was too slow on some platforms. C+Python or C+Lisp are GREAT programming environments these days (although you have to have a lot of confidence from your boss to be allowed the second).

    I agree that exceptions are useful when writing libraries and the like, but unfortunately, no maintainable* language comes close to C/C++ in speed, and C++ libraries can't even be linked into other compilers C++ applications.

    *as in the "not perl" family of languages.

    C++ is slow. (none / 2) (#312)
    by tkatchev on Mon Feb 09, 2004 at 04:08:59 PM EST

    C++ is actually one of the slowest compiled languages out there.

    Delphi/Oberon/Pascal clones, the ML family, compiled Lisps, etc., are all typically faster than C++.


       -- Signed, Lev Andropoff, cosmonaut.
    [ Parent ]

    Are you talking about facts (none / 3) (#319)
    by i on Mon Feb 09, 2004 at 04:49:04 PM EST

    or opinions? What are your sources?

    and we have a contradicton according to our assumptions and the factor theorem

    [ Parent ]
    Read some benchmarks. (none / 3) (#323)
    by tkatchev on Mon Feb 09, 2004 at 05:05:28 PM EST

    Whatever.

       -- Signed, Lev Andropoff, cosmonaut.
    [ Parent ]

    I have... (none / 2) (#334)
    by joto on Mon Feb 09, 2004 at 06:00:14 PM EST

    My conclusion is that C++ is fast. As is C. There's hardly any contenders. Look at e.g. the old computer language shootout.

    Delphi/Oberon/Pascal/Modula-1/-2/-3/Ada/whatever should probably be just as fast, as they are just about the same language (C or C++), with a different syntax. But they do have some advantages in calling conventions, etc (C's is patently stupid), as well as some disadvantages (bounds checking, etc). But in the end, it's more likely to be a question of how much effort you put into writing your backend. And due to market competition, C and C++ wins hands down.

    ML is fast in theory, but in practice, only ocaml lives up to the claim. Probably for the same reasons as above.

    For compiled lisp, by the time your program starts to get comparable in speed to C, it no longer looks like lisp. Ok, there are smart compilers doing type-inference and so on, but writing fast lisp still takes a lot of work, while it's the default in C/C++.

    Fortran does have a real advantage. But it's not my idea of what I would like to write in. Then again, I haven't investigated much efforts to learn anything newer than Fortran 77.

    Java and .NET are still slow.

    But perhaps the most important thing, is that when language implementation X claims to be faster than something (usually on some obscure benchmark), it's C (or C++) they compete against. There's a reason for that.

    [ Parent ]

    C calling convention (none / 1) (#344)
    by i on Mon Feb 09, 2004 at 07:01:24 PM EST

    isn't stupid. At all.

    That particular computer language shootout is out of date and/or incorrect. I measured a couple of C and C++ programs from it myself. The difference in speed is much smaller than the benchmark claims. In all cases where C++ is slower it can be attributed to the C++ standard library. In all cases the C standard library doesn't contain comparable functionality.

    and we have a contradicton according to our assumptions and the factor theorem

    [ Parent ]

    C calling convention... (none / 2) (#348)
    by joto on Mon Feb 09, 2004 at 07:32:10 PM EST

    Is stupid. It dictates that the caller pop arguments, which results in code bloat, since most every function is called more from more than one place. The reason for this, is that early C didn't have prototypes. And it's needed for varargs, but varargs is rare, and could have a special calling convention anyway.

    Also C lacks out-parameters, meaning that out parameters, or parameters "passed by reference", must be implemented by passing the address, which also results in suboptimal code. References in C++ could theoretically fix this, but I'm not aware of an implementation where it actually does that. Then again, that might be due to just ignorance.

    On x86 in particular, no arguments will ever be passed in registers, unless you use a nonstandard calling convention (__fastcall on windows, __attribute__ ((stdcall,regparm(3))) if using gcc). By investing at least a second of thought, you should also come to the conclusion that not passing arguments in registers, is always suboptimal. It probably made sense back when x86 was an accumulator machine, though...

    As for the computer language shootout, yes it certainly has it's flaws. What would you expect? It still is the best resource for coming up with arbitrary benchmark numbers in this respect :-)

    [ Parent ]

    The caller pops arguments. (none / 1) (#351)
    by i on Mon Feb 09, 2004 at 07:49:19 PM EST

    This results in about two instructions per call, which may or may not be a large amount of code. But if a function calls many other functions, it may defer popping arguments until the last call. This is slightly faster.

    How out-parameters are normally implemented in a language that has them?

    On x86 arguments may not be passed in registers, but that's x86 convention, not C convention. On other architectures arguments get passed in registers jolly well.

    As for the computer language shootout, I would expect people stop quoting it at once. If you need arbitrary numbers, use rand48(). If you need meaningful language comparison, you ought to compare languages meaningfully.

    and we have a contradicton according to our assumptions and the factor theorem

    [ Parent ]

    Another reply (none / 1) (#357)
    by joto on Mon Feb 09, 2004 at 08:20:54 PM EST

    But if a function calls many other functions, it may defer popping arguments until the last call. This is slightly faster.

    Good point. My objection would be to ask: What is the common case (at runtime, not loc)? Then again, it isn't many percents anyway...

    How out-parameters are normally implemented in a language that has them?

    I would expect that if you by "normally" means "traditionally", then it would probably be just as stupid. If you mean in a modern language, or implementation that didn't have to care about backwards compatibility, and with a real native-code backend (not C or VM) then I would assume they would use registers, if possible.

    [ Parent ]

    eheheh (none / 1) (#385)
    by 49399 on Tue Feb 10, 2004 at 04:23:24 AM EST

    increase the weight for the test called "Exceptions" and watch c++ sink, sink, sink...

    [ Parent ]
    Benchmarks are useless. (none / 2) (#335)
    by i on Mon Feb 09, 2004 at 06:06:05 PM EST

    Show me some real programs. A physics engine. A LAPACK implementation. Whatever. I don't really care how long it takes to compute Ackermann function.

    By the way, those benchmark tend to compare various compilers against g++, which doesn't quite produce a state of the art optimised code and has a relatively slow (even for C++) IO library.

    and we have a contradicton according to our assumptions and the factor theorem

    [ Parent ]

    My point exactly. (none / 2) (#343)
    by tkatchev on Mon Feb 09, 2004 at 06:58:54 PM EST

    So SFTU C++ advocates.


       -- Signed, Lev Andropoff, cosmonaut.
    [ Parent ]

    Your point is (none / 2) (#346)
    by i on Mon Feb 09, 2004 at 07:07:25 PM EST

    what exactly?

    You say: "c++ is a slow language".
    I say: "g++ is not the best c++ compiler out there".
    You say: "exactly, so STFU".

    I beg you pardon? That's the sort of logic they teach now in universities?

    and we have a contradicton according to our assumptions and the factor theorem

    [ Parent ]

    Uh, I'll do this slowly. (none / 1) (#404)
    by tkatchev on Tue Feb 10, 2004 at 08:21:36 AM EST

    You can't claim that C++ is fast, just like I can't really claim that it is slow.

    From now on, I consider all claims that C++ is "efficient" to be retarded trolls.

    (BTW, there are lots and lots of claims circulating around where people claim that C++ is 10, 20, 100 or whatever times slower than Fortran, Delphi or Oberon. Why shouldn't I trust these people if you don't even have real benchmarks for C++?)


       -- Signed, Lev Andropoff, cosmonaut.
    [ Parent ]

    Uh, no, I can. (none / 1) (#443)
    by i on Tue Feb 10, 2004 at 01:53:32 PM EST

    And so can you. Download Intel's C++ compiler for X86, or latest stable version of GCC. Download benchmarks in question. Check them for obvious signs of incompetence. That's an important step. Run them. That's it. That's what I did. You don't have to trust anyone. You can check for yourself. Or you can continue to "trust" (whatever that means) random people on teh Intarweb. It's your choice.

    and we have a contradicton according to our assumptions and the factor theorem

    [ Parent ]
    Hey dumbass. (none / 1) (#445)
    by tkatchev on Tue Feb 10, 2004 at 02:14:15 PM EST

    I can provide "benchmarks" for you that "prove" that Oberon is 200 times faster than raw C.


       -- Signed, Lev Andropoff, cosmonaut.
    [ Parent ]

    I'm not sure what point you're trying to make. (none / 0) (#455)
    by i on Tue Feb 10, 2004 at 03:22:26 PM EST

    If there's any, please restate it in plain English. If you absolutely cannot refrain from gratuitous insults and name-calling, please confine them to a separate paragraph or three. Thank you.

    and we have a contradicton according to our assumptions and the factor theorem

    [ Parent ]
    My point. (none / 0) (#464)
    by tkatchev on Tue Feb 10, 2004 at 04:47:51 PM EST

    Is that your claim that C++ is "efficient" is laughably unfounded.

       -- Signed, Lev Andropoff, cosmonaut.
    [ Parent ]

    I don't make any such claims. (none / 1) (#469)
    by i on Tue Feb 10, 2004 at 05:16:31 PM EST

    I checked for myself. I've looked at benchmarks, I've run some of them, and I've looked inside the benchmarks I've run. C++ is efficient enough for me, most of the time.

    Note: this is my conclusion. It may or may not be valid for you. The only way for you to know is to check for yourself.

    My other conclusion is that any behchmark that claims hundreds of percents in performance difference between major compiled languages is highly suspicious. But that's probably just me.

    and we have a contradicton according to our assumptions and the factor theorem

    [ Parent ]

    OK. (none / 0) (#492)
    by tkatchev on Wed Feb 11, 2004 at 05:30:52 AM EST

    I agree with you.

       -- Signed, Lev Andropoff, cosmonaut.
    [ Parent ]

    Addendum. (none / 0) (#496)
    by i on Wed Feb 11, 2004 at 08:12:18 AM EST

    Fortran is very fast at looping over huge piles of floating-point numbers. This is one area where it can be 10 or 100 times faster than best C compiler. C++ can be very close to Fortran, if you use a right library like Blitz++, though it's still slower. I don't have access to big hardware or fast Fortran compilers, so I must rely on published benchmarks here.

    and we have a contradicton according to our assumptions and the factor theorem

    [ Parent ]
    I was going to avoid this debate, but (none / 0) (#527)
    by Bill Barth on Thu Feb 12, 2004 at 01:33:10 AM EST

    for the love of god please give an example. It's been a long time since the gap between FORTRAN and C was anywhere near 10x. I've seen C kick the crap out of FORTRAN on some systems b/c the FORTRAN compiler hasn't kept up as well. For the most part C is at least as fast as FORTRAN these days unless your C programmer is brain damaged.

    Yes...I am a rocket scientist.
    [ Parent ]

    Well. (none / 0) (#528)
    by i on Thu Feb 12, 2004 at 06:49:49 AM EST

    See e.g here. That's straight C++ which was six times slower than Fortran, but it shouldn't be very different from C. Here too.

    I don't have access to a high perf Fortran compiler now, so I can't check this stuff.

    and we have a contradicton according to our assumptions and the factor theorem

    [ Parent ]

    That's pretty funny.... (none / 0) (#534)
    by Bill Barth on Thu Feb 12, 2004 at 11:15:06 AM EST

    You'll notice that they don't provide the plain C++ code, and I can bet I can tell you why. I'm certain they used a c-style double indexed array:

    a[i][j]=foo;

    That'll certainly kill the C/C++ performance. That or they were using std::vector and comparing it to the blitz::vector and blitz::array (or whatever they're called).

    Either way, those benchmarks are crap. The folks that put up that page don't even have enough inegrity to put up the C++ code that they're comparing against. (OK, maybe that's a bit extreme, I've emailed them for the C++ code.)

    I did say that the C programmer had to not be brain damaged. :)

    Yes...I am a rocket scientist.
    [ Parent ]

    All benchmarks are crap. (none / 0) (#538)
    by tkatchev on Thu Feb 12, 2004 at 02:10:55 PM EST

    Of course all languages are fundamentally the same, and whatever can be done in one can be done in any other.

    (BASIC is the fastest language ever invented if one sticks to using PEEK and POKE properly...)


       -- Signed, Lev Andropoff, cosmonaut.
    [ Parent ]

    Really? (none / 0) (#543)
    by Bill Barth on Thu Feb 12, 2004 at 07:00:28 PM EST

    I think that well done benchmarks are useful in charting our progress and showing where we need to improve. If two languages, using their usual constructs, produce similarly performing codes in all but a few cases, then we might look at what the better one does in the cases where it wins to see what needs to be done to the slower. There's always something to learn.

    That is, BTW, why I wanted to know what the plain C++ version of the code looks like in that example above.

    Yes...I am a rocket scientist.
    [ Parent ]

    That's the point. (none / 0) (#539)
    by i on Thu Feb 12, 2004 at 02:36:44 PM EST

    Look at their Fortran code. Rewrite it in plain C, line by line. Should be easy.

    Now, why a[i][j]=foo; is brain damaged, and A(I,J)=FOO is not?

    and we have a contradicton according to our assumptions and the factor theorem

    [ Parent ]

    Well... (none / 0) (#542)
    by Bill Barth on Thu Feb 12, 2004 at 06:53:54 PM EST

    So that's what I did. icc and ifc produced very similar run times (within 5%) with maximum optimizations turned on. The C/C++ version was the faster one.

    In C 'a[i][j]' is a double lookup, always (i.e. *(*(a+i)+j)=foo ) because 'int a[m][n];' is roughly equivalent to 'int **a;' as far as the type goes at least. In Fortran, the array sizes are known, always, and the compiler is guaranteed to be able to unfold A(i,j) into a single lookup (A((i-1)*N+j) or the equivalent).

    What the authors of that page did was compare a probably disingenuous implementation in plain C++ (say using a[][] or a std::vector of std::vectors) with Blitz++ which gives user something that looks like a(i,j) which is supposedly more 'inutitive'. They do this to make their sytactic sugar (templates) look so much better than what the supposed 'naive user' might do on their first try. The reality is that the C preprocessor could have done the same thing (in that case).

    I think that not providing the plain C++ implementation is a glaring tip-off in this case. I'm sure that this library does great things, and I'd probably use it if it was really necessary. But for this example, they haven't done much special.

    Yes...I am a rocket scientist.
    [ Parent ]

    No, not really. (none / 0) (#546)
    by i on Fri Feb 13, 2004 at 04:21:43 AM EST

    In Fortran, array sizes are not necessarily known, but that's beside the point. Modern compilers try hard to convert indexing inside loops into pointer arithmetic. On x86 Fortran and C code should compile to approximately the same thing.

    Fortran really shines on big vector hardware, which x86 isn't. On a Cray, a Fortran compiler is able to vectorise more loops than a C compiler, because Fortran is a more restrictive language. E.g. two Fortran arrays are either the same array or don't overlap at all. This is not the case with C (unless you use a C90 compiler and the new "restricted" keyword).

    I don't know why Blitz++ arrays are different from plain C arrays in this regard.

    and we have a contradicton according to our assumptions and the factor theorem

    [ Parent ]

    Yes really..... (none / 0) (#548)
    by Bill Barth on Fri Feb 13, 2004 at 11:15:41 AM EST

    I'm not sure what you mean by "[m]odern compilers try hard to convert indexing inside loops into pointer arithmetic." In C there's no difference. Doubly indexed arrays in C _mean_ something entirely different than they do in FORTRAN. This is especially true for Fortran77. The C compiler can't arbitrarily convert a double index (i.e. a double look up) into a single stride 1 access the way the F77 compiler can.

    The Blitz++ template library makes its internal stride 1 access look like a double index. It's syntactic sugar to make C++ look like Fortran. (Nothing wrong with that, BTW.)

    The Cray C/C++ compiler has directives which allow the compiler know that there's no overlap between two arrays and that it can safely vectorize the loop.

    Yes...I am a rocket scientist.
    [ Parent ]

    Pointer arithmetic. (none / 0) (#549)
    by i on Fri Feb 13, 2004 at 06:20:39 PM EST

    I mean the following.

    A loop like this:

    for (i = 0; i < N; i++)
     for (j = 0; j < M; j++)
       ...a[i][j]...

    can be converted to a loop like this:

    for (a_i = a; a_i < a+N; a_i++)
     for (a_ij = *a_i; a_ij < *a_i+M; a_ij++)
       ...*a_ij...

    As you can see the lookup in the inner loop is a single dereference.

    and we have a contradicton according to our assumptions and the factor theorem

    [ Parent ]

    Please do. I'm actually curious. (n/t) (none / 0) (#536)
    by Bill Barth on Thu Feb 12, 2004 at 01:44:04 PM EST


    Yes...I am a rocket scientist.
    [ Parent ]

    In which case... (none / 0) (#390)
    by codemonkey_uk on Tue Feb 10, 2004 at 05:35:10 AM EST

    Why is it that programming contests that place an emphasis on performance are usually won by programs written in C or C++? (For reference, search for "PFC" in the archives of this website).
    ---
    Thad
    "The most savage controversies are those about matters as to which there is no good evidence either way." - Bertrand Russell
    [ Parent ]
    Why? (none / 0) (#405)
    by tkatchev on Tue Feb 10, 2004 at 08:23:45 AM EST

    Because Unix only includes one compiler, the gcc.


       -- Signed, Lev Andropoff, cosmonaut.
    [ Parent ]

    Its not one compiler... (none / 0) (#459)
    by gte910h on Tue Feb 10, 2004 at 04:08:32 PM EST

    ...its a fortran, C, C++, java and objective C compiler.

    [ Parent ]
    Well, it's sort of a one compiler. (none / 0) (#465)
    by tkatchev on Tue Feb 10, 2004 at 04:48:31 PM EST

    With several front-ends, though.

       -- Signed, Lev Andropoff, cosmonaut.
    [ Parent ]

    There are no 'fast' or 'slow' languages (none / 0) (#547)
    by Shubin on Fri Feb 13, 2004 at 08:43:42 AM EST

    There are compiled or interpreted languages, but this distinction is base only on particular implementation of a language on the paticular hardware. There are FORTH-machines and LISP-machines where C would hardly be an efficient language. C is nearly well compiles into modern computer architectures, but some modern architectures are tweaked specifically to make C (or similar languages) more efficient.

    There is more important point : a good language should encourage good programming practices. C does not. Without good style it is possible to make any program to run forever.


    [ Parent ]
    On Types and Breaks (2.75 / 4) (#286)
    by hardburn on Mon Feb 09, 2004 at 01:27:51 PM EST

    Hello? Flexible casting? Hello?

    With a good type system, you don't need casting. Needless to say, C's type system sucks. Take a look at Strong Typing and Perl, where Dominus gives an overview of why C's type system sucks, and why ML languages have a really great type system and thus need no casts at all. This presentation convinced me that I really need to learn OCaml.

    You can use it to break out from nested loops of arbitrary depth by using it with an integer, such as "break 3"; this would break out of three levels of loops.

    I wouldn't hold up PHP on a pedestal here. Doing break by the depth of the loop structure makes the code much more fragile in the event that you need to restructure your loop. A much better way is with the label syntax allowed in Java or Perl, so you can break by name:

    OUTER: foreach my $i (0 .. 3) {
        INNER: foreach my $j (0 .. 3) {
            break OUTER if $j = $i;
        }
    }

    IMHO, you should only code C because your only other alternative is assembly. Thus, C should be considered purely a low-level language and should ditch any attempt to make it otherwise. If your problem can be solved at a higher level, don't waste your time with C.


    ----
    while($story = K5::Story->new()) { $story->vote(-1) if($story->section() == $POLITICS); }


    Just a couple points. (none / 3) (#287)
    by jmv on Mon Feb 09, 2004 at 01:30:55 PM EST

    You seem to bash C in favor of Pascal in a couple places where Pascal is no better. Note that my Pascal experience dates back to Turbo Pascal, but while there are probably many more expensions now, I doubt the "standardized" language has changed much.

    Strings: What does it change you have two functions or just one (nobody forces you to use both). Also, Pascal strings also have a fixed length, which I why I don't like them (the C++ string class is much better)

    Buffer overflows: The fixed-length Pascal string is about as dangerous wrt buffer overflows.

    Integer overflow without warning: If Pascal does it, it's probably a compiler option (that could be done in C) and it would be *really* slow.

    Portability: Come one, you can do non-portable code with about any language. With C, if you use only C89, your chances are quite good. With Pascal, you depend on the extensions supported by your compiler. Last time I used Pascal, casting pointers was an extension, which means you couldn't even do dynamic array allocation without using extensions (i.e. non-standard code). Last thing, +- all platforms have a C compiler, that's not quite true for Pascal. On many platforms you can only compile Pascal by using p2c :)

    Trapped in the 1970s: I thought the design of Pascal dated just as much...

    Library size: Ever tried diet libc? You can use it to create static executables that are less than 1k in size. The size of libc is not caused by the language, but the choices (speed vs. size) that are normally made. There's no reason for the Pascal lib to be smaller or larger than the C lib because of the language.

    OK, I could go on forever, but I better get some work done today.

    Some good points-- but, (none / 3) (#291)
    by valar on Mon Feb 09, 2004 at 01:41:41 PM EST

    The reason for no string class-- a) your processor doesn't natively support string type (ok, so maybe pseudostrings of up to 255 characters, but that is the best I've heard of) b) no classes in C. In C, aggregate types are only loosely coupled with methods that act on them. In my opinion this is one weakness of C that can no longer be justified as a performance issue.

    As far as them being stored as arrays of characters, that is how every programming language does it-- though some hide it better than others.

    Buffer overflows? Yes. This falls under the "shit happens" school of programming. Buffer overflows are 'easy' in C and hard in other languages. But still very possible (I once demonstrated a buffer overflow in ADA to prove to an ADA developer that it was possible).

    Low level or high level? As a computer engineer, I'm inclined to say everything other than machine code or assembly are high level. That said, C is obviously lower level than C# or Java (my point is helped by the fact that several implementations of the .NET runtime and java virtual machines are written in C). As far as the library being huge-- well, it isn't. It is significantly smaller than the standard library of most modern languages (compare with .net or java, or even perl, python, or ruby).

    I have never seen array subscripting associated. In fact, I had to compile a test program before I even believed you. You taught me a new trick to confuse and astound at parties. :)

    Integer overflows-- a) There are several good algorithms that depend on a numerical overflow to know when to stop b) The reason that you get weird numbers and not zero or some kind of error condition is that signed numbers in C (and in most computers) are stored as 2s complement numbers. c) In order to create a runtime overflow handling system one would have to add 3 or four instructions for every arithmatic operation performed. That is a tremendous overhead. d) Because integers can overflow in C, you are can an array of integers as a larger integer type, if you define your addition operation correctly (unfortunatly, this requires an add() function, because you can't overload operators in 'pure C').

    You can and in some schools of programming are highly encouraged to cast the return value of malloc. You can't fflush(stdin) because flushing is for output. I've never heard gets() is evil.

    Most compilers get confused if you use bad syntax. Some are better than others but this is a compilier design issue, not a language one.

    If you pass a bad address to puts(), what the hell is it supposed to do? Guess what string you want to output? In more modern languages, the only difference is that the error happens at a different time.

    Hash tables: what hash function should they use to handle all cases well?

    I'll note that in C# (and IIRC, java) the multidimensional arrays work the same way. If you need a jagged array, you compose it uses pointers (or references, whatever). If not, you just declare a straightforward multidimensional array.

    Float is not the same as double. I repeat, FLOAT IS NOT THE SAME AS DOUBLE. The reason atof() is called atof is that it returns a float.

    Break X: Yeah, that would be nice, wouldn't it. In C, you are presented with two options, use GOTO and a label (yes, that's right: goto) or set a variable and check it in each of the outside loops. The goto way is probably more readible than either break x or the other C solution, for most situations.

    Error handling: diagnostic messages are intended to catch programming errors. Numerical return errors are intended for errors that 'aren't your fault.'

    string type (none / 1) (#362)
    by kubalaa on Mon Feb 09, 2004 at 09:38:14 PM EST

    There's no such thing as arrays either, so you can't say that strings are arrays of characters. Both are just pointers. Therefore arrays always need to carry around a size argument indicating how many elements they hold. Strings often do, but sometimes they don't because they may be null-terminated. I don't understand why they couldn't have defined these standard types as structs so that you didn't have to manually use two variables everywhere for one conceptual value.

    [ Parent ]
    almost (none / 0) (#367)
    by valar on Mon Feb 09, 2004 at 10:33:08 PM EST

    Except that arrays have a size declared at compilation: the size attribute is in the compiler and not in the program. Which is why you can buffer overflow easily. What is implied in a string class is that such things are handled programmatically, so that the string could be dynamically reallocated if it needs to expand. There is little parallel between this ability and the size on an array.

    [ Parent ]
    gets() not evil?! (none / 1) (#370)
    by magney on Mon Feb 09, 2004 at 10:48:27 PM EST

    It's probably the single most evil function ever concocted! If a newline or \0 never comes in on the standard input, it'll merrily write to whatever buffer you've got defined until the end of time, or until you wrap around your address space, whichever comes first. And god help you when the function returns.

    Do I look like I speak for my employer?
    [ Parent ]

    No, atof does not return a float. (none / 0) (#423)
    by Haeleth on Tue Feb 10, 2004 at 10:30:31 AM EST

    I don't know what your atof(3) manpage says, but mine reads as follows:

    NAME
      `atof', `atoff'--string to double or float

    SYNOPSIS
       #include <stdlib.h>
       double atof(const char *S);
       float atoff(const char *S);

    DESCRIPTION
       `atof' converts the initial portion of a string to a `double'.  `atoff' converts the initial portion of a string to a `float'.


    [ Parent ]

    ..And pascal is standard and portable..? (none / 3) (#293)
    by beavan on Mon Feb 09, 2004 at 01:59:42 PM EST

    Not as such. C is probably the most common, portable and standard language out there. Can anyone name a single platform that has more than 2 users that DOES NOT have a decent C compiler? C is just what I need when I'm developing for a low memory next to zero CPU cellular phone, even the objects are small and lean. Try to compare the size of a pascal or C++ generated object file and see what I'm talking about... Why? Because C doesn't make the compiler do anything you didn't INTEND it to do. Can't cope? maybe it's time for a career switch. Having said that, on my big fat Sparc, I use C++ (except for important low level stuff), a very well structured and lovely language. C++ still requires a brain. In fact, in my opinion it's much harder and complex to use, even if it supports strings (STL is actually a part of C++) in the way you think strings should be supported... BTW, if one of my programmers would use strcat, and not strncat (for instance) I'd have him write pascal for a whole week!

    I love burekas in the morning
    Wow, you manly man of all the manly men. (none / 2) (#313)
    by tkatchev on Mon Feb 09, 2004 at 04:19:40 PM EST

    I bet your ego feels really gratified that your giant, manly brain (brain that can cope with anything, much less the vagarities of C++) is being used to write programs that produce dumps of core.

    I think this is called "autofellatio".


       -- Signed, Lev Andropoff, cosmonaut.
    [ Parent ]

    Thoughts of a massive male brain (none / 0) (#389)
    by beavan on Tue Feb 10, 2004 at 05:06:21 AM EST

    Since I have a massive, giant and manly brain, my programs don't core dump, at least not that often :-)
    I guess you've often worked with other programs written by people of gigantic, and according to your perception - male brain.
    Most of the programs I use don't unexpectedly crash, I guess the ones that do, are written in pascal.
    Again, if your program constantly core dumps, and you can't fix it, you should switch to something else.
    How about becoming a surgeon?
    If things go wrong, you could always claim medicine is a complex science, and it's a perfectly common mistake to have a patient's liver removed instead of his appendix.


    I love burekas in the morning
    [ Parent ]
    I'm not so manly. (none / 0) (#406)
    by tkatchev on Tue Feb 10, 2004 at 08:25:19 AM EST

    So I write programs in languages where it is logically impossible to produce a core dump.


       -- Signed, Lev Andropoff, cosmonaut.
    [ Parent ]

    can't we all just get along? (none / 0) (#425)
    by beavan on Tue Feb 10, 2004 at 10:46:02 AM EST

    I totally agree that there are tasks that C is not the right tool for.
    It would be senseless to use C for rendering web pages (server side..), for instance.
    PHP and ASP do it with a fraction of the amount of work a C equivalent would require.
    My point is that in some cases C is the only alternative, and the risk of having a program blow up in your face is a calculated risk you must take.

    I love burekas in the morning
    [ Parent ]
    Yes, C is very useful. (none / 0) (#427)
    by tkatchev on Tue Feb 10, 2004 at 11:18:33 AM EST

    It's definitely not in the fact that it allows you to act the part of the Real Man Manly Programmer that its usefulness lies.

       -- Signed, Lev Andropoff, cosmonaut.
    [ Parent ]

    A more thorough article by the same title... (2.50 / 4) (#303)
    by jason on Mon Feb 09, 2004 at 03:41:52 PM EST

    Prof. Fateman of Berkeley has an article with more research behind it, but with more-or-less the same title:

    Software Fault Prevention by Language Choice: Why C is Not my Favorite Language

    Fateman's article is worth some thought, especially if you're designing a C library interface.



    he's a smart guy too (none / 1) (#366)
    by kobayashi on Mon Feb 09, 2004 at 10:32:03 PM EST

    at least where computer algebra is concerned.

    [ Parent ]
    Yay!!! (none / 3) (#331)
    by Deus Horribilus on Mon Feb 09, 2004 at 05:24:35 PM EST

    This is what I have been trying to explain to my workmates for the past month. I am currently programming scientific software in C++ (okay, it is a little better than C), but before this the standard language was FORTRAN.

    Why the change, you ask? To quote a fellow researcher:

    "Everybody else is doing it, so we have to."

    It's the whole jumping off a cliff argument all over again. If your current solution works, why bother with the trouble of changing your entire programming language?

    I have to add the point to this article that C and all its offshoots are NOT scientific programming languages. Surely the lack of an exponential operator and complex numbers would have demonstrated this. It's another argument against its use. Moreover, the propensity of C for errors (both seen and unseen) makes it unsuitable for programs that depend on accurate algorithms. And I haven't even mentioned its unreadability in code form (oh, wait, I just did).

    Great article, now if only my colleagues would read it...

    _________________________________________
    "Beliefs are never concrete, they change direction like autumn leaves in a windstorm..."

    YHBT (2.25 / 4) (#336)
    by ph317 on Mon Feb 09, 2004 at 06:13:55 PM EST

    Unfortunately, time has not been kind to Kernighan's tract. Pascal has matured and grown in leaps and bounds, becoming a premier commercial language. Meanwhile, C has continued to stagnate over the last 35 years with few fundamental improvements made. It's time to redress the balance; here's why C is now owned by Pascal.

    It's a brilliant peice of writing, I support it's section vote.  But cmon guys, YHBT, so stop arguing the finer points of it.

    What's your porblem? (none / 3) (#354)
    by the on Mon Feb 09, 2004 at 07:55:16 PM EST

    "How about you allow 5[var] to mean the same as var[5]?"

    And what exactly is the cause of your difficulty with the commutativity of .[.]? Too confusing for you? Goes against your religious beliefs?

    --
    The Definite Article

    C is owned by Pascal? (1.75 / 4) (#364)
    by JayGarner on Mon Feb 09, 2004 at 10:07:03 PM EST

    WTF is this doing here? I feel like I went to the Journey fan board by mistake, and there's an article on how much Foreigner sucks.

    WTF?!

    C Isn't Supposed to Be High Level (2.75 / 4) (#369)
    by NeantHumain on Mon Feb 09, 2004 at 10:40:03 PM EST

    If you want a string type, operator overloading, exception handling, etc., use C++, Java, C#, or any of the other new-fangled programming languages out there. C was never meant to do these high-level things because it was originally meant to write an operating system in.

    C's simple treatment of all things as either variables or pointers is its beauty. Its beauty is also its ugliness. It's in the eye of the beholder.

    I use C++ or Java most of the time because I could use the abstraction that C doesn't provide. If you want to deal with things at the low level, use C or maybe assembly.


    I hate my sig.


    Great Story (1.60 / 5) (#380)
    by Gysh on Tue Feb 10, 2004 at 03:09:36 AM EST

    Wow.

    I'd like to apologize for all the in your other article, because while I still don't agree with it, this one rocks. I don't mind using C or C++, and there were a few areas where I disagreed with you, but most of your points were (in my opinion) more than valid, and I really enjoyed reading the article.

    C(++) definitely isn't a fun language to learn compared to others, but I tend to use whatever works. It bugs me, however, when people take up the attitude that you're not a skilled programmer if you don't do everything in C(++) regardless of whether or not it's a good idea. "l337 skillz forevaR!" and such.

    Of course, now I feel like an groveling idiot... "Forgive me, oh wise one!"... but that's what I get for going off on a rant despite my better judgment. Heh.

    Hey, no problem. (none / 0) (#387)
    by James A C Joyce on Tue Feb 10, 2004 at 04:34:04 AM EST

    "I'd like to apologize for all the in your other article, because while I still don't agree with it, this one rocks."

    I'm a K5 troll; it's my job to get people to scream abuse at me!

    I bought this account on eBay
    [ Parent ]

    Egg Troll does technology better (1.75 / 4) (#388)
    by bigchris on Tue Feb 10, 2004 at 04:40:06 AM EST

    Linky

    ---
    I Hate Jesus: -1: Bible thumper
    kpaul: YAAT. YHL. HAND. btw, YAHWEH wins ;) [mt]
    I disagree. (none / 0) (#400)
    by James A C Joyce on Tue Feb 10, 2004 at 07:48:16 AM EST

    I think I'm much better.

    I bought this account on eBay
    [ Parent ]

    Bogus claims (2.77 / 9) (#393)
    by ttsalo on Tue Feb 10, 2004 at 06:27:08 AM EST

    The article is a troll, but anyway...

    Buffer overflows abound in virtually any substantial piece of C code. This is caused by programmers accidentally putting too much data in one space or leaving a pointer pointing somewhere because a returning function ballsed up somewhere along the line. C includes no way of telling when the end of an array or allocated block of memory is overrun. The only way of telling is to run, test, and wait for a segfault.

    Bullshit. I write security-critical software for living with C and I don't think we've ever had a buffer overflow vulnerability in our code. Why? We don't, ever, store anything in an array without checking that it fits. sprintf and it's ilk are absolutely banned everywhere in our code. (snprintf is good.) You really think that the only solution is running the code and seeing whether it segfaults? Are you a retard?

    If you want to break out from a series of nested for or while loops then you have to use a goto. This is what is known as a crude hack.

    Nonsense. Why the hell would a goto fail; be more of a crude hack than break 3;? Breaking from nested control structures with a goto to a clearly specified location is much, much cleaner than some break n;

    Multidimensional arrays. Before you tell me that you can do stuff like int multiarray[50][50][50] I think that I should point out that that's an array of arrays of arrays. Different thing.

    Same thing. Tell me, what does your multidimensional array do that you can't do with multiarray[50][50][50]?



    Manly manlyness. (none / 1) (#407)
    by tkatchev on Tue Feb 10, 2004 at 08:27:56 AM EST

    Real programmers program bits by flipping switches on the front dash of their mainframe, etc.

       -- Signed, Lev Andropoff, cosmonaut.
    [ Parent ]

    Fuck vi and emacs... (none / 2) (#453)
    by skyknight on Tue Feb 10, 2004 at 03:06:31 PM EST

    I write files by holding a magnet over my disk.

    It's not much fun at the top. I envy the common people, their hearty meals and Bruce Springsteen and voting. --SIGNOR SPAGHETTI
    [ Parent ]
    sn* and strn* aren't always safe, either. (none / 0) (#450)
    by billion on Tue Feb 10, 2004 at 02:48:32 PM EST

    Even the s*n* functions aren't very safe:

    char buf[10];

    strncat(buf, "string longer than 10", 10);

    printf("%s", buf); // OOPS!

    Do you know what is wrong with this code?  If you guessed that it's missing a buf[sizeof(buf)-1] = '\0'; after the strncat, then you guessed right!

    [ Parent ]

    Yes (1.33 / 3) (#458)
    by wji on Tue Feb 10, 2004 at 03:48:13 PM EST

    And C even lets you write

    char *a, *b;
    a = "/bin/sh"; b = NULL; execve(a, b, NULL);


    Instant root access! Imagine that! A safe language would *clearly* prevent this kind of thing.

    In conclusion, the Powerpuff Girls are a reactionary, pseudo-feminist enterprise.
    [ Parent ]

    how does it hack root? (none / 0) (#522)
    by army of phred on Wed Feb 11, 2004 at 07:16:10 PM EST

    seems like it should be patched!

    "Republicans are evil." lildebbie
    "I have no fucking clue what I'm talking about." motormachinemercenary
    "my wife is getting a blowjob" ghostoft1ber
    [ Parent ]
    Yeah, totally annoying. (none / 1) (#467)
    by tkatchev on Tue Feb 10, 2004 at 04:51:37 PM EST

    This retarded mis-feature totally defeats the point of the "safe" strncpy function.

       -- Signed, Lev Andropoff, cosmonaut.
    [ Parent ]

    More to the point... (none / 0) (#454)
    by skyknight on Tue Feb 10, 2004 at 03:07:33 PM EST

    If your code is getting that deeply nested, then you're probably not doing a very good job of breaking your program into subroutines.

    It's not much fun at the top. I envy the common people, their hearty meals and Bruce Springsteen and voting. --SIGNOR SPAGHETTI
    [ Parent ]
    Deeply nested (none / 0) (#487)
    by ttsalo on Wed Feb 11, 2004 at 03:39:13 AM EST

    I know... But sometimes I pretty much have broken all the actual work into subroutines, and have one function calling a sequence of them, and then it's damn convenient to be able to do that goto fail; from inside nested if:s instead of thinking up a way of bubbling the error status up from there. The goto often allows much cleaner code.



    [ Parent ]

    Offtopic, but... (2.75 / 4) (#408)
    by bugmaster on Tue Feb 10, 2004 at 08:28:50 AM EST

    How come no modern compiled language (that I know of) supports binary constants ? You would think that C would support them, being a low-level language and all (and hence, incidentally, out of scope of the author's article), but it doesn't. 0xA5 ? Sure. 165 ? Sure. 0245 ? Sure. 0b10100101 (or something similar) ? No.

    Why not ? People have to use binary constants all the time, because that's how you make flags and masks. In fact, I can only think of a single case I've encountered where a hex constant would be better than a binary one.
    >|<*:=

    Not true. (none / 1) (#415)
    by tkatchev on Tue Feb 10, 2004 at 08:54:59 AM EST

    Lisp and Scheme, AFAIK.

    (Though I don't see why you would need this, ever. Binary is too verbose and error prone; I advise you to memorise 0x1, 0x2, 0x4, 0x8 instead.)


       -- Signed, Lev Andropoff, cosmonaut.
    [ Parent ]

    I said "compiled" (none / 0) (#421)
    by bugmaster on Tue Feb 10, 2004 at 09:48:53 AM EST

    Lisp and Scheme are normally interpreted (though yes, I know, you can compile them). I actually don't remember whether Scheme has binary constants or not, but it's plausible. In any case, which bitmask is easier to read ? 0b1101 or 0xD ? I think the answer is pretty self-evident.
    >|<*:=
    [ Parent ]
    I said Lisp. (none / 0) (#429)
    by tkatchev on Tue Feb 10, 2004 at 11:20:39 AM EST

    Is it normal when kewl programmers confuse Lisp with Emacs?

       -- Signed, Lev Andropoff, cosmonaut.
    [ Parent ]

    okay (none / 0) (#461)
    by hading on Tue Feb 10, 2004 at 04:28:01 PM EST

    Except that Common Lisp is not normally interpreted.    In fact, some implementations can't be interpreted because they only have compilers.  And even with the others, it's pretty darn rare not to compile one's code once it's working.

    [ Parent ]
    Oh no. (none / 1) (#422)
    by it certainly is on Tue Feb 10, 2004 at 10:03:27 AM EST

    I hope this isn't a new branch of Fen's "decimal" meme.

    As tkatchev says, memorise 0x1, 0x2, 0x4 and 0x8. Now you can build any flag. Secondly, learn to split binary into clumps of 4. Bingo, now you can easily work out those four binary digits into a single hex char.

    Languages support hex because it's a very compact and human-readable representation of binary. On the other hand, I have no idea why octal is supported at all, save for unix file permissions... historical accident, perhaps? Counting in octal was trendy at the time and has now died out?

    kur0shin.org -- it certainly is

    Godwin's law [...] is impossible to violate except with an infinitely long thread that doesn't mention nazis.
    [ Parent ]

    Octal (none / 1) (#436)
    by squigly on Tue Feb 10, 2004 at 12:36:08 PM EST

    Octal has various advantages - Firstly, it's quicker to type since you only need the keypad.  Secondly, it's easier to convert to binary since you only have to remember 8 conversions rather than 16, and those are easy to work out.  3=1+2; 5=4+1; 6=4+2; 7=4+3=4+2+1.  Everyone knows those.   What do you get if you add 3 to 8 in hex?

    Personally, I found working out what bits A-F gave a little confusing at first since I would really expect A, C and E to represent odd numbers.  My fault for counting letters from 1 rather than 0, I guess, but I'm sure I'm not the only one who has done this.

    [ Parent ]

    Conversion (none / 1) (#486)
    by bugmaster on Wed Feb 11, 2004 at 01:46:04 AM EST

    Yes, I know how to convert from hex to binary to decimal, and yes, I can do it in my head. My point is that I shouldn't have to do it in my head; computers were made to lighten my mental load, not burden me with trivia. After all, C does many other monotonous, repetitive tasks for me -- like saving registers during function calls, allocating stack space for local variables, etc. Why should another common operation -- bitmasking -- occupy my own valuable time ?
    >|<*:=
    [ Parent ]
    Actually C often does... (none / 0) (#460)
    by gte910h on Tue Feb 10, 2004 at 04:11:10 PM EST

    on embedded compilers.

       --Michael

    [ Parent ]

    Not constants, literals (none / 0) (#561)
    by pieroxy on Thu Mar 04, 2004 at 09:03:41 PM EST

    Not to be a grammar Nazi, but what you're referring to is called "literals" and not "constants". It is problem in the grammar of the language itself, purely syntaxic. It would take a very small amount of time to add that to gcc for example.

    [ Parent ]
    This is like complaining that... (2.77 / 9) (#418)
    by skyknight on Tue Feb 10, 2004 at 09:06:07 AM EST

    a soldering iron is a lousy tool for creating web applications. You personally, at your dotcom company, aren't going to sit down with such a tool, but somewhere along the line a soldering iron was in fact involved with putting the pieces in place to render the webapp on someone's monitor. It certainly isn't the right tool for building your piece of the pipeline, but it was the right tool for building some of the hardware components of the systems that people are employing to both develop and use the webapps that you write.

    Quite simply, high level languages and tools do not materialize out of thin air. Perl, a beautiful high level language that supports all of the wizardry that you want, is written in C. It's not crafted from hand written assembly. All of mankind's technology is hierarchical, with more specialized and powerful technology being crafted from simpler, more "stupid" tools. Do you think that industrial circuit board etching tools with precisions smaller than that of the resolution of the human eye are built by a guy with a hammer and chisel?

    You lament that C doesn't hold your hand all along the way, but it's a trade off. When choosing a language, you have to ask yourself many questions. What is the user base? How long will this software be around? Is man time or machine time more important? If you're writing a little bit of glue code, you're going to be the only one using it, and it's going to be thrown away at the end of the day, you'd be insane to write it in C instead of Perl. If you're writing a device driver, it's going to be distributed to millions of people and hang around forever, and squeezing every bit of performance out of it is the ultimate imperative, you'd be a fool not to use C.

    There Ain't No Such Thing As A Free Lunch. In a language that checks for out of bounds array accesses, you'll certainly stamp out bugs faster, but it means that every time you access an array there is the overhead of the check. Shifting the burden to the machine makes perfect sense in a lot of situations, but not all of them.

    Real software engineers are not language bigots. Real software engineers carry a diverse collection of tools on their virtual belts, knowing both how and, more importantly, when to use each one of them.



    It's not much fun at the top. I envy the common people, their hearty meals and Bruce Springsteen and voting. --SIGNOR SPAGHETTI
    A choice comment..... (2.00 / 4) (#424)
    by wabbitwatcher on Tue Feb 10, 2004 at 10:35:40 AM EST

    When one single condition you'd just happened to have forgotten about whilst coding screws up, it's your fault.

    Who else is there to blame biy the programmer?

    Such languages do exist. (none / 3) (#435)
    by wrg on Tue Feb 10, 2004 at 12:35:20 PM EST

    You can define binary constants in Ada, and you can even embed underscores to aid in visually grouping them.  So, for instance, you could write the same number in hexadecimal, octal, decimal, and binary thus:

    16#AF#
    8#257#
    175  -- or 10#175#
    2#1010_1111#  -- or 2#10_101_111#

    You can also define binary constants ("bit-strings") in PL/I, for example:

    '10101111'B

    Ada is the uber-language. (none / 0) (#439)
    by tkatchev on Tue Feb 10, 2004 at 12:42:18 PM EST

    Wonder why it isn't used more. (Probably because people are scared of the U.S.-government...)

       -- Signed, Lev Andropoff, cosmonaut.
    [ Parent ]

    Binary constants? (none / 0) (#532)
    by fatgeekuk on Thu Feb 12, 2004 at 09:48:24 AM EST

    I am sorry, but if you cannot convert without thinking from binary into hex and back you should not be mucking around with software that needs binary constants.

    This may sound elitist of me.

    so be it.

    [ Parent ]

    C has sprouted C++, Objective-C, Java and C# (2.75 / 4) (#451)
    by akuzi on Tue Feb 10, 2004 at 02:59:49 PM EST

    > Pascal has matured and grown in leaps and bounds,
    > becoming a premier commercial language. Meanwhile,
    > C has continued to stagnate over the last 35 years
    > with few fundamental improvements made

    This post strikes me as a troll, but i'll give the author the benefit of the doubt.

    C itself has remained constant, but it has sprouted a whole tree of derived languages that completely dominate modern applications programming.

    Whether you consider these premier commercial versions of Pascal you are talking about to be the same language as the original 'Pascal' or not is really a matter of nomenclature - since they are atleast as different from the original Pascal as C is to C++. Borland Delphi is really an Obj oriented language, whereas the original Pascal had no OO features whatsoever.

    Article is Misleading but Partly True (2.66 / 6) (#474)
    by OldCoder on Tue Feb 10, 2004 at 08:04:02 PM EST

    C lacks a formal string type and hash tables because when the language was designed it wasn't clear how to create a string or hash that was good enough for all applications, so they created a language in which one could write whatever kind of string or hash table you wanted.

    This was a true breakthrough, compared to languages that came before. Languages that came before overspecified, so that you could be stuck with a string or hash that was not appropriate.

    The source for the bad string functions you hate was available, and the idea was that people would write their own string types as needed. That's why it was part of the standard library and not in the language, where you could not get away from it.

    The reason that so many projects used the standard string type rather than implementing or buying their own, is basically the cowardice and stupidity of management. Most or many of the software managers in the first several decades of C development were not great programmers themselves, or programmers at all, and were very frightened by the idea of deviating from the standard. I know, I fought that battle.

    The reasons for avoiding buffer overflow, and the techniques to use for avoiding buffer overflow, were well known and widely publicized by the early 1980's. I remember learning them then. But the only way to enforce them is by the formal code review, which was too expensive for most budgets. Another management failure.

    To emphasize this, consider the widespread problem of buffer overflow in Microsoft products written in C. Many or most of the people building the software knew about the potential for buffer overflow, but management could not get organized enough to create and enforce coding standards.

    Of course, the fact that manageing programmers was similar to the job of herding cats only made management more difficult. The current commoditization of programming jobs to the level of low level clerk-like status, and the regimentation that is the norm in Indian workplaces, might help here...

    Deeper Reasons
    Building a safe string type requires building a memory allocation system underneath, such as garbage collection or following the malloc/free discipline. Once one of those is in your program, you cannot get rid of it. But some programs are simply not compatible with a one-size-fits-all memory allocation scheme, and so the C language needed to provide programmers with the flexibility that comes with not bolting in a memory allocation scheme.

    C was built for the programming problems of the 1970's and 80's, when machines were made of wood and men were made of iron, as they say. The luxury of a multi-gigahertz processor was just a pipe dream...

    --
    By reading this signature, you have agreed.
    Copyright © 2003 OldCoder

    Problems persist (none / 0) (#513)
    by JamesThiele on Wed Feb 11, 2004 at 02:50:59 PM EST

    C was built for the programming problems of the 1970's and 80's, when machines were made of wood and men were made of iron, as they say. The luxury of a multi-gigahertz processor was just a pipe dream...

    Actually, there are still situations where the luxury of fast processors doesn't exist. Many embedded systems are built with processors with low MHz clock speeds, particularly on battery powered devices. These speeds are comparable to minicomputers built in the 70s or 80s. In a battery powered system you trade off easy of programming in a higher level language against power - every cycle used drains the batteries.

    [ Parent ]

    It has nothing to do with speed. (none / 0) (#521)
    by tkatchev on Wed Feb 11, 2004 at 06:37:58 PM EST

    Modern processor architectures are radically different from those that were common in the 1970's.

    Unlike software, hardware design tends to advance at a very frenetic pace.


       -- Signed, Lev Andropoff, cosmonaut.
    [ Parent ]

    Yes, but (none / 0) (#559)
    by OldCoder on Sun Feb 22, 2004 at 12:26:29 AM EST

    I have coded embedded real-time systems of this type. I have also worked on multi-million line embedded systems written in "Bad C" (by others) that needed all the help a language could provide, except randomly intrusive garbage collection. And even that would have been acceptable if we had the ability to write threads that ran in a higher priority than the GC code.

    --
    By reading this signature, you have agreed.
    Copyright © 2003 OldCoder
    [ Parent ]
    C is old. But its offspring aren't. (2.62 / 8) (#489)
    by Xtapolapocetl on Wed Feb 11, 2004 at 04:14:55 AM EST

    Firstoff, a little background. I'm a longtime C guy (23 years old, been programming in C on a regular basis since I was 9, and damn near daily since I was 15). I love C. But I've gotten to the point where its limitations just get in my way.

    You're right about C being an old language, and it shows. But you're comparing a new version of Pascal (Delphi, which is a far cry from old-school Pascal, being that it's an OO language while Pascal originally was nothing of the sort) to an old version of C - try comparing it to C++ or Objective C, which are basically modern versions of C. I don't consider C99 to be a new version of C, either - I consider it to be an update to an old version of C. While it fixes a lot of annoyances with C and is in my opinion a good thing, it still doesn't add the types of things that make software development less of a battle.

    First I'll talk about Objective C, because it's become my favorite language over the past year. Combined with the NextStep API (I use its best and most current implementation, Cocoa, under MacOS X), it's amazing how rapidly you can develop a GUI application. The standard library is unbelievable, including a through-and-through standardized memory management scheme (NSObject's retain and release reference-counting mechanism). If you follow the rules, you will never have a memory leak, because the rules of who allocates and who releases objects are very well-defined and sane. If I'm writing a utility or something for my own use, I'll do it in Objective C (unless it's the type of thing for which Perl makes more sense, like a quick script). On its own, Objective C is nothing terribly special (although it does let you do some neat things that are difficult or impossible in other languages), but when combined with the NextStep API (which is, for all intents and purposes, Objective C's standard library), it's truly a beautiful thing, and if it makes sense to use it for something I'm working on, I absolutely would not consider using anything else.

    But that only really does you good if you're developing under and for MacOS X (it's good to be a Mac user these days! Come join us!). My day job, though, is that of game developer, and let's face it - the Mac market is not a big enough target. So we use C++.

    C++ is a funny beast. Taken on its surface, it looks to be little more than C with some object-oriented extensions. And while that's what it is, the extensions in question make it a very different programming environment to work with, and a much more enjoyable one. A well-designed class hierarchy does wonders for code correctness and maintainability and allows interesting design patterns that would be somewhat of a hassle to do in C. Templates are another story altogether - template metaprogramming (especially in a field such as mine where performance is king) is incredibly useful - it allows truly generic code with zero runtime overhead. You do have to be careful of code bloat, though, but again a good design will help avoid that sort of problem.

    When it comes to memory management, C++ is still basically where C was - the programmer is responsible for allocating and freeing memory from the heap. But I wouldn't have it any other way! Again, things are different in less performance-sensitive situations, but I need to have full control over everything the program is doing - the last thing I want my code to do is dump into a garbage collection routine that'll waste 200ms for no good reason. 200 milliseconds? Shit - the performance goal for my engine is 60 frames per second, which leaves a scant 16.666 milliseconds to do everything required to render a frame. A 200ms trip down Garbage Collection Lane would waste enough time to render an entire 12 frames. It, and any other language or library feature with similarly unpredictable performance characteristics, are entirely unwelcome in my world. Thank God C(++) gives me control over my own program.

    But shit, man - you're a smart guy. You can come up with a workable solution for organizing resource management - I did it for this engine early in development, and 2 years later, resource leaks are very rare and always caught and fixed almost immediately. Nobody said programming would be brainless.

    Now about your complaints with the standard library - again, it was written for a different computing world. Of COURSE it feels dated and doesn't support some things it should. But once again, look at the current incarnations of the C family - I've ranted enough about Cocoa enough, so suffice it to say your complaints don't apply to it. C++ too, though - the STL is a wonderful library, and a huge timesaver. If you know how to use it properly - not just the basics of its containers, but also its algorithms - you can accomplish a ton of work in a very short time by allowing your code to be "written" by the template processor instead of doing it by hand. And as an added bonus, you're building on code you can rely on - it's been debugged by someone else already! I've never been affected by a bug in STLPort, which is the STL implementation we use for our engine (due to both its very good performance and its portability - nothing is more of a pain in the ass than writing code using different implementations of a core library on different platforms. Too much hassle. So we use STLPort on Win32/Linux/OS X).

    As far as C goes, you're right - it's old, and it shows. But even though C was my first and longest love in the programming world (not the first language I learned - that would be Pascal. But I fell in love with C), I wouldn't consider starting a new project in it, no matter what the project (with the possible exception of embedded programming for a platform where a decent C++ compiler doesn't exist). C++ just offers the programmer too many advantages over plain C, and as I said, if you know what the hell you're doing, you can get equivalent performance out of C and C++. And while you can't really write code in C++ that's faster than anything you could write in C, templates make it easier to write efficient code in certain situations that would require much more (mostly repeated) code in C.

    Although I truly love C++, it is certainly not appropriate in all situations either. If performance isn't an issue (or security is) I would generally use a safer, interpreted language (possibly Perl, but lately I've been digging Python. Those aren't the only options, though - many other languages would work fine too). Especially for any code which interfaces with a network (or arbitrary file data, or another other case where a malicious user can feed your program garbage, especially when the program is running with elevated permissions), using a language like C which has no security features is really asking for trouble. Despite my talk about memory management above, I'm not perfect when it comes to details, and neither is any other programmer, and those details will be the death of you in a security-conscious environment. Considering that networking programs are generally I/O bound as opposed to CPU bound, a slower but safer language makes sense.

    And while I'm on the subject, the same applies to command-line utilities, system daemons, and things of that nature. Frankly I'm shocked that the OpenBSD project hasn't started an initiative to rewrite good chunks of their userland in a non-C language. I'm sure they are well aware that it would result in a safer, more secure system than what they're doing now (combing through their code with a fine-toothed comb looking for security problems). Obviously buffer overflows aren't the only security problem there is, but Perl (and I'd imagine Python and other as well, although I'm less familiar with those) has a ton of libraries, both in its standard library and in public code archives like CPAN, which are designed to help avoid other security problems.

    Ok, I've rambled enough (it's 4 in the morning, and I gotta get back to work - this physics engine isn't gonna debug itself!). I guess my point is that yes, C is old. But many of your issues are addressed quite well by C's modern derivatives (although not all - that wouldn't be possible without sacrificing control and efficiency, which are exactly the things that necessitate using a C-based language in the first place). The problem is that they are used when they shouldn't be. My job couldn't be done without a C-based language, though, and neither could many others. When raw performance counts, there's no reasonable alternative. Nothing else gets out of your way and lets you at the raw horsepower of your machine quite the same way (well, except assembly, but aside from being non-portable, it's generally unnecessary these days unless you're doing very low-level code interfacing directly with the hardware or you're trying to do optimizations which are beyond the scope of the compiler, such as using SIMD instructions). With power always comes responsibility, and C and friends are no exception. If you know how to use them, and you have a reason to do so, they can be a lifesaver. But if you fuck up, you have no one to blame but yourself. The computer just did what you told it to do, nothing more, nothing less.

    ~Xtapolapocetl

    --
    zen and the art of procrastination

    Smalltalk-80 (none / 3) (#516)
    by Phillip Asheo on Wed Feb 11, 2004 at 04:27:12 PM EST

    Kicks every other language's ass including Java and C++.

    If only Steve Jobs had stolen that from Xerox too...

    --
    "Never say what you can grunt. Never grunt what you can wink. Never wink what you can nod, never nod what you can shrug, and don't shrug when it ain't necessary"
    -Earl Long

    small talk (none / 1) (#553)
    by horny smurf on Sun Feb 15, 2004 at 12:23:49 AM EST

    Well, NeXTStep/OpenStep/Cocoa are based on Objective C, which is C with Smalltalk-inspired OOP and messaging and truly dynamic run-time.



    [ Parent ]

    Indeed. (none / 0) (#556)
    by Phillip Asheo on Mon Feb 16, 2004 at 07:57:16 PM EST

    And i;ve just bought myself an iBook so I will be investigating all that stuff real soon now.

    --
    "Never say what you can grunt. Never grunt what you can wink. Never wink what you can nod, never nod what you can shrug, and don't shrug when it ain't necessary"
    -Earl Long
    [ Parent ]

    Java descending from C? (none / 2) (#529)
    by marcovje on Thu Feb 12, 2004 at 08:15:30 AM EST

    Java only borrows some basic C syntax (like {} and the post/pre increment operators) Actually Delphi is much closer to Java than e.g. C++. Also check the credits for the Java VM. One contributor is a ... Niklaus Wirth :)

    Huh? (none / 3) (#530)
    by Arevos on Thu Feb 12, 2004 at 09:38:37 AM EST

    Good programming is all about choosing your tools. You're complaining that your screwdriver is a really bad tool at hammering in nails. Your whole rant is, well, utterly pointless.

    C isn't high level. It's a step above assembly. If you want to manipulate strings and hash tables or whatever, then either get a good library or, don't use C!

    It really is that simple.

    There is a reason that C has remained static. (none / 1) (#531)
    by fatgeekuk on Thu Feb 12, 2004 at 09:46:05 AM EST

    Even with all of its faults, it is just useful enough without getting in the way.

    It's a tool, just like every other programming language. It has stood the test of time.

    Although C++ is good, it is also more complex and
    takes greater dexterity to make use of it PROPERLY.

    Do you feel that you would be able to give a student a C++ compiler and a 10 week course and expect elegant results?

    With C you have to work harder to get the same results as C++, but a larger population can actually do the work, can actually understand what is happening.

    [ Parent ]

    Why are you ranting about this? (none / 1) (#544)
    by Verdeboy on Fri Feb 13, 2004 at 01:10:35 AM EST

    This rant is pointless, C++ is much better than C.

    About your rants about the preprocessor: a programmer worth his salt DOESN'T USE THAT STUFF, or at least I don't.

    About your string rants there is a C++ header called string.h which defines an object-oriented string type, and in C++ all you have to do is overload operator=() to call the appropriate functions to concatenate strings (not hard to do, and i think in the C++ header it already is, but I'm not sure). If I want do complicated string operations I use Perl scripts embedded in my C++ code--which is very easy to do since that excellent language was written in C.

    The reason there is not exponent operator is that they ran out of operators--operator^() is used for bitwise OR.

    Finally check your UNIX makefiles and note most of them call a C compiler--why on earth they don't use C++ I will NEVER know.
    But all in all, this kinda thing is a personal preference issue, you should have figured out mine by now.
    --Verde

    DETECTING WINDOWS USE DOWNlOAD SLACKWARE LINUX
    Preprocessors (none / 1) (#562)
    by ruylopez on Tue Jun 15, 2004 at 10:31:38 AM EST

    About your rants about the preprocessor: a programmer worth his salt DOESN'T USE THAT STUFF, or at least I don't.

    I think you should re-read his section on portability, specificially: "If it weren't for the C preprocessor, then it would be virtually impossible to get C to run on multiple families of processor hardware, or even just slightly differing operating systems."

    If you're running a C program on only one OS, using only one chip, then you may be able to get away with using little or no preprocessor attributes. If you're writing a C program that be used on different platforms however, it is a different story. For example, the fcntl() call on most UNIX's is called ioctlsocket() on Windows. Actually, 99% or maybe even 100% of the #ifdef's I've ever done were due to Windows desire to not follow standards, but that's how it is. I generally develop on a Linux, testing the code out once in a while on Windows, where I will sometimes be reminded of things like sockets are closed with not close() but closesocket() on Windows.

    [ Parent ]

    Hilarious (none / 1) (#550)
    by loqi on Fri Feb 13, 2004 at 06:48:58 PM EST

    This is even better than that "Why C++ is the coolest language" article. Keep taking him seriously, folks, it's got me in stitches.

    that's strange... (none / 0) (#563)
    by busfahrer on Mon Jul 12, 2004 at 02:58:17 PM EST

    ...he listed all the reasons for which I like C.
    --
    GCS d s:+ a19 C++ UL P+>P++ L+>L++ E- W++ N+ o? K? w+>w++ O! M- V? PS+ PE-- Y+ PGP t 5? X+ R(R+) tv b- DI D++ G e h! y
    THE BEST PROGRAMMING LANGUAGE (none / 0) (#564)
    by THE TRUTH on Sun Jun 12, 2005 at 06:18:40 PM EST

    EVERYTHING YOU CAN DO IN C (C++,C#,OBJECT C ...) YOU CAN DO IN PASCAL(FREE PASCAL, OBJECT PASCAL AND DELPHI) EXCEPT THE ERRORS THAT THE C COMPILER ALLOWS. :))

    THE SIZE, THE PORTABILITY AND THE SPEED OF THE CODE IS SIMILAR IN ALL COMPARABLE LANGUAGES.

    I HAVE WORKED IN ALL OF THE PROGRAMING LANGUAGES ABOVE BUT NONE COMPARES TO DELPHI.

    DELPHI RULZ!!!!!!!!!!

    IT IS THE FUTURE. TRY IT! IT'S THE BEST!:-)

    C IS OUTDATED, PASCAL HAS EVOLVED.

    Whoa, what the shit? (none / 0) (#565)
    by Patrick Chalmers on Sat Oct 15, 2005 at 12:41:27 PM EST

    This story still hasn't been archived? Laughing online!

    BTW, hi bleep.
    Holy crap, working comment search!

    Why C Is Not My Favourite Programming Language | 556 comments (448 topical, 108 editorial, 3 hidden)
    Display: Sort:

    kuro5hin.org

    [XML]
    All trademarks and copyrights on this page are owned by their respective companies. The Rest 2000 - Present Kuro5hin.org Inc.
    See our legalese page for copyright policies. Please also read our Privacy Policy.
    Kuro5hin.org is powered by Free Software, including Apache, Perl, and Linux, The Scoop Engine that runs this site is freely available, under the terms of the GPL.
    Need some help? Email help@kuro5hin.org.
    My heart's the long stairs.

    Powered by Scoop create account | help/FAQ | mission | links | search | IRC | YOU choose the stories!