|
|
This is the C programming language item. The C Programming Language has a long history and is intimately tied to the Unix environment, where it was initially created. Indeed, one of the major motivations behind C's development was to have a systems programming language in which to implement Unix! But the story of C begins several years before Unix, with the CPL language developed at Cambridge University, and its descendant BCPL, which was ported to the MIT MULTICS system running on the GE 645 mainframe. It was under that system that Bell Labs researcher Ken Thompson encountered BCPL. After creating the embryonic Unix system, Thompson set out to re-implement BCPL, but found that the PDP-7 system he was working on was too cramped to support the full language, which led to a simplified version called B. Dennis Ritchie took this early work and extended it, adding structs and types and eventually coming up with the first versions of C, in which he rewrote the Unix kernel. A much more detailed history of the language appears here: http://plan9.bell-labs.com/who/dmr/chist.html The Wikipedia article on C is also quite interesting: http://en.wikipedia.org/wiki/C_programming_language Today, C has been standardized and the current version of the standard, referred to as C99, is mostly supported by most compilers (some compilers lack support for some of the more esoteric features of the standard). C is an extremely portable language, and often C programs will run without modification on vastly different platforms. C is also a rather low-level language, providing few higher level constructs in the language itself (preferring instead to implement most common subsystems, such as I/O and generic data structures and algorithms, as libraries). Because of its portability and lower-level nature, C is often referred to as a "portable assembly language" and is a target for higher level language compilers. The high level compilers generate C, which in turn is compiled and linked by the underlying systems tool chain. C is frequently criticized as being too low level, and it is easy to write incorrect programs in it. It relies heavily on things like pointers and implicit type semantics which are well known sources of bugs for unwary programmers and make compiler optimization difficult. It lacks higher level constructs such as classes and exceptions - again, part of the low-level nature of the language, but often annoying for application programs. Without special care, C programs can quickly become unwieldy. But C was not designed for this type of environment. It was and is a systems programming language created by master programmers for advanced programmers. Its wide spread and popularity are due largely to the academic impact that Unix had from the 1970's through the 1990's and today, and the fact that if care is exercised, C can be astonishingly efficient both in terms of space and time. Today, C is being supplanted as an applications development language by other, newer, safer, higher level languages. Yet it endures, and new code is being written in it all the time. The open source movement, which seems to revolve around Unix-like systems, has breathed new life into it, and it's well worth being part of any programmer's toolkit. This is the place to talk about C, its implementations and standards, and other things related to this astonishingly successful language.
52 responses total.
I'm taking a C class this semester. The following text is from an email message that I sent to my college tutor and a few programmer friends. The tutor hasn't written back yet, I think he may have been eaten by small dogs. ---------------------------------------------------------------------- C seems to me generally less readable, less consistent and a little more "clunky" than Pascal, although there are some simliarities such as referencing members of a struct (we called them "records" in Pascal) in the format cake.sponge, cake.icing, cake.filling etc. Because I'm used to pointers in machine code and assembler, it fealt odd to declare them with the data type of the object to which they point, rather than a data type just large enough to hold the pointer itself. I daresay the compiler reserves just enough space for the pointer though. C's access to pointers and addresses could be used to write insanely bad code, but I can see that it also grants a lot of flexibility and power that would be useful in embedded and systems programming. Come to think of it, C even feels like unix in that it's terse, yet powerful. Pascal feels more like DEC VMS in that it's more readable and friendly, but takes longer to type ;-)
s/simliarities/similarities/
I think that's fair. Pascal comes, originally, from a CDC background. But
that may share more in common with VMS than Unix; a few weird CDC-isms
remain in modern Pascal: the packed data types, for instance.
With respect to pointers, C makes you declare the type associated with the
pointer so it has some idea of what you're pointing at, not necessarily so
it knows how much space to reserve for it (on some machines that doesn't
matter, though on some it does). What I mean by this is that the compiler
uses the type to figure out what to do when you say, "p + 1", and p is a
pointer time. If p points to an array of characters, and the size of a
character is 1 addressing unit, then "p + 1" is the value of p plus one. On
the other hand, if p points to an array of, say, integers, and the
addressing unit of an integer is 4, then "p + 1" is the value of p plus
*four*. Why is this? Why not just let the programmer deal with it him or
herself? Because of the way that arrays are implemented: internally, a[i]
is the same as *(a + i). Incidentally, a[i] = *(a + i) = *(i + a) = i[a],
and a fun thing to play around with is the following:
char a[] = "Hi there";
int i;
for (i = 0; i < sizeof(a); i++)
printf("%c\n", i[a]);
Does that look weird? It is, but it'll compile and do what you may expect:
addition is commutative, and this is a good way to demonstrate the parity
between arrays and pointers.
But more to the point, if C required the programmer to specify explicit type
widths when incrementing pointers, it wouldn't be very portable across
platforms where type widths varied. For instance, if you hypothetically
always incremented integer pointers by four, because that's how wide
integers are on one platform, what's going to happen when you need to compile
it for a platform with integers that are 8 units wide? Or 16, 2, etc? You
would have to find all places that used points to iterate over integer
pointers and modify them, or define weird constants to increment by or
something of that nature. Of course this would extend to other data types,
as well. Eww. But I think that's the primary reason you declare types for
pointers.
I taught C a lot. My students' minds were always boggled when I pointed out that i[a] was equivalent to a[i]. Putting the intelligence about type widths in the compiler, rather than making it the programmer's responsibility as had been the case in some earlier "system programming" languages that incorporated pointer arithmetic, was a stroke of genius on the part of the designers of C.
It does sound like a good idea. It just fealt strange to someone more used to machine code and assembler.
C's portability is often overstated. Trivial programs are portable. Programs that use do anything more complicated than standard I/O generally are not, because the system libraries are not standardized. This has lead to abominations like ./configure scripts to try to automated the process of fixing up the program for each system's eccentricities.
Regarding #6; To pick a nit: the standard libraries *are* standardized. But many other useful libraries are not (or, rather standardized by different standards. For instance, POSIX and Single Unix include sockets, while Windows has winsock which is slightly different. Both are "standard," yet they differ. That's the nice thing about standards: there are so many to choose from). I've found that, if you constrain yourself to a reasonable subset of available libraries that conforms to say POSIX, you can write quite portable programs that don't require things like configure. Ironically, the problem autoconf was designed to solve has largely gone away through near-universal adoptation of C89 (the 1989 ANSI and ISO C standard - I believe ISO differs from ANSI only cosmetically, but they may have added a preface or something), POSIX and X11R6. Now it's largely used to handle differences between other, higher level librares (what version of GTK+ is installed? etc). Even where I need to account for system differences, I've found it best to implement some intermediate compatability layer, or emulate an existing, standard interface rather than resort to autoconf. But I digress.... C can be portable if used with care. If not, you'll have problems, and probably more so than with other languages!
Is it required that the sizes of char and int types be the same in C?
No, it is not. For instance, on grex, sizeof(char) = 1 and sizeof(int) = 4; this is perfectly legal.
Okay, have never seen that code before. Pretty funky. . .
Oh, that wasn't real code, just pseudo-code. You can't assign to the sizeof
operator; I'm just saying that the two are already different. In "real" C,
you could run the following on grex:
if (sizeof(char) == 1) && sizeof(int) == 4)
printf("Hey, that works...\n");
and it would print "Hey, that works...\n".
Re: 11- was talking about the a[i] == i[a] stuff, sorry. :)
Oh, okay.
I've lost my momentum in the C class because the CBT software is shoddy and yet again my lecturer has gone AWOL (different lecturer, different college). He's not replying to my emails or voicemail messages.
This response has been erased.
I have a strange problem when I compile with GCC in an xterm. Often the names of identifiers in the error messages are replaced with , an a-with-a-carat character. This doesn't happen in other terminals. It seems like it must be some kind of character encoding problem. Has anyone else run into this?
The actual character didn't appear in my post, so you'll have to use your imagination, I guess. ;)
What's your locale set to?
en_US
Hmm.. Wouldn't think you'd be getting funky international characters from gcc on purpose, then. In other terminals, are the values that are munged in xterm marked up in some way (e.g. bolded, colored, etc..)? I haven't run into that. As a practical suggestion, take a screenshot and post a query to a gcc forum with a pointer to the screenshot and I bet you'll get an answer pretty quickly.
In other terminals they're surrounded by open and close single quotes. I'm starting to think MacOS X's xterm is buggier than most.
I would be happier if I could just get xterm /on/ the MacOS X machine I have at home. MacOS X "Terminal" works after a fassion, but the colours are so screwed up that some programs end up printing white text on a white background.
Why can't you get xterm?
He can, and the "terminal" program can emulate it.
Re: 24 xterm probably requires the presence of an X server, which is another useful thing that's missing from the machine in question. I don't have the MacOS X install disc either.
You might want to look into http://www.finkproject.org/ which allows installation of UNIX-like (and Linux-like) software in Mac OS X (alongside the BSD/Mach stuff). Most people primarily get it so they can have XWindow on Mac.
I just installed X11 that I downloaded from Apple and it worked fine.
a thing u lost the central thing on this forum :D :D
this forum must be about C lenguage
printf("Hello World");
C'ya
Hi Temo. Discussions here often change topic. Where in the World do you live?
mehico.. could be tor though and jvmv in disguise..
Holy crap it's Pete. PFV What's up?
Re resp:23: You can change the background color in Terminal. BTW, the version of X11 that comes with Leopard is horrendously buggy. I ended up installing Xquartz from here, to get a working version: http://trac.macosforge.org/projects/xquartz
i live in the same world as u do... but come on.. for other topics exist other forums exit ?
somebody know books about "Object-Oriented Programming" in C ?, because now I'm reading "Object-Oriented with ANSI C" by Axel-Tobias Schreiner, but I want to know if there is others. cheers :) PS: Sorry if my english sucks :P
Most object-oriented programming books for C programmers are probably going to deal with learning C++ or Objective C or one of the other C language variants rather than try to impose object-oriented constructs on a language that isn't intended to support them.
I recommend UNIX Scripting
tod, Scripting is good, but sometimes you need something faster and powerfull, things that C can give you.
This response has been erased.
|
|
- Backtalk version 1.3.30 - Copyright 1996-2006, Jan Wolter and Steve Weiss