Kuro5hin.org: technology and culture, from the trenches
create account | help/FAQ | contact | links | search | IRC | site news
[ Everything | Diaries | Technology | Science | Culture | Politics | Media | News | Internet | Op-Ed | Fiction | Meta | MLP ]
We need your support: buy an ad | premium membership

[P]
What ever happened to...

By maketo in Technology
Tue Dec 12, 2000 at 07:29:35 AM EST
Tags: Software (all tags)
Software

...the joy of coding up a new demo, or implementing that compression algorithm, or reading that magazine article about the new Borland Complier? I bought my first computer (Commodore 64) exactly twelve years ago. Then Amiga followed, AT etc. with even a short excursion to a Motorola 68030 HP 9000/360. Even before I had a machine, I was reading all those computer articles about the new best trick to code up a scroll in asm, or about how to directly access the video memory on your new VGA. But now....


....now those times are gone. Computer magazines, even the ones that claim to be made for your next door Joe Programmer, even they ramble about the reusable components of Java, the new better and improved version of Perl or the latest craze, OO Python. Now, to manage a web site, you need to implementi it in Zope, or Cocoon or AxKit. But you have to learn these frameworks first. It is not enough for your data to be in any file. They now have to be in XML. We all sit marveling at the nice web page on the Net and the PHP or god-knows-what-today technology used to make it all look the way it does. Your average Joe Manager driving the mercedes has it all nicely prepared for you. You, the programmer, are supposed to be maximally productive. All software you see online has to have certain functions. We now run ten projects at a time. Run off the mill software. Now the professor teaches you the design of a complex piece of software, since didnt you know (?) the problems are getting more and more complex and the solutions too. No more code it yourself so it can run in tightest possible space and least time, now you just "plug-and-play". Noone has the time to stop and reflect.

There were times when I learned my first programming steps from a computer magazine. The people writing there were literally wizards. There was something intriguing about coding up a fast search for a specific word in a file. Or write a TSR program to extend your keyboard buffer. Now, I find myself reading countless books and magazines just to stay on top of the new interface to the new component model of the new <BUZZWORD OF TODAY>.

And it goes on and on. The magazines are now specialized. There are languages that contain components off the mill to code up everything in no time (tm). We are all downloading, developing, adding, bloating, exchanging....

I dont know if this is good or bad. I just know I am digging up my old HP 9000/360 to write something cool on it, in assembly language. Maybe something that boots itself. Maybe something new. I sure hope I will spend countless nights writing code to make it more tighter and faster for the poor 8 meg machine. I will do that because, like all Robin Hoods of today who know that chivalry infact _is_ dead, I know real programming is dead too.

Sponsors

Voxel dot net
o Managed Hosting
o VoxCAST Content Delivery
o Raw Infrastructure

Login

Related Links
o Also by maketo


Display: Sort:
What ever happened to... | 95 comments (95 topical, editorial, 0 hidden)
Hell yeah. (3.11 / 9) (#1)
by pb on Tue Dec 12, 2000 at 01:48:57 AM EST

This should probably be a rant (Op-Ed, that is), but I voted (+1, Front Page) anyhow, because I completely agree.

I started on the Commodore 64 too, and I was amazed at the things people could do on those machines. I admired the small, fast, beautiful programs of yesterday. And I wonder who will be left to code them for tomorrow.

There are still some real programmers out there, though. And the stuff they do is still amazing. I'm sure we can still learn from them, and I'm sure that they are still learning, every day.
---
"See what the drooling, ravening, flesh-eating hordes^W^W^W^WKuro5hin.org readers have to say."
-- pwhysall
set platform (4.33 / 3) (#25)
by sety on Tue Dec 12, 2000 at 09:56:33 AM EST

I'm just commenting on yours and others use of yesterday's computers as an example:

The C-64 was static. That is why amazing things were/can be done with it. The platform never changed. The C-128 came out, but didn't really catch on except for some word-processing. I got my first C-64 in 1982 and continued to use it until about 1990 when I got an Amiga. So for 8 years or so a mainstream computer didn't change at all. Nothing. No new sound cards, no new graphics cards, no new memory. So everything could be hard-coded and optimized and you could re-use most of this optimized stuff again.

The Amiga was the same story. Amiga 500 with 512K was the standard. Although porting games from IBM PC's caused this to have to be raised to 1 Meg with an expansion card.

So the point is that computers of the past were static. Like the game consoles of today. If people want to be a "cool" programmers go hack on a console and make some games. The PS2 won't change anytime soon, so you can make a game for it today and it will run fine in 7 years time.

Todays PC's are a hodpodge of hardware. Abstraction and generics are a necessity.....

[ Parent ]

There's a reason for that... (3.25 / 4) (#28)
by pb on Tue Dec 12, 2000 at 10:21:11 AM EST

There wasn't anything for the Commodore line of computers after the C128. However, it did go quite a ways; I had two disk drives, for example. Much better than a VIC-20 with one tape drive, or a C64 with only cartridges...

The Amiga is a much better example, because that had a whole line of different machines and chips; it was definitely more upgradable.

But yeah, you're right. It's a whole lot easier to optimize for a static platform. That doesn't mean that people don't make assumptions like that today, though. In fact, anytime you find an application that breaks under a later version of the same OS, (think, say, a V86-mode DOS box in Windows running an old DOS app) someone probably made an assumption that didn't hold true. I can't get sound working for the Future Crew demo, for instance, because it needs real Sound Blaster hardware there...

Today's PCs generally provide an interface of some sort, but it's often too-little, too-late. VESA 1.2 (and now 2) would have been great, if more cards had supported them. Sound Blaster emulation doesn't, virtual DOS boxes aren't... And so you have to write drivers for everything, and trap random stuff that used to work, and hope people use the same interface. So basically we have none of the speed hacks for a static platform, and none of the benefits of an actual portable platform... ;)
---
"See what the drooling, ravening, flesh-eating hordes^W^W^W^WKuro5hin.org readers have to say."
-- pwhysall
[ Parent ]
Embedded... (3.00 / 1) (#45)
by tzanger on Tue Dec 12, 2000 at 04:07:01 PM EST

And I wonder who will be left to code them for tomorrow.

The embedded designers will do this. Take the Microchip PICs, Atmel AVRs, the venerable 8051s and Motorola 6800 (not 000) deriviatives.

Personally I code all of the above, but seem to prefer PICs. Small, fast and efficient coding is a necessity, not a desire.



[ Parent ]
Damn... (3.14 / 7) (#2)
by fluffy grue on Tue Dec 12, 2000 at 02:07:20 AM EST

...now I REALLY need to get around to writing the software renderer for Solace. :) Maybe over the winter break I'll finally break down and do it...

I was intending to write a software renderer all along, by the way. It's a pervasive architecture decision I made. I was intending at the very least reusing the triangle fillers I wrote for MAGE (my first 3D engine, completely software-only). And I'm getting so sick of the brokenness of OpenGL in Linux these days that maybe I really should just get around to writing the software renderer so I can, at the very least, keep experimenting with stuff without stuff fouling up in ways that I have no control over. It's been so long since I've done any really low-level graphics programming that I'm not sure whether I can do a competent job of it anymore, though...

Eh, like riding a bicycle, or something. I still know how to write fast interpolation code, fragment operations aren't exactly difficult to do... I can manage, I'm sure.
--
"Is not a quine" is not a quine.
I have a master's degree in science!

[ Hug Your Trikuare ]

Software rendering ... (3.66 / 3) (#22)
by StrontiumDog on Tue Dec 12, 2000 at 09:35:34 AM EST

... speaking as someone who has an engine or two listed on the 3D engines list, I must say software rendering suffers from some serious problems:

One is bandwidth; I can (and have) write a software renderer that can compete with hardware for 8 bpp 320x200 viewscreens @ 30 FPS. A 1024x768 32 bpp screen at 120 FPS(think Quake 3 on a high-end PC + GeForce) is impossible to emulate in software -- the bus simply can't shove pixels from system memory to video memory that fast.

Caching is the second; with large framebuffers and textures your CPU cache will be seriously trashed. This means that system RAM speed is going to be the limiting factor for real-time performance, and that is still at the 100-133 MHz stage.

Correctness: it is harder than it seems to write sub-pixel accurate renderers and shaders that can still maintain decent speed. It took a long time for 3D-chip driver writers to wise up to the neccessity for accuracy, but they're finally getting their act together. When I pull out old software-rendered games and demos, the first thing that strikes my eye is the inaccuracy: the jaggies, the jumping pixels, the lack of filtering and anti-aliasing, the cracks between polygons, no mipmapping etc.

The sheer amount of work: back in the old days by the time I had finally gotten my 3d renderer working I was so sick of coding that I let the rest of the engine slide, which is why there are thousands of renderers on Karsten Isakovitch's list but few decent engines.

Open-GL extensions offer some solace, though. What would really be nifty would be a lower-level standardized API aimed at programmable graphics DSPs or FPGAs. Standardized to prevent Glide-like fiascos or incompatible proprietary lock-ins to hardware. Control over accumulation, stencil, z, alpha and frame buffers, for instance; customizable per-pixel shader routines; fine control over hardware T&L so that hardware LOD and curved surfaces can be implemented at any level of sophistication desired.

But I think hardware 3D graphics are here to stay, and software rendering is gone forever.

[ Parent ]

Yeah, I know all this (4.00 / 1) (#39)
by fluffy grue on Tue Dec 12, 2000 at 02:49:06 PM EST

I've written software renderers before, and yeah, I'm well aware of all of these issues. The bandwidth one is the worst; MAGE ran at ungodly speeds at 320x240, but when I upped the resolution to even 640x480 it started to choke (much more than the 1/4 framerate that would be expected, due to cache thrashing). And yeah, corretness is a pain. I don't think I'm going to bother with perspective correction or subpixel accuracy - I'm just sick of the G400 DRI driver being in a perennially-broken state.

As far as the sheer amount of work goes, I've already gotten the rendering engine written, and until I finally get around to working on the clientside stuff (which is hard to do with broken OpenGL), a minimal implementation of a software renderer for my rendering framework would be a Good Thing. And also some sparetime fun.

I don't disagree with your last sentence, either. I just want to finally write the software backend though. It's something I promised myself I would do eventually.
--
"Is not a quine" is not a quine.
I have a master's degree in science!

[ Hug Your Trikuare ]
[ Parent ]

/me wonders what if... (3.21 / 14) (#3)
by Justinfinity on Tue Dec 12, 2000 at 02:24:51 AM EST

...the pilot of an airliner didn't have an intricate knowledge of his plane. Would you want to fly with him if he didn't know what everything did, or at the very least how to find out in big hurry for the really technical things. I sure wouldn't.

So why do people continue to use products without having any knowledge of how it works. I don't claim to know everything about computer systems and how they work. I do, however, know enough to find out the information I don't know. And I use this little power called inference to read between the lines and figure things out. When I try to help my computer illiterate friends fix something, they don't realize that 90% of what I tell them is a guess based on my knowledge and past experiences. Even when I work on my own machines, half of what I do is experimentation. One of my mother's friends called me today to help with an Outlook e-mail virus. I basically said "tell your husband to buy an anti-virus program, or start over from scratch", because I didn't feel like explaining how booting off a floppy would keep the virus from begin loaded (or how it possibly might be loaded anyway and she'd really be screwed). My boss called be last night asking if the IDE port on the SoundBlaster 16 PnP I just gave him would be causing an error in Media Player. I just told him to read the damn help because I didn't want to explain the difference between ISA, PCI, IDE, and how sub-systems physically on the motherboard can be on the PCI bus, etc for the 15th time.

OK back to the topic. If everyone had to do a little programming, the whole world would be better. I'm not talking about "Hello World.". I mean something real. Program a graphics display program, or even text reader. Hell, I'd be happier if Reading The F* Manual was a requirement before buying a computer. We all need a license to drive, and since the gov't thinks computers are a "huge threat" (sic) to society, why not have something to prove you know enough to use the machine. And no age limit of course (that's for qslack :-P ).

Most "geeks" spent most of their adolesence coding in someway, or building hardware, or both, or even just read the computer mags and books until they got their own machine. When we got our second-hand Commodore 64 on Christmas about 8 years ago, I was ecstatic. Especialy since my dad also got me a BASIC programming book. Ever since then, I've been compelled to learn as much as I could, and not only about computers, about everything. I cook for living currently, and since I have no formal culinary training I'm constantly asking what certain spices do, and how a certain food cooks the way it does. Even though I most likely will not be continuing cooking fulltime for much longer, I still want to learn.

OK back on topic for real this time. Real programming does seem to be dead. Laziness has killed it. The plug and play ideal has be taken to the extreme. People always say "Oh we have the CPU power, we don't have to optimize so much anymore." Screw that!.

According to the specs, 128 MB of RAM is reccomended for Mac OS X. 128 MB!! If the OS has a footprint so huge that you need 128 MB, something is wrong IMO. I understand that system are getting more complicated, but come on. And you know most of that memory will be used for eye-candy. Which brings me to the next point. Real programming is dying because Real programming (with exceptions) usually doesn't give you a result that any Joe Schmoe off the street will notice and be awed by. Coding a new compression algorithm may get you faster compression and great ratios, but that only looks good on charts and graphs. The [stupid,useless,wasteful] graphical effects in OS X do grab peoples eyes and suck them in. Granted, alot of cool graphical effects are done using Old Skool Real Programming, but the modern stuff is done using packaged code, usually bloated already.

Enough ranting. maketo, I agree. Yes, Real programming does seem to be a lost art. But I feel it will make a comeback. We are coming up on a point in history where a lot of things are going to be changing. Barring a nuclear war, asteroid, massive EMP, or other catastrophic event, the human race will be changing, very soon I hope and believe.

-justin

You know... (2.40 / 5) (#11)
by Elendale on Tue Dec 12, 2000 at 04:42:15 AM EST

Windows ME (and by extension, i assume win2k)starts up with 80 MB of ram used. I know it can run fine with only 64 Megs, but that is a lot of memory. It seems like every new Windows requires another stick of RAM. I also think that much of the macOS is spent (for better or worse) in backwards compatability.

-Elendale (just my thoughts)
---

When free speech is outlawed, only criminals will complain.


[ Parent ]
Windows (2.50 / 2) (#35)
by nstenz on Tue Dec 12, 2000 at 01:20:01 PM EST

<short rant>Windows 95,98,ME != Windows NT/2000. The codebases split at Windows 3.1, and 95 implemented most of the Win32 API from NT later on... they're now trying to re-integrate the codebase for 2000 to take over the world, but right now, they're still separate and quite a bit different.</rant>

But anyhow... Windows 2000 Server requires 128 MB, and I don't remember about Professional (but I think it's less). As for Windows 98 SE, I have many machines that will use from 130-200 MB of memory with no programs loaded, straight from a cold boot. There may be a utility or two running, but nothing that would hog that much memory. I can only guess Windows is making use of the free memory as disk cache until the memory is actually needed, then swapping programs back into physical memory when necessary. Anyone else have an idea?



[ Parent ]
Actually (none / 0) (#57)
by Elendale on Tue Dec 12, 2000 at 09:18:03 PM EST

AFAIK, win98 does not report memory usage correctly. Due to a bug (which was, i believe, fixed in NT/ME/2k) it reported 100% or nearly 100% of memory was used. That might have been the problem, but then again i never claimed to be a windows expert.

-Elendale
---

When free speech is outlawed, only criminals will complain.


[ Parent ]
Memory Usage in Win9x (none / 0) (#72)
by jfpoole on Wed Dec 13, 2000 at 09:13:21 AM EST

I can only guess Windows is making use of the free memory as disk cache until the memory is actually needed, then swapping programs back into physical memory when necessary. Anyone else have an idea?

Win9x does indeed use free memory as a disk cache until it's requested by a program. So, while the memory used by the disk cache is reported as used, it's actually available for use by the applications. Note, though, that Windows will try and keep about 3MB around at all times for the disk cache, though.

As for the memory reporting problems another poster mentioned, there are some problems determining the amount of free memory in Win9x, simply because you have to take into account the amount of memory used by the disk cache. It's not hard, but it is another hoop to jump through.

-j

[ Parent ]

Fear the treath from AOL users! (2.25 / 4) (#15)
by lastwolf on Tue Dec 12, 2000 at 08:02:18 AM EST

"[..] since the gov't thinks computers are a "huge threat" (sic) to society, why not have something to prove you know enough to use the machine."

So, you think the ones who don't know anything about the machine are a threat to society? The only ones I think of who can be are so called "geeks" and "hackers". And to me it seems they actually do know a lot about their systems. They care about them, want to understand them, build them and program them. They are the ones who understand networks, they are the treath to society. In the gov's eyes at least...
Though I don't think most AOL users would be.


LastWOLF "Take your wings, go out and fly.
Learn, read and soar the sky."


[ Parent ]
Modern complexity (4.50 / 4) (#21)
by Aquarius on Tue Dec 12, 2000 at 09:34:20 AM EST

[What if] the pilot of an airliner didn't have an intricate knowledge of his plane. Would you want to fly with him if he didn't know what everything did, or at the very least how to find out in big hurry for the really technical things. I sure wouldn't.
So, I take it that you never travel on planes, then?

Modern planes can't be flown efficiently without the help of computers. Fighter planes can't be flown at all without their help; you can fly a commercial plane solo without machine help, but the computers are constantly making minute adjustments to the fuel consumption and so on to make the plane fly as efficiently as possible.

Hey, let's re-extend that metaphor back into programming, from whence it came. Programming is not fighter planes; you do not have to use toolkits, data file formats, and so forth. If you want to write an X application, you don't have to use Qt, or Gtk, or Motif, or any of the other windowing toolkits. You could use Xlib. Actually, while I think about it, Xlib is for wusses. Why not just open a socket and talk to the X server directly! Yeah! Actually, the hell with that. Just run your job as root, and write things directly into /dev/kmem! Yeah!

OK, that's an exaggeration.

You see the point, though.

If you feel that these abstraction layers (windowing toolkits, XML, sockets, whatever) are getting in the way of coding, then I fear that you have missed the point of the way coding is going. There's repeated desire to not reinvent the wheel, to make things easier for coders, to provide a consistent UI for users, to mean that coding you do in one place can be done again in another place, to not have to relearn all your skills every two years. Yes, Real Programming works a bit faster. Yes, there's an intellectual challenge in bumming another two cycles out of your app's main refresh loop. But making everyone code like that, all the time, throws us back into the horror days of incompatibility, when you couldn't change the database your data was stored in because then all the programs that looked at the data would stop working, because they used the proprietary database query language instead of SQL. You had to write, test and debug your own data file parsers instead of just picking up someone else's precoded, tested and stable XML parser. You rolled your own window primitives for every project. Oh, look, I need a hash tree. Ah well, I can't use the one from the last project, because it was "optimised" to run faster, by which I mean that it wasn't generic. Gee, I'll just roll my own, I've only done that a thousand times, to quote jwz (out of context).

Abstraction layers are there to save work and save rework. They might slow things slightly, and they might mean you're further from the bare metal than you might otherwise be, but they make it easier to build the kinds of applications that seem to be required, today. If you want to code on the metal, hack the Linux kernel. Or the NetBSD kernel. Or X graphics drivers.

Oh, and say hello to Mel for me.

Aq.

"The grand plan that is Aquarius proceeds apace" -- Ronin, Frank Miller
[ Parent ]
Re: Modern complexity (3.00 / 1) (#38)
by cezarg on Tue Dec 12, 2000 at 02:49:00 PM EST

If you feel that these abstraction layers (windowing toolkits, XML, sockets, whatever) are getting in the way of coding, then I fear that you have missed the point of the way coding is going. There's repeated desire to not reinvent the wheel, to make things easier for coders, to provide a consistent UI for users, to mean that coding you do in one place can be done again in another place, to not have to relearn all your skills every two years.

I think you're missing something important here. All abstractions are nice and hierarchical design is necessary to manage complexity but look at how software engineering went about it as opposed to the electronics industry. A flip-flop is still two logic gates no matter how complex your design but a push button (with just simple functionality) can weigh anything from a few kbytes to a few hundred kilobytes depending on the platform and the level of bloat. But the beef of Real Programmers is that a push button of FLTK is just as functional as a push button in MFC. Why is the resultant executable ten times larger in case of MFC?

Just like yourself I don't want to relearn new APIs every two years but the current hype driven IT world forces me to learn stupid new toolkits and productivity frameworks every six months driving me crazy and forcing me to learn other peoples designs that are very imperfect a lot of time and work around bugs in all those 'cutting edge' toolkits. Frankly these toolkits never live up to their initial hype and are getting progressively bigger slower and less stable. If lines of code are anything to go by my productivity today writing a 'soft realtime' app on a PIII 800 is no greater that when I was developing cool stuff on my ZX Spectrum back in 1984. To me the only difference is the amount of fun involved.

[ Parent ]

Airplanes and minimalism... (3.00 / 1) (#49)
by bgalehouse on Tue Dec 12, 2000 at 06:22:34 PM EST

Many airplanes, even Cessna 182s have autopilots. No analogy is perfect, but I think that the autopilot is closer to the windowing toolkit than the automated fuel management system, or even the glass cockpit. What the pilot does changes when he is using an autopilot. On the other hand, electronic altimiter displays aren't so very different.

The most important button on any autopilot, the first button a good pilot looks for, is the off button. When the shit hits the fan, any pilot worth his salt wants a stick with all the functionality of the first airplane yoke he ever set hands on. How this functionality is implemented is irrelevant.

So I see the military fly by wire systems (without which, modern fighters would spin out of control very quickly) as being far more like a JIT compiler, or a new processor - not a new language. They reimpliment the old interface with new technology.

Most airliners cannot recover from a true stall. This is because the tail is high, and the wings swept back. In case of a stall, the turbulence from the wings keeps the tail from doing it's job. The 'solution' is to add system which makes the stick shake when the plan approaches stall - just like it shakes on approach to stall in a lightplane.

[ Parent ]

memory is cheap (2.80 / 10) (#4)
by enterfornone on Tue Dec 12, 2000 at 02:38:48 AM EST

Back when you only had 64k to play with that sort of stuff became important. There's no point trying to find the best way to squeeze your demo into 128meg. The world has moved on from that.

--
efn 26/m/syd
Will sponsor new accounts for porn.
Well maybe .... (4.00 / 6) (#7)
by farlukar on Tue Dec 12, 2000 at 03:12:07 AM EST

WordStar on the Apple II didn't take an eternity to load, like Word or Staroffice on my Duron 700 w/128MB do.
Having a lot of system resources shouldn't necessarily mean using it inefficiently.
______________________
$ make install not war

[ Parent ]
Raw speed is not important anymore. (3.14 / 7) (#10)
by Holloway on Tue Dec 12, 2000 at 04:39:35 AM EST

Yes but the Apple II didn't have to worry about a zillion colours, more resolution, more than 4 bit sound, interupts and drivers or operating systems (in their current expectations) or cross platform code or the developers expectations of code reuse and structure and portability and varying input devices and widgets and OpenGL and TCP/IP and, um, well, others, yeah.

Of course there is just shoddy software too. But now we have the leg-room to use on other things we value as important. Like code maintance and structure and high-level languages (I don't want to start another 'and' list, so i'll stop).


== Human's wear pants, if they don't wear pants they stand out in a crowd. But if a monkey didn't wear pants it would be anonymous

[ Parent ]

Yes and no. (4.42 / 7) (#8)
by bgalehouse on Tue Dec 12, 2000 at 03:15:49 AM EST

Memory and processor capacity don't enforce careful programming the way they used to. On the other hand, careless coding and poorly evolved design don't just lead to bloat. They also lead to instability.

Faster processors begat slower programs (with limit additional utility) begat faster processors. I think that we might be nearing the limit of this cycle. Not because we are at the limit of processor design, but because code bloat leads to instabilities. The computer has the performance to manage more widgets than the programming team.

[ Parent ]

Code bloat (none / 0) (#51)
by kagaku_ninja on Tue Dec 12, 2000 at 06:47:34 PM EST

Careless programming can lead to bloat, but not all "bloat" is due to poor programming. For example, if my language or library automatically performs range checks on array accesses, the code size will be larger, and the program will be slower. Is this being careless?

All of the following techniques can lead to less efficient design, but have enormous benefits: object oriented programming, highly abstract code, garbage collection, structured exception handling, generic programming, thread safety, location transparency...

As software becomes more complex, these techniques are needed precisely to avoid instability.

[ Parent ]
Actually (3.66 / 3) (#9)
by pwhysall on Tue Dec 12, 2000 at 03:45:30 AM EST

If you go to www.scene.org, and visit the Viewing Tips page, you'll find that the intro scene is still alive and kicking...
--
Peter
K5 Editors
I'm going to wager that the story keeps getting dumped because it is a steaming pile of badly formatted fool-meme.
CheeseBurgerBrown
[ Parent ]
Memory was never an issue (3.50 / 2) (#16)
by squigly on Tue Dec 12, 2000 at 08:06:55 AM EST

When I got my 486, I had a demo that required 4k. This was at a time when it was impossible to buy a PC with less than 4 megs. Somepeople keep memory down because they like to see how small they can go.

--
People who sig other people have nothing intelligent to say for themselves - anonimouse
[ Parent ]
Memory will always be an issue (3.50 / 2) (#29)
by pianoman113 on Tue Dec 12, 2000 at 10:44:30 AM EST

As memory size increases, we programmers will always find neat things to put in it. As far as demos are concerned it may now be a non-issue, but for commercial (and free) software there will always be a lot of little things to take up space. Sure, much of it is bloat and should be done away with, but that bloat can be replaced with some useful advancement in the complexity of algorithms such as AI and graphical rendering.


A recent survey of universities nation-wide yeilded astounding results: when asked which was worse, ignorance or apathy, 36% responeded "I don't know," and 24% responeded "I don't care." The remaining 40% just wanted the free pen.
[ Parent ]
I'm ashamed of myself (3.11 / 9) (#5)
by jesterzog on Tue Dec 12, 2000 at 02:47:42 AM EST

Here at home I boot to Windows around 80% of the time, and it was only just now that I realised how much of a slave I am to it.

I'm used to using unix systems in places that aren't home, and normally writing a short text processing utility would be a two minute job.

Unfortunately I don't have any nice interpreters installed, and I deleted my DOS-based pascal and C++ compilers a few months ago to free up space. I ended up having to copy and paste the stuff I wanted into a new table in a new Access database and write a VB script to do what I wanted to.

VB has it's uses, but this isn't one of them. It's not so much that that I'm worried about as the frustration of not having simple utilities available to do a simple job.


jesterzog Fight the light


Don't give in! (3.25 / 4) (#6)
by pb on Tue Dec 12, 2000 at 03:00:56 AM EST

If you're running up against that problem, at least install Cygwin.

If it isn't actually a Unix, then it's the next best thing.
---
"See what the drooling, ravening, flesh-eating hordes^W^W^W^WKuro5hin.org readers have to say."
-- pwhysall
[ Parent ]

And... (3.00 / 2) (#43)
by 0xdeadbeef on Tue Dec 12, 2000 at 03:47:11 PM EST

Get perl and python from ActiveState. Never touch VB again!

[ Parent ]
Time changes (3.90 / 10) (#12)
by Nickus on Tue Dec 12, 2000 at 06:12:57 AM EST

In my youth I coded a few demos on the C64. Then you had to use assembler to make something "c00l". Today we have a lot more resources. Why should I tweak out the last percent of performance of an application when you don't have to? And let's talk about productivity.. you simply couldn't create the things we have today using assembler (sure in theory...).

Sometimes I miss the days when computers were simpler and you actually could understand your operating system in detail. But there is no need to live in yesterday.



Due to budget cuts, light at end of tunnel will be out. --Unknown
Is Chivalry Dead? (4.57 / 21) (#13)
by moshez on Tue Dec 12, 2000 at 07:19:11 AM EST

Hardly. The programmers of 20 years from now will be laughing/amazed when we tell them "In my days, there was no wimpy-assed abstraction for HTTP -- when we wanted state, we had to choose our own names for cookies and scheme for arranging them, send them, and try to recover from lost cookies ourselves." Ever coded up a web application? Not that different from programming on the altair: you allocate the registers (cookies) yourself, you code up the interrupt (URL) handlers explicitly, and put them in place yourself.

Now that we have an OS, we use it to manage data sent to other machines by hand. Each generation will have it's own crop of "real programmers", who code without the infrastructure long enough to understand how the infrastructure must look like. They are the pioneers,
always braving uncharted territories, terr-forming them
for the rest.

[T]he k5 troll HOWTO has been updated ... This update is dedicated to moshez, and other bitter anti-trolls.
The demoscene lives on (3.71 / 7) (#14)
by Dion on Tue Dec 12, 2000 at 07:58:00 AM EST

As an old fart in the Scene I think I can safely say that the scene has been pronounced dead every year since people started coding and that it has been true for every year except this one

This year something new started to happen, sceners have started to realize that the [gl]amers are not the future sceners and they have started to band together and just do cool stuff, in stead of trying to impress everybody.

Assembly helped a lot by allowing the old-skool people to hang out in their sepparate area and that produced the best entries that I have seen in years (I've been an organizer at TG, DH, SE and ASM since around 96) I saw the same thing happen at least a little at DH around 2 weeks ago and at SE2k this summer.

My point is that the art of coding cool routines just for the sake of it hasn't died, it has only been diluted by the other cool things that can be done with a computer.

Any one still looking for a demoparty in this day and age should take a peek at (disclaimer, yes I'm an organizer at all of these):

  • Scene Event 2k Very small (250-300 guests), but very cozy and very scene oriented, in Denmark
  • dreamhack nice small party (around 2700 guests) in sweden
  • Assembly Very cool party (around 3000 guests) Finland
  • The Gathering more of an IRC get-together, but still nice, the worlds largest 4500+ guests, Norway



Follow the money (3.25 / 4) (#18)
by pallex on Tue Dec 12, 2000 at 08:36:39 AM EST

I think the problem is that the people who used to spend hours of their spare time (outside school, college or work) coding demos can now spend the same amount of time putting together websites, or learning technologies which will earn them loads of money. Playing around with pcs for loads of money vs Playing around with pcs for no money. Hmmm. Not the hardest problem you`re likely to have to solve...

[ Parent ]
Following the money (2.66 / 3) (#24)
by Dion on Tue Dec 12, 2000 at 09:52:53 AM EST

Yep, It's really as easy as that, long ago (stares into the distance) there was really two ways of making money with your computer:
  • Code cool stuff for games: This meant having to code demos first.
  • Code boring stuff: Anybody could do that, so there was no glory to be had

Now you can whip something together in (html/java/3d studio) that will impress mortals as well as make you money.

[ Parent ]
Association of C & C++ Users (4.14 / 7) (#17)
by codemonkey_uk on Tue Dec 12, 2000 at 08:14:35 AM EST

Join the ACCU.

I know C isn't ASM, and some people will throw up their arms in discust at C++, but the ACCU produces some find programming litrature, namely CVu and Overload. Magazines that remind me of the good old days, when an algorythm was a decent topic of conversation. When you get printed listings. When you wrote in to point out an optermisation, and got a reply.

Anway, I won't harp on. Check out the link, it might be what your looking for.
---
Thad
"The most savage controversies are those about matters as to which there is no good evidence either way." - Bertrand Russell

Not dead, but maybe buried (4.25 / 8) (#19)
by mjs on Tue Dec 12, 2000 at 09:08:24 AM EST

My personal opinion is that we all suffer from the effects of selective memory. We remember the ends of the bell curve, not the middle. Most programmers in the 'old days' were doing much the same thing as most programmers are doing now: writing dull but profitable code. There are still experimenters but since the business world woke up to the obscene profit potential of software, the interesting stuff tends to get drowned out in a sea of earnings reports.

Basically, that's why I don't code for a living any more. I just couldn't get excited about getting up in the morning and tracking down yet another bug in Accounts Payable. Profitable? Quite. Fun? 'bout to the same degree as picking nose hairs.

The thing is, everyone needs to pay the rent. What's the first thing new CS grads do once they've got their ticket stamped? They look for the job which pays the maximum return (hey, dude: I've got loans to pay off!) The fun stuff -- the stuff that got them into computers to begin with -- becomes an evening and weekend gig, soon subdued between the demands of 24x7 corporatism and family.

I'm not saying that it's a bad thing; I'm not bright enough to know or interested enough to care. If you keep looking for it, eventually you'll find it!

mjs

I don't think many people are understanding... (4.36 / 19) (#20)
by stinkwrinkle on Tue Dec 12, 2000 at 09:10:57 AM EST

I'm seeing rants against assembler, and raves about the demoscene, and I don't think that was what the author was talking about. I, too, remember the days when a computer magazine was more than a buzzword repository for suits. Back In The Day, when you bought a nifty piece of hardware, you didn't get a disk with a crappy thrown-together piece of driver software, you got (gasp!) real INSTRUCTIONS on how it worked, and enough documentation to be able to code your own uses for it! Nowadays, you hear things like, "That information is proprietary" and "Windows doesn't support it", which is sad. Computer magazines used to have real information in them; they were actually worth keeping for reference, what with algorithms and real computer science topics, and all.

I guess the author and I are just nostalgic for the days when computer geeks, and ONLY computer geeks, were interested in computers. The pool has been somewhat diluted today. Computers have entered the mainstream, and while we old-skool freaks might miss the old days, we've benefitted right along with everybody else. I got my 25th anniversary issue of Dr. Dobbs the other day, and starting thinking along these lines myself.... Sigh. Even user groups now have people who want to leverage Bean and servlet technologies to implement enterprise-level B2B solutions. It's enough to bring tears to the eyes of a guy who used to skip school so I could go buy the latest issue of Byte.

Sorry for rambling.

Ah, the Good'Ol'Days(TM) (4.00 / 1) (#66)
by Ming D. Merciless on Wed Dec 13, 2000 at 01:44:52 AM EST

when Steve Ciarcia monthly cobbled together technological wonders in his Circuit Cellar! Let's not forget the hardware hacking that a lot of us did back then too. I think this is a big part of the problem. In the '80s, we were all much closer to the hardware and thus had a greater understanding of the machine. Designing and constructing your own bank switching memory extension for your VIC-20 gave you a hell of a better understanding of the machine and how to write code for it. This is something I strongly believe - that you must understand the hardware in depth to be a good programmer.

==============================================
A little slice of 1987 on the internet. Visit KAOS -- Central NY's premiere BBS. Multi-user, telnetable, Citadel/UX.
[ Parent ]
The industry grew up (3.80 / 5) (#23)
by leviathan on Tue Dec 12, 2000 at 09:39:40 AM EST

Don't get me wrong. Coding by the seat of your pants in the old days was what got me started. There are still pockets of this going on, especially in the demo scenes.

But I'll tell you what. I'm glad I can talk to the guy down the corridor and plug his code into mine after only reading the API for a few minutes - and he didn't even design the code for me. I'm glad I can read and write XML as easily as I can read and write hex files (like I used to) - I'm getting future-proofing for free.

You can still code like you used to, if you want to code 64k intros, but if you want to do something bigger, or newer, the old practices just don't scale as well.

--
I wish everyone was peaceful. Then I could take over the planet with a butter knife.
- Dogbert

Not dead yet! (3.55 / 9) (#26)
by keick on Tue Dec 12, 2000 at 10:04:41 AM EST

I'm not that old, 25, but I was one of the luck few who had access to computers at a real young age. I rememeber trying to code on a Timex Synclair 1000 connected to a 12" B&W TV and tape drive. Later I also ended up with a VIC20, and even a C64.

I wasn't a memeber of any scenes then, don't even know if I knew what they were. I was content however, typing in the Machine Code for cool new games that came in magazines like Compute. Those where the days? Maybe, but I think today is more 'the days' than yesterday ever was!

Thanks mostly to the reluctancy of the govm't aerospace sectors to use mewer technology, there are still jobs out there where the old ways still rule. I'm currently working on flight control software for the X-31. Ever heard of Jovial? Me neither, but my point is: This is the first professional job where I actually get to spend a few hours trying to tweak the last few cycles out of an ASM routine.

Just last week I was coding up an ASM rountine to checksum memory, and got it 2.4 times faster than the original ASM.

Granted, I miss coding in Delphi 5, but the feeling you get when you've written a routine in pure ASM is wonderful!


demo secne isnt dead yet!!!1 (1.82 / 17) (#27)
by JeffK on Tue Dec 12, 2000 at 10:18:10 AM EST

ther are still peple riting lots of cool demmo porgrams in asembler and stuff!! i find themn on sites wen i am bowrsing and downlode them and run!

by teh way check out my new comic Levelord Solves a Crim at Somehting Awful!!!!1

ps does anyone now were i can find a new fire demo i love that siht!!!



Good enough... (3.60 / 5) (#30)
by chewie on Tue Dec 12, 2000 at 11:58:04 AM EST

Believe it or not, there are still programmers out here that really want to be the best at what we do. C, C++, Assembly, Python, Java. It doesn't matter, really. Good solid programming involves a well thought out design, a well executed plan, and a thorough review of the work you completed. It's when you work for an employer or a team of developers who don't really care about what they're doing that draws programming into the depths of "cookie cutter code." I recently left a very well paid position at a growing "publically-traded" company because this "good enough" mentality was encouraged under the guise of "code reuse."

I don't have to tell you how disappointing it is to see a project lead or library maintainer that has no solid experience in application or routine design, especially when you're looking forward to good mentorship. I've always wanted to be "the best" at what I do. I reveled in the days at college when I was the guy with "all the answers." Now, it doesn't matter, or rather it didn't at my last employment.

I agree with the sentiment that this "good enough" mentality seems to be spreading. I only hope that there are enough people like us, who want to know the intricacies of EVERY programmatic building block, to lead the would be "cookie cutters" to a better product. It's just difficult trying to lead when you're at the bottom of the totem pole.


assert(expired(knowledge)); /* core dump */
There is a reason for high-level tools (4.12 / 8) (#31)
by andreas on Tue Dec 12, 2000 at 12:04:51 PM EST

Yes, back then I was writing demos too, and that was in assembler. I did all the dirty tricks, including self-modifying code, hooking to the PC timer chip to emulate copper tricks and play samples on the PC speaker, writing my own sound drivers from scratch, and so forth.

There is a reason I stopped doing this. After I had implemented triangle drawing routines, I wanted to add 3D transformations for real 3D graphics. I've imlemented the matrix multiplication in C, so I would get it right. Then I took the assembler code generated by the compiler, and tried to make it faster. Imagine my surprise when days of hard work could only squeeze a few percent speedup out of it. The total speedup of the whole program was not even measurable (see Amdahl's Law if you don't know it).

But implementing backface culling made the program 30% faster, and in C, it was easy to do. That made me stop writing assembler, and rather conentrate on the high-level aspects of the problem, because you can gain a lot more speed there. After I saw DOOM, I knew that I wasn't the only one thinking this way. DOOM is mostly written in C.

The key insight here is that the only thing more scarce than RAM and CPU power is cycles in the programmer's brain. And the complexity of a problem the brain can tackle is always limited, so it is better if I don't have to waste brain capacity on low-level aspects of the system (such as whether instructions stall each other in the pipeline) and rather concentrate on the problem I want to solve. Even C is way too low-level for sophisticated programming, I don't want to care about array bounds, freeing memory, string length checking and all the boring stuff.

Also, the lower the level of your prgramming language, the easier it is to shoot yourself in the foot, and spend days debugging crashing code. How often have you traced a mysterious, hard-to-reproduce bug, which turned out to be caused by some bad pointer in a totally different module? In high-level languages you get a fat red arrow right at the bad pointer, because the compiler inserts all neccessary checks for you.

A crucial aspect is of course compiler technology. Somebody has to care whether two instructions stall each other, whether a type or bounds check is needed, when memory needs to be freed. But why not let a program do it? There are some high-level languages out there that are easy to use and learn, but don't produce good-enough code (Python being the prime example). But it's possible to compile languages that even exceed the features of Python, and still get out code that's as fast as the equivalent in C. A lot of ground was laid for this in the functional and Lisp compiler community, and a lot of knowlede of how to do fast high-level languages was learnt when C already was the dominant language.

I'm one of the core maintainers of the freeGwydion Dylan compiler. Dylan is a dynamic, object-oriented language, that was based on the ideas of Common Lisp and Smalltalk. Dylan has been designed at Apple, in cooperation with the Carnegie-Mellon University, and was originally planned to be used as the language for the Apple Newton. Dylan has multiple dispatch, multiple inheritance, garbage collection, first class functions, first class objects, real macros (that work on the parse tree, not on the characters) and bounds checking. Everything is an object (even integers), and the syntax is easy to read and easy to write. See also here for a comparison of Java and Dylan. Don't forget that Dylan is not only the nicer language, but also generates faster code.

People don't want to waste time and effort (4.30 / 10) (#32)
by amokscience on Tue Dec 12, 2000 at 12:06:07 PM EST

Coding the ultra neatinner loop and hacking in retrieving data from the machine code of the binary maybe gives you that high, however, it is ultimatly grossly underproductive. You can't re-use that code. Why is qsort() part of a standard library? Because you don't want to keep recoding the same thing.

These high level languages and application frameworks allow for minimal work to create complex applications. Instead of having to rewrite a save/load routine from scratch each time let the language do a lot of the work for you.

Again, it's all about saving time. With so much money to be made is software, people that take the time to do something cool will probably miss out on the cash basket. Computer resources are hardly scarce either. Many of the neat code examples were precisely because you couldn't do a whole lot on that 286 12 Mhz w/ 2 MBs of RAM...

Surely you've read the story of Mel: a Real Programmer. That's infinitely cool imo but utterly impractical today. It would be a waste of time to put the effort into low level optimized code for a program that could be created using high level objects.

Remember, the industry wide baselines for measuring productivity are still measured in KLocs. It's not a very accurate measure by any means but that's the only reasonable one that's widespread. You could also argue that fewer lines of code (Java versus asm) leads to lower defect figures.

Neat code is also extremely difficult to maintain. Try going back to one of thes neat algorithms a year from now. You'll spend as much time figuring it out as you intiially did coding it. For others who have no idea how it works it will be much worse. Elite code leads to headaches, grumbling, and frequently hard to trace bugs.

If you want to get under the hood and dig at hardware your best bet is to become an embedded systems programmer (DSPs, microcontrollers, console game boxes, set top boxes, etc). I'm all for hacking and being cool... as someone who was heavily involved in the DOS demo scene for several years I can appreciate that type of code. Yet it isn't a philosophy that translates well into actual productivity.

So, my bottom lines: Time is money. Cleanliness (clarity) is next to godliness. Don't reinvent the wheel, wagon, road, sky... etc.

IT sucks (2.25 / 8) (#33)
by evvk on Tue Dec 12, 2000 at 12:11:42 PM EST

I can nothing but agree. Computers and programming just are not fun anymore. There was a time when I thought I'd like to do programming for living. It was fun to make the computer do what you want, experiment with it, play good games. Not anymore. I don't like where the IT industry is going at all. Modern games suck. Bloated and inefficient software suck. New media sucks; crappy dhtml xml javascript java c00l design web pages suck. Modern GUIs suck. Many standards (read: compromises) suck. The APIs suck. I don't want to buy a new computer every year. I don't want to know all the latest hype crap. I don't want carpal tunnel syndrome from using the mouse (whenever I have to use Windows longer, I can feel it in my hands). I just want simple keyboard-friendly software that do what I want and the damn box to be quiet!


if everything sucks.... (none / 0) (#71)
by krogoth on Wed Dec 13, 2000 at 09:00:08 AM EST

why not try to make it better? If you want something to change, you can always do it yourself. Go contribute to an Open Source project to make linux, and windows better, or contribute to litestep (my current windows shell, www.litestep.net) to replace the explorer shell. There's still a lot you can do to change things that suck.
--
"If you've never removed your pants and climbed into a tree to swear drunkenly at stuck-up rich kids, I highly recommend it."
:wq
[ Parent ]
Re: if everything sucks.... (none / 0) (#73)
by evvk on Wed Dec 13, 2000 at 09:34:51 AM EST

Well, I _have_ done something. One man just cannot do everything and change the world alone with a few programs and limited time; I'd rather have an army of code monkeys implement my ideas :-).


[ Parent ]
Try sourceforge (none / 0) (#85)
by krogoth on Wed Dec 13, 2000 at 06:59:25 PM EST

I started a project manager for a game i'm making, and withing 2 days of posting help wanted ads on sourceforge, I had 4 other coders (down to two now, but I closed the ad after those 4, because I though I would just get flooded if I left it open), even when most people who use sourceforge wouldn't need an external project manager. Leave an ad open for a month and you could have a whole army of coders working on your project. Get noticed and people will help you.
--
"If you've never removed your pants and climbed into a tree to swear drunkenly at stuck-up rich kids, I highly recommend it."
:wq
[ Parent ]
Two words: Embedded Systems (4.41 / 12) (#34)
by mosburger on Tue Dec 12, 2000 at 12:41:00 PM EST

Miss assembly language? Like optimizing? Think writing "tight code" is fun? Then you shouldn't be working in IT. Find yourself a job in embedded systems, where each penny you can squeeze out of the product cost by reducing the amount of RAM you need is actually appreciated. People still code in C or assembly, too, 'cuz the code needs to be damned small and damned fast.

--- I want to be different, just like everybody else. ---

Listen to mosburger... (4.66 / 6) (#36)
by the coose on Tue Dec 12, 2000 at 01:40:01 PM EST

He is exactly right. The system I work on has 1 meg of flash and 1 meg of RAM. Probably 98% is in C/C++; assembly is used to: 1) boot up the system (i.e. get the vector table loaded, initialize RAM, call C++ initializer thunks, etc.) and 2) for CPU hungry routines that need to be done really fast (I work on a real-time telecommunication system). It is fun to squeeze all the cycles you can out of a 33MHz 68302 and sometimes just the simple things will give you more cycles than you think. (Like re-writing a loop with inline assembly - amazing how inefficient compiler-generated code can be :)

[ Parent ]
Re: Listen to mosburger (4.33 / 3) (#55)
by tjb on Tue Dec 12, 2000 at 07:50:33 PM EST

1 Meg! I'd kill for one meg! :)

One processor I work on is an 8051 USB controller. It has a grand total of 8K, 6.5K of code, 1.5K of data (this can be adjusted somewhat depending on demand). C is an exercise in futility. Normally, I'll use C as a RAD tool by cutting out extra features when I need to test something new, but the production code is pure ASM.

The other processor I work on is a DSP back-end that has 1K of code and 1K of data (non-negotiable). It is quite a piece of work: Cooperative multi-tasking, pipelined (by hand!), and a 3 deep call stack (a function call is a major design decision :8) It is by far the most challenging processor I've ever written code for. The problems take tremendous thought, usually requiring three or four people to work for a couple of weeks just to bang out 500 lines of code that'll work (and fit).

Anyway, I must say that I find the amount of thought required for embedded coding makes it quite satisfying when you finally get it working.

[ Parent ]
career advice? (3.50 / 2) (#68)
by nickwkg on Wed Dec 13, 2000 at 06:48:05 AM EST

But how does one go about getting a job in embedded systems?

I remember I used to play with 68k and custom chip coding on my old Amiga years ago but haven't touched low level stuff like that since. After leaving Uni I've only had C and C++ commercial experience - what should I do to get into an embedded job?

[ Parent ]
Problem Solving (3.75 / 4) (#37)
by drodger on Tue Dec 12, 2000 at 01:54:10 PM EST

One thing I've noticed is that those who used to do their work or their hobby programming in assembly language, but now work with some "higher-level" language, still have the benefit of their past experience. For those of you who feel you no longer do "real programming", do you at least feel that the knowledge you gained from experimenting, from solving little problems that required you to change your thinking, still help you solve problems today, regardless of the language(s) you use? Or has exposure to large/complex programs changed the way you solve problems now?
--- Official beta tester of the 21st century... Linux security...to mend and defend! Cruisin' around in my modemmobile...
It's all ones and zeros (2.00 / 1) (#77)
by pmk on Wed Dec 13, 2000 at 12:50:52 PM EST

Amen to this comment. If you don't understand how it works all the way down to the metal, you don't understand it.

[ Parent ]
The place where old ASL programmers go to play. (4.00 / 5) (#40)
by neoliminal on Tue Dec 12, 2000 at 03:05:52 PM EST

Corewars.

With a little over 14 instructions, the game of Corewars is probably the most enjoyable outlet of frustrated ASL programmers there is. Originally designed as a way to teach Assemble, the game consists of programs fighting for control of a simulated core.

You can find tons of pages on this, but probably the most comprehensive would be www.koth.org. Here you can find "King of the hill" competitions and resources for playing.

Then there's the venerable alt.games.corewar, which has been around for years.

If you are at interested in ASL programming, you will find others like yourself here.

Corewars interpreter and debugger (3.00 / 1) (#44)
by Michael Leuchtenburg on Tue Dec 12, 2000 at 03:58:03 PM EST

Is there one for Linux? That actually works? I got the VM to run, but all it told me was what ops got executed (nothing about, for instance, the address of those locations.) I've also seen screenshots of pretty graphical debuggers. While nice, I'd settle for something. It's better than nothing, after all.

[ #k5: dyfrgi ]
[ TINK5C ]
[ Parent ]
Check out the "Corewars Now" link on thi (5.00 / 1) (#47)
by neoliminal on Tue Dec 12, 2000 at 04:55:50 PM EST

Corewars now.

[ Parent ]
Do it yourself... (3.66 / 6) (#41)
by ameoba on Tue Dec 12, 2000 at 03:43:37 PM EST

If you really miss that aspect of computing, why don't you start doing it yourself? Get a few likeminded ppl together and set up a web-zine the has articles covering whatever aspect of computingit is that you miss. I mean, if it comes down to it, anybody can get a free 10-20meg site to start from, and build it from there.

With how many people are out there using computers, I'm sure you should have no problem finding some people to join you on your journey. And I'm sure, along the way, you'll find more and more ppl. with the same ideals towards coding...

'Sides... if nobody steps up to the plate to keep this knowledge alive, it'll be lost forever. Somebody needs to ste up and write down the knowledge of those guys w/ 20yr of bare-metal coding experience...


If not you, then who else?

blahblahblah (2.60 / 5) (#42)
by jeanlucpikachu on Tue Dec 12, 2000 at 03:43:48 PM EST

you're absolutely right, most print mags utterly suck (ddj and 2600 being exceptions in my mind, what do you think?) but that's kinda because the web has taken over. you want to program for fun? flipcode.com, gamedev.net... don't ever say the fun has gotten out of programming. if you're not having fun, go do something you like, but there's still lots of fun to be had...

--
Peace,
Capt. Jean-Luc Pikachu AIM: jeanlucpikachu
DDJ is OK (3.00 / 2) (#48)
by maketo on Tue Dec 12, 2000 at 05:33:48 PM EST

But they seem to be far away from what they used to write. I can understand that - times just suck lately.
agents, bugs, nanites....see the connection?
[ Parent ]
True enough. (3.20 / 5) (#46)
by rebelcool on Tue Dec 12, 2000 at 04:52:00 PM EST

I recall learning programming using good ole qbasic, and marveling as i drew simple circles and lines and..*gasp* played music. Or figuring out how to use random-access files.. all simple stuff.

Today's programming is simply too complicated. Back in the day, it was possible for someone to learn nearly every aspect about programming a computer... i think in the future we're going to see much more specialized fields. People who specialize in networks, webdesign, database design and so on.. there wont be a generic programmer anymore.

COG. Build your own community. Free, easy, powerful. Demo site

We're already there (4.00 / 1) (#58)
by ChannelX on Tue Dec 12, 2000 at 10:25:16 PM EST

"...i think in the future we're going to see much more specialized fields. People who specialize in networks, webdesign, database design and so on.. there wont be a generic programmer anymore."

I think we're already there. Most jobs today you specialize in something. Unless you're at a very small company you're most likely not a jack of all trades. You do web programming, or db design, etc. You might dabble little in other stuff as part of your job but you aren't responsible for the majority of the work in other areas.

[ Parent ]

So true.. (4.00 / 1) (#80)
by use strict on Wed Dec 13, 2000 at 03:39:44 PM EST

Where I work (a online store), we have a DB guy, a sysadmin, our manager (jack of all trades), and myself, I normally implement the 'new and experimental' stuff.

IMHO, I have the coolest job, but I think that anyone there could say the same about their own job there.

The point is, is that myself, when I was younger, I was interested in programming, but regarding more of a usability aspect. I ran bulletin board systems (i'm sure most involved in this discussion remember those... although that seems to be a dying breed as well) and maximized my coding time working on systems which made 'new ground' in usability and interface issues... 'Web programming' came easily, for obvious reasons. I still do a bit of work in C, but most of it is in perl. (as you can probably tell by my username)

However, I did develop (IMHO, of course) a semblance for 'good programming'. The nice thing about perl, in particular, is that you have the choice of implementing things in a lower level fashion if you want (XS, a very odd C interface to perl), but without sacrificing the power of higher level aspects. Reusability, IMHO, is best, and component use is only beneficial if the component is going to be reused. The windows/GNOME ORB approach to programming is of no interest to me, as most of what I have seen is either not used enough or too general to be of any real world use.

The big point, is that regardless of what you're doing, you can still always do it 'your way'. This is due to the genius of using language to program computers.

As for the demo scene? Most of the talent pool is in gaming now, where they can actually make money now and still display their 'l33t skills. For instance, Future Crew's Skaven did some music tracks for Unreal Tournament, and the rest of the guys are working on Hardware projects.

Is hornet.org still active? (I havne't checked yet) If so, this is a great place to meet others who think just like the guy who wrote the article.



[ Parent ]
ah bbses... (3.00 / 1) (#81)
by rebelcool on Wed Dec 13, 2000 at 04:41:37 PM EST

heh, i ran one. And i develop web software that makes things very similar (much truer to the old bbses than the "communities" you see on the web).. check out The Machine sometime

COG. Build your own community. Free, easy, powerful. Demo site
[ Parent ]

The evolution of software engineering (2.75 / 4) (#50)
by (void *)0x00000000UL on Tue Dec 12, 2000 at 06:35:23 PM EST

There are reasons why people are no longer coding in assembly. Real world developper need to release code fast and it has to be portable and maintanable. Maintanability is the key issue. It's the part that cost the most in a software life cycle. By using higher level langage, your code is usually more maintable and self-documenting.

Also, it's a natural evolution. Back in the industrial revolution, there were no standards. Today, we take for granted that any 3/8" bolt will do the job, regardless of whom manufactured it. But in those days, it was not. Steam boilers were exploding because there were no codes regulating them. Over the years, the mechanical industry wrote several standards so that today, you can enjoy buying piece from anybody and knowing they will fit. That metal bolt may look boring, but everything parameter is standardized. Now, instead of machining piece (which is costly), we buy them and fit them together. The less stuff you have to do yourself, the better. If you want to take a look at those little bolts and stuff, check out Machinery Handbook.

The electronic world now has multitudes of ICs that can do anything. You no longer buy single MSI chips and hardwire connections. You buy reprogrammable FPGA chips that comes with development tools (checkout www.xilinx.com). You no longer implement an RS-232C serial subsystem, because today, they come in a single chip that will cost you far less that if you produced it yourself.

In the civil engineering field, it's the same thing, steel beams are standardized, water pipes, ventilations ducts, cement recipes etc... The more available stuff you can get, the less you need to do yourself, the better the quality and the lesser the price is.

You see, standardization is one the greatest achivement because everything fits and is more reliable.

The software industry is getting some maturity and is going the same way. One day, we will have a lot of standardized components and we will be able to fit easily to do what we want. There is no point in re-inventing the wheel. It's unreliable and unmaintable. Of course it will take knowledge to properly fit and chose those software "parts" and we will always have to do custom work ourselves. This is were software engineering is going in my opinion.

Anyway, today's compiler do a very good job at optimizing your code. Also today's hardware is so complex and fast changing than low-level coding is not an option unless where it is absolutely needed. Ie: Your old i386 asm tricks will no longer work when you'll be programming on a IA-64 processor. Software and hardware will always get more complex.

Now if you enjoy low-level and write high-performance, try designing hardware. Try to optimize the gates so that's it's the fastest possible. Fast here mean low propagation delay so that you can drive the circuit at higher clock freqs. Tips: Fastest designs are almost never the ones that's got the lowest number of gates.

Now the only problem with components and software is that currently, you can't be sure if the components behaves exactly like the specs says and that it's free of bug. If the mechanical world, you could stress-test a few random pieces from a manufacturers and if they all follow your needs, you can usually be sure that most pieces will behave correctly too. Testing a software component is harder but with higher-level tools, it will be easier to ensure they meet the specs.

Also having a lot of code monkeys who got a 1.5 years pseudo-CS course is not the right way to come up with properly working software. There are just too many monkeys over here that only know about coding. Coding is just a part of software development.

Those still thinking high-level language suck should be mostly kiddies and monkeys. Software and hardware development will always be at higher and higher level. I'm not making this up, the director of electrical engineer said it in a course and I agree with him. Whether you want this to happen or not, it will happen.

I'm second year BSCE student if you're wondering who I am.

you missed the point (4.50 / 2) (#52)
by maketo on Tue Dec 12, 2000 at 07:09:59 PM EST

Well, I generally agree with you. But you missed the point - I was reminescing about the good old times when we tweaked the hardware / software to make things cool. Now there is nothing to tweak, it is mostly done by other people (components). 1.5 year code monkeys? I think it takes more than that to become a good programmer. The director of electrical engineering said that the way to progress is standardization, well, it is. Asm was just an example I used. You can hack something useful in Java if your idea is novel. But, it just bothers me when I have to sit down and use a component coded by someone else. It bothers me when I open a programmer's magazine only to read about JavaScript and servlets and how to connect to your Oracle backend...Call me irrational, call me focused too much on detail....I still miss the good old times.

As for your comment on fast moving hardware - thats nice...how many people actually use the 128+ megs of RAM....you start Windows and it expands to fill it...You start Microsoft Word and it takes 3 minutes to load (dont tell me StarOffice because it is even worse). Your 700 Mhz Duron is sitting idle running an event loop....ofcourse, you do have a 700 Mhz Duron to run the loop and listen to MP3s, dont you?
agents, bugs, nanites....see the connection?
[ Parent ]

Computers (3.00 / 1) (#53)
by (void *)0x00000000UL on Tue Dec 12, 2000 at 07:31:10 PM EST

Yes I do have a 660 Mhz Duron that's mostly sitting idle but when I do some heavy crunching in Matlab or whatever, it kicks in. That's fast enough as I don't play games. I also think it's fun sometimes to hack or do lower-level stuff but it will be harder and harder as hardware gets more complex. But like one guy said, if you like that stuff, try embedded stuff. You can buy robot stuff and microcontrollers. Last year, we did some robot stuff in a course. We did the programming in an interpreted subset of C. It was not assembly but it was really cool. The control system was an MIT HandyBoard running an HC11 Check this for an image of the robot http://www.gel.usherb.ca/crj/images/photos2000/p0001605.jpg

[ Parent ]
a thought on topic (4.00 / 1) (#63)
by TigerBaer on Wed Dec 13, 2000 at 12:33:35 AM EST

maketo - I agree with you that alot of the basic principles are overlooked, such as being able to factor out some compiled asm, and make it wicked efficient. I am a third year CS major, and my biggest problem has been the over concentration on OO and design. I agree that these are the KEY points in software development, but I am taking CS for the science, not the software engineering (otherwise I would be a software engineer).

I find alot of the algorithmic work and language theory is the core knowledge, and I think that all computer science students should be able to write any system call in C.. not just understand its functionality and roughly what it does.

My curriculum focuses alot on Java, which is great because Java is the ideal learnign language, but sometimes i crave bytes, and I wish that more of my courses detailed the methods of making functions and methods extremely efficient.

There is an elegance to both Java's Structure and C's simplicity.

[ Parent ]
Want Fun? (3.00 / 3) (#54)
by granto on Tue Dec 12, 2000 at 07:48:39 PM EST

When I want fun, I try to implement the most complicated
and abstract algorithm I can find in as few lines of code
as I can in a functional language like Miranda,
Haskell, Clean or Erlang; or even in APL.
Fun, difficult and super cool.


Sweet! (none / 0) (#65)
by Greyjack on Wed Dec 13, 2000 at 01:16:50 AM EST

Hey, cool--could you whip me up negascout search with alpha-beta pruning in INTERCAL? ;)

(Yes, I'm a smartass)

--
Here is my philosophy: Everything changes (the word "everything" has just changed as the word "change" has: it now means "no change") --Ron Padgett


[ Parent ]
APL, sigh (none / 0) (#76)
by pmk on Wed Dec 13, 2000 at 12:48:15 PM EST

I remember totally geeking out on APL when I was in junior high. I would not be the programmer that I am today without having had my consciousness expanded by thinking in terms of neat tricks on whole arrays in APL. I still pull out my old copies of Gilman and Rose and the SIGAPL Quote Quad for fun.

It wasn't the notation that was important; it was the idea. I can't stand J because the notation is horribly ugly. If only there were a good free real APL available...



[ Parent ]

An idea (2.33 / 3) (#56)
by skim123 on Tue Dec 12, 2000 at 08:17:43 PM EST

A friend of mine and I felt this same sentiment... so we invigorated our desire to code by defining some terms for a task a function would have to complete (say calculating the square root of a number) and see who could do it in C using the fewest characters.

If you are looking for "real programming" fun in magazines or at work, I think you're SOL. Although a friend of mine has been impressed with Dr. Dobbs and its coverage on neat C++ stuff (I don't read it, so I can't comment on it...).

Money is in some respects like fire; it is a very excellent servant but a terrible master.
PT Barnum


Joy reading about "that new Borland compiler& (1.50 / 2) (#59)
by deeznuts on Tue Dec 12, 2000 at 10:33:48 PM EST

Delphi for Linux is coming Q1 2001! I can't wait to start RADeveloping all kinds of random (but useful!) applications. My goal is to single-handedly make Linux a viable desktop operating system for home users.

I'm as giddy as a schoolgirl! Don't tell the Japanese.
---
Deez Nuts!!

Lost Fun (4.16 / 6) (#60)
by Fred Nerk on Tue Dec 12, 2000 at 11:16:40 PM EST

This article brings a tear to my eye, as I was lamenting just the other day the lack of anything interesting in the huge range of computer magazines at the local newsagent.
In the IT section there were probably about 30 different magazines, all with their own buzzword on the cover. After browsing through them for about an hour, I couldn't find one single magazine with anything coming close the the quality of the Ol' TJ's Workshop from the original Australian Personal Computer magazine, back before it became a 150 page advertisement.
When I was a few years younger, I owned a TRS-80 color-computer 2 which my dad bought me. I used to wait impatiently for each month's edition of "Rainbow Magazine", which my dad imported for me to read. I dutifully entered all the code in the book, line-by line, and fixed all the bugs in them line-by-line.
The amount I learned, just from entering a "Kung-Fu" game in TRS-80 BASIC is incredible. At the time I wasn't interested in making a business application that I could churn out in 5 weeks and make a pile of money off, even though the quality was terribly. I was (and still am) interested in writing code for the pure pleasure of creating something beautiful. I wasn't born with any sort of artistic ability in the normal sense. I can't sing or paint or write poetry, so I express myself in code.
Back in the Day, there were others like me, which is why magazines like "Rainbow" existed. There are others like me still, which is why the whole open-source mess exists. The community has grown with the advent of the Internet, but I think it's being more and more pushed out of sight.
I think it's time to go read some early articles by Michael Abrash (my hero).

Dude, the TRS-80 rocked! (4.00 / 1) (#62)
by Sheepdot on Wed Dec 13, 2000 at 12:16:13 AM EST

What model did you have? the spankin' new model II or III? I had a model I that didn't have an INKEY function. Oh the HORROR!!

I coded on that beast around 1988 or 89 or so when I was 9. It was my first computer and my parents paid 100 bucks for it. I then decided I wanted to be a computer guy instead of a lawyer.

I've regretted that choice ever since. :)


[ Parent ]
Just PEEK(14400) (4.00 / 1) (#64)
by Greyjack on Wed Dec 13, 2000 at 01:12:14 AM EST

Sure, you may not have INKEY$, but you can just PEEK(14400) and read the status bits for the arrow keys, shift keys, control key, and space bar directly. That's all you really need to write a good game anyway :)

--
Here is my philosophy: Everything changes (the word "everything" has just changed as the word "change" has: it now means "no change") --Ron Padgett


[ Parent ]
OMG (3.00 / 1) (#74)
by Sheepdot on Wed Dec 13, 2000 at 10:20:13 AM EST

You mean I could have done that the whole time? Jeez!

And to think I lamented having my parents pay money for such a system when the thing I hated most about it was possible the entire time.

Thanks for the info. I really do appreciate it. :)


[ Parent ]
Here, have some sample code (5.00 / 1) (#78)
by Greyjack on Wed Dec 13, 2000 at 01:22:41 PM EST

10 PRINT PEEK(14400)
20 GOTO 10

So, go fire up your TRS-80 emulator and try it out already. (emulators r0x0r :)

Kinda wigs me out how much I remember about that machine. Want to put stuff straight to video? POKE values into (15360) through (16384). But then, that's probably true for whatever computer anyone cut their teeth on.

--
Here is my philosophy: Everything changes (the word "everything" has just changed as the word "change" has: it now means "no change") --Ron Padgett


[ Parent ]
TRS-80 (5.00 / 2) (#86)
by Fred Nerk on Wed Dec 13, 2000 at 07:12:33 PM EST

I actually started off with a Co-Co 2.. I built up my supply of cartridges for it, including the ever wonderful "EDTASM+", which I made extremely good use of, and learned a lot of really low-level stuff about the system, none of which I can remember, except maybe some of the 6502 instruction set (so much cleaner and simpler than the x86).

After bumming around with the tape player for it, I managed to get hold of a *gasp* floppy disk drive!

This came with a special interface cartridge that had it's own modified version of TRS-80 DOS on it that included things like DIR and random read-write calls from BASIC.

Unfortunately when I upgraded to a Co-Co 3, I found out that the cartridge wasn't compatible, but I found that out after plugging it in. When the computer turned on, instead of the normal "OK" prompt, you got a full-screen graphic of the people who designed the hardware for the Co-Co 3.

I had to shell out for a multiport cartridge box so that I could plug in EDTASM+ as well as the disk drive, so that I could modify the code for the drive controller.. I even made a EPROM writer so that I could update the chip on the controller, but I never got around to finishing it, because it was about then that we got our first PC.

I've still got it all the TRS-80 crap in a box somewhere.. I should pull it out and finish my project!

[ Parent ]

the joy is still out there ... (3.50 / 4) (#61)
by mx on Tue Dec 12, 2000 at 11:36:05 PM EST

i ran into the same mental block a while back, until i realized that that sort of stuff is still out there ... i was just focusing on the wrong stuff.

my history can be traced back to my atari 800, through to the 130xe, 512ST, 1040STE, Amiga, 8088XT, 386-AT, to now on my pathetic p3-800/500 and 667. there were times with each era, each language, each paradigm that i couldn't find that spark that got me going ... but it was there if i looked in the right place. i found it to be partly motivation, and partly the way i looked at things. there are still the little projects, demos, utilities, etc., that are there for the creation. one just needs to see them, which is no simple feat.

recently i found myself moping around at work thinking that i missed developing cool looking useful apps (i am in the middle of a long server development). after enough moping, i decided to start developing some cool stuff, like the stuff i started with years ago. a few minutes later i was on my way to developing a tool to make a windows thing easier (for testing). then i found an open source project to contribute to ... and now i am working on some gui ideas. that stuff is there, you have just forgotten to ignore technology and focus on creating cool things.
- mx
Very much alive and kicking (4.00 / 6) (#67)
by linklater on Wed Dec 13, 2000 at 05:04:17 AM EST

I remember the scene you are talking about perfectly well. I started proogramming in 1981 with a Sinclair ZX81, then moved through the Commodore range etc... until finally ending up with an Amiga 4000. Once the Amiga died I had to move over to PC's, which was a bit of a shame because it felt like the old cavalier days were gone.

But no, those days are still alive and kicking, if you know where to look. I was lucky enough to get a job as a console games programmer when I left University, and after 8 years of doing it professionally (and after starting my own dev. company too) I can tell you that the cavalier spirit still lives on in the video game world.

Sure, some PC products are bloated and need mega-processors to get them up to a decent frame rate, but in the world of console development the programmer is still battling with the hardware at a low level. No matter how clever you think you are being there is always someone else who will do something a little cleverer than you, and pull off some cool stuff. This forces up the stakes and makes you squeeze the silicon even more.

That is why I love console games programming.

Sure, I've learnt some PHP and some other higher level languages, but they are stricktly (sp?) religated to my website or to be used for tool programming. Most of my time is still spent looking at the hardware and thinking about performance.

As far as I'm concerned, you can take your 4GL database high level scripted interpreted do-it-all-for-you languages and leave them at the door - they don't interest me much at all. I'm far more interested in speeding my code up than in waiting for a faster CPU.

Rant over - I've just got in work, and the caffeine ain't absorbed yet 8(.

PS - The reason I like Linux/BSD is that it's a bare system, made for experimenting. Sure, I program Win32 too, but it seems too 'easy' somehow... Like a brand new shiny car with aircon and satnav. I'd rather have an old ragtop with 'interesting' handling 8). (Well, that or an R1 - mmmmmmmm)


---- 'Who dares not speak his free thought is a slave.' - Euripides

Design hardware (4.75 / 4) (#69)
by d40cht on Wed Dec 13, 2000 at 07:06:54 AM EST

The final frontier for those who enjoy the joys of doing it all yourself has got to be hardware design. With 200,000 gate FPGAs (Xilinx or Altera being the leading manufacturers) coming in at a few tens of dollars a piece now, why not design your own computer from scratch at the gate level - including the processor and frame buffer? Reconfigurable logic is the way forward for the hackers of today...

Tools... Tools... Tools... (4.00 / 1) (#89)
by bored on Thu Dec 14, 2000 at 01:07:35 PM EST

I've considered this as an amusing exercise. The problem with FPGA's is the tools cost so damn much. Sure you can get their proprietary gate tools but who wants that? What I want is a decent VHDL or Verilog tool. Look at Development Systems I don't consider $1000 to $3000 pocket change for an amusing little project I will only spend a few days on.

[ Parent ]
Tools... (none / 0) (#91)
by d40cht on Fri Dec 15, 2000 at 05:38:58 AM EST

I guess the solution is to work for a hardware design house... I'm working for a company in England (Celoxica) with a C to hardware compiler (C with some hardware extensions - for explicit parallelism, arbitrary width variables etc...) Our tool is priced similarly to the VHDL and Verilog synthesisers - but we do special academic deals... Guess this is probably not relevant to the hobbyist... :)

[ Parent ]
Magazine quality... (3.66 / 3) (#70)
by akihabara on Wed Dec 13, 2000 at 08:04:59 AM EST

I couldn't agree more about the poor state of computer magazines nowadays. Scanning the racks at the Borders here in London is depressing; not one of them is worth reading, let alone buying or subscribing to. In my experience, by far the best quality, and variety, of computer magazines is in Japan. The obvious downside being, of course, that they're in Japanese. But if you can read Japanese, I thoroughly recommend two ASCII pulications: "BSD Magazine", which is incredible, and "Unix Magazine". They also do a "Linux Magazine", which is not quite in the same league but still better than, say, the Linux Journal in English. When I came back from Japan I was so disgusted at the state of affairs over here that I renewed my subscriptions to these magaizines, despite their already being expensive and the price doubling to include the shipping. Japan's great for other things computer-wise too, like Akihabara. Absolutely unique on planet earth. I can't even find one decent screwdriver shop here in London, there's about 100 in one small area of Tokyo alone.

Further recommendation... (3.00 / 1) (#79)
by frenetik on Wed Dec 13, 2000 at 02:38:25 PM EST

For those who grok german rather than japanese, check out any publication by the heise group (c't, iX). These magazines are absolutely excellent. They did a multipart (10+, I think) special article series about Delphi and DirectX, which alone got me to learn Delphi.

(In case this sounds like a blatant plug: I have no relation to them, commercial or otherwise)

Friends are like plants. They need attention and they need to drink. -- SPYvSPY
[ Parent ]

Some puzzles to make you feel better (4.71 / 7) (#75)
by pmk on Wed Dec 13, 2000 at 11:07:50 AM EST

1) Given a pointer to the first node in a singly-linked list, determine in linear time whether the list terminates or has a cycle, without storing to memory.

2) How can you construct a linked list that can be traversed forward and backward using only enough space for one pointer per node, but still allowing the nodes to have arbitrary addresses?

3) Prove or disprove: Any Boolean expression with multiple occurences of at least one variable can be simplified.

4) Can you always divide an integer by a power of two using the sign-extending "arithmetic right shift" instruction? Construct a sequence that works for all inputs on your machine. Would this have been easier to write back in the 60's?

5) If your processor doesn't have a bit population count instruction that counts the 1 bits in a word, construct a fast sequence that does so without looping or loading from memory.

6) Write code that counts the number of rightmost zero bits in a word, again without looping or loading or using a special bit scan instruction. Is it easier to count the number of leftmost zero bits?

I collect tricks like this and sometimes use them as interview questions when I'm feeling nasty. Warning: some of these are really subtle, and I design instruction sets for a living at a hardware vendor. Have fun!

Hum (3.66 / 3) (#82)
by cybaea on Wed Dec 13, 2000 at 05:13:47 PM EST

1) Something like keeping a previous pointer that moves forward at "half speed" (i.e. once every two (three?) itereations) and checking if new==old? Along those lines anyhow... Is it really linear? I think so...

2) Classic! Store the XOR of prev and next pointer, using the fact that you "know where you are coimg from", no matter which way you transverse.

3-4) Saving something for tomorrow...

5) Really a classic!! I always have to look it up, so what I really want to know is: how did he come up with that crazy expression in the first place??

[ Parent ]

Pop count (3.00 / 2) (#83)
by pmk on Wed Dec 13, 2000 at 05:29:22 PM EST

I look forward to seeing your sequence for the bit pop count (5). The one that I use, when dealing with a processor that doesn't support it as a native instruction, is actually very straightforward, so maybe it's different from yours.

I have a pile of bit-twisting puzzles like this, and can post more if people are interested.



[ Parent ]

Is this the solution you're looking for? (4.25 / 4) (#87)
by mahlen on Thu Dec 14, 2000 at 01:26:44 AM EST

It's been so amny eons since I've done assembler that I'll have to express it in a higher level language, but how's this?

  • Value with bits we're counting I'll call A.
  • Shift A 1 bit to the right and add it to A. Call that B.
  • Shift B 2 bits to right and add to B, call that C.
  • Shift C 4 bits to right and add to C.

That would work for an 8-bit word; 6502 was my last assembler, and words were 8 bits then :). But the idea scales upward.

It works because you can think of each origianl bit in A as being a 1 bit field containing the number of on bits in that field. If you add A and A >> 1, then in your answer you have four 2 bit fields. And so on...

But I only knew this one because my friend Sean Flynn interviewed at Microsoft in '87 and they asked him this question (loosely, they just said, how would you count the bits in a word). They're prefered solution was a 256x4-bit array of precomputed values, as that'd be faster.

mahlen

The optimist thinks this is the best of all possible worlds, and the pessimist knows it. -- J. Robert Oppenheimer, "The Bulletin of the Atomic Scientists", 1951



[ Parent ]
Thank you (3.50 / 2) (#84)
by Elendale on Wed Dec 13, 2000 at 05:40:25 PM EST

Now i have something witty to send to my CS prof at the bottom of my latest java hack^H^H^H^Hprogram. I was worried i wouldn't be able to come up with something :)

-Elendale
---

When free speech is outlawed, only criminals will complain.


[ Parent ]
Depressing... (3.75 / 4) (#88)
by Pantheon on Thu Dec 14, 2000 at 06:15:06 AM EST

I learned how to program Basic 16 years ago, and have practically lived with computers since then.

Lots of similar situations you describe have happened during the years.

* You're really good at making music with 4 channels, suddenly computers with 8 channels sound come out, and you music starts sounding like crap.

* You can pixel like a god with 32 colors, and new graphic chipsets can utilize have 256.

* You can code like a god in assembly language, but the hype is all about C++.

* You are really good with with CGI, but that technology gets replaced by JSP and ASP.

I eventually lost the desire to learn new technology, because I know it will be replaced by something else in a year or two.

I think it is important not to confuse work with hobby. At work my customer pays me by the hour, I have to get the job done as quickly as possible. So I go about it as high-level as I can.
About the only satisfaction I can get programming-wise at work is database design, which still can be quite low level.
Other than that, it is usually quick and dirty stuff, no time for elegant solutions. No time to stop and think. I don't think of this as programming, just putting pieces together as rapidly as possible. I can't remember the last time I implemented something even remotely elegant (actually anything elegant makes code hard to read, and worth less. How depressing).

As a hobby, I like programming stuff for PDA's, the Lego mindstorms and LPC for MUDs. Yeah its fun, but things will never be the same again.


Driver development (4.00 / 4) (#90)
by bored on Thu Dec 14, 2000 at 01:13:24 PM EST

If you like tweaking the HW and trying to get the last little ounce of speed out of some piece of HW you might try doing some driver programming. I spend my days at ring 0 messing with page tables, hardware registers, OS annoyances etc... Its a good job and keeps me amused with relatively low level programming. There is also OS developement, BIOS developement and Embedded systems when you get bored with what your doing.

I can relate to that (3.66 / 3) (#92)
by nymia_g on Sat Dec 16, 2000 at 04:16:30 AM EST

Oh yeah, definitely, I know the feeling. It's a little bit sad thinking about what has happened to computing. Up to now, I'm still wondering how they've managed to make the latest and greatest Intel boxes perform like a 386 class PC. With things like that happening, it's really easy to conclude that there is really some fundamental flaw in how software are constructed today. Yet most of us are not even aware of the processing power of a Pentium III class PC is probably the same as a mid-range cabinet type machine back in the 80's.

Back in my days when microprocessor trainers were part of the curriculum and assemblers and low-level translators were used to program custom built hardware that were made using 8088 or Z80 chips, coupled with 74LS244 and other 74 series chips. Even oscilloscopes and logic probes were common in those days. But now, these tools seem to have faded to oblivion as they seemed to have tossed them out of the window and decided to use off-the-shelf software from vendors instead.

Anyway, these are just my ramblings worth 2 cents.

BTW, I have a pet project of called "The Real Programmers" which basically treat PCs like translators weren't even invented yet. I'm also into compiler construction, specifically pasm, as these stuff really make my day. I know it sounds too 70's and 80's, but, that is where I get my fun.

tradeoffs (5.00 / 1) (#94)
by klamath on Sun Dec 17, 2000 at 08:33:58 PM EST

Back in my days when microprocessor trainers were part of the curriculum and assemblers and low-level translators were used to program custom built hardware that were made using 8088 or Z80 chips, coupled with 74LS244 and other 74 series chips.

Fine - but how much software actually got written? Did you have anything near the complexity or quality of the software we have now? I really doubt it. There is obviously going to be a trade off between raw runtime performance, and code clarity/good "software enginerring" methodology. You might have saved on hardware, but that's relatively cheap. Programmer time would have been much more expensive, and the time needed to write a given program would also be much higher.

[ Parent ]

Evolution (4.00 / 2) (#93)
by JohnHopfrog on Sun Dec 17, 2000 at 03:35:56 PM EST

There is always that tendency to stick to the stuff you know. You hamper progress.
We do not need to code for 8MB or 8Kb machines any longer.
If you really wish to do so, pretend that the 64MB on the end users machine is 8MB, and code your application. By the time you are done optimising for 8MB, the average end user probably has 128MB anyway....

-John Hopfrog.

Depends what you call progress (none / 0) (#95)
by maketo on Mon Dec 18, 2000 at 01:51:00 AM EST

Oh I get what you mean. I hamper progress because I dont like the fact that now every moron can produce a 2 Gig app to display a window and a 6 gig server-side app full of holes to handle _your_ credit card number online. Hmmm... let me see. Sorry, I will hamper until I die.
agents, bugs, nanites....see the connection?
[ Parent ]
What ever happened to... | 95 comments (95 topical, 0 editorial, 0 hidden)
Display: Sort:

kuro5hin.org

[XML]
All trademarks and copyrights on this page are owned by their respective companies. The Rest 2000 - Present Kuro5hin.org Inc.
See our legalese page for copyright policies. Please also read our Privacy Policy.
Kuro5hin.org is powered by Free Software, including Apache, Perl, and Linux, The Scoop Engine that runs this site is freely available, under the terms of the GPL.
Need some help? Email help@kuro5hin.org.
My heart's the long stairs.

Powered by Scoop create account | help/FAQ | mission | links | search | IRC | YOU choose the stories!