Kuro5hin.org: technology and culture, from the trenches
create account | help/FAQ | contact | links | search | IRC | site news
[ Everything | Diaries | Technology | Science | Culture | Politics | Media | News | Internet | Op-Ed | Fiction | Meta | MLP ]
We need your support: buy an ad | premium membership

Coding Standards are Falling

By tumeric in Technology
Wed Dec 06, 2000 at 05:09:52 PM EST
Tags: Software (all tags)

Despite research into the software development process, better tools and an improved understanding of employee needs, the standard of coding is falling.

I have no empirical data to back this up and so this is based purely on my own experience and prejudice. Whenever I talk to other developers I find myself shocked at their attitude and their mistakes. Not only are they the same ones I've made, but they are finding newer and better reasons for making them.

The sofware industry is eating away at its, already low, standards. I blame the following:
  • Programmers keen on building their CVs. For instance, the rush to build things with Enterprise Java Beans when they were unproven and nobody had any experience with them. I've sat in meetings where the technology was decided before the approach to solving the problem.
  • Good, evolutions in the development process being used as buzzwords to cover the cracks in the basics. I've seen some awful systems expressed in UML, beautiful coherent database schemas that were months out of date, had to maintain stupid OO frameworks and have had spec-less, chaotic hacks of projects described as 'rapid development'.
  • The urge of designers/programmers to climb mountains just because they are there. The toaster story around on the net is an illustration of this not being a new problem. However, I think things have got worse as architects and techies are getting the upper hand in a world becoming more dominated with new technology.

Good software is incredibly hard to write. By getting the things we needed (technology, methods and respect) we are further away than we ever were from achieving that original goal.


Voxel dot net
o Managed Hosting
o VoxCAST Content Delivery
o Raw Infrastructure


Your current software project is failing because:
o You're trying new technology 5%
o The design has basic flaws 17%
o You're using a sledgehammer to crack a nut 5%
o The old reasons (no time, spec, tools) 30%
o It isn't failing 41%

Votes: 130
Results | Other Polls

Related Links
o toaster story
o Also by tumeric

Display: Sort:
Coding Standards are Falling | 38 comments (27 topical, 11 editorial, 0 hidden)
A simpler reason... (3.33 / 9) (#3)
by trhurler on Wed Dec 06, 2000 at 02:24:33 PM EST

Success means more people. In any sufficiently large group, the majority are idiots. Therefore, as we experience more success, the majority of programmers are now idiots. My projects don't fail. I don't mean "rarely" fail. I mean nothing that I've ever taken past a prototype has ever failed since I've been employed. Why? Because I'm not an idiot, and I don't work with idiots. It is possible to have a failure even though you are not an idiot, to be sure, but the odds are actually quite low. The reason this viewpoint is not more commonly expressed is that most people are idiots.

Interestingly, I'm not one of those people in California making $150,000 a year to dye my hair blue and use a flat panel display and a wireless keyboard 120 hours a week until I burn out and throw myself off a bridge. I'm also not old and wise. I have to make do with merely being competent, which so far has put me far above 99% of the people I've met who work in the field.

Just for completeness, I'm a C programmer. This makes me largely immune to trendy bullshit like Java Beans and "DHTML." However, it also means that an even larger percentage of the people who do what I do get it wrong, because there's more to being a good C programmer than there is to being a good VB programmer, if the latter is even possible.

'God dammit, your posts make me hard.' --LilDebbie

Hardly (3.60 / 5) (#7)
by Pac on Wed Dec 06, 2000 at 02:34:38 PM EST

I suppose everyone in this part of the town has its own set of annedoctical data to support a thesis like this. But I feel it is hardly the case.

I feel people now live faster and they need to be reassured every now and then that yes, Software Engineering still is a new discipline and yes, it will still need some decades to reach the maturity of its cousins.

Meanwhile things are not getting worst, they are improving. OO is the de-facto standard for software development, UML has a reasonable chance of becoming the blueprint standard for SE and I see more and more new programmers getting a solid background still in college.

Your arguments are not so compeling. If your point is that there are lazy and incompetent developers in the field, I would agree. But then again, we don't get to hear much about Roman bridge builders whose bridges had fallen because they were probably served to the lions. Idem for failed pyramid builders, but instead of lions they got crocodilles.

You also seem to think the adoption of a new, unproven technology is a light, easy decision taken by some junior programmers over a lunch table. It is not. New technology usually holds the promisse of solving more problems, better. So the decision to be an early adopter is usually the result of a very carefull cost-benefit analysis. Or at least its so in the projects I usually see. Remeber that a new technology carry not only a greater technological risk but also a higher price of adoption. So the decision to go is not purely technical and most of the time it is career-betting for many of the involved.

Evolution doesn't take prisoners

Measures of Improvement (4.75 / 4) (#13)
by tumeric on Wed Dec 06, 2000 at 02:52:35 PM EST

Meanwhile things are not getting worst, they are improving. OO is the de-facto standard for software development, UML has a reasonable chance of becoming the blueprint standard for SE and I see more and more new programmers getting a solid background still in college.

This is the sort of thing I was talking about. OO guarantees nothing. Yes, its potentially better than functional decomposition but saying "I'm using OO -- look how great things are" papers over the real issues. UML documents are no good when they are out of date.

I agree that these methods are good. I'm saying that they are not improving matters because of our own weaknesses.

[ Parent ]

Invention, adoption, standardization (3.00 / 2) (#17)
by Pac on Wed Dec 06, 2000 at 03:35:48 PM EST

When a new method(or machine) for doing something is invented, some people will start using it right away, exploring its possibilities, discovering its problems and trying to improve it. If the method can survive this batch of "beta testers", it will usually hit the market and start its rise to standartization. Once it becomes a standard, it means that no other way of doing whatever the method does is acceptable.

I am using the word "standard" in the hard way, not the soft, W3C way. A standard is not something you choose to use if it fits your marketing plans or your project's time frame. A standard here is something that is usually required by law and always enforced by peer preassure and job competition. You don't get to build a building with any material you want. You must also draw the right set of blueprints or you won't get a license to build anything.

Software engineering is not yet at the point where there are laws stating how you must go about developing software. So, you can basically do whatever you want. But I do not think it will be the case for long now.

And we are not so weak where it matters. There are hundreds of very sharp, cold and deadly serious teams building defense, health, aviation and even good old comercial applications. Naturally you will not usually find these people making small client server apps in VB or building a Web site over the weekend. These are the people who live in the domain of the very large software, for whom and by whom OO, UML and things like that were invented in the first place. And at this place you can these tools really shine.

Evolution doesn't take prisoners

[ Parent ]
That Bridge (3.33 / 3) (#19)
by tumeric on Wed Dec 06, 2000 at 04:08:47 PM EST

Software engineering is not yet at the point where there are laws stating how you must go about developing software. So, you can basically do whatever you want. But I do not think it will be the case for long now.

Was engineering at the point where laws had matured when that bridge at the top of the page was built? Could the engineers a hundred years before build a better one?

[ Parent ]

Bridges, Laws (3.66 / 3) (#21)
by Pac on Wed Dec 06, 2000 at 04:27:32 PM EST

Obviously I am not saying that the laws are responsible for better bridges. They are at most responsible for avoiding that the collapsed bridge engineer will keep building collapsible bridges. The kind of law we are discussing are usually an after-the-fact realization of some standard. Most of the time the law will refer to some reconizable standard body that have been codifying the standard since the beginning of time. So, the law will not tell you how to build the bridge, it will say that your bridge must conform to the Saudi Arabian Institute for Bridge Building standard.

I think that my answer to the question "Can someone build software today that will be considered good in a hundred years" is clear, I think it is possible. But if the question is "Can the present average software developer do so?", my answer would be "almost certanly not". And the reasons, I think, are clear in my first post. We don't have all the necessary standards and the quasi-standards we have are not widespread to the point where one can't be a software developer without using them.

Evolution doesn't take prisoners

[ Parent ]
Software engineering (4.33 / 3) (#25)
by trhurler on Wed Dec 06, 2000 at 08:27:57 PM EST

I know the tools and so on that are being developed are in use by a few people here and there, and apparently you are involved in that. I respect this, but I also disagree with your prognostications. Software development will change, but the idea that process-oriented rules are going to create a software engineering future much like EE or ME is a pipedream; the reason software engineering works is that the people working in that area are bright and dedicated, and anyone who isn't gets weeded out, and because SE projects do not face the same constraints as most other software development.

In other words, those neat figures SE guys like to throw around about lowering average bug rates and so on are as much a function of their talent pool and the circumstances they work under as their process rules and fancy tools. When you try to scale SE to the whole software industry, you'll see the advantages evaporate for two reasons:

First, because you'll have to hire enough people to get the job done, or else your methodology will never be accepted by business. This will mean hiring some people who, by any reasonable standard, are basically idiots. They'll love the process more than anyone else, because it provides clear rules for how to do things, but it won't actually remedy the fact that they're idiots. You can't fix stupid.

Second, because you will face obstacles you now do not: ridiculous deadlines, changing (and even conflicting) requirements, rapid rates of employee turnover, more software to write than your staff can possibly produce, and so on. SE guys like to pretend their methods will alleviate this, but the fact is, good management is the only thing that alleviates this, and even good management can't really fix the problem so much as just keep the developers from going insane from it. The software industry demands far more from itself than it can reasonably produce, and until THAT changes, nothing will significantly improve quality except in isolated cases here and there.

Add the basic fact of software being a huge growth industry with not enough people to the additional fact that, demographically speaking, some tiny fraction of programmers are orders of magnitude better than their fellows and actually perform better without rules than they do with them, and all of a sudden, you get a very different picture of the future. SE will no doubt take off, but it will be applied to two kinds of software: the kind that simply can NOT be allowed to fail(antilock brakes, avionics, etc,) and the kind that is so huge that it cannot be effectively produced any other way. Both of those are and will remain the minority, but they are respectable, and I wish them the best of luck. I certainly will never do that kind of work, but please understand that I in no way mean to denigrate it.

'God dammit, your posts make me hard.' --LilDebbie

[ Parent ]
People and process (none / 0) (#33)
by Pac on Thu Dec 07, 2000 at 10:14:57 AM EST

I agree with you that people matter far more than process in this business. Give an infinite number of idiots a good C++ compiler and, given an infinite time, they will not produce the next generation operating system (they may come up with windows 3.11, though).

But I feel process helps in an intermediate situation, where you do not have enough very bright developers to cover all your bases, but you have a good number of average developers. These people can benefit hugely from having a good process in place.

Also, a good process will let you use the bright ones better, by helping identify the places where they are most needed.

But do not restrict the meaning of "process". You do not need to study the thousands of years of accumulated house building knowledge, an architect, two engineers and a construction crew of five to build a dog house. But you need some planning anyway, and if you can scale down the same process the dog will have its house faster.

I think the software industry will eventually get to a point where components will be available to build most everyday software. The average people will be working mostly on specifications, that will feed a quasi-factory building process. Some artisans will be available to craft some needed new components from the old ones, rarely from scratch.

Elsewhere, components and completely new software will be build using the same process, but this activity will involve far more R&D and a completely different set of skills than the everyday operation.

Evolution doesn't take prisoners

[ Parent ]
Not falling...coding standards have always sucked (4.00 / 9) (#8)
by ucblockhead on Wed Dec 06, 2000 at 02:35:23 PM EST

I've been modifying other people's programs since 1983, and have been programming professionally since 1987. Some of the code I've had to maintain during my career as dated back to as early as 1978. And it is my experience that coding standards are not falling in that they've always sucked badly.

I saw examples of all your bullets way back in the mid-eighties.

This is not to say that there isn't a problem. There sure as hell is. But it is not by any means new.

I could tell stories, God, could I tell stories!
This is k5. We're all tools - duxup
Perhaps more prevalent (3.33 / 3) (#15)
by retinaburn on Wed Dec 06, 2000 at 03:06:55 PM EST

Due to the increase in rapid coding companies, and in the sheer volume of people generating code it only seems like coding practices are suffering. Sure we are teaching more kids "how to program" and all the good things like indenting and commenting, but when your pointy-haired boss is breathing down your neck all that goes out the window in the name of survival.

I think that we are a young species that often fucks with things we don't know how to unfuck. -- Tycho

[ Parent ]
The Toaster Story (3.40 / 5) (#14)
by reshippie on Wed Dec 06, 2000 at 03:02:03 PM EST

I've never heard it before, but I think there should be 3 people, not 2.

The 3rd person should respond to the embedded computer question with a simple question "Why do you want a computer in your toaster?"

I know you didn't write it, but I thought I'd bring this up.

Had more to say, but couldn't figure out a coherent way of expressing it. Hope I did an ok job with what I did.

Those who don't know me, probably shouldn't trust me. Those who do DEFINITELY shouldn't trust me. :-)

But that wasn't the question... (2.50 / 2) (#26)
by sec on Wed Dec 06, 2000 at 09:40:28 PM EST

The king asked how one would design an embedded controller for a toaster, not whether one was necessary, and he didn't necessarily ask that the person actually do it.

But, point taken. Why use high-tech when low-tech will do?

Actually, though, as a counterpoint to the toaster story, I would cite the IBM PC and its decendents as a horror story of what happens when the 'engineering' approach is used when it isn't appropriate. Don't even need to make up a fantasy story for that. :)

[ Parent ]

Blame OSS? (2.75 / 4) (#16)
by dreamfish on Wed Dec 06, 2000 at 03:35:44 PM EST

I wonder if you have open-souce software in mind, given that the 'release early and often' mantra seems to be in direct conflict with most software engineering principles which demand you spend lavish amounts of time and resources on the design stage and leave coding well alone until the design is complete and robust.

However from my experience most of the software lifecycle approaches don't take account of business processes. That is, they are very good for long deadlines and known technologies but can be overly restrictive (and prescriptive) for shorter projects, like 2-3 months, involving newer technologies and interfacing issues.

I believe that OO methodologies are better at this than classical waterfall techniques but are any suitable for the faster and more haphazard world of open source?

Some reasons (3.33 / 3) (#18)
by Pac on Wed Dec 06, 2000 at 03:52:34 PM EST

The OSS development process is based not only upon the "release early, release often" principle, but also upon a very definite set of (free) tools. Rational Rose is not one of them (by the way, for the price Rational wants for Rose, it will not be part of most software projects, OSS or not). CVS does not understand petal files and, more important, can't control changes in them.

But UML seems to be catching up, and I think that as soon as we have a nice open source UML editor it will become very common. Also, many OS projects are now being developed in Java, favouring a more modern and mantainable approach.

Evolution doesn't take prisoners

[ Parent ]
Rational rose (rant rant) (none / 0) (#38)
by Dion on Fri Dec 08, 2000 at 05:46:22 AM EST

I have had mu brush with Rational Rose and I must say that it is IMHO the most horrible piece of sh*t I have ever evaluated, the UI is klunky, the program produces horrible code and has a hard time doing roundtrip engineering. I have tried Together though, it did everything I needed done in the GUI and allowed exelent integration with the code. Together allows you to code and draw the design at the same time, you can create a class in the GUI, add attributes to it in the code and then change the type of the attributes in the GUI if you want to.

[ Parent ]
Blame all software (4.33 / 3) (#20)
by tumeric on Wed Dec 06, 2000 at 04:26:23 PM EST

I wouldn't call the release early and often rule in violation of software engineering principles. Its possible to release well designed software early and often.

Some OSS projects are done to get a technology on a CV. Some do try to build the ultimate toaster. Others achieve wonderfully simple solutions by meeting the needs of the users. I actually think the third type is harder to achieve in the commercial world.

[ Parent ]

Re: Blame all software (3.25 / 4) (#24)
by dreamfish on Wed Dec 06, 2000 at 05:52:31 PM EST

The problem I have with 'release early, release often' is its emphasis on validation through testing rather than design. This means the testing becomes 'test out bugs' rather than what software engineering principles demand which is 'test to determine if software satisfies requirements'

[ Parent ]
Design Validation through Testing (3.00 / 1) (#30)
by tumeric on Thu Dec 07, 2000 at 03:22:33 AM EST

I would argue that an inbox full of bug reports is reason enough to design hard. The testing phase is more full on, the audience is wider, technically literate (in a lot of cases), suggest requirement changes and some even go through the code.

The testing out the bugs emphasis comes from the fact that open source releases are more liable to have bugs found in them (not more liable to have bugs).

[ Parent ]

I'm not sure standards are falling (4.50 / 6) (#22)
by Simon Kinahan on Wed Dec 06, 2000 at 05:08:58 PM EST

I see a few things going on in the industry over the last few years, many of them contributing to poor coding, but I do believe poor code has always been, and always will be, with us.

The most important factors are the ones noone has found a way to deal with yet. Building good software requires that someone on the project on a day to day basis have a solid mental picture of what the entire system does, and how it does it. This is the proper role of a project architect, but unfortunately this has become a synonym for "over payed, irresponsible person who doesn't write, or even understand, code." Many projects fail on this count, and thus doom themselves to bad, buggy code from the start. If noone knows what it should do, how will anyone know whether its doing the right thing ?

The second factor that will always be with us is irreducible complexity. Software is hard because it sets out to solve complex problems. There's a limit to how many people you can have working on sucha a problem, and a limit to how hard they can work. Try to put too many people on the project, or make them work too fast, and the architectural vision breaks down and the whole thing goes to pot. This goes back to Brookes, but still no senior manager in a software company seems able to resist the urge to push schedules by cutting out design and test time, or throwing inexperienced staff at late projects.

There are also some new phenomena at work. The first is a huge rise in the number of software projects, thanks to the web and corporate computerisation. This inevitably means people will be pulled into programming who lack the skills at abstract thinking that in the old days, or in acdemia today, everyone had. These people can be perfectly good programmers, but they make poor architects and designers, and this must be recognised.

Secondly, and connected to the above, is the rise of the journeyman programmer. Not everyone is a master craftsman, and not does everyone need to be. The fact this can be so is a tribute to newer tools, especially relational databases. However, once again, someone who can put together a perfectly competent database back CGI app in perl, is not necessarily competent to deal with writing a relational database themselves. Its amazing how common it it to regard programming as a single skill. It isn't, and some kinds are much harder than others.

Thirdly, software product companies suck at producing quality code. Since there are so many more of them now, this is becoming an issue. This is an inevitable aspect of their position in the market. They are so far detached from their customers, they cannot obtain requirements for new products by any means other than guesswork. Thus, once they get past a certain size, and a single architectural vision can no longer cover the whole product line, requirements thrash inevitably sets in, developers no longer know what they should be building, and code quality plummets as the source base in pulled backwards and forwards between objectives.


If you disagree, post, don't moderate
Specialization (3.66 / 3) (#27)
by aphrael on Wed Dec 06, 2000 at 09:49:27 PM EST

Its amazing how common it it to regard programming as a single skill. It isn't, and some kinds are much harder than others.

It's also hard to identify what the skills are, and who actually has which skills, except by trial and error. Eventually those skills will have to be classified; I wonder if that will lead to the type of over-specialization in computers that are seen in other scientific fields.

Me? I do variables. Just variables.

[ Parent ]
Defense of the Toaster Story (4.00 / 6) (#23)
by interiot on Wed Dec 06, 2000 at 05:43:13 PM EST

I have to suggest a defense to the toaster story. It's not so much that a programmer gets a new toy (eg. a hammer) and eternally uses it to whack everything. It's more that... at first, the programmer doesn't know what a hammer is good for, so he tries it on everything. Eventually he learns the situations where it's useful to whack, and trims back on his whack-everything heuristic.

This behavior is especially important in the field of programming. Programmers are constantly being exposed to new tools. So, they cope by learning how to learn, and this is one of the strategies.

Maybe this behavior looks strange to someone who already knows what a hammer should be used for?

Not trying to solve the problem (4.00 / 1) (#35)
by gcmillwood on Thu Dec 07, 2000 at 10:50:54 AM EST

In the story, the computer scientist is not trying to solve the problem. Instead, he has a solution that is looking for something to solve.

This is something I may have been guilty of in the past. I want to move on to something new (new programming language/computer/design method/whatever), so I try and find a 'problem' this new thing is required for. In itself this is not a bad thing - difficulties only arise when you can't find an appropriate problem for what you want to do, and end up applying your 'solution' to something it cannot solve.

[ Parent ]
Taking a breather from stability (3.00 / 1) (#31)
by slaytanic killer on Thu Dec 07, 2000 at 07:42:05 AM EST

The sofware industry is eating away at its, already low, standards.

When I read things like this, I get the impression that people are worried about the wrong things.

There is nothing wrong with technologies being pushed before their times. Throughout human technological history, that has been a driving impulse: Make worse systems, because that pushes the state of the art. Ideas do not have to be completely fleshed out properly; sometimes it is just best to power ahead and create so that there is a demand for new tools to be created. People in the future can pick up where you left off and make special, stable evolutions of your work. This creates an innovation-heavy and stability-light world.

The advantage of this is that there are a lot more ideas to pick up! People abandon good things, such as ML and the Amiga, which you can pick up later. We are soon getting to the point where old ideas are a more efficient source of "innovation" than new discoveries, and at some point an enterprising businessperson will pick up on that fact.

The disadvantage of our world being so innovation-heavy, is that people fall for marketing at the drop of a hat. But there are always people who fall for it, and there are those who don't. I like the explosive growth; I don't mind using "worse" tools, because they can be used with foresight.

I don't own a Swiss Army Knife. I do well with just normal knives and screwdrivers because I know their limitations and they are far more abundant.

Picking things up later? (none / 0) (#34)
by Deven on Thu Dec 07, 2000 at 10:19:59 AM EST

The advantage of this is that there are a lot more ideas to pick up! People abandon good things, such as ML and the Amiga, which you can pick up later.

Unfortunately, most of the good things that have been abandoned (at least in the computer arena) are protected by Intellectual Property laws, such as copyright, patents, etc. This means you can't pick up where someone else left off, no matter how little interest the original people have in continuing the "thing" in question. (Sure, they could grant you the right to continue from where they left off, but how common is that? Lawyers will almost invariably recommend against it, just on general principle. Why give up rights for no reason?)

I would love to have the source code to the old 1.3 version of the Amiga operating system. Even on an 8 MHz 68000, it was FAST. And it didn't even need ONE megabyte of RAM to multitask efficiently. It would be interesting to port the OS to a PC hardware platform and see how fast it runs on machines with 1,000 times the processing power. But the source isn't available, so the only option would be to try to reimplement the same ideas in a new OS from scratch instead.

The Open Source community is often accused of reinventing the wheel, or "chasing the taillights", etc. This is necessary to get to a point where we have an infrastructure that everyone CAN build upon freely. Once that infrastructure is solidly in place, you're likely to see a lot more Open Source advancing the state of the art rather than imitating proprietary systems...



"Simple things should be simple, and complex things should be possible." - Alan Kay
[ Parent ]

sheer numbers (3.50 / 2) (#32)
by holzp on Thu Dec 07, 2000 at 09:31:56 AM EST

I think the problm may be eplained by sheer numbers as the amount of programmers needed in the industry grows, more younger programmers are being pushed to do things without the proper 'apprenticeship.' Couple that with the levels of abstraction newer languages offer, like Java. There are many young programmers who will pump everything in to one of its available data structures just because the standard library has it built in, without thought for the cost. The thought that two lines of self written code save 200 lines of library code is still a foreign concept.
Is this real or merely a perception? (4.80 / 5) (#36)
by kostya on Thu Dec 07, 2000 at 11:46:58 AM EST

I ask the question because I think there is some validity to the idea that this is merely perception.

Ask yourself this: when you started at your first job and made your first mistakes, do you think the "old timers" at your first job looked at you and said "Geez, where are they getting these kids?"

I've been in the industry for almost 7 years. I'm not saying that is ancient, but I have been around the block at least twice ;-) And in that time, I have made my share of stupid mistakes. I have also used a bunch of technologies before they were "proven" (although they all later became standards and proven). I've also changed my philosophy gradually over the years, learning and growing as a person and as a professional.

I would like to humbly suggest that all of us (and I am definitely guilty) who say "Standards are falling" or "Programmers are getting less skilled" might just be guilty of generational prejudice. That is to say, we look at the younger additions to the field and view them with a prejudicial skpeticism. We forget how unskilled we were. We forget how bad our mistakes were. We simply start thinking, us vs. them.

This isn't to say that there are not horrible abuses of coding practices and fundementals today. But it is just to say that perhaps there has always been. Adn despite our positive outlook on our own skills, there was a day when we were just as guilty.

If that is true, the solution is very simple and within all our powers: help them. Most of the reasons for our past mistakes were because no one would help us or we wouldn't ask for help. As senior members of the CS field, we can make it better by mentoring and training junior developers. And also by being very patient. ;-)

Veritas otium parit. --Terence
The tools are good but results vary (4.50 / 4) (#37)
by amokscience on Thu Dec 07, 2000 at 01:56:26 PM EST

It's my opinion from personal experience that most good programmers have excellent coding standards. They have a consistent style, explore technology before diving into new stuff, and create good designs. It's common sense and it works well when dealing only with that developer.

However, start throwing in other developers - often less experienced or plain bad, ignorant customers, ignorant management, and - probably most impotantly - market pressures, then you'll find that those good coding stndards are quickly sacrificed. Ironicaly adhereing to at least some of the standards would probably save more time down the road, but none of us can see 5 months down the road very accurately, no matter how often we check our hindsight.

There's another thing. I've found that most programs created from a decent design are really nice to maintain and extend from, right up until you've exceeded the design specifications. Then the code becomes a horrible kludge. The good design decisions don't make any sense and because redesign is the costliest part of the SE process it is rarely ever redesigned properly. This usually precipitates the good ole 'rewrite from scratch'.

One other thing, never, ever let one programmer do a whole module on his or her own without code reviews. When he/she leaves you're most often going to be very surprised at what 'design' changes have been made (I'm going through this at work, Mmmm FUN, not).

Anyways, the tools are there, people just don't use them as they should. You can built a crappy soap box derby racer and you can built and excellent one with much the same tools. I believe very much the same with software; we all have the same basic tools, some of us produce much better software than others. We may just be getting drowned out as the volume has increased. As anyone familiar with defects analysis and software engineering knows, coding goes on at a furious pace while properly applied software engineering lags far behind. It will require a significant shift in philosphy before this changes.

Perhaps becoming more liable for defective software will raise the 'standards'. *sigh* Of course that opens up another gigantic can of worms... stifling creativity, only corporations can afford it, more laws, lawsuits, how to appropriate damages and accountability. I'm getting depressed, back to programming.

Coding Standards are Falling | 38 comments (27 topical, 11 editorial, 0 hidden)
Display: Sort:


All trademarks and copyrights on this page are owned by their respective companies. The Rest 2000 - Present Kuro5hin.org Inc.
See our legalese page for copyright policies. Please also read our Privacy Policy.
Kuro5hin.org is powered by Free Software, including Apache, Perl, and Linux, The Scoop Engine that runs this site is freely available, under the terms of the GPL.
Need some help? Email help@kuro5hin.org.
My heart's the long stairs.

Powered by Scoop create account | help/FAQ | mission | links | search | IRC | YOU choose the stories!