Kuro5hin.org: technology and culture, from the trenches
create account | help/FAQ | contact | links | search | IRC | site news
[ Everything | Diaries | Technology | Science | Culture | Politics | Media | News | Internet | Op-Ed | Fiction | Meta | MLP ]
We need your support: buy an ad | premium membership

[P]
ACCU Spring Conference 2001 Roundup

By codemonkey_uk in Technology
Sat Apr 07, 2001 at 05:25:57 AM EST
Tags: Software (all tags)
Software

The recent Association of C and C++ Users spring conference held at Christ Church College in Oxford, England had something for everyone. There were a number of language agnostic seminars that covered various topics relevant to all developers including discussions on current software development practices and how to improve them as well as various aspects of the Open Source phenomenon and how developers could best partake of this steadily rising tide.

There were also a number of technical seminars, mostly focused on C++, which covered a number of subjects such as Policy classes, advanced template techniques and namespaces. Also various potential changes, defects and extensions to the C++ Standard were discussed.

Below are a number of short but detailed descriptions of each of the seminars in the 3 day event.


    Quick Index:
  1. [Minimalism]
  2. [Improving Reuse in C++ through Policy Classes]
  3. [Expressing Constraints a la Standard ML Signatures]
  4. [Namespaces and the Interface Principle]
  5. [Template Techniques]
  6. [13 Pitfalls of the Software Industry]
  7. [Optimising "plain" and "multi threaded" C++]
  8. [C++ Standard Library : Changes, Corrections, and Extensions]
  9. [Open Source]
  10. [Meta Programming in C++]
  11. [Introducing Boost]

Thursday 29th March

Keynote: Minimalism - Kevlin Henney

Kevlin Henney opened the conference with a "less is more" keynote speech in which he tried to convince the programmers in the audience to "write less code". His focus was not on minimalism in the feature set, but rather on the need for developers to "omit needless code", and "remove to improve". "There is too much code, and not enough software" he said, pointing out that "there is no code faster than no code" and that "less code, equals less bugs", as the bugs per line of code is constant across this industry, irrespective of the size of the project.

His advice also extended to documentation, saying "if you document everything then nothing is significant", alluding to those people who would comment: "i++; //increment i".

He berated both java strings (length or size?) and c++ strings ([i] or .at(i)?) for their inconsistencies, and criticised stl allocators for being complex and "fragile", with very little practical use.

He concluded (more or less) that design is about embracing constraints, and that we should consider constraints and affordances, specifically, that unwanted affordances should be reduced.

Improving Reuse in C++ through Policy Classes - Andrei Alexandrescu

Andrei Alexandrescu, Development Manager at Real Networks Inc, presented the first heavy weight lecture of the conference, describing policy classes (a technique which should be familiar in concept to those who have read my "An introduction to C++ Traits" article) and using the implementation of the ultimate Smart Pointer class as an example.

"Design is Choice" he told us, introducing the policy class idiom as a solution to the "evil multiplicity" of real world situations. He showed that large interfaces undermine design by compromising size, efficiency and clarity, and went on to demonstrate the limitations of a Multiple Inheritance solution, which "juxtaposes names, features and vtables", and a Template based solution, which could not specialise structure, only behaviour, and did not scale. He also claimed that the template only based solution placed an unreasonable burden on the user.

Policy based design, he explained, was a way of constructing classes with complex behaviour by combining classes called policies. Policies being a low fat variant of interface based design, and the Template Method pattern, where each policy establishes an interface to a to be implemented policy class.

The policy classes idiom is too large a subject for me to go into here, those of you interested in it might want to check out Andrei Alexandrescu's book "Modern C++ Design: Generic Programming and Design Patterns Applied" for a more in-depth look at this topic.

Expressing Constraints a la Standard ML Signatures - Gabriel Dos Reis

ML is a high-level functional programming language with which Gabriel Dos Reis has worked extensively. The essence of his talk was that C++ could benefit from a technique used in ML to constrain types.

He started by talking about compiler error messages: when you get something wrong and templates are involved the compiler will nearly always report an error (good) but the messages you get can be extremely verbose and unhelpful (bad). Often this is because you have more than one template function or template specialisation for a particular name and the compiler does not have sufficient type information to pick the "right" one.

His proposed solution to this problem was to extend the C++ language to allow the definition of signatures. A signature declaration looks like a class declaration except that it starts with __signature__ instead of class. When you declare a class or template class you can specify that it is associated with a particular signature. The compiler uses the signature information to enforce extra constraints on the class or template class (at instantiation time in the case of templates) and thus finds any ambiguity or mismatch problems at an early stage. For example, a method or operator specified in the signature must also appear in the class or template class being declared.

Gabriel Dos Reis is currently implementing a proof of concept of signatures using the gcc codebase and plans to submit a proposal to extend the C++ language later on this year.

Namespaces and the Interface Principle - Herb Sutter

In his first lecture, Herb Sutter, CTO of PeerDirect Inc, and secretary of the ISO/ANSI C++ standards committee, addressed the question "Why namespaces?" and introduced the audience to the "Interface Principle".

Namespaces were added to the standard after if became clear that the common C practice of prefixing identifiers in a library with a library name has its own C++ equivalent, in enclosing whole libraries in a single class, which was cumbersome, and intrusive to development. We were advised that by using namespaces we could "avoid the pollution of the global namespace", and "avoid accidental name clashes" (such as those that can happen when using libraries from multiple vendors).

The Interface Principle addresses the question: What is 'part of' a class? Consider the following example:

class X { ... };
void foo(const X & ... ) { ... };

Is foo part of X? What about now:

class X{
   ...
   void foo() const { ... };
};

What's the difference? Herb showed that for X all functions, including free functions, that mention X, and come with X are part of X. This doesn't just apply to C++ - consider FILE in C, and its associated functions as well.

He then went on to talk about Koening Lookup in great detail, which I don't have space to go into here, but is something that all C++ programmers should make themselves familiar with.

Namespaces, and the Interface Principle are addressed in detail in Herb Sutter's book "Exceptional C++: 47 Engineering Puzzles, Programming Problems, and Solutions". Herb Sutter's article for Dr. Dobb's Journal "Migrating to Namespaces" is also available online, along with his Guru Of The Week #53 : "Migrating to Namespaces".

Template Techniques - Nicolai Josuttis

In a session billed as "intermediate / advanced", Nicolai Josuttis, C++ Standard Committee member, gave a solid if uninspiring lecture on c++ template techniques. Opening by establishing key terminology (notably "Function Template", not "Template Function", & "Class Template", not "Template Class"), he went on to discuss the problems surrounding header dependencies, and the problem with "export" not working.

He discussed the difference between declaration and definition and suggested that one solution to compile time problems was to #include a header containing the template declaration, and instantiate the template for specific types in a separate cpp file, observing that this technique only works when you know which types will be used in which headers, you avoid inline functions, and are prepared to accept the inevitable proliferation of files.

Debugging issues and compiler support where discussed, and Nicolai observed that with some implementations of the STL a map of strings to strings can result in symbols 10's of 1000's of characters long!

Nicolai Josuttis's talk finished on a light note, with a limerick which appears in the C++ ISO Standard:

when writing a specialization,
be careful about its location,
or to make it compile,
will be such a trial,
as to kindle its self immolation!
So who says the C++ Standard Committee doesn't have a sense of humour!

Friday 30th March

Keynote: 13 Pitfalls of the Software Industry - Herb Sutter

Subtitled "Miss Conceptions", Herb Sutter opened the second day of the conference by outlining the 13 main mistakes that the software development industry makes, which in his opinion where:

  1. Assuming that Communication equals Community
    Herb asked the audience to consider the question: Does all the communication technology we have at our disposal draw us together as a community?
  2. Communications Failure
    Highlighting his point by "crashing" the slide show, he addressed the issues of people failing to "hear" what is being said, and people failing to raise problems. He also pointed out that there is a tendency for people in the software industry to assume that people are criticising them, and asked us to ask ourselves if "ignorance is bliss".
  3. Failure to allow Failure
    Pointing out that both high reliability systems, and backup systems are good things, he asked "how good is good enough?"
  4. Language Wars
    "Use the best language!" he told us. Sounds obvious, but ask yourselves which is the best language? Best for what? Best for whom? He gave the example of Garbage Collection (as a language feature), stating that it was 'best' , unless you're writing a hard real-time system, unless you're not using dynamic memory (for example, in an embedded system), unless there are other reasons not to. Note also that Garbage Collection does not necessarily imply Java.
  5. Truthtelling & Naysaying
    "Its not a bug, its a 'fee-chur'" he said, condoning marketing speak, and then going on to "documentation that isn't" (Question: Which documentation is right? Hint: Read the source) and "the little comments that couldn't" - referring to the ubiquitous "i++; //increment i".
  6. Geekspeak & Technobabble
    "Mean what you say! Say what you mean!" he told us, going on to condemn the use of sloppy terms such as "start", "friend", "real-time", "undefined behaviour", "thread safe containers" and so on.
  7. Automation and Dependency Inversion
    Should we question our dependency on electricity and computers? EMP, hacking, viruses are all a risk. Ask yourself, would you rather ICBM firing mechanisms were manual, or computer controlled?
  8. Size Doesn't Matter
  9. Mega Mania
    Is bigger really better? In a world where mega-bucks have taken over from mega-bytes, and things are valued over people, (why) are the dot-com billionaires (un)happy?
  10. Silver Bullet Syndrome
    In the software industry we have our werewolves - but there are no silver bullets. There is no one "silver bullet" methodology, language or framework, so why do we expect each new technology that comes along to fix everything?
  11. Know Your Limits
    Not only should developers know their own limits, they should know the limits of their development tools, their compiler, their code, and their space / performance constraints. "Software is not Magic".
  12. Insane Programming
    Knocking out code at a rate of knots, and beating tight deadlines is an addictive headrush, extreme programming may be 'cool', but does not focus on design. Insane Programming is worse, what happens to trust, when there is a lack of sanity checking? How does this impact on robustness and error recovery.
  13. Neuroses & Phobias
    "Just Ship It" and "We Can't Tell Them That (even if its true)" mentalities. Group think.

Concluding that it all comes down to a Failure To Communicate.

Optimising "plain" and "multi threaded" C++ - Herb Sutter

Herb Sutter's began his third talk by asking the audience the question he ultimately aimed to answer: How do you optimise a C++ Program?

He then went on to explain that it all depends on what you mean by "optimise". You could be optimising for space efficiency, in which case, would that be the space taken up by the program code? The data? The working set? Or perhaps you want to optimise for speed / time efficiency, in which case, would that be start up time? Worst case time? Best case time? Average time? Which operation? On what platform or processor? Or perhaps you wanted to optimise stability, or compile time efficiency, or time to market, or your profit margin! The point being, when you say that you will, or you are asked to, optimise a program, there are more questions that need to be answered before you continue.

That said, Herb went on to say that today he would be looking at micro optimisation (the small stuff that you should do automatically, at no cost to code clarity), improving build times, the inline keyword, and the Copy On Write 'optimisation' in a multi threaded environment.

Having prewarned the audience that optimisation is the last thing you do ("make it clear, make it right, cut the fat"), and even then only when you have to ("1st Law of optimisation - Don't optimise. 2nd Law of optimisation - Don't optimise yet."), that by using the stl containers and algorithms you would get most of what he was about to explain for free, and that there was no substitute for profiling, he went on to focus on the standard techniques for avoiding tempory objects in C++, ie, prefer "++i" to "i++", pass by reference, and avoiding implicit conversions (use explicit!).

He then moved on to the inline keyword which he explained, could do anything, including making you code faster, slower, bigger, and smaller, amoung other things.

Following that Herb explored build time optimisations, the key points of which were:

  • Remove unnecessary #includes
    This may seem obvious, but reconsider what is necessary.
  • Avoid "using" in headers
  • Prefer #include < iosfwd > over #include < ostream > in headers
    (see also virtual print idiom)
  • Forward declare what can be forward declared
    Forward declare template instantiations, see also "used in name only" principle.
  • Compiler Firewall / Pointer To Implementation (pimpl) idiom
    (note: comes with runtime overhead)
  • Prefer composition to private inheritance
    (for those using the pimple idiom)

The discussion then moved on to the issues surrounding optimising multi threaded apps, and Copy On Write. There isn't space to go into it here, but Herb Sutters article "Optimizations That Aren't (In a Multithreaded World)" is available online.

C++ Standard Library : Changes, Corrections, and Extensions

With Dietmar Kuhl, Nicolai Josuttis, Beman Dawes, et al.

The panel of C++ Standard committee members introduced this session by explaining that the C++ Standard is split into two parts, "core" and "library", and that this talk would focus on "library". It should be noted that there is a possibility that core and library, which are both currently covered by one ISO standard could be split into two separate standards in the future.

There were 300 issues submitted to the standard committee, 125 have been accepted as defects in the standard. Only 2 or 3 of these defects are real problems, most are typos and errors in examples. 16 issues were duplicates. 58 issues submitted to the committee were not defects. For example, compiler vendors sometimes submit a defect report stating that a feature of the language is "not implementable", which has actually been implemented elsewhere. There are (at time of writing) 41 new issues that have not been considered by the committee, and 25 open, that is, they have been looked at, and need to be considered futher. It should be noted that missing components are not considered to be defects.

Of the defects in the standard, there are two that stand out, and these where given significant discussion time by the panel. The first of these revolves around std::vector< bool >, the second around valarray .

The subject of std::vector< bool >, is a controversial one. There are several problems with this part of the library, and they are compounded by the fact that it is in real world use (a show of hands in the room revealed one developer out of about 50). The main problem with std::vector< bool > is that it has a different interface to std::vector< ... >, that violates the stl container requirements, and that for "std::vector< bool > v;", "v[i].flip();" is undefined. This situation came about because the STL was a relatively late addition to the standard, and while Stepanov had intended to include proxy reference types, there was no time and they were removed, all except the one used by std::vector< bool >.

The problems with valarray, are less controvesial. It simply does not work. The solution is more complex; the committee have to decide to fix it, or deprecate it. The problem with fixing it is that it may be broken on the conceptual level, and even were the standard to be fixed, it would be very hard to get vendors to support it fully, and to regain the developers' trust it it. The problem with depreciating it is that deprecated features stay broken, and never really go away. In addition to this, the possible inclusion of the restrict keyword from C99, in combination with the OO Numerics / Blitz++ library, would solve most of the problems that valarray tried to.

The panel closed with a discussion with the audience on the subject of future extensions to the standard library. Those of you interested in this should note that the boost library (descibed later in this document) is going to be the focus for change in the future. Other librarys of interest are ACE for multi threading, and the SGI container extensions to the STL.

Saturday 31st March

Keynote: Open Source - Alan Lenton

Alan Lenton, Technical and Creative Director for ibgames, and linux user since v1.0, gave the final keynote speech of the conference on the subject of Open Source, or to be more accurate, he addressed the much asked question: Is your (companies) product suitable for open source?

Before addressing this question directly, he then went on to separate "Open Source" into three conceptual parts:

  1. open source Licences, such as the GPL,
  2. open source as a Development Model, and finally,
  3. open source as a business model.
Which poses the killer question: How do you make money out of open source? A question that is especially relevant in the post dot-com bubble economy. Alan suggests that support services are a valid revenue model, and cited ibgames as a successful example.

He then went on to list what he considers to be the strengths and weaknesses of open source development, listing the production of development tools, and applications that have been done before and are clearly defined as examples of projects that work, and rapid correction of bugs as an other benefit, although he claimed that in practice having the source does not necessarily mean that you can fix bugs yourself.

Making a joke about flame proof pants, he went on to cite what he considered to be the weakness of open source, listing choice, innovation and standards, as the key points. He explaining that the problem with "choice" for companies who want to get into open source is that the development "community" has a choice, and that they can choose not to become involved with your project. Innovation within open source was criticised, and Alan suggested that this problem might stem from a clash of egos within the community, where innovative and original ideas might be ignored due to the "not invented here" syndrome. The problem with "standards" he said, was multifold, first of all was motivation to comply to standards, he told us, citing POSIX conformance in Linux as an example, second was the need for some companies to meet ISO 9000, which involves documenting process, which may be difficult with an open source project. The third 'standards' problem that open source projects may mee! t is that of software patents, and the subversion of standards by third parties.

The final topic covered in the session was that of "Managing your Open Source Project". His main points were getting it out, getting it noticed (SourceForge hosts 18000+ projects), getting people to work with (not for) you, code ownership, managing bug fixes, collective decisions, revision control, negative productivity, and feature creep.

Alan Lenton has posted the document "Aspects of Open Source Software Development", which is based on this talk, online.

Meta Programming in C++ - Gabriel Dos Reis

Template metaprogramming is the use of templates to perform operations at compile time: this can improve both the efficiency and the type-safety of a program. Unfortunately template metaprograms are extremely difficult to write and understand; this is because they were not designed into C++, they're only possible because of an accidental combination of C++ features - i.e. template partial specialisation, typedefs and enums with expressions using compile-time constants. In 1995 Todd Verhuizen published an article Template Metaprograms in the May issue of C++ Report and thus the art was born.

Gabriel Dos Reis had an unusual approach to the subject; he started with a program written in Scheme and showed how it could be implemented as a template metaprogram. Scheme is a functional language derived from Lisp and because looping and assignment are not available to template metaprogram writers it turns out that the sort of techniques used to write functional programs are the same as you would use for writing template metaprograms.

For example, a conditional (introduced by if in most programming languages) in Scheme is implemented using a special form called cond, which is followed by one or more condition-expression pairs; the conditions are evaluated in order and the first one which is "true" (non-nil) is selected and the value of the whole conditional is the value of the true condition's expression.

The corresponding construct in template metaprogramming is partial specialisation: you define the template first of all for the general case using a standard (unspecialised) template definition and this contains the code for the "else" branch of your conditional. Then you define one or more specialisations of the template for particular types or integers (depending on what is being passed to the template as parameters) and these correspond to the "if" and "else if" branches of the conditional.

Similarly, template metaprograms follow the functional paradigm in that they use recursion instead of looping constructs such as for and while. Variables - at least variables whose value can be changed - are not available either.

Unfortunately, those of you who have no experience of defining template classes/functions will be completely lost by now but there is insufficient space here to go into more detail. I will just re-iterate the recommendation of Andrei Alexandrescu's book which is the best introduction I've found to template metaprogramming.

Introducing Boost - Beman Dawes

Boost, Beman Dawes, C++ standard library group committee member, and boost.org webmaster, told us, is a "repository of free peer reviewed portable C++ source libraries which work well with the C++ standard library".

In addition to that, boost is a website, a mailing list, a public CVS service, a public & private FTP service, and a community of volunteers, comprising of individuals, universities, businesses and other organisations. The people behind boost, we where told, have tried to engineer a culture of positivity and inclusion - for example, commercial / proprietary developers are 'included' by placing the libraries under non-restrictive licences. In addition to this, boost is inclusive of other language developers - a C++ to Python bindings library is one of the most popular downloads.

The talk then went on to outline the key points potential users of the library may want to consider. These points where:

  • Boost is a library in development.
    This means that it is subject to change.
  • Boost uses the latest c++ techniques.
    This means that stresses compilers, sometimes to breaking point.
So how do you cope with these issues? Beman suggests:
  • Freeze on one version of the library.
    Don't automatically upgrade, instead only update at points in your development cycle when change is tolerable, and you will have time to deal with broken code.
  • Test Boost on your system before putting it into production use
    There is a compiler status page on the boost website that may be of help.

One often asked question about free libraries is: What do I (my company) have to gain by contributing to this library? Beman answered this by saying that the boost community acts as a free QA department to companies that submit a library under a suitable licence.

Boost is worth checking out, it is the future of C++, and many components of it will be up for inclusion in future revisions of the C++ library.


By Thad (codemonkey_uk),
Introduced by Carnage4Life
With thanks to digger for contributing write ups of Gabriel Dos Reis' "Expressing Constraints a la Standard ML Signatures" & "Meta Programming in C++" sessions.

Sponsors

Voxel dot net
o Managed Hosting
o VoxCAST Content Delivery
o Raw Infrastructure

Login

Related Links
o Associatio n of C and C++ Users
o Minimalism
o Improving Reuse in C++ through Policy Classes
o Expressing Constraints a la Standard ML Signatures
o Namespaces and the Interface Principle
o Template Techniques
o 13 Pitfalls of the Software Industry
o Optimising "plain" and "multi threaded" C++
o C++ Standard Library : Changes, Corrections, and Extensions
o Open Source
o Meta Programming in C++
o Introducin g Boost
o Kevlin Henney
o Real Networks Inc
o An introduction to C++ Traits
o Modern C++ Design
o Gabriel Dos Reis
o ML
o Herb Sutter
o PeerDirect Inc
o The Interface Principle
o Koening Lookup
o Exceptiona l C++
o Migrating to Namespaces
o Guru Of The Week #53
o Nicolai Josuttis
o Optimizati ons That Aren't
o Nicolai Josuttis
o Beman Dawes
o OO Numerics / Blitz++
o ACE
o Alan Lenton
o ibgames
o subversion of standards by third parties
o SourceForg e
o Aspects of Open Source Software Development
o Gabriel Dos Reis [2]
o Template Metaprograms
o Scheme
o Andrei Alexandrescu's book
o Boost
o Beman Dawes
o C++ to Python bindings library
o compiler status page
o Thad
o codemonkey _uk
o Carnage4Li fe
o digger
o Also by codemonkey_uk


Display: Sort:
ACCU Spring Conference 2001 Roundup | 39 comments (25 topical, 14 editorial, 0 hidden)
Open Source as a Business Model (4.66 / 6) (#10)
by Carnage4Life on Fri Apr 06, 2001 at 06:51:28 PM EST

Which poses the killer question: How do you make money out of open source? A question that is especially relevant in the post dot-com bubble economy. Alan suggests that support services are a valid revenue model, and cited ibgames as a successful example.

I've heard this mentioned several times but I am yet to see this idea scale past a few developers working as consultants in a small business setting.

Currently Open Source shops are either collapsing, on the verge of collapse, tanking stockwise, facing shareholder lawsuits or are going proprietary. The only people who are actually making any money from Open Source projects are either those who incorporate BSDL code into closed projects (e.g. Microsoft) or those that sell hardware at a premium and thus do not care if the software is free (e.g. IBM).

In a recent interview with Slashdot, Microsoft exec Doug Miller stated that he doesn't think it's possible for a company that primarily sells software to be very successful from Open Source and I agree with him. Personally I think all the companies that are trying to profit from Open Source as a business model and tout "services" as a viable revenue stream need their heads examined. The most interesting addition to these ranks is Eazel whose supposed value added services which are supposed to bring in revenue are available for free from other companies. Here's a critique of Eazel's business plans I wrote a while ago in a comment on Slashdot.

Quite frankly the Microsoft model of charging for the main product and giving the fringe products and add-ons away (Internet Explorer, MSN Messenger, Windows Update, Windows Media Player, The Microsoft SDK, etc) easier for users to understand and accept than the Open Source way of giving the main product away for free then trying to charge for the fringe products which people then take for granted should also be free.

Open Source works best when the software is a hobby produced by talented and dedicated developers who are uninterested in financial rewards (Can I get a whoop for Debian!!!) as opposed to when developers have to constantly invent ways to make money from the software they are writing.

disagree (2.00 / 2) (#13)
by alprazolam on Fri Apr 06, 2001 at 07:52:55 PM EST

open source works best when the code is not the main product. you cite service business models, but i think more appropriate are models like zope's or sun's.

[ Parent ]
It seems you are agreeing with me. (4.00 / 1) (#15)
by Carnage4Life on Fri Apr 06, 2001 at 08:08:15 PM EST

open source works best when the code is not the main product.

That's what I said and I gave an example of IBM.

you cite service business models, but i think more appropriate are models like zope's or sun's.

Sun's model is just like IBM's, their primary product is overpriced servers and they give away a lot of stuff to make sure you buy their serversincluding an entire programming language and massive standard library (Java™). By the way, what Open Source product does Sun rely on or provide?

Digital Creations(the creator of Zope) is a small consulting company which is one of few ways I mentioned that it is possible to make money from Open Source. If you are a contractor or a consultant then it doesn't matter if the tools you use are free or not because you are getting paid for your time.

[ Parent ]
Open Source Software (5.00 / 1) (#32)
by alan on Mon Apr 09, 2001 at 08:44:14 AM EST

In the paper I wrote up from my notes, I did, in fact, change the example. Following discussions with some of the delegates after the talk, I thought that Digital Creations was a better example.

I agree that making money from Open Source is difficult - nearly as difficult as bending the culture of a traditionally closed source compnay to make an Open Source product work in the first place.

My talk was really aimed at people who are going to be faced with this sort of problem, and to give them some pointers on how to start, and what to look at.

I suspect - but I don't know - that making money out of Open Source is at base a matter of figuring out how to use it to increase the size of the pie so that even with a smaller slice you are better off.

One thing I would point out though is that directly making money is not the only reason for a commercial entity to embrace Open Source. Apart from the tactical advantages mentioned by other people, it is an excellent way to establish de-facto standards, and if you can do it in collaboration with other companies in the same niche, can be well nigh unstoppable.

Alan

[ Parent ]
Re: Open Source as a Business Model (4.00 / 1) (#14)
by sigwinch on Fri Apr 06, 2001 at 07:56:55 PM EST

[The Microsoft way is] easier for users to understand and accept than the Open Source way of giving the main product away for free then trying to charge for the fringe products which people then take for granted should also be free.
Who is trying to make money from OSS that way? My favorite approach is selling integrated software, a la IBM, Red Hat, SuSE, Ximian, and so forth. If IBM sold S/390 kits with software scattered across thousands of ftp servers, few businesses would buy them. People buy 390s because they want an integrated solution. And since new software is released all the time, the revenue from selling the latest distribution continues forever.

--
I don't want the world, I just want your half.
[ Parent ]

Cygnus (none / 0) (#31)
by Per Abrahamsen on Mon Apr 09, 2001 at 08:36:45 AM EST

>> Alan suggests that support services are a valid >> revenue model, and cited ibgames as a successful >> example. > I've heard this mentioned several times but I am > yet to see this idea scale past a few developers > working as consultants in a small business setting. Cygnus Solutions always made money, and I believe they grew to a couple of hundred people. They made most of their money on development contracts from hardware and software companies, and support contracts This made me sad to see them swallowed by Red Hat, I had a lot more faith in Cygnus ability to earn money than I do in Red Hat.

[ Parent ]
Compilers: Roadblocks in the Advancement of C/C++ (4.40 / 5) (#11)
by Carnage4Life on Fri Apr 06, 2001 at 07:04:36 PM EST

It is quite sad that although the most recent C and C++ standards are 2 and 4 years old respectively, a large number of features not only are not implemented by various compilers but seem like they won't be implemented in the near future. This is similar to how standards like CSS, DOM and XSL are still either unimplemented or implemented poorly in large number of browsers despite these standards being a few years old.

One begins to wonder who to blame for this though, is it the fault of the standards bodies for trying to please everybody only to end up designing broken libraries such as std::vector< bool > or is it the compiler writers who would rather create platform specific features (most notoriusly Microsoft) while claiming that some features are unimplementable only to have them implemented by other developers?

it'll come (5.00 / 1) (#20)
by mikpos on Sat Apr 07, 2001 at 12:16:53 PM EST

I suspect most C compilers were not entirely ANSI-compliant in 1991, either. In fact you *still* see people doing strange things in C, such as casting the return value of malloc(), because pre-ANSI compilers are around (or have been recently). It took probably five years before people were writing ANSI C code without gnawings in the backs of their heads about whether it would work on platform X.

GCC and one proprietary cathedral-style compiler (whose name escapes me) are I think leading the charge for C99 compliance. You can chart GCC's status: it's slow going (they have bigger worries right now, i.e. gcc 3.0), but I don't think it's unreasonable to assume they'll be effectively C99-compliant in a couple years.

On the point of the standards committee expecting too much, there is one sticky thing in C99. C99's behaviour for snprintf() differs from the traditional Unix snprintf(), specifically in the return value when the buffer is too short. glibc2.0 does the old behaviour; glibc2.1 does the new behaviour; BSD's libc follows the old behaviour (I think). There may be platforms which never switch over to the C99 behaviour, which is unfortunate.

[ Parent ]

But Will It Come In Time? (4.00 / 1) (#25)
by Carnage4Life on Sat Apr 07, 2001 at 10:56:26 PM EST

GCC and one proprietary cathedral-style compiler (whose name escapes me) are I think leading the charge for C99 compliance. You can chart GCC's status: it's slow going (they have bigger worries right now, i.e. gcc 3.0), but I don't think it's unreasonable to assume they'll be effectively C99-compliant in a couple years.

The changes proposed in C99 are relatively straightforward and I expect GCC to be fully standards compliant in a year or so. I was more concerned about the changes to the C++ standard which some compiler writers have claimed are unimplementable and are thus unwilling to implement to support those features.

I am paricularly worried about the fact that even though most compilers are at best semi-compliant with the standard, there are already parts being deprecated by the standards committee while more additions are being suggested. At this rate we won't have standard C++ which is portable across platforms for quite a while.

[ Parent ]
export (none / 0) (#33)
by Per Abrahamsen on Mon Apr 09, 2001 at 09:01:33 AM EST

It is mostly "export" that is deemed hard to implement, I believe the rest of the language will soon be reasonable portable.


[ Parent ]
Not that long at all... (3.00 / 1) (#21)
by ttfkam on Sat Apr 07, 2001 at 05:10:56 PM EST

It took browser makers 6+ years to get where they are now. XSLT was only ratified as a W3C recommendation in November of 1999. XSL is still only a "Candidate Recommendation" as of this writing. CSS and DOM are a bit older, but support for them in the newest versions of the various browsers is not that bad. It's all of the older browser that are still in use that prevent their wholesale adoption.

When I think that the ISO C++ Standard was ratified in 1998 and we've actually come this far so quickly, I am amazed. Sure I'm impatient and want it now. Who doesn't? However something that I think is not given enough credence is that maybe they aren't going too slowly and everyone else simply isn't planning for the long haul.

Having worked on rushed code and products with unreasonable deadlines (haven't we all?), wouldn't it be nice to have something take its time and get it right? Not just workable. Done right.

When citing the faults(?) of the standards bodies, considering the volume of information regarding C++, isn't it incredible that only two items slipped through the cracks?

Devil's advocate for the proprietary vendors (including Microsoft): What were they supposed to before the standards were released? Sit on their hands and start implementing when given the "go!" signal? The standard solves real problems. The problems also existed before the standard was released. To my knowledge, Microsoft did not implement a new feature in conflict to the standard after the standard was released. The ATL instead of the STL? I can guarantee you that MS started designing and implementing that long before 1998. And what would you expect of them now? "Oh sorry. We know you used this other library that we made, but because of the standard, we're getting rid of it and all of your old programs will be broken until you rewrite them according to the standard. Hello? Hello?

<Rant>
If anyone is to blame (which I'm not convinced that blame is necessarily appropriate in this case), it is the developers. Not the tool vendors who, like it or not, are bound by their customers' wishes. Not the end user who couldn't care less whether their favorite program was written in C++, IA32 assembly, or Eiffel. Developers dictate what tools they use. They decide that, while rough around the edges, a decision must be made for the long haul rather than the immediate Pavlovian reward (get a neat new tool, watch programmer drool, repeat). If we as developers are not ready to do make that commitment, then all the standards in the world floating around won't make any difference.
</Rant>

Thank god for the GCC development group. Those individuals have been the common ground that we've needed for so many years. Developers may have to make the hard decisions when creating, but at least we have help.

If I'm made in God's image then God needs to lay off the corn chips and onion dip. Get some exercise, God! - Tatarigami
[ Parent ]
Microsoft Embracing and Extending C++ (4.33 / 3) (#24)
by Carnage4Life on Sat Apr 07, 2001 at 10:44:04 PM EST

When citing the faults(?) of the standards bodies, considering the volume of information regarding C++, isn't it incredible that only two items slipped through the cracks?

From his article I saw that there have been 300 issues which have been submitted to the C++ Standards committee of which 125 are admitted defects. The article just highlighted the two that were major problems.

Devil's advocate for the proprietary vendors (including Microsoft): What were they supposed to before the standards were released? Sit on their hands and start implementing when given the "go!" signal? The standard solves real problems. The problems also existed before the standard was released. To my knowledge, Microsoft did not implement a new feature in conflict to the standard after the standard was released. The ATL instead of the STL? I can guarantee you that MS started designing and implementing that long before 1998. And what would you expect of them now? "Oh sorry. We know you used this other library that we made, but because of the standard, we're getting rid of it and all of your old programs will be broken until you rewrite them according to the standard. Hello? Hello?

No one is suggesting that Microsoft breaks compatibility with their old libraries. The truth of the matter is that Microsoft can provide bridges between their libraries and the STL but does not plan to do so. Heck, I've interviewed with them and was told they don't use the STL in their internal code.

Microsoft has the resources to be completely compatible with the newest standard but they aren't. Instead they are embracing and extending the language with their managed extensions to C++ with their new focus on .NET. Writing C++ on Windows platforms looks set to become another Java™ fiasco where the langauge is supposed to be platform neutral but isn't due to MSFT specific extensions.

[ Parent ]
Important note (none / 0) (#29)
by codemonkey_uk on Mon Apr 09, 2001 at 04:24:12 AM EST

From his article I saw that there have been 300 issues which have been submitted to the C++ Standards committee of which 125 are admitted defects. The article just highlighted the two that were major problems.
The C++ ISO Standard Committee is a *volenteer* group.

Q) Whats the best way to improve the quality of industry standards?
A) Get involved.

This, I think, is an extreamly relevent point - the C++ standard is not one about which people are really in a position to complain, because, had they cared enough to get involved, they could have done something about it...
---
Thad
"The most savage controversies are those about matters as to which there is no good evidence either way." - Bertrand Russell
[ Parent ]

Only 2 or 3 *real* problems. (none / 0) (#34)
by Per Abrahamsen on Mon Apr 09, 2001 at 09:14:01 AM EST

From his article I saw that there have been 300 issues which have been submitted to the C++ Standards committee of which 125 are admitted defects. The article just highlighted the two that were major problems.
He didn't wrote major problems, but real problems. There is a big difference. 300 typos in a work that size is very little. From the article: Only 2 or 3 of these defects are real problems, most are typos and errors in examples.

[ Parent ]
Silver bullets, werewolves, and the real world (3.75 / 4) (#22)
by yosemite on Sat Apr 07, 2001 at 10:06:11 PM EST

I have never, not once seen anyone refer to Object Oriented programming (which seems to be the favorite whipping boy in such debates) called a "silver bullet". What I have seen is people make the argument "OOP (or whatever) is not a silver bullet, therefore it is useless". Gimmie a break. Does anyone (OK, anyone outside a marketing department) really think OOP (or any other new programming stratagy) is ever going to be a silver bullet?

As a professional software developer, I say we can use anything that will truely make our lives easier and more productive, even if it isn't the solution to all our woes. We would all do well to remember that software development is a discipline barely 50 years old, and as such it should come as no surprise that we're still learning how to do it. And it should come as no surprise that when we do learn something new and useful, it's not the whole answer to everyone's problems.

-y



--
[Signature redacted]

Silver bullets (4.00 / 2) (#27)
by ucblockhead on Sun Apr 08, 2001 at 10:03:38 AM EST

Recently, no, but ten years ago, when C++ was first hitting the mainstream? Oh, yeah, all the time.

I was personally on a project that got killed (despite the fact that the software was working!) after code auditors reported that it was "only 35% object-oriented".
-----------------------
This is k5. We're all tools - duxup
[ Parent ]

Java (3.00 / 1) (#30)
by codemonkey_uk on Mon Apr 09, 2001 at 04:58:19 AM EST

"Write one run anywhere" (or was it everywhere?) - well that sounds like a sliver bullet statment to me. And IIRC large portions of the industry lapped it up.
---
Thad
"The most savage controversies are those about matters as to which there is no good evidence either way." - Bertrand Russell
[ Parent ]
Brad Cox: There is a silver bullet (4.00 / 1) (#35)
by Per Abrahamsen on Mon Apr 09, 2001 at 09:33:49 AM EST

Brad Cox of Objective-C fame claim there is a silver bullet.

[ Parent ]
Open source as tactic (4.75 / 4) (#23)
by slaytanic killer on Sat Apr 07, 2001 at 10:28:53 PM EST

I think that using Open Source as an overall business strategy is just like using the idea of "selling stuff" as a strategy. Works fine for the first person to ever do it, but it really leaves a lot of questions unanswered.

But Open Source certainly has a tactical place:
1) To gain a foothold
2) To destroy someone else's foothold

From there, everything you gain should put distance between you and your threats. AT&T does this; their business model can be stated in one sentence: "Go into non-commodity areas which require high amounts of capital, where we can expect to be the #1 or #2 company." And also there is an account at Joel on Software describing Excel's uphill 1991 battle against Lotus 123, where the Excel team looked very carefully at all the barriers against switching to their product, and broke each of them. Business is not a matter of having an overpowering strategy that leads companies down Penny Lane. It is more like driving a car than pointing a gun: Making a number of strong decisions, depending on what's happening at the moment and where you want to end up.

What footholds can Open Source destroy? Well, using the GPL can ruin the foothold of someone using a proprietary solution to dictate standards. Suddenly much of the infrastructure built around that proprietary solution is destroyed. Costly to either reorganize or ignore; people with different skillsets will need to be brought in, since the skills needed to maintain a protected, profitable company are different from those needed for running a company like a startup.

This is the crux of the fight between the music companies vs. Napster. Napster carries the threat of destroying infrastructure that took decades to build.

What can be gained by Open Source? For a small company, it allows them to build an infrastructure rapidly 'n cheap. If they develop open source products or take a strong role in existing products, they gain a sizable advantage in consulting/service. Does it scale to vast heights? No, since at some point a company needs lock-in to retain customers easily. But it can bring a company to the point where they can use the AT&T strategy of getting into capital-intensive environments. While the growth of a specialist consulting company may be bounded, it still can be very profitable and bring a fledgling company to a next level.

A large company can use Open Source to destabilize smaller, newer companies. Microsoft had ample resources to release Internet Explorer for free, destroying a Netscape that tried going at max velocity. IBM uses alphaWorks in a similar way, incubating other peoples' efforts to grow the market share IBM wishes to dip into.

The moral to this all is probably that there is no point in loving and hating Open Source as if it were a fad. If an entire industry dismisses Open Source, that is good news for those who can use it tactically. The point is not to follow the herd blindly. A company which does, has no weapons to speak of.

MetaProgramming is a weak compiler simulation (3.00 / 2) (#26)
by exa on Sun Apr 08, 2001 at 09:27:31 AM EST

That I understood when I implemented parallel arrays using expression templates. Now, avoiding temporaries is easily solved because it is actually an easy algorithm (simply do lazy eval on expressions) but doing even the most trivial parallel optimization would be impossible. Simply because C++ wasn't designed for that. [Theoretically, you can write anything in the compile-time, but the code explosion would make most interesting implementations intractable]

MetaProgramming doesn't really give you metaclasses or compiler writing abilities, only a shallow imitation of what a compiler might be like. It's just a very sophisticated way to delve into C++ internals and do tricks that seem impossible with ordinary coding. But, it certainly isn't the ultimate solution to high level programming, because it's just a kludge. Nothing less, nothing more.

What I'd like to see is a rich language like Haskell catching up on performance with C++, then we would be getting somewhere. Or else, I'll be writing my own lang., I've had enough of C/C++ kludgery!!

Regards,
__
exa a.k.a Eray Ozkural
There is no perfect circle.

ocaml (5.00 / 1) (#38)
by Puchitao on Mon Apr 09, 2001 at 05:45:57 PM EST

I've been hearing good stuff about OCaml's effeciency lately. The syntax isn't as nice as Haskell's, and it's strict rather than lazy (although I think there's a "lazy" module, IIRC), but it's still a helluva lot closer to Haskell than C++ is.

I haven't made the switch myself -- I like writing in a lazy idiom and don't really need screaming speed -- but maybe it'll do for you. It seems to have converted a loyal following, who love to roll out some new OCaml benchmark every time the strict/lazy flamewar erupts.

Perhaps we can do *snappy fun* with you everytime! -- Orz
[ Parent ]

OCaml (none / 0) (#39)
by Wodin on Tue Apr 10, 2001 at 02:53:57 AM EST

There's some interesting stuff going on with OCaml -- it's probably one of the most developed functional programming languages right now, and it's got a lot of really interesting stuff going for it.

You can download an implementation at the Ocaml website for just about any platform. It's fun stuff.

As for syntax, well, Haskell vs. OCaml is purely a matter of what you learned first. I originally learned SML, so the fact that ++, ::, and : in Haskell mean @, :, and :: (I think, it's been a little while) in ML makes a big difference. I will agree that programming in a strict language can foster some very different habits than a lazy language, but again it's purely a matter of preference. One thing that FPLs are really good at is compiler/interpreter design, and it's really showing now.

[ Parent ]

C/C++ OOPs I did it again. (2.50 / 4) (#28)
by jsburch on Sun Apr 08, 2001 at 11:32:22 PM EST

I started programming in 1983 in Dibol and from there proceeded into Cobol, IDMS, DB/2, CICS, etc. In other words all the popular procedural languages of the day. I have written and supported large customer systems that handled 1 to 3 million customers. All without the benefit of OOP or C/C++. I have never been involved with a large scale OOP system.

With that said, I evaluated Turbo C v. 1.0 in 1990 and wrote a few utilities with it, but decided in the end that it made a pretty sorry application language. It seemed well suited as a substitute for Assembler where control over hardware was needed, but IMO lacks the ease of use of higher level languages it strove to replace. Powerful yes, but dangerously so. I see no logical reason for it to have caught on the way it did. How did something with high development and maintenance cost push better solutions out the door?

As for C++, it seems like a language the Twilight Zone would produce. I can imagine a cigarette smoking Rod Sterling looking over your shoulder puffing away as you code. {shiver} It may be powerful, but I see no advantage to the OOP development model over modular techniques. OOP seems to obscure elements of the program which makes it more difficult to share components between teams. At least for the examples I have seen. Why is a class method better than a subroutine? Does class inheritance promote code bloat? If I need one method from a class that contains 40, I have to inherit the entire class to access it. Seems inefficient. Can anyone relate moving from the procedural/modular world into the object world? If so, is it really better, or just better in the classroom? Or is it just another way of looking at a problem?

For programming applications on the PC, my language of choice would be Object Pascal. Well actually more Pascal than Object, but because it has better language design. It has great features and execution speed. Compiles in a faction of the time C takes (C is dirty so the compiler has to work more). And generally keeps you from making sloppy mistakes. If it were Microsoft pushing Delphi I think that is what we would all be using today. --Scott
--Scott 8-}

Clarification (none / 0) (#36)
by codemonkey_uk on Mon Apr 09, 2001 at 12:12:56 PM EST

It should be made clear that this article covers only some of the 28 sessions at the ACCU Spring Conference.

There where usually 3 seperate sessions going on at anyone time, covering other subjects not mentioned here such as Java, C#, embeded programming, Python and langauge neutral design.
---
Thad
"The most savage controversies are those about matters as to which there is no good evidence either way." - Bertrand Russell

Language Design vs. Library Design (4.00 / 3) (#37)
by exa on Mon Apr 09, 2001 at 01:49:49 PM EST

Please accept this as a very short synopsis of what might come. ;) If you remember your "A Book On C", you will remember a saying which emphasizes the strong relation between library design and language design. (Now, you tell me if the reference is correct, I don't own the book ;)

That remark underlines the importance of "pragmatics" in programming languages. Since every prog. language out there is Turing Complete, we distinguish them by their "use" in certain tasks. The more effective a language is in a wider array of tasks we claim that it is a "general purpose" programming language. That is C. Of course, relational algebra is Turing Complete, but few of us try to write, say a graphical file manager, in a database query language.

Nevertheless; there lies the promise of new programming paradigms. And there is the importance of standard library. C was successful because it has a quite reasonable standard library which supports a certain programming "style" very well. That's when a language is of more use.

Then, I look down at my beloved SGI STL manual on http, and I observe that the whole iterator concept is very inflexible, very difficult to use being notoriously unpleasant to read and write, thoroughly error-prone and superficial at best. Why was that I think, and when I look at a more coherent standard library like Common LISP's, or even Python's, I don't see why C++ wasn't developed in conjunction with a standard library. Or has that process been too complicated? There have been many iterations of both language and library, but standard library has hardly the ubiqutious quality that the C library had.

Then I look at the one of the things that Mr. Stroustroup claims which the new C++ boasts: generic object-based procedural programming. I buy the argument: genericity is obviously very important. Then, I get down and code stuff using templates and see my code get uglier, twisted as hell, and it compiles really slow, almost killing my poor computer and in the result producing a very unmanageable code... Then I think to myself, how many people actually know how to use the standard library? Not many. And perhaps because it's not that standard after all, or not as effective and useful as one might like it to be. I am there writing generic algorithms using templates, hmmm... well nobody else seems to be using all these nice functional prog. support in standard lib. And well, you wonder, _why_ is that? Perhaps it's because there are _large_ gaps between the practice of the library designers and the language designers. Just a hunch.

Thanks,

__
exa a.k.a Eray Ozkural
There is no perfect circle.

ACCU Spring Conference 2001 Roundup | 39 comments (25 topical, 14 editorial, 0 hidden)
Display: Sort:

kuro5hin.org

[XML]
All trademarks and copyrights on this page are owned by their respective companies. The Rest 2000 - Present Kuro5hin.org Inc.
See our legalese page for copyright policies. Please also read our Privacy Policy.
Kuro5hin.org is powered by Free Software, including Apache, Perl, and Linux, The Scoop Engine that runs this site is freely available, under the terms of the GPL.
Need some help? Email help@kuro5hin.org.
My heart's the long stairs.

Powered by Scoop create account | help/FAQ | mission | links | search | IRC | YOU choose the stories!