Kuro5hin.org: technology and culture, from the trenches
create account | help/FAQ | contact | links | search | IRC | site news
[ Everything | Diaries | Technology | Science | Culture | Politics | Media | News | Internet | Op-Ed | Fiction | Meta | MLP ]
We need your support: buy an ad | premium membership

[P]
Graphical Dataflow Programming: LabVIEW and Other Tools

By unDees in Technology
Tue Feb 04, 2003 at 01:47:04 PM EST
Tags: Software (all tags)
Software

These days, a number of software developers create their software by drawing it. These engineers do not merely hand-translate a paper flowchart into text source code, or wield software tools to transform a UML diagram into C++. With graphical programming, the diagram is the source code, depicted as an arrangement of nodes connected by wires. Each piece of data flows through the wires, to be consumed by nodes that transform the data mathematically or perform some action such as I/O.

The concept of a dataflow diagram (which, unlike a flowchart, shows the motion of data rather than the motion of logic) is nothing new. In fact, even the idea of letting a dataflow diagram be the sole input to a compiler or interpreter has been put into practice for years. A number of graphical programming tools are available today, each tailored to a particular industry. Developers use these tools to boost productivity and express their ideas more clearly, with fewer of the artificialities of programming getting in the way of the task at hand. This article will focus on one such tool, LabVIEW, and will give a brief overview of similar development environments.


Full disclosure

I chose to focus on LabVIEW because, of these three tools, it's the one with which I have the most experience. I'm not a National Instruments employee; quite the contrary, money usually flows the other direction, from the company where I work to NI. It should be noted that I work for a company that competes with Agilent in the hardware market, but not in the software market. Even being an Agilent competitor does not stop us from buying their hardware and software when we need them to get the job done. In any case, I have endeavored to deal with the differences among LabVIEW, VEE, and Sanscript evenhandedly.

LabVIEW

Laboratory Virtual Instrument Engineering Workbench, or LabVIEW, was developed by Jeff Kodosky and others at National Instruments in 1986. Originally available only for the Macintosh, LabVIEW is used today on MacOS, Windows, Linux, Solaris, embedded systems, and FPGAs. This software development tool consists of an application development environment used to create graphical source code in a programming language called "G."

Since National Instruments specializes in test and measurement hardware, LabVIEW programs are based on the concept of a Virtual Instrument, or VI. Each VI can be seen as both a program that can run by itself and a function that can be called by other VIs. (Java developers will no doubt be familiar with this dual role, as a Java class can be run either on its own by calling its main method, or as a component used by other classes.) When a VI runs as a standalone program, a user interacts with a GUI called the front panel (Screenshot 1). The source code behind the scenes is drawn on the block diagram (Screenshot 2). Any VI can be dropped onto another VI's block diagram as a subVI (Screenshot 3) and connected to the rest of the dataflow structure. The programmer-defined connector pane (Screenshot 4), like a C function signature, defines how the outside world will pass parameters into the subVI.

It should be emphasized that the graphical representation of the block diagram is the VI's source code. Before the VI runs, LabVIEW compiles it to native machine code. In fact, once that step is complete, an executable version of the VI can be saved with the front panel and block diagram removed (except front panels the user will actually see) for embedding into a standalone program or shared library/DLL.

LabVIEW was originally designed to create GUI front ends for test and measurement systems, hence its heavy emphasis on the many different I/O buses used in that industry. Over time, however, it has evolved into more of a general-purpose language (or more than a general purpose language, as some might say). True, it would be difficult to write an operating system in LabVIEW, but there are plenty of non-measurement programs written in G, including video games. Programmers love the way the graphical language takes care of multithreading automatically (any time a wire splits in two, or two blocks of code are not chained together in sequence, the runtime may run these independent chunks of code in separate threads). Temporary variables with useless names have been supplanted by the anonymity of wires, which show the data's purpose and destination far better than names like "temp," "foo," and "dummy."

The LabVIEW programming environment is not without its drawbacks. First, although there are many open source projects written using LabVIEW, the development environment itself is expensive (although the student edition is only $93, the professional version with all the bells and whistles is nearly $3500), closed-source, and controlled by one vendor. It should be noted, however, that National Instruments has placed the source code in escrow (contact NI for the exact terms) to protect future development in G. Secondly, the language is just now evolving to contain native support for event-driven user interface programming; it is easy to create an attractive GUI in LabVIEW, but something of a skill to create a responsive one. The G programming language is also lacking in true object-oriented features. While OOP is of course not a panacea and in fact slightly breaks the dataflow paradigm, many developers have encountered situations where true OOP would save programming time, and where the "OOP lite" GOOP Toolkit is not powerful enough.

VEE

Agilent's Visual Engineering Environment, or VEE, uses many of the same concepts as LabVIEW. (Before Agilent spun off from Hewlett-Packard, the software was called "HP VEE," which sounds rather unfortunately like the abbreviation for the unpleasant human papilloma virus.) The two programming tools have been in something of an arms race over the years, as each has added features to match its competitor's progress. VEE takes a slightly looser interpretation of the dataflow paradigm. Information does not necessarily flow down every visible wire exactly once as in LabVIEW. Instead, there are some outputs called sequence pins that can repeatedly cause nodes to execute at a given intervals. A chunk of data will only take one exit from an If/Then/Else object in VEE.

VEE's diagrams are also its source code, though it should be noted that their disk file representation is a textual format that looks vaguely LISP-like and is easier to use with source code control programs than LabVIEW's binary format. Diagrams can be built into intermediate bytecode to be executed by the VEE runtime (license required), in contrast to LabVIEW's fully compiled executables that use the free LabVIEW runtime.

At $1050 for the professional version, VEE is less than a third of the price of the corresponding LabVIEW version. Many users report a faster learning curve with VEE, since a few subtle ramifications of the dataflow model are simplified. Like LabVIEW, VEE sports an extensive library of software components called instrument drivers to help users communicate with their test equipment (these instrument drivers are basically an API for instrument control and are usually somewhat easier to develop and use than hardware device drivers). Developers can get help on Agilent's VEE forum and from a worldwide community of VEE users.

On the downside, VEE's latest version is only available for Win32, cannot build DLLs, and has fewer support options (i.e., several application notes, but no extensive example code and tutorial library). VEE's user interface controls by default have a clunky Motif-esque appearance, where LabVIEW offers three choices (a snazzy 3D look, a simpler flat style, and a set of platform-native controls). Another point on the subject of GUIs is that VEE offers much less runtime programmatic control over the interface elements.

Sanscript

Sanscript from Northwoods Software is marketed to Windows users who want to put together a quick program with something a little more intuitive and powerful than a batch file or Visual Basic script. Like LabVIEW and VEE, Sanscript allows developers to wire nodes together into programs and then in turn use these programs as subcomponents of larger applications or build them into executables. Since it is targeted at a different market, Sanscript lacks the measurement and advanced analysis features of its big-company counterparts, but has a few advantages of its own. It is much cheaper than the competition, coming in a free-of-charge entry-level flavor as well as a more advanced $130 professional version.

Sanscript's palette of built-in operations mainly consists of simple logical, file, and arithmetic operations. The professional version contains more extensive support for scripting other applications via COM. Users who want sophisticated math or complicated string parsing will have to build their own toolkits from the supplied primitives. Beginners are likely to find it easy to code in Sanscript: most objects can be configured by double-clicking the appropriate icon and filling in fields in a dialog box. Regular users may feel frustration at the need to keep invoking configuration dialogs instead of having better in-place editing options.

In short, Sanscript shares both the advantages and drawbacks of the traditional batch files and scripts it is meant to complement. While it is simple and inexpensive, it offers a far less sophisticated set of language primitives and built-in libraries for the power user.

Other tools

The three programming environments surveyed here are just a few of the many dataflow development tools available. Other programs pointed out by alert readers include:

  • An audio signal processing toolkit called PD, and competitors Max/MSP and jMax (thanks, celeriac)
  • ProGraph, a Win32 object-oriented dataflow language (link supplied by Pac)
  • SimuLink, a dataflow companion to MATLAB (thanks to subversion for pointing this one out)
  • Mindstorms, the LEGO robot programming environment
  • NFW's Juice, another language for controlling robots

Conclusion

Like any other approach to coding, graphical programming is not a panacea that meets all software needs. Besides the obviously more expensive hardware required to create and view dataflow diagrams, there are far fewer cheap or free software tools available. Despite their ability to be compiled, graphical programs still rely on hefty runtime libraries that may slow performance. Additionally, the dataflow model proves unsettling and unproductive for some coders and inappropriate for some jobs.

Many developers, however, have joined the ranks of dataflow programmers with enthusiasm. They find the visual presentation of their ideas direct and refreshing. The ability to prototype rapidly and call on a wide range of industry-specific libraries leads many to boast of a productivity increase for certain tasks. In the best case, one can think entirely in the problem domain, forget distracting notions of programming and code, and just have fun getting the job done.

Sponsors

Voxel dot net
o Managed Hosting
o VoxCAST Content Delivery
o Raw Infrastructure

Login

Poll
What graphical programming technique or tool do you use most?
o LabVIEW 10%
o VEE 2%
o Sanscript 0%
o Diagram-based C++/Java tools like Rational Rose or VisualAge 11%
o A napkin 25%
o My imagination 28%
o Graphical programming is for wusses! I think in hex! 19%
o Other; see comment 2%

Votes: 78
Results | Other Polls

Related Links
o UML
o dataflow diagram
o LabVIEW
o Jeff Kodosky
o National Instruments
o Screenshot 1
o Screenshot 2
o Screenshot 3
o Screenshot 4
o emphasis
o more than a general purpose language
o video games
o open
o source
o projects
o student edition
o bells and whistles
o contact NI
o future development in G
o GOOP Toolkit
o Agilent
o VEE
o Hewlett-Pa ckard
o human papilloma virus
o arms
o race
o sequence pins
o If/Then/El se
o textual format
o intermedia te bytecode
o LabVIEW runtime
o profession al version
o faster learning curve
o instrument drivers
o VEE forum
o community
o latest version
o user interface controls
o Sanscript
o Northwoods Software
o entry-leve l
o profession al
o PD
o Max/MSP
o jMax
o celeriac
o ProGraph
o Pac
o SimuLink
o subversion
o Mindstorms
o NFW
o Juice
o productivi ty increase
o fun
o Also by unDees


Display: Sort:
Graphical Dataflow Programming: LabVIEW and Other Tools | 161 comments (122 topical, 39 editorial, 0 hidden)
Dr. Dobb's article on Sanscript... (none / 0) (#3)
by SaintPort on Mon Feb 03, 2003 at 04:50:07 PM EST

http://www.ddj.com/documents/s=900/ddj9908b/9908b.htm

I have not tried it, but might.

--
Search the Scriptures
Start with some cheap grace...Got Life?

That article... (none / 0) (#5)
by unDees on Mon Feb 03, 2003 at 05:39:28 PM EST

...in the dead-tree version of DDJ is how I found out about Sanscript. That's actually a great issue; they also mention Mindstorms, which is a kit you use to build LEGO robots controlled by a much simpler pseudo-dialect of LabVIEW.

The same DDJ issue also discusses the Formulate visual programming language, which appears to be dedicated to visual representations of data structures.

Your account balance is $0.02; to continue receiving our quality opinions, please remit payment as soon as possible.
[ Parent ]

Prograph and ToonTalk (5.00 / 1) (#7)
by Pac on Mon Feb 03, 2003 at 06:55:13 PM EST

Pictorius Prograph is also somewhat popular.

And ToonTalk should probably be mentioned too, at least for its fun potential.

Evolution doesn't take prisoners


Very interesting. (none / 0) (#10)
by guidoreichstadter on Mon Feb 03, 2003 at 07:58:27 PM EST

Some questions:

What kind of expensive hardware is needed to create and view dataflow diagrams?

Can you mention which (if any) free or open source graphical programming software?


you are human:
no masters,
no slaves.

Answers... (5.00 / 1) (#13)
by unDees on Mon Feb 03, 2003 at 08:17:51 PM EST

What kind of expensive hardware is needed to create and view dataflow diagrams?

Well, no specific fancy video hardware is mentioned by name in the vendor documentation. But LabVIEW, for example, has ways of finding hitherto-undetected bugs in video drivers, as many users will attest, so a lot of developers spring for the high-end cards. Also, the disk and memory requirements (150-225 MB disk required and 64 MB memory recommended, depending on the platform) are considerably beefier than those for, say, gcc.

The intent behind my original statement was to emphasize that, to fix a program, you can't just telnet into a 486, change a few lines of code, and recompile. The more hardware you can throw at dataflow programming, the smoother your experience will be.

Can you mention which (if any) free or open source graphical programming software?

Well, none of the three I surveyed are open source. I suspect that's the case with the most commonly-used tools, with a few niche programs like RobotFlow and OpenDX here or there.

However, note that plenty of open-source projects are written in dataflow languages, even using non-open source IDEs.

Your account balance is $0.02; to continue receiving our quality opinions, please remit payment as soon as possible.
[ Parent ]
Clarification... (none / 0) (#15)
by unDees on Mon Feb 03, 2003 at 08:21:08 PM EST

When I said: I suspect that's the case with the most commonly-used tools, with a few niche programs like RobotFlow and OpenDX here or there, I left out two words:

...with a few open-source niche programs....

Your account balance is $0.02; to continue receiving our quality opinions, please remit payment as soon as possible.
[ Parent ]
I wrote a free one. (5.00 / 2) (#21)
by NFW on Mon Feb 03, 2003 at 11:59:44 PM EST

Last year I wrote a 3D robot workbench just for grins. It includes a graph-based dataflow diagram system to create control systems for the robots. "Simple" in the sense that they basically amount to clockwork, but the diagrams can get pretty hairy if you're trying to get a biped to walk, or to get a hexapod to walk and pirouette smoothly under joystick control.

It's free, but the source is not available. It probably will be in a few months or so, but I'm not making any promises just yet.


--
Got birds?


[ Parent ]

Actually, two. (none / 0) (#67)
by NFW on Tue Feb 04, 2003 at 04:32:44 PM EST

But this one is ancient (Win16, if you can imagine) and though 'flat' code worked, subroutines had issues... I'm not even sure I still have copies of the source or binaries. Binaries might be fun, if they still run on Win32. Source would be ugly, since I barely knew what I was doing at the time.

It was called GLIDE, for Graphical Language, Integrated Development Environment. You could step through the code, which was kind of neat.


--
Got birds?


[ Parent ]

You missed a major one: (5.00 / 4) (#18)
by subversion on Mon Feb 03, 2003 at 11:03:30 PM EST

Simulink.  It's basically an add-on for MATLAB (probably the most common numerical analysis and engineering program anywhere I've worked, especially the auto industries) that gives the ability of working in MATLAB with graphical programming.  It tends to be aimed more towards doing math than programming, but can be interfaced to external instruments just like the 3 environments you mention.

If you disagree, reply, don't moderate.
I used Labview in 1995 (5.00 / 1) (#26)
by ggeens on Tue Feb 04, 2003 at 04:04:33 AM EST

I worked with Labview in 1995. I had to create a measurement system for my master's thesis.

The system used an electrical motor to position an optical sensor. At each position, it took a measurement from the sensor.

The motor wasn't supported by Labview, so I had to program the GPIB bus directly. (I had the data sheets with all the commands.) The optical sensor (also on the GPIB bus) was known, so I only had to take the VIs from the library.

The only thing I wasn't happy about were the initialisation routines: they stand outside of the measurement loop, and it wasn't easy to have them executed before the rest. I ended up with a series of VIs all connected with a dummy data flow. (Each VI would transmit a 1 to the next one.)

The version I used wasn't very advanced, and it was difficult to get the loop controls right. (You had to draw the loop before the contents.) Later versions fixed that.

L'enfer, c'est les huîtres.


data sheets with all the commands (none / 0) (#28)
by wiredog on Tue Feb 04, 2003 at 07:49:42 AM EST

Ouch. "outportb(0x340,0x2a)" etc, etc. I've done that sort of machine code programming. Once.

Really gives you an appreciation for higl level languages like C.

Wilford Brimley scares my chickens.
Phil the Canuck

[ Parent ]

Well, to be fair... (none / 0) (#39)
by unDees on Tue Feb 04, 2003 at 10:54:59 AM EST

Command-based data sheets for GPIB control are usually text-based, so instead of

outportb(0x340, 0x2a);

you'd have

ibwrt(specAn, "CF 20 MHZ\n", 10);

Still a pain, but much easier to remember, and anyway you can usually create a quick API for your instrument and never worry about it again.



Your account balance is $0.02; to continue receiving our quality opinions, please remit payment as soon as possible.
[ Parent ]
All the commands (5.00 / 1) (#150)
by phygjh on Fri Feb 07, 2003 at 12:17:45 PM EST

Agilent VEE isn't so bad. If you have a manual to hand, or have used a similar instrument before, then you can generally guess the command.

"LAS:OUT ON" would switch a laser on, "MEAS:FREQ" or "MEAS:WAVE" are similarly obvious.

I've been reading this article and comments with interest, as I've only used VEE, not Labview. I have dabbled with C in the past, and want to get more into VB/ActiveX to integrate my test-system with Excel/Access.

Cheers,

[ Parent ]

LabVIEW and instrument control (none / 0) (#40)
by unDees on Tue Feb 04, 2003 at 10:57:28 AM EST

It's a pain to have to progam GPIB commands directly, but you can always create a library of your most-commonly used commands and build your own instrument driver. And you can pass error information into and out of each driver routine, so you get true dataflow instead of having to fake it at the top level. It may not have been worth going through all that, though, if it was a simple device like a motor.

Your account balance is $0.02; to continue receiving our quality opinions, please remit payment as soon as possible.
[ Parent ]
Old LabView/LabWindows (none / 0) (#106)
by richc on Wed Feb 05, 2003 at 09:55:00 AM EST

I used LabView and its parallel semi-programming equivalent LabWindows back in 1995-ish (back in DOS/WFWG days). I can't remember how LabView worked exactly but LabWindows was mainly programmed in a wierd form or BASIC (of all things) that was then converted into bad C and compiled. I thought LabView was similar (ie. used a C compiler somewhere inside its build process) but it may well have changed a lot since then.

These control systems where really handy, once you've done the programming part all you had to do was pop into the lab in the morning, set up your programs to do the experiments for the day then relax leaving the computer to control all the work. Pop back into the lab in the afternoon to pick up the results and thats it for the day.

[ Parent ]

LabView & GPIB (none / 0) (#151)
by superflex on Fri Feb 07, 2003 at 02:47:09 PM EST

I had a similar situation in a job a couple years ago, having to manually control an old-ass 3-phase wattmeter over GPIB. I found that NI's support/driver website to be really quite excellent, but their support for older instrumentation was somewhat lacking. Hence, I figured I'd start making driver VI's myself.

Kind of funny, actually. It was my first time working with GPIB. I had all the NI documentation on it, where they liked to refer to it as IEEE 488.2, and all the meter documentation, which called it "HPIB", which is what it was originally called back in the 70's when HP first developed the protocol. Took me a couple hours to figure out that they were all talking about the exact same friggin thing.

[ Parent ]

IEEE 488 (none / 0) (#161)
by lukme on Wed Mar 26, 2003 at 12:08:49 AM EST

I had the same experience back in 1992 with LabView. I was controlling a triple monochromator and a CCD through HPIB. I found using the HPIB part straight forward, where as getting the motors to perform the way they should was difficult.

It was less difficult when the monochromator company replaced the motion control hardware with less defective hardware.


-----------------------------------
It's awfully hard to fly with eagles when you're a turkey.
[ Parent ]
MindRover and stuff (4.00 / 1) (#27)
by carbon on Tue Feb 04, 2003 at 04:28:22 AM EST

Have you ever played the game MindRover? It's a robot programming game, originally built for Windows by Cognitoy, but later ported by Loki. You should still be able to find the Loki demo around somewhere...

MindRover has a graphical wiring interface which it compiles to a textual event-based language called ICE, which is then interpreted inside the simulation (where your programming must guide the bot without your direct help.) The graphical wiring interface seems to model the data flow system you were talking about, where input data from a given source would be piped through output ports to the input port of another device. These other devices could in turn generate further events, triggering more devices.

This system had many of the same features as the one you describe; it has VIs in the form of "logical devicesa", and the automated control flow splitting when two output wires came from a single source. However, it had no restrictions on how many times data could flow down a given wire; it was possible (though harder to do by accident then it would seem) to build infinite recursive loops and the like. A given input would always propogate completely before the next was handled.

For example, it was possible to build a rotating sensor dish using a metronome device (which generates an output signal containing the number y every x seconds), piped into the first column of an Add logical device, overwriting whatever was previously there. Changing the first column of an Add caused it to produce an output signal containing the sums of both columns. This first Add was connected to the first column of a second Add, which was connected to both the second column of the first Add and the angle input of the dish.

The biggest problem with the interface by far was that of race conditions; without a visible procedural main loop it was impossible to tell which of several inputs would be handled first, or which path of a device with two outputs would be called first. At first glance, it would seem as though the systems you show exhibit the same problem. How does LabVIEW get around this?

The other problem was that, generally, it was just a big pain to do any sort of assignment logic involving data stored between inputs. Getting the data back out of a variable could only be done by prodding it somehow in order to force it to produce an output signal. However, this often did not occur at the same time as assignment. Furthermore, it would be impossible to have shared variables between different wire subsystems; even if there was a prod input, it would not be possible to select between several different prods depending on the source. You ended up needing to do hacks with mode switching before a prod. How does LabVIEW deal with this problem?


Wasn't Dr. Claus the bad guy on Inspector Gadget? - dirvish
Similar... (none / 0) (#37)
by unDees on Tue Feb 04, 2003 at 10:50:45 AM EST

I've used the MindStorm environment to control LEGO robots before. It was loads of fun watching a robot crash haphazardly into the walls of the course. The language looked a lot like LabVIEW (and that's no coincidence!), but lacked a lot of sophisticated looping and control structures. The result was a quick learning curve, but lots of straight-line code from beginning to end.

However, it had no restrictions on how many times data could flow down a given wire; it was possible (though harder to do by accident then it would seem) to build infinite recursive loops and the like. A given input would always propogate completely before the next was handled.

In LabVIEW, data flows down every wire exactly once when that part of the diagram becomes visible (the True or False case of a control structure is activated, or a For loop triggers another iteration). You can have While loops that iterate indefinitely. You can also do recursion, but it's not trivial. LabVIEW doesn't let you drop a VI onto its own diagram, but you can tell it, "Open a reference to MyProgram.vi and run it," which amounts to the same thing. As with recursion in C, you can keep nesting calls like this until you run out of memory.

The biggest problem with the interface by far was that of race conditions; without a visible procedural main loop it was impossible to tell which of several inputs would be handled first, or which path of a device with two outputs would be called first. At first glance, it would seem as though the systems you show exhibit the same problem. How does LabVIEW get around this?

In LabVIEW, race conditions generally only happen if you use global variables shared by VIs, or if you read and write a front-panel control from more than one place on your diagram (using a construct rather inappropriately named "local variables"). You can avoid problems with the former by using subVIs instead of global variables; if you don't mark a VI as reentrant, all calls to it are serialized. As far as local variables (i.e., aliases for front-panel controls) go, you should never need them in a computation VI. Data should flow from inputs to outputs. The only place where locals are really appropriate is in GUI code, when you want to change the value of something the user just typed. This kind of race condition is shared by any GUI programming environment, and the best defense is to code carefully.

The other problem was that, generally, it was just a big pain to do any sort of assignment logic involving data stored between inputs. Getting the data back out of a variable could only be done by prodding it somehow in order to force it to produce an output signal.... How does LabVIEW get around this?

LabVIEW doesn't really have this problem. If you use a subVI instead of a global variable, you just drop it on your diagram anywhere you want to read the value. Plus, there are lots of other ways to share data and synchronize operations between VIs, including queues of any datatype, notifiers, semaphores, and others.

Your account balance is $0.02; to continue receiving our quality opinions, please remit payment as soon as possible.
[ Parent ]

SoftWIRE (4.00 / 1) (#29)
by SaintPort on Tue Feb 04, 2003 at 08:37:58 AM EST

http://www.softwiretechnology.com/

probably falls in this category.

It allows graphical linking of ActiveX controls and meshes with VB6.0 or VB.NET.  Trial and student versions are available.

--
Search the Scriptures
Start with some cheap grace...Got Life?

I'm vaguely familiar with SoftWIRE.... (none / 0) (#34)
by unDees on Tue Feb 04, 2003 at 10:29:00 AM EST

Someone posted an announcement to Info-LabVIEW once about SoftWIRE, so I headed over to the site and checked it out a few months back. It's definitely an interesting concept, and the tight integration with Visual Studio is a plus.

I chose not to evaluate it for this article, because it appears that there is still a "traditional" programming language (C# / VB) behind the scenes, and I wanted to concentrate mainly on environments where the diagram is the language. Please correct me if I've misunderstood how SoftWIRE works; I reached this conclusion by going through the online demos and by reading sentences like the following one from their site:

Icons are powerful .NET components and controls that contain functions written in C#.


Your account balance is $0.02; to continue receiving our quality opinions, please remit payment as soon as possible.
[ Parent ]
How well do these things scale ? (5.00 / 2) (#50)
by Simon Kinahan on Tue Feb 04, 2003 at 11:43:52 AM EST

I know of attempts from the past to create graphical languages, and, of course, any Rational salesman worth his bonuses will tell you code never works unless you've also got a finely detailed series of UML diagrams to go with it. My concern with such ideas has always been how well they scale up to big projects.

A UML diagram showing every detail of a system's data structures with more than 20 classes on it becomes unmanageable, and 20 classes is really small. I'd imagine that if you have a box for every function call, rather than a box for every class, it becomes unmanageable all the quicker. How do tese tools handle this problem ?

Simon

If you disagree, post, don't moderate

Scalability (4.00 / 1) (#54)
by unDees on Tue Feb 04, 2003 at 01:34:25 PM EST

LabVIEW scales like any procedural language would. You don't have one diagram for the entire application with every single function call in your program represented by a box. You have one diagram for your top-level VI (like a C main() function), and each function call in that top-level routine is one box. Double-click on a subVI call to open a new window with the diagram that implements that behavior, just like looking at the source to a C function.

So it's not like a diagram in Rational that just shows the classes and their relationships (please correct me if this is not how Rose and its ilk are used). Rather, it's the graphical representation of the behavior of all those modules.

It scales reasonably well. A medium-sized project in LabVIEW might have 1,000-2,000 VIs (that's 1.000-2.000 if you're in Europe :), and 30,000 nodes total (kind of like KLOC, but not directly comparable). And there are plenty of developers who deal with much larger projects daily. At those levels, the biggest issues become those shared by all large software projects: software updates and synchronization.

Your account balance is $0.02; to continue receiving our quality opinions, please remit payment as soon as possible.
[ Parent ]

Another question ... (none / 0) (#55)
by Simon Kinahan on Tue Feb 04, 2003 at 01:48:38 PM EST

What about revision control ?

Simon

If you disagree, post, don't moderate
[ Parent ]
Binary files and revision control (none / 0) (#56)
by unDees on Tue Feb 04, 2003 at 02:06:50 PM EST

Each VI is a separate binary file, usually only a couple hundred kilobytes. We deal with VIs the same way you might place any other binary file (such as a Word document) in a revision control system. Our source code is archived in a CVS repository, and we use the wonderful TortoiseCVS graphical front end.

Of course, if two developers make concurrent changes to the same file (analagous to two C developers modifying not just the same file, but the same function), most revision control tools can't automatically merge binary files. LabVIEW offers the next best thing: if you buy the professional version, you can use its graphical diff feature to pull up the new and old revisions side by side and flash red circles around each change as you select it from a list.

The pricier versions of LabVIEW will integrate with some Win32 tools that implement the Windows SCC API, including Visual SourceSafe and the cross-platform Perforce. If you don't use any of these, LabVIEW also has a very rudimentary revision control system built in, with which you can add comments and create a timestamped, archived revision every time you save a VI.

The hardest thing about source code control with LabVIEW is that sometimes a VI will change and need re-saving simply because one of its subVIs has had a major change made (such as altering the connector pane, or moving to a different directory). Of course, you can either not save VIs you know have not changed, or put up with the fact that VIs will occasionally need re-checking in, even if you haven't made manual edits.

As a side note, VEE uses text files for its disk representation, so at least two facets of revision control (repository size and checkin/checkout speed) are smoother for VEE developers.

Your account balance is $0.02; to continue receiving our quality opinions, please remit payment as soon as possible.
[ Parent ]

Prograph (5.00 / 2) (#51)
by substrate on Tue Feb 04, 2003 at 12:37:58 PM EST

In grad school I extended a tool written in Prograph. The concept itself was interesting, but the implementation was painful. Maybe it was the fault of the original author, but it was very difficult to follow the code because it appeared as a rats nest of interconnections.

More about your project? (none / 0) (#58)
by unDees on Tue Feb 04, 2003 at 02:35:46 PM EST

I'd like to hear more. Another K5er (Pac) mentioned Prograph as well. What was the nature of your project? Did Prograph provide any manual way to untangle the wires, or is the routing fixed?

Your account balance is $0.02; to continue receiving our quality opinions, please remit payment as soon as possible.
[ Parent ]
Prograph (5.00 / 2) (#78)
by substrate on Tue Feb 04, 2003 at 08:47:05 PM EST

The project took a mathematical description of an algorithm and generated a layout that could be used as a module in an integrated circuit. I think part of the problem was that the code was originally written in sort of a stream of consciousness mode. As a result there were these huge "subroutines" that probably should have been broken into 5 or 6 smaller routines.

It's been a long while, but I think there was some limited auto-untangling ability as well as manual, but I think this particular code may have been too far gone. I picked up Prograph some time later and had more luck with it, but I started from scratch and kept things manageable.

I've also used LabView, and for its purposes I thought it was a very decent tool.

[ Parent ]

Other music tools (5.00 / 1) (#52)
by fluffy grue on Tue Feb 04, 2003 at 12:50:43 PM EST

There are actually a lot of music composition tools which use dataflow for major parts of their setup. Jeskola Buzz, GNU Octave, and many others I can't name off the top of my head use dataflow diagrams for effects chains, and there's some tools out there which use dataflow for the composition itself (though I'm drawing a blank on their names right now).

Also, I remember playing with a dataflow-based visualization language made by IBM in the early/mid 90s, though unfortunately, I don't remember its name either... I also remember a dataflow-based image processing package for OS/2 (conceptually similar to Photoshop "effect layers," except that it could take multiple inputs, rather than just the compositing of the previous layer). Which, again, I can't remember the name of.
--
"Ain't proper English" ain't proper English.
"Is not a quine" is not a quine.

[ Hug Your Trikuare ]

some? try many or even most (none / 0) (#88)
by jjayson on Wed Feb 05, 2003 at 01:01:53 AM EST

anything that tried to mimic an effect rack, MIDI synth, or modular synth uses dataflow. That includes the big programs too... like reason and reaktor. These programs even use the box/wire metaphore. A tool like Sound Forge might not have rthe boxes and wires, but it still operates on the concept of a generator and filters.
_______
Smile =)
* bt krav magas kitten THE FUCK UP
<bt> Eat Kung Jew, bitch.

[ Parent ]
Not most... (5.00 / 1) (#91)
by fluffy grue on Wed Feb 05, 2003 at 02:59:20 AM EST

Many, sure, but not most. Most of the ones you know, yes. Most of the ones in existence, no.
--
"Ain't proper English" ain't proper English.
"Is not a quine" is not a quine.

[ Hug Your Trikuare ]
[ Parent ]

OS/2 and Graphics (none / 0) (#136)
by Anonymous 242 on Thu Feb 06, 2003 at 12:44:31 AM EST

That wouldn't have been Colorworks, would it?

If nothing else, those folks knew how to write a manual. 300+ pages, hardbound, full color ...

[ Parent ]

Maybe (none / 0) (#137)
by fluffy grue on Thu Feb 06, 2003 at 01:21:55 AM EST

I forget. It was on the shelf in my office back when I ran Hobbes. I remember flipping through the manual (which I don't recall as being color or hardbound).
--
"Ain't proper English" ain't proper English.
"Is not a quine" is not a quine.

[ Hug Your Trikuare ]
[ Parent ]

Instrumentation Software (none / 0) (#59)
by gmol on Tue Feb 04, 2003 at 02:44:19 PM EST

I am amazed the kind of instrumentation software you are talking about exists and wonder why it isn't used more.

In my lab we have one computer for each stupid instrument. The software for each instrument is very different and quite often sucks (with the exception of a new robot liquid handler we got from TECAN, whose software is actually quite slick).  The leads to huge amounts of unproducitivity with moving data files around, various control scripts, higher learning curves (for what are actually the same abstractions)....

In short instrumentation software sucks, it should be more uniform and not flaky..

Do people outside the chem/bio field feel the same?

Virtual instrumentation (none / 0) (#61)
by unDees on Tue Feb 04, 2003 at 03:13:54 PM EST

That sounds painful. How do the PCs communicate with the instruments in your lab? Serial, Ethernet, etc.? LabVIEW is designed to let you quickly grab data from all your devices onto one PC, maybe do a little numerical analysis, and pop it ino a fancy graph or three. I'm pretty sure there are several companies in the biochem field who use it. In other industries, too, there are still plenty of difficult-to-use instruments and antiquated, platform-specific control programs, even (gasp!) where I work.

Your account balance is $0.02; to continue receiving our quality opinions, please remit payment as soon as possible.
[ Parent ]
Most communicate through serial ports... (5.00 / 1) (#66)
by gmol on Tue Feb 04, 2003 at 04:31:01 PM EST

The day I have my own lab, I am going to make sure I have instruments (I'll custom make them if I have too!) which all communicate through a wireless standard interface so that anybody with a computer can run any instrument from anywhere in the lab....
I can't imagine how much energy is wasted by keeping those computers on most of the time when they are not being used....

[ Parent ]
This is totally dumb. (1.83 / 12) (#60)
by tkatchev on Tue Feb 04, 2003 at 03:11:31 PM EST

The only reason why somebody would want to use a graphical programming tool is for creating complex parallel programs. And even then the individual code blocks are still programmed in a textual language.

In spite of the useless novelty of "graphical programming" designed to impress dumb Americans, humans are still predominantly text-based creatures.

We think in terms of natural language, not in terms of pictures.

In short, get a life and find a more productive cause to hype.

   -- Signed, Lev Andropoff, cosmonaut.

You're entitled to your opinion, of course... (5.00 / 4) (#62)
by unDees on Tue Feb 04, 2003 at 03:27:09 PM EST

In many cases, the parallelism you cite is exactly why people use dataflow languages. And no, it's not necessary to have text at some level. You can create graphical code down to the function level and beyond that takes advantage of multithreading or in some cases multiprocessing.

In spite of the useless novelty of "graphical programming" designed to impress dumb Americans, humans are still predominantly text-based creatures.

Visual programming impresses lots of "dumb people" not only from the Americas, but from Europe, Asia, Africa, and Australia as well. Western Europe, particularly Germany and Scandinavia, has a thriving community of graphical developers. And the concept has been around for a couple of decades, so it's hardly a novelty any more.

We think in terms of natural language, not in terms of pictures.

Are you really qualified to make that blanket statement for all of humanity? Anyway, if you're viewing a dataflow diagram, it isn't necessary to think through the design--you see it. Creating the diagrams is, of course, another story. Even then, are you really sure every graphical programmer is busy spewing out text narrative in his head? ("Hmmm, first I create an Add node here, next I drag this wire....")

In short, get a life and find a more productive cause to hype.

This programming style is no hyped-up cause; it's simply a topic for discussion I thought a few people on K5 might find interesting. If you're not one of those people, hey--that's fine. Enough people find graphical programming an easier way to visualize the system they're creating to make the topic relevant to programming, whether or not you or I personally happen to like it.

Your account balance is $0.02; to continue receiving our quality opinions, please remit payment as soon as possible.
[ Parent ]

processing time (5.00 / 3) (#69)
by Rhodes on Tue Feb 04, 2003 at 05:24:19 PM EST

If you took the word "stop" from a white bordered red octagon, would you still slow your car down? Text takes longer to process than simple signs. Pictionary point out that verbs and abstract concepts are difficult to represent pictorally.

[ Parent ]
No such "fact" (4.00 / 2) (#127)
by Estanislao Martínez on Wed Feb 05, 2003 at 05:25:36 PM EST

Text takes longer to process than simple signs.

This statement is vague enough to be meaningless. It depends on context; you can't generalize over all instances of text and/or sign recognition. For example, in your stop sign example, presumably in the context of the stop sign the word STOP can be recognized more quickly. There is also the fact that stop signs occur in predictable contexts as you drive around towns (i.e. intersections without signals).

--em
[ Parent ]

Context (none / 0) (#129)
by unDees on Wed Feb 05, 2003 at 05:45:56 PM EST

I agree that context is important. If, however, I happen to be driving in a foreign country that also happens to use red octagons, and the text in the sign says, "FRGMSDFFFFL," I might indeed guess that it's a stop sign. (Of course, assumptions like that can get one into trouble.)

Your account balance is $0.02; to continue receiving our quality opinions, please remit payment as soon as possible.
[ Parent ]
And... (4.00 / 3) (#130)
by Estanislao Martínez on Wed Feb 05, 2003 at 06:06:28 PM EST

If, however, I happen to be driving in a foreign country that also happens to use red octagons, and the text in the sign says, "FRGMSDFFFFL," I might indeed guess that it's a stop sign.

And if a native of that country is driving by an intersection where the sign is misspelled as "FRGMSDFFFL", do you think she'll notice? She'll just see a FRGMSDFFFFL sign.

The point is that the important factor is the overall situation; the components determine that situation redundantly, and taking in the situation does not require attention to all the visual components of the situation. This is why being at an intersection with a red octagonal sign can speed up the recognition of the word "STOP" in the first place. (And conversely, why it is much more difficult to read a list of color words when the words are printed in a color different from the one they name.)

--em
[ Parent ]

Dude, (none / 0) (#132)
by it certainly is on Wed Feb 05, 2003 at 08:50:12 PM EST

call me when you can put up a board with the next 5 towns and their distances using only pictograms.

BTW: Stop signs are deliberately shaped like that so you can tell them apart from every other sign, even when they're covered in snow.

kur0shin.org -- it certainly is

Godwin's law [...] is impossible to violate except with an infinitely long thread that doesn't mention nazis.
[ Parent ]

It's called a map. [nt] (none / 0) (#160)
by NFW on Sat Feb 22, 2003 at 05:15:12 PM EST




--
Got birds?


[ Parent ]

It isn't just me, you're bitter with everyone! (5.00 / 1) (#71)
by subversion on Tue Feb 04, 2003 at 06:30:00 PM EST

Let me put it to you this way.

Take a complex control system, involving not only pure PID control, but also (let's just use some examples here) deadbeat control, feedforward, and maybe some other random digital processes.

Implement it in C++ code.

Now implement it in Simulink, a graphical environment.

Tell me which you got done faster.

I've done both, as a comparison for coursework, and I can tell you that for some things graphical programming is faster and easier to debug.  

Why use graphical programming?  Hell, why use high-level languages?  Why don't we go back to machine code?  Because working in high-level languages is generally more efficient and productive.  For some tasks, graphical programming is more efficient and more productive.

For others, I'll go back to text programming.

It's just another tool.

If you disagree, reply, don't moderate.
[ Parent ]

But that's my point. (none / 0) (#99)
by tkatchev on Wed Feb 05, 2003 at 04:11:36 AM EST

Graphical programming only works when you need to implement something with a high degree of parallelism.

   -- Signed, Lev Andropoff, cosmonaut.
[ Parent ]

And this is a bad thing? (4.00 / 1) (#107)
by DaChesserCat on Wed Feb 05, 2003 at 10:54:37 AM EST

Personally, I see Ubiquitous Computing being the paradigm of the future. The idea is that, instead of having one machine which you interact with at a time, which runs all of your applications, there are a variety of machines, interconnected with some kind of networking (a considerable amount of it wireless). Consequently, when you use a program, you aren't necessarily running one monolithic program on one machine; in contrast, you're often running parts of the program on a variety of different machines, and the mix is not necessarily homogenous. You can use things like PCAnywhere or VNC to allow one machine to access another's apps (I can use VNC, on a Linux machine, to reach a Windows machine and run Windows apps), but that is decidedly NOT optimal. Consequently, we WANT to write programs which are basically collections of small, communicating modules. If the user interface modules are written in, say, Java, I can use a larger number of systems for the user interface, even if the back-end modules require a more specific hardware/OS platform. Chance are, you're using such a set of interconnected modules at this very moment. I'm not sure what hardware/OS this website is hosted on, but it doesn't HAVE TO be the same as the workstation on which you're reading this. Graphical programming can make it easier to visualize the interconnections and the boundaries between the various modules.

People are currently writing web-based apps, with different network interfaces (ODBC, HTML, RPC, etc.) and different modules. Writing stuff in server-side JSP, HTML and client-side JavaScript (all text-based) gets tedious after a while. Adding Java Applets, with their own interfaces to various network services, makes it more difficult. Then, of course there's EJB's, or M$'s .NET platform with its Web Services (which may provide services to clients OR to other services). Having a well-planned UML diagram to work from helps, but having a graphical layout of the actual development (because the graphical layout IS the development) would be nice.

If it eases the development of parallelized, pipelined applications, then it's probably a good thing, because that's where we're headed, near term and longer term

Oh, and for those who are still using these applications in a monolithic "entire application running on one machine" mode, parallelized, pipelined applications will still work. I mean, I use X-Windows apps on my Linux machine. X is inherently client-server, but both tasks just happen to run on the same machine. The modularity, though, gives me flexibility which a single, monolithic app just can't touch.

We're already there, and the future is more so. In the future, people who code exclusively in text-mode will be about as rare as people today who code exclusively in machine language. Such people still exist, but they are definitely a minority.

Trains stop at train stations Busses stop at bus stations A windows workstation . . .
[ Parent ]
Yes, but you miss the main point. (none / 0) (#140)
by tkatchev on Thu Feb 06, 2003 at 12:47:25 PM EST

The actual code will still be written using a text editor.

Once you write, say, four or five extremely high-level code blocks (each block being something like "process a customer request" or "make a database query") than you can combine these high-level blocks into a parallelized graphical tree.

Using more fine-grained graphical programming is a very bad idea, I think.

   -- Signed, Lev Andropoff, cosmonaut.
[ Parent ]

I don't think that's a valid point... (none / 0) (#145)
by roiem on Thu Feb 06, 2003 at 03:21:50 PM EST

...because in any high-level language, the languae "primitives" are implemented in some lower-level language. C-language primitives are written in assembler, eevn higher-level languages like Perl have primitives written in C. So a graphical language with primitives written in a textbased (and therefore lower-level) language is not automatically inferior due to this.
90% of all projects out there are basically glorified interfaces to relational databases.
[
Parent ]
Control systems aren't parallel. (none / 0) (#119)
by subversion on Wed Feb 05, 2003 at 04:11:48 PM EST

At all.  Generally they're SISO.

Next!

If you disagree, reply, don't moderate.
[ Parent ]

A little more detailed. (4.00 / 1) (#134)
by subversion on Wed Feb 05, 2003 at 10:29:46 PM EST

And, to reply to myself; yes, Multiple Input Multiple Output controls exist, but the most common controls in practical use are Single Input Single Output (SISO) and don't lend themselves well to parallelization.

Pretty much any problem in the discipline generally known as Systems Engineering (comprising mainly control systems, communication systems, and signal processing) is easier to implement in a graphical-mathematic programming environment than in a straight-code environment.  The best tool I've ever found for any of these problems is MATLAB/Simulink.  

One common feature of all of these is the fact that generally they're most easily represented as block diagrams.  Block diagrams are very easy to translate into graphical programming.

I don't write OSen.  I don't write compilers.  I don't write word processors.  I don't write GUIDEs.  For all of these, I would probably assume that a graphical solution is not the best one.

For some types of programming, a graphical solution is FAR more efficient.  I happen to work with those types, mostly.  I've implemented most of them in both C (procedural) and MATLAB/Simulink (graphical) and categorically they were easier to implement in M/S.

If you don't like a style of programming, you don't have to use it.  But don't claim its worthless when its extremely useful.

If you disagree, reply, don't moderate.
[ Parent ]

parallelization (none / 0) (#152)
by superflex on Fri Feb 07, 2003 at 02:59:44 PM EST

wouldn't the application of the principle of superposition be effective in parallelizing computation for linear or linearized systems?

[ Parent ]
I'm not sure if I understand you (3.00 / 1) (#155)
by subversion on Sat Feb 08, 2003 at 03:20:49 PM EST

But if I do, not really, no.

Superposition means I can add waveforms in a linear system.  I add them once and perform the processing once, and its more efficient that keeping them seperate and processing them individually.

Linearized systems are even stranger, and generally you try to minimize use of them because they may mask problems in the original non-linear system; if you do numerical analysis, you generally don't linearize and let the computer brute-force it if you can to provide best accuracy.


If you disagree, reply, don't moderate.
[ Parent ]

Wrong (4.50 / 2) (#73)
by bodrius on Tue Feb 04, 2003 at 06:56:03 PM EST

I'm not a graphical programmer, but it seems to me most of what you say may be true for you but is technically false, in the sense that you're extrapolating your preferences to everyone else:

I - There are many reasons why "somebody" would want to use a graphical programming tool.
    I do agree the main reason to program exclusively on such a tool is for complex parallel programs; the ability to "see" the threads in the source code instead of parsing the thread-related commands and tracing the function/method calls on you mind is quite appealing.
    However, any single-threaded programs with lots of object-messages/function-calls can benefit from a graphical view. I think sequence diagrams, for example, are a big part of what sells the idea of UML to a lot of people, they certainly were for me. Technically, there's no reason for these diagrams not to be the "source code" (i.e. we could reduce all implementation information, down to arithmetic operators, to message sequences) rather than an intermediate template, and sometimes that view even makes sense. The problem is that, for most applications, it's not an efficient way to code. You have to know what constitutes "interesting behavior" to document, and what's better documented by a simple for statement.
    I get the impression your objection applies rather to the idea of programming exclusively in a graphical way. In that sense, I share your skepticism. Graphical expression is clearer and more concise for a limited set of problems, and it would be overkill to apply it to most algorithmic tasks. However, a mixture of graphic and textual "source code" is already used through automation tools, and since these tools seem to be able to communicate with "actual code" it seems a nice way to approach problems which can be tedious, annoying and error-prone to code textually, much as coding graphically certain algorithms would seem to be.

II - Most humans are most definitely not text-based creatures. They do not think in terms of natural language most of the time.
     It's easier for each human to memorize, process and then understand concepts when expressed in the kind of stimuli their brain processes best in the long term. Some people work better with text, others with sound, others with visual forms, and they try to translate everything into their preferred form if they can.
     For most people this happens to be visual images, it seems. The current fashion of using pretty pictures on everything is supported by considerable research. It's not a matter of "dumb", it's a matter of people thinking differently (mainly because their memory works differently), which presents more than a few communication problems.
     Personally I work better with text too, but I'm in the minority. The idea of people understanding mathematics better through sound/music, or arithmetic trough geometry,  baffles me, but apparently is quite common.
Freedom is the freedom to say 2+2=4, everything else follows...
[ Parent ]

Two ponts. (5.00 / 1) (#98)
by tkatchev on Wed Feb 05, 2003 at 04:10:33 AM EST

1. UML is used for documentation, not a as a graphical programming tool. Using UML for graphical programming qualifies you as a grade-A dumbass, I think.

2. Virtually all humans on this planet subvocalize to some degree when they think.

   -- Signed, Lev Andropoff, cosmonaut.
[ Parent ]

Two answers (5.00 / 1) (#113)
by bodrius on Wed Feb 05, 2003 at 01:31:03 PM EST

1. I never said UML was a graphical programming language. You seem to have problems parsing text here.
   I said a mixture of graphical views of certain programmic logic (UML for example) and textual source code are used to express and implement a program through automation tools, and that there is TECHNICALLY no reason not to use exclusively UML diagrams (let the automation tools generate all the code) and consider these "the source code". UML can, and is frequently used, to "program" pretty much as GUI builders of IDEs are used to "program user interfaces". It could be used more radically than that.
    It would be idiotic, down to documenting every operator use as a function call, but it could be done. My point is that the fact that it would be idiotic to use for everything doesn't mean that it's idiotic to use for what it's good for, and that the same thing could be said about graphical programming languages.
    Now, if you think there's a semantic difference between UML because it's used to "document", and programming languages because they're used to "implement", it seems to me you're the one confused on the nature of programming languages: they're all for "documentation", or we would be coding in assembly.
    The only practical difference depends on the existence of a compiler/interpreter. If the "documenting language" is sufficiently formal (as some subsets of UML are), a code-generation tools fulfills such a role.

2. No they don't.
   All humans who learn speech subvocalize when processing language (text, speech), because that's how the speech center works. Not all cognitive processes are language based, not all thoughts related to speech.
   If you subvocalize while drawing pictures, identifying objects or solving spatial problems, for example, you're definitely in the minority.
   You seem confused about what "to think" means.
Freedom is the freedom to say 2+2=4, everything else follows...
[ Parent ]

Maybe. (none / 0) (#142)
by tkatchev on Thu Feb 06, 2003 at 12:52:36 PM EST

I don't consider graphics design or spatial orientation to be "thoughts".

Also, there is a very clear difference between documentation and implementation -- documentation is what you do before and after you implement your project.

   -- Signed, Lev Andropoff, cosmonaut.
[ Parent ]

I see (none / 0) (#147)
by bodrius on Thu Feb 06, 2003 at 05:42:14 PM EST

If you believe "thoughts"==text, then of course you're right and there's no other way to "think", according to your definition.

But by considering that geometry, pattern-recognition, etc. not to be part of "thinking", you're not just opposing practically all authoritative and popular definitions of that concept, but also seriously limiting what you can do with it. According to your definition, solving math problems geometrically, exploring a tree in a graph problem, solving a rubik's cube or building physical tools do not involve "thinking".

On the other point:

From a theoretical point of view, your distinction is as meaningless as the SAT's definition for aptitude ("Aptitude is what the SAT tests") and invites the same kind of circularity ("SAT is a test that tests aptitude").

From a practical point of view, if you only document before and after implementation, and never through the implementation, I guess I'm lucky not to work with your code, which would then surely be composed of function and variable names like doZ(),x1,tm2,foo(), ngh(), poit().  

I suspect, however, that you don't make the language and the compiler's existence irrelevant in such a way. Or perhaps you do code in your platform's instruction set, and forget there are people who prefer to dumb down their code to readable fragments that say what you're doing, like "sortDescending(array)" or something like that.

Source code is documentation in the same sense that calculations are documentation in a paper: both may, and usually do, require clarifications, descriptions, notes on the intent and the assumptions to facilitate comprehension or to let other parties skip them if they trust the results, but without them there is no document, no proof of what exactly you're doing, and correcting or duplicating the program/proof/experiment is difficult at best.
Freedom is the freedom to say 2+2=4, everything else follows...
[ Parent ]

UML (none / 0) (#114)
by unDees on Wed Feb 05, 2003 at 01:31:24 PM EST

The fact remains that companies market UML as a code-generating, pseudo-programming tool, regardless of whether or not it's appropriate for this task. And a big part of why I didn't include it in this article is that I, like you, consider it more of a documentation tool than anything else.

Your account balance is $0.02; to continue receiving our quality opinions, please remit payment as soon as possible.
[ Parent ]
Yeah. (none / 0) (#141)
by tkatchev on Thu Feb 06, 2003 at 12:50:27 PM EST

My mailbox markets Tibetan herbal remedies as a valid penis-increasing tool. So?

I think this is shoddy marketing, personally. As far as I know, UML code generation is in such a larval stage as to be basically broken.

(Unless something changed in this half-year or so?)

   -- Signed, Lev Andropoff, cosmonaut.
[ Parent ]

You're not smart enough to troll (5.00 / 1) (#74)
by Big Sexxy Joe on Tue Feb 04, 2003 at 07:13:39 PM EST

Trolling is not the same as pissing and moaning like a bitch. These things are beyond your grasp, though.

I'm like Jesus, only better.
Democracy Now! - your daily, uncensored, corporate-free grassroots news hour
[ Parent ]
Uh. (4.00 / 1) (#97)
by tkatchev on Wed Feb 05, 2003 at 04:08:30 AM EST

OK, OK, you win. You are the best troll in the universe.

   -- Signed, Lev Andropoff, cosmonaut.
[ Parent ]

I'm not saying I'm a good troll (none / 0) (#108)
by Big Sexxy Joe on Wed Feb 05, 2003 at 12:19:52 PM EST

Indeed, I rarely troll at all, aside from when I troll you. You just happen to be a very poor troll. If it makes you feel any better, you are a wonderful troll victim.

I'm like Jesus, only better.
Democracy Now! - your daily, uncensored, corporate-free grassroots news hour
[ Parent ]
Yes indeed. (none / 0) (#143)
by tkatchev on Thu Feb 06, 2003 at 12:53:15 PM EST

IHBT. I guess I should go kill myself now.

   -- Signed, Lev Andropoff, cosmonaut.
[ Parent ]

humans are still predominantly text-based creature (5.00 / 3) (#75)
by michaelp on Tue Feb 04, 2003 at 07:18:21 PM EST


"still" no doubt because we evolved from creatures who read the lables to tell whether the fruit was ripe rather than looking at the color of it.

We think in terms of natural language, not in terms of pictures.

Which is no doubt why all human languages started with written alphabets and only later devolved into pictograms.


"Every gun that is made, every warship launched, every rocket fired, signifies in the final sense a theft from those who hunger and are not fed, those who are cold and are not clothed."

[ Parent ]
You, sir, are a grade-A dumbass. (1.75 / 4) (#96)
by tkatchev on Wed Feb 05, 2003 at 04:06:35 AM EST

Tell me, do you think in pictures?

I'll bet you 1000 to 1 that you subvocalize rather than paint a diagram in your head.

The rest of your comment is irrelevant trash.

   -- Signed, Lev Andropoff, cosmonaut.
[ Parent ]

Maybe he's autistic? (none / 0) (#103)
by derek3000 on Wed Feb 05, 2003 at 08:29:30 AM EST

You know, like Dustin Hoffman. Or something like that.

-----------
Not too political, nothing too clever!--Liars
[ Parent ]

Who tkatchev? (none / 0) (#135)
by Big Sexxy Joe on Thu Feb 06, 2003 at 12:08:45 AM EST

Yeah, that's the impression I get.

I'm like Jesus, only better.
Democracy Now! - your daily, uncensored, corporate-free grassroots news hour
[ Parent ]
As a couple people have said before... (none / 0) (#139)
by derek3000 on Thu Feb 06, 2003 at 08:39:15 AM EST

pls die thx beep beep boop boop

-----------
Not too political, nothing too clever!--Liars
[ Parent ]

I feel sorry for you (none / 0) (#116)
by michaelp on Wed Feb 05, 2003 at 02:47:47 PM EST


that you never learned to read without subvocalizing, there are many courses you can take that would help you with that handicap of yours:

If you have trouble with grouping words, check to see if you are subvocalizing -- reading word by word because you are reading with your lips instead of your eyes. Put your finger on your lips as you read -- if your lips move, you are subvocalizing. Also, listen to your thoughts as you read -- can you "hear" each word articulated rather than seeing pictures without even noticing actual words? If you do find you are subvocalizing, break the habit by keeping a finger on your lips whenever your read, for about three weeks. That should do the trick.

Plus people won't think you are quite such a nut if you stop moving your lips when you read.

As for thinking in pictures, if you are only capable of thinking in text, you are missing half your potential, thinking with 1/2 a brain, literally.. Of course, one could rather easily deduce that from your typical posting style on k5;-).


"Every gun that is made, every warship launched, every rocket fired, signifies in the final sense a theft from those who hunger and are not fed, those who are cold and are not clothed."

[ Parent ]
Dude, (none / 0) (#131)
by it certainly is on Wed Feb 05, 2003 at 08:43:30 PM EST

your lips don't move with subvocalising. That's "vocalising". Subvocalising uses the voice(s) in your head.

kur0shin.org -- it certainly is

Godwin's law [...] is impossible to violate except with an infinitely long thread that doesn't mention nazis.
[ Parent ]

They don't talk as fast as I read either (none / 0) (#133)
by michaelp on Wed Feb 05, 2003 at 08:50:46 PM EST


But however you stop it, subvocalizing must slow you down while reading, it says so right here on K5!


"Every gun that is made, every warship launched, every rocket fired, signifies in the final sense a theft from those who hunger and are not fed, those who are cold and are not clothed."

[ Parent ]
I do. (none / 0) (#158)
by subversion on Sun Feb 09, 2003 at 09:53:22 AM EST

The first thing I do when I need to explain something to someone (in my field of electrical engineering) is grab a piece of paper and start drawing diagrams.  Because that's how I think about it.

If you disagree, reply, don't moderate.
[ Parent ]
Scripts (4.00 / 1) (#125)
by Estanislao Martínez on Wed Feb 05, 2003 at 05:20:12 PM EST

Which is no doubt why all human languages started with written alphabets and only later devolved into pictograms.

You want to qualify this statement. E.g. I'm not aware of Sanskrit ever being written in an ideographic script.

To make a long story short, most human languages are not written, and most of the ones that have been written never had an ideographic script; but then again most scripts are an adaptation of some other script. The correct statement is that the oldest scripts we are aware of are ideographic.

--em
[ Parent ]

But notice I said "pictograms" (none / 0) (#138)
by michaelp on Thu Feb 06, 2003 at 01:49:05 AM EST

like the ones from Lascaux.

Which tkatchev will no doubt tell you were painted by "predominantly text-based creatures."

After all, since tkatchev thinks in text now, and tkatchev is a people, it is only logical that people have always carefully spelled out their thoughts in text while doing a task. Probably they had some logically inassailable reason for only writing down their ancient texts on materials that would not last as long as the stone walls of their caves.

Maybe they were all using palm pilots made of charcoal and skin?


"Every gun that is made, every warship launched, every rocket fired, signifies in the final sense a theft from those who hunger and are not fed, those who are cold and are not clothed."

[ Parent ]
While I agree with you (none / 0) (#76)
by tzanger on Tue Feb 04, 2003 at 07:33:22 PM EST

You got a 2 from me because you slammed the guy unnecessarily.

I've used Labview and others before and it's a pain in the ass. Drawing lines and hooking the shit up is fine for the first time 'round but it gets nasty if you ever have to edit someone else's work. Even with graphical diff it's a pain in the ass to try and see what exactly was going on and why the change does what it's supposed to.

It's a great tool for getting things up fast but it's a nightmare to maintain.



[ Parent ]
Maintenance... (4.00 / 2) (#85)
by unDees on Tue Feb 04, 2003 at 11:41:57 PM EST

...has only been a headache for me when the code was bad. This has been true of every project I've inherited, from embedded assembler programs to straight-C Win16 code (yuck!) to LabVIEW diagrams. I don't find the problem to be any worse on the graphical side. Spaghetti code is spaghetti code; only with LabVIEW can it finally look like spaghetti in the hands of a sufficiently twisted craftsman. :)

Your account balance is $0.02; to continue receiving our quality opinions, please remit payment as soon as possible.
[ Parent ]
True true true (4.00 / 1) (#87)
by it certainly is on Wed Feb 05, 2003 at 12:25:20 AM EST

Being forced to use SDL diagrams at work (because the ITU, in their infinite wisdom, have specified the V5 protocol as a set of SDL diagrams), I completely agree with tkatchev. Graphical languages are stupid.
  • It's an absolute bastard tracing back from a program counter address of a failure to the lines and boxes responsible in the "source code". At least I have the benefit of seeing the intermediate C code that was generated for the diagram.
  • Doing a "diff" to find out what changes a person has committed in an update to shared SDL diagrams is completely impossible without the textual "sdtdiff" tool to tell me the filename/model/block/page/xy-coords of each change.
  • It takes roughly 4 boxes and lines to do a simple "loop" in SDL, compared with just one line of C.
  • Most of my time was spent tracing the path of signals through the model... from the main diagram, into a block, then into a sub block, then into a sub sub block (like Russian dolls), then finally we get to an actual "procedure" to handle the signal. Of course, it then emits another signal, so we have to trace, trace trace. With C or C++ code, I can launch Source Navigator and trace execution flow through XREF graphs in a couple of seconds.
While I admit that a picture showing a data structure or a block-diagram overviews can be helpful, it's utterly untrue that it's useful to program directly in pictures of block-diagram overviews. It's a pain in the backside.

kur0shin.org -- it certainly is

Godwin's law [...] is impossible to violate except with an infinitely long thread that doesn't mention nazis.
[ Parent ]

Yeah. (none / 0) (#95)
by tkatchev on Wed Feb 05, 2003 at 04:04:38 AM EST

SDL sucks. (Besides, I thought it was already pretty much supplanted by UML? The various state and activity diagrams seem to do what SDL does, except in a neater and more rational way...)


   -- Signed, Lev Andropoff, cosmonaut.
[ Parent ]

Supplanted? (none / 0) (#104)
by it certainly is on Wed Feb 05, 2003 at 08:47:22 AM EST

Perhaps UML replaces SDL in some idelogical or religious way, but the ITU still specify their state machines in SDL, so no.

kur0shin.org -- it certainly is

Godwin's law [...] is impossible to violate except with an infinitely long thread that doesn't mention nazis.
[ Parent ]

That's too bad. (none / 0) (#144)
by tkatchev on Thu Feb 06, 2003 at 01:00:14 PM EST

'Cause I seem to have an allergy towards SDL for some reason.

   -- Signed, Lev Andropoff, cosmonaut.
[ Parent ]

It has its uses (5.00 / 2) (#92)
by kerinsky on Wed Feb 05, 2003 at 03:20:50 AM EST

I'm a computer science student and intern at a defense contractor.  My major task has been to write test code to control 1.5 or so million dollars worth of test equipment for quality and functional testing of satellite components.  When I started out they handed me VEE and told me to learn it.  I grumbled that I'd rather use C or C++ but this idea was nixed.  Previous projects had been done in C but the engineer who wrote the code left and the rest of the EEs couldn't make heads or tails of his code when they needed to modify it.  I did end up writing a small portion of the program that had to deal with large amounts of binary data as a shared library in C for the sake of efficiency.

When I was done two EEs who worked on the project reviewed my code to make sure they could understand and modify it if the need arose.  Both have had formal classes in C and C++ but neither are very interested in coding.  They spent more time trying to understand the C code that took me 5 hours to write and debug than the VEE code that I had more than a hundred hours on.

You could say this is an indication that the VEE is less efficient to write in, but I am certain I would have spent more time writing in C, that I would have had more bugs and that the EEs wouldn't have been able to grok the code nearly as well.  In fact the EEs understood the VEE code well enought to see a bug just by looking at the code while they didn't even understand how I was inverting bits in an integer using exclusive or in C.  Once they learned the magic of the ?: operator in VEE there was only one thing in the code that they didn't understand (and that code did something that I don't even think is possible in compiled languages), the C section is still almost black magic to them.

VEE is clearly not a catch all solution, but it does a good job at providing a graphical interface for controlling test and measurement equipment.  It is also wonderful to print and show to manager types.  When they have a question I can usually just point to a box and say this box fulfills that function, if I handed them a printout of C code their eyes would glaze over, but they would continue to ask silly questions.

-=-
A conclusion is simply the place where you got tired of thinking.
[ Parent ]

Your complaints have nothing... (2.66 / 3) (#94)
by tkatchev on Wed Feb 05, 2003 at 04:00:22 AM EST

...to do with graphical programming or C++.

Problems like that are solved with proper human resource management and a tight corporate documentation standard.

   -- Signed, Lev Andropoff, cosmonaut.
[ Parent ]

Human resource management (5.00 / 1) (#124)
by kerinsky on Wed Feb 05, 2003 at 05:19:05 PM EST

Any time over the next couple of years the testing requirements could change due to a variety of reasons. As it stands the electrical engineers working on the project will much more likely to be able to modify or add code in VEE than they could in C. If the code was in C then the project would need to borrow a software engineer, probably stealing them from another project. Then the software engineer would have to be brought up to speed about how the relevent parts of the hardware work so that it can be tested properly.

You do have a good point in that for any complicated change or large addition they are almost certain to need a full time coder, but such large changes are not too likely. Ideally they would have a EE who knows how to program well working in the lab. They did in the past but ran into problems understanding his C code when he left. When a piece of test equipment dies and is replaced with a different model they have no idea where to even start and may have several days of downtime. The EEs could probably make such modifications to VEE code in an afternoon.

In the end we're living in a complicated world. It's much easier to make a descision to use VEE versus C than it is to change how the HR deparement works or documentation standards at a large company. You said that there is only one reason why anyone would want to use a graphical programming language. My boss had a couple other reasons as well and in the end I think they're perfectly valid.

-=-
A conclusion is simply the place where you got tired of thinking.
[ Parent ]

pictures and words. (none / 0) (#156)
by Michael Moser on Sun Feb 09, 2003 at 02:20:10 AM EST

>We think in terms of natural language, not in >terms of pictures. but why do they always visualize a state machine as some picture of a graph? (i tend to prefer tables)

[ Parent ]
Ha Ha! i found the counterexample (none / 0) (#159)
by Michael Moser on Mon Feb 17, 2003 at 05:04:53 AM EST

>We think in terms of natural language, not in
>terms of pictures.
>
>In short, get a life and find a more productive
>cause to hype.

Ever seen Microsoft Biztalk mapping editor?

The idea is that there is input and output
in XML format, and you want to produce a transformation between two hirarchies

The UI has two tree controls - one for input the other for output.

What do you do? Drag links from left to the right.

What is the output? XSLT transformation.
Now please try to construct the XSLT transformation by hand.
That's much more complex than dragging around links.

[ Parent ]

I don't like LabView (and similar) (5.00 / 5) (#64)
by ishark on Tue Feb 04, 2003 at 03:44:21 PM EST

Let me say something negative about this LabView/graphic programming thing. I've used (very little) HP VEE, and tried more seriously to learn and use LabView. I failed miserably. Many explanations are possible, the most likely one is that (as correctly noted in the article), "The ability to prototype rapidly and call on a wide range of industry-specific libraries leads many to boast of a productivity increase for certain tasks". The magic part is "certain tasks", and it's very likely that mine did not fall into them. Still, after some efforts at trying to use it I've decided that I'll never use it for any task. Why is easily explained:
  • data flow is nice, but my brain is not dataflow. I can follow a single "thread" of execution, when things start to split up massively I'm completely lost. And I know that I'm not the only one: the LabView manual itself suggests to organize the flow from left to right and minimize "jumps". Basically, if you organize execution from left to right and put a label onto any "wire" what you have looks very similar to a sequence of function calls with variables moving the data through them. Where is the advantage of data-flow if the best way of using it is not to use it?
  • Again on "threads". The dataflow thing means that different VIs can be executed at the same time (if there are no wires connecting them, for example). This is nice, basically you get multithreading for free (and since there's no shared data - it all travels on the "wires" - there are no lock/mutex/deadlock nightmares). Now, multithreading is nice when it's useful, but being forced to use it is not nice: debugging a multithreaded program is much more messy than a single-tread one. And having the source shown as a nice diagram does not necessairly make your life easier. I'm sure it's useful, but my problems tend to be very single-thread (measure something, decide, change stuff, measure...etc..). I'd rather debug them single-thread, thank you.
  • The "size" of a VI is basically limited to a screenful. Go beyond that and it starts being a game of scrolling around to see what's up. This may be seen as good, since it forces the programmer to split the program into small sub-units, but when you are tracing/debugging stuff, or you simply want to understand what a program does it turns into a nightmare of a million open windows. You can easily end up with depth 5 or 10 of "entering in the box" to see what a sub-VI is doing.
  • flow control commands are messy: in a true-false if, LabView will not show the two branches at the same time. It's a single box where you can "switch" between the two states and see the two diagrams associated to the two states. Again, this sounds nice, set your combinations of false/true and you see the flow associated to that combination. But if you want to see what happens you're forced to flip back and forth between the different cases all the time.
  • Wire/VI placement aesthetics. If you put your VIs around and connect them quickly, the whole diagram becomes a total mess. Just like indenting code correctly increases readability, so here is important to align the wires and make sure that they don't cross lots of times, otherwise you'll be lost in the diagram. This takes a LOT of time. An example: the VIs have the inputs and outputs at specific locations. This means that in order to avoid crossings you must be careful to place the outputs of the early VIs in a reasonable position relative to the inputs of the following VIs. This involves a lot of moving stuff up and down to find the best combination. I already hate designing graphical interfaces and "correctly positioning" stuff on the screen....I'd rather avoid being forced to do it ALSO for the code.
Overall, I have no doubt that it can be very effective at developing quickly small things which rely heavily on pre-existing functions, but I would never do anything involving math or complex decisions. Right now I'm using C++ to control my instruments and I'm happier and a lot more productive....(even if MSVC 6.0 makes me want to kill who wrote it, at times).

Excellent points, all of them. (none / 0) (#65)
by unDees on Tue Feb 04, 2003 at 04:14:58 PM EST

I think you said it best in your introductory paragraph: LabVIEW and its kin are not right for every programming job. But let me address your points in detail as well, as they point out a number of real challenges (pessimists read: shortcomings) to the graphical approach:
  • Basically, if you organize execution from left to right and put a label onto any "wire" what you have looks very similar to a sequence of function calls with variables moving the data through them. Where is the advantage of data-flow if the best way of using it is not to use it?

    Because you can have several parallel threads running left-to-right, and stack these threads top-to-bottom. That a weak answer to your point, but there are other issues as well. Organizing your dataflow carefully and occasionally labeling wires is like adding comments to text code: they're hints around non-obvious portions of the code. And a complex but well-implemented dataflow calculation often looks nothing like a simple left-to-right sequence.

  • Now, multithreading is nice when it's useful, but being forced to use it is not nice: debugging a multithreaded program is much more messy than a single-tread one.

    You're never forced to use it: if you want or need to specify straight-line code for portions of your program, enforce it using dataflow--pass error information or partial results through a chain of VIs, and the chain will execute in one thread.

  • The "size" of a VI is basically limited to a screenful. Go beyond that and it starts being a game of scrolling around to see what's up.

    The "size" of a single function in a text program had better not be more than a screenful or two, or you'll have the same problems. I have certainly seen projects with nested VI upon nested VI and so on, 7, layers deep. There is a happy medium between one gigantic diagram and 10 layers of nested VIs. Anyway, you're supposed to unit-test the low-level VIs graphically, and then be done with them. To be honest, I almost never have to step into the bottom, communications-based layer of most instrument control programs.

  • in a true-false if, LabView will not show the two branches at the same time. It's a single box where you can "switch" between the two states and see the two diagrams

    This is not always true; there is also a selector which takes a Boolean input, plus two wires of any data type, and outputs the first value if the Boolean was true or the second otherwise. You can see alternatives for both the True and False choices flow into the selector as the choice is made. However, if the code has side effects, you're back to putting it inside a Case structure. I've found these to be more of a help than a hindrance, as they also work on enumerated data types like a switch statement in C. I should point out that VEE shows more information than LabVIEW for If/Then/Else nodes without needing to click the mouse.

  • If you put your VIs around and connect them quickly, the whole diagram becomes a total mess.

    If you type your text code in quickly, the whole function becomes a total mess. Just as you pointed out, making code easy to read takes discipline. I find it no worse for graphical programming than I do for text programming. And many people hate neither GUI design nor aesthetic positioning of graphical code.
But it all comes down to personal preference, in the end. You expressed yours eloquently and reasonably, and I applaud you for it.

Your account balance is $0.02; to continue receiving our quality opinions, please remit payment as soon as possible.
[ Parent ]
Ok on "personal preference" (5.00 / 1) (#100)
by ishark on Wed Feb 05, 2003 at 05:03:47 AM EST

Thanks for the answer, you make some good points. It is possible that LabView has improved (I'm thinking about the if/else) thing. The version I have experience with is 4.something (0? 1?).

I think that you are right when you say "But it all comes down to personal preference, in the end". By comparing my language preferences with other people's I'm starting to think that there are basically two criteria.

  • Objective: a language may provide primitives to deal with some problem, and this automatically makes it more suitable for that problem (think regexp in perl, instrument drivers for LabView, etc. etc.).
  • Subjective: whenether the language matches the "mental process" of the writer. For example, I have two friends (both professional programmers): one hates perl, the other loves it. They have the same background about "good programming", they have the same ideas about good coding guidelines, structure, you name it. Still, one would never use it, while the other uses it a lot.
I've started very early to program in "text-oriented" languages, so for me is completely natural to think in those terms, and using a graphic language requires a conversion in my mind from text to graphics. If I had started with graphical programming maybe I'd thinking the opposite now....
Overall, the personal preference/"mental process" thing probably explains a lot of the religions/flame wars surrounding programming languages :)

[ Parent ]
My own experience... (none / 0) (#112)
by unDees on Wed Feb 05, 2003 at 01:27:45 PM EST

I was brought into this department as "the C++ guy," and I figured, "Well, this LabVIEW thing looks like a nice toy, but I'll probably spend most of my time in a text editor." Graphical programming did indeed grow on me, as I found that it appealed to some visual, aesthetic part of me, just as text programming still appeals to my linguistic side. It's a nice tool in my arsenal now, and it's appropriate for about half of my projects.

Your account balance is $0.02; to continue receiving our quality opinions, please remit payment as soon as possible.
[ Parent ]
Pipelining (4.00 / 2) (#68)
by Rhodes on Tue Feb 04, 2003 at 05:20:55 PM EST

Pipeline Pilot

Your description sounds a lot like the concept of pipelining. Scitegic focuses more on the informatics market, and has chemical enumeration, and database handling plugins. Very slick, quite expensive.

13 years ago. (4.00 / 1) (#70)
by a boy and his bike on Tue Feb 04, 2003 at 06:24:25 PM EST

A nerd sits in his parent's basement with his Amiga 3000. He just bought a copy of the AmigaVision authoring system. It lets you write software by dragging icons that represent various 'objects' such as displaying data or getting information from the keyboard, in a timeline. It's great, nerd thinks this is a very good way of doing things.

Everything old is new again?

Which came first? (5.00 / 1) (#80)
by fluffy grue on Tue Feb 04, 2003 at 08:54:27 PM EST

I'm pretty sure LabVIEW and dataflow programming in general predates the Amiga 1000. ;)
--
"Ain't proper English" ain't proper English.
"Is not a quine" is not a quine.

[ Hug Your Trikuare ]
[ Parent ]

Magic and other stuff... (5.00 / 2) (#72)
by Maniac on Tue Feb 04, 2003 at 06:30:52 PM EST

Another example, not publically sold, with some "lessons learned" from using graphical tools such as LabVIEW.

In the mid-80's when Symbolics was still in business selling Lisp Machines, we developed an application named "Model Builder". The concept was similar, draw data flow diagrams on the screen, and then generate code (FORTRAN) using the back end processor. There were about 20 basic blocks (e.g, +, *, integrators, clippers) and one of them was the "model block" which allowed to nest models. Our target application was simulators for aircraft (for training, primarily maintenance crews but some flight crews as well).

The "execution model" for the Model Builder generated code was based on what we called "partial order". Blocks, connected directly to an input had a partial order of 1. The blocks connected to another block was assigned a partial order of (original block PO + 1) unless it was already assigned a partial order. This continued until each block had a partial order. Feedback occurred in the "next cycle" (we usually ran the models at 10 Hz). A block would be executed only if one or more inputs changed.

What I described above is radically different from the usual design of a real time simulator. Most simulators run at a fixed frame rate, doing the same models in the same order each time. Model Builder generated code would change the order based on which inputs were changed, generally minimizing system latency across the board.

What we found from using the tool was...

  • a developer could do the same amount of work for about 2/3rds the amount of effort
  • the generated code was larger than hand coded, but usually ran VERY FAST
  • it took 6-12 months to get a new person comfortable using the tools
  • we had some good programmers that were never good Model Builder users
The main reason the development cost was less was zero compilation errors, interfaces that were enforced, and a more logical transition from an illustration of how a system was connected to the "code" that modelled that system. We lost a few good people because of that last problem, but in general, the benefits far outweighed the costs.

We used this tool (with improvements) for about 3-4 years before we decided to produce a complementary tool. We had also noticed that:

  • about 1/3rd of the models were written to call other models (model blocks and connections only)
  • it was hard to do state transitions and collections of data (we had a "bus" concept, but it was error prone)
  • we had no tools to split a large model into pieces (to run on more than one computer)
  • we did not have a good place to capture design information (data dictionary)
  • we did not have an easy way to do stubs
The new tool, named "MAGIC" (an acronym I have forgotten) was built to address these problems.

We used a commercial program, Excelerator RTS to do the drawings. We used a "Ward Mellor" (ESML) notation for the drawings. The back end was a C program (again generating FORTRAN) that took the diagrams, the data dictionary, and an allocation table (which blocks into what CPU, which ones are stubs) as input to generate the code. What we got out was a set of main programs that would implement those diagrams, route the data between CPU's, and call Model Builder models at the lowest level.

With Magic, we saw similar improvements in productivity. The learning curve increased a little bit more (two tools, not one). However, the generated code was much better and still fast. The one new drawback we had was the turnaround to

  • find bug
  • fix bug
  • run tool
  • run compiler
  • test fix
which took over a day - mainly due to the size of the generated code and the speed of the FORTRAN compiler (feeding it over 40000 lines of code took a while...). A revision in the code generator allowed us to fix that and get the cycle time down to a few hours (not great, but OK).

This was a very scalable solution. The main simulation was a 40 Mbyte data dictionary and about 2000 diagrams in Excelerator. We were able to have over a dozen developers work on their parts of the system and then put it together for integration and test.

We considered the tool set a competitive advantage, both for initial development and retaining the updates. This did not come cheap - I estimate we spent over $1M on developing the tools but they paid for themselves many times over.

I am not sure if those tools are still in use today. They were in use about 10 years ago, but that part of the company was sold off about 5 years ago and I haven't kept up to date with it. It is not likely with the demise of Symbolics.

  --Maniac


FOR GODSAKES DON'T USE LABVIEW (3.28 / 7) (#77)
by wmg62541 on Tue Feb 04, 2003 at 07:48:44 PM EST

Please don't use LabVIEW -- or any other graphical programming tool. Here's why:

It is, much, Much, MUCH, easier to transcribe algorithms to text (i.e. a regular programming language) rather than to GUI (i.e. LabVIEW).

If you have pseudocode floating around in your head or on a piece of paper, it is rather straightforward to put them into C/C++/Java/C#/VB/Perl/Python, all you have to do is know the syntax of the language that you are using. Usually (but not always), there is a one to one correspondence between your pseudocode and your source code.

For example:

    # flux capacitor calculation (pseudocode)
    PROCEDURE fc_calc(Integer flux, Integer cap, Float time)

        IF flux < 0 THEN
            RETURN 100
        END IF

        IF cap < 0 THEN
            RETURN 50
        END IF

        IF cap > 0 THEN
            RETURN 75 + time
        END IF

    END PROCEDURE

Here is how I would write it (roughly) in, say, Python:

    def fc_calc(flux, cap, time):

        if flux < 0:
            return 100

        if cap < 0:
            return 50

        if cap > 0:
            return 75 + time

Obviously this is a contrived example, but you get the idea. The source code will look very similar in other languages. And they will all look more or less like the pseudocode. Once you have the pseudocode written, the source is more or less trivial.

But now imagine transcribing the pseudocode into LabVIEW instead. Since I can't show you pictures, let me walk you through the steps involved:

    1. create a new Vi
    2. create an input wire for the flux value
    3. create an input wire for the cap value
    4. create an input wire for the time value
    5. create an ouput wire for the fc value

    (Okay that was the easy part.)

    6. Now test the flux value. Insert '0' constant, and less-than gate.

    7. Feed wire from # 6 into a case wire. Careful now, make sure all your wires match up.

    8. Create a circuit in this case wire for the case where the boolean (flux < 0) is a true vale.

    9. Now create a circuit in this case wire for the case where the boolean (flux < 0) is false.

    10. Make sure all your wires match up.

    11. Create a wire inside the first case wire that will test the cap value.

    So you Create two more circuits, each one is similar to steps 6-10:

    12. ...
    13. ...
    14. ...
    15. ...
    16. ...
    17. ...
    18. ...
    19. ...
    20. ...
    21. ...

    22. Make all your wires match up.

    [ An hour and 45 minutes later you are done. ]

    Forgot what your circuit was supposed to do? Bounce around like a schmuck switching case values. etc., etc., etc., ad infinitum, ad nauseum.

And don't you forget that this is trivial code. God help you if you have to write something more complex.

No, stick with regular programming languages, always.

(This goes for everyone. At first I used tell people that maybe LabVIEW was good for scientists and engineers who really didn't know how to program, since it is easier to point and click. Now I realize that it is also a bad choice for them, as they are left struggling with doing stupid things like connecting wires to little blocks that really have no semantic connection to what they want to do.)

That's a pretty big assumption there (5.00 / 4) (#79)
by fluffy grue on Tue Feb 04, 2003 at 08:52:28 PM EST

What makes you so sure that all algorithms are thought of in terms of pseudocode, and that the people programming in LabVIEW think of their algorithms as imperative code rather than a flow of data between components?
--
"Ain't proper English" ain't proper English.
"Is not a quine" is not a quine.

[ Hug Your Trikuare ]
[ Parent ]

I'm with you. (4.00 / 1) (#83)
by bjlhct on Tue Feb 04, 2003 at 11:24:58 PM EST

I used LabVIEW for a while. The bid time drain was guess what debugging. Anyway, it isn't hard to translate an algorithm into blocks. At least it wasn't for me.

Sometimes it was hard to know how to some things (usually interfacing with another application or an instrument) but now ya can just insert actual C of your own as a block if it's easier to do it in code. Damn I wish I used a Mac so I coulda used Applescript called from in LabVIEW.

*
[kur0(or)5hin http://www.kuro5hin.org/intelligence] - drowning your sorrows in intellectualism
[ Parent ]

Just to add to fluffy_grue's remarks... (none / 0) (#84)
by unDees on Tue Feb 04, 2003 at 11:33:07 PM EST

The steps for implementing this algorithm in LabVIEW are likely to be quicker than you suggest. For one thing, you don't need any Case structures; you can do it all with Select nodes (which are like non-short-circuited if/then clauses):
  1. Hit Ctrl-N to create a new VI.
  2. Drop a numeric on the front panel and name it Flux.
  3. Right-click on the diagram, which simultaneously brings it forward and lets you choose a Select node.
  4. Drop a "<0" node between Flux and the Select, and you'll notice they wire themselves quite nicely.
  5. Select the whole diagram and Ctrl-drag it to copy it, and name the new control Cap (it is not necessary to toggle to the front panel to do this)
  6. Drop a 100 constant and a 75 constant at the True inputs of the two selectors, and they'll wire themselves.
  7. Wire the output of the second Select into the False input of the first.
  8. Copy the Flux numeric again to make the time input.
  9. Drop an Add on the diagram and let its output auto-wire to the second selector's False input.
  10. Wire Flux to the Add and use Create Constant to put an already-wired 75 as the Add's other input.
  11. Right-click the second Select and choose Create Indicator to create an output called FC Result or whatever you want (it doesn't have to be a nameless return value as in C).

Even these few steps may sound like a lot, but I'll bet there are thousands of LabVIEW users who could do them all in less than a minute. Hell, I haven't even finished waiting for Tcl's man page at dev.scriptics.com to finish loading in a minute; God help me if I've forgotten some obscure point of syntax or the name of a particular library function. (I'm not picking on Tcl specifically; it's just an example!)

A lot of people think their algorithms out in terms of pictures, as fluffy_grue so kindly pointed out. And even if you're trying to implement a well-known algorithm whose text representation is widely recognized, LabVIEW offers several ways to do that, from built-in expression and formula nodes to integration with MATLAB, HiQ, and other scripting environments. Sometimes a textual representation of an algorithm is more expressive, definitely, but I for the applications I develop those cases are the minority by far.

Your account balance is $0.02; to continue receiving our quality opinions, please remit payment as soon as possible.
[ Parent ]

In VEE (5.00 / 1) (#89)
by kerinsky on Wed Feb 05, 2003 at 02:41:37 AM EST

In VEE I would probably just make a formula box and type in "(flux < 0 ? 100 : ( cap < 0 ? 50 : 75 + time) )".  I declare all my local variables so I only have to worry about one wire in and one out at most.  Formula boxes are what make VEE usable to me.  Pretty much any single C statement that isn't used for flow control can easily be inserted into a formula box.  A multi line formula box could also be used if you don't like that above, just declare a variable such as output and write

output = (flux < 0 ? 100 : output);
output = ( cap < 0 ?  50 : output);
output = ( cap > 0 ?  75 + time : output);

It's somewhat clunky, but still only four boxes, three of which are variable declarations.

As a CS student I didn't like VEE at all when I was first forced to use it as an intern, but it has grown on me like a fungus.  I seems like every week I discover something new that resolves one of my issues with the language.  It is also great to print out and show to program managers.

PS - My first example also makes sure that a value (specifically 75+time) is returned when flux is greater than or equal to zero and cap is zero, my second example will just pass through the previous value of the variable named output.

-=-
A conclusion is simply the place where you got tired of thinking.
[ Parent ]

VEE simplicity (none / 0) (#110)
by unDees on Wed Feb 05, 2003 at 01:18:18 PM EST

Thanks for bringing that up--I should have mentioned it in my post. The VEE approach does indeed only need one node for the computation. LabVIEW has a similar set of tools (formula and expression nodes), but to be honest most people tend to think of using raw Add nodes and so on, making VEE a bit simpler.

Your account balance is $0.02; to continue receiving our quality opinions, please remit payment as soon as possible.
[ Parent ]
FOR GODSAKES DON'T USE PROCEDURAL LANGUAGES (3.00 / 1) (#157)
by subversion on Sun Feb 09, 2003 at 09:41:06 AM EST

Please don't use C - or any other procedural language.  Here's why:

It is much, much, much, much easier to transcribe block diagrams to a graphical diagram program than it is to evaluate them in a procedural language.

Different methods for different tasks.  Is that really so hard to understand?

If you disagree, reply, don't moderate.
[ Parent ]

LabVIEW is great (5.00 / 1) (#81)
by germ on Tue Feb 04, 2003 at 08:56:04 PM EST

The people who posted negative comments on LabVIEW are entitled to their own opinion. I have an opinion too, and that is that LabVIEW is great. It may not be the solution to ALL your programming needs, but for measurement setups, data acquisition, etc., it is by far the BEST.

Let me try to substantiate my opinion with some reasons, which none of the detractors of LV managed to do convincingly, IMHO.

The advantages of LV:

  • rapid application development (YES!)
  • great GUI
  • emphasizes modularity
  • emphasizes program organization and structure
  • complete system (analysis tools, Active X, etc.)
  • great debugging tools
  • widely supported by many people/organizations
  • many instrument drivers available
  • sharing (via HTTP by checking a checkbox!)
  • cross-platform

There are some disadvantages:

  • price
  • not suitable for purely numeric procedures (but see below)

First, a little on my background: I have a PhD in solid-state physics. I have been setting up measurements systems for ages. I have tried most of the programming languages typically used in these applications, starting with HP Basic, then HT Basic, LabWindows, Visual crap, LabVIEW. LV is by far the best and most suitable to the task, IMO.

The only little cumbersome thing is translating a purely numerical procedure. But for this, you CAN embed binary code and DLLs on LV, so it's really a non-issue. The rather stupid example brought by wmg64541 can easily be solved with a formula node. Generally speaking, I find that LV makes me think more clearly about the structure of my programs. Perhaps it's this aspect, a little harder than in procedural languages, that puts off some people. Once you get used to it, you will think more clearly and your programs will be better structured, too.

One other great thing about LV is that it is a complete system, it has lots of functionality for data analysis built right in (non-linear fitting, etc.), it supports a lot of technologies (interfaces, DLLs, binary code, ActiveX, etc.).

To the guy lamenting that LB program size is limited by screen space: You don't get it. Make your programs modular! LV is great for that. Your programs will be easier to debug as well...

To those who thinks LV is only good for "little things", I have quite complex applications for reliability testing. They run for MONTHS at a time with no incident.

I would be hard pressed to maintain my productivity without a tool like LV (Note that I am not reated to NI in any way, just a happy customer).



  • Both Sides (5.00 / 4) (#86)
    by bugmaster on Wed Feb 05, 2003 at 12:05:58 AM EST

    It seems that people here are divided into two camps: "LabView is god !" and "Graphical programming is for losers". Both camps are wrong. LabView is good for some tasks (I haven't personally used it, but I used similar tools), and regular programming is good for others.

    Graphical programming is great for things like automating instrumentation. The data which is being processed is relatively simple (sine waves, square waves, voltage, current), but the degree of parallelism is high. Thus, it is really easy to envision the system as being composed of blocks and wires. It is also easy to hook everything up, because the inputs and outputs are relatively few. Yes, you can do the same thing with regular programming, but most of your code would be spent in making function headers -- in the GUI, you would just draw a wire. And I am not even going into discussing the race conditions...

    On the other hand, graphical programming fails if what you need to do is some ye olde symbolic manipulation. For example, consider a program that asks the user for his password, checks it against the database, and then creates a Web page of the user's chosen news sources, sorted by preference and release date. This would be about a page of Perl/PHP/whatever code if done the old-fashioned way. I can't even imagine how large and complicated the GUI diagram for this program will be -- especially considering the fact that there's probably no SQL interface written for LabView.

    Basically, each approach is good for some things and bad for others. There is no catch-all programming technique that's good for everything -- at least, not until we have Strong AI worked out.
    >|<*:=

    General-purpose tasks (5.00 / 1) (#102)
    by unDees on Wed Feb 05, 2003 at 08:08:35 AM EST

    You're exactly right that LabVIEW is not for every task. However, in the specific examples you pointed out, let me mention that there are multiple ways to access SQL databases from LabVIEW, at least on Win32. And the same goes for HTML as well. I've used the latter toolkit many times before. Sometimes, admittedly, it is a little annoying if you're making a large document. In the best case, though, your diagram ends up looking like the document tree itself, which is at least reasonably descriptive.

    Anyway, the password / database / Web page example would indeed probably be larger in LabVIEW than in PHP, or Perl (in the latter, it'd no doubt be one line of asterisks, dollar signs, and underscores :). But not intractable, and if you're bolting something like that into an existing measurement application, it may be worth the extra effort to keep the source in the same language. Or you could write the HTML-generation portion in Perl and call your Perl script from LabVIEW. You've got plenty of choices.

    Your account balance is $0.02; to continue receiving our quality opinions, please remit payment as soon as possible.
    [ Parent ]

    Suggestion (2.50 / 2) (#93)
    by e8johan on Wed Feb 05, 2003 at 03:50:10 AM EST

    To all LabView-slaves out there: I would like to suggest that you take a look at TestPoint (http://www.quatech.com/shopquatech/products/prod529.asp) which I find being easier to use, more object orientated and simply better.

    Slaves? :) (none / 0) (#101)
    by unDees on Wed Feb 05, 2003 at 07:57:57 AM EST

    I've seen a TestPoint demo before. I tend to get a little suspicious when a sales rep tells me I don't have to do anything at all--no wires, no typing, nothing at all. Does it read my mind, then?

    Your account balance is $0.02; to continue receiving our quality opinions, please remit payment as soon as possible.
    [ Parent ]
    Sales reps... (none / 0) (#121)
    by Pac on Wed Feb 05, 2003 at 04:54:48 PM EST

    Microsoft sales reps will demonstrate to you how secure and cheap their products are. Oracle sales reps will go for hours about their product's easy of use. Before they were bought by IBM, Rational sales reps would tell you how cheap their suite was compared to the potential productivity gains you would soon be enjoying.

    The main problem with sales reps is that they actually believe in what marketing tells them. And almost every one of them either despise or is afraid of the techs...

    Evolution doesn't take prisoners


    [ Parent ]
    In this day and age (none / 0) (#122)
    by Pac on Wed Feb 05, 2003 at 04:57:36 PM EST

    I will have to be very, very interested to contact a company that fails to clearly state the price of its products in their site.

    Evolution doesn't take prisoners


    [ Parent ]
    And I am sorry about that... (none / 0) (#123)
    by Pac on Wed Feb 05, 2003 at 04:59:23 PM EST

    This is an answer for another comment... :(

    Evolution doesn't take prisoners


    [ Parent ]
    A blast from the past (4.80 / 5) (#105)
    by epepke on Wed Feb 05, 2003 at 09:40:21 AM EST

    I was heavily involved in scientific visualization during its heyday, which was roughly 1985 to 1995 and was spurred by the U.S. Federal Government's investing heavily in the NSA supercomputing centers. I worked at the only 100% non-classified DOE supercomputing center: the Supercomputing Computations Research Institute at Florida State University. This is now dead, and although some of the NSA sites seem to hobble along, things aren't what they used to be.

    Back then, the chief tools for doing scientific visualization used a dataflow paradim. The best known was AVS. IBM had an offering that was called (I think) Data Explorer. There was also APE, a free dataflow visualization package out of Ohio.

    All of these were based on a dataflow paradigm. (We bucked the trend with SciAn, which was more of an O-O tool with an interface based on direct manipulation.) A visualization was built up by drawing boxes and drawing wires between them. The individual modules were relatively sophisticated, compared to something like LabView.

    Over time, a number of problems with this approach became apparent:

    1. Visual programming is still too difficult
      The apparant elegance of a visual diagram proved only to be an illusion of simplicity. It was still too complex to be acceptable to most of the scientists who wanted to do visualization. Therefore, a large programming/consulting staff was still required.
    2. Projects did not scale well
      What might be called the "grokkability" of a visual model goes down very quickly after a certain level of scale is reached. Theoretically, this could be ameliorated by having a good hierarchical decomposition mechanism. Practically, however, the way systems evolved just resulted in more and more spaghetti.
    3. It used too much memory
      The easiest way to put together a dataflow system is to use separate processes and interprocess communication. However, this approach, which none of the systems ever really got over, resulted in a lot of duplication of data. When one is dealing with the kinds of data sets that are used in, say, the atmospheric sciences, this can be a really big problem. To be sure, memory is much cheaper now, but scientific computation always pushes the limits and grows to exceed the available space.
    4. It was too slow
      Even using shared memory and even given a lot of memory, the sheer overhead of negotiating the data connections and transferring the data became a major factor in the computation.

    The truth may be out there, but lies are inside your head.--Terry Pratchett


    Things have changed somewhat.... (none / 0) (#111)
    by unDees on Wed Feb 05, 2003 at 01:23:09 PM EST

    The memory requirements are still higher than they might be for text languages, but for running user apps, LabVIEW compiles down to native machine code, and one hopes that their compiler makes some reasonable optimizations. Actually, many an NI developer loves to boast of specific shortcuts in the code, things like knowing when a wire divides whether or not the data actually need to be copied (kind of like copy-on-write for text language string classes).

    It's entirely possible to make a "grokkable," reasonably-performing application of non-trivial size, but it takes some design up front, just as with any programming language.

    Your account balance is $0.02; to continue receiving our quality opinions, please remit payment as soon as possible.
    [ Parent ]

    visual programming links (5.00 / 2) (#109)
    by zzkt on Wed Feb 05, 2003 at 01:16:28 PM EST

    while the dataflow systems mentioned use a visual/graphical system, there are a few further graphical models for programming which are worth following up.

    the lanaguages derived from pd (pure data), behave in a similar way to labview, but are mostly used around music/multimedia and some machine control they include max/msp, and jmax (via ircam) as mentioned at the end of the article. pd and jmax are both free software, with active development and quite a range of extensions.

    some other interesting methods of programming break with the dataflow model and include modeling programms as petri-nets (see: http://www.daimi.au.dk/PetriNets/ and graph rewriting. graph rewriting tends to be more academic in its aproach, with a few useable tools. eg. progress for programming using graph rewriting, and HOPS, a graphically interactive program development and program transformation system based on acyclic term graphs.

    tangents


    ...in 3D (5.00 / 1) (#115)
    by zzkt on Wed Feb 05, 2003 at 02:13:32 PM EST

    one of the few programming languages that can be implemented as a potentially playable quake mod. CUBE

    [ Parent ]
    Cube (none / 0) (#120)
    by Pac on Wed Feb 05, 2003 at 04:14:24 PM EST

    I forgot that one in my comment about Prograph. I found it in a recent research for a simple "in-game" programming language we are designing.

    It is cute, but unfortunately it is just an idea, not an implementation. I actually wrote the author about his thoughts on making the code open source and let the four or five of people around the world interested on such things paly with it.

    Evolution doesn't take prisoners


    [ Parent ]
    cube (none / 0) (#128)
    by zzkt on Wed Feb 05, 2003 at 05:33:17 PM EST

    I found it in a recent research for a simple "in-game" programming language we are designing.

    maybe collision based graph rewriting automata bots? could lead to some scary frag//debug sessions. ..

    did u get any response from the cubist?

    [ Parent ]
    My experiences with multimedia visual programming (4.50 / 2) (#149)
    by jdigital on Fri Feb 07, 2003 at 10:52:34 AM EST

    When I was first exposed to the Max/PD/Reaktor way of doing this I was pretty impressed. As a coder I've ofter been sitting down doing long stretches of coding wishing that I could simply wire things around in a modular fashion whilst testing out new effects or algorithms. Sure, a nice modular coding style helps, but doing it visually seemed like the perfect paradigm.

    My initial suspicions were that those environments would be quite limited in the ability to perform as well as hand coded equivalents, but given the nature of DSP, most time is spent in tight inner loops which are abstracted in the afformentioned environments as well coded objects. After I became amazed at the speed, I found that the real limiting factor was that to actually do anything complex. Alot of time needed to be spent dragging and dropping wires between boxes. Ive always been less than partial towards using a mouse, and when you have hundreds of objects on the screen (and in environments like reaktor, where screen space can be limited), this becomes a real time consuming pain.

    The second thing that I found disappointing, but somewhat educational was that abstracting away the lower level of code implementation took away a fair bit of the fun of coding. I think its the fact that I'm quite comfortable with the pace of textual coding, and that from finalising an algorithm in your head, to getting it working, you are giving a good chance to really think things through both before and during the coding process; or possibly a fair bit of ego-stroking knowing that you slave away for N hours/lines of code to get something happening in the end.

    [ Parent ]
    apropriate visual coding techniques (4.00 / 1) (#154)
    by zzkt on Sat Feb 08, 2003 at 07:42:53 AM EST

    I found that the real limiting factor was that to actually do anything complex. Alot of time needed to be spent dragging and dropping wires between boxes

    like any programming paradigm, the right level of abstraction can help. more complex stuff generally requires a bit of planning + organising patches in ways which are condusive to less spagettification. subpatches can help by abstracting a colleciton of boxes/ + wires into a single box at a higher level, wireless communication can also help ('send' and 'recive' objects in max/pd).

    The second thing that I found disappointing, but somewhat educational was that abstracting away the lower level of code implementation took away a fair bit of the fun of coding.

    with both max + pd, you can get some of the 'fun' of lower level implementation by coding externals in c. if thats too low level, then they are both stack based langauges so you can edit patches in a text editor. while dsp coding in assembler can be fun also, othertimes its fun to have things 'just work' with a minimum of effort, not worrying about memory managment, bitshifting, endianess and all that other coder machismo stuff...

    [ Parent ]
    computer graphics (4.00 / 1) (#117)
    by odds on Wed Feb 05, 2003 at 03:02:56 PM EST

    As with musical software, many 3D graphics packages allow a dataflow model of programming.

    Probably the most powerful in this respect is Side Effects Software's Houdini program, which allows very powerful control over multiple data items (i.e. particle effects), or over the modeling pipeline (create object, extrude vertices, transform vertices, etc.) Side Effects calls this procedural modeling.

    Alias | Wavefront's Maya also has a "dependency graph" (DG) view, which is a dataflow diagram of sorts, but it's so fine-grained that it's not very useful for regular users, although it can be very useful for plug-in programmers. Things get more confusing since Maya actually has some odd semantics in their graphs, with the notion of "parent" defining a 3D transformation hierarchy within the DG.

    All told, I'm a fan of this model. You can take some building blocks and build quick prototypes, or play around with code at runtime. As a graphics coder, I'll never rely on it exclusively, but I always have the option of going under the hood and building a new "node" in C++ for the dataflow diagram. Ya can't lose.

    - David

    Academic background (5.00 / 3) (#118)
    by norge on Wed Feb 05, 2003 at 03:43:35 PM EST

    For those who are interested in the academic side of computer science:

    Gilles Kahn is responsible for much of the formalism that exists in the dataflow world today.  He invented and studied Kahn process networks (with Dave MacQueen) back in the 70's.  Since then many computer scientists have invented a whole universe of dataflow models, many of which are base on Kahn's early work.

    In the late 80's Edward Lee (http://ptolemy.eecs.berkeley.edu/~eal/) began studying dataflow models and has published some fundamental papers in the field.  His research group is responsible for the Ptolemy project which is a graphical programming environment that permits programming in a mixture of dataflow-ish models.

    Related to dataflow programming is the work on the pi calculus, begun by Robin Milner (the inventer of ML) in the early 80's.  The goal of pi calculus research is to come up with a simple unifying model of concurrent programming, similar to the lambda calculus for sequential programming.  Many pi calculus related links have been assembled at http://lamp.epfl.ch/mobility/.

    Many people believe that there is very little room left to accelerate the execution of popular sequential assembly languages (x86, Sparc, MIPS, etc.).  To benefit from the scaling of silicon technology that is expected to continue for at least another 10 years, we are going to have to start writing programs that are both more easily parallelized and more explicitly parallel.  I expect both the popular graphical programming systems summarized in this article and the formal models for concurrent programming I mentioned in this post will become increasingly important for programming computer architectures that are on the horizon today.

    Cheers,
    Benjamin


    More programs on this bent. (none / 0) (#146)
    by Apuleius on Thu Feb 06, 2003 at 04:47:39 PM EST

    Open Data Explorer (OpenDX) and SCIrun both use a dataflow paradigm. The former is freely available for those of you who want to tinker with such things.


    There is a time and a place for everything, and it's called college. (The South Park chef)
    Just another functional language (4.00 / 2) (#148)
    by pattern on Fri Feb 07, 2003 at 10:36:55 AM EST

    Or maybe you can tell me how connecting a five, a seven, and a one to a addition operator differs in function from the lisp expression

    (+ 5 7 1)

    The pretty pictures just make it impossible for me to grep for something when I need to find something in a huge project.

    I really don't see any advantage.  I strongly suspect that the people who think that graphical programming is great just really like functional languages and haven't seriously tried another.

    LabVIEW has a graphical grep... (none / 0) (#153)
    by unDees on Fri Feb 07, 2003 at 03:18:41 PM EST

    ...that works quite well in large projects. I'm sure many of the other dataflow environments do as well.

    I do think there are some similarities between dataflow and functional programming, in that there are fewer invocations of imperative code that has side effects. LabVIEW doesn't have the concept of passing the entire state of the world around as a parameter everywhere, but it at least does away with creating, naming, writing, and destroying temporary variables, as do functional languages.

    Your account balance is $0.02; to continue receiving our quality opinions, please remit payment as soon as possible.
    [ Parent ]

    Graphical Dataflow Programming: LabVIEW and Other Tools | 161 comments (122 topical, 39 editorial, 0 hidden)
    Display: Sort:

    kuro5hin.org

    [XML]
    All trademarks and copyrights on this page are owned by their respective companies. The Rest © 2000 - Present Kuro5hin.org Inc.
    See our legalese page for copyright policies. Please also read our Privacy Policy.
    Kuro5hin.org is powered by Free Software, including Apache, Perl, and Linux, The Scoop Engine that runs this site is freely available, under the terms of the GPL.
    Need some help? Email help@kuro5hin.org.
    My heart's the long stairs.

    Powered by Scoop create account | help/FAQ | mission | links | search | IRC | YOU choose the stories!