Kuro5hin.org: technology and culture, from the trenches
create account | help/FAQ | contact | links | search | IRC | site news
[ Everything | Diaries | Technology | Science | Culture | Politics | Media | News | Internet | Op-Ed | Fiction | Meta | MLP ]
We need your support: buy an ad | premium membership

[P]
Perl: it's not just for breakfast

By eann in News
Wed Jul 05, 2000 at 12:42:58 PM EST
Tags: Software (all tags)
Software

I've been given the opportunity to put on my Advocacy hat and write a short article about how useful Perl is for things other than web-related programs (CGI, mod_perl, etc.) and automating system administration tasks. I doubt it'll actually convince any of the PHBs around here, but if they keep seeing the word 'Perl' enough we might be able to make them believe it's a buzzword. My trouble is, other than some data conversions, I've never really used Perl for anything else. So, I'm asking for examples of Perl success stories that don't involve web or sysadmin stuff.


I know this isn't necessarily the best forum for this, but I'm more likely to get a wide variety of responses here than anywhere else. I'm cross-posting this at use Perl and a couple of mailing lists, and I'll probably submit the draft article here for review sometime early next week.

Sponsors

Voxel dot net
o Managed Hosting
o VoxCAST Content Delivery
o Raw Infrastructure

Login

Related Links
o use Perl
o Also by eann


Display: Sort:
Perl: it's not just for breakfast | 40 comments (39 topical, 1 editorial, 0 hidden)
It's only sort of sysadmin (4.00 / 1) (#1)
by Rasputin on Wed Jul 05, 2000 at 12:01:51 PM EST

Well, in my previous employment I replaced a largish shell script with a Perl script. It monitored a variety of servers and processes spread across a number of boxes and locations. If it hit something I needed to worry about it would send an e-mail and a page (via e-mail) to the on-call engineer. It monitored a collection of system processes, database processes, did some basic data integrity checks, checked network connections, etc. There was actually a fair amount I could do with Perl that I couldn't with the shell script.

Let me know if you want/need more information.
Even if you win the rat race, you're still a rat.

Re: It's only sort of sysadmin (none / 0) (#14)
by zapman on Wed Jul 05, 2000 at 04:00:23 PM EST

Like which? I've been contemplating learning perl. But all I'd really use it for is sysadmin stuff, and 80% of that revolves around executing other programs. If every other line of the perl script is system(.....), then why bother?

(Granted, I understand the power of assoc arrays, and other complex datatypes. If I ever need something like that, I'd use a real language)

--Jason
-- The request of a friend in need, is done by a friend in deed.
[ Parent ]
Re: It's only sort of sysadmin (none / 0) (#29)
by Rasputin on Thu Jul 06, 2000 at 09:30:51 AM EST

The biggest problem that I solved using Perl was to keep a heart-beat on an application server. It allowed me to not only 'ping' the server but actually launch a query at a specific port and view the response. I couldn't do that in a shell script without coding a custom app for it to call.

Also, given that there were a total of 30 servers/services that had to be monitored, flow-control was rather essential. I don't know if you've tried to do complex flow-control in a Bourne shell script, but it isn't pretty ;) As a bonus, formatting the hourly report into an easy to scan message was trivial.

You might want to look and see what those other programs are doing. If your system() calls include a lot of things like grep, wc, sort, etc Perl is a reasonable substitute. Also, I found it hugely useful to feed the output from a tail -f into a perl script to watch for specific messages. A lot better than scanning the entire file every once and again for messages. Especially when the logs you're looking at get large (> 10 Meg) and you can't truncate them because you need the complete information set. Perl is very effective at text processing which is why it is the Pathologically Eclectic Rubbish Lister.

Every thing I've done in Perl could have been done either as a shell script or in a compiled language (C is my preferred choice ). However, it took about a week of actual effort to write the system monitoring script in Perl, it had taken another person about 3 weeks to create the shell script I replaced and it would have taken about 3 weeks to code and compile. Also, since it was used in a support role, I had to train my replacement to support it before I left. Teaching Perl isn't that tough. I would not have enjoyed trying to teach enough C to support it.
Even if you win the rat race, you're still a rat.
[ Parent ]

Another use: embedding (none / 0) (#3)
by mind21_98 on Wed Jul 05, 2000 at 01:02:17 PM EST

There's another use for Perl: embedding it into C programs. In fact, I made a sort of 'inetd' that loads Perl code in at startup and executes it for every connection that comes in. The performance is at least 2x that of inetd alone.

Embedded perl is also what mod_perl uses by the way.

--
mind21_98 - http://www.translator.cx/
"Ask not if the article is utter BS, but what BS can be exposed in said article."

Re: Another use: embedding (none / 0) (#12)
by Paradox on Wed Jul 05, 2000 at 03:18:54 PM EST

I'd love to see the code for that if you don't mind.
Sounds like a neat trick.
Dave "Paradox" Fayram

print print join q( ), split(q,q,,reverse qq;#qsti
qq)\;qlre;.q.pqevolqiqdog.);#1 reason to grin at Perl
print "\n";
[ Parent ]
Re: Another use: embedding (none / 0) (#18)
by mind21_98 on Wed Jul 05, 2000 at 04:36:14 PM EST

Here it is:

http://msalem.translator.cx/pinetd.tar.gz

Right now it's only pre-alpha, but hopefully I'll release this onto Freshmeat sometime soon.

--
mind21_98 - http://www.translator.cx/
"Ask not if the article is utter BS, but what BS can be exposed in said article."
[ Parent ]

Large bugfix replacement (3.00 / 1) (#4)
by Anonymous Hero on Wed Jul 05, 2000 at 01:08:22 PM EST

We have an ancient, creaky chunk of code that's supposed to create bulk datafiles for some of our customers. Problem is that it sometimes creates incorrectly formatted messages with spurious characters in it, which causes the client's parser to break. The code is too nasty for anyone to touch without causing new problems.

After multiple nights of support people trying to manually fix these giant (20+ MB) files using emacs, I wrote a validating parser in Perl -- it checks that the file is structually complete, and is aware of all of the various record types it might contain (8 variants), so it checks that each record is correct, given its stated type. Finally, it checks the field of each record for bad characters. Bad records are removed and put in an error file.

It's been postprocessing these files for well over a year now without a hitch. The client who was threatening legal action has been quiet as well =).

The most compex job I've attempted (each record type and I'm pleased it's come out so well (and it's pretty fast to boot). 200 lines of code, but 1000 lines of data structures describing the record types (including a list-of-lists-of-lists =). The C programmers were claiming months to write something similar, this took a week and change, including bugfixes. A hack, to be sure, but a valuable hack.

Re: Large bugfix replacement (none / 0) (#31)
by Notromda on Thu Jul 06, 2000 at 12:03:43 PM EST

A hack, to be sure, but a valuable hack.

This is perl's greatest strength. It lets one hack out solutions for little (or big) problems quickly, allowing one to get on with the "important" stuff, whatever that may be. Sure, a similar program in C may be slightly faster, more elegant, more professional... pick your buzzword... but the bottom line is that you were able to move on to other things quickly.

This may not be a good attitude for all software (in fact, it isn't) but it certainly has its place.

[ Parent ]

Compiler (2.00 / 1) (#5)
by royh on Wed Jul 05, 2000 at 01:26:59 PM EST

I am probably insane for doing this, but I am using perl to write a translator/executor to bootstrap a compiler for a language I'm making. The open-ended object-oriented features of perl really help, because the language will be similarly open-ended.

I started out in C++, but decided to use perl for the extremely simple string handling and the eval-like features (closures etc.) and because C++ templates, while awesome in theory, are not as flexible as they should be. Perl can handle most of the uses of templates fairly well though...

I could probably have done it in C, but perl is sooo fun :)

ps. It's not a success story yet, but I'm working on it.

Astronomical uses (5.00 / 1) (#6)
by CanSpice on Wed Jul 05, 2000 at 01:32:33 PM EST

When I was working at a major astronomical facility, there were a number of Perl geeks there. Two things I worked on that involved Perl:

1. A simple little script that would report on telescope usage. Uptime, downtime, time lost to weather, etc. Fairly simple parsing of textfiles.

2. A major suite of programs/scripts/templates that would be used, in conjunction with previously written FORTRAN programs, to reduce astronomical data as it came off the telescope. A perfect example of Perl being used as glue, it had to do all the file handling, config file parsing, and major object handling, and fire up FORTRAN programs that would do the actual mathematics/data reduction. More information on the ORAC (Observatory Reduction and Acquisition Control) project can be found here.

Oh, and Frossie Economou, one of the coolest people working at JACH, had a licence plate that read "PERL5". Awwwwyeeeeh.
--- I don't have a sig.
Used Perl to control batch processing of GPS data (none / 0) (#7)
by Anonymous Hero on Wed Jul 05, 2000 at 01:33:28 PM EST

The batch processing of geodetic quality GPS survey data was a mixture of art and science a few years back. You'd pass the data through one set of code, make some decisions based on the output send it to the next piece, sometimes iterate, sometimes not, so lots of judgement calls and branching. I managed to automate the process by using a mixture of Perl and shell scripts and cribbing from other similar efforts. My opinion is such interpreted languages are no good for nested algorithmic stuff (fortran still holds sway there, honest!) but it's great for tying various bits of compiled code together, process control etc. GPS batch processing is pretty well automated now, by folks who wanted to do this seriously, but I'd bet big money (well maybe not) that it's compiled fortran/c code tied together with Perl scripts.

I used the project partly as a way of teaching myself Perl and mainly as a way of taking the tedium out of routine but fiddly work. I've forgotten most of my Perl now, still I reckon it's not unrelated to riding a bike so I think that, with a manual to hand I could scrape along ok (having many manuals within easy reach was pretty much how I worked when I was doing real programming and so did many colleagues, so I reckon it ain't lame).

Not another mp3 player! (none / 0) (#8)
by kbob on Wed Jul 05, 2000 at 01:44:36 PM EST

I'm listening to an mp3 right now, aren't you? (-:

My mp3 player is a front end written in Perl and Perl/Tk which calls amp to do the actual decoding. The front end is, of course, graphical, and has album/track browsing facilities.

I also wrote several Perl scripts to rip and index my CD collection.

K<bob>

sockets... (none / 0) (#9)
by Anonymous Hero on Wed Jul 05, 2000 at 01:45:54 PM EST

this isn't really a success story but PERL made my task last night so easy it was unbelievable. I was trying to write a program that would allow me to do remote execution of certain commands through a web interface, and display the output from these comands in the web-interface again. the main problem was my web-server and my main code-execution machine are different and I'm running the web-server as a router to the internal machine plus our main firewall dismisses all telnet, ssh, ftp connections so I was left with http. I wrote a very small, like 40 lines, server program in the computer I was trying to access, it did everything I wanted it to do. It acted as the server to the complemantary client program which was a CGI script, parsed all the information sent to it, decided which were dangerous pieces not attempted to execute them, and send perfectly legible output back to the client. I think by pulling a few tricks I will be able to sit in a internet cafe in the place I'm going, connect to the server using https and continue developing the program I am working on.

Perhaps this doesn't sound marvelous to most programmers but I had no clue on sockets when I started and it took an hour to complete both the server and the client. Of course I still don't know anything about sockets and it took me half an hour to search for a legitimate sollution from the Camel book that didn't exist to a trivial problem but hey, one hour and it's working to suit my needs. And the way I ended up using sockets was no different then normal filehandles which I had some experience on. Perl just simplified the task amazingly.



No PHBs here, only us, the thinking machines... (none / 0) (#10)
by Pac on Wed Jul 05, 2000 at 02:45:36 PM EST

I am not so sure there are too many PHBs in kuro5hin right now. There aren't many in /. either. We techies tend to gravitate around these sites and forget that "normal" people will usually be found in CNN(f).com, ESPN.com and Playboy.com.

So, as k5 is a kinder, gentler site than /., you will get a somewhat informed discussion on non-web uses of Perl. Months from now you will get a ragging flame-war on Perl x Python x PHP. :)


Evolution doesn't take prisoners


PHB Fodder (4.70 / 3) (#11)
by Anonymous Hero on Wed Jul 05, 2000 at 03:01:05 PM EST

I do quite a bit of work around employment systems. These systems tend to be the stepchildren in Human Resources because the data they contain is derived from resumes or job applications. This information is self-declared by applicants, unverifiable due to cost restrictions, and highly subjective.

Of all the things that should attract a PHBs focus for optimization, these knuckleheads invariably want to pour all of their resources into eliminating re-keying of data.

Forget that applicant data is not definitive enough to be used for cutting paychecks or populating benefits applications. Forget that few employers actually maintain application-type data in their personnel systems. PHBs are willing to spend the equivalent of 50 full-time administrative temps to eliminate the step of keypunching basic payroll/benefits/first-day-of-work data. Of course, re-keying costs the same or less than verifying and correcting data pulled from an applicant system.

But the data must crossover.

My carping about the wisdom of pushing applicant data into HRIS systems aside, going the other direction--getting position and department information from the HRIS system into the recruiting system--is more than just useful.

The bigger vendors on both the application and HRIS sides produce their own glue software for integrating among themselves. This software costs a great deal to license, even more to install, and even more than that to troubleshoot to the point of verifying that it does not work as advertised.

Enter Perl. With its ability to talk to nearly any DBMS through DBI, its ability to programatically send and grab files across systems, its elegantly full-featured data structures, and its free availability, in the hands of a minimally competent programmer, it puts the big-ticket interfaces to shame.

Perl is stable, deterministically reliable, and infinitely malleable. The typical licensing price of interface sofware components would buy two full months' worth of consulting at $150/hr. A Perl-based alternative would only take from a few hours to five days to develop plus the three-to-ten days it takes to get processes hammered-out and to get access to participating systems.

Note, please, that these gross estimates favor Perl even more than that since the requirements effort is expended even with commercial interface software. The installation, configuration, and testing of the commercial interfaces can often exceed the time it takes to develop and test the Perl-based solution.

Factor-in the difficult-to-quantify benefit of being able to meet the desired process rather than mold the process to meet the limitations of the tools, and the Perl-based solution is a clear winner.

Summarization for PHBs:

1. Perl is well documented
2. Perl is well supported
3. Perl is widely available
4. Perl is portable
5. Perl is reliable
6. Perl is stable
7. Perl is inexpensive
8. Perl can be made to do what you want it to do
9. If the major gripe is that you can't sue Perl, then tell me, when is the last time you ever sued anybody for non-performance? If you have big-fivers running amok in your corporate household, then performance and delivery were probably never a concern anyhow. How's that for a boundary condition?





Remedy Macro Editor (3.00 / 1) (#13)
by slycer on Wed Jul 05, 2000 at 03:54:07 PM EST

I am brand new to Perl (bought the Learning Perl book ~month ago), never programmed (ok, while BASIC when I was 12), so this probably isn't the greatest of feats, but I was damn proud :-)

We use Remedy for a ticketing system here. One of the things we have enabled is a process to relate tickets to a main ticket when we have a major problem. Problem with this is that it takes much longer to create a ticket that is related to a main ticket (~5 minutes - slow server). So I wrote a perl script that grabs user input and creates a Remedy Macro that can be used by the group to quickly do this. This was about a week of work (most of that was bugfixes - like I said I'm new). Anyone that's used Remedy may know that there is a macro recorder included with the client - but this is much faster. (about 30 seconds to create the macro)

We have now lost our Macro recording functions completely (had to move back to older client) so I extended the script to allow us to still create them. (just more input fields).

Maybe not the best of examples, but it works really well for our needs, which is after all why you create a program to begin with.

Octel voicemail management (none / 0) (#15)
by Vila on Wed Jul 05, 2000 at 04:15:47 PM EST

A previous employer had an Octel voicemail system [can't recall the model, sorry] which was managed via a telnet interface which, to the standard tech support tecnician, was highly cryptic and unfriendly. There was no other way into the box. I used Perl to provide a point-and-click reporting system. It did simple stuff like show all users on a distribution list, and all lists a user was subscribed to. Within a couple of days they'd found and removed /hundreds/ of dead lists, unwanted subscriptions, orphan sublists and the like. I didn't get around to fixing it to write the changes as well as read 'em, but it would have been fairly easy I think.

Hope this helps... I was fired /for using Perl/ (and Apache) against corporate policy... so PHB-ammunition is close to my heart ;)

Re: Octel voicemail management (5.00 / 1) (#16)
by genehack on Wed Jul 05, 2000 at 04:19:51 PM EST

I was fired /for using Perl/ (and Apache) against corporate policy

Care to tell that story? Semi-on-topic because it's sort of the reverse of the question, and could provide useful points, or something...

john,
rationalizing to stay stil...

[ Parent ]

Discussion at use Perl... (4.00 / 3) (#17)
by genehack on Wed Jul 05, 2000 at 04:29:19 PM EST

...has started, if anybody's interested.

john,
deep-linking for fun and profit.

uses of Perl (none / 0) (#19)
by Anonymous 242 on Wed Jul 05, 2000 at 05:17:36 PM EST

On the large vertical telco product I work on we use Perl in a number of spots.

The reporting subsystem is built almost entirely in Perl. Not the scheduler or viewer portions, but the jobs that actually collect data and create the actuall report files to be displayed are written in Perl.

Perl is also used extensively for file utilities. We have have a whole slew of in-house utilities to work with the industrial strength implementation of SCCS we use for version control. One utility was re-written in Perl (previously written using korn shell and egrep/fgrep) and halved the amount of time it took to run.

Sys Admin tasks on Windows NT. Not your typical sys admin use for Perl. We have Perl scripts that set the environments for bulding and compiling the client portion of the project.

We also use Perl extensively (but not as extensively as korn shell) to write wrappers for some of our binary programs.



Perl gave me new Hope! (none / 0) (#20)
by Anonymous Hero on Wed Jul 05, 2000 at 06:21:24 PM EST

After wallowing in the mire of VB, being baffled by the complexities of C++, disgusted by the number of brackets on a line of LISP, i found Perl.

And now i use it for just about everything. I use it to produce Flash output for a graphing app i am writing, i built my career as a web programmer around it, i use it to process 3D data for rendering, i use Perl/Tk to create GUIs with ease.

I love Perl. Perl has been more influential in my computing life than anything else, except maybe Newtek Lightwave3D. I'd also like to thank the authors of O'Reillys 'Programming Perl' for making such a wonderful reference. Kudos to ActiveState for making Perl usable on Win95/NT (back in the day, i thought Windows was a good OS)


-IkeKrull


used perl to delete duplicate files (2.00 / 2) (#21)
by Anonymous Hero on Wed Jul 05, 2000 at 07:12:15 PM EST

my boss gave me a list of files that they knew were in two places on the server. he wanted me to use find and type in each file name individually then check the last modified date attached to file properties(using win95) then delete one of the files.

Using functions like:

(compare($file1,$file2) == 0) {

and

DeleteFile( $cleanpath ) or die "Can't delete the damn file: $!\n";

I did the odius task a lot quicker and got on to better things like reading kuro5hin.



Re: used perl to delete duplicate files (none / 0) (#23)
by Anonymous Hero on Wed Jul 05, 2000 at 09:25:00 PM EST

"wow! perl sure is powerful!"

Tell me why this can't just as easily be done in another language.

[ Parent ]
Re: used perl to delete duplicate files (none / 0) (#26)
by Cryptnotic on Thu Jul 06, 2000 at 01:41:31 AM EST

It can be done in many other languages, although usually not as easily.

One of perl's mottos is "Make the easy things easy; make the hard things possible".

[ Parent ]

Re: used perl to delete duplicate files (none / 0) (#32)
by hoss10 on Thu Jul 06, 2000 at 01:33:51 PM EST

Don't forget Python!
I recently got sick of checking shares of mp3's on 3 different computers in work to see if any new ones had been added. A few lines of python later I had them all listed (2000+) in a html file in order of date created.
The quicksort code was most of it.

#!c:\program files\python\python.exe
from os import listdir,stat
from stat import S_ISDIR
from time import asctime,localtime
from string import split

#0 - fullpath
#1 - createtime (seconds since 1970)
#2 - songname

def isgt(x,y):
return x[1]<y[1] # change 1 to 2 for alphabetical order

def partition(list, start, end):
pivot = list[end] # Partition around the last value
bottom = start-1 # Start outside the area to be partitioned
top = end # Ditto

done = 0
while not done: # Until all elements are partitioned...

while not done: # Until we find an out of place element...
bottom = bottom+1 # ... move the bottom up.

if bottom == top: # If we hit the top...
done = 1 # ... we are done.
break

if isgt(list[bottom],pivot): # Is the bottom out of place?
list[top] = list[bottom] # Then put at the top...
break # ... and start searching from the top.

while not done: # Until we find an out of place element...
top = top-1 # ... move the top down.

if top == bottom: # If we hit the bottom...
done = 1 # ... we are done.
break

if isgt(pivot,list[top]): # Is the top out of place?
list[bottom] = list[top] # Then put it at the bottom...
break # ...and start searching from the bottom.

list[top] = pivot # Put the pivot in its place.
return top

def quicksort(thing,start=0,end=0):
if start < end:
split = partition(thing,start,end)
quicksort(thing, start, split-1) # ... and sort both halves.
quicksort(thing, split+1, end)
return

# the main prog

files=[]

def godir(path):
fn=listdir(path)
for x in fn:
np=path+"\\"+x
t=stat(np)[0]
if S_ISDIR(t):
godir(np)
else:
ctime = stat(np)[9]
sn=split(np,"\\")
sn=sn[len(sn)-1]
tup= np,ctime,sn
files.append(tup)

godir("\\\\machine1\\mp3")
godir("\\\\machine2\\mp3")
godir("\\\\machine3\\mp3")

quicksort(files,0,len(files)-1)
print "<html><body><table>"
for x in files:
print "<tr><td><a href=\"file:" + x[0] + "\">" + x[2] + "</a></td><td>" + asctime(localtime(x[1])) + "</td></tr>"
print "</table></body></html>"


[ Parent ]
Re: used perl to delete duplicate files (none / 0) (#33)
by Notromda on Thu Jul 06, 2000 at 07:02:59 PM EST

Of course, with perl code, one can safely cut and paste into a web page, but python... with the whitespace syntax, encourages confusion with post like this....

(Hey, not trying to be flamebait... just pointing out a real problem... :)

[ Parent ]

Re: used perl to delete duplicate files (none / 0) (#35)
by hoss10 on Fri Jul 07, 2000 at 06:19:35 AM EST

> with the whitespace syntax, encourages confusion

you're right. All the whitespace is lost in the HTML.
But all languages should go like Python!
ie. Curly braces {} are assumed from the indenting

Does this count as the flamebait you were on about?

[ Parent ]
Re: used perl to delete duplicate files (none / 0) (#37)
by Anonymous Hero on Sat Jul 08, 2000 at 11:04:52 AM EST

Wow! Any special reason you didn't just use files.sort(isgt)? And os.path.walk to find the files?

[ Parent ]
Re: used perl to delete duplicate files (none / 0) (#40)
by hoss10 on Mon Jul 10, 2000 at 10:39:22 AM EST

because files is a list of tuples and I want to sort by the SECOND thing in the tuple

anyway, i tried files.sort(isgt) and even though it didn't complain it sorted by the first thing in the tuple (filename, not creation date)

it's declared as list.sort() and I hoped that passing an argument was an undocumented way to sort by the results of it (like perl, AFAIR) but it didn't work

I suppose os.path.walk would have been handy though

PS: that sorting thing I wrote is pretty generic don't you think. I'm surprised it isn't in Python (and every bloody language ever!)


[ Parent ]
Re: used perl to delete duplicate files (none / 0) (#34)
by Anonymous Hero on Thu Jul 06, 2000 at 09:52:36 PM EST

I think for such a trivial example, it would be hard to find a self-respecting scripting language (or even C) that could not handle this very easily. I don't like perl, but I realize it can make some things quick & easy. But this example was just stupid (but then, of course, the harder stuff does *not* need to rely on little tricks "to make the easy stuff easy", of which perl has many, you cannot deny. But then, why perl? If all you are doing is sequencing function calls and gluing objects together, all that counts there is a big module/component library. And there are other languages that have the same (no, not just python; in a way, I like perl more than python))

I also don't understand that motto: a language that *could not*, by definition, do "the hard things" would be truly broken. So why must this be a "feature" of perl? I mean, everything in perl, I can do in C. Hard, but possible.


[ Parent ]
Why Perl is useful (none / 0) (#22)
by AbMan on Wed Jul 05, 2000 at 07:39:34 PM EST

I originally started learning Perl because I needed to do some complex-but-not really-so stuff on Win NT at a customer site. You say you don't want to hear about sysadmin stuff, but the point is that Windows' native tools are absolutely hopeless for any serious scripting - try doing anything which requires anything but the most basic flow control in command script (aka DOS batch files) and you'll see what I mean. Unless you really want to write a little Delphi or VB program everytime you need a small utility, you need something like ActivePerl.

Secondly, Perl IMHO comes close to Java in terms of portability, and in some ways supersedes it. Using DBI, I can write an entire database-driven web application on Windows, and move it onto a Unix based web server just by changing the DBD driver, or vice versa. I got really annoyed with Java when I was trying to build an applet which would interface with a piece of third party software whose only interface was via non-blocking socket I/O, which Java does not support ( Sun says it does, but then they define 'non-blocking socket I/O' differently to everyone else in the world). I can do this in Perl on both Unix and Windows (though I had to steal the value of **NONBIO out of winsock.h to make it work on Windows).

Perl is really good for little niggly text manipulation jobs, like getting reports output by closed source programs in electronic text format, stripping out all the header and blank lines and producing formatted data which then can be fed into other programs. Powerful tools for reformatting data.

Also, things like big complex database conversion scripts where constraints need to be dropped and records deleted in a certain order because of referential integrity, and then reapplied in reverse order are a snap with a few simple Perl scripts; once you get the deletion order right, just apply the *reverse* function on the script, and tweak the SQL in Perl, and the reinsertion script is ready.

The other advantage of Perl is the huge number of modules written for all sorts of purposes; Tk, CGI and the XML libraries being the most obvious that come to mind.

And let's not forget that all this stuff is free, as in beer as well as speech - that ought to have a favorable effect on any bottom-line watching boss.


perl for everything (none / 0) (#24)
by dustacio on Wed Jul 05, 2000 at 09:25:07 PM EST

When I changed jobs in May Perl was the first thing that went on my computer. Not what most M.E. would install first. I use perl for everything that I might have to do again. Mostly parsing machine logs for statistical analysis. Plus a few cgi web pages to make my life easier. Perl's importance to me came to life a few days ago when a coworker, who is skilled in Visual Basic and Excel, told me he would have to learn Perl so he could keep up with the data we were processing. Seems he hit his limit of 65k lines in Excel and would have to wait for me to parse the data for him. I lent him the Camel book. A Mechanical Engineer's use for Perl: parsing data and reports (obviously), generating graphical (using Tk) wafer maps, modifying said maps (much easier than by hand)

good advocacy stories (none / 0) (#25)
by Yzorderex on Thu Jul 06, 2000 at 01:36:04 AM EST

here British Telecom, Call Management Information System
Motorola
neat XML stuff.

perl for everything! (none / 0) (#27)
by orabidoo on Thu Jul 06, 2000 at 07:17:52 AM EST

At work we use perl for everything we can. Our website runs mod_perl with a custom-written templating/appserving backend comparable to HTML::Mason (right now in the process of cleaning up for opensourcing); we also have an html-chat daemon in perl, which interfaces (using the irc server-to-server protocol) to a standard ircd, so it sees the java client users too. It's also coming out as open source one of these days :)

One use of perl that impressed me is the whole Mandrake installer: it does everything from showing the installation GUI (with perl-GTK) to resizing a FAT32 partition, all in perl.

The Human Genome Project (none / 0) (#28)
by Anonymous Hero on Thu Jul 06, 2000 at 08:42:15 AM EST

Perl is heavily used in my lab and many other genetics labs these days -- we simply have too much data not to use something like Perl, and all our data is text, so it makes sense.....

See, for example, Lincoln Stein's article How Perl Saved the Human Genome Project

Sysadmin and Accounting (none / 0) (#30)
by mr on Thu Jul 06, 2000 at 11:40:01 AM EST

I've got a shell and perl process that takes one bainary and a bunch of text files and imports billable hours into my accounting system.

Why perl? Because I waould be parsing text.

And I have a series of shell scripts that watch the e-mail flow on a box I do adminning for and then sends commands to the firewall to cause it to change the way it routes the incomming SPAM e-mail. The spam is rejected with a redirect message.

Why Perl? Because the log files are text.

Perl/Tk is great! (none / 0) (#36)
by thycotic on Fri Jul 07, 2000 at 01:00:11 PM EST

I wrote a simple Project Manager (ala IDE) to be used in conjunction with any editor in Perl/Tk. I was contracting for an all Perl shop at the time and it seemed logical that any tools we developed for our own use may as well be in Perl too. I was very impressed by the ease with which I could create this simple application in Perl/Tk with no experience of Tk.

I guess Perl is really only limited by your imagination! :)

Anybody catching heat for useing Perl? (none / 0) (#38)
by jordan on Sat Jul 08, 2000 at 04:04:43 PM EST

I'm a little surprised that you have to justify Perl to bosses nowadays.

Perl seems pretty mainstream to me. Most bosses have heard about it and know that it can really improve your productivity, especially for Web/CGI and Sysadmin stuff.

As a Consultant working for a Great Metropolitan Newspaper, uh... Computer Company, my boss is actively encouraging my getting more and more involved in Perl. Paying for trips to Perl Conferences as self-improvement, buying books, having me work with others on their Perl education.

I suspect that there are some bosses, especially those who used to be technical (or imagine that they used to be technical :-)) who might throw up objections to it as being slow or unsupported or some other nonsense, but I don't see that.

I've worked in one shop where Linux and BSD are strictly forbidden, but Perl is more than encouraged.



Corpus processing (none / 0) (#39)
by Digambaranath on Sun Jul 09, 2000 at 03:03:40 PM EST

I recently wrote my first perl script and found it surprisingly easy, given that my only previous programming experience was a few shell scripts and a bit of BASIC and Pascal programming back in the '80's. Basically, having collected a corpus of student writing, I needed to do things like word frequency counts, pattern matching and so on, so perl was perfect. OK, there's plenty of software out there which does the same thing, but all the programs I could find were either commercial (and over-priced for what they do) or written for the wrong OS *with the exception of "hum", the original UNIX concordancing program, which came with such confusing documentation that I reckoned it would be easier to write my own program!). Perl is excellent for anything involving text matching and manipulation - kind of grep, awk and sed rolled into one.

Perl: it's not just for breakfast | 40 comments (39 topical, 1 editorial, 0 hidden)
Display: Sort:

kuro5hin.org

[XML]
All trademarks and copyrights on this page are owned by their respective companies. The Rest 2000 - Present Kuro5hin.org Inc.
See our legalese page for copyright policies. Please also read our Privacy Policy.
Kuro5hin.org is powered by Free Software, including Apache, Perl, and Linux, The Scoop Engine that runs this site is freely available, under the terms of the GPL.
Need some help? Email help@kuro5hin.org.
My heart's the long stairs.

Powered by Scoop create account | help/FAQ | mission | links | search | IRC | YOU choose the stories!