Newsgroups: comp.lang.apl
Path: watmath!watserv1!utgpu!news-server.csri.toronto.edu!rpi!usc!elroy.jpl.nasa.gov!jato!csi!sam
From: sam@csi.jpl.nasa.gov (Sam Sirlin)
Subject: Re: APL2 question
Message-ID: <1992Mar17.185047.8403@csi.jpl.nasa.gov>
Originator: sam@kalessin
Sender: usenet@csi.jpl.nasa.gov (Network Noise Transfer Service)
Nntp-Posting-Host: kalessin
Organization: Jet Propulsion Laboratory, Pasadena, CA
References: <9203152248.AA24898@bottom.magnus.acs.ohio-state.edu> <728@kepler1.rentec.com> <1992Mar16.173450.1067@csi.jpl.nasa.gov> <730@kepler1.rentec.com>
Date: Tue, 17 Mar 1992 18:50:47 GMT
Lines: 75


In article <730@kepler1.rentec.com>, andrew@rentec.com (Andrew Mullhaupt) writes:
|> There are not many FORTRAN derivatives that are not called FORTRAN, but then
|> perhaps you mean Algol derivatives. 

I thought Fortran came before Algol, but I could easily be wrong. I
meant languages like C, Pascal, Basic, PL1, etc. 

|> To me, readability is not a matter of how 
|> fast one can comprehend something, but of how easy it is. 

These sound equivalent to me. If something is hard to understand it
usually takes me longer. I've read other people's fortran and other
people's apl, and I still say that one can be just as bad as the
other. 

|> Life is too short
|> to spend _any_ time in that characteristic APL fog that separates seeing
|> how to do the problem and then trying to trick that damn fool language into
|> doing it without the hideous excesses of the easy to see approaches, especially
|> if it also means trying to reverse engineer the interpreter at the same
|> time. 

I agree. The APL code I write usually is very simple, since I want to
write it fast. I used to worry alot about avoiding loops, but now I
mostly just use them and worry about speed later if the code is slow.
I now tend to view APL as having the capabilities of other languages
(I can write loops etc), but with the option to do many array options
easiliy. You don't have to program any different than in Fortran.
Using this style its always faster to write code in APL, and simpler
to debug. But the code may run slow. This is not usually a problem for
me as I don't write comercial code, but code that I will use myself
for engineering analysis. If speed is too much of a problem, I can
always use fortran. This is the niche that the Matlab (and
Mathematica) crowd fits into as well, for exactly the same reasons. 

|> Then for your pains the code you end up with is practical enough, except
|> if at some future date you have need of figuring out what it does. By the
|> time appropriate comments are included, it takes up the same page or two as 
|> those Brand X languages.

Possibly. Comments can expand to fill any amount space if you want. On
the other hand, I still think a+b is more readable (faster and easier)
than 
   do i=1,n
     do j=1,m
       c(i,j)= a(i,j) + b(j,i)
     end do
   end do
and I'm less prone to typos if I type less. Even 
   call add(c,a,b,n,m)
is really much to complex (and not general).

|> >Well, you can always compile APL. 
|> 
|> Well how much good compiling APL does depends on how 'high level' your code
|> is, but I'll bite. How fast can you find the bandwidth of a square band matrix
|> (stored as a dense matrix) using compiled APL? We assume that the bandwidth
|> of the input matrices will be uniformly distributed over the possible values.
Interesting problem. For random matrices, the bandwidth ought to be
generically full. I don't immediately see any algorithm that would use
any of the extra features of APL. Hence I'd expect the code should be
just the same speed or slower, a worst case example. I haven't tried
it, but will if I have something to compare to, as it sounds like a 
good worst case test for a compiler. An easier test with the same
character is the eigenvalue problem. I have APL code for that but I
don't think the compiler I have can handle it yet. Eventually...

|> The best I've seen interpreted APL do is at least five times slower than 
|> FORTRAN or C.
Any references?
-- 
Sam Sirlin
Jet Propulsion Laboratory         sam@kalessin.jpl.nasa.gov

