[Shootout-list] Organisation for numerical tests

Sebastien Loisel Sebastien Loisel <sloisel@gmail.com>
Wed, 4 May 2005 13:24:04 +0200


> course. I'd like to see non-numerical (e.g. combinatorial optimization,

I think that's a good idea, but I'm going to focus on the numerical
stuff, which isn't to say that you and the others can't do a little
spring cleaning in the other stuff.

> These are exactly the kinds of things that I'd farm out to an existing li=
brary

For reasons previously discussed and outside the scope I'm interested
in, we have to make sure we can implement these things in few lines
for those languages that don't have the luxury of such libraries.

> Perhaps the discrete wavelet transform would be better than the FFT as it=
 is

I just forgot to put it in the list. DWT is a good idea.

> thing. Maybe we compute a useful physical quantity from an atomic simulat=
ion?

I know nothing about physics, but if we can do it in 10 lines of
matlab, why not.

> >    5. LU - computes the LU factorization of a dense N x N matrix
> Again, I have the practical concern that nobody would want to implement t=
hese
> I have the theoretical concern that the performance measurements from the=
se

I do it all the time. It's not always easy to find an LU code that
works on interval arithmetic. The performance is meaningful.

> > * Conjugate Gradient
>=20
> It would be interesting to see this coded in different languages but, in =
terms
> of performance, you'd either be measuring the time taken to evaluate the
> function(s) or measuring the difference in performance of different
> algorithms.

I meant as a linear solver. All you need is fast matrix-vector product
and vector operations. These are used mainly for solving linear PDEs
like the Laplace problem above.

> > * Newton iteration
> I think this could be interesting but I'd like to see it as part of somet=
hing

Yes, you are right. This is just a checklist.

> > * Automatic differentiation
> What exactly do you mean by this?

http://www.autodiff.org/
http://www.math.mcgill.ca/loisel/ad-matlab.pdf

It's a way of computing derivatives of programs without modifying the
program much. There are three ways of using it: to compute the
high-order derivatives of a function R->R, to compute the tangent of
R->R^n (the "forward mode") and to compute the gradient of R^n->R (the
"reverse mode".)

The reverse mode is important when you have a function of 10000
variables which takes a minute to compute and you want its gradient
(say, for gradient descent.) Using automatic differentiation, it'll
take 3 minutes to compute the gradient. Using one-sided finite
difference, you'll need 10001 function evaluations, so it'll take
10001 minutes. That is, if it takes time T to evaluate f, it will take
time 3T (or so) to evaluate $\nabla f$ using the reverse mode. The
computation is exact up to roundoff, although it isn't generally
considered symbolic. In particular, a computation using intervals will
yield a very tight conservative interval about the actual derivative.

If a function f takes time T and has an m by n Jacobian, using the
reverse mode will take mT time to compute the Jacobian, and using the
forward mode will take nT.

Sebastien Loisel