[Shootout-list] Java benchmarks

Dave davejf@frontiernet.net
Tue, 29 Mar 2005 19:44:13 -0600


<thump><thump>Up on the soapbox</thump></thump>

I agree that maybe some of the tests should run a bit longer, but I don't 
manage the alioth environment so I don't know what kind of external 
constraints there may be. There are ~67 languages & ~30 tests that each run 
3 times, and one run for some of the tests takes like 331.42 seconds (i.e.: 
PHP Ackermann). I don't know if they just do a partial or a full run when 
code is added or modified, but still even a partial run can add up to a 
large chunk of CPU time.

>From my understanding, The Shootout is intended to give a good overall 
snapshot of performance with emphasis on systems-type programming, which 
makes up most software. I think it does a pretty good job of that as far as 
benchmarks go. The tests and duration of most of the tests seem to mimick 
real-world software better than most benchmark suites (How often do most 
programmers write code that spins through a few preloaded floating point 
arrays for 10 minutes? Or write a recursive function that spins for more 
than 5 seconds for that matter. Those are the exceptions that prove the 
rule.).

For example, the Word Count test does something real-worldish and uses the 
equivalent of a 15 MB file for the last run, which is actually larger by far 
than most text files, on my system at least. When I use a utility like wc, I 
often use it in a loop in a script that gets called in a loop by another 
script for files that would avg. a lot smaller than 15 MB. Startup time 
counts. Even in server 'daemon' environments where the code takes different 
branches with different data, the "profiling" time of the Hotspot optimizer 
can be an issue as optimized executable code is constantly flushed in and 
out of cache.

All this to say that a) I think there are probably logistical concerns, b) 
the Shootout is done pretty well over a large swath of tests and even more 
implementations and c) maybe some of the tests are too long already instead 
of too short <g>

My point in all this basically comes down to: If Hotspot doesn't perform 
well under these conditions (-server is a pig and -client doesn't optimize 
well enough), then that may give people some pretty decent insight as to how 
it will perform for a lot of GP / SP programming chores.

It doesn't make sense to change the way a benchmark is run because of how a 
runtime performs - it should be the other way around ;)

<thump><thump>Off the soapbox</thump></thump>

- Dave

----- Original Message ----- 
From: "James McIlree" <ovrskeek@mac.com>
To: "Shootout ML" <shootout-list@lists.alioth.debian.org>
Sent: Tuesday, March 29, 2005 1:00 AM
Subject: Re: [Shootout-list] Java benchmarks


> Isaac,
>
> The central problem here is that the shootout benchmarks
> are very small and short.
>
> In general, the server compiler generates higher performance
> code than the client compiler. It uses a lot of resources to do so,
> though. In a benchmark that runs only a few seconds, or, as many of
> the shootout benchmarks, fractions of a second, it just doesn't pay.
>
> To me, the best fix would be to have longer running benchmarks.
>
> Some of the C benchmarks can complete in less time than it takes
> Java to reach main(). If the computing task at hand is really that
> short, it just doesn't matter what language you write it in. Even
> something 100x slower is fast enough.
>
> As a second choice, the IBM jdk would do nicely.
>
> As a third choice, run both server and client versions, as
> separate languages. You've already got a large batch of java(s)
> in the shootout, this would do to nicely confuse the mattter :-).
>
> James M
>
> On Mar 28, 2005, at 10:44 PM, Isaac Gouy wrote:
>
>> Could you have this conversation with Keith Lea, and then get back to
>> us? He beat-us-up for using  the client compiler.
>>
>> http://kano.net/javabench/
>>
>>
>> --- James McIlree <ovrskeek@mac.com> wrote:
>>>
>>> I'd like to raise an issue about the Java benchmarks. It looks to
>>> me as if they are all being run with the "-server" option.
>>>
>>> Would it be possible to switch to the client compiler? I believe
>>> most of the benchmarks would actually complete quicker using the
>>> client
>>> compiler.
>>>
>>> The shootout benchmarks are all so small and quick, the server
>>> compiler simply does not have time to pay for itself.
>>>
>>> As a concrete example, here are timings for knucleotide:
>>>
>>> IBM java
>>> 1.205u 0.073s 0:01.17 108.5%    0+0k 0+0io 0pf+0w
>>>
>>> Sun java (-client)
>>> 1.291u 0.053s 0:01.38 97.1%     0+0k 0+0io 1pf+0w
>>>
>>> Sun java (-server)
>>> 1.785u 0.061s 0:01.45 126.8%    0+0k 0+0io 1pf+0w
>>>
>>> As you can see, it would be really nice to have the IBM jdk
>>> available for testing as well. It seems to be generally faster than
>>> the Sun jdk.
>>>
>>> http://www-128.ibm.com/developerworks/java/jdk/linux140/
>>
>>
>> __________________________________________________
>> Do You Yahoo!?
>> Tired of spam?  Yahoo! Mail has the best spam protection around
>> http://mail.yahoo.com
>>
>> _______________________________________________
>> Shootout-list mailing list
>> Shootout-list@lists.alioth.debian.org
>> http://lists.alioth.debian.org/mailman/listinfo/shootout-list
>>
>
>
> _______________________________________________
> Shootout-list mailing list
> Shootout-list@lists.alioth.debian.org
> http://lists.alioth.debian.org/mailman/listinfo/shootout-list
>
>
>
> -- 
> No virus found in this incoming message.
> Checked by AVG Anti-Virus.
> Version: 7.0.308 / Virus Database: 266.8.4 - Release Date: 3/27/2005
>
>