[Pkg-octave-devel] Bug#783624: Bad plotted data against time with long arrays

Mike Miller mtmiller at debian.org
Thu Apr 30 00:12:42 UTC 2015


Control: forwarded -1 https://savannah.gnu.org/bugs/?32980

On Wed, Apr 29, 2015 at 20:31:35 +0200, Rafael Laboissiere wrote:
> At any rate, this is an upstream issue and I am hereby tagging this bug
> report accordingly.  It is related to the specific numerical condition of
> your data.  This is the reason why the problem disappears when floor(ttr(1))
> is subtracted from ttr.
> 
> I will report this upstream, eventually.

Done!

This is a known upstream bug, the OpenGL plotting toolkits only support
single precision values. If your x or y data series exceed the bit depth
of single precision, some number of sequential values will end up being
the same, and it will plot as you've seen.

A workaround is to use gnuplot as Rafael has shown.

Another workaround would be to quantize or scale your x data series so
it doesn't overflow the depth of a single precision value. For example,
if you subtract datenum(2010,1,1) from your trr vector, the plot comes
out ok.

-- 
mike



More information about the Pkg-octave-devel mailing list