[Pkg-octave-devel] Bug#532656: Bug#532656: octave3.2_3.2.0-1(mips/unstable): FTBFS on mips. Segfault in regression test.

Rafael Laboissiere rafael at debian.org
Sun Jun 14 10:25:21 UTC 2009


* Rafael Laboissiere <rafael at debian.org> [2009-06-13 09:15]:

> So, even if log2 returns a wrong value, the bug happens elsewhere. At any
> rate, the following works as expected:
> 
>     complex (NaN, NaN)
>     complex (0, NaN)
>     complex (NaN, Inf)
> 
> I am puzzled with this problem and I cannot really debug it, since I am
> not versed in GDB.  The bug arises at line 1442 of pr-output.cc, in
> function pr_complex():
> 
>     pr_imag_float (os, i, i_fw); 


I think I found the culprit but I am stuck with the debugging (see
below). If I set a breakpoint at line 703 of pr-output.cc, I see the
following when I launch "complex(NaN,0)" at the octave prompt under gdb:

    Breakpoint 4, set_complex_format (x_max=2147483647, x_min=0, r_x=2147483647, inf_or_nan=true, int_only=0, 
        r_fw=@0x7facdd70, i_fw=@0x7facdd74) at pr-output.cc:703
    703       int prec = Voutput_precision;

On my amd64 system I see the following, instead:

    Breakpoint 6, set_complex_format (x_max=0, x_min=-2147483648, r_x=-2147483648, inf_or_nan=true, int_only=0, 
        r_fw=@0x7fffd1dd6afc, i_fw=@0x7fffd1dd6af8) at pr-output.cc:703
    703       int prec = Voutput_precision;

I think that the different values of x_max and x_min explain the bug on
the mips system.  I guess that this is caused by the following lines in
pr-output.cc (function set_format):

      int x_max = max_abs == 0.0                                                                                               
        ? 0 : static_cast<int> (floor (log10 (max_abs) + 1.0));                                                                
                                                                                                                           
      int x_min = min_abs == 0.0                                                                                               
        ? 0 : static_cast<int> (floor (log10 (min_abs) + 1.0));                                                                
 
However, I cannot debug the problem because the variables are not
visible from gdb and many things seem to be inlined.

Would it be possible to write a simple C++ test program that would expose
the problem on mips?

-- 
Rafael





More information about the Pkg-octave-devel mailing list