Comment 4 for bug 2061291

Revision history for this message
Christophe Rhodes (csr21-cantab) wrote :

On process startup, the ABI says that the floating point control register will be in a certain state. For x86-64, that is in the System V ABI, available (for example) at https://raw.githubusercontent.com/wiki/hjl-tools/x86-psABI/x86-64-psABI-1.0.pdf -- summarizing, the process is set up so that various exceptional circumstances (e.g. divide by zero, overflow, and so on) do not trigger floating point exceptions.

However, SBCL (mostly) attempts to provide information to the user about these exceptional circumstances; rather than return infinities, not-a-number, or similar, it modifies the floating point control register at startup to turn on those floating point exceptions. So while in C,

float div(float a, float b) { return a/b; }
int main() {
  printf("%f", div(1.0, 0.0));
}

probably prints "+Inf.0" or something, in SBCL

(/ 1.0 0.0)

signals a DIVISION-BY-ZERO condition.

I don't know what glut is doing, but it seems that it is assuming that the state of the control word is to allow overflow and invalid operations without triggering a floating point exception. I don't think this is valid: if the GLUT library requires a particular state, it should ensure it, and restore it on return (the floating point control word is "callee-saved").

Other lisp implementations, or other software in general, may choose to enable a different set of floating point exceptions, or none at all, which can explain the differences in behaviour you are seeing. Your test C program does not set up any floating point exceptions, so it's not very surprising that it doesn't see any.