static epicsInt64
scaleCoefficient(double x)
{
epicsInt64 v;
int neg = 0;
if (x < 0) {
neg = 1;
x = -x;
}
if (x > 2) x = 2;
v = floor(x * (1ULL << 33) + 0.5);
if (neg) {
v = -v;
}
else {
if (v >= (2ULL << 33)) v = (2ULL << 33) - 1;
}
return v;
}
To my surprise this failed to compile. The problem is that this test in epicsTypes.h fails:
#if __STDC_VERSION__ >= 199901L
“Hmmmm", I said, “I wonder how that macro expands”...
238> touch a.c
239> gcc -E -dM a.c|grep STDC
#define __STDC_HOSTED__ 1
#define __STDC__ 1
240>
!!!!!! How can __STDC_VERSION__ be undefined? I thought it was part of the language.
242> type gcc
gcc is hashed (/usr/bin/gcc)
243> gcc --version
gcc (GCC) 4.4.7 20120313 (Red Hat 4.4.7-16)
And indeed, it *is* part of the language — if you explicitly request a more recent standard.
245> gcc -std=c99 -E -dM a.c|grep STDC
#define __STDC_HOSTED__ 1
#define __STDC_VERSION__ 199901L
#define __GNUC_STDC_INLINE__ 1
#define __STDC__ 1
wenorum@xildev4 246>
My questions, after that long-winded introduction are, “Given that the default gcc options preclude the use of 64 bit values, what -std value should be used? Should the appropriate -std=xxxx be added to CONFIG.gnuCommon?"