Sat Apr 12, 2014 9:40 am by Taron
Oh, since this is a Dev Diary, I should say something useful, too, shouldn't I...especially if I have reason to!
After many weeks of being shell-shocked by incompatibilities between my NVidia card and actually all others, really, I finally ran into the problem, after investigating the code on my new Cintiq Companion (which I still don't like). But a compile log on the most simple shader of them all revealed the source of all evil:
A variable I called "buffer".
Yes, as wicked as it sounds, the word "buffer" somehow appears official to all cards, except NVidia. In this case the Intel HD 4000 somehow was convinced that it had to be part of some openGL GLSL version that it simply doesn't support so it goes: "The keyword buffer does not exist in the supported GLSL versions...." (or something to that effect).
I renamed it "dbuffer" and everything worked... everything Apparently I used "buffer" a bunch of times throughout my shaders, including the glass, the normal generator and some more tiny ones.
THUS: Be careful what labels you choose for your variables, best stay away from anything that sounds official, because it runs you into a sneaky nonsense that could drive you nuts without you ever finding out why. I mean, who's concerned about a four line shader that claims to be perfectly fine on your NVidia or GPUShader Analyzer?
Live and learn...I guess.