GLSL problems on Mac (really broken now)

If you've had any problems with Nexuiz, or would like to report bugs, post here.

Moderators: Nexuiz Moderators, Moderators

GLSL problems on Mac (really broken now)

Postby ds01 » Wed Mar 29, 2006 6:28 am

[Note: added external link to log output]

In any rc 151 and Lord Havoc's snapshots on Icculus, GLSL is completely broken in Nexuiz on Mac OS.

Previously in both 1.5 and 1.2.x (and nexuizengine snapshots February 2006 and earlier) only the shadows were broken when GLSL was enabled in Nexuiz (and by the shadows being broken I mean half of the realtime shadows coming from a dlight such as weapon effect or rocket flare etc. were not rendered so the shadows look like "broken glass").

Now, GLSL enabled produces nothing but the skybox background and weapons fire [shotgun pellets for example]. No textures, no models (players or weapons) etc. - just black screen with skybox background and the projectiles from weapons :(

To be specific these options:
Code: Select all
r_shadow_glsl "1"
r_shadow_realtime_dlight "1"
r_shadow_realtime_world_dlightshadows "1"
r_shadow_realtime_world "0"
r_shadow_realtime_world_shadows "0"


The world shadows are too slow for my setup, the dlight are too slow without GLSL - really, really it is needed for the dlights to be usable, or things like the "Quad" drag the FPS to %50 of what it should be.

So to reiterate: previously I disabled dlight and dlightshadows and enabled GLSL, because of the dlightshadows looking broken [like shards of glass]. This is no longer enough in the 151 rc's and the latest engine snapshots, because even with dlights and slightshadows disabled, enabling GLSL makes the entire screen black with no textures and no models being rendered (only 'skybox' and weapon-fire).

To me this is just unacceptable, the latest engine is fast, very very fast compared to the previous - I can see a LOT of work being done to it, but it's just sad that this problem exists for Mac OS and I want to give as much feedback as possible to see it hopefully corrected.

System specs:
Mac OS X 10.4.5 all updates
Dual-Core G5 @ 2Ghz
NVidia 7800 GT 256MB PCI-Express
2GB DDR2 (667) Ram

I have noticed there is no GL_EXT_texture3D implementation for the NVidia drivers on OS X, I thought this was pretty standard and was surprised it was missing (well standard for Linux/Windows/Mesa/etc.).

Console output gives errors finding shaders and one about DrawQ_ResetClipArea: not in 2d, and here's a link to the console log with developer option enabled:
qconsole.log.gz

Anyone on OS X with an ATI video card having same issues or is this NVidia specific?

TYIA
Last edited by ds01 on Wed Mar 29, 2006 9:07 pm, edited 1 time in total.
ds01
Member
 
Posts: 47
Joined: Wed Mar 29, 2006 5:54 am
Location: Lvl 7

Postby LordHavoc » Wed Mar 29, 2006 7:43 am

Could you run with the commandline -developer and post the log somewhere (a nopaste site would be fine) and give me the link (it would be a bit big for a forum post)? by default it does not print the reason the warnings/errors from the shader compilation process, so I need you to enable -developer mode to enable them.

Thanks.

P.S. in the mean time you can play with r_glsl 0, or -nofragmentshader on the commandline, but as you know this is slower.
LordHavoc
Site Admin
 
Posts: 191
Joined: Wed Mar 29, 2006 7:39 am
Location: western Oregon, USA

developer log output

Postby ds01 » Wed Mar 29, 2006 8:41 am

[edit: requested log added to parent post]

Not sure if you have a Mac accessible to you in any fashion, but the "OpenGL Shader Builder" that comes with the Developer's Tools would probably be of help if more insight is needed.

If there's something else needed let me know.

Thanks.
ds01
Member
 
Posts: 47
Joined: Wed Mar 29, 2006 5:54 am
Location: Lvl 7

Postby LordHavoc » Thu Mar 30, 2006 1:53 am

Yikes.

That driver is completely broken, it's not honoring the #ifdef directives which determine which part of the file to compile as vertex shader (geometry processing) and which part of the file to compile as fragment shader (pixel processing), so it's trying to compile both parts of the file into the vertex shader, which causes internal conflicts.

The shaders use a LOT of #ifdef directives to reduce maintenance (otherwise it would take, literally, 3000 different shader scripts, or 30 or so which would run much slower than necessary), so they simply won't work with a broken driver like that!

I have only two options on this, one is modifying the engine to do the work the driver should be doing (#ifdef stuff), the other is leaving it broken.

I bought the OpenGL Shading Language book, and it is quite clear on the full-featured nature of the OpenGL Shading Language, this driver is broken.

Please notify Apple on this issue, it's not my fault :(

P.S. it works fine on other NVIDIA drivers (windows x86, windows x64, Linux x86, Linux x86_64, freebsd x86 are known to work), the MacOS NVIDIA drivers are maintained primarily by Apple, so the ball is in their court, not NVIDIA's.
LordHavoc
Site Admin
 
Posts: 191
Joined: Wed Mar 29, 2006 7:39 am
Location: western Oregon, USA

Postby ds01 » Thu Mar 30, 2006 3:14 am

Do you know if it is in the ARB spec that the driver must honor these conditionals? I would assume it is, but honestly if I am going to file a bug report that has any chance I need to know some things.

I don't blame you for this problem - you may be interested to know that since owning this machine (only a few months btw) this is most certainly not the first bug I have come across, including a rather dumb regression causing an endless driver restart (until the pre-defined reload limit) upon which being greeted with the "Black Menu of Death". Anyone who has seen this knows what I am talking about, but that program was the SolarWinds screensaver port to OS X (not Nexuiz!).

So to get back on track do you happen to know if it's in the spec to support that? What I need to submit is pretty much covered in this mail from an Apple engineer: http://lists.apple.com/archives/Mac-opengl/2004/Nov/msg00088.html

Would you be so kind as to post a very basic code snippet I may use as an example of what fails, please?

Also how would you describe this error in technical terms from your point-of-view (being an OpenGL developer)? For example I would describe this issue as the NVidia OpenGL driver on OS X not respecting the conditional directives in a C project using GLSL code: the fragment shaders get clobbered (interpreted as vertex routines) regardless of the conditionals in the C code. Then I'd insert a small code example. Is this an accurate description in your opinion?

FYI I actually wanted an ATI card, because I knew about the lack of direct involvement by NVidia with the Apple GL drivers for their cards (including some other reasons). Unfortunately, there were no and still are no ATI video cards for PCI-Express based Apple G5 PowerMacs.

Thanks for your time, I really appreciate it!
ds01
Member
 
Posts: 47
Joined: Wed Mar 29, 2006 5:54 am
Location: Lvl 7

Postby LordHavoc » Thu Mar 30, 2006 6:21 am

ds01 wrote:Do you know if it is in the ARB spec that the driver must honor these conditionals? I would assume it is, but honestly if I am going to file a bug report that has any chance I need to know some things.


http://oss.sgi.com/projects/ogl-sample/registry/ARB/GLSLangSpec.Full.1.10.59.pdf - Section 3.3 describes the preprocessor directives supported in the GLSLang and their exact behavior, DarkPlaces uses only a few of them.

ds01 wrote:I don't blame you for this problem - you may be interested to know that since owning this machine (only a few months btw) this is most certainly not the first bug I have come across, including a rather dumb regression causing an endless driver restart (until the pre-defined reload limit) upon which being greeted with the "Black Menu of Death". Anyone who has seen this knows what I am talking about, but that program was the SolarWinds screensaver port to OS X (not Nexuiz!).


Strange.

ds01 wrote:So to get back on track do you happen to know if it's in the spec to support that? What I need to submit is pretty much covered in this mail from an Apple engineer: http://lists.apple.com/archives/Mac-opengl/2004/Nov/msg00088.html


Yes it's in the spec.

ds01 wrote:Would you be so kind as to post a very basic code snippet I may use as an example of what fails, please?


Code: Select all
const char *shaderstring =
"#define TESTING1\n"
"#ifdef TESTING1\n"
"void main(void) {}\n"
"#endif\n"
"#ifdef TESTING2\n"
"void main(void) {}\n"
"#endif\n";
int vertexshaderobject = glCreateShaderObjectARB(GL_VERTEX_SHADER_ARB);
glShaderSourceARB(vertexshaderobject, 1, &shaderstring, NULL);
glCompileShaderARB(vertexshaderobject);
int vertexshadercompiled;
glGetObjectParameterivARB(vertexshaderobject, GL_OBJECT_COMPILE_STATUS_ARB, &vertexshadercompiled);
char compilelog[65536];
qglGetInfoLogARB(vertexshaderobject, sizeof(compilelog), NULL, compilelog);
printf("vertex shader compile log:\n%s\n", compilelog);


If working correctly the first main() should be compiled into the vertex shader, and the second main() should be ignored, however the observed behavior is that BOTH are compiled into the vertex shader, causing errors in the compilelog.

ds01 wrote:Also how would you describe this error in technical terms from your point-of-view (being an OpenGL developer)? For example I would describe this issue as the NVidia OpenGL driver on OS X not respecting the conditional directives in a C project using GLSL code: the fragment shaders get clobbered (interpreted as vertex routines) regardless of the conditionals in the C code. Then I'd insert a small code example. Is this an accurate description in your opinion?


Not quite an accurate interpretation, the conditional directives are precompiler directives in the GLSL code, not the C code. GLSL is a very very C-like language.

ds01 wrote:FYI I actually wanted an ATI card, because I knew about the lack of direct involvement by NVidia with the Apple GL drivers for their cards (including some other reasons). Unfortunately, there were no and still are no ATI video cards for PCI-Express based Apple G5 PowerMacs.


Funny how that works, on PC NVIDIA is respected for their superior drivers (ATI's are buggy putting increased strain on developer relations), on Mac ATI is respected because Apple writes their Mac drivers.

ds01 wrote:Thanks for your time, I really appreciate it!


No problem.
LordHavoc
Site Admin
 
Posts: 191
Joined: Wed Mar 29, 2006 7:39 am
Location: western Oregon, USA

Postby obi_wan » Thu Mar 30, 2006 7:35 am

ds01 wrote:Unfortunately, there were no and still are no ATI video cards for PCI-Express based Apple G5 PowerMacs.

Isn't there any way to flash-update a PC ati card, so that it can work with a mac ?
There are ati cards for AGP based G5 (i've got one), so perhaps... :)
obi_wan
Alien trapper
 
Posts: 256
Joined: Mon Mar 13, 2006 9:24 am
Location: France

shader dump

Postby ds01 » Fri Mar 31, 2006 3:30 am

Havoc,

Is there a way to dump the shaders [to the console and thus the log] before they're fed to the backend compiler (or before they're compiled), so I can verify that the defines that are supposed to be "fed" into them are? In this way I can verify if the driver is ignoring the conditionals in the GLSL code [string], or if something else is going on (like for some reason the defines aren't being placed into the shaders before being compiled).

Thanks.
ds01
Member
 
Posts: 47
Joined: Wed Mar 29, 2006 5:54 am
Location: Lvl 7

flash card

Postby ds01 » Fri Mar 31, 2006 3:33 am

lol Obi Wan, think I'll wait until they EOL the G5 powermacs and see if ATI ever releases a card for them before I go to such drastic measures ;)
ds01
Member
 
Posts: 47
Joined: Wed Mar 29, 2006 5:54 am
Location: Lvl 7

Re: shader dump

Postby LordHavoc » Fri Mar 31, 2006 3:40 am

ds01 wrote:Havoc,

Is there a way to dump the shaders [to the console and thus the log] before they're fed to the backend compiler (or before they're compiled), so I can verify that the defines that are supposed to be "fed" into them are? In this way I can verify if the driver is ignoring the conditionals in the GLSL code [string], or if something else is going on (like for some reason the defines aren't being placed into the shaders before being compiled).

Thanks.


Sure.

In GL_Backend_CompileProgram which is at gl_backend.c:715, add something like this:
Code: Select all
if (developer.integer >= 100)
{
int i;
Con_Printf("Compiling shader:\n");
if (vertexstrings_count)
{
Con_Printf("------ VERTEX SHADER ------\n");
for (i = 0;i < vertexstrings_count;i++)
Con_Print(vertexstrings_list[i]);
Con_Print("\n");
}
if (fragmentstrings_count)
{
Con_Printf("------ FRAGMENT SHADER ------\n");
for (i = 0;i < fragmentstrings_count;i++)
Con_Print(fragmentstrings_list[i]);
Con_Print("\n");
}
}


(I've just now commited this to the darkplaces engine cvs)

Note that -developer on the commandline sets developer to 100, so the same commandline should now print this to the log.

Also note that GLSL takes a whole list of strings to compile, so darkplaces sends the #define lines as the first strings, then the actual shader source is the last string.
LordHavoc
Site Admin
 
Posts: 191
Joined: Wed Mar 29, 2006 7:39 am
Location: western Oregon, USA

Next

Return to Nexuiz - Support / Bugs

Who is online

Users browsing this forum: No registered users and 1 guest

cron