smithpd on 7/5/2007 at 14:12
Quote Posted by Bikerdude
So I never got around to putting the 8800 in the second slot - whats the point running the 8800 in a x4 slot...?!
Sorry for your frustration. There may be no point for you, but it would be quite interesting to me to know if it works at all, with the cards in either position. You could try the ATI first in slot 1, then Nvidia 2. Or, you could try to install each card in the first slot individually and then move them around. The bios may then recognize the hardware and Windows may then assign existing software to it correctly. I don't know, I am just speculating.
bikerdude on 7/5/2007 at 15:06
Hi Smithy
When i said x4 I was refering to the speed of the pci-e slot. The 1st slots runs @ x16, so just for the sack of compatability you would be running the 8800 in the 2nd slot which is 4x slower.
But I did partly attempt the swap, but the ati card did not like being put back in the 2nd slot after installing the drivers when it was in the 1st slot. All I would get is windows redetecting the card then BSODing.
So it was a non starter from the get go. If I come accross another (diffferent model) cheaper ati card I will try again. But the first card Im gonna look for is an FX5900 or FX5950-Ultra
biker
Sneak on 7/5/2007 at 15:53
I have looked around for a 5900 series card in PCI-e but have not seen any. If you find one let me know. See them around in AGP but not many. I already have an AGP version.
Nice try on the ATI. I for sure would not put my 8800 in a 4 x slot. That card needs all the octane it can get. :thumb:
bikerdude on 8/5/2007 at 01:36
Quote Posted by Sneak
I have looked around for a 5900 series card in PCI-e but have not seen any. If you find one let me know. See them around in AGP but not many. I already have an AGP version.
I used to have one about 3 months ago, but got rid of it grrrr, so I will keep hunting for one. btw The 'nVidia Quadro FX 3000' is the same card, so I Imagine that could be bios'd back to a 5900.
biker
Sneak on 8/5/2007 at 03:48
Am wondering. We really don't know yet what is causing this problem. You think it is possible it might be a Shader thing? Trying to remember what version of Direct X or Direct 3D was current when T2 came out in 2000. And what Shader Model it used. We have been from Shader Model 1, 1.1, 1.2, 1.3, 1.4, 2, 2+, 2.1 and up from there. The Shaders have about everything to do with rendering from what I understand.
Might new cards with new GPU's be changing how they impliment the Shaders? Wonder if there is any way to force the 8800 to use a particular Shader in Thief to see if that would do anything. Like make it use 1.4.
Or am I off base here. I don't technically know how they work.
bikerdude on 8/5/2007 at 07:07
Quote Posted by Sneak
Might new cards with new GPU's be changing how they impliment the Shaders? Wonder if there is any way to force the 8800 to use a particular Shader in Thief to see if that would do anything. Like make it use 1.4. Or am I off base here. I don't technically know how they work.
If thats the case, then Riva tuner would be of use as it can force a shader version
biker
Update: dosent make any difference forcing 1.1, 1.3, 1.4 or 2.0..
Sneak on 8/5/2007 at 14:56
Quote Posted by Bikerdude
If thats the case, then Riva tuner would be of use as it can force a shader version
biker
Update: dosent make any difference forcing 1.1, 1.3, 1.4 or 2.0..
Thanks for checking Biker, is looking more and more like a hardware thing.
smithpd on 8/5/2007 at 15:36
Quote Posted by Sneak
Thanks for checking Biker, is looking more and more like a hardware thing.
I think it must be software. That is evident in what happened with driver behavior at around 56xx. Nobody would design a video card that is hardware limited to 8-bit rendering. 24 bits (= 3 bytes), maybe. The problem is evidently that there is some lack of communication between Thief software and Nvidia software. It's too bad Nvidia will not step up to the plate.
Sneak on 8/5/2007 at 17:06
Quote Posted by smithpd
I think it must be software. That is evident in what happened with driver behavior at around 56xx. Nobody would design a video card that is hardware limited to 8-bit rendering. 24 bits (= 3 bytes), maybe. The problem is evidently that there is some lack of communication between Thief software and Nvidia software. It's too bad Nvidia will not step up to the plate.
I have posed the question about why it might be causing what it is doing over at Guru3D and Hard Forums. Not much in the way of responses. If it is a driver thing Guru could certainly tweak it. But would they for this particular game? If it was widespread over a bunch of old games there would probably be more response.
smithpd on 9/5/2007 at 00:22
Quote Posted by Sneak
I have posed the question about why it might be causing what it is doing over at Guru3D and Hard Forums. Not much in the way of responses. If it is a driver thing Guru could certainly tweak it. But would they for this particular game? If it was widespread over a bunch of old games there would probably be more response.
Tweaking means changing default settings, registry hacks, adding INF files, etc. That will probably not help. That is not the same as programming, which means going into the C code and removing the bug.