well, Oblivion is crashing and performing like crap for me{EDIT-FIX'D} - by Fig455
dracflamloc on 22/3/2006 at 22:34
Quote Posted by io organic industrialism
there is no way to "fix" it. shader model 2.0 just means it is using "pixel shaders" ( type of graphics rendering that renders shadows and textures more realistically ) that require directx 9.0 generation hardware. geforce 4 series is only a directx 8 generation card. since it came out before directx 9 was was released, it doesn't physically have the hardware capable of performing some of the operations, such as SM 2.0, that some directx 9 games require...... including oblivion.
so why does half life 2 and doom 3 work? they are able to detect that you have an older card and render through the directx8 shader path instead of dx9. oblivion doesn't support that. so you will need to get a new graphics card if you're going to play it. sorry :(
Thats incorrect, theres a tool called 3d Analyze or soemthing along those lines that can actually CPU-emulate various version of pixel shaders and card capabilities. Not sure how the performance would be on a game thats this intensive though. Depends on your CPU
ziltron on 22/3/2006 at 23:13
I'm running a geforce 6600 gt agp 128 mb and it's running smooth (so far) at 800x600. You can pick one up for around $100 er so. (If you wanna go the cheap route.)
RyushiBlade on 22/3/2006 at 23:19
My own problems continue. I still get random, inexplicable lag. Restarting the game solves this, but I don't want to exit out and back in again every time. I just updated my drivers which made things WORSE - my screen was covered in artifacts in game. Reinstalling the older drivers didn't work, so I'm currently in the middle of a System Restore.
I'm afraid I might look into a Geforce 7800GS OC, too. After spending $400 on my card, I really hate to go and spend so much on yet another. Perhaps I could sell my Radeon x850xt PE. *sighs* Why is it everything I own eventually turns to crap?
jstnomega on 23/3/2006 at 00:50
allyourframeratesarebelongtorefreshrateexceptthosenotedotherwise
(
http://accelenation.com/)
2006.03.20 : 05:49
Vsync Confusion /// Thomas Monk
Let’s get one thing straight right away: disabling Vsync does not improve the visible smoothness of a game by permitting the frame rate to exceed the display rate. This is because it is impossible to exceed the display rate! I don’t care if your benchmark does read 300 fps, if your display rate is 80Hz then you’re only ever going to see the image change 80 times per second! End of story!
In the above case what you would actually see is an image rate of 80 per second, but each of these images would be made from a collage of successive frames. But why on earth would anyone want to look at a mismatched composition of successive frames?
So to reiterate: there is absolutely no additional smoothness to be had by disabling Vsync if the frame rate exceeds the display rate — all you get is a broken image (image tearing).
However, if the frame rate is lower than the display rate, then disabling Vsync may produce slightly smoother gameplay. The reason for this is difficult to explain, but it derives from the fact that the frame rate is averaged within individual screens, rather than over successive screens. Take for example a game running at 60fps displayed on a monitor at 90Hz. With Vsync enabled the monitor would display the same frame twice on every other frame (1,1,2,3,3,4,5,5,6 etc). So for half the time the frame rate is only 45fps over the entire screen. With Vsync disabled at least one half of the image is guaranteed to change on successive frames; i.e. one half of the image is always running at 90fps (but never the same half twice). This arrangement appears slightly smoother to the human eye, but at the cost of image tearing between the upper and lower halves of the screen.
Personally I find image tearing unacceptable in an age where visual quality is King. For this reason I almost always run with Vsync enabled, and try to configure the in-game settings to deliver a consistent frame rate. Frame rate consistency is the most effective trick for delivering fluid gameplay without unsightly image tearing.
Think on
spamsk8r on 23/3/2006 at 01:21
I'm very pleasantly surprised as to how well it runs on my computer. I have a last gen card (x800 pro) and a decent processor (3500+ @ 2.5GHz), and I am running it at 1280x960 with pretty much everything maxed, using 4xAA and no bloom because bloom sucks (it sucked in Deus Ex: IW, it sucked in Thief 3, and it sucks in Oblivion). I'm getting really decent frame rates. I was expecting to have to run it at 1024x768 with no AA but it's been great so far. They really optimized the engine compared to Morrowind (which ran like a dead dog when it came out). The loading is insanely fast as well. I do have 2 gigs of RAM, so that might help, but if you have a mid to high end PC you will not see long load times at all.
Slither on 23/3/2006 at 01:34
Quote Posted by dracflamloc
Thats incorrect, theres a tool called 3d Analyze or soemthing along those lines that can actually CPU-emulate various version of pixel shaders and card capabilities. Not sure how the performance would be on a game thats this intensive though. Depends on your CPU
I've never heard of this utility and dkessler is certainly welcome to dig up this program up -- once its real name is discovered -- but I wouldn't put a lot of hope in this. There's more at play than just his shader model limitations; the card's GPU simply will gag under the full strain of Oblivion's needs and the pipelines of his Ti will be jammed up like an LA freeway at rush hour.
dkessler, you're much better off getting a new video card. Check the prices of any DX 9 cards. You have a kick-ass CPU and Oblivion is highly reliant on GPUs (video card processors) -- so get the best card you can afford. Putting an old Ti in your current machine is akin to putting bicycle tires on your Corvette, anyway.
Dirty_Brute on 23/3/2006 at 02:34
Quote Posted by Zaphod
$340. I've found it on Pricewatch for $330, but it was definately worth the extra $10 to get it today.
I have an eVGA 7800 GS. I read that these are very overclockable so don't hesitate to get those extra frames.
I currently have the gpu overclocked at 450mhz and the memory at 1350ghz via Coolbits. Half Life 2 and Fear have been running stable so far on these settings. I previously had it clocked much higher but I was getting small blinking textures during Fear. So I went down a bit. Make sure you have decent cooling in your case if you decide to OC.
io organic industrialism on 23/3/2006 at 03:00
Quote Posted by dracflamloc
Thats incorrect, theres a tool called 3d Analyze or soemthing along those lines that can actually CPU-emulate various version of pixel shaders and card capabilities. Not sure how the performance would be on a game thats this intensive though. Depends on your CPU
wow.
i can see this getting 5fps maximum, probably with the lowest resolution and detail settings
io organic industrialism on 23/3/2006 at 03:05
Quote Posted by Dirty_Brute
I have an eVGA 7800 GS. I read that these are very overclockable so don't hesitate to get those extra frames.
I currently have the gpu overclocked at 450mhz and the memeory at 1035ghz via Coolbits. Half Life 2 and Fear have been running stable so far on these settings. I previously had it clocked much higher but I was getting small blinking textures during Fear. So I went down a bit. Make sure you have decent cooling in your case if you decide to OC.
memory at 1035????? the STOCK clock for the memory is 1200 ... i also have an evga and mine is currently clocked at 489 / 1380... no problems
i am disappointed in the performance in oblivion though ... i am getting about 25fps average, indoors and outdoors. i just installed the new nvidia drivers so i'm not quite sure what is the problem . obviously my athlon xp 3000+ is bottlenecking my gpu somewhat, but other games run just great (FEAR for example). :confused: the game detected "ultra high" settings... so at first i was pumped, but i was definitely disappointed at the FPS. i tried turning down the textures and some other settings, but it didn't seem to affect it much :-\
Dirty_Brute on 23/3/2006 at 03:28
Oops, sorry! I fixed the typo regarding the memory. I actually have my 7800 GS memory set to 1350