Jump to content
Sign in to follow this  
Hobo

Graphic engine

Recommended Posts

Is there any visual difference between HW T&L engine and normal Direct3D? Maybe I am blind, but I cannnot see even much difference between 16 and 32 bit image in OPF. What differences should I concentrate on? I am sorry, but I know nothing about these graphic issues. Thank you.

Share this post


Link to post
Share on other sites

T&L does affect look I think, but its mainly just letting the graphics card take a heavier burden than in normal case, freeing the CPU.

Hm, I never even got T&L to work in OFP, have a Gforce DDR, forced T&L, yet it say it isnt enabled... And I cant enable it ingame. But is it forced anyway? Anyone know of a way to check ingame?

Share this post


Link to post
Share on other sites

The HW T&L setting is still using Direct 3D to generate the graphics.

Normally transform and lighting calculations are performed by the computers processor. The word "transforms" refers to the conversion of three dimensional geometry data into the 2D version seen on screen (since a monitor is a 2D display, this is required).

Graphics cards that have T&L processors can handle these calculations themselves, which frees up the processor to do other tasks (animation, physics simulation, etc.).

The only visual improvement you could get from enabling T&L is if you originally had some graphics options turned down because your framerate was too low.

By enabling T&L, you can turn up more graphics options because your processor can handle more information.

But other than being able to render more objects on screen, there's really not much of an improvement. T&L doesn't improve textures or model complexity.

Share this post


Link to post
Share on other sites

</span><table border="0" align="center" width="95%" cellpadding="3" cellspacing="1"><tr><td>Quote (R. Gerschwarzenge @ Mar. 04 2002,21:19)</td></tr><tr><td id="QUOTE">You must set the T&L in OFP-Preferences. I can't be enabled while the game is running.<span id='postcolor'>

*dawdler starts preferences*

DOH!

biggrin.gif

Share this post


Link to post
Share on other sites

I haven't noticed any difference when using T&L in OFP, I just leave it turned on for psychological reasons, lol

Share this post


Link to post
Share on other sites

Well, thanks. And what about the difference between 16 and 32 bit color mode? A have still a low-end machine, so OPF in 32 bit color depth worthwhile to upgrade my hardware? I tried to evaluate visual differences, but the game was so choppy in 32, that I could not (sorry for my English smile.gif

Share this post


Link to post
Share on other sites

i hear its better all around in 32bit. what's your system configuration?

Share this post


Link to post
Share on other sites

</span><table border="0" align="center" width="95%" cellpadding="3" cellspacing="1"><tr><td>Quote (Bloodmixer @ Mar. 05 2002,11:16)</td></tr><tr><td id="QUOTE">Personally I noticed a great improvement from upgrading from 128MB to 256MB<span id='postcolor'>

Yes, me too biggrin.gif

when i had 128 MB, after a while the game got bumpy. Then when restarting everything was fine again. When going from 128 to 384 MB the game runs really smooth, even on my brothers slow 600 mhz computer (i have 1400). But my new mainboard didn't work with my old memory, so now i have only 256 MB sad.gif

Share this post


Link to post
Share on other sites

16bit is faster

32bit looks much better (shadows, colours)

Share this post


Link to post
Share on other sites

</span><table border="0" align="center" width="95%" cellpadding="3" cellspacing="1"><tr><td>Quote (Czacha @ Mar. 05 2002,12:55)</td></tr><tr><td id="QUOTE">16bit is faster

32bit looks much better (shadows, colours)<span id='postcolor'>

Does it really look better? Really?

What is it, 64K color palette compared to 16 million? I dont know the counts exactly, 24 bits may be 64K, dont know. The point is still, can your eye really see all those colors? And if you can, will you really notice it during gameplay (not sitting and staring at the screen in search of color number 404212 on a pixel)? One thing totally overdone on a computer.

One should run 16bits in any case where the computer cant handle 32bits. Most modern graphics cards and games doesnt handle so much faster in 16 bits, its often more worth to go to 32 bits.

Share this post


Link to post
Share on other sites
Guest

32bpp should be avoided like the plague when using a GF2MX or lower. It simply doesn't have enough memory bandwitdth for smooth 32 bit operation confused.gif Only 3GB/Sec compared to more like 5 or 6GB/sec for the higher-end Geforce 2's..Of course with a GF3 + you're fine smile.gif

Only difference (in OFP) I can see between 16 and 32bpp is that in 16 bit you can sometimes see through objects because of the z buffer problems under 16 bit sad.gif Not a big price to pay for smooth running.

Share this post


Link to post
Share on other sites

I find that 32 bit colour runs fine with my GeForce DDR and PIII 750 256 MB RAM. But when I enable HW T&L is looks a little better, better lighting, but it runs slower, and jerks lot. Odd really, I'd have thought it would be better....

Share this post


Link to post
Share on other sites

</span><table border="0" align="center" width="95%" cellpadding="3" cellspacing="1"><tr><td>Quote (Dawdler @ Mar. 05 2002,13:24)</td></tr><tr><td id="QUOTE"></span><table border="0" align="center" width="95%" cellpadding="3" cellspacing="1"><tr><td>Quote (Czacha @ Mar. 05 2002,12:55)</td></tr><tr><td id="QUOTE">16bit is faster

32bit looks much better (shadows, colours)<span id='postcolor'>

Does it really look better? Really?

What is it, 64K color palette compared to 16 million? I dont know the counts exactly, 24 bits may be 64K, dont know. The point is still, can your eye really see all those colors? And if you can, will you really notice it during gameplay (not sitting and staring at the screen in search of color number 404212 on a pixel)? One thing totally overdone on a computer.

One should run 16bits in any case where the computer cant handle 32bits. Most modern graphics cards and games doesnt handle so much faster in 16 bits, its often more worth to go to 32 bits.<span id='postcolor'>

Yes it looks better...Really...in any game....im not sure you have any games like UT or Q3 but give it a try....first go to 16bit and then 32 you will notice the difference and ones your used to it you dont want to go back ever

Share this post


Link to post
Share on other sites
Guest

</span><table border="0" align="center" width="95%" cellpadding="3" cellspacing="1"><tr><td>Quote </td></tr><tr><td id="QUOTE"> find that 32 bit colour runs fine with my GeForce DDR and PIII 750 256 MB RAM. <span id='postcolor'>

I guess the Geforce DDR has much greater memory bandwidth due to the double data RAM which is why 32bit is smooth with your setup.

Share this post


Link to post
Share on other sites

One of the great things about the Kyro2 chipset is that switching between 16bit and 32bit results in only a tiny reduction of FPS smile.gif

Share this post


Link to post
Share on other sites

That's because it natively renders everything in 32bpp in the first place.

Share this post


Link to post
Share on other sites

16bit messes up shadows in my game, 32bit is a little more demanding but i still use it, i prefer to use T&L because it takes the load off my computers CPU and onto the graphics card, but it creates some sort of input lag whenever using scopes/ironsights, the stronger the zoom, the more input lag....

PC specs: PIII 500Mhz, 256MB SD RAM, Geforce2 V7100 MX400 64MB and 12.9GB HD...

Share this post


Link to post
Share on other sites

I have PIII 450 MHz, GeForce 2MX, but I consider upgrading on Celeron 1.2 MHz (Tualatin) + i815EPT Pro board + GeForce2MX + 256 Mb RAM, since I was told, that this combination offers best stability and money/power rate. Do you think I will be able to play resolution 800x600x32 without any lagging even such missions like "battlefields"? I checked both color depths 16 and 32 bit on computer of my friend and I have to admit, that I did not notice any substantial difference in visual quality. While QuakeIII in 16 and 32 color does make difference, I did not see this in OPF. And, by the way, which version od Detonators is the best for GeForce2? I considered to download latest version of XP Detonators to improve frame rate, but there are some reports on discussion boards, that these detonators are not very stable and does not offer much improvemet to GeForce 2 (they are designed mainly for GeForce3). So which version is the best for GF2?

Share this post


Link to post
Share on other sites

</span><table border="0" align="center" width="95%" cellpadding="3" cellspacing="1"><tr><td>Quote (Hobo @ Mar. 05 2002,20:33)</td></tr><tr><td id="QUOTE">I have PIII 450 MHz, GeForce 2MX, but I consider upgrading on Celeron 1.2 MHz (Tualatin) + i815EPT Pro board + GeForce2MX + 256 Mb RAM, since I was told, that this combination offers best stability and money/power rate. Do you think I will be able to play resolution 800x600x32 without any lagging even such missions like "battlefields"? I checked both color depths 16 and 32 bit on computer of my friend and I have to admit, that I did not notice any substantial difference in visual quality. While QuakeIII in 16 and 32 color does make difference, I did not see this in OPF. And, by the way, which version od Detonators is the best for GeForce2? I considered to download latest version of XP Detonators to improve frame rate, but there are some reports on discussion boards, that these detonators are not very stable and does not offer much improvemet to GeForce 2 (they are designed mainly for GeForce3). So which version is the best for GF2?<span id='postcolor'>

Get a Tualatin PENTIUM III 1200 like i did if ur mobo can handle it (guess so since it can handle Celeron Tualatins) biggrin.gif its much faster! + much more expensive confused.gif

But serious, I own a Pentium III-Tualatin 1200 + asus i815 board + Geforce 3 Ti200 (200€/170USD) and it rox! Q3 1600x1200@32bits about 100 up to 300 fps. And Flashpoint runs fantastic too in 1600x1200@32bits!!

Just get the latest of the detonators, since its close to no diffrence, plus bugs get fixed and effects optimized.

Share this post


Link to post
Share on other sites

</span><table border="0" align="center" width="95%" cellpadding="3" cellspacing="1"><tr><td>Quote (Hobo @ Mar. 05 2002,20:33)</td></tr><tr><td id="QUOTE">Do you think I will be able to play resolution 800x600x32 without any lagging even such missions like "battlefields"? And, by the way, which version od Detonators is the best for GeForce2?<span id='postcolor'>

Yes, mine have an old PC:

PII450@512

ASUS P3CE i820

256MB x1 + 128MBx1 (Corsair PC133 Cas2-2-2)

Leadtek Winfast GeForce2 Pro 32Mb

DirectX 8.1

WinMe

Battlefield Mission @800x600@32bit? Nooo problem.

Detonator : 4.13.01.1241

Regards,

Icabola

Share this post


Link to post
Share on other sites

i have

p3 866mhz

128mb pc133 sdram

geforce 2 64mb mx400

40gb HD

i run at highest resolution and 32bit color, and no slowdown at all. I just recently upgraded my graphics card from 16mb vanta nvidia card and i noticed a huge difference. I used to play ofp at 800X600@16bit and sometimes even lower, and i still had choppy frame rates some times. I recomend to anyone with a system similar to mine(with a lesser video card) to get a gf2 mx400, if its the least you can afford. It will drasitcly improve the performance.

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
Sign in to follow this  

×