Jump to content
Sign in to follow this  
SpongeBob

Nvidia launches geforce 6 series

Recommended Posts

The PCI express has 16 channels to communicate right? Compared to the 2 channels in AGP8x atm.

The bandwith is suppose to be up at 4.0 GB compared to the prsent 2.0 GB.

Share this post


Link to post
Share on other sites

PCI-X will only make a difference in high-rez video editing applications. As Shadow said, it's not there the bottleneck is. AGP already has a huge bandwith, more than enough for applications today.

Share this post


Link to post
Share on other sites

Lol, are those Nvidia cards always that big? That card takes up another PCI slot rock.gif .

Share this post


Link to post
Share on other sites
PCI-X will only make a difference in high-rez video editing applications. As Shadow said, it's not there the bottleneck is. AGP already has a huge bandwith, more than enough for applications today.

Well by the time it will be released and we poor people be able to buy them 'tomorrows' games will be out like DOOM3 or others which will require such cards and bandwiths or not?

Share this post


Link to post
Share on other sites
Will the pci thing need a new motherboard?

Yes if you are talking about PCI-eXpress.

Share this post


Link to post
Share on other sites

One thing people, its PCI-E as in PCI-Express...PCI-X is not the same thing!

Share this post


Link to post
Share on other sites

Heres what John Carmac is writing about doom3 and gf6800:

"As DOOM 3 development winds to a close, my work has turned to development of the next generation rendering technology. The NV40 is my platform of choice due to its support of very long fragment programs, generalized floating point blending and filtering, and the extremely high performance."

Share this post


Link to post
Share on other sites
Heres what John Carmac is writing about doom3 and gf6800:

Who cares what John has to say, I wanna know what Marek Spanel has to say about this card biggrin_o.gif

Share this post


Link to post
Share on other sites
Heres what John Carmac is writing about doom3 and gf6800:

Who cares what John has to say, I wanna know what Marek Spanel has to say about this card  biggrin_o.gif

LOL

Well now that you mention it. Let's hear it, Marek smile_o.gif

Share this post


Link to post
Share on other sites

Flashpoint will eat this card for breakfast - as it has every other new card, though with this one, I'm sure you'll be able to set the fsAA a bit higher and have a prettier picture. tounge_o.gif

Before grabbing this card for OFP, remember all the people who rushed out to get a Radeon 9800 expecting to be able to set their viewdistance to 5000 in OFP.

biggrin_o.gif

Share this post


Link to post
Share on other sites

beside ofp is taxing much more on the cpu than the gpu

Share this post


Link to post
Share on other sites

Damn I am so sick and tired of the rapid developments. Just bought a new top-notch computer and I already have to be afraid about how I will be able to play games in 12 months! crazy_o.gif

It is just going to fast.... I dont remember the article but I recently read one in which the interviewee said that for their game under development the Radeon 9800Pro would be able to run the game under 800*600 if all special features would be disabled! grrrrrr

Share this post


Link to post
Share on other sites

Bah, nvidia only released the fx series so they would have a DX9 competitor to ATI, far has gpu performance the fx series didnt seem to improve much from the g4ti series anyway, i would rather have a ti4600 than a fx5200 for sure.

But they did make them very expensive and now they throw this out of nowhere.

If i changed graphix card every new series the fx wouldnt have played any new games so im glad i didnt upgrade.

Share this post


Link to post
Share on other sites
Guest
Yeh you got problems i got a Geforce 440 MX  crazy_o.gif  sad_o.gif

Lol....same here. I run OFP with a 2000+m viewdistance with 40+FPS, not a problem. After having a good look through some of the benchmarks and reviews it seems like the main benifits come in higher resolutions (1280x768 and above). It seems a waste of money to be buying a card that can do double the frame rate of any other card in 1600 if your monitor doesn't support it wink_o.gif

Share this post


Link to post
Share on other sites

Damn...I just bought my girlfriend a pc that gives a OFP cpu benchmark of 8500+ , while mine gets a lousy 4500, i reckon ill get her this new card and finally move in with her.

(needless to say im a poor bastard but her aunt subsidises her pc ) biggrin_o.gif

Share this post


Link to post
Share on other sites

Im still on my 2Ë year old ti4600.... it runs okay with ofp smile_o.gif

//Edit

Oh, and i never had any problems with geforce... first gf2mx, then gf4ti4600... Never experienced any problem running any game whatsoever, my next card will be nvidia for sure smile_o.gif

Share this post


Link to post
Share on other sites

I had the Ge-Force 256 for 2 weeks before upgrading to Ge-Force 256DDR, then the GF3, and finally a GF4Ti4600. Now I'm looking at GF6800U simply because I have only positive experience with Nvidia and their driver-support.

Okay, so I won't see an improvement in OFP, but what about LOMAC and FarCry then? wink_o.gif

I'm sick and tired of getting a little blurry image in the above mentioned games. Running them in 1024x768 on a LCD 1280x1024 (which makes the image quality a little "soft around the edges").

I've never had an ATI card and it does'nt look like I'll end up with one this time either. Let's face it, the X800-series won't be 2x faster than the 6800U.

Share this post


Link to post
Share on other sites
Maybe when these come out we can finally play OFP with "very high" terrain detail selected!  tounge_o.gif

Nope. OFP is limited by your bus and CPU I'm afraid.

But BIS is working on this for OFP2 so we should'nt see the same limitations there.

Share this post


Link to post
Share on other sites
Now I'm looking at GF6800U simply because I have only positive experience with Nvidia and their driver-support.
Still living in the past? Get over it, driver support from ATI are now at least as good if not even better than from nVidia.
I've never had an ATI card and it does'nt look like I'll end up with one this time either. Let's face it, the X800-series won't be 2x faster than the 6800U.
Why would it have to be 2x faster than 6800U? I would choose the 6800U if it turns out they are pretty much equal in performances because 6800U supports SM 3.0 but you would not pick the X800XT unless it was twice as fast as the 6800U?! SM 3.0 will not bring THAT big difference you know.

Share this post


Link to post
Share on other sites
Still living in the past? Get over it, driver support from ATI are now at least as good if not even better than from nVidia.

Why fix it if it ain't broken? smile_o.gif

Quote[/b] ]Why would it have to be 2x faster than 6800U? I would choose the 6800U if it turns out they are pretty much equal in performances because 6800U supports SM 3.0 but you would not pick the X800XT unless it was twice as fast as the 6800U?! SM 3.0 will not bring THAT big difference you know.

And don't take me literally wink_o.gif

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
Sign in to follow this  

×