Jump to content
Sign in to follow this  
SpongeBob

Nvidia launches geforce 6 series

Recommended Posts

<span style='font-size:17pt;line-height:100%'>NVIDIA Launches GeForce 6 Series</span>

Biggest Performance and Feature Leap in Company History

Quote[/b] ]

http://biz.yahoo.com/prnews/040414/sfw069_1.html

Wednesday April 14, 9:00 am ET

3D Gaming Performance, Ultra-Realistic Graphics and On-Chip Video Processor Distinguish New Desktop GPUs

SANTA CLARA, Calif., April 14 /PRNewswire-FirstCall/ -- NVIDIA Corporation (Nasdaq: NVDA - News), the worldwide leader in visual processing solutions, introduced today the NVIDIA® GeForce 6800 models of graphics processing units (GPUs) for high-performance desktop computers. The NVIDIA GeForce 6 Series, which includes the flagship GeForce 6800 Ultra and GeForce 6800, is designed to deliver:

<ul>[*]Industry-leading 3D performance -- new superscalar 16-pipe architecture delivers more than twice that of current industry leading NVIDIA GPUs

[*]New features, including Microsoft DirectX® 9.0 Shader Model 3.0 feature set -- for ultra-realistic cinematic effects

[*]Unprecedented on-chip video processing engine -- enabling high- definition video and DVD playback

"This is the biggest generation-to-generation performance leap that we have ever seen with a new GPU," said Jen-Hsun Huang, president and CEO of NVIDIA. "In addition to the raw performance increase, we had two fundamental strategies with the 6800 models. First was to take programmability to the next level with the industry's only GPU with Shader Model 3.0. Second was to extend the reach of GPUs to the consumer electronics market with a powerful and fully programmable video processor capable of multiple video formats and 'prosumer' level image processing."

"As DOOM 3 development winds to a close, my work has turned to development of the next generation rendering technology. The NV40 is my platform of choice due to its support of very long fragment programs, generalized floating point blending and filtering, and the extremely high performance," said John Carmack, president and technical director of id Software

Revolutionary New GeForce 6800 Architecture

The innovative 3D graphics architecture of the GeForce 6800 GPUs introduces many key new technologies that will drive the image quality and performance in 3D gaming, including:

<ul>[*]Blazingly Fast Graphics Performance: With a revolutionary new 16 pipeline superscalar architecture and the world's fastest GDDR3 memory, GeForce 6800 GPUs offer a staggering amount of graphics horsepower. Gaming performance is super charged with eight times the floating-point shader power, four times the shadow processing power and double the vertex processing power over last generation's award winning GeForce FX architecture.

[*]Revolutionary New Feature-Set: Built with full Microsoft® DirectX® 9.0 Shader Model 3.0 support, the GeForce 6 Series was built to deliver ultra- realistic next-generation DirectX 9 titles. The GeForce 6800 GPUs give developer's unlimited programmability, 32-bit shader precision, displacement mapping, geometry instancing and other features unique to the GeForce 6 Series. In addition, this new GPU delivers full floating point support throughout the entire pipeline, GeForce 6800 GPUs vastly improve the quality of images in motion with floating point filtering. The new rotated-grid antialiasing feature removes jagged edges for incredible image quality and a new 16x anisotropic filtering mode adds clarity to textures.

"The NVIDIA GeForce 6 Series is a great leap forward in PC graphics, bringing next-generation Microsoft DirectX 9 rendering performance to new heights while continuing forward with the high-definition floating-point rendering, driver performance, and stability NVIDIA is famous for," stated Tim Sweeney, Founder EPIC Games. "As the first GPU supporting the new Pixel Shader 3.0 programming model, the GeForce 6 Series enables games to use entirely new rendering approaches and obtain higher performance per-pixel lighting, shadowing, and special effects than previously possible."

"We appreciate NVIDIA"s continued support of Microsoft DirectX," said Chris Donahue, Lead Technical Evangelist of the Windows Graphics and Gaming Team at Microsoft. "We are eager to experience the hardware implementation NVIDIA® GeForce 6 Series promises to bring to the market."

World's first on-chip video processor:  GeForce 6800 GPUs have a programmable video processing engine that delivers stunning high-definition video playback. Featuring support for MPEG encode and decode, as well as support for Windows Media Video 9 (WMV9), the  GeForce 6800 delivers an amazing video experience when used with the Microsoft® Windows® Media Center Edition operating system. The GeForce 6800 also enables high-quality video playback on any window size, and has an integrated TV-encoder for direct-to-TV playback.

"Adobe is very excited about the programmable video features in these new NVIDIA GPUs. Many Adobe customers have already been enjoying the power of NVIDIA GPUs to create sophisticated 3D animations for the past year. Our future products will take even more advantage of the new capabilities being unveiled in these GPUs," said David Trescot, senior director, Digital Video Products at Adobe.

GeForce 6800 GPUs benefit from the renowned compatibility and reliability of the NVIDIA® ForceWare software featuring the NVIDIA Unified Driver Architecture (UDA). In addition NVIDIA ForceWare delivers industry-leading software features such as custom game profiles, HDTV output, overclocking, multi-display support, and much more.

Availability

The first GPUs based on the NVIDIA GeForce 6 Series, the GeForce 6800 Ultra and GeForce 6800 models, are manufactured using IBM's high-volume 0.13-micron process technology and are currently shipping to leading add-in-card partners, OEMs, system builders, and game developers.

Retail graphics boards based on the GeForce 6800 models are slated for release in the next 45 days.

This might be the card I'll be playing OFP2 on. biggrin_o.gif

Now make sure you ATI users argue about how much ATIs are better than the GF6  biggrin_o.gif

Share this post


Link to post
Share on other sites

Cool ,but i always buy my video cards previous generation ,because new generation video cards are quite expensive though are overshadowed relativly fast by new generation cards thus beome cheap fast.Though the dollar is low at the moment my strong Euro lets me import US products cheaper than ever.

Share this post


Link to post
Share on other sites

Sweeet... not about the card though, I'll probably never get one of those, what I'm excited about is the sudden price drop we're going to see on other cards like the Radeon 9800 pro when it hits the shelves wink_o.gif

Share this post


Link to post
Share on other sites
Now make sure you ATI users argue about how much ATIs are better than the GF6  biggrin_o.gif

There is nothing to argue about yet. We have to wait a couple of weeks until reviews of X800P/XT come up. It wasn't a very big surprise that this card would beat all cards that are on the market today since it's a complete new generation graphics card but I am surprise that it was this good though. They fixed the poor AA that the FX-series had and the shader performance is greatly improved. It looks like a great card!

Will it beat the X800XT? Who knows…The X800XT will have 16x1 architecture and a core speed somewhere between 500 and 600 Mhz (the GF 6800 “only†runs at 400MHz) and 1200 MHz mem speed.

The GF has one advantage though, it supports shader model 3.0 (PS3.0/VS3.0). If rumors (not sure if it’s confirmed yet) are true then the ATI card will not do that. I think nVidia will push for the use of SM3.0, Farcry already supports it with the new patch.

Share this post


Link to post
Share on other sites
Now make sure you ATI users argue about how much ATIs are better than the GF6  biggrin_o.gif

There is nothing to argue about yet. We have to wait a couple of weeks until reviews of X800P/XT come up. It wasn't a very big surprise that this card would beat all cards that are on the market today since it's a complete new generation graphics card but I am surprise that it was this good though. They fixed the poor AA that the FX-series had and the shader performance is greatly improved. It looks like a great card!

Will it beat the X800XT? Who knows…The X800XT will have 16x1 architecture and a core speed somewhere between 500 and 600 Mhz (the GF 6800 “only†runs at 400MHz) and 1200 MHz mem speed.

The GF has one advantage though, it supports shader model 3.0 (PS3.0/VS3.0). If rumors (not sure if it’s confirmed yet) are true then the ATI card will not do that. I think nVidia will push for the use of SM3.0, Farcry already supports it with the new patch.

hey boys. I am pretty sure the next ati card doesnt use 16x1 pipeline but 8x2. As nvidia leaked the info that their card was 16x1 for pipeline when it would have been to late to ati to adopt that in their card.

You guys should check the benchmark in games already, this card is going to be hard to beat.... It is insane. I am so going back to nvidia!

Share this post


Link to post
Share on other sites

Sure, sure, but the price must be ridiculous. Ill wait for the price of the top FX cards to drop and then i may consider updating my poor old ti4200...

Share this post


Link to post
Share on other sites

Nice ! Not that I could afford one when it comes out,but it will at least make the lower spec cards cheaper.

Share this post


Link to post
Share on other sites
Guest

I wonder how hot the thing is.......

Share this post


Link to post
Share on other sites

well I dont think it is going to be as hot. First it is not clocked as high as its predecessors. 2nd if it was that hot, the reviewers would have said, as they did the last time

Share this post


Link to post
Share on other sites

Yea, pretty impressive looking at these benchmarks:

sc1600.gif

The gf6800 really kicks other cards arse on high resolutions.

tr1600.gif

lomac1600.gif

halo1280.gif

q3high1600.gif

Quote[/b] ]

Conclusion

So far, NVIDIA’s GeForce 6800 Ultra is shaping up to be a real winner. NVIDIA is the first to adopt shader model 3.0 with GeForce 6800 Ultra, and have already lined up the following titles:

Lord of the Rings, Battle For Middle-earth

STALKER: Shadows of Chernobyl

Vampire: Bloodlines

Splinter Cell X

Tiger Woods 2005

Madden 2005

Driver 3

Grafan

Painkiller

FarCry

It remains to be seen how far some of the developers on these titles will go with their 3.0 implementations, but it’s an interesting development nonetheless, as the first shader model 3.0 titles will hit the market much faster than its predecessors. Technically, the first title, FarCry, is already shipping (although Microsoft hasn’t released DX9.0c yet).

NVIDIA has also addressed their anti-aliasing quality by adding a new rotated-grid anti-aliasing sampling pattern. We definitely saw the IQ improvements in our AA testing.

Clearly when it comes to performance GeForce 6800 Ultra is in a class of its own. In many cases GeForce 6800 Ultra outperforms its nearest competitor by a factor of nearly two, and keep in mind that this is with early drivers, performance will only get better from here.

There is, however, reason to reserve judgment. ATI has yet to step up to the plate with R420. Given their rather remarkable silence over the past few months, a silence not unlike the one preceding the Radeon 9700 line, it may be that they can surprise us. Perhaps they're holding off on final specifications until they see these GeForce 6800 Ultra numbers and know what they have to beat, or perhaps they're not sure if R420 can deliver. We'll soon find out, maybe even as early as today, which marks ATI's second Shader Day event.

Another factor that may play against GeForce 6800 Ultra is its extraordinary power requirements. With its two Molex power connector requirement, and NVIDIA’s recommendation of a 480-watt power supply, many enthusiasts will have to upgrade their power supply unit in order to run GeForce 6800 Ultra. This could make a whole lot of money for PSU manufacturers like Antec and Enermax, or it could lead enthusiasts to settle for GeForce 6800. We’ll just have to wait and see how that part plays out.

You won’t have to wait much longer, fortunately. The first GeForce 6800 and 6800 Ultra boards should be hitting retail a little over a month from now. Expect to see us cover the first wave of GeForce 6800 boards as soon as they’re available!

Share this post


Link to post
Share on other sites

Hehe Who said Nvidia was the only one playing dirty?  tounge_o.gif

I have a feeling Radeon X800-series will not have PS3.0 or bad PS3.0.....

Here's a leak from an internal ATI note quotaed from Tomshardware:

While NVIDIA is loudly and proudly advertising the new shader model, ATi is attempting to downplay it. A wonderful and quite humorous example of this tactic is a developer presentation titled "Save the Nanosecond" by ATIs Richard Huddy which was accidentally leaked on the internet. The trouble was, the presentation still sported some personal notes which made it clear that the presenter was trying to convince developers to stay away from Flow Control in PS3.0, as it incurs a very tangible performance hit.

Quote[/b] ]"Steer people away from flow control in ps3.0 because we expect it to hurt badly. [Also it's the main extra feature on NV40 vs R420 so let's discourage people from using it until R5xx shows up with decent performance...]"

Share this post


Link to post
Share on other sites

Ach, think outside power supply 3dfx style would have been better solution than wasting two ATX connectors and requiring a 480w power supply. crazy_o.gif

Then again, I just went Gf4Ti so im not going to see that card anytime soon. tounge_o.gif

Share this post


Link to post
Share on other sites

Well, it seems the odd versions of the gf suck and the even ones are good, so gf6 could be a winner tounge_o.gif

But i will wait how the ati compares and most probably buy some now cheap 9800 and think about those cards in a year or two.

Share this post


Link to post
Share on other sites

Seeing as I only change video-card every two years or so I want one that can live that long. I've had the GF4Ti4600 for 2 or 3 years (can't exactly remember) it's about time I'm upgrading. The way I see it a R9800xt or FX59050u will not be "enough" for games 2 years from now.

I'll prolly get a 6800u or X800xt in 2-3-4 months. Depends on price/performance and availability for me.

Share this post


Link to post
Share on other sites

Looks interestring , i hope radeon x800 will be up the challenge, i was planning to buy a new graphic card looks like i will wait a little month or two

Share this post


Link to post
Share on other sites
hey boys. I am pretty sure the next ati card doesnt use 16x1 pipeline but 8x2. As nvidia leaked the info that their card was 16x1 for pipeline when it would have been to late to ati to adopt that in their card.

You guys should check the benchmark in games already, this card is going to be hard to beat.... It is insane. I am so going back to nvidia!

The people over at Beyond3D are certain that it (X800XT) will have a 16x1 architecture and they are usually right. I would not decide on a new card just yet, if it has a core speed of 600MHz it could very well turn out to be faster than the 6800U. We will see in a couple of weeks...

Share this post


Link to post
Share on other sites

Pure core-speed don't mean that much anymore. It's how much you can do per clock cycle.

Look at Intel and Amd.

I expect the X800 to be similar to 6800. Competition has always been tight between Nvidia and Ati.

The question is: which will be cheaper I suppose.

Share this post


Link to post
Share on other sites

Looking like to be a great card! But I will probably hold off on upgrading/buying a new computer till sometime after the summer.

And the two ATX connectors thing is very silly indeed. External power supply would be much better in my opinion,

Share this post


Link to post
Share on other sites
Pure core-speed don't mean that much anymore. It's how much you can do per clock cycle.

True, but with a 16x1 architecture I dont believe the X800XT will be that much slower clock for clock...but who knows.

Share this post


Link to post
Share on other sites

Wait for the PCI-Express version to come out. That shouldn't need two molex connectors on it, as it can pull far more power from the slot than AGP.

Think I'll wait till the new boards come out, prices settle, then I'll buy a whole new setup. PCI-Express does look nice though, and is long overdue.

Share this post


Link to post
Share on other sites
PCI-Express does look nice though, and is long overdue.

And it's not there where the bottleneck is.

I can't remember where I read it though, but PCI-X will not make much (if any at all) difference with the current cards.

There was something with PCI-X not being useful untill full 64-bit systems were a reality.

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
Sign in to follow this  

×