multimedia 0 Posted March 6, 2002 I have read several tests on the subject and everywhere they state that although giving raw power GF4 MX is just a boosted GF2. It doesn't have nFinite engine and it doesn't fully support DirectX8. "If you're content with DirectX7 gaming, the GeForce4 MX will certainly get the job done, but a GeForce3 Ti board will end up having longer legs." That's one comment on the matter... No I'm puzzled what actually will I miss if I play "DirectX8 game" with GF4 compared to GF3 Ti 200 (which gives app. the same frame rates) ???????????????????????????????? Anyone ??????? Share this post Link to post Share on other sites
Gorgi Knootewoot 0 Posted March 6, 2002 i read in an article that the GF4 MX was somewhat slower than the GF3 Ti500. Also it didn't have the infinite-engine II, which the GF4 ti version will have. It is just like the GF2 MX, they suck. I am waiting for the GF4 Ti 4600 Share this post Link to post Share on other sites
JAP 2 Posted March 6, 2002 GF 3 Tit. all the way Or wait for GF4 Pro or something, don t spend your money now on GF4 ! Wait a month or two Share this post Link to post Share on other sites
Gorgi Knootewoot 0 Posted March 6, 2002 </span><table border="0" align="center" width="95%" cellpadding="3" cellspacing="1"><tr><td>Quote (JAP @ Mar. 06 2002,12:20)</td></tr><tr><td id="QUOTE">GF 3 Tit. all the way Or wait for GF4 Pro or something, don t spend your money now on GF4 ! Wait a month or two<span id='postcolor'> The GF4Ti series are very fast and good. They will be released this month, and the fastest is the Ti4600. I buy it this month. First i wanted the GF3Ti500, but i was willing to wait a few more months. I cannot wait any longer. When i wait 2-3 more months, yes better cards will appear. But that almost happenes every month. Share this post Link to post Share on other sites
JAP 2 Posted March 6, 2002 </span><table border="0" align="center" width="95%" cellpadding="3" cellspacing="1"><tr><td>Quote (Gorgi Knootewoot @ Mar. 06 2002,12:54)</td></tr><tr><td id="QUOTE"></span><table border="0" align="center" width="95%" cellpadding="3" cellspacing="1"><tr><td>Quote (JAP @ Mar. 06 2002,12:20)</td></tr><tr><td id="QUOTE">GF 3 Tit. all the way Or wait for GF4 Pro or something, don t spend your money now on GF4 ! Wait a month or two<span id='postcolor'> The GF4Ti series are very fast and good. They will be released this month, and the fastest is the Ti4600. I buy it this month. First i wanted the GF3Ti500, but i was willing to wait a few more months. I cannot wait any longer. When i wait 2-3 more months, yes better cards will appear. But that almost happenes every month.<span id='postcolor'> The waiting is for the price to get down a little ! GF(as everything is ) is overpriced the first two months. Share this post Link to post Share on other sites
multimedia 0 Posted March 6, 2002 I could get MSI GF4 MX 440 for 170 Euros if I make a postal order and my local reseller offered it for 180 Euros. I could already get any of the TI series but the prize is from 400 to 600 Euros which is too much for me. I've been waiting for GF3 prizes to come down but due to that difference between GF4 MX and GF3 it hasn't and I won't even expect it to come down. Share this post Link to post Share on other sites
DodgeME 0 Posted March 6, 2002 Well GF3 Ti for sure. GF4Mx are GF2 cards based on the NV17 with very high clock speeds and new features. also GF4Mx are not DX8 compatitable. GF3 Ti 500 would blow away any GF4MX. I should recommend GF4 Ti 4200 when it arrives Share this post Link to post Share on other sites
multimedia 0 Posted March 6, 2002 ..so nobody actually knows what's the difference between GF4MX and GF3/GF4 Ti if we're not talking about frame rates ??????????????????????????????????????????????? - is it better graphics ??? If it is what's better - are there graphical features that you'll miss with GF4 MX ?? Does anybody know a web page which would have screen shots that would clarify the situation ? ..and YES I would buy "GeForce 4 Ti 46220000 12800Mb VIVO Golden Sample ExtraOrdinaire AGP+ DDR SDRAM" if I had the money but I DON'T !!!!!!!!!!! Share this post Link to post Share on other sites
Gorgi Knootewoot 0 Posted March 6, 2002 The GF4 Ti4200 and 4400 will be better then the GF3ti500, and cheaper. Only the Ti4600 will be expensiver. I will buy this one, and not wait for the price to drop. I must play JKII: Jedi Outcast with this baby. GeForce4 Ti 4600    Vertices per Second: 136 Million                    Fill Rate:4.8 Billion AA samples/Sec                    Operations per Second:1.23 Trillion                    Memory Bandwidth:10.4GB/Sec.                    Maximum Memory:128MB GeForce4 Ti 4400    Vertices per Second:125 Million                    Fill Rate:4.4 Billion AA Samples/Sec.                    Operations per Second:1.12 Trillion                    Memory Bandwidth:8.8GB/Sec.                    Maximum Memory:128MB GeForce4 MX 460    Fill Rate:1.2 Billion Texels/Sec                    Triangles per Second:38 Million                    Memory Bandwidth:8.8GB/Sec.                    Maximum Memory:64MB GeForce4 MX 440    Fill Rate:1.1 Billion Texels/Sec.                    Triangles per Second:34 Million                    Memory Bandwidth:6.4GB/Sec.                    Maximum Memory:64MB GeForce4 MX 420    Fill Rate:1 Billion Texels/Sec.                    Triangles per Second:31 Million                    Memory Bandwidth:2.7GB/Sec.                    Maximum Memory: 64MB GeForce3 Ti 500     Graphics Core:256-bit                    Memory: 64 MB                    Fill Rate: 3.84 Billion AA Samples/Sec.                    Operations per Second:960 Billion                    Memory Bandwidth:8.0GB/Sec. Share this post Link to post Share on other sites
R. Gerschwarzenge 0 Posted March 6, 2002 </span><table border="0" align="center" width="95%" cellpadding="3" cellspacing="1"><tr><td>Quote (multimedia @ Mar. 06 2002,15:38)</td></tr><tr><td id="QUOTE">..and YES I would buy "GeForce 4 Ti 46220000 12800Mb VIVO Golden Sample ExtraOrdinaire AGP+ DDR SDRAM" if I had the money but I DON'T !!!!!!!!!!!<span id='postcolor'> What? No DVI? Share this post Link to post Share on other sites
DodgeME 0 Posted March 6, 2002 Well Gf3 prices went way down mate. Well as l have said GF3: Vertex Shader X8, 8.1 support-Two cards will performe the same on DX7 games but GF3 will pull ahead in DX8 games. Overall better rendering power. I would easily say that a GF3 card looks better then anything based on the NV17 GF4MX are GF2 cards with:higher clock settings new FSAA which works better then the GF3 one. There are NO vertex shaders-if you plan buying Unreal 2 you will be missing. New LightSpeed Arcitecture 2 that saves more memory bandwidth they say. To cut it short go for a GF3!!! Share this post Link to post Share on other sites
Gorgi Knootewoot 0 Posted March 6, 2002 He should dodge you. The MX version doesn't have these features, but the GF4 ti version does. He should go for the GF4 Ti4400 or the Ti4600 which both have Vertex Shader. Share this post Link to post Share on other sites
Gorgi Knootewoot 0 Posted March 6, 2002 http://www.pcw.co.uk/Products/Hardware/1129011 Pity those who, five months ago, shelled out for a GeForce3 TI500. Having faced tougher than expected competition from ATI's Radeon 8500, it's already yesterday's chip. With almost indecent haste, nVidia has a new flagship product, and one that's destined to stop ATI crowing. The GeForce4 TI4600 is, without any doubt, the most ludicrously, unbelievably speedy Graphics Processing Unit (GPU) in all creation. In fact, if big numbers are all that matter you can save yourself some time right now - just look at the graphs and prepare to extend your overdraft. Our Geforce4 reference board produced staggering results on our test system. At any resolution, with or without Full-Scene Anti-Aliasing (FSAA), where jagged edges in games are smoothed, making scenes look more realistic, the Geforce4 TI4600 whips any other card going. In fact, if you want 1,600 x 1,200 with 4x FSAA switched on, it's the only board that can handle the job. That's just one benefit of having a whacking 128Mb of DDR memory onboard. Some of this performance increase can be put down to clock and memory speed increases, with the former now set at 300MHz and the memory clocked at 325MHz. Because it's DDR memory, this effectively runs at 650MHz. However, the Geforce architecture hasn't been left untouched. First, the NfiniteFX 3D engine has been overhauled, with a new dual Vertex Shader pipeline and an enhanced Pixel Shader, resulting in a theoretical three times increase in Geometry performance and a 50 per cent speed increase in Pixel Shader operations. In most current 3D applications you won't see quite that much difference but, in future applications that make heavy use of Pixel and Vertex Shaders, you should see NfiniteFX II pulling the TI4600 even further away from the competition. Meanwhile, an improved version of the GeForce3's Lightning Memory Architecture, means that bandwidth between the 300MHz GPU and the 650MHz DDR memory chips reach a staggering 10.4Gbps - a significant move up from the TI500's 8Gbps. Not only is the pipeline fatter, but it's also running more efficiently thanks to improved Z-Culling techniques (where the card doesn't bother drawing what you can't see) and a system of independent caches managing the flow of the 3D data. Now, all of this isn't too exciting if it just means a jump from 70fps (frames per second) to 100fps when Unreal 2 is launched, but nVidia is stressing that the sheer amount of power on offer makes previously unavailable effects - realistic fur, for example - very possible. Whether furry varmints actually start making an appearance on our desktops next year is, of course, a matter for developers of 3D games. Luckily, there is one improvement that will make a tangible difference now: Accuview Anti-Aliasing. Previously, if you turned anti-aliasing on at high resolutions, your game slowed to a crawl. With the TI4600, you still suffer a performance hit when switching FSAA on - Quake III Team Arena drops 30fps at 1,280 x 1,024 with 2x FSAA - but it's a manageable one, and pushing it up to 4x FSAA still keeps you within playable levels. In fact, you can run Return to Castle Wolfenstein at 1,600 x 1,200 with 4x FSAA switched on and still play at 50fps. And yes, it looks gorgeous! Sadly, we were unable to test the new, high-quality 4sx mode, which promises to sharpen the image further with finer colour graduations combined with high-detail anisotropic filtering, but the final drivers should make that possible. Dual-monitor technology is also improved and nVidia is now finally up to Matrox's high standards in this department. The desktop manager is now smart enough to pop up dialogue boxes where it's told, rather than in the centre of the two monitors. Our preview card arrived with three connectors on it: the standard D-Sub, an S-Video TV-out, and a digital DVI connector. So, one to rush out and buy? Well, if you're putting together the ultimate games machine, then yes. Otherwise, be patient. Even a year after the release of the GeForce3 there are very few titles that really demand that level of performance, while Pixel and Vertex Shaders still have to make any serious impact on games. All this good stuff is coming, but at the moment you'll be paying around Å350 to watch some spectacular demos run. For now, waiting seems the sensible option. Contact: Nvidia www.nvidia.com This article will appear in the April issue of Personal Computer World, along with a review of nVidia's new Å100 graphics card, the Geforce 4 MX. The April issue is available from newsagents on 28 February Share this post Link to post Share on other sites
Gorgi Knootewoot 0 Posted March 6, 2002 Our first simple graph show Quake III performance. As we one would expect the GeForce3 Ti200 with its slow standard core & memory clock speed 175/400 does not give such a good showing for itself. While the GeForce4 MX 460 core & memory clocks at 300/550 shows how dependent all is on clock speed. It must be remembered that the GeForce3 Ti200 can be amply overclocked for far better performance. Our second simple graph shows Unreal Tournament performance. Here on the other hand the tables are turned and the GeForce4 MX 460 shows its lack of processing power. Its good to see the GeForce3 Ti200 performing much more as you would expect. Thoughts. The GeForce4 Titanium range give sufficient performance difference to the GeForce3 Titanium range but one can hardly be impressed at the performance difference between the GeF4 Ti4600 & the GeF4 Ti4400. Considering the GeF4 Ti4600 will probably be about $100 more it hardly seems worth the extra than the Ti4400. As for the the GeF4 MX4600. No matter how cheap this is we could in all fairness not recommend it. Much better to buy either a GeForce3 Titanium card or GeForce4 Ti4400. You would always be kicking yourself. One is reminded just how far graphics processors have progressed in only a few years when you think back to the initial GeForce cards. What will they be like in a few years? From this you can see that the GF3 is better then the GF4MX series, but the GF4Ti series are the better ones Share this post Link to post Share on other sites
Ex-RoNiN 0 Posted March 6, 2002 I will wait for the GeForce 4-Ti range to come out, then I'll grab myself a then cheap GeForce 3-Ti 200 and overclock it Share this post Link to post Share on other sites
multimedia 0 Posted March 7, 2002 Thanks Gorgi and others.... Quote: "So, one to rush out and buy? Well, if you're putting together the ultimate games machine, then yes. Otherwise, be patient. Even a year after the release of the GeForce3 there are very few titles that really demand that level of performance, while Pixel and Vertex Shaders still have to make any serious impact on games. All this good stuff is coming, but at the moment you'll be paying around Å350 to watch some spectacular demos run. For now, waiting seems the sensible option." So I bought GF4 MX and reading quote above it makes now sense. (Thanks again..) So within a year or two I'll be forced to update my GF4 MX to GF4 Ti (GF3 Ti500) and in the mean time I won't be missing a thing ! And when the time comes GF4 Ti series will sell for 200 -400 Euros. Well how did my new card performed ? I tried GR as I don't have 3DMark. Putting every feature on high detail except shadows on resolution of 1150 x 840 x 32 and the game ran smoooothhhh....... (You must remember that previously I had GF 256 DDR) Share this post Link to post Share on other sites
Icabola 0 Posted March 7, 2002 For your reference (Video Cards and Graphics) lots of discussions there : http://discussions.hardwarecentral.com/cgi-bin....tLogin= Regards, Icabola Share this post Link to post Share on other sites