GeForce 3 series
{{Short description|Series of GPUs by Nvidia}}
{{for|GeForce cards with a model number of 3X0|GeForce 300 series}}
{{Use mdy dates|date=October 2018}}
{{Infobox GPU
| name = GeForce 3 series
| image = File:Nvidia GeForce 3 Series Logo.png
| codename = NV20
| created = {{start date and age|February 27, 2001}}
| model1 = GeForce 3 series
| model2 = GeForce 3 Ti series
| midrange = Ti 200
| highend = GeForce 3 (original), Ti 500
| d3dversion = Direct3D 8.0
Vertex Shader 1.1
Pixel Shader 1.1
| openglversion = OpenGL 1.3
|predecessor = GeForce 2 (NV15)
|successor = GeForce 4 Ti (NV25)
| support status = Unsupported
|architecture=Kelvin
}}
The GeForce 3 series (NV20) is the third generation of Nvidia's GeForce line of graphics processing units (GPUs). Introduced in February 2001,{{Cite web |last=Witheiler |first=Matthew |date=July 25, 2001 |title=NVIDIA GeForce3 Roundup - July 2001 |url=https://www.anandtech.com/show/802 |website=Anandtech}} it advanced the GeForce architecture by adding programmable pixel and vertex shaders, multisample anti-aliasing and improved the overall efficiency of the rendering process.
The GeForce 3 was unveiled during the 2001 Macworld Conference & Expo/Tokyo 2001 in Makuhari Messe and powered realtime demos of Pixar's Junior Lamp and id Software's Doom 3. Apple would later announce launch rights for its new line of computers.
The GeForce 3 family comprises 3 consumer models: the GeForce 3, the GeForce 3 Ti200, and the GeForce 3 Ti500. A separate professional version, with a feature-set tailored for computer aided design, was sold as the Quadro DCC. A derivative of the GeForce 3, known as the NV2A, is used in the Microsoft Xbox game console.
Architecture
The GeForce 3 was introduced three months after Nvidia acquired the assets of 3dfx. It was marketed as the nFinite FX Engine, and was the first Microsoft Direct3D 8.0 compliant 3D-card. Its programmable shader architecture enabled applications to execute custom visual effects programs in Microsoft Shader language 1.1. It is believed that the fixed-function T&L hardware from GeForce 2 was still included on the chip for use with Direct3D 7.0 applications, as the single vertex shader was not fast enough to emulate it yet.{{Cite web |last=McKesson |first=Jason |title=Programming at Last |url=https://paroj.github.io/gltut/History%20Radeon8500.html}} With respect to pure pixel and texel throughput, the GeForce 3 has four pixel pipelines which each can sample two textures per clock. This is the same configuration as GeForce 2, excluding the slower GeForce 2 MX line.
To take better advantage of available memory performance, the GeForce 3 has a memory subsystem dubbed Lightspeed Memory Architecture (LMA). This is composed of several mechanisms that reduce overdraw, conserve memory bandwidth by compressing the z-buffer (depth buffer) and better manage interaction with the DRAM.
Other architectural changes include EMBM support{{Cite web|url=http://ixbtlabs.com/articles/digest3d/0402/itogi-video-gf3.html|title=April 2002 3Digest - NVIDIA GeForce3|first=iXBT|last=Labs|website=iXBT Labs}}{{Cite web|url=https://www.nvidia.com/object/geforce3_faq.html|title = GeForce RTX 20 Series Graphics Cards and Laptops}} (first introduced by Matrox in 1999) and improvements to anti-aliasing functionality. Previous GeForce chips could perform only super-sampled anti-aliasing (SSAA), a demanding process that renders the image at a large size internally and then scales it down to the end output resolution. GeForce 3 adds multi-sampling anti-aliasing (MSAA) and Quincunx anti-aliasing methods, both of which perform significantly better than super-sampling anti-aliasing at the expense of quality. With multi-sampling, the render output units super-sample only the Z-buffers and stencil buffers, and using that information get greater geometry detail needed to determine if a pixel covers more than one polygonal object. This saves the pixel/fragment shader from having to render multiple fragments for pixels where the same object covers all of the same sub-pixels in a pixel. This method fails with texture maps which have varying transparency (e.g. a texture map that represents a chain link fence). Quincunx anti-aliasing is a blur filter that shifts the rendered image a half-pixel up and a half-pixel left in order to create sub-pixels which are then averaged together in a diagonal cross pattern, destroying both jagged edges but also some overall image detail. Finally, the GeForce 3's texture sampling units were upgraded to support 8-tap anisotropic filtering, compared to the previous limit of 2-tap with GeForce 2. With 8-tap anisotropic filtering enabled, distant textures can be noticeably sharper.
A derivative of the GeForce 3, known as the NV2A, is used in the Microsoft Xbox game console. It is clocked the same as the original GeForce 3 but features an additional vertex shader.{{Cite web |last=Parker |first=Sam |date=May 22, 2001 |title=Inside the Xbox GPU |url=https://www.gamespot.com/articles/inside-the-xbox-gpu/1100-2764090/ |access-date=2025-02-13 |website=GameSpot |language=en-US}}{{cite web |url=https://www.eurogamer.net/articles/article_30419 |title=Microsoft clarify NV2A |website=Eurogamer.net |date=March 28, 2001 |access-date=October 10, 2024 |archive-date=March 20, 2020 |archive-url=https://web.archive.org/web/20200320214850/https://www.eurogamer.net/articles/article_30419 |url-status=live }}[https://web.archive.org/web/20010331050522/http://cube.ign.com/news/32458.html Graphics Processor Specifications], IGN, 2001{{cite web |url=http://www.anandtech.com/show/853/2 |title=Anandtech Microsoft's Xbox |publisher=Anandtech.com |access-date=2024-10-10 |archive-url=https://web.archive.org/web/20101104081735/http://www.anandtech.com/show/853/2 |archive-date=2010-11-04 |url-status=live }}{{cite news |last1=Smith |first1=Tony |title=TSMC starts fabbing Nvidia Xbox chips |url=https://www.theregister.co.uk/2001/02/14/tsmc_starts_fabbing_nvidia_xbox/ |access-date=10 October 2024 |work=The Register |date=14 February 2001}}
Performance
The GeForce 3 GPU (NV20) has the same theoretical pixel and texel throughput per clock as the GeForce 2 (NV15). The GeForce 2 Ultra is clocked 25% faster than the original GeForce 3 and 43% faster than the Ti200; this means that in select instances, like Direct3D 7 T&L benchmarks, the GeForce 2 Ultra and sometimes even GTS can outperform the GeForce 3 and Ti200, because the newer GPUs use the same fixed-function T&L unit, but are clocked lower.{{cite web|url=https://techreport.com/review/2515/nvidia-geforce3-graphics-processor/11|title=NVIDIA's GeForce3 graphics processor|website=techreport.com|date=June 26, 2001|access-date=June 25, 2017}} The GeForce 2 Ultra also has considerable raw memory bandwidth available to it, only matched by the GeForce 3 Ti500. However, when comparing anti-aliasing performance the GeForce 3 is clearly superior because of its MSAA support and memory bandwidth/fillrate management efficiency.
When comparing the shading capabilities to the Radeon 8500, reviewers noted superior precision with the ATi card.{{cite web|url=https://techreport.com/review/3266/ati-radeon-8500-off-the-beaten-path/5|title=ATI's Radeon 8500: Off the beaten path|website=techreport.com|date=December 31, 2001|access-date=June 25, 2017}}
Product positioning
Nvidia refreshed the lineup in October 2001 with the release of the GeForce 3 Ti200 and Ti500. This coincided with ATI's releases of the Radeon 7500 and Radeon 8500. The Ti500 has higher core and memory clocks (240 MHz core/250 MHz RAM) than the original GeForce 3 (200 MHz/230 MHz), and generally matches the Radeon 8500 in performance. The Ti200 is clocked lower (175 MHz/200 MHz) making it the lowest-priced GeForce 3 release, but it still surpasses the Radeon 7500 in speed and feature set although lacking dual-monitor implementation.
The original GeForce3 and Ti500 were only released in 64 MiB configurations, while the Ti200 was also released as 128 MiB versions.
The GeForce 4 Ti (NV25), introduced in April 2002, was a revision of the GeForce 3 architecture.{{Cite web|url=https://techreport.com/review/3379/a-look-at-nvidia-geforce4-chips/2|title = A look at NVIDIA's GeForce4 chips| work=The Tech Report |date = February 6, 2002}}{{Cite web|url=http://www.activewin.com/reviews/hardware/graphics/nvidia/gf4ti4600/gf2.shtml|title=ActiveWin.Com: NVIDIA GeForce 4 Ti 4600 - Review|website=www.activewin.com}} The GeForce 4 Ti was very similar to the GeForce 3; the main differences were higher core and memory speeds, a revised memory controller, improved vertex and pixel shaders, hardware anti-aliasing and DVD playback. Proper dual-monitor support was also brought over from the GeForce 2 MX. With the GeForce 4 Ti 4600 as the new flagship product, this was the beginning of the end of the GeForce 3 Ti 500 which was already difficult to produce due to poor yields, and it was later completely replaced by the much cheaper but similarly performing GeForce 4 Ti 4200.{{Cite web|url=https://www.tomshardware.com/reviews/pc-graphics-xbox,423-19.html|title=PC Graphics Beyond XBOX - NVIDIA Introduces GeForce4|author1=Thomas Pabst|date=February 6, 2002|website=Tom's Hardware}} Also announced at the same time was the GeForce 4 MX (NV17), which despite the name was closer in terms of architecture and feature set to the GeForce 2 (NV 11 and NV15).{{cite web |author=Lal Shimpi, Anand |url=http://www.anandtech.com/show/875 |title=Nvidia GeForce4 - NV17 and NV25 Come to Life |publisher=AnandTech |date=February 6, 2002 |access-date=October 1, 2024}} The GeForce 3 Ti200 was still kept in production for a short while as it occupied a niche spot between the (delayed) GeForce 4 Ti4200 and GeForce 4 MX460, with performance equivalent to the DirectX 7.0 compliant MX460 while also having full DirectX 8.0 support, although lacking the ability to support dual-monitors. {{Cite web|url=https://www.tomshardware.com/reviews/pc-graphics-xbox,423-18.html|title=PC Graphics Beyond XBOX - NVIDIA Introduces GeForce4|author1=Thomas Pabst|date=February 6, 2002|website=Tom's Hardware}} However, ATI released the Radeon 8500LE (a slower clocked version of the 8500) which outperformed both the Ti200 and MX460. ATI's move in turn compelled Nvidia to roll out the Ti4200 earlier than planned, also at a similar price to the MX 460, and soon afterwards discontinuing the Ti200 by summer 2002 due to naming confusion with the GeForce 4 MX and Ti lines. The GeForce 3 Ti200 still outperforms the Radeon 9000 (RV250) that was introduced around the time of the Ti200's discontinuation; as unlike the 8500LE which was just a slower-clocked 8500, the 9000 was a major redesign to reduce production cost and power usage; the Radeon 9000's performance was equivalent to the GeForce 4 MX440. {{cite web |author=Worobyew, Andrew. |author2=Medvedev, Alexander |url=https://www.pricenfees.com/digit-life-archives/nvidia-geforce4-ti-4400-geforce4-ti-4600-nv25-review |title=Nvidia GeForce4 Ti 4400 and GeForce4 Ti 4600 (NV25) Review |publisher=Pricenfees |access-date=October 1, 2024 |archive-date=October 12, 2018 |archive-url=https://web.archive.org/web/20181012014518/https://extensivelyreviewed.com/nvidia-geforce4-ti-4400-geforce4-ti-4600-nv25-review.html |url-status=dead }}
Specifications
{{main|Comparison of Nvidia graphics processing units}}
{{Further|Kelvin (microarchitecture)}}
- All models are made via TSMC 150 nm fabrication process
- All models support Direct3D 8.0 and OpenGL 1.3
- All models support 3D Textures, Lightspeed Memory Architecture (LMA), nFiniteFX Engine, Shadow Buffers
{{Row hover highlight}}
class="mw-datatable wikitable sortable" style="font-size:85%; text-align:center;" |
rowspan="2" style="vertical-align: bottom"|Model
! rowspan="2" style="vertical-align: bottom"|Launch ! rowspan="2" {{Vertical header|Code name}} ! rowspan="2" {{Vertical header|Transistors (million)}} ! rowspan="2" {{Vertical header|Die size (mm2)}} ! rowspan="2" {{Vertical header|Bus interface}} ! rowspan="2" {{Vertical header|Core clock (MHz)}} ! rowspan="2" {{Vertical header|Memory clock (MHz)}} ! rowspan="2" {{Vertical header|Core config{{efn|name=geforce 3 1|Pixel shaders: vertex shaders: texture mapping units: render output units}}}} ! colspan="4" |Fillrate ! colspan="4" |Memory ! rowspan="2" {{Vertical header|Performance (GFLOPS ! rowspan="2" {{Vertical header|TDP (Watts)}} |
---|
{{Vertical header|MOperations/s}}
!{{Vertical header|MPixels/s}} !{{Vertical header|MTexels/s}} !{{Vertical header|MVertices/s}} !{{Vertical header|Size (MB)}} !{{Vertical header|Bandwidth (GB/s)}} !{{Vertical header|Bus type}} !{{Vertical header|Bus width (bit)}} |
style="text-align:left" |GeForce3 Ti200
|October 1, 2001 | rowspan="3" |NV20 | rowspan="3" |57 | rowspan="3" |128 | rowspan="3" |{{nowrap|AGP 4x}}, PCI |175 |200 | rowspan="3" |4:1:8:4 |700 |700 |1400 |43.75 |64 |6.4 |rowspan="3" |DDR |rowspan="3" |128 |8.750 |? |
style="text-align:left" |GeForce3
|February 27, 2001 |200 |230 |800 |800 |1600 |50 |64 |7.36 |10.00 |? |
style="text-align:left" |GeForce3 Ti500
|October 1, 2001 |240 |250 |960 |960 |1920 |60 |64 |8.0 |12.00 |29 |
{{notelist}}
Discontinued support
Nvidia has ceased driver support for GeForce 3 series.
=Final drivers=
- Windows 9x & Windows Me: 81.98 released on December 21, 2005; [http://www.nvidia.com/object/win9x_81.98.html Download];
:[http://www.nvidia.com/object/81.98_9x_supported.html Product Support List Windows 95/98/Me – 81.98].
:* Driver version 81.98 for Windows 9x/Me was the last driver version ever released by Nvidia for these systems; no new official releases were later made for these systems.
- Windows 2000, 32-bit Windows XP & Media Center Edition: 93.71 released on November 2, 2006; [https://www.nvidia.com/download/driverResults.aspx/5/ Download].
- Also available: 93.81 (beta) released on November 28, 2006; [http://www.nvidia.com/object/nzone_downloads_winxp_2k_32bit_93.81.html Download].
- Linux 32-bit: 96.43.23 released on September 14, 2012; [https://www.nvidia.com/Download/driverResults.aspx/48996/en-us/ Download].
The drivers for Windows 2000/XP can also be installed on later versions of Windows such as Windows Vista and 7; however, they do not support desktop compositing or the Aero effects of these operating systems.
Note: Despite claims in the documentation that 94.24 (released on May 17, 2006) supports the Geforce 3 series, it does not (94.24 actually supports only GeForce 6 and GeForce 7 series).{{Cite web|url=https://www.nvidia.com/en-us/drivers/details/60/|title=Driver Details|website=NVIDIA}}
:(Products supported list also on this page)
[http://www.nvidia.com/object/win9x_archive.html Windows 95/98/Me Driver Archive]
[http://www.nvidia.com/object/winxp-2k_archive.html Windows XP/2000 Driver Archive]
[http://www.nvidia.com/object/unix.html Unix Driver Archive]
See also
References
{{Reflist}}
External links
{{Commons category|Nvidia GeForce 3 series video cards|GeForce 3 series}}
- [http://www.nvidia.com/page/geforce3.html Nvidia: GeForce3 - The Infinite Effects GPU]
- [http://www.nvidia.com/object/win9x_81.98.html ForceWare 81.98 drivers, Final Windows 9x/ME driver release]
- [http://www.nvidia.com/object/winxp_2k_93.71_2.html ForceWare 93.71 drivers, Final Windows XP driver release]
- [http://www.anandtech.com/showdoc.aspx?i=1442 Anandtech: Nvidia GeForce3]
- [http://www.anandtech.com/showdoc.aspx?i=1539 Anandtech: Nvidia's Fall Product Line: GeForce3 Titanium]
- [http://www.techpowerup.com/gpudb techPowerUp! GPU Database]
{{Nvidia}}
{{DEFAULTSORT:Geforce 3 series}}