GeForce 256#Motion Compensation
{{Use mdy dates|date=October 2018}}
{{Short description|GPU by Nvidia}}
{{Infobox GPU
| name = GeForce 256
| image = File:Geforce256logo.jpg 300px
| caption = Top: Logo
Bottom: Nvidia GeForce 256
| codename = NV10
| date = {{start date and age|October 11, 1999}} (SDR)
{{start date and age|December 13, 1999}}{{cite web|url=http://pc.ign.com/news/13162.html|title=News Briefs|author=IGN staff|date=December 13, 1999|archive-url=https://web.archive.org/web/20000901012320/http://pc.ign.com/news/13162.html|archive-date=September 1, 2000|url-status=dead|access-date=October 1, 2020}} (DDR)
| fab = TSMC
| entry =
| midrange = GeForce 256 SDR
| highend = GeForce 256 DDR
| openglversion = OpenGL 1.2.1 (T&L)
| d3dversion = Direct3D 7.0
| predecessor = RIVA TNT2
| successor = GeForce 2 series
| support status = Unsupported
|architecture=Celsius}}
The GeForce 256 is the original release in Nvidia's "GeForce" product line. Announced on August 31, 1999 and released on October 11, 1999, the GeForce 256 improves on its predecessor (RIVA TNT2) by increasing the number of fixed pixel pipelines, offloading host geometry calculations to a hardware transform and lighting (T&L) engine, and adding hardware motion compensation for MPEG-2 video. It offered a notable leap in 3D PC gaming performance and was the first fully Direct3D 7-compliant 3D accelerator.
Architecture
File:KL NVIDIA Geforce 256.jpg
GeForce 256 was marketed as "the world's first 'GPU', or Graphics Processing Unit", a term Nvidia defined at the time as "a single-chip processor with integrated transform, lighting, triangle setup/clipping, and rendering engines that is capable of processing a minimum of 10 million polygons per second".{{cite web |title=nVidia unveils new computer graphics accelerator |url=https://money.cnn.com/1999/08/31/technology/nvidia/ |website=CNNfn |date=August 31, 1999}}{{Cite web |url=http://www.nvidia.com/object/gpu.html |title=Graphics Processing Unit (GPU) |website=www.nvidia.com |date=December 17, 2009 |access-date=March 24, 2016}}
The "256" in its name stems from the "256-bit QuadPipe Rendering Engine", a term describing the four 64-bit pixel pipelines of the NV10 chip. In single-textured games NV10 could put out 4 pixels per cycle, while a two-textured scenario would limit this to 2 multitextured pixels per cycle, as the chip still had only one TMU per pipeline, just as TNT2.{{Cite web|url=https://www.anandtech.com/show/391|title=NVIDIA GeForce 256 Part 1: To buy or not to buy|first=Anand Lal|last=Shimpi|website=www.anandtech.com}} In terms of rendering features, GeForce 256 also added support for cube environment mapping and dot-product (Dot3) bump mapping.{{Cite web|url=https://www.tomshardware.com/reviews/high,294.html|title=High-Tech And Vertex Juggling – NVIDIA's New GeForce3 GPU|first=Thomas Pabst 27|last=February 2001|website=Tom's Hardware|date=February 27, 2001 }}
The integration of the transform and lighting hardware into the GPU itself set the GeForce 256 apart from older 3D accelerators that relied on the CPU to perform these calculations (also known as software transform and lighting). This reduction of 3D graphics solution complexity brought the cost of such hardware to a new low and made it accessible to cheap consumer graphics cards instead of being limited to the previous expensive professionally oriented niche designed for computer-aided design (CAD). NV10's T&L engine also allowed Nvidia to enter the CAD market with dedicated cards for the first time, with a product called Quadro. The Quadro line uses the same silicon chips as the GeForce cards, but has different driver support and certifications tailored to the unique requirements of CAD applications.{{cite web |url=http://www.nvidia.com/page/workstation.html |title=Nvidia Workstation Products |publisher=Nvidia.com |access-date=October 2, 2007}}
The chip was manufactured by TSMC using its 220 nm CMOS process.{{cite web |last1=Singer |first1=Graham |title=History of the Modern Graphics Processor, Part 2 |url=https://www.techspot.com/article/653-history-of-the-gpu-part-2/ |website=TechSpot |date=April 3, 2013 |access-date=21 July 2019}}
There was only one GeForce 256 release, as succeeding GeForce products would have varying chip speeds. However there were two memory configurations, with the SDR version released in October 1999 and the DDR version released in mid-December 1999{{snd}}each with a different type of SDRAM memory. The SDR version uses SDR SDRAM memory from Samsung Electronics,{{cite web |title=NVIDIA GeForce 256 SDR |url=https://videocardz.net/nvidia-geforce-256-sdr/ |website=VideoCardz.net |access-date=10 July 2019}}{{cite web |title=K4S161622D Datasheet |url=http://www.datasheetcatalog.com/datasheets_pdf/K/4/S/1/K4S161622D.shtml |publisher=Samsung Electronics |access-date=10 July 2019}} while the later DDR version uses DDR SDRAM memory from Hyundai Electronics (now SK Hynix).{{cite web |title=NVIDIA GeForce 256 DDR |url=https://videocardz.net/nvidia-geforce-256-ddr-64mb/ |website=VideoCardz.net |access-date=10 July 2019}}{{cite web |title=HY5DV651622 Datasheet |url=http://www.ic72.com/pdf_file/h/169210.pdf |publisher=Hynix |access-date=10 July 2019}}
Product comparisons
Compared to previous high-end 3D game accelerators, such as 3dfx Voodoo3 3500 and Nvidia RIVA TNT2 Ultra, GeForce provided up to a 50% or greater improvement in frame rate in some games (ones specifically written to take advantage of the hardware T&L) when coupled with a very-low-budget CPU. The later release and widespread adoption of GeForce 2 MX/4 MX cards with the same feature set meant unusually long support for the GeForce 256, until approximately 2006, in games such as Star Wars: Empire at War or Half-Life 2, the latter of which featured a Direct3D 7-compatible path, using a subset of Direct3D 9 to target the fixed-function pipeline of these GPUs.
Without broad application support at the time, critics pointed out that the T&L technology had little real-world value. Initially, it was only somewhat beneficial in certain situations in a few OpenGL-based 3D first-person shooters, most notably Quake III Arena. Benchmarks using low-budget CPUs like the Celeron 300A would give favourable results for the GeForce 256, but benchmarks done with some CPUs such as the Pentium II 300 would give better results with some older graphics cards like the 3dfx Voodoo 2. 3dfx and other competing graphics-card companies pointed out that a fast CPU could more than make up for the lack of a T&L unit. Software support for hardware T&L was not commonplace until several years after the release of the first GeForce. Early drivers were buggy and slow, while 3dfx cards enjoyed efficient, high-speed, mature Glide API and/or MiniGL support for the majority of games. Only after the GeForce 256 was replaced by the GeForce 2, and ATI's T&L-equipped Radeon was also on the market, did hardware T&L become a widely utilized feature in games.
The GeForce 256 was also quite expensive for the time and didn't offer tangible advantages over competitors' products outside of 3D acceleration. For example, its GUI and video playback acceleration were not significantly better than that offered by competition or even older Nvidia products. Additionally, some GeForce cards were plagued with poor analog signal circuitry, which caused display output to be blurry.{{Citation needed|date=August 2010}}
As CPUs became faster, the GeForce 256 demonstrated that the disadvantage of hardware T&L is that, if a CPU is fast enough, it can perform T&L functions faster than the GPU, thus making the GPU a hindrance to rendering performance. This changed the way the graphics market functioned, encouraging shorter graphics-card lifetimes and placing less emphasis on the CPU for gaming.
=Motion compensation=
The GeForce 256 introduced{{Cite web|url=http://www.activewin.com/reviews/hardware/graphics/nvidia/gf4ti4600/gf3.shtml|title=ActiveWin.Com: NVIDIA GeForce 4 Ti 4600 – Review|website=www.activewin.com|access-date=July 2, 2013|archive-date=May 21, 2013|archive-url=https://web.archive.org/web/20130521142750/http://www.activewin.com/reviews/hardware/graphics/nvidia/gf4ti4600/gf3.shtml|url-status=dead}} motion compensation as a functional unit of the NV10 chip,{{cite web |url= http://www.orpheuscomputing.com/downloads2/GeForce_HDVP_brief.pdf|title= Technology brief|website=www.orpheuscomputing.com |access-date=2020-09-21}}{{Cite web|url=https://www.techspot.com/article/653-history-of-the-gpu-part-2/|title=History of the Modern Graphics Processor, Part 2|website=TechSpot}} this first-generation unit would be succeeded by Nvidia's HDVP (High-Definition Video Processor) in GeForce 2 GTS.
Specifications
{{Main|Comparison of Nvidia graphics processing units}}
{{Further|Celsius (microarchitecture)}}
{{Row hover highlight}}
class="mw-datatable wikitable sortable" style="font-size:85%; text-align:center;" |
rowspan="2" style="vertical-align: bottom"|Model
! rowspan="2" style="vertical-align: bottom"|Launch ! rowspan="2" {{Vert header|Code name}} ! rowspan="2" {{Vert header|Transistors (million)}} ! rowspan="2" {{Vert header|Die size (mm2)}} ! rowspan="2" {{Vert header|Bus interface}} ! rowspan="2" {{Vert header|Core clock (MHz)}} ! rowspan="2" {{Vert header|Memory clock (MHz)}} ! rowspan="2" {{Vert header|Core config{{efn|name=geforce 256 1|Pixel pipelines: texture mapping units: render output units}}}} ! colspan="4" |Fillrate ! colspan="4" |Memory ! rowspan="2" {{Vert header|Performance (MFLOPS ! rowspan="2" {{Vert header|TDP (Watts)}} |
---|
{{Vert header|MOperations/s}}
!{{Vert header|MPixels/s}} !{{Vert header|MTexels/s}} !{{Vert header|MVertices/s}} !{{Vert header|Size (MB)}} !{{Vert header|Bandwidth (GB/s)}} !{{Vert header|Bus type}} !{{Vert header|Bus width (bit)}} |
style="text-align:left" |GeForce 256 SDR{{Cite web|title=4x AGP GeForce 256 Graphics Accelerator|url=http://vgamuseum.info/images/doc/nvidia/gf256/geforce256_graphics.pdf|access-date=August 30, 2024|website=vgamuseum.info|archive-date=7 January 2024|archive-url=https://web.archive.org/web/20240107050758/http://vgamuseum.info/images/doc/nvidia/gf256/geforce256_graphics.pdf|url-status=live}}
|{{Dts|1999|October|11|format=mdy|abbr=on}} | rowspan="2" |NV10 | rowspan="2" |17 | rowspan="2" |139 | rowspan="2" |{{nowrap|AGP 4x}}, PCI | rowspan="2" |120 |166 | rowspan="2" |4:4:4 | rowspan="2" |480 | rowspan="2" |480 | rowspan="2" |480 | rowspan="2" |0 | rowspan="2" |32 |2.656 |SDR | rowspan="2" |128 | rowspan="2" |960 |13 |
style="text-align:left" |GeForce 256 DDR{{cite web|title=NVIDIA GeForce 256 DDR Specs|url=https://www.techpowerup.com/gpu-specs/geforce-256-ddr.c734|access-date=August 30, 2024|website=TechPowerUp|language=en}}
|{{Dts|1999|December|13|format=mdy|abbr=on}} |150 |4.800 |DDR |12 |
{{notelist}}
Discontinued support
NVIDIA has ceased driver support for the GeForce 256 series.
=Final drivers=
- Windows 9x & Windows Me: 71.84 released on March 11, 2005; [http://www.nvidia.com/object/win9x_71.84.html Download];
:[http://www.nvidia.com/object/71.84_9x_supported.html Product Support List Windows 95/98/Me – 71.84].
- Windows 2000 & 32-bit Windows XP: 71.89 released on April 14, 2005; [http://www.nvidia.com/object/winxp_2k_71.89.html Download].
:[http://www.nvidia.com/object/71.84_geforcetnt2quadro_supported.html Product Support List Windows XP/2000 – 71.84].
The drivers for Windows 2000/XP may be installed on later versions of Windows such as Windows Vista and 7; however, they do not support desktop compositing or the Aero effects of these operating systems.
Competitors
See also
References
{{Reflist}}
External links
{{Commons category|Nvidia GeForce 256 series video cards|GeForce 256 series}}
- [https://web.archive.org/web/20040414145655/http://www.nvidia.com/page/geforce256.html NVIDIA: GeForce 256 – The World's First GPU from web archive]
- [http://www.nvidia.com/object/win9x_71.84.html ForceWare 71.84 drivers, Final Windows 9x/ME driver release]
- [http://www.nvidia.com/object/winxp_2k_71.89.html ForceWare 71.89 drivers, Final Windows XP driver release]
- [http://www.techpowerup.com/gpudb techPowerUp! GPU Database]
{{Nvidia}}
{{DEFAULTSORT:Geforce 256}}
Category:Computer-related introductions in 1999