GeForce 8 series#GeForce 8300 and 8400 series
{{Short description|Series of GPUs by Nvidia}}
{{for|GeForce cards with a model number of 8X0M|GeForce 800M series}}
{{Use mdy dates|date=October 2018}}
{{Infobox GPU
| name = GeForce 8 series
| image =
| caption = GeForce 8800 Ultra released in May 2007; the series' flagship unit.
| codename = G8x
| architecture = Tesla
| created = {{start date and age|November 8, 2006}}
| model1 = GeForce GS series
| model2 = GeForce GT series
| model3 = GeForce GTS series
| model4 = GeForce GTX series
| model5 = GeForce Ultra series
| entry = 8100
8200
8300
8400
8500
| midrange = 8600 GS/GT/GTS
| highend = 8800 GS/GT/GTS
| enthusiast = 8800 GTX/Ultra
| d3dversion = Direct3D 10.0
Shader Model 4.0
| openglversion = OpenGL 3.3
|predecessor = GeForce 7 series
|successor = GeForce 9 series
| support status = Unsupported
}}
The GeForce 8 series is the eighth generation of Nvidia's GeForce line of graphics processing units. The third major GPU architecture developed by Nvidia, Tesla represents the company's first unified shader architecture.[https://archive.today/20120524105931/http://phx.corporate-ir.net/phoenix.zhtml?p=irol-eventDetails&c=116466&eventID=1411995 Q3 2007 NVIDIA Corporation Earnings Conference]. NVIDIA.com. November 9, 2006.{{Cite web|title=NVIDIA's GeForce 8800 graphics processor|url=http://www.techreport.com/reviews/2006q4/geforce-8800/index.x?pg=1|url-status=dead|archive-url=https://web.archive.org/web/20070715124131/http://www.techreport.com/reviews/2006q4/geforce-8800/index.x?pg=1|archive-date=2007-07-15}}
Overview
All GeForce 8 Series products are based on Nikola Tesla.
As with many GPUs, it is important to note that the larger numbers these cards carry does not guarantee superior performance over previous generation cards with a lower number. For example, the GeForce 8300 and 8400 entry-level cards cannot be compared to the previous GeForce 7200 and 7300 cards due to their inferior performance. The same can be said for the high-end GeForce 8800 GTX card, which cannot be compared to the previous GeForce 7800 GTX card due to differences in performance.
=Max resolution=
Dual dual-link DVI support:
Able to drive two flat-panel displays up to 2560×1600 resolution. Available on select GeForce 8800 and 8600 GPUs.
One dual-link DVI support:
Able to drive one flat-panel display up to 2560×1600 resolution. Available on select GeForce 8500 GPUs and GeForce 8400 GS cards based on the G98.
One single-link DVI support:
Able to drive one flat-panel display up to 1920×1200 resolution. Available on select GeForce 8400 GPUs.[http://www.nvidia.com/object/geforce_8600_features.html GeForce 8600 – Features and Benefits] GeForce 8400 GS cards based on the G86 only support single-link DVI.
=Display capabilities=
The GeForce 8 series supports 10-bit per channel display output, up from 8-bit on previous Nvidia cards. This potentially allows higher fidelity color representation and separation on capable displays. The GeForce 8 series, like its recent predecessors, also supports Scalable Link Interface (SLI) for multiple installed cards to act as one via an SLI Bridge, so long as they are of similar architecture.
NVIDIA's PureVideo HD video rendering technology is an improved version of the original PureVideo introduced with GeForce 6. It now includes GPU-based hardware acceleration for decoding HD movie formats, post-processing of HD video for enhanced images, and optional High-bandwidth Digital Content Protection (HDCP) support at the card level.Shrout, Ryan. [http://www.pcper.com/article.php?aid=319&type=expert&pid=5 NVIDIA GeForce 8800 GTX Review - DX10 and Unified Architecture] {{Webarchive|url=https://web.archive.org/web/20070701001604/http://www.pcper.com/article.php?aid=319&type=expert&pid=5 |date=July 1, 2007 }}, PC Perspective, November 8, 2006.
GeForce 8300 and 8400 series
Image:ASUS NVIDIA GeForce EN8400GS 512MB.jpg
File:NVidia EVGA GeForce 8400 GS 1 GB.jpg
In the summer of 2007 Nvidia released the entry-level GeForce 8300 GS and 8400 GS graphics cards, based on the G86 core. The GeForce 8300 was only available in the OEM market, and was also available in integrated motherboard GPU form as the GeForce 8300 mGPU. The GeForce 8300 series was only available in PCI Express, with the GeForce 8400 series using either PCI Express or PCI. The first version of the 8400 GS is sometimes called "GeForce 8400 GS Rev. 1".
Being entry-level cards, it is usually less powerful than with mid-range and high-end cards. Because of the reduced graphics performance of these cards, it is not suitable for intense 3D applications such as fast, high-resolution video games, however they could still play most games in lower resolutions and settings, making these cards (in particular the 8400 series) popular among casual gamers and HTPC (Media Center) builders without a PCI Express or AGP slot on the motherboard.
The GeForce 8300 and 8400 series were originally designed to replace the low-cost GeForce 7200 series and entry-level GeForce 7300 series, however they were not able to do so due to their aforementioned inferior gaming performance.
At the end of 2007 Nvidia released a new GeForce 8400 GS based on the G98 (D8M) chip.{{cite web| url=http://en.expreview.com/2007/12/04/born-for-hd-first-review-of-g98-8400gs/77.html | title=Born for HD: first review of G98-8400GS | archive-url=https://web.archive.org/web/20100417154050/http://en.expreview.com/2007/12/04/born-for-hd-first-review-of-g98-8400gs/77.html | archive-date=2010-04-17}}, Expreview.com, G98-8400GS review, accessed March 29, 2010. It is quite different from the G86 used for the "first" 8400 GS, as the G98 features VC-1 and MPEG2 video decoding completely in hardware, lower power consumption, reduced 3D-performance and a smaller fabrication process. The G98 also features dual-link DVI support and PCI Express 2.0. G86 and G98 cards were both sold as "8400 GS", the difference showing only in the technical specifications. This card is sometimes referred to as "GeForce 8400 GS Rev. 2".
During mid-2010 Nvidia released another revision of the GeForce 8400 GS based on the GT218 chip.[http://nouveau.freedesktop.org/wiki/CodeNames/ nouveau/CodeNames], freedesktop.org, accessed February 28, 2014 It has a larger amount of RAM, a significantly reduced 3D-performance, and is capable of DirectX 10.1, OpenGL 3.3 and Shader 4.1. This card is also known as "GeForce 8400 GS Rev. 3".
GeForce 8500 and 8600 series
On April 17, 2007, Nvidia released the GeForce 8500 GT for the entry-level market, and the GeForce 8600 GT and 8600 GTS for the mid-range market. The GeForce 8600 GS was also available. They are based on the G84 core. This series came in PCI Express configurations, with some cards in PCI.
With the 8600 series being mid-range cards, they provided more power than entry-level cards such as the 8400 and 8500 series but are not as powerful as with the high-end cards such as the 8800 series. They provided adequate performance in most games with decent resolutions and settings but may struggle with handling some higher-resolution video games.
Nvidia introduced 2nd-generation PureVideo with this series. As the first major update to PureVideo since the GeForce 6's launch, 2nd-gen PureVideo offered much improved hardware-decoding for H.264.
GeForce 8800 series
{{Redirect|G80|the expressway in China|G80 Guangzhou–Kunming Expressway}}
Image:Evgageforce8800gtxtopleft.jpg GeForce 8800 GTX]]
Image:Evgageforce8800gtxunder2.jpg
The 8800 series, codenamed G80, was launched on November 8, 2006, with the release of the GeForce 8800 GTX and GTS for the high-end market. A 320 MB GTS was released on February 12 and the Ultra was released on May 2, 2007. The cards are larger than their predecessors, with the 8800 GTX measuring 10.6 in (~26.9 cm) in length and the 8800 GTS measuring 9 in (~23 cm). Both cards have two dual-link DVI connectors and an HDTV/S-Video out connector. The 8800 GTX requires 2 PCIe power inputs to keep within the PCIe standard, while the GTS requires just one.
=8800 GS=
The 8800 GS is a trimmed-down 8800 GT with 96 stream processors and either 384 or 768 MB of RAM on a 192-bit bus.[http://www.trustedreviews.com/graphics/news/2008/01/03/nVidia-GeForce-8800-GS-Uncovered/p1 8800 GS Uncovered], "TrustedReviews", November 9, 2008. In May 2008, it was rebranded as the 9600 GSO in an attempt to spur sales.
The early 2008 iMac models featured an 8800 GS GPU that is actually a modified version of the 8800M GTS (which is a laptop-specific GPU normally found in high-end laptops) with a slightly higher clock speed, rebranded as an 8800 GS.{{cite web|url=https://notebookcheck.net/NVIDIA-GeForce-8800M-GTX.8836.0.html|title=NVIDIA GeForce 8800M GTX|year=2007|work=Notebookcheck|access-date=25 June 2024}} These newly updated models with the rebranded 8800 GS GPUs were announced by Apple on April 28, 2008.{{cite web | title=Apple Updates iMac | url=https://www.apple.com/pr/library/2008/04/28imac.html | publisher=Apple Inc| date=2008-04-28}} It uses 512 MB of GDDR3 video memory clocked at 800 MHz, 64 unified stream processors, a 500 MHz core speed, a 256-bit memory bus width, and a 1250 MHz shader clock. These specifications are highly similar to that of the 8800M GTS, of which the iMac's 8800 GS GPU is based on.{{cite web | title=NVIDIA GeForce 8800M GTS | url=http://www.notebookcheck.net/NVIDIA-GeForce-8800M-GTS.6934.0.html | website=www.NotebookCheck.net| date=2008-05-25}}
=8800 GTX / 8800 Ultra=
Image:NVIDIA NVIO-1-A3 RAMDAC.jpg
The 8800 GTX is equipped with 768 MB GDDR3 RAM. The 8800 series replaced the GeForce 7900 series as Nvidia's top-performing consumer GPU. GeForce 8800 GTX and GTS use identical GPU cores, but the GTS model disables parts of the GPU and reduces RAM size and bus width to lower production cost.
At the time, the G80 was the largest commercial GPU ever constructed. It consists of 681 million transistors covering a 480 mm2 die surface area built on a 90 nm process. (In fact the G80's total transistor count is ~686 million, but since the chip was made on a 90 nm process and due to process limitations and yield feasibility, Nvidia had to break the main design into two chips: Main shader core at 681 million transistors and NV I/O core of about ~5 million transistors making the entire G80 design standing at ~686 million transistors).
A minor manufacturing defect related to a resistor of improper value caused a recall of the 8800 GTX models just two days before the product launch, though the launch itself was unaffected."Visionary". [http://www.vr-zone.com/?i=4253 All 8800 GTX Cards Being Recalled] {{Webarchive|url=https://web.archive.org/web/20070927043611/http://www.vr-zone.com/?i=4253 |date=September 27, 2007 }}, VR-Zone.com, November 6, 2006.
The GeForce 8800 GTX was by far the fastest GPU when first released, and 13 months after its initial debut it still remained one of the fastest. The GTX has 128 stream processors clocked at 1.35 GHz, a core clock of 575 MHz, and 768 MB of 384-bit GDDR3 memory at 1.8 GHz, giving it a memory bandwidth of 86.4 GB/s. The card performs faster than a single Radeon HD 2900 XT, and faster than 2 Radeon X1950 XTXs in Crossfire or 2 GeForce 7900 GTXs in SLI. The 8800 GTX also supports HDCP, but one major flaw is its older NVIDIA PureVideo processor that uses more CPU resources. Originally retailing for around US$600, prices came down to under US$400 before it was discontinued. The 8800 GTX was also very power hungry for its time, demanding up to 185 watts of power and requiring two 6-pin PCI-E power connectors to operate. The 8800 GTX also has 2 SLI connector ports, allowing it to support NVIDIA 3-way SLI for users who run demanding games at extreme resolutions such as 2560x1600.
The 8800 Ultra, retailing at a higher price,{{Clarify|date=February 2009|than what? 8800 GTX?}} is identical to the GTX architecturally, but features higher clocked shaders, core and memory. Nvidia told the media in May 2007 that the 8800 Ultra was a new stepping,{{Clarify|date=February 2009|stepping-processor maybe? stepping-stone?}} creating less heat{{Clarify|date=February 2009|than what?}} therefore clocking higher. Originally retailing from $800 to $1000, most users thought the card to be a poor value, offering only 10% more performance than the GTX but costing hundreds of dollars more. Prices dropped to as low as $200 before being discontinued on January 23, 2008. The core clock of the Ultra runs at 612 MHz, the shaders at 1.5 GHz, and finally the memory at 2.16 GHz, giving the Ultra a theoretical memory bandwidth of 103.7 GB/s. It has 2 SLI connector ports, allowing it to support Nvidia 3-way SLI. An updated dual slot cooler was also implemented, allowing for quieter and cooler operation at higher clock speeds."Visionary". [http://techreport.com/articles.x/12379/1 8800 Ultra Reviewed], Techreport.com, November 9, 2008.
=8800 GT=
{{Redirect|G92|the expressway in China|G92 Hangzhou Bay Ring Expressway}}
The 8800 GT, codenamed G92, was released on October 29, 2007. This card is the first to transition to the 65 nm process, and supports PCI-Express 2.0.[http://www.vr-zone.com/?i=5092 GeForce 8800GT 65nm and PCI-E 2.0 support] {{Webarchive|url=https://web.archive.org/web/20071013223039/http://vr-zone.com/?i=5092 |date=October 13, 2007 }}, VR-Zone.com, accessed October 7, 2007. It has a single-slot cooler as opposed to the dual-slot cooler on the 8800 GTS and GTX, and uses less power than GTS and GTX due to its aforementioned 65 nm process. While its core processing power is comparable to that of the GTX, the 256-bit memory interface and the 512 MB of GDDR3 memory often hinders its performance at very high resolutions and graphics settings. The 8800 GT, unlike other 8800 cards, is equipped with the PureVideo HD VP2 engine for GPU assisted decoding of the H.264 and VC-1 codecs.
The release of this card presents an odd dynamic to the graphics processing industry. With an initial projected street price at around $300, this card outperforms ATI's flagship HD2900XT in most situations, and even NVIDIA's own 8800 GTS 640 MB (previously priced at an MSRP of $400). The card, while only marginally slower in synthetic and gaming benchmarks than the 8800 GTX, also takes much of the value away from Nvidia's own high-end card.
Performance benchmarks at stock speeds place it above the 8800 GTS (640 MB and 320 MB versions) and slightly below the 8800 GTX. A 256 MB version of the 8800 GT with lower stock memory speeds (1.4 GHz as opposed to 1.8 GHz) but with the same core is also available. Performance benchmarks have shown that the 256 MB version of the 8800 GT has a considerable performance disadvantage when compared to its 512 MB counterpart, especially in newer games such as Crysis. Some manufacturers also make models with 1 GB of memory; and with large resolutions and big textures, one can perceive a significant performance difference in the benchmarks. These models are more likely to take up to 2 slots of the computer due to its usage of dual-slot coolers instead of a single-slot cooler on other models.
The performance (at the time) and popularity of this card is demonstrated by the fact that even as late as 2014, the 8800 GT was often listed as the minimum requirement for modern games developed for much more powerful hardware.
=8800 GTS=
File:PNY XLR8 8800GTS 640MB.jpg 8800GTS 640MB]]
The first releases of the 8800 GTS line, in November 2006, came in 640 MB and 320 MB configurations of GDDR3 RAM and utilized Nvidia's G80 GPU.[http://www.nvidia.com/object/IO_37234.html GeForce 8800 Press Release], NVIDIA.com, accessed November 9, 2006. While the 8800 GTX has 128 stream processors and a 384-bit memory bus, these versions of 8800 GTS feature 96 stream processors and a 320-bit bus. With respect to features, however, they are identical because they use the same GPU.Wasson, Scott. [http://www.techreport.com/articles.x/11211 Nvidia's GeForce 8800 graphics processor], Tech Report, November 8, 2006.
Around the same release date as the 8800 GT, Nvidia released a new 640 MB version of the 8800 GTS. While still based on the 90 nm G80 core, this version has 7 out of the 8 clusters of 16 stream processors enabled (as opposed to 6 out 8 on the older GTSs), giving it a total of 112 stream processors instead of 96. Most other aspects of the card remain unchanged. However, because the only 2 add-in partners producing this card (BFG and EVGA) decided to overclock it, this version of the 8800 GTS actually ran slightly faster than a stock GTX in most scenarios, especially at higher resolutions, due to the increased clock speeds.{{cite web| url=http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/3173-evga-8800gts-640mb-w-112sps-ssc-edition-review.html | title=EVGA 8800GTS 640MB (w/112SPs) SSC Edition Review | publisher=Hardware Canucks |date=2007-11-10}}
Nvidia released a new 8800 GTS 512 MB based on the 65 nm G92 GPU on December 10, 2007.Wasson, Scott. [http://www.techreport.com/articles.x/13772 Nvidia's GeForce 8800 GTS 512 graphics card], Tech Report, December 11, 2007. This 8800 GTS has 128 stream processors, compared to the 96 processors of the original GTS models. It is equipped with 512 MB GDDR3 on a 256-bit bus. Combined with a 650 MHz core clock and architectural enhancements, this gives the card raw GPU performance exceeding that of 8800 GTX, but it is constrained by the narrower 256-bit memory bus. Its performance can match the 8800 GTX in some situations, and it outperforms the older GTS cards in all situations.
=Compatibility issues with PCI Express 1.0a on GeForce 8800 GT/8800 GTS 512 MB cards=
Shortly after their release, an incompatibility issue with older PCI Express 1.0a motherboards surfaced. When using the PCI Express 2.0 compliant 8800 GT or 8800 GTS 512 in some motherboards with PCI Express 1.0a slots, the card would not produce any display image, but the computer would often boot (with the fan on the video card spinning at a constant 100%). The incompatibility has been confirmed on motherboards with VIA PT880Pro/Ultra,{{Cite web |url=http://hardforum.com/showpost.php?p=1031631970&postcount=119 |title=HardForum – View Single Post – Asrock 775 Dual VSTA Incompatible with Nvidia 8800 series |access-date=October 27, 2011 |archive-date=April 3, 2012 |archive-url=https://web.archive.org/web/20120403221039/http://hardforum.com/showpost.php?p=1031631970&postcount=119 |url-status=dead }} Intel 925[http://news.softpedia.com/news/ATI-to-Dismiss-PCI-Express-2-0-Compatibility-Issues-77282.shtml ATI to Dismiss PCI-Express 2.0 Compatibility Issues - Softpedia] and Intel 5000P[http://communities.intel.com/thread/25240 S5000PSL with recent nVidia graphics such as GTX550] PCI Express 1.0a chipsets.
Some graphics cards had a workaround that involves re-flashing the graphics card's BIOS with an older Gen 1 BIOS, however this effectively made it into a PCI Express 1.0 card, which is unable to utilize PCI Express 2.0 functions. This could be considered a non-issue however, since the card itself could not even utilize the full capacity of the regular PCI Express 1.0 slots, there was no noticeable reduction in performance. Also, flashing the video card's BIOS usually voided the warranties of most video card manufacturers (if not all), thus making it a less-than-optimum way of getting the card to work properly. A proper workaround to this is to flash the BIOS of the motherboard to the latest version, which depending on the manufacturer of the motherboard, may contain a fix.
In relation to this, the high numbers of cards reported as DOA (as much as 13–15%) were believed to be inaccurate. When it was revealed that the G92 8800 GT and 8800 GTS 512 MB were going to be designed with PCI Express 2.0 connections, NVIDIA claimed that all cards would have full backwards compatibility, but failed to mention that this was only true for PCI Express 1.1 motherboards. The source of the BIOS-flash workaround did not come from NVIDIA or any of their partners, but rather ASRock, a mainboard producer, who mentioned the fix in one of their motherboard FAQs. ASUSTek sells the 8800 GT with their sticker, posted a newer version of their 8800 GT BIOS on their website, but did not mention that it fixed this issue. EVGA also posted a new bios to fix this issue.{{Cite web |url=http://www.evga.com/forums/tm.asp?m=176929 |title=VGA 8800 GT BIOS (Updated 1/14/08) |access-date=April 22, 2008 |archive-date=February 23, 2008 |archive-url=https://web.archive.org/web/20080223124231/http://www.evga.com/forums/tm.asp?m=176929 |url-status=dead }}
Technical summary
- Direct3D 10 and OpenGL 3.3 support
- 1 Unified shaders: texture mapping units: render output units
- 2 Full G80 contains 32 texture address units and 64 texture filtering units unlike G92 which contains 64 texture address units and 64 texture filtering units{{cite web |url=http://www.anandtech.com/show/2549/3 |title=Lots More Compute, a Leetle More Texturing - Nvidia's 1.4 Billion Transistor GPU: GT200 Arrives as the GeForce GTX 280 & 260 |website=Anandtech.com |date=June 16, 2008 |first1=Anand|last1=Lal Shimpi|first2=Derek|last2=Wilson|access-date=December 11, 2015}}{{cite web |url=http://www.anandtech.com/show/2116/6 |title=Digging deeper into the shader core - Nvidia's GeForce 8800 (G80): GPUs Re-architected for Direct3D 10 |website=Anandtech.com |date=November 8, 2006 |first1=Anand|last1=Lal Shimpi|first2=Derek|last2=Wilson|access-date=December 11, 2015}}
- 3 To calculate the processing power, see Performance.
class="wikitable" style="font-size: 85%; text-align: center;" |
rowspan="2" | Model
! rowspan="2" | Launch ! rowspan="2" | Codename ! rowspan="2" | Fab (nm) ! rowspan="2" | Transistors (million) ! rowspan="2" | Die size (mm2) ! rowspan="2" | Core config1 ! colspan="3" | Clock rate ! colspan="2" | Fillrate ! colspan="4" | Memory ! Processing power (GFLOPS)3 ! rowspan="2" | TDP(Watts) ! rowspan="2" | Comments |
---|
Core
(MHz) ! Shader (MHz) ! Memory (MHz) ! Pixel (GP/s) ! Texture (GT/s) ! Size (MB) ! Bandwidth (GB/s) ! Bus type ! Bus width (bit) |
style="text-align:left;" | GeForce 8100 mGPU{{cite web |url=http://www.tomshardware.com/reviews/amd-nvidia-chipset,1972-14.html |title=AMD and Nvidia Platforms Do Battle |website=Tomshardware.com |date=July 18, 2008 |access-date=December 11, 2015}}
| 2008 | MCP78 | 80 | {{unk}} | {{unk}} | rowspan="3" | PCIe 2.0 ×16 | 8:8:4 | 500 | 1200 | 400 RAM) | 2 | 4 | Up to 512 of system RAM | 6.4 | DDR2 | 64 | 28.8 | {{unk}} | The block of decoding of HD-video PureVideo HD is disconnected |
style="text-align:left;" | GeForce 8200 mGPU
| 2008 | MCP78 | 80 | {{unk}} | {{unk}} | 8:8:4 | 500 | 1200 | 400 RAM) | 2 | 4 |gt | 6.4 | DDR2 | 64 | 28.8 | {{unk}} | PureVideo 3 with VP3 |
style="text-align:left;" | GeForce 8300 mGPU
| 2008 | MCP78 | 80 | {{unk}} | {{unk}} | 8:8:4 | 500 | 1500 | 400 RAM) | 2 | 4 | Up to 512 of system RAM | 6.4 | DDR2 | 64 | 36 | {{unk}} | PureVideo 3 with VP3 |
style="text-align:left;" | GeForce 8300 GS{{cite web|url=https://www.theinquirer.net/inquirer/news/1000911/nvidia-gf8600-8500-8300-details-revealed|archive-url=https://web.archive.org/web/20100702143617/http://www.theinquirer.net/inquirer/news/1000911/nvidia-gf8600-8500-8300-details-revealed|url-status=unfit|archive-date=July 2, 2010|date=April 12, 2007|access-date=September 25, 2007|title=Nvidia GF8600/8500/8300 details revealed|website=The Inquirer|first=Theo|last=Valich}}
| July 2007 | G86 | 80 | 210 | 127 | PCIe 1.0 ×16 | 8:8:4 | 450 | 900 | 400 | 1.8 | 3.6 | 128 | 6.4 | DDR2 | 64 | 14.4 | 40 | OEM only |
style="text-align:left;" | GeForce 8400 GS
| June 15, 2007 | G86 | 80 | 210 | 127 | PCIe 1.0 ×16 | 16:8:4 | 450 | 900 | 400 | 1.8 | 3.6 | 128 | 6.4 | DDR2 | 64 | 28.8 | 40 | |
style="text-align:left;" | GeForce 8400 GS rev.2
| December 10, 2007 | G98 | 65 | 210 | 86 | PCIe 2.0 ×16 | 8:8:4 | 567 | 1400 | 400 | 2.268 | 4.536 | 128 | 6.4 | DDR2 | 64 | 22.4 | 25 | |
style="text-align:left;" | GeForce 8400 GS rev.3
| April 26, 2009 | GT218 | 40 | 260 | 57 | PCIe 2.0 ×16 | 8:4:4 | 520 | 1230 | 600 | 2.08 | 2.08 | 512 | 4.8 | DDR3 | 32 | 19.7 | 25 | |
style="text-align:left;" | GeForce 8500 GT
| April 17, 2007 | G86 | 80 | 210 | 127 | PCIe 1.0 ×16 | 16:8:4 | 450 | 900 | 400 | 1.8 | 3.6 | 256 | 12.8 | DDR2 | 128 | 28.8 | 45 | |
style="text-align:left;" | GeForce 8600 GS
| April 2007 | G84 | 80 | 289 | 169 | PCIe 1.0 ×16 | 16:8:8 | 540 | 1180 | 400 | 4.32 | 4.32 | 256 | 12.8 | DDR2 | 128 | 75.5 | 47 | OEM only |
style="text-align:left;" | GeForce 8600 GT
| April 17, 2007 | G84 | 80 | 289 | 169 | PCIe 1.0 ×16 | 32:16:8 | 540 | 1188 | 400 | 4.32 | 8.64 | 256 | 12.8 | DDR2 | 128 | 76 | 47 | |
style="text-align:left;" | GeForce 8600 GTS
| April 17, 2007 | G84 | 80 | 289 | 169 | PCIe 1.0 ×16 | 32:16:8 | 675 | 1450 | 1000 | 5.4 | 10.8 | 256 | 32 | GDDR3 | 128 | 92.8 | 71 | |
style="text-align:left;" | GeForce 8800 GS
| January 2008 | G92 | 65 | 754 | 324 | PCIe 2.0 ×16 | 96:48:12 | 550 | 1375 | 800 | 6.6 | 26.4 | 384 | 38.4 | GDDR3 | 192 | 264 | 105 | |
style="text-align:left;" | GeForce 8800 GTS (G80)
| February 12, 2007 (320) | G80 | 90 | 681 | 484 | rowspan="2" | PCIe 1.0 ×16 | 96:24:20 | 513 | 1188 | 800 | 10.3 | 24.6 | 320 | 64 | GDDR3 | 320 | 228 | 146 | |
style="text-align:left;" | GeForce 8800 GTS 112 (G80)
| November 19, 2007 | G80 | 90 | 681 | 484 | 112:282:20 | 500 | 1200 | 800 | 10 | 24 | 640 | 64 | GDDR3 | 320 | 268.8 | 150 | only XFX, EVGA and BFG models, |
style="text-align:left;" | GeForce 8800 GT
| October 29, 2007 (512) | G92 | 65 | 754 | 324 | rowspan="2" | PCIe 2.0 ×16 | 112:56:16 | 600 | 1500 | 700 (256) | 9.6 | 33.6 | 256 | 57.6 | GDDR3 | 256 | 336 | 125 | |
style="text-align:left;" | GeForce 8800 GTS (G92)
| December 11, 2007 | G92 | 65 | 754 | 324 | 128:64:16 | 650 | 1625 | 970 | 10.4 | 41.6 | 512 | 62.1 | GDDR3 | 256 | 416 | 135 | |
style="text-align:left;" | GeForce 8800 GTX
| November 8, 2006 | G80 | 90 | 681 | 484 | rowspan="2" | PCIe 1.0 ×16 | 128:322:24 | 575 | 1350 | 900 | 13.8 | 36.8 | 768 | 86.4 | GDDR3 | 384 | 345.6 | 145 | |
style="text-align:left;" | GeForce 8800 Ultra
| May 2, 2007 | G80 | 90 | 681 | 484 | 128:322:24 | 612 | 1500 | 1080 | 14.7 | 39.2 | 768 | 103.7 | GDDR3 | 384 | 384 | 175 | |
rowspan="2" | Model
! rowspan="2" | Launch ! rowspan="2" | Codename ! rowspan="2" | Fab (nm) ! rowspan="2" | Transistors (million) ! rowspan="2" | Die size (mm2) ! rowspan="2" | Core config1 ! Core (MHz) ! Shader (MHz) ! Memory (MHz) ! Pixel (GP/s) ! Texture (GT/s) ! Size (MB) ! Bandwidth (GB/s) ! Bus type ! Bus width (bit) ! rowspan="2" | TDP (Watts) ! rowspan="2" | Comments |
colspan="3" | Clock rate
! colspan="2" | Fillrate ! colspan="4" | Memory ! Processing power (GFLOPS)3 |
= Features =
- Compute Capability 1.1: has support for Atomic functions, which are used to write thread-safe programs.
- Compute Capability 1.2: for details see CUDA
class="wikitable" style="font-size: 85%; text-align: center;" |
rowspan="2" | Model
! colspan="9" | Features |
---|
Scalable Link Interface (SLI) ! 3-Way ! PureVideo HD ! PureVideo 2 with VP2, BSP Engine, and AES128 Engine ! PureVideo 3 with VP3, BSP Engine, and AES128 Engine ! PureVideo 4 with VP4 ! Compute |
style="text-align:left;" | GeForce 8300 GS (G86)
| {{no}} | {{no}} | {{no}} | {{yes}} | {{no}} | {{no}} | {{yes|1.1}} |
style="text-align:left;" | GeForce 8400 GS Rev. 2 (G98)
| {{no}} | {{no}} | {{no}} | {{no}} | {{yes}} | {{no}} | {{yes|1.1}} |
style="text-align:left;" | GeForce 8400 GS Rev. 3 (GT218)
| {{no}} | {{no}} | {{no}} | {{no}} | {{no}} | {{yes}} | {{yes|1.2}} |
style="text-align:left;" | GeForce 8500 GT
| {{yes}} | {{no}} | {{no}} | {{yes}} | {{no}} | {{no}} | {{yes|1.1}} |
style="text-align:left;" | GeForce 8600 GT
| {{yes}} | {{no}} | {{no}} | {{yes}} | {{no}} | {{no}} | {{yes|1.1}} |
style="text-align:left;" | GeForce 8600 GTS
| {{yes}} | {{no}} | {{no}} | {{yes}} | {{no}} | {{no}} | {{yes|1.1}} |
style="text-align:left;" | GeForce 8800 GS (G92)
| {{yes}} | {{no}} | {{no}} | {{yes}} | {{no}} | {{no}} | {{yes|1.1}} |
style="text-align:left;" | GeForce 8800 GTS (G80)
| {{yes}} | {{no}} | {{yes}} | {{no}} | {{no}} | {{no}} | {{no|1.0}} |
style="text-align:left;" | GeForce 8800 GTS Rev. 2 (G80)
| {{yes}} | {{no}} | {{yes}} | {{no}} | {{no}} | {{no}} | {{no|1.0}} |
style="text-align:left;" | GeForce 8800 GT (G92)
| {{yes}} | {{no}} | {{no}} | {{yes}} | {{no}} | {{no}} | {{yes|1.1}} |
style="text-align:left;" | GeForce 8800 GTS (G92)
| {{yes}} | {{no}} | {{no}} | {{yes}} | {{no}} | {{no}} | {{yes|1.1}} |
style="text-align:left;" | GeForce 8800 GTX
| {{yes}} | {{yes}} | {{yes}} | {{no}} | {{no}} | {{no}} | {{no|1.0}} |
style="text-align:left;" | GeForce 8800 Ultra
| {{yes}} | {{yes}} | {{yes}} | {{no}} | {{no}} | {{no}} | {{no|1.0}} |
GeForce 8M series
On May 10, 2007, Nvidia announced the availability of their GeForce 8 notebook GPUs through select OEMs. The lineup consists of the 8200M, 8400M, 8600M, 8700M and 8800M series chips.[http://www.nvidia.com/object/geforce_8m.html NVIDIA GeForce 8M Series], nvidia.com, May 10, 2007.
It was announced by Nvidia that some of their graphics chips have a higher than expected rate of failure due to overheating when used in particular notebook configurations. Some major laptop manufacturers made adjustments to fan setting and firmware updates to help delay the occurrence of any potential GPU failure. In late July 2008, Dell released a set of BIOS updates that made the laptop fans spin more frequently.[http://direct2dell.com/one2one/archive/2008/07/25/nvidia-gpu-update-for-dell-laptop-owners.aspx NVIDIA GPU Update for Dell Laptop Owners Fri. 25 Jul. 2008 ] As of mid-August 2008, Nvidia has not published any further details publicly, though it has been heavily rumored that most, if not all, of the 8400 and 8600 cards had this issue.[https://www.engadget.com/2008/07/10/all-nvidia-8400m-8600m-chips-faulty/ "All NVIDIA 8400M 8600M chips faulty"]
=GeForce 8200M series=
The GeForce 8200M is an entry-level series of GeForce 8M GPUs. It can be found in some entry-level to mid-range laptops as an alternative to integrated graphics. The GeForce 8200M G is the only GPU in this series.
Its GPU core was based on the GeForce 9200M/9300M GS GPUs. This series was not designed for gaming, but rather for viewing high-definition video content. It can still play older games just fine, but may struggle to play with then-current games at low settings.[https://www.notebookcheck.net/NVIDIA-GeForce-8200M-G.11360.0.html Notebookcheck: NVIDIA GeForce 8200M G]
Some HP Pavilion, Compaq Presario, and Asus laptops have GeForce 8200M G GPUs.
=GeForce 8400M series=
The GeForce 8400M is the entry-level series for the GeForce 8M chipset. Normally found on mid-range laptops as an alternative solution to integrated graphics, the 8400M was designed for watching high-definition video content rather than gaming.
Versions include the 8400M G, 8400M GS, and 8400M GT. These were not designed for gaming (it was only meant for non-gaming tasks such as high-definition video content as mentioned above), however the GDDR3-equipped 8400M GT can handle most games of its time at medium settings and was suitable for occasional gaming.[http://www.notebookcheck.net/NVidia-GeForce-8400M-GT.3708.0.html Notebookcheck: NVidia GeForce 8400M GT] On the other hand, the rest of the 8400M series aside from the 8400M GT handled older games quite well but can only run then-current games at low settings.
Some ASUS and Acer laptops featured 8400M G GPUs. Some Acer Aspire models, some HP Pavilion dv2000, dv6000, dv9000 models, some Dell Vostro 1500 and 1700 models, the Dell XPS M1330, and some Sony VAIO models featured 8400M GS GPUs. Various Acer Aspire and Sony VAIO laptop models featured 8400M GT GPUs.
=GeForce 8600M series=
The GeForce 8600M was offered in mid-range laptops as a mid-range performance solution for enthusiasts who want to watch high-definition content such as Blu-ray Disc and HD DVD movies and play then-current and some future games with decent settings.
Versions include the 8600M GS and 8600M GT (with the GT being the more powerful one). They provided decent gaming performance (due to the implementation of GDDR3 memory in the higher-end 8600M models) for then-current games.
It is available on the Dell XPS M1530 portable, some Dell Inspiron 1720 models, HP Pavilion dv9000 models, Asus G1S, Sony VAIO VGN-FZ21Z, select Lenovo IdeaPad models, some models of the Acer Aspire 5920, Acer Aspire 9920G and BenQ Joybook S41, the Mid 2007 to Late 2008 MacBook Pro, and some models of Fujitsu Siemens.
The common failure of this chip in, amongst others, MacBook Pro's purchased between May 2007 and September 2008 were part of a class-action suit against nVidia which resulted in Apple providing an extended 4 year warranty related to the issue{{cite web|url=http://support.apple.com/kb/TS2377 |title=MacBook Pro: Distorted video or no video issues |publisher=Apple Support |access-date=June 15, 2013 |url-status=dead |archive-url=https://web.archive.org/web/20100105062655/http://support.apple.com/kb/TS2377 |archive-date=January 5, 2010 }} after confirming that the issue was caused by the Nvidia chip themselves.{{cite web | url=http://appleinsider.com/articles/08/10/10/apple_says_some_macbook_pros_affected_by_faulty_nvidia_chips.html | title=Apple says some MacBook Pros affected by faulty Nvidia chips | publisher=appleinsider | date=October 10, 2008 | access-date=June 15, 2013}}{{cite web|url=http://support.apple.com/kb/TS2377 |title=MacBook Pro: Distorted video or no video issues |access-date=June 15, 2013 |url-status=dead |archive-url=https://web.archive.org/web/20100105062655/http://support.apple.com/kb/TS2377 |archive-date=January 5, 2010 }} This warranty replacement service was expected to cost nVidia around $150 to $200 million{{cite web|url=http://edgar.secdatabase.com/1388/119312508145974/filing-main.htm |title=NVIDIA CORP, Form 8-K, Current Report, Filing Date Jul 2, 2008 |publisher=secdatabase.com |access-date =May 15, 2018}} and knocked over $3 billion off their market capitalisation after being sued by their own shareholders for attempting to cover the issue up.{{cite web | url=http://www.tomshardware.com/news/Nvidia-lawsuit-chip,6345.html | title=Nvidia's Own Shareholders Sue for Coverup | publisher=tomshardware.com | author=Jane McEntegart | date=September 10, 2008 |access-date=June 15, 2013}}
=GeForce 8700M series=
The GeForce 8700M was developed for the mid-range market. The 8700M GT is the only GPU in this series.
This chipset is available on high-end laptops such as the Dell XPS M1730, Sager NP5793, and Toshiba Satellite X205.
While this card is considered by most in the field to be a decent mid-range card, it is hard to classify the 8700M GT as a high-end card due to its 128-bit memory bus, and is essentially an overclocked 8600M GT GDDR3 mid-range card.[http://www.reghardware.co.uk/2007/06/13/nvidia_clocks_up_8600m_as_8700m/ Nvidia clocks up GeForce 8600M for 8700M GT] {{Webarchive|url=https://web.archive.org/web/20080414183519/http://www.reghardware.co.uk/2007/06/13/nvidia_clocks_up_8600m_as_8700m/ |date=April 14, 2008 }}, Register Hardware, June 13, 2007. However, it shows strong performance when in a dual-card SLI configuration, and provides decent gaming performance in a single-card configuration.[http://www.notebookcheck.net/Mobile-Graphics-Cards-Benchmark-List.844.0.html Mobile Graphics Cards - Benchmark List], Notebookcheck.
=GeForce 8800M series=
The GeForce 8800M was developed to succeed the 8700M in the high-end market, and can be found in high-end gaming notebook computers.
Versions include the 8800M GTS and 8800M GTX. These were released as the first truly high-end mobile GeForce 8 Series GPUs, each with a 256-bit memory bus and a standard 512 megabytes of GDDR3 memory, and provide high-end gaming performance equivalent to many desktop GPUs. In SLI, these can produce 3DMark06 results in the high thousands.
Laptop models which include the 8800M GPUs are: Sager NP5793, Sager NP9262, Alienware m15x and m17x, HP HDX9000, and Dell XPS M1730. Clevo also manufactures similar laptop models for CyberPower, Rock, and Sager (among others) - all with the 8800M GTX, while including the 8800M GTS in the Gateway P-6831 FX and P-6860 FX models.
The 8800M GTS was used in modified form as the GeForce 8800 GS in the early 2008 iMac models.
=Technical summary=
class="wikitable" style="font-size: 85%; text-align: center; width: 100%;" |
rowspan=2 style="width:12em" | Model
! rowspan=2 | Release Date ! rowspan=2 | Codename ! rowspan=2 | Fabrication process (nm) ! rowspan=2 | Core clock max (MHz) ! colspan=3 | Peak fillrate ! colspan=2 | Shaders ! colspan=5 | Memory ! rowspan=2 | Power Consumption (Watts) ! rowspan=2 | Transistor Count (Millions) ! rowspan=2 | Theoretical Shader Processing Rate (Gigaflops) |
---|
billion pixel/s
! billion bilinear FP16 texel/s ! Stream Processors ! Clock (MHz) ! Bandwidth max (GB/s) ! DRAM type ! Megabytes ! Effective DDR Clock (MHz) |
valign="top"
! style="text-align:left;" | GeForce 8200M G | June 2008 | MCP77MV MCP79MV | 80 | 350/500 | 3 | ? | ? | 8 | 1200 | ? | DDR2 | 64 | 256 | ? | ? | ? | 19 |
valign="top"
! style="text-align:left;" | GeForce 8400M G | May 10, 2007 | G86M | 80 | 400 | 3.2 | 3.2 | 1.6 | 8 | 800 | 6.4 | GDDR3 | 64 | 128/256 | 1200 | 15 | 210 | 19.2 |
valign="top"
! style="text-align:left;" | GeForce 8400M GS | May 10, 2007 | G86M | 80 | 400 | 3.2 | 3.2 | 1.6 | 16 | 800 | 6.4 | GDDR2/GDDR3 | 64 | 64/128/256 | 1200 | 15 | 210 | 38.4 |
valign="top"
! style="text-align:left;" | GeForce 8400M GT | May 10, 2007 | G86M | 80 | 450 | 3.6 | 3.6 | 1.8 | 16 | 900 | 19.2 | GDDR3 | 128 | 128/256/512 | 1200 | 17 | 210 | 43.2 |
valign="top"
! style="text-align:left;" | GeForce 8600M GS | May 10, 2007 | G84M | 80 | 600 | 4.8 | 4.8 | 2.4 | 16 | 1200 | 12.8/22.4 | DDR2/GDDR3 | 128 | 128/256/512 | 800/1400 | 19 | 210 | 57.6 |
valign="top"
! style="text-align:left;" | GeForce 8600M GT | May 10, 2007 | G84M | 80 | 475 | 3.8 | 7.6 | 3.8 | 32 | 950 | 12.8/22.4 | DDR2/GDDR3 | 128 | 128/256/512 | 800/1400 | 22 | 289 | 91.2 |
valign="top"
! style="text-align:left;" | GeForce 8700M GT | June 12, 2007 | G84M | 80 | 625 | 5.0 | 10.0 | 5.0 | 32 | 1250 | 25.6 | GDDR3 | 128 | 256/512 | 1600 | 29 | 289 | 120.0 |
valign="top"
! style="text-align:left;" | GeForce 8800M GTS[http://www.nvidia.com/object/geforce_8800m.html NVIDIA GeForce 8800M], NVIDIA.com, November 19, 2007. | November 19, 2007 | G92M | 65 | 500 | 8.0 | 16.0 | 8.0 | 64 | 1250 | 51.2 | GDDR3 | 256 | 512 | 1600 | 35 | 754 | 240.0 |
valign="top"
! style="text-align:left;" | GeForce 8800M GTX[http://www.tt-hardware.com/modules.php?name=News&file=article&sid=11006 NVIDIA GeForce 8800GTX Next week], tt-hardware.com, November 14, 2007. | November 19, 2007 | G92M | 65 | 500 | 12.0 | 24.0 | 12.0 | 96 | 1250 | 51.2 | GDDR3 | 256 | 512 | 1600 | 37 | 754 | 360.0 |
- The series has been succeeded by GeForce 9 series (which in turn was succeeded by the GeForce 200 series). The GeForce 8400M GS was the only exception, which has not been renamed in neither the GeForce 9 and GeForce 200 series.
Problems
Some chips of the GeForce 8 series (concretely those from the G84 [for example, G84-600-A2] and G86 series) suffer from an overheating problem. Nvidia states this issue should not affect many chips,[https://arstechnica.com/news.ars/post/20080716-nvidia-denies-rumors-of-mass-gpu-failures.html arstechnica.com] whereas others assert that all of the chips in these series are potentially affected. Nvidia CEO Jen-Hsun Huang and CFO Marvin Burkett were involved in a lawsuit filed on September 9, 2008, alleging their knowledge of the flaw, and their intent to hide it.{{Cite web |url=http://www.infoworld.com/article/08/09/10/Lawsuit_claims_Nvidia_hid_serious_flaw_in_graphics_chips_1.html |title=Info World Article |access-date=September 11, 2008 |archive-url=https://web.archive.org/web/20080912214026/http://www.infoworld.com/article/08/09/10/Lawsuit_claims_Nvidia_hid_serious_flaw_in_graphics_chips_1.html |archive-date=September 12, 2008 |url-status=dead }}
End-of-life driver support
Nvidia has ceased Windows driver support for GeForce 8 series on April 1, 2016.[http://nvidia.custhelp.com/app/answers/detail/a_id/3473 EOL driver support for legacy products]
- Windows XP 32-bit & Media Center Edition: version 340.52 released on July 29, 2014; [http://www.nvidia.com/download/driverResults.aspx/77225/en-us Download]
- Windows XP 64-bit: version 340.52 released on July 29, 2014; [http://www.nvidia.com/download/driverResults.aspx/77226/en-us Download]
- Windows Vista, 7, 8, 8.1 32-bit: version 342.01 (WHQL) released on December 14, 2016; [http://www.nvidia.com/download/driverResults.aspx/112593/en-us Download]
- Windows Vista, 7, 8, 8.1 64-bit: version 342.01 (WHQL) released on December 14, 2016; [http://www.nvidia.com/download/driverResults.aspx/112594/en-us Download]
- Windows 10, 32-bit: version 342.01 (WHQL) released on December 14, 2016; [http://www.nvidia.com/download/driverResults.aspx/112595/en-us Download]
- Windows 10, 64-bit: version 342.01 (WHQL) released on December 14, 2016; [http://www.nvidia.com/download/driverResults.aspx/112596/en-us Download]
See also
- Comparison of Nvidia graphics processing units
- GeForce 7 series
- GeForce 9 series
- GeForce 100 series
- GeForce 200 series
- GeForce 300 series
- Nvidia Quadro - Nvidia workstation graphics system
- Nvidia Tesla - Nvidia's first dedicated general purpose GPU (graphical processor unit)
References
{{reflist|2}}
External links
{{Commons category|Nvidia GeForce 8 series video cards|GeForce 8 series}}
- [http://www.nvidia.com/page/geforce8.html NVIDIA's GeForce 8 series page]
- [http://www.nvidia.com/page/geforce_8800.html Nvidia GeForce 8800 Series]
- [http://www.nvidia.com/object/geforce_8600.html Nvidia GeForce 8600 Series]
- [http://www.nvidia.com/object/geforce_8500.html Nvidia GeForce 8500 Series]
- [http://www.nvidia.com/object/geforce_8400.html Nvidia GeForce 8400 Series]
- [http://www.nvidia.com/object/geforce_8800M.html Nvidia GeForce 8800M Series]
- [http://www.nvidia.com/object/geforce_8600M.html Nvidia GeForce 8600M Series]
- [http://www.nvidia.com/object/geforce_8400M.html Nvidia GeForce 8400M Series]
- [http://developer.nvidia.com/nvidia-nsight-visual-studio-edition Nvidia Nsight]
- [http://www.nvidia.com/download/driverResults.aspx/77225/en-us Nvidia GeForce Drivers for the GeForce 8x00 series (v. 340.52)]
- [http://www.nvidia.com/object/IO_37100.html NVIDIA GeForce 8800 GPU Architecture Overview] - a somewhat longer and more detailed document about the new 8800 features
- [http://developer.download.nvidia.com/opengl/specs/g80specs.pdf OpenGL Extension Specifications for the G8x]
{{Nvidia}}
{{DEFAULTSORT:Geforce 8 series}}