Nvidia was originally founded in 1993 but it wasn't until 1995 that the company released its first graphics product - the NV1.

Things have changed a lot since then and we're now spoiled with ray tracing, DLSS and other wonderful technologies that make our gaming experiences even more enjoyable.

We're taking a look back through the history of Nvidia graphics cards and what these devices have looked like over the years.

Nvidia NV1

Nvidia GPUs through the ages photo 1
Swaaye/Wikipedia

The Nvidia NV1 launched in 1995 and was able to handle both 2D and 3D video. It was sold as the "Diamond Edge 3D" and even sported a Sega Saturn compatible joypad port.

Several Sega Saturn games were ported for PC including Panzer Dragoon and Virtua Fighter Remix. Those features alone weren't enough to appeal to the market though as the Saturn was struggling to compete with the original PlayStation.

NV1 was off to a rough start that was made worse by the release of Microsoft DirectX which was incompatible with the GPU and many games would not run.

Nvidia RIVA 128

Nvidia GPUs through the ages photo 2
Mathías Tabó/Wikipedia

In 1997 Nvidia released the NV3 aka Riva 128, Riva stood for "Real-time Interactive Video and Animation". This graphics card used both 2D and 3D acceleration along with polygon texture mapping.

At the time Nvidia's rival 3dfx was dominating the market but the NV3 had a 100MHz core/memory clock and essentially doubled the spec of 3dfx's Voodoo 1.

There were two variants of the NV3 in the form of the Riva 128 and the Riva 128ZX. The latter was more powerful with 8MB of VRAM and a 250MHz clock speed.

The NV3 was much more successful than the company's first GPU and helped Nvidia get widespread popularity.

Nvidia NV4

Nvidia GPUs through the ages photo 3
Tors/Wikipedia

In 1998, Nvidia unleashed NV4, aka Riva TNT. This graphics card improved over the previous models by including support for 32-bit True Colour. The NV4 also had more RAM with 16MB of SDR SDRAM which meant it offered greater performance too.

It was around this time that Nvidia started to make moves to regularly update graphics drivers to ensure good performance and compatibility for the end user, something the company is still doing to this day.

Nvidia's Riva TNT was more affordable at the time than 3dfx's Vodoo2 if a little slower in terms of performance. The driver support was key to NV4's success.

Nvidia NV5

Nvidia GPUs through the ages photo 4
Uwe Hermann/Wikipedia

In 1999, Nvidia followed up NV4 with the RIVA TNT2. NV5, as it was codenamed, bought a number of updates including 32-bit Z-buffer/stencil support, up to 32MB of VRAM and 2048 x 2048 texture support.

More importantly, this graphics card had improved clock speeds (up to 150+ MHz) which gave it as much as a 17 per cent performance boost over the previous model.

This card was a direct competitor to the 3dfx Voodoo3 and both were incredibly popular.

Nvidia GeForce 256

Nvidia GPUs through the ages photo 5
Hyins/Wikipedia

Late in 1999, Nvidia released what it pitched as the "world's first GPU" in the form of the Nvidia GeForce 256.

This was a clever marketing move from Nvidia and the beginning of a love affair with GeForce-branded graphics cards for years to come.

It improved upon previous RIVA cards by increasing the pixel pipelines but also offered a big leap in performance for PC gaming.

This card supported up to 64MB of DDR SDRAM and operated up to 166MHz. As such it was 50 per cent faster than the NV5.

More importantly, the GeForce 256 also fully supported Direct3D 7 which meant it could power many of the best PC games available at the time.

Shortly after this launch 3dfx went bankrupt and ATI became Nvidia's main competitor.

Nvidia GeForce2

Nvidia GPUs through the ages photo 6
Hyins/Wikipedia

Nvidia followed up the world's first GPU with the aptly named GeForce2.

This GPU came in several different variants including the Ti, Pro, GTS and Ultra models. These were essentially NV11, 15 and 16 cards. Increasing pipelines and higher clock rates followed throughout the 2000 and 2001 releases.

The thing that made GeForce2 interesting was the start of support for multi-monitor setups.

It was also around this time that Nvidia acquired 3dfx.

Nvidia GeForce3

Nvidia GPUs through the ages photo 7
Hyins/Wikipedia

Shortly after GeForce2 came GeForce3. Codenamed NV20, this series of graphics cards represented Nvidia's first DirectX 8 compatible GPUs.

There were three versions - the GeForce3, GeForce3 Ti 200, and GeForce3 Ti 500. This next GeForce GPU added programmable pixel and vertex shaders, multisample anti-aliasing into the mix.

A version of GeForce3 referred to as NV2A was used in the original Xbox console.

Nvidia GeForce FX series

Nvidia GPUs through the ages photo 8
Hyins/Wikipedia

Leap forward a couple of generations and in 2003 we have the release of Nvidia's GeForce FX series.

These were the fifth generation of the GeForce graphics cards and supported Direct3D 9 as well as a number of new memory technologies. Those included DDR2, GDDR2 and GDDR3 as well as Nvidia's first attempt at a memory data bus wider than 128 bits.

Meanwhile, the GeForce FX 5800 made waves for being the first GPU to be equipped with a large cooler. One so big that it was often called the "dustbuster" because of the amount of fan noise it gave off.

Nvidia GeForce 6 series

Nvidia GPUs through the ages photo 9
Hyins/Wikipedia

Shortly after the release of the GeForce FX series came the 6 series (aka NV40). GeForce 6 was the start of Nvidia pushing SLI technology allowing people to combine more than one graphics card for more power.

The flagship of this range was the GeForce 6800 Ultra, a graphics card which boasted 222 million transistors, 16-pixel superscalar pipelines and six vertex shaders. It had Shader Model 3.0 support and was compliant with both Microsoft DirectX 9.0c and OpenGL 2.0.

The series also featured Nvidia PureVideo technology and was able to decode H.264, VC-1, WMV and MPEG-2 videos with reduced CPU use.

As a result of all this, the GeForce 6 series was highly successful.

Nvidia GeForce7 series

Nvidia GPUs through the ages photo 10
FxJ/Wikipedia

In 2005 came the GeForce7 series including the highlight thought of 7800 GTX.

That card alone was a real powerhouse for the time and with some clever cooling, Nvidia was able to push the clock speed to 550MHz. At the same time, the company also managed to reduce latency and increase the bus to 512 bit.

Interestingly a version of the 7 series was crafted as the RSX Reality Synthesizer which was the proprietary CPU co-created by Nvidia and Sony for the PlayStation 3.

Nvidia GeForce8 series

Nvidia GPUs through the ages photo 11
Shooke/Wikipedia

In 2006 Nvidia released the GeForce8 series and unveiled its Tesla microarchitecture. This was the company's first unified shader design and would be one of its most used architectures in future cards too.

The much-loved Nvidia GeForce 8800 GTX was the flagship of the range and was incredibly popular. It boasted 681 million transistors, had 768MB of GDDR3 memory and came with 128 shaders clocked at 575MHz.

Most significantly, this graphics card could run Crysis, which was all that PC gamers wanted at the time.

In 2006 things were also heating up as AMD purchased ATI for $5.4 billion and would become a thorn in Nvidia's side for years to come.

Nvidia GeForce9 series

Nvidia GPUs through the ages photo 12
Sk/Wikipedia

At the start of 2008 Nvidia released the GeForce9 series. These cards continued to use the Tesla architecture but also added PCIe 2.0 support, improved colour and z-compression too.

By this time Nvidia was pushing performance up a notch with clock speeds hitting up to 675MHz and yet power consumption was being reduced as well.

Nvidia GeForce 400 series

Nvidia GPUs through the ages photo 13
Amazon

The next significant update came in 2010 when Nvidia launched the GeForce 4000 series. This was when Nvidia revealed the Fermi microarchitecture which was the company's next major architecture.

This series also saw support for OpenGL 4.0 and Direct3D 11 and was a direct competitor to the Radeon HD 5000 Series.

However, the series was let down by high running temperatures and power consumption which caused a lot of grumbles from users.

Nvidia GeForce 600 series

Nvidia GPUs through the ages photo 14
HanyNAR

The GeForce 600 series introduced Nvidia's Kepler architecture which was designed to increase performance per watt while also improving upon the performance of the previous Fermi microarchitecture.

With Kepler, Nvidia managed to increase the memory clock to 6GHz. It also added GPU Boost which guaranteed the GPU would be able to run at a minimum clock speed, but also boost performance when needed until it reached a pre-defined power target.

This range also supported both Direct3D 11 and Direct3D 12. It also introduced a new anti-aliasing method known as TXAA.

Nvidia GeForce 700 series

Nvidia GPUs through the ages photo 15
Marcus Burns

In 2013 Nvidia kicked things up a notch with the 700 series which was topped off with the insane high-end enthusiast GTX Titan card.

This series was a refresh of the Kepler microarchitecture but some later cards also featured Femi and Maxwell architectures.

The GTX Titan boasted 2688 CUDA cores, 224 TMUs and 6GB of RAM. By this time Nvidia was managing to squeeze as many as seven billion transistors into its GPUs.

The GeForce 700 series was designed to maximise energy efficiency but also included other features such as hardware H.264 encoding, PCI Express 3.0 interface, support for DisplayPort 1.2 and HDMI 1.4a 4K x 2K video output as well as GPU-Boost 2.0.

Nvidia GeForce 900 series

Nvidia GPUs through the ages photo 16
Marcus Burns

In 2014 Nvidia seemingly skipped a generation and went straight into the GeForce 900 series. This was the introduction to the Maxwell microarchitecture which offered improved graphics capabilities as well as better energy efficiency.

This series had the GeForce GTX 980 as its initial flagship, later followed by the 980 Ti and GeForce GTX TITAN X at the extreme end.

NVENC was also improved for this series and support appeared for Direct3D 12_1, OpenGL 4.6, OpenCL 3.0 and Vulkan 1.3.

Nvidia GeForce10 series

Nvidia GPUs through the ages photo 17
Marcus Burns

In 2014 Nvidia revealed the GeForce10 series based on the Pascal microarchitecture. The GTX 1080 Ti is perhaps the most well-known of the GPUs from this series and cemented itself in history as perhaps one of the most significant cards Nvidia released.

This graphics card dominated the market and offered such superb performance and energy efficiency vs cost that it would often be referred to when compared with future GPUs.

The 10 series also included new features such as GPU Boost 3.0, a dynamic load balancing scheduling system, triple buffering and support for both DisplayPort 1.4 and HDMI 2.0b.

Nvidia GeForce20 series

Nvidia GPUs through the ages photo 18
Marcus Burns

The GeForce 20 series was released in 2018 and introduced the Turing microarchitecture to the world.

Notably, these graphics cards were the first-gen of RTX cards and saw Nvidia pushing ray tracing as the main selling point.

The use of Tensor Cores and other enhancements helped these cards present a massive leap in graphical prowess. Resulting in realistic lighting and convincing reflection effects in-game.

Improvements came at a cost though, with the RTX 2080 Ti retailing for an eye-watering $1,199 (compared to the $699 of the 1080 Ti).

Nvidia GeForce30 series

Nvidia GPUs through the ages photo 19
Pocket-lint

The GeForce30 series succeeded the 20 series in 2020 and unfortunately became most well-known for simply being impossible to get hold of due to the silicone shortage.

Still, the Nvidia RTX 3080 was meant to retail for $699 making it much more affordable than the previous generations flagship and the series bought significant improvements too.

The flagship GeForce RTX 3090 Ti announced in March 2022 packed 10,752 CUDA cores, 78 RT-TFLOPs, 40 Shader-TFLOPs and 320 Tensor-TFLOPs of power. This powerful beast was also noted to be power-hungry, requiring at least 850 watts of power meaning you'd likely need a 1,000-watt PSU to run it.

Nvidia GeForce 40-series

Nvidia's smoldering RTX 4090 power connector labeled a 'disaster' photo 1
Nvidia

Nvidia has constantly been trying to push the boundaries with each iteration. With the RTX 40-series, one user proved it was possible to run games at 13K resolution.

This was the first series of GPUs to use the 12VHPWR connector which led to some problems, but otherwise mostly had glowing reviews.

The flagship card of this range, the RTX 4090, had 24GB of GDDR6X, 16384 Cuda cores and some serious processing power. With these cards, Nvidia also introduced DLSS 3.