GPUs with 16–undefined GB VRAM

Browse 38 GPUs with 16–undefined GB VRAM compatible with running LLM models locally. Compare VRAM, memory bandwidth, and AI performance.

← Show all GPUs

Which GPU Do You Need for AI?

The amount of VRAM is the most important specification for running LLMs locally. Most 7B parameter models require 4–8 GB of VRAM at common quantization levels, while 70B models need 24–48 GB. Memory bandwidth determines how fast the model generates tokens — faster bandwidth means faster responses.

GPU List

AMD Instinct MI210

AMD · CDNA 2

64 GB
1638.4 GB/s6,656 SP300W TDP

AMD Instinct MI250X

AMD · CDNA 2

128 GB
3276.8 GB/s14,080 SP560W TDP

AMD Instinct MI300X

AMD · CDNA 3

192 GB
5300.0 GB/s19,456 SP750W TDP

AMD Radeon PRO W7800

AMD · RDNA 3

32 GB
576.0 GB/s4,480 SP260W TDP$2,499

AMD Radeon PRO W7900

AMD · RDNA 3

48 GB
864.0 GB/s6,144 SP295W TDP$3,999

AMD Radeon RX 6800

AMD · RDNA 2

16 GB
512.0 GB/s3,840 SP250W TDP$579

AMD Radeon RX 6800 XT

AMD · RDNA 2

16 GB
512.0 GB/s4,608 SP300W TDP$649

AMD Radeon RX 6900 XT

AMD · RDNA 2

16 GB
512.0 GB/s5,120 SP300W TDP$999

AMD Radeon RX 7800 XT

AMD · RDNA 3

16 GB
624.0 GB/s3,840 SP263W TDP$499

AMD Radeon RX 7900 XT

AMD · RDNA 3

20 GB
800.0 GB/s5,376 SP315W TDP$899

AMD Radeon RX 7900 XTX

AMD · RDNA 3

24 GB
960.0 GB/s6,144 SP355W TDP$999

Intel Arc A770 16GB

Intel · Alchemist

16 GB
560.0 GB/s225W TDP$349

NVIDIA A100 40GB PCIe

NVIDIA · Ampere

40 GB
1555.0 GB/s6,912 CUDA250W TDP

NVIDIA A100 80GB SXM

NVIDIA · Ampere

80 GB
2039.0 GB/s6,912 CUDA400W TDP

NVIDIA A40

NVIDIA · Ampere

48 GB
696.0 GB/s10,752 CUDA300W TDP

NVIDIA GeForce RTX 3090

NVIDIA · Ampere

24 GB
936.2 GB/s10,496 CUDA350W TDP$1,499

NVIDIA GeForce RTX 3090 Ti

NVIDIA · Ampere

24 GB
1008.0 GB/s10,752 CUDA450W TDP$1,999

NVIDIA GeForce RTX 4060 Ti 16GB

NVIDIA · Ada Lovelace

16 GB
288.0 GB/s4,352 CUDA165W TDP$499

NVIDIA GeForce RTX 4070 Ti SUPER

NVIDIA · Ada Lovelace

16 GB
672.0 GB/s8,448 CUDA285W TDP$799

NVIDIA GeForce RTX 4080

NVIDIA · Ada Lovelace

16 GB
716.8 GB/s9,728 CUDA320W TDP$1,199

NVIDIA GeForce RTX 4080 SUPER

NVIDIA · Ada Lovelace

16 GB
736.0 GB/s10,240 CUDA320W TDP$999

NVIDIA GeForce RTX 4090

NVIDIA · Ada Lovelace

24 GB
1008.0 GB/s16,384 CUDA450W TDP$1,599

NVIDIA GeForce RTX 5070 Ti

NVIDIA · Blackwell

16 GB
896.0 GB/s8,960 CUDA300W TDP$749

NVIDIA GeForce RTX 5080

NVIDIA · Blackwell

16 GB
960.0 GB/s10,752 CUDA360W TDP$999

NVIDIA GeForce RTX 5090

NVIDIA · Blackwell

32 GB
1792.0 GB/s21,760 CUDA575W TDP$1,999

NVIDIA H100 PCIe

NVIDIA · Hopper

80 GB
2039.0 GB/s14,592 CUDA350W TDP

NVIDIA H100 SXM

NVIDIA · Hopper

80 GB
3352.0 GB/s16,896 CUDA700W TDP

NVIDIA L4

NVIDIA · Ada Lovelace

24 GB
300.0 GB/s7,424 CUDA72W TDP

NVIDIA L40

NVIDIA · Ada Lovelace

48 GB
864.0 GB/s18,176 CUDA300W TDP

NVIDIA L40S

NVIDIA · Ada Lovelace

48 GB
864.0 GB/s18,176 CUDA350W TDP