Cloud GPU Pricing Index
Per-GPU-hour pricing for H100, H200, B200, A100, and L40S across 12 cloud providers. On-demand and spot rates in USD. Verified monthly. H100 on-demand range: $1.87–$6.15/hr as of March 2026.
What Is the Cheapest Cloud Provider for H100 GPUs?
| Provider | Instance | On-Demand $/GPU-hr | Spot $/GPU-hr | Min GPUs | Region | Verified |
|---|---|---|---|---|---|---|
| Vast.ai | Marketplace | $1.87–$3.50 | — | 1 | Various | 2026-02-28 |
| GMI Cloud | H100 SXM Container | $2.10 | — | 1 | US | 2026-02-28 |
| RunPod | Community Cloud | $2.49 | $1.89 | 1 | Various | 2026-02-28 |
| Lambda Labs | On-demand | $2.99 | — | 8 | us-west | 2026-02-28 |
| Google Cloud (A3-High) | a3-highgpu-8g | $3.67 | $2.25 | 8 | us-central1 | 2026-02-28 |
| AWS (P5) | p5.48xlarge | $3.93 | $2.50 | 8 | us-east-1 | 2026-02-28 |
| Azure ND H100 v5 | Standard_ND96isr_H100_v5 | $3.50–$5.00 | — | 8 | East US | 2026-02-28 |
| CoreWeave | HGX H100 | $6.15 | — | 8 | US-East | 2026-02-28 |
NVIDIA H200 141GB — Cloud Pricing
H200 pricing is only 10–25% higher than H100, yet offers 1.76x more memory and 1.43x more bandwidth — almost always better value for 70B+ parameter models.
| Provider | Instance | On-Demand $/GPU-hr | Min GPUs | Verified |
|---|---|---|---|---|
| Lambda Labs | H200 On-demand | $3.29 | 1 | 2026-02-28 |
| GMI Cloud | H200 Container | $3.35 | 1 | 2026-02-28 |
| RunPod | Community Cloud | $3.59 | 1 | 2026-02-28 |
| CoreWeave | HGX H200 | $6.31 | 8 | 2026-02-28 |
NVIDIA A100 80GB — Cloud Pricing
| Provider | Instance | On-Demand $/GPU-hr | Spot $/GPU-hr | Min GPUs | Verified |
|---|---|---|---|---|---|
| Vast.ai | Marketplace | $0.80–$1.50 | — | 1 | 2026-02-28 |
| RunPod | A100 SXM | $1.39 | $0.79 | 1 | 2026-02-28 |
| Lambda Labs | A100 On-demand | $1.79 | — | 8 | 2026-02-28 |
| AWS (P4d) | p4d.24xlarge | $2.75 | $1.40 | 8 | 2026-02-28 |
Frequently Asked Questions About Cloud GPU Pricing
What is the cheapest H100 cloud provider?
The cheapest H100 cloud provider is Vast.ai at $1.87–$3.50/hr. For reliable on-demand: RunPod at $2.49/hr and Lambda Labs at $2.99/hr. Data verified February 2026.
How much does AWS charge for H100 GPUs?
AWS charges $3.93/GPU-hour on-demand for H100 (P5, us-east-1). AWS cut H100 pricing by 44% in June 2025. Spot: ~$2.50/hr. 1-year reservation: $1.90–$2.10/hr.
Should I use H100 or H200 for cloud inference?
For models above 50B parameters, H200 is almost always better value. Lambda Labs H200 ($3.29/hr) vs H100 ($2.99/hr) — 10% premium for 76% more memory. A single H200 fits Llama 3 70B in FP16 vs 2× H100 required.