distilbert-base-uncased-finetuned-sst-2-english
by distilbert
67M params · text-classification · 885 likes · 3.4M downloads
distilbert-base-uncased-finetuned-sst-2-english is a 67M parameter model. At Q4 quantization it requires 0GB of VRAM. It runs comfortably on GeForce RTX 4090 (19610 tok/s), GeForce RTX 5090 (29407 tok/s), M4 Max 128GB (7176 tok/s).
Inference providers
| Provider | $/1M in | $/1M out | Throughput |
|---|---|---|---|
| HF Inference |
GPU compatibility
| GPU | VRAM | Q4 Decode | Verdict |
|---|---|---|---|
| GeForce RTX 4090 | 24GB | 19610 tok/s | comfortable |
| GeForce RTX 5090 | 32GB | 29407 tok/s | comfortable |
| M4 Max 128GB | 128GB | 7176 tok/s | comfortable |
| M4 Pro 48GB | 48GB | 3588 tok/s | comfortable |
| M4 Pro 24GB | 24GB | 3588 tok/s | comfortable |
| A100 PCIe 80 GB | 80GB | 35928 tok/s | comfortable |
| H100 SXM5 80 GB | 80GB | 72263 tok/s | comfortable |
| GeForce RTX 3090 | 24GB | 17338 tok/s | comfortable |
| Radeon RX 7900 XTX | 24GB | 14337 tok/s | comfortable |
| GeForce RTX 4080 | 16GB | 13917 tok/s | comfortable |