Cloudflare Products Reference
Use this card to identify which Cloudflare products might fit your architecture. You can combine as many as you like. More meaningful usage of Cloudflare products strengthens your score.
Products marked with an ★ AI badge are particularly relevant for AI use cases.
Compute & Serverless
- Workers: Serverless JavaScript, TypeScript, Python, or Rust functions that run at the edge across Cloudflare's global network.
- Containers: Run Docker containers on Cloudflare's network, managed through Durable Objects.
- Durable Objects: Single-instance, stateful serverless objects with their own built-in SQLite storage.
- Workers for Platforms: Deploy isolated Workers on behalf of your own customers.
Storage & Databases
- Workers KV: Globally distributed key-value store. Extremely fast reads.
- D1: Serverless SQLite database running at the edge.
- R2: S3-compatible object storage with zero egress fees.
- Queues: Reliable message queue for asynchronous communication between Workers.
- Workflows: Stateful, durable execution engine for multi-step processes.
- Hyperdrive: Connects Cloudflare Workers to existing Postgres databases efficiently.
AI & Intelligence
- Workers AI ★ AI: Run machine learning models directly on Cloudflare's GPU-enabled global network.
- Custom Model Hosting — LoRA Fine-Tunes ★ AI: Upload a LoRA adapter trained on your own data.
- Custom Model Hosting — Containers ★ AI: Package any model as a Docker container.
- AI Gateway ★ AI: A proxy layer sitting in front of any AI provider (OpenAI, Anthropic, etc.).
- Replicate ★ AI: A platform for running, deploying, and fine-tuning open-source AI models via a simple API.
- Vectorize ★ AI: Globally distributed vector database.
- Agents SDK ★ AI: Framework for building AI agents — autonomous systems running on Workers.
- MCP Servers ★ AI: Expose your internal tools as MCP servers — a standard protocol for AI agents to call services.
AI Security
- Firewall for AI ★ AI: A detection layer specifically for LLM-powered applications.
- AI Security for Apps ★ AI: Enterprise-grade security for AI-powered applications.
- Zero Trust Access for AI Tools ★ AI: Control which employees can access external AI tools.
Developer Experience
- Pages: Deploy full-stack web applications with CI/CD from Git.
- Analytics Engine: Time-series analytics database built directly into Workers.
Quick-pick Guide
| I need to... | Consider... |
|---|---|
| Run serverless logic at the edge | Workers |
| Run a Docker container / custom model | Containers |
| Maintain stateful real-time connections | Durable Objects |
| Store structured data with SQL | D1 |
| Store files, media, model weights | R2 |
| Cache frequently-read config or flags | Workers KV |
| Run background jobs reliably | Queues + Workflows |
| Run AI models without external API keys | Workers AI |
| Use a specialised / niche open-source model | Replicate |
| Fine-tune a model on my own data | Workers AI LoRA fine-tunes |
| Host my own proprietary model | Containers |
| Add semantic search to my app | Vectorize |
| Observe and control all AI API calls | AI Gateway |
| Build an autonomous AI agent | Agents SDK + Durable Objects |
| Connect AI agents to my internal systems | MCP Servers |
| Give users custom compute inside my SaaS | Workers for Platforms |
| Protect my AI endpoint from attacks | Firewall for AI |
| Connect Workers to my existing Postgres DB | Hyperdrive |