Agumbe AI Gateway is designed to fit into different application architectures. You can start with a simple SDK call, use direct HTTP requests, or build a more controlled backend integration for production workloads. This page helps you choose the right path. Recommended path For most production applications, call Agumbe AI Gateway from your backend service. Your frontend or client application should call your own backend. Your backend should call Agumbe using a Gateway API key. This keeps your API key private, gives you one place to add retries and logging, and lets you control which app, model, and guardrail policy are used for each request. Use this path when:Documentation Index
Fetch the complete documentation index at: https://agumbe.mintlify.app/llms.txt
Use this file to discover all available pages before exploring further.
- You are building a production application
- You need to keep credentials server-side
- You want consistent guardrails across user requests
- You want to attach internal metadata such as workspace, operation, or request ID
- You want better control over logging, error handling, and retries
Path 1: SDK integration
Use an SDK integration when you want the fastest path from setup to a working request. Agumbe exposes a provider-compatible API shape, so you can use a familiar LLM SDK and point it to the Agumbe base URL. Use this path when:- You want to integrate quickly
- Your team already uses an SDK-style workflow
- You want minimal custom HTTP code
- You are testing locally or building a backend service
Path 2: Direct HTTP integration
Use direct HTTP when you want full control over request construction or you are integrating from a system where an SDK is not available. Use this path when:- You are integrating from any language or runtime
- You want simple cURL-based testing
- You need low-level control over headers, retries, or transport
- You are building your own client wrapper
Path 3: Tenant-scoped API key
Use a tenant-scoped API key when one backend service needs to call the gateway on behalf of multiple apps or workflows. With this path, your request chooses the app guardrail policy by passingagumbe_guardrails_app_id.
Use this path when:
- One service handles multiple AI workflows
- You want to choose the app policy per request
- You have shared infrastructure calling the gateway for many product surfaces
- You want central routing with flexible app-level guardrails
Path 4: App-scoped API key
Use an app-scoped API key when a key should always use one app’s guardrail policy. With this path, you do not need to passagumbe_guardrails_app_id. The gateway applies the bound app’s guardrails automatically.
Use this path when:
- A workload belongs to one app
- You want tighter credential boundaries
- You want staging, production, or sensitive workflows to have separate keys
- You want to avoid accidentally applying the wrong guardrail policy
app_mismatch.
Path 5: Console and playground
Use the Agumbe Console when you want to test prompts, models, credentials, and guardrails before writing or changing application code. Use this path when:- You are exploring supported models
- You want to test a guardrail policy
- You want to compare request behavior before deploying
- You want to inspect usage, latency, or errors from recent runs
- Choose SDK integration if you want the fastest backend setup.
- Choose direct HTTP if you want language-neutral control.
- Choose a tenant-scoped key if one service needs to select different app policies per request.
- Choose an app-scoped key if the workload should always use one app policy.
- Choose the Console playground if you are testing prompts, models, credentials, or guardrails before production.
smart-default as the model alias, and app-level guardrails configured in the Console.