Run code, not containers.
Tears down after every call. Resumes where you left off in under 20ms, as if it never stopped. No zombies, no idle billing.
$ curl -X POST https://api.baponi.ai/v1/execute \
-H "Authorization: Bearer sk-..." \
-d '{
"language": "python",
"code": "print(\'Hello World!\')"
}' # Response
{ "stdout": "Hello World!\n", "exit_code": 0 } {
"mcpServers": {
"baponi": {
"url": "https://api.baponi.ai/mcp",
"headers": {
"Authorization": "Bearer sk-..."
}
}
}
} Claude Desktop, Cursor, Windsurf, and any MCP client.
No create(). No close(). Just execute().
Other sandbox SDKs make you manage container lifecycle. With Baponi, your API key already knows the image, resources, storage, and credentials. You just send code.
# Typical sandbox SDK
sandbox = provider.create() # boot container
sandbox.upload("data.csv", "/tmp/") # move data in
sandbox.run("pip install pandas") # install deps
result = sandbox.run(code) # your code
files = sandbox.download("/output/") # move data out
sandbox.close() # cleanup
# Forgot close()? Zombie container.
# Timeout? State is gone.
# LLM thinking? Still billing. $ curl -X POST https://api.baponi.ai/v1/execute \
-H "Authorization: Bearer sk-..." \
-d '{"code": "print(42)"}' result = baponi.execute(code)
# LangChain, DeepAgent, Anthropic, OpenAI,
# Gemini, CrewAI, and any MCP client. Stateful when you need it.
Add a thread_id to persist the sandbox environment between calls. Installed packages and working files are saved automatically.
- Zero idle cost. Nothing runs until you call again. No billing between calls.
- Configurable retention. 24 hours, 7 days, 30 days, or forever.
# Call 1: set up the environment
client.execute(
code="pip install pandas scikit-learn",
language="bash",
thread_id="analysis-x8k2",
)
# Hours later, zero cost in between
client.execute(
code="import pandas; print(pandas.__version__)",
language="python",
thread_id="analysis-x8k2",
)
# Nothing ran between calls. Same environment. Mount data, don't copy it.
Other sandboxes hold your data hostage. Use their volumes or build your own sync pipeline. With Baponi, mount your S3 or GCS bucket as a local directory. Every file is there on spin-up. Writes go straight to your bucket. Read-only or read-write, your rules.
- Single source of truth. Your bucket is THE storage. No duplicated data.
- Read-only enforcement. Enforced at the kernel level, not an application permission check.
- Sub-path scoping. Mount
for per-tenant isolation.users/user-123/
Bring Your Own Bucket
UnlimitedMount your own S3, GCS, or Azure Blob. Data never leaves your cloud. Sub-path scoping for multi-tenant isolation. Read-only mode available.
Managed Volumes
10 GB freeBaponi-hosted persistent storage. Attach volumes to your agents for datasets, models, and output files. No cloud credentials needed.
Everything your agents need. Nothing they don't.
Sub-20ms. Always.
Under 20ms on every single call. No VM boot. No pod spin-up. No warm-node lottery. Not just when conditions are right.
Network Control
Block all outbound access by default, or open it up when your agent needs internet. Per-sandbox network policies, enforced at the kernel level.
Custom Runtimes
Bring your own container image pre-loaded with your company's packages and internal tools. Your AI agents work with your stack, not a generic sandbox.
Stateful Without Running
State persists between calls, but nothing runs. No idle billing, no timeouts, no orphans. Pick up where you left off, minutes or days later.
MCP + REST + Python SDK
Full MCP support. Direct REST API. Or pip install. 2 lines to run code from any AI agent.
Your Cloud. Your VPC.
The entire platform deploys inside your infrastructure. Built from day one for self-hosted enterprise. Or start with our managed cloud and move when you're ready.
One place to manage which agents access what
Configure credentials once, assign per agent. CLI tools like psql, git, bq and any code that needs those credentials just works. No more custom tools.
Databases
Cloud Storage
APIs & SCM
# Credentials injected transparently. CLI tools and code just work.
$ psql -c "SELECT count(*) FROM orders"
count
-------
42891
$ git clone https://github.com/acme/private-repo.git
Cloning into 'private-repo'... done.
$ python3 -c "
import psycopg2
conn = psycopg2.connect('dbname=analytics')
cur = conn.cursor()
cur.execute('SELECT sum(revenue) FROM sales')
print(cur.fetchone()[0])"
1284930 Your agents already know how to use these tools.
LLMs have seen millions of examples of psql, git, and bq in their training data. Baponi makes it work in practice by handling the credentials.
- 01 Configure connector credentials in the console
- 02 Associate connectors with an API key
- 03 Agent runs code. Credentials injected as native config files.
Ready to get started?
1,000 free credits/month. Unlimited seats. No credit card required.