Skip to main content

/containers

Manage OpenAI code interpreter containers (sessions) for executing code in isolated environments.

FeatureSupported
Cost Trackingโœ…
Loggingโœ… (Full request/response logging)
Load Balancingโœ…
Proxy Server Supportโœ… Full proxy integration with virtual keys
Spend Managementโœ… Budget tracking and rate limiting
Supported Providersopenai

Supported Providers:โ€‹

Quick Startโ€‹

Containers provide isolated execution environments for code interpreter sessions. You can create, list, retrieve, and delete containers.

SDK, PROXY, and OpenAI Clientโ€‹

Create a Container

import litellm
import os

# setup env
os.environ["OPENAI_API_KEY"] = "sk-.."

container = litellm.create_container(
name="My Code Interpreter Container",
custom_llm_provider="openai",
expires_after={
"anchor": "last_active_at",
"minutes": 20
}
)

print(f"Container ID: {container.id}")
print(f"Container Name: {container.name}")

### ASYNC USAGE ###
# container = await litellm.acreate_container(
# name="My Code Interpreter Container",
# custom_llm_provider="openai",
# expires_after={
# "anchor": "last_active_at",
# "minutes": 20
# }
# )

List Containers

from litellm import list_containers, alist_containers
import os

os.environ["OPENAI_API_KEY"] = "sk-.."

containers = list_containers(
custom_llm_provider="openai",
limit=20,
order="desc"
)

print(f"Found {len(containers.data)} containers")
for container in containers.data:
print(f" - {container.id}: {container.name}")

### ASYNC USAGE ###
# containers = await alist_containers(
# custom_llm_provider="openai",
# limit=20,
# order="desc"
# )

Retrieve a Container

from litellm import retrieve_container, aretrieve_container
import os

os.environ["OPENAI_API_KEY"] = "sk-.."

container = retrieve_container(
container_id="cntr_123...",
custom_llm_provider="openai"
)

print(f"Container: {container.name}")
print(f"Status: {container.status}")
print(f"Created: {container.created_at}")

### ASYNC USAGE ###
# container = await aretrieve_container(
# container_id="cntr_123...",
# custom_llm_provider="openai"
# )

Delete a Container

from litellm import delete_container, adelete_container
import os

os.environ["OPENAI_API_KEY"] = "sk-.."

result = delete_container(
container_id="cntr_123...",
custom_llm_provider="openai"
)

print(f"Deleted: {result.deleted}")
print(f"Container ID: {result.id}")

### ASYNC USAGE ###
# result = await adelete_container(
# container_id="cntr_123...",
# custom_llm_provider="openai"
# )

Container Parametersโ€‹

Create Container Parametersโ€‹

ParameterTypeRequiredDescription
namestringYesName of the container
expires_afterobjectNoContainer expiration settings
expires_after.anchorstringNoAnchor point for expiration (e.g., "last_active_at")
expires_after.minutesintegerNoMinutes until expiration from anchor
file_idsarrayNoList of file IDs to include in the container
custom_llm_providerstringNoLLM provider to use (default: "openai")

List Container Parametersโ€‹

ParameterTypeRequiredDescription
afterstringNoCursor for pagination
limitintegerNoNumber of items to return (1-100, default: 20)
orderstringNoSort order: "asc" or "desc" (default: "desc")
custom_llm_providerstringNoLLM provider to use (default: "openai")

Retrieve/Delete Container Parametersโ€‹

ParameterTypeRequiredDescription
container_idstringYesID of the container to retrieve/delete
custom_llm_providerstringNoLLM provider to use (default: "openai")

Response Objectsโ€‹

ContainerObjectโ€‹

{
"id": "cntr_123...",
"object": "container",
"created_at": 1234567890,
"name": "My Container",
"status": "active",
"last_active_at": 1234567890,
"expires_at": 1234569090,
"file_ids": []
}

ContainerListResponseโ€‹

{
"object": "list",
"data": [
{
"id": "cntr_123...",
"object": "container",
"created_at": 1234567890,
"name": "My Container",
"status": "active"
}
],
"first_id": "cntr_123...",
"last_id": "cntr_456...",
"has_more": false
}

DeleteContainerResultโ€‹

{
"id": "cntr_123...",
"object": "container.deleted",
"deleted": true
}