One Interface.Every LLM.Zero Complexity.
Simple, unified interface to all leading Generative AI providers.
import aisuite as ai
client = ai.Client()
response = client.chat.completions.create(
model="openai:gpt-4o",
messages=[{"role": "user", "content": "Hello!"}]
)
print(response.choices[0].message.content)
Quick Install:
Everything You Need, Nothing You Don't
AISuite provides a clean, consistent interface to work with any LLM provider, so you can focus on building great applications.
Unified Interface
OpenAI-compatible API works seamlessly with all supported providers
Multi-Language Support
Native libraries for Python, JavaScript, and TypeScript
All Major Providers
OpenAI, Anthropic, Google, AWS, Azure, Mistral, and many more
Agent-Ready Tool Calling
Build autonomous agents with automatic tool execution
Streaming Support
Real-time streaming responses with consistent API
Coming: ASR & TTS
Audio capabilities across providers (Coming Soon)
Build Autonomous Agents with Any LLM
Transform any LLM into an intelligent agent that can plan, use tools, and solve complex problems autonomously.
import aisuite as ai
def search_web(query: str):
"""Search for information online"""
return f"Results for: {query}"
def analyze_data(data: str):
"""Analyze and summarize data"""
return f"Analysis complete: {data[:100]}..."
client = ai.Client()
# Create an autonomous research agent
response = client.chat.completions.create(
model="openai:gpt-4o", # or any other provider
messages=[{
"role": "user",
"content": "Research the latest AI trends and create a summary"
}],
tools=[search_web, analyze_data],
max_turns=5 # Agent can use tools up to 5 times
)
print(response.choices[0].message.content)
One API, Infinite Possibilities
With the max_turns
parameter, your LLM becomes an autonomous agent capable of complex reasoning and multi-step problem solving.
Plan & Execute
Agents break down complex tasks and execute them step-by-step
Tool Orchestration
Automatically coordinate multiple tools to achieve goals
Self-Improvement
Agents can evaluate results and iterate until success
Write Once, Run Everywhere
Same code, different providers. Switch between LLMs with a single line change.
pip install aisuite
import aisuite as ai
client = ai.Client()
models = ["openai:gpt-4o", "anthropic:claude-3-5-sonnet-20240620"]
messages = [
{"role": "system", "content": "Respond in Pirate English."},
{"role": "user", "content": "Tell me a joke."},
]
for model in models:
response = client.chat.completions.create(
model=model,
messages=messages,
temperature=0.75
)
print(f"{model}: {response.choices[0].message.content}")
Works With Your Favorite Providers
Connect to any LLM provider with a unified interface. More providers being added regularly.
OpenAI
GPT-4, GPT-3.5
Anthropic
Claude 3, Claude 2
Gemini Pro, PaLM
AWS Bedrock
Multiple
Azure
OpenAI Models
Mistral
Mistral Large
Groq
LLaMA, Mixtral
HuggingFace
Various
Ollama
Local Models
Cohere
Command
Cerebras
CS Models
DeepSeek
DeepSeek
Fireworks
Various
Together
Various
SambaNova
Samba
Watsonx
IBM Models
xAI
Grok
Nebius
Various
LMStudio
Local
Inception
Custom
Want to Add Your Provider?
Contributing a new provider is easy. Check out our documentation to get started.
View Contribution GuideTry It Live
Test different providers with the same code. See how easy it is to switch between LLMs.
response = client.chat.completions.create(
model="openai:gpt-4o",
messages=[{"role": "user", "content": "Tell me a fun fact about AI"}]
)
Fun fact: The term "Artificial Intelligence" was first coined in 1956 at the Dartmouth Conference, where computer scientists gathered to discuss the possibility of creating machines that could think!
* This is a demo interface. In production, you would connect to actual LLM providers.
Ready to Build Autonomous AI Agents?
Join thousands of developers using AISuite to build agentic applications with any LLM provider.
pip install aisuite