CrewAI Integration
Using Forge with CrewAI.
CrewAI Integration
CrewAI's multi-agent framework works with Forge as the LLM backend. Each agent in your crew gets Forge's intelligent routing, memory, and security automatically.
Setup
pip install crewai
Configure with Forge
import os
from crewai import Agent, Task, Crew
os.environ["OPENAI_API_KEY"] = "forge_sk_your_key"
os.environ["OPENAI_API_BASE"] = "https://api.optima-forge.com/v1"
researcher = Agent(
role="Senior Research Analyst",
goal="Produce thorough research reports on given topics",
backstory="You are an experienced research analyst with expertise in technology.",
verbose=True,
llm="auto",
)
writer = Agent(
role="Technical Writer",
goal="Transform research into clear, engaging content",
backstory="You are a skilled technical writer.",
verbose=True,
llm="auto",
)
research_task = Task(
description="Research the current state of AI agent frameworks",
agent=researcher,
expected_output="A detailed research report",
)
writing_task = Task(
description="Write a blog post based on the research",
agent=writer,
expected_output="A 1000-word blog post",
)
crew = Crew(agents=[researcher, writer], tasks=[research_task, writing_task])
result = crew.kickoff()
print(result)
Benefits
Each CrewAI agent's LLM calls route through Forge, so different agents can use different models based on their task complexity. The researcher might get routed to GPT-4o for deep analysis while the writer gets Claude for prose quality — all determined automatically by Forge's Quality Router.