Menü schliessen
Created: March 31st 2026
Last updated: March 31st 2026
Categories: Artificial intelligence (AI),  IT Development,  Laravel,  Laravel Package
Author: Milos Jevtic

Getting Started with the Laravel AI SDK: Agents, Tools & Structured Output

Introduction: There's Now a Laravel Way to Do AI

In the first post of this series, we
connected OpenAI and Claude directly using Laravel's HTTP client. Raw API calls,
custom service classes, manual headers. It works well and teaches you exactly what
happens under the hood.

But Laravel being Laravel, the framework now has an official, first-party answer to
AI integration: the Laravel AI SDK. Released in early 2026, it brings the same
clean, expressive syntax you expect from Laravel — but for AI providers.

Instead of writing HTTP requests and parsing JSON responses yourself, you create
dedicated agent classes, run Artisan commands, and call ->prompt(). The SDK handles
OpenAI, Anthropic (Claude), Gemini, and more through one unified interface.

In this guide we'll install the SDK, understand how Agents work, build some real
features, and cover what makes this approach better than the raw HTTP approach for
most projects.

What Is the Laravel AI SDK?

The Laravel AI SDK (laravel/ai) is an official first-party package that gives you a
consistent way to work with multiple AI providers without caring about their individual
API differences.

Key things it handles for you:

  • Agents - Dedicated PHP classes that encapsulate instructions, conversation history, tools, and output schema for an AI model
  • Provider switching - Use OpenAI, Anthropic, Gemini, Groq, and others through the same code
  • Failover - Automatically switch to a backup provider if the primary goes down or hits rate limits
  • Structured output - Define a schema and get back typed, predictable data instead of raw strings
  • Queuing and streaming - Run AI tasks in the background or stream responses token by token
  • Testing - Fake agents in tests so you never hit a real API during your test suite
  • Extras - Image generation, text-to-speech, transcription, embeddings, and vector search all in one package

Installation

Install the package via Composer:

composer require laravel/ai

Publish the config and migration files:

php artisan vendor:publish --provider="Laravel\Ai\AiServiceProvider"

Run the migrations. The SDK needs two tables to store conversation history:

php artisan migrate

This creates agent_conversations and agent_conversation_messages tables automatically.

Adding Your API Keys

Add whichever providers you plan to use in your .env file:

OPENAI_API_KEY=sk-your-openai-key-here
ANTHROPIC_API_KEY=sk-ant-your-claude-key-here
GEMINI_API_KEY=your-gemini-key-here

The SDK reads these automatically through the published config/ai.php file. You don't
need to wire them up manually like we did in part one — the package takes care of it.

Understanding Agents

The core concept in the Laravel AI SDK is the Agent. Instead of writing AI logic
scattered across controllers and service classes, you create a dedicated PHP class
for each AI task in your app.

Think of an agent as a specialized assistant you configure once:

  • Instructions - What the AI should do and how it should behave (the system prompt)
  • Messages - Previous conversation history for multi-turn chats
  • Tools - Extra capabilities the AI can call (like searching your database)
  • Schema - If you want structured JSON output instead of plain text

Generate your first agent with Artisan:

php artisan make:agent SupportBot

This creates app/Ai/Agents/SupportBot.php. Here's what a basic agent looks like:

// app/Ai/Agents/SupportBot.php

namespace App\Ai\Agents;

use Laravel\Ai\Contracts\Agent;
use Laravel\Ai\Promptable;
use Stringable;

class SupportBot implements Agent
{
    use Promptable;

    /**
     * Instructions the AI should follow.
     */
    public function instructions(): Stringable|string
    {
        return 'You are a friendly customer support assistant for AcmeCorp.
                Only answer questions about our products and services.
                Be concise and helpful. If you cannot help, say so politely.';
    }
}

Prompt it anywhere in your app:

$response = (new SupportBot)->prompt('How do I reset my password?');

return (string) $response;

That's it. No HTTP clients, no headers, no JSON parsing. The Promptable trait handles all of it.

Configuring Your Agent

You can control which provider and model an agent uses through PHP attributes placed directly on the class:

use Laravel\Ai\Attributes\MaxTokens;
use Laravel\Ai\Attributes\Model;
use Laravel\Ai\Attributes\Provider;
use Laravel\Ai\Attributes\Temperature;
use Laravel\Ai\Contracts\Agent;
use Laravel\Ai\Enums\Lab;
use Laravel\Ai\Promptable;

#[Provider(Lab::Anthropic)]
#[Model('claude-haiku-4-5-20251001')]
#[MaxTokens(500)]
#[Temperature(0.7)]
class SupportBot implements Agent
{
    use Promptable;

    public function instructions(): string
    {
        return 'You are a helpful support assistant.';
    }
}

The available attributes:

  • Provider - Which AI provider to use (Lab::OpenAI, Lab::Anthropic, Lab::Gemini, etc.)
  • Model - The specific model name to use
  • MaxTokens - Maximum response length
  • Temperature - Creativity level, from 0.0 (focused) to 1.0 (creative)
  • MaxSteps - How many tool-use steps the agent can take before stopping
  • Timeout - HTTP timeout in seconds (default is 60)
  • UseCheapestModel - Automatically use the cheapest available model for the provider
  • UseSmartestModel - Automatically use the most capable model for the provider

If you don't set a provider, the SDK uses the default configured in config/ai.php.

Real-World Scenarios

Let's build features you'd actually ship.

Scenario 1: Summarizing Text

A simple agent that summarizes any article or document a user pastes in:

// app/Ai/Agents/Summarizer.php

namespace App\Ai\Agents;

use Laravel\Ai\Attributes\MaxTokens;
use Laravel\Ai\Attributes\Provider;
use Laravel\Ai\Contracts\Agent;
use Laravel\Ai\Enums\Lab;
use Laravel\Ai\Promptable;

#[Provider(Lab::Anthropic)]
#[MaxTokens(300)]
class Summarizer implements Agent
{
    use Promptable;

    public function instructions(): string
    {
        return 'You are a summarization assistant. When given text, summarize it
                in 3 clear bullet points. Be concise and focus on key takeaways.';
    }
}

Use it in a controller:

public function summarize(Request $request)
{
    $request->validate(['text' => 'required|string|max:10000']);

    $summary = (new Summarizer)->prompt($request->text);

    return response()->json(['summary' => (string) $summary]);
}

Scenario 2: Structured Output — Classifying Support Tickets

Instead of getting back a raw string, you can ask the SDK to return structured data
that matches a schema you define. No more parsing messy text responses.

Generate a structured agent:

php artisan make:agent TicketClassifier --structured
// app/Ai/Agents/TicketClassifier.php

namespace App\Ai\Agents;

use Illuminate\Contracts\JsonSchema\JsonSchema;
use Laravel\Ai\Contracts\Agent;
use Laravel\Ai\Contracts\HasStructuredOutput;
use Laravel\Ai\Promptable;

class TicketClassifier implements Agent, HasStructuredOutput
{
    use Promptable;

    public function instructions(): string
    {
        return 'You are a support ticket classifier. Given a ticket message,
                determine its category and priority level.';
    }

    public function schema(JsonSchema $schema): array
    {
        return [
            'category' => $schema->string()
                ->enum(['billing', 'technical', 'account', 'general'])
                ->required(),
            'priority' => $schema->string()
                ->enum(['low', 'medium', 'high'])
                ->required(),
            'summary'  => $schema->string()->required(),
        ];
    }
}

Access the result like an array — no string parsing required:

$result = (new TicketClassifier)->prompt($ticket->body);

$ticket->update([
    'category' => $result['category'],
    'priority' => $result['priority'],
    'summary'  => $result['summary'],
]);

Scenario 3: Conversational Agent With Memory

For a chatbot that remembers previous messages, use the RemembersConversations
trait. The SDK stores and retrieves conversation history from the database automatically
— no manual message management needed.

// app/Ai/Agents/ChatAssistant.php

namespace App\Ai\Agents;

use Laravel\Ai\Concerns\RemembersConversations;
use Laravel\Ai\Contracts\Agent;
use Laravel\Ai\Contracts\Conversational;
use Laravel\Ai\Promptable;

class ChatAssistant implements Agent, Conversational
{
    use Promptable, RemembersConversations;

    public function instructions(): string
    {
        return 'You are a helpful assistant. Be friendly and concise.';
    }
}

Start a conversation for a user and get back a conversation ID to continue it later:

// Start a new conversation
$response = (new ChatAssistant)->forUser($user)->prompt('Hello!');
$conversationId = $response->conversationId;

// Continue the same conversation later
$response = (new ChatAssistant)
    ->continue($conversationId, as: $user)
    ->prompt('Can you remind me what we talked about?');

Scenario 4: Background Processing With Queues

AI API calls can take several seconds. For anything non-interactive — like processing
a batch of uploaded documents — queue the work so your HTTP response stays fast:

Route::post('/documents/analyze', function (Request $request) {
    $document = Document::create(['path' => $request->file('doc')->store('documents')]);

    (new Summarizer)
        ->queue("Summarize the document with ID {$document->id}: {$document->content}")
        ->then(function ($response) use ($document) {
            $document->update(['summary' => (string) $response]);
        })
        ->catch(function (\Throwable $e) use ($document) {
            logger()->error('Summarization failed', [
                'document_id' => $document->id,
                'error'       => $e->getMessage(),
            ]);
        });

    return response()->json(['message' => 'Document queued for analysis.']);
});

Scenario 5: Failover Between Providers

Pass an array of providers and the SDK will automatically switch to the next one if
the first is unavailable or rate-limited:

use Laravel\Ai\Enums\Lab;

$response = (new SupportBot)->prompt(
    'Help me understand my invoice.',
    provider: [Lab::OpenAI, Lab::Anthropic],
);

If OpenAI is down or returns a rate limit error, the request automatically retries
with Anthropic. No fallback logic, no try/catch, no extra code on your end.

Scenario 6: Giving an Agent Tools

Tools let an agent call your application code while generating a response. For example,
an agent that can look up a user's order status in your database:

php artisan make:tool GetOrderStatus
// app/Ai/Tools/GetOrderStatus.php

namespace App\Ai\Tools;

use App\Models\Order;
use Illuminate\Contracts\JsonSchema\JsonSchema;
use Laravel\Ai\Contracts\Tool;
use Laravel\Ai\Tools\Request;
use Stringable;

class GetOrderStatus implements Tool
{
    public function description(): Stringable|string
    {
        return 'Look up the status of an order by order number.';
    }

    public function handle(Request $request): Stringable|string
    {
        $order = Order::where('number', $request['order_number'])->first();

        if (! $order) {
            return 'Order not found.';
        }

        return "Order {$order->number}: {$order->status}. Expected delivery: {$order->delivery_date}.";
    }

    public function schema(JsonSchema $schema): array
    {
        return [
            'order_number' => $schema->string()->required(),
        ];
    }
}

Register the tool in your agent's tools() method:

use App\Ai\Tools\GetOrderStatus;
use Laravel\Ai\Contracts\HasTools;

class SupportBot implements Agent, HasTools
{
    use Promptable;

    public function instructions(): string
    {
        return 'You are a support assistant. Use the GetOrderStatus tool
                when a customer asks about their order.';
    }

    public function tools(): iterable
    {
        return [
            new GetOrderStatus,
        ];
    }
}

When a user asks "What's the status of order #1234?", the agent automatically calls
your tool, gets the real data from the database, and includes it in the response.

Testing Your Agents

One of the biggest advantages of using the SDK is testability. You never want real
API calls hitting OpenAI or Anthropic in your test suite — it's slow, costs money,
and is unreliable.

The SDK makes faking agents straightforward:

use App\Ai\Agents\TicketClassifier;

it('classifies tickets correctly', function () {
    // Fake the agent — no real API call is made
    TicketClassifier::fake([
        [
            'category' => 'billing',
            'priority' => 'high',
            'summary'  => 'Customer cannot process payment.',
        ],
    ]);

    $result = (new TicketClassifier)->prompt('My credit card keeps getting declined');

    expect($result['category'])->toBe('billing');
    expect($result['priority'])->toBe('high');

    // Assert the agent was actually called
    TicketClassifier::assertPrompted('My credit card keeps getting declined');
});

For tests where you want to make sure no unexpected agent calls happen:

TicketClassifier::fake()->preventStrayPrompts();
// Now any unmatched prompt will throw an exception

Common Mistakes to Avoid

1. Not Setting a Provider or Default Model

If you don't configure a default provider in config/ai.php and don't set one via
attribute, the SDK might not know which provider to use.

Bad:

// No provider set anywhere, and config/ai.php still has defaults pointing to nothing
class SupportBot implements Agent
{
    use Promptable;
}

Good:

#[Provider(Lab::Anthropic)]
class SupportBot implements Agent
{
    use Promptable;
}

Or set a global default in config/ai.php so all agents use it unless overridden.

2. Putting AI Logic Directly in Controllers

The whole point of agents is to keep AI logic organized. Don't bypass the pattern.

Bad:

// Messy, untestable, hard to reuse
public function summarize(Request $request)
{
    $response = agent(instructions: 'Summarize this.')->prompt($request->text);
    return (string) $response;
}

Good:

// Clean, testable, reusable
public function summarize(Request $request)
{
    return (string) (new Summarizer)->prompt($request->text);
}

3. Running AI Calls Synchronously for Slow Tasks

Calls to AI providers can take 5–30 seconds for large inputs. Running them synchronously
blocks your server and frustrates users.

Bad:

// Blocks the request for potentially 20+ seconds
$response = (new DocumentAnalyzer)->prompt($longDocument);

Good:

// Non-blocking, processed in the background
(new DocumentAnalyzer)
    ->queue($longDocument)
    ->then(fn ($response) => $this->saveResult($response));

4. Skipping Fakes in Tests

If you don't fake your agents in tests, every test run makes real API calls.

Bad:

it('saves the summary', function () {
    // This hits the real OpenAI API every time 🚫
    $result = (new Summarizer)->prompt('Some long article...');
    expect($result)->not->toBeEmpty();
});

Good:

it('saves the summary', function () {
    Summarizer::fake(['Here are 3 key takeaways...']);

    $result = (new Summarizer)->prompt('Some long article...');
    expect((string) $result)->toBe('Here are 3 key takeaways...');
});

5. Using Anonymous Agents When a Class Makes More Sense

The SDK supports quick anonymous agents with the agent() helper. These are fine for
one-off scripts or quick prototyping, but don't use them for features you'll reuse
or test — a named class is always easier to maintain.

OK for quick scripts:

use function Laravel\Ai\{agent};

$response = agent(instructions: 'You are a helpful assistant.')
    ->prompt('What is Laravel?');

Better for real features:

// Dedicated class: testable, configurable, reusable
(new SupportBot)->prompt('What is Laravel?');

Quick Reference

Artisan Commands

# Create a basic agent
php artisan make:agent AgentName

# Create an agent with structured output
php artisan make:agent AgentName --structured

# Create a tool
php artisan make:tool ToolName

# Create agent middleware
php artisan make:agent-middleware MiddlewareName

Prompting an Agent

// Basic prompt
$response = (new MyAgent)->prompt('Your message here');
return (string) $response;

// Override provider at runtime
$response = (new MyAgent)->prompt('Message', provider: Lab::Anthropic);

// Failover between providers
$response = (new MyAgent)->prompt('Message', provider: [Lab::OpenAI, Lab::Anthropic]);

// Queue the prompt
(new MyAgent)->queue('Message')->then(fn ($r) => ...)->catch(fn ($e) => ...);

Agent Attributes

#[Provider(Lab::Anthropic)]       // Set provider
#[Model('claude-haiku-4-5-20251001')] // Set model
#[MaxTokens(500)]                  // Limit response length
#[Temperature(0.7)]                // Set creativity (0.0–1.0)
#[MaxSteps(10)]                    // Max tool-use iterations
#[Timeout(120)]                    // HTTP timeout in seconds
#[UseCheapestModel]                // Auto-select cheapest model
#[UseSmartestModel]                // Auto-select most capable model

Interfaces Your Agent Can Implement

Agent              // Required. Base interface for all agents
Conversational     // Agent has conversation history
HasStructuredOutput // Agent returns typed schema instead of plain text
HasTools           // Agent can call tools
HasMiddleware      // Agent has middleware pipeline
HasProviderOptions // Agent passes provider-specific options

Testing Fakes

// Fake with fixed response
MyAgent::fake(['Response text']);

// Fake with dynamic response
MyAgent::fake(fn ($prompt) => 'Response to: ' . $prompt->prompt);

// Prevent real API calls from slipping through
MyAgent::fake()->preventStrayPrompts();

// Assertions
MyAgent::assertPrompted('Expected text');
MyAgent::assertNeverPrompted();

Raw HTTP vs Laravel AI SDK — When to Use Which

  • Use raw HTTP if you're learning how AI APIs work, building a one-off script, or integrating a provider the SDK doesn't support yet
  • Use the Laravel AI SDK for any real application feature — you get organized code, built-in testing, failover, queueing, and structured output without writing any of it yourself

For most Laravel projects started today, the SDK is the right default.

Conclusion

The Laravel AI SDK brings AI into Laravel properly — not as a bolted-on HTTP call,
but as a first-class feature with agents, Artisan commands, queues, testing helpers,
and a consistent interface across providers.

Key takeaways:

  • Install with composer require laravel/ai, publish config, and run migrations
  • Agents are PHP classes — generate them with php artisan make:agent
  • Use PHP attributes (#[Provider], #[Model], #[MaxTokens]) to configure each agent
  • Implement HasStructuredOutput to get typed schema responses instead of raw strings
  • Use RemembersConversations to automatically persist and load chat history
  • Queue slow tasks with ->queue() to keep your app responsive
  • Pass an array of providers for automatic failover with zero extra code
  • Always fake agents in tests with MyAgent::fake() — never hit a real API in your test suite

If you haven't read part one of this series, it covers how to call the OpenAI and
Claude APIs directly using Laravel's HTTP client — a good foundation for understanding
what the SDK is doing for you behind the scenes.