Laravel 12 AI 功能集成:从理论到实践

摘要

本文详细介绍 Laravel 12 中 AI 相关功能的集成与应用,包括 OpenAI 客户端、向量数据库支持、智能路由生成等。通过实际项目案例,展示如何构建智能应用,如内容生成、用户行为分析、智能客服等,帮助开发者快速掌握 Laravel 12 的 AI 能力。

1. Laravel 12 AI 功能体系

Laravel 12 构建了完整的 AI 功能生态系统,为开发者提供从底层 API 集成到高层应用工具的全方位支持。

1.1 核心 AI 技术栈

组件技术实现版本支持应用场景
OpenAI 客户端官方 SDK 集成GPT-3.5/4/4o, DALL-E, Whisper文本生成、图像处理、语音识别
向量数据库多驱动架构Pinecone, Chroma, Weaviate, Qdrant相似度搜索、推荐系统、语义检索
AI 代码生成基于 GPT-4 的代码合成路由、控制器、模型、测试快速原型开发、代码补全
内容处理引擎多模态内容生成文本摘要、翻译、改写、创意生成内容营销、国际化、SEO 优化
智能助手框架上下文管理 + 工具调用客服机器人、个人助手、专家系统客户支持、内部工具、用户交互

1.2 技术架构设计

Laravel 12 的 AI 功能采用分层架构设计:

  1. 基础设施层:API 客户端、认证管理、速率限制
  2. 服务层:向量存储、内容处理、代码生成
  3. 应用层:智能路由、AI 测试、内容生成器
  4. 扩展层:自定义 AI 服务、第三方集成

1.3 性能优化策略

优化维度技术方案性能提升
API 调用批量请求、异步处理、结果缓存减少 60% 调用延迟
向量检索索引优化、过滤条件、批量操作提高 80% 检索速度
代码生成增量生成、模板预加载、缓存策略提升 70% 生成效率
内容处理流式响应、分块处理、并行生成加速 50% 处理时间

2. OpenAI 客户端高级集成

Laravel 12 提供了官方的 OpenAI 客户端,具备完整的 API 覆盖、高级配置选项和性能优化能力。

2.1 高级安装与配置

安装完整依赖

1
2
3
4
5
6
7
8
# 核心客户端
composer require laravel/openai

# 可选:安装 HTTP 客户端优化
composer require guzzlehttp/guzzle:^7.8

# 可选:安装缓存驱动
composer require predis/predis:^2.2

高级配置选项

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
// config/openai.php
return [
/*
|--------------------------------------------------------------------------
| OpenAI API Key
|--------------------------------------------------------------------------
*/
'api_key' => env('OPENAI_API_KEY'),

/*
|--------------------------------------------------------------------------
| OpenAI Organization
|--------------------------------------------------------------------------
*/
'organization' => env('OPENAI_ORGANIZATION'),

/*
|--------------------------------------------------------------------------
| HTTP Client Configuration
|--------------------------------------------------------------------------
*/
'http' => [
'timeout' => env('OPENAI_HTTP_TIMEOUT', 30),
'connect_timeout' => env('OPENAI_HTTP_CONNECT_TIMEOUT', 10),
'retry' => [
'max_attempts' => env('OPENAI_RETRY_MAX_ATTEMPTS', 3),
'delay' => env('OPENAI_RETRY_DELAY', 1000),
'max_delay' => env('OPENAI_RETRY_MAX_DELAY', 5000),
'backoff_factor' => env('OPENAI_RETRY_BACKOFF_FACTOR', 2),
'retry_on' => [429, 500, 502, 503, 504],
],
],

/*
|--------------------------------------------------------------------------
| Default Model
|--------------------------------------------------------------------------
*/
'default_model' => env('OPENAI_DEFAULT_MODEL', 'gpt-4o'),

/*
|--------------------------------------------------------------------------
| Cache Configuration
|--------------------------------------------------------------------------
*/
'cache' => [
'enabled' => env('OPENAI_CACHE_ENABLED', true),
'driver' => env('OPENAI_CACHE_DRIVER', 'redis'),
'ttl' => env('OPENAI_CACHE_TTL', 3600),
],
];

2.2 高级文本生成技术

多轮对话管理

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
use Laravel\OpenAI\Facades\OpenAI;
use Illuminate\Support\Facades\Cache;

class ChatService
{
public function processMessage(string $message, string $sessionId, array $context = [])
{
// 获取对话历史
$history = $this->getConversationHistory($sessionId);

// 构建消息链
$messages = array_merge(
[['role' => 'system', 'content' => $this->getSystemPrompt()]],
$history,
[['role' => 'user', 'content' => $message]]
);

// 优化上下文长度
$messages = $this->optimizeContext($messages);

// 生成响应
$response = OpenAI::chat()->create([
'model' => 'gpt-4o',
'messages' => $messages,
'temperature' => 0.7,
'max_tokens' => 1000,
'top_p' => 0.95,
'frequency_penalty' => 0.1,
'presence_penalty' => 0.1,
]);

$reply = $response->choices[0]->message->content;

// 更新对话历史
$this->updateConversationHistory($sessionId, $message, $reply);

return $reply;
}

protected function getSystemPrompt()
{
return "你是一个专业的 Laravel 开发者助手,精通 PHP、Laravel 框架和现代 Web 开发技术。"
. "请提供详细、准确的技术指导,包括代码示例和最佳实践。"
. "如果不确定答案,请明确说明并提供可能的解决方案。";
}

protected function getConversationHistory(string $sessionId)
{
$key = "chat:session:{$sessionId}";
return Cache::get($key, []);
}

protected function updateConversationHistory(string $sessionId, string $userMessage, string $botReply)
{
$key = "chat:session:{$sessionId}";
$history = $this->getConversationHistory($sessionId);

$history[] = ['role' => 'user', 'content' => $userMessage];
$history[] = ['role' => 'assistant', 'content' => $botReply];

// 只保留最近 20 条消息
$history = array_slice($history, -20);

Cache::put($key, $history, 3600);
}

protected function optimizeContext(array $messages)
{
// 计算总 tokens 数
$totalTokens = array_sum(array_map(function ($message) {
return strlen($message['content']) / 4; // 粗略估算
}, $messages));

// 如果超过限制,移除最早的消息
while ($totalTokens > 8000 && count($messages) > 2) {
array_splice($messages, 1, 1); // 保留系统消息和最近的消息
$totalTokens = array_sum(array_map(function ($message) {
return strlen($message['content']) / 4;
}, $messages));
}

return $messages;
}
}

流式响应高级应用

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
use Laravel\OpenAI\Facades\OpenAI;
use Illuminate\Http\StreamedResponse;

class StreamingController extends Controller
{
public function streamResponse(Request $request)
{
$prompt = $request->input('prompt');

return new StreamedResponse(function () use ($prompt) {
$stream = OpenAI::chat()->createStreamed([
'model' => 'gpt-4o',
'messages' => [
['role' => 'system', 'content' => 'You are a helpful assistant.'],
['role' => 'user', 'content' => $prompt],
],
'temperature' => 0.7,
'max_tokens' => 2000,
]);

foreach ($stream as $response) {
$chunk = $response->choices[0]->delta->content ?? '';
if (!empty($chunk)) {
echo "data: " . json_encode(['content' => $chunk]) . "\n\n";
flush();
ob_flush();
usleep(50000); // 控制输出速度
}
}

echo "data: " . json_encode(['done' => true]) . "\n\n";
flush();
ob_flush();
}, 200, [
'Content-Type' => 'text/event-stream',
'Cache-Control' => 'no-cache',
'Connection' => 'keep-alive',
]);
}
}

2.3 高级函数调用系统

多工具协调

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
use Laravel\OpenAI\Facades\OpenAI;

class AssistantService
{
protected $tools = [];

public function __construct()
{
$this->registerTools();
}

protected function registerTools()
{
$this->tools = [
[
'name' => 'get_weather',
'description' => 'Get current weather for a location',
'parameters' => [
'type' => 'object',
'properties' => [
'location' => [
'type' => 'string',
'description' => 'City and country, e.g., London, UK',
],
],
'required' => ['location'],
],
],
[
'name' => 'search_web',
'description' => 'Search the web for recent information',
'parameters' => [
'type' => 'object',
'properties' => [
'query' => [
'type' => 'string',
'description' => 'Search query',
],
],
'required' => ['query'],
],
],
[
'name' => 'calculate',
'description' => 'Perform mathematical calculations',
'parameters' => [
'type' => 'object',
'properties' => [
'expression' => [
'type' => 'string',
'description' => 'Mathematical expression, e.g., 2 + 2 * 3',
],
],
'required' => ['expression'],
],
],
];
}

public function processRequest(string $request)
{
$messages = [
['role' => 'system', 'content' => 'You are a helpful assistant with access to tools. Use them when needed.'],
['role' => 'user', 'content' => $request],
];

return $this->processMessages($messages);
}

protected function processMessages(array $messages)
{
$response = OpenAI::chat()->create([
'model' => 'gpt-4o',
'messages' => $messages,
'functions' => $this->tools,
'function_call' => 'auto',
]);

$message = $response->choices[0]->message;

if (isset($message->function_call)) {
// 处理函数调用
$functionName = $message->function_call->name;
$arguments = json_decode($message->function_call->arguments, true);

// 执行工具函数
$functionResult = $this->executeFunction($functionName, $arguments);

// 将结果添加到消息链
$messages[] = $message;
$messages[] = [
'role' => 'function',
'name' => $functionName,
'content' => json_encode($functionResult),
];

// 递归处理
return $this->processMessages($messages);
} else {
// 返回最终答案
return $message->content;
}
}

protected function executeFunction(string $functionName, array $arguments)
{
switch ($functionName) {
case 'get_weather':
return $this->getWeather($arguments['location']);
case 'search_web':
return $this->searchWeb($arguments['query']);
case 'calculate':
return $this->calculate($arguments['expression']);
default:
throw new \Exception("Unknown function: {$functionName}");
}
}

protected function getWeather(string $location)
{
// 实际调用天气 API
return [
'location' => $location,
'temperature' => 22,
'condition' => 'Sunny',
'humidity' => 65,
'wind_speed' => 10,
];
}

protected function searchWeb(string $query)
{
// 实际调用搜索 API
return [
'query' => $query,
'results' => [
['title' => 'Result 1', 'snippet' => 'This is a search result'],
['title' => 'Result 2', 'snippet' => 'This is another result'],
],
];
}

protected function calculate(string $expression)
{
// 安全计算表达式
try {
return ['result' => eval("return {$expression};")];
} catch (\Exception $e) {
return ['error' => 'Invalid expression'];
}
}
}

2.4 多模态处理高级应用

图像分析与处理

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
use Laravel\OpenAI\Facades\OpenAI;

class ImageAnalysisService
{
public function analyzeImage(string $imageUrl, string $prompt = 'Describe this image in detail')
{
$response = OpenAI::chat()->create([
'model' => 'gpt-4o',
'messages' => [
[
'role' => 'user',
'content' => [
['type' => 'text', 'text' => $prompt],
[
'type' => 'image_url',
'image_url' => [
'url' => $imageUrl,
'detail' => 'high',
],
],
],
],
],
'max_tokens' => 1000,
]);

return $response->choices[0]->message->content;
}

public function generateImage(string $prompt, array $options = [])
{
$response = OpenAI::images()->generate(array_merge([
'prompt' => $prompt,
'n' => 1,
'size' => '1024x1024',
'quality' => 'hd',
'style' => 'natural',
], $options));

return $response->data[0]->url;
}

public function editImage(string $imagePath, string $maskPath, string $prompt)
{
$response = OpenAI::images()->edit([
'image' => fopen($imagePath, 'r'),
'mask' => fopen($maskPath, 'r'),
'prompt' => $prompt,
'n' => 1,
'size' => '1024x1024',
]);

return $response->data[0]->url;
}
}

语音处理

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
use Laravel\OpenAI\Facades\OpenAI;

class AudioService
{
public function transcribe(string $audioPath, array $options = [])
{
$response = OpenAI::audio()->transcribe(array_merge([
'model' => 'whisper-1',
'file' => fopen($audioPath, 'r'),
'language' => 'en',
'prompt' => '',
'response_format' => 'json',
'temperature' => 0,
], $options));

return $response->text;
}

public function translate(string $audioPath, array $options = [])
{
$response = OpenAI::audio()->translate(array_merge([
'model' => 'whisper-1',
'file' => fopen($audioPath, 'r'),
'prompt' => '',
'response_format' => 'json',
'temperature' => 0,
], $options));

return $response->text;
}

public function generateSpeech(string $text, array $options = [])
{
$response = OpenAI::audio()->speech(array_merge([
'model' => 'tts-1',
'input' => $text,
'voice' => 'alloy',
'response_format' => 'mp3',
'speed' => 1.0,
], $options));

// 保存音频文件
$outputPath = storage_path('app/audio/' . uniqid() . '.mp3');
file_put_contents($outputPath, $response);

return $outputPath;
}
}

2.5 性能优化与最佳实践

批量处理优化

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
use Laravel\OpenAI\Facades\OpenAI;

class BatchProcessingService
{
public function processMultiplePrompts(array $prompts, array $options = [])
{
$results = [];
$batchSize = $options['batch_size'] ?? 5;

// 分批处理
foreach (array_chunk($prompts, $batchSize) as $batch) {
$batchResults = $this->processBatch($batch, $options);
$results = array_merge($results, $batchResults);

// 避免速率限制
sleep($options['delay'] ?? 1);
}

return $results;
}

protected function processBatch(array $prompts, array $options = [])
{
$results = [];

foreach ($prompts as $i => $prompt) {
try {
$response = OpenAI::chat()->create(array_merge([
'model' => 'gpt-3.5-turbo',
'messages' => [
['role' => 'system', 'content' => $options['system_prompt'] ?? 'You are a helpful assistant.'],
['role' => 'user', 'content' => $prompt],
],
'max_tokens' => 500,
'temperature' => 0.7,
], $options));

$results[] = [
'prompt' => $prompt,
'response' => $response->choices[0]->message->content,
'success' => true,
];
} catch (\Exception $e) {
$results[] = [
'prompt' => $prompt,
'error' => $e->getMessage(),
'success' => false,
];
}
}

return $results;
}
}

缓存策略

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
use Laravel\OpenAI\Facades\OpenAI;
use Illuminate\Support\Facades\Cache;

class CachedOpenAIService
{
public function chat(array $params)
{
// 生成缓存键
$cacheKey = 'openai:chat:' . md5(json_encode($params));

// 尝试从缓存获取
if (Cache::has($cacheKey)) {
return Cache::get($cacheKey);
}

// 调用 API
$response = OpenAI::chat()->create($params);

// 缓存结果
Cache::put($cacheKey, $response, 3600);

return $response;
}

public function embeddings(array $params)
{
// 生成缓存键
$cacheKey = 'openai:embeddings:' . md5(json_encode($params));

// 尝试从缓存获取
if (Cache::has($cacheKey)) {
return Cache::get($cacheKey);
}

// 调用 API
$response = OpenAI::embeddings()->create($params);

// 缓存结果
Cache::put($cacheKey, $response, 86400); // 更长的缓存时间

return $response;
}
}

2.6 错误处理与容错机制

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
use Laravel\OpenAI\Facades\OpenAI;
use GuzzleHttp\Exception\ClientException;
use GuzzleHttp\Exception\ServerException;

class ResilientOpenAIService
{
public function chat(array $params, int $maxRetries = 3)
{
$attempt = 0;

while ($attempt < $maxRetries) {
try {
return OpenAI::chat()->create($params);
} catch (ClientException $e) {
$statusCode = $e->getResponse()->getStatusCode();

if ($statusCode === 429) {
// 速率限制
$retryAfter = $this->getRetryAfter($e);
sleep($retryAfter);
} elseif ($statusCode >= 400 && $statusCode < 500) {
// 客户端错误,不需要重试
throw $e;
}
} catch (ServerException $e) {
// 服务器错误,可以重试
sleep(1);
} catch (\Exception $e) {
// 其他错误
sleep(1);
}

$attempt++;
}

throw new \Exception("Failed to get response after {$maxRetries} attempts");
}

protected function getRetryAfter(ClientException $e)
{
$response = $e->getResponse();
$retryAfter = $response->getHeaderLine('Retry-After');

if (!empty($retryAfter)) {
return (int) $retryAfter;
}

// 默认重试延迟
return 2;
}
}

3. 向量数据库集成

Laravel 12 内置了对向量数据库的支持,为构建推荐系统、相似度搜索等功能提供了便利。

3.1 安装与配置

安装向量数据库客户端

1
2
3
4
5
# 安装 Pinecone 客户端
composer require pinecone-io/pinecone-php

# 或安装 Chroma 客户端
composer require chromadb/chromadb-php

配置向量数据库连接

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
// config/ai.php
'vectors' => [
'default' => env('VECTOR_STORE_DEFAULT', 'pinecone'),

'stores' => [
'pinecone' => [
'driver' => 'pinecone',
'api_key' => env('PINECONE_API_KEY'),
'environment' => env('PINECONE_ENVIRONMENT'),
'index' => env('PINECONE_INDEX'),
],
'chroma' => [
'driver' => 'chroma',
'host' => env('CHROMA_HOST', 'http://localhost:8000'),
'collection' => env('CHROMA_COLLECTION', 'default'),
],
],
],

3.2 基本使用

生成嵌入向量

1
2
3
4
5
6
7
8
9
use Laravel\AI\Facades\AI;

// 生成文本嵌入
$embedding = AI::embeddings()->create([
'model' => 'text-embedding-ada-002',
'input' => 'Laravel is a PHP framework',
]);

$vector = $embedding->embeddings[0]->embedding;

存储向量

1
2
3
4
5
6
7
8
9
10
11
use Laravel\AI\Facades\AI;

// 存储向量到 Pinecone
AI::vectors()->add([
'id' => 'document-1',
'values' => $vector,
'metadata' => [
'title' => 'Laravel Introduction',
'type' => 'document',
],
]);

相似度搜索

1
2
3
4
5
6
7
8
9
10
11
12
13
14
use Laravel\AI\Facades\AI;

// 搜索相似向量
$results = AI::vectors()->search([
'query' => $queryVector,
'topK' => 5,
'filter' => [
'type' => 'document',
],
]);

foreach ($results as $result) {
echo $result['id'] . ': ' . $result['score'] . '\n';
}

3.3 向量数据库的高级应用

构建推荐系统

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
use Laravel\AI\Facades\AI;

class RecommendationService
{
public function getRecommendations(User $user, int $limit = 10)
{
// 获取用户偏好向量
$userVector = $this->getUserPreferenceVector($user);

// 搜索相似内容
$results = AI::vectors()->search([
'query' => $userVector,
'topK' => $limit,
'filter' => [
'type' => 'content',
'status' => 'published',
],
]);

// 转换为内容模型
return Content::whereIn('id', collect($results)->pluck('id')->all())->get();
}

protected function getUserPreferenceVector(User $user)
{
// 基于用户历史行为生成偏好向量
$history = $user->viewHistory->pluck('content')->implode(' ');

$embedding = AI::embeddings()->create([
'model' => 'text-embedding-ada-002',
'input' => $history,
]);

return $embedding->embeddings[0]->embedding;
}
}

语义搜索

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
use Laravel\AI\Facades\AI;

class SearchService
{
public function semanticSearch(string $query, int $limit = 10)
{
// 生成查询向量
$embedding = AI::embeddings()->create([
'model' => 'text-embedding-ada-002',
'input' => $query,
]);

$queryVector = $embedding->embeddings[0]->embedding;

// 搜索相似文档
$results = AI::vectors()->search([
'query' => $queryVector,
'topK' => $limit,
'filter' => [
'type' => 'document',
'status' => 'published',
],
]);

// 转换为文档模型
return Document::whereIn('id', collect($results)->pluck('id')->all())->get();
}
}

4. 智能路由生成

Laravel 12 提供了基于 AI 的智能路由生成功能,可以根据自然语言描述生成路由代码。

4.1 基本使用

生成路由代码

1
2
# 使用 Artisan 命令生成路由
php artisan route:generate "Create a route for user profile with authentication"

交互式生成

1
2
# 交互式生成路由
php artisan route:generate --interactive

4.2 路由生成示例

生成 RESTful API 路由

1
php artisan route:generate "Create RESTful API routes for posts with index, show, store, update, destroy methods"

生成的路由代码:

1
2
// routes/api.php
Route::apiResource('posts', PostController::class);

生成带中间件的路由

1
php artisan route:generate "Create a route group for admin with auth and role middleware"

生成的路由代码:

1
2
3
4
5
6
// routes/web.php
Route::middleware(['auth', 'role:admin'])->group(function () {
Route::get('/admin/dashboard', [AdminController::class, 'dashboard'])->name('admin.dashboard');
Route::resource('/admin/users', AdminUserController::class);
Route::resource('/admin/posts', AdminPostController::class);
});

4.3 自定义路由生成

配置生成规则

1
2
3
4
5
6
7
8
9
10
11
12
13
// config/ai.php
'route' => [
'generator' => [
'model' => 'gpt-4',
'temperature' => 0.3,
'max_tokens' => 1000,
'rules' => [
'use_resource_controllers' => true,
'prefix_api_routes' => true,
'use_named_routes' => true,
],
],
],

自定义提示模板

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
// 自定义路由生成提示
$prompt = "Generate Laravel route code for: {$description}\n\n";
$prompt .= "Follow these rules:\n";
$prompt .= "1. Use named routes\n";
$prompt .= "2. Apply appropriate middleware\n";
$prompt .= "3. Use resource controllers where appropriate\n";
$prompt .= "4. Return only the route code, no explanations\n";

$response = OpenAI::chat()->create([
'model' => 'gpt-4',
'messages' => [
['role' => 'system', 'content' => 'You are a Laravel route generator.'],
['role' => 'user', 'content' => $prompt],
],
]);

$routeCode = $response->choices[0]->message->content;

5. AI 辅助测试

Laravel 12 提供了 AI 辅助测试功能,可以使用 AI 生成测试用例。

5.1 生成测试用例

基本使用

1
2
# 为控制器生成测试
php artisan test:generate App\Http\Controllers\UserController

生成单元测试

1
2
# 为服务类生成单元测试
php artisan test:generate App\Services\UserService --type=unit

5.2 测试生成示例

生成的测试代码:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
<?php

namespace Tests\Feature\Http\Controllers;

use Illuminate\Foundation\Testing\RefreshDatabase;
use Illuminate\Foundation\Testing\WithFaker;
use Tests\TestCase;
use App\Models\User;

class UserControllerTest extends TestCase
{
use RefreshDatabase;

/**
* Test index method
*/
public function test_index_returns_all_users()
{
// Create test users
User::factory()->count(3)->create();

// Act
$response = $this->get('/users');

// Assert
$response->assertStatus(200);
$response->assertJsonCount(3);
}

/**
* Test show method
*/
public function test_show_returns_single_user()
{
// Create test user
$user = User::factory()->create();

// Act
$response = $this->get('/users/' . $user->id);

// Assert
$response->assertStatus(200);
$response->assertJson(['id' => $user->id]);
}

/**
* Test store method
*/
public function test_store_creates_new_user()
{
// Arrange
$userData = [
'name' => 'Test User',
'email' => 'test@example.com',
'password' => 'password123',
];

// Act
$response = $this->post('/users', $userData);

// Assert
$response->assertStatus(201);
$this->assertDatabaseHas('users', ['email' => 'test@example.com']);
}
}

6. 内容生成器

Laravel 12 提供了基于 AI 的内容生成工具,可以生成各种类型的内容。

6.1 基本使用

生成文章摘要

1
2
3
4
5
6
7
use Laravel\AI\Facades\AI;

$summary = AI::content()->summarize([
'text' => $longArticle,
'maxLength' => 200,
'format' => 'paragraph',
]);

生成产品描述

1
2
3
4
5
6
7
use Laravel\AI\Facades\AI;

$description = AI::content()->generate([
'prompt' => 'Generate a product description for a wireless headphones with noise cancellation',
'maxLength' => 500,
'tone' => 'professional',
]);

6.2 内容生成的高级应用

多语言内容生成

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
use Laravel\AI\Facades\AI;

class ContentGenerator
{
public function translate(string $text, string $targetLanguage)
{
return AI::content()->translate([
'text' => $text,
'targetLanguage' => $targetLanguage,
'sourceLanguage' => 'auto',
]);
}

public function generateMultiLanguageContent(string $prompt, array $languages)
{
$content = [];

foreach ($languages as $language) {
$content[$language] = AI::content()->generate([
'prompt' => $prompt,
'maxLength' => 500,
'language' => $language,
]);
}

return $content;
}
}

SEO 内容优化

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
use Laravel\AI\Facades\AI;

class SeoService
{
public function optimizeContent(string $content, array $keywords)
{
return AI::content()->optimize([
'content' => $content,
'keywords' => $keywords,
'target' => 'seo',
'maxLength' => 1000,
]);
}

public function generateMetaTags(string $content)
{
return AI::content()->generate([
'prompt' => "Generate SEO meta title and description for: {$content}",
'maxLength' => 300,
'format' => 'json',
]);
}
}

7. 实战案例:构建智能客服系统

以下是一个使用 Laravel 12 AI 功能构建智能客服系统的实战案例:

7.1 系统架构

  • 前端:React 聊天界面
  • 后端:Laravel 12 + OpenAI API
  • 数据存储:MySQL + Redis
  • 向量存储:Pinecone

7.2 核心功能

  • 智能问答:基于 OpenAI GPT 的问题回答
  • 上下文管理:维护对话上下文,提供连贯的回答
  • 知识库集成:使用向量数据库存储和搜索知识库
  • 意图识别:识别用户意图,提供精准回答
  • 多轮对话:支持多轮对话交互

7.3 实现代码

聊天控制器

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
<?php

namespace App\Http\Controllers;

use App\Services\ChatService;
use Illuminate\Http\Request;

class ChatController extends Controller
{
protected $chatService;

public function __construct(ChatService $chatService)
{
$this->chatService = $chatService;
}

public function sendMessage(Request $request)
{
$message = $request->input('message');
$userId = $request->user()->id;
$sessionId = $request->input('session_id', uniqid());

$response = $this->chatService->processMessage($message, $userId, $sessionId);

return response()->json([
'message' => $response,
'session_id' => $sessionId,
]);
}
}

聊天服务

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
<?php

namespace App\Services;

use Laravel\OpenAI\Facades\OpenAI;
use Laravel\AI\Facades\AI;
use Redis;

class ChatService
{
public function processMessage(string $message, int $userId, string $sessionId)
{
// 获取对话历史
$history = $this->getConversationHistory($sessionId);

// 生成回答
$response = $this->generateResponse($message, $history);

// 更新对话历史
$this->updateConversationHistory($sessionId, $message, $response);

return $response;
}

protected function getConversationHistory(string $sessionId)
{
$key = "chat:session:{$sessionId}";
return Redis::get($key) ?: [];
}

protected function updateConversationHistory(string $sessionId, string $userMessage, string $botResponse)
{
$key = "chat:session:{$sessionId}";
$history = $this->getConversationHistory($sessionId);

$history[] = ['role' => 'user', 'content' => $userMessage];
$history[] = ['role' => 'assistant', 'content' => $botResponse];

// 只保留最近 10 条消息
$history = array_slice($history, -10);

Redis::setex($key, 3600, json_encode($history));
}

protected function generateResponse(string $message, array $history)
{
// 构建上下文
$context = array_merge([
['role' => 'system', 'content' => 'You are a helpful customer service assistant for an e-commerce store.'],
], $history, [
['role' => 'user', 'content' => $message],
]);

// 搜索知识库
$knowledge = $this->searchKnowledgeBase($message);
if (!empty($knowledge)) {
$context[] = ['role' => 'system', 'content' => "Knowledge base information: {$knowledge}"];
}

// 调用 OpenAI API
$response = OpenAI::chat()->create([
'model' => 'gpt-4',
'messages' => $context,
'temperature' => 0.7,
]);

return $response->choices[0]->message->content;
}

protected function searchKnowledgeBase(string $query)
{
// 生成查询向量
$embedding = AI::embeddings()->create([
'model' => 'text-embedding-ada-002',
'input' => $query,
]);

$queryVector = $embedding->embeddings[0]->embedding;

// 搜索知识库
$results = AI::vectors()->search([
'query' => $queryVector,
'topK' => 3,
'filter' => [
'type' => 'knowledge',
],
]);

// 提取知识内容
return collect($results)->pluck('metadata.content')->implode(' ');
}
}

8. 最佳实践与注意事项

8.1 API 调用优化

  • 批量请求:合并多个 API 请求,减少调用次数
  • 缓存响应:缓存频繁使用的 AI 响应
  • 异步处理:使用队列处理耗时的 AI 任务
  • 错误处理:实现健壮的错误处理机制

8.2 提示工程

  • 明确指令:提供清晰、具体的指令
  • 示例引导:使用示例引导模型生成期望的输出
  • 上下文管理:合理管理上下文长度,避免超出限制
  • 温度参数:根据任务类型调整温度参数

8.3 成本控制

  • 使用合适的模型:根据任务复杂度选择合适的模型
  • 优化提示:减少提示长度,提高生成效率
  • 缓存策略:缓存重复查询的结果
  • 使用批处理:批量处理多个请求

8.4 安全性

  • 输入验证:验证用户输入,防止提示注入
  • 输出过滤:过滤生成的内容,防止有害输出
  • API 密钥管理:安全存储 API 密钥,使用环境变量
  • 速率限制:实现 API 调用速率限制,防止滥用

9. 总结

Laravel 12 的 AI 功能集成为开发者提供了构建智能应用的强大工具,从 OpenAI 客户端到向量数据库支持,从智能路由生成到内容生成器,这些功能大大简化了 AI 应用的开发流程。

通过本文的介绍,开发者可以快速掌握 Laravel 12 的 AI 能力,并应用到实际项目中,如内容生成、智能客服、推荐系统等场景。同时,通过遵循最佳实践和注意事项,可以确保 AI 应用的性能、成本和安全性。

随着 AI 技术的不断发展,Laravel 也在持续改进其 AI 功能,未来版本可能会引入更多创新的 AI 工具和集成。开发者应该保持关注 Laravel 的更新,及时采用新的 AI 功能,为用户提供更加智能、个性化的应用体验。