Use the AiFoundation module
Edit on GitHubThis document describes how to integrate and use the AiFoundation module to interact with various AI providers in your Spryker application. The AiFoundation module provides a unified interface for working with multiple AI providers, such as OpenAI, Anthropic Claude, AWS Bedrock, and others.
About NeuronAI framework
The AiFoundation module uses the NeuronAI PHP agentic framework under the hood. NeuronAI provides the foundational infrastructure for AI provider integrations.
The Spryker AiFoundation client is designed for simple use cases where you need to send prompts to AI providers and receive responses. This covers most common AI integration scenarios in e-commerce applications.
For advanced agentic solutions that require complex workflows, multi-agent systems, or custom AI behaviors, you can use the NeuronAI framework directly in your project code. However, note that Spryker does not officially support direct usage of NeuronAI APIs outside of the AiFoundation module. If you choose to use NeuronAI directly, you are responsible for maintenance and compatibility with future versions.
Install the AiFoundation module
-
Require the package:
composer require spryker/ai-foundation -
Generate transfers:
console transfer:generate
Configure AI providers
Configure AI providers in a dedicated configuration file. The module uses the AI_CONFIGURATIONS constant to define one or more AI configurations.
Create a separate configuration file for AI settings to keep your configuration organized and maintainable.
Store API keys as environment variables, not in configuration files. For Spryker Cloud, use the parameter store to manage sensitive credentials. For details, see Add variables in the parameter store.
-
Create a new configuration file
config/Shared/config_ai.php:<?php use Spryker\Shared\AiFoundation\AiFoundationConstants; // AI provider configurations $config[AiFoundationConstants::AI_CONFIGURATIONS] = [ // Your AI configurations will be defined here ]; -
Include the AI configuration file in your main configuration file (for example,
config/Shared/config_default.php):<?php require 'config_ai.php';
Alternatively, you can define AI configurations directly in config/Shared/config_default.php if you prefer a single configuration file approach.
Configuration structure
Each AI configuration requires:
provider_name: The AI provider identifier (required)provider_config: Provider-specific configuration (required)system_prompt: Default system prompt for the AI provider
Default configuration
The module automatically uses the configuration named AI_CONFIGURATION_DEFAULT when you do not specify a configuration in the PromptRequest. Define at least one default configuration:
<?php
use Spryker\Shared\AiFoundation\AiFoundationConstants;
$config[AiFoundationConstants::AI_CONFIGURATIONS] = [
AiFoundationConstants::AI_CONFIGURATION_DEFAULT => [
'provider_name' => AiFoundationConstants::PROVIDER_OPENAI,
'provider_config' => [
'key' => getenv('OPENAI_API_KEY'),
'model' => 'gpt-4o',
],
'system_prompt' => 'You are a helpful assistant.',
],
];
Provider configuration examples
OpenAI
'openai-config' => [
'provider_name' => AiFoundationConstants::PROVIDER_OPENAI,
'provider_config' => [
'key' => getenv('OPENAI_API_KEY'), // required
'model' => 'gpt-4o', // required
'parameters' => [], // optional
'httpOptions' => [ // optional
'timeout' => 60,
'connectTimeout' => 5,
'headers' => [],
],
],
'system_prompt' => 'You are a helpful assistant.', // optional
],
Anthropic Claude
'anthropic-config' => [
'provider_name' => AiFoundationConstants::PROVIDER_ANTHROPIC,
'provider_config' => [
'key' => getenv('ANTHROPIC_API_KEY'), // required
'model' => 'claude-sonnet-4-20250514', // required
'version' => '2023-06-01', // optional
'max_tokens' => 8192, // optional
'parameters' => [], // optional
'httpOptions' => [ // optional
'timeout' => 60,
],
],
],
AWS Bedrock
AWS Bedrock requires a system_prompt configuration. AWS credentials are automatically loaded from environment variables: AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, and AWS_SESSION_TOKEN.
'bedrock-config' => [
'provider_name' => AiFoundationConstants::PROVIDER_BEDROCK,
'provider_config' => [
'model' => 'eu.anthropic.claude-sonnet-4-20250514-v1:0', // required
'bedrockRuntimeClient' => [ // required
'region' => 'eu-west-1', // required
'version' => 'latest', // optional
],
],
'system_prompt' => 'You are a helpful assistant.', // required for Bedrock
],
Ollama (local/self-hosted)
If Ollama runs outside the Docker SDK on macOS, use http://host.docker.internal:11434/api as the URL to access the host machine from within Docker containers.
'ollama-config' => [
'provider_name' => AiFoundationConstants::PROVIDER_OLLAMA,
'provider_config' => [
'url' => 'http://host.docker.internal:11434/api', // required - use host.docker.internal for Mac when Ollama runs outside Docker
'model' => 'llama3.2', // required
'parameters' => [], // optional
'httpOptions' => [ // optional
'timeout' => 60,
'connectTimeout' => 5,
],
],
],
Run Ollama with Docker SDK
To run Ollama as a service within the Spryker Docker SDK:
-
Create an
ollama.ymlfile in your project root:version: '3.8' services: ollama: image: ollama/ollama:latest environment: OLLAMA_HOST: "0.0.0.0:11435" volumes: - ./data/tmp/ollama_data:/root/.ollama networks: - private - public -
Reference the Ollama compose file in your
deploy.dev.yml:compose: yamls: ['./ollama.yml'] -
Update your AI configuration to use the Ollama service URL:
'ollama-config' => [ 'provider_name' => AiFoundationConstants::PROVIDER_OLLAMA, 'provider_config' => [ 'url' => 'http://ollama:11435/api', // use service name when running inside Docker SDK 'model' => 'llama3.2', ], ], -
Deploy the changes:
docker/sdk boot deploy.dev.yml docker/sdk up -
Pull the required Ollama model:
docker/sdk cli exec -c ollama ollama pull llama3.2
The Ollama data is stored in the ./data/tmp/ollama_data directory, which you should exclude from version control (.gitignore or .dockerignore).
Google Gemini
'gemini-config' => [
'provider_name' => AiFoundationConstants::PROVIDER_GEMINI,
'provider_config' => [
'key' => getenv('GEMINI_API_KEY'), // required
'model' => 'gemini-2.0-flash', // required
'parameters' => [], // optional
],
],
Deepseek
'deepseek-config' => [
'provider_name' => AiFoundationConstants::PROVIDER_DEEPSEEK,
'provider_config' => [
'key' => getenv('DEEPSEEK_API_KEY'), // required
'model' => 'deepseek-chat', // required
'parameters' => [], // optional
],
],
HuggingFace
'huggingface-config' => [
'provider_name' => AiFoundationConstants::PROVIDER_HUGGINGFACE,
'provider_config' => [
'key' => getenv('HUGGINGFACE_API_KEY'), // required
'model' => 'meta-llama/Llama-3.3-70B-Instruct', // required
'parameters' => [], // optional
],
],
Mistral AI
'mistral-config' => [
'provider_name' => AiFoundationConstants::PROVIDER_MISTRAL,
'provider_config' => [
'key' => getenv('MISTRAL_API_KEY'), // required
'model' => 'mistral-large-latest', // required
'parameters' => [], // optional
],
],
xAI Grok
'grok-config' => [
'provider_name' => AiFoundationConstants::PROVIDER_GROK,
'provider_config' => [
'key' => getenv('XAI_API_KEY'), // required
'model' => 'grok-2-latest', // required
'parameters' => [], // optional
],
],
Azure OpenAI
'azure-openai-config' => [
'provider_name' => AiFoundationConstants::PROVIDER_AZURE_OPEN_AI,
'provider_config' => [
'key' => getenv('AZURE_OPENAI_API_KEY'), // required
'endpoint' => 'https://your-resource.openai.azure.com', // required
'model' => 'your-deployment-name', // required
'version' => '2024-02-01', // optional
'parameters' => [], // optional
],
],
Use the AiFoundation client
Basic usage
<?php
namespace Pyz\Zed\YourModule\Business;
use Generated\Shared\Transfer\PromptMessageTransfer;
use Generated\Shared\Transfer\PromptRequestTransfer;
use Spryker\Client\AiFoundation\AiFoundationClientInterface;
class YourBusinessModel
{
public function __construct(
protected AiFoundationClientInterface $aiFoundationClient
) {
}
public function generateContent(string $userMessage): string
{
$promptRequest = (new PromptRequestTransfer())
->setPromptMessage(
(new PromptMessageTransfer())->setContent($userMessage)
);
$response = $this->aiFoundationClient->prompt($promptRequest);
return $response->getMessage()->getContent();
}
}
Using a specific configuration
Specify a configuration name to use a configuration other than the default:
$promptRequest = (new PromptRequestTransfer())
->setAiConfigurationName('anthropic-config')
->setPromptMessage(
(new PromptMessageTransfer())->setContent('Explain Spryker modules')
);
$response = $this->aiFoundationClient->prompt($promptRequest);
Multiple configurations example
Configure multiple AI providers for different use cases in your application:
<?php
use Spryker\Shared\AiFoundation\AiFoundationConstants;
$config[AiFoundationConstants::AI_CONFIGURATIONS] = [
AiFoundationConstants::AI_CONFIGURATION_DEFAULT => [
'provider_name' => AiFoundationConstants::PROVIDER_OPENAI,
'provider_config' => [
'key' => getenv('OPENAI_API_KEY'),
'model' => 'gpt-4o',
],
],
'fast-responses' => [
'provider_name' => AiFoundationConstants::PROVIDER_OPENAI,
'provider_config' => [
'key' => getenv('OPENAI_API_KEY'),
'model' => 'gpt-4o-mini',
],
'system_prompt' => 'Provide concise, brief responses.',
],
];
Available provider constants
The module provides the following provider constants in AiFoundationConstants:
PROVIDER_OPENAI- OpenAI (ChatGPT)PROVIDER_ANTHROPIC- Anthropic ClaudePROVIDER_BEDROCK- AWS Bedrock RuntimePROVIDER_GEMINI- Google GeminiPROVIDER_DEEPSEEK- Deepseek AIPROVIDER_HUGGINGFACE- HuggingFacePROVIDER_MISTRAL- Mistral AIPROVIDER_OLLAMA- Ollama (local/self-hosted)PROVIDER_GROK- xAI GrokPROVIDER_AZURE_OPEN_AI- Azure OpenAI
Transfer objects
PromptRequest
This transfer contains the request data for AI interaction:
promptMessage(PromptMessage, required): The message to send to the AIaiConfigurationName(string, optional): The configuration name to use. If not provided, usesAI_CONFIGURATION_DEFAULT
PromptMessage
This transfer represents a message in the conversation:
content(string): The text content of the messagecontentData(array, optional): Additional structured dataattachments(Attachment[], optional): File or image attachments
PromptResponse
This transfer contains the AI response:
message(PromptMessage): The AI’s response message
Attachment
This transfer represents a file or image attachment:
type(string): Type of attachment (useAiFoundationConstants::ATTACHMENT_TYPE_IMAGEorATTACHMENT_TYPE_DOCUMENT)content(string): The content (URL or Base64-encoded data)contentType(string): Content type format (useAiFoundationConstants::ATTACHMENT_CONTENT_TYPE_URLorATTACHMENT_CONTENT_TYPE_BASE64)mediaType(string): MIME type (for example,image/png,application/pdf)
Roadmap
The following capabilities are planned for future releases of the AiFoundation module:
Structured response
Support for requesting and parsing structured responses from AI providers. This will enable AI models to return data in predefined formats (for example, JSON schemas), making it easier to integrate AI responses directly into application workflows and business logic.
Tool call
Implementation of function calling capabilities, allowing AI models to invoke specific tools or functions during the response generation. This enables more interactive and dynamic AI interactions where the model can query external systems, perform calculations, or execute custom business logic.
Chat history capabilities
Support for maintaining conversation context across multiple interactions. This will enable multi-turn conversations where the AI can reference previous messages, maintain state, and provide more contextually relevant responses throughout an extended dialogue.
Thank you!
For submitting the form