Package Data | |
---|---|
Maintainer Username: | cloudstudio |
Maintainer Contact: | hello@cloudstudio.es (Toni Soriano) |
Package Create Date: | 2023-11-19 |
Package Last Update: | 2024-08-05 |
Home Page: | |
Language: | PHP |
License: | MIT |
Last Refreshed: | 2024-11-17 03:09:19 |
Package Statistics | |
---|---|
Total Downloads: | 10,463 |
Monthly Downloads: | 2,438 |
Daily Downloads: | 34 |
Total Stars: | 314 |
Total Watchers: | 7 |
Total Forks: | 24 |
Total Open Issues: | 3 |
Ollama-Laravel is a Laravel package that provides a seamless integration with the Ollama API. It includes functionalities for model management, prompt generation, format setting, and more. This package is perfect for developers looking to leverage the power of the Ollama API in their Laravel applications.
https://github.com/cloudstudio/ollama-laravel/releases/tag/v1.0.5
composer require cloudstudio/ollama-laravel
php artisan vendor:publish --tag="ollama-laravel-config"
Published config file:
return [
'model' => env('OLLAMA_MODEL', 'llama2'),
'url' => env('OLLAMA_URL', 'http://127.0.0.1:11434'),
'default_prompt' => env('OLLAMA_DEFAULT_PROMPT', 'Hello, how can I assist you today?'),
'connection' => [
'timeout' => env('OLLAMA_CONNECTION_TIMEOUT', 300),
],
];
use Cloudstudio\Ollama\Facades\Ollama;
$response = Ollama::agent('You are a weather expert...')
->prompt('Why is the sky blue?')
->model('llama2')
->options(['temperature' => 0.8])
->stream(false)
->ask();
$response = Ollama::model('llava:13b')
->prompt('What is in this picture?')
->image(public_path('images/example.jpg'))
->ask();
// "The image features a close-up of a person's hand, wearing bright pink fingernail polish and blue nail polish. In addition to the colorful nails, the hand has two tattoos – one is a cross and the other is an eye."
$messages = [
['role' => 'user', 'content' => 'My name is Toni Soriano and I live in Spain'],
['role' => 'assistant', 'content' => 'Nice to meet you , Toni Soriano'],
['role' => 'user', 'content' => 'where I live ?'],
];
$response = Ollama::agent('You know me really well!')
->model('llama2')
->chat($messages);
// "You mentioned that you live in Spain."
$response = Ollama::model('Llama2')->show();
Ollama::model('Llama2')->copy('NewModel');
Ollama::model('Llama2')->delete();
$embeddings = Ollama::model('Llama2')->embeddings('Your prompt here');
pest