Filaforge Opensource Chat
A powerful Filament plugin that integrates various open-source AI chat capabilities directly into your admin panel.
Features
- Multi-Model Support: Chat with various open-source AI models
- Conversation Management: Save, organize, and continue chat conversations
- Model Selection: Choose from available open-source AI models
- Customizable Settings: Configure API endpoints, models, and chat parameters
- Real-time Chat: Live chat experience with streaming responses
- Conversation History: Keep track of all your AI conversations
- Export Conversations: Save and share chat transcripts
- Role-based Access: Configurable user permissions and access control
- Context Awareness: Maintain conversation context across sessions
- Local Deployment: Support for self-hosted AI models
Installation
1. Install via Composer
composer require filaforge/opensource-chat
2. Publish & Migrate
# Publish provider groups (config, views, migrations)
php artisan vendor:publish --provider="Filaforge\\OpensourceChat\\Providers\\OpensourceChatServiceProvider"
# Run migrations
php artisan migrate
3. Register Plugin
Add the plugin to your Filament panel provider:
use Filament\Panel;
public function panel(Panel $panel): Panel
{
return $panel
// ... other configuration
->plugin(\Filaforge\OpensourceChat\OpensourceChatPlugin::make());
}
Setup
Configuration
The plugin will automatically:
- Publish configuration files to
config/opensource-chat.php
- Publish view files to
resources/views/vendor/opensource-chat/
- Publish migration files to
database/migrations/
- Register necessary routes and middleware
Open Source AI Configuration
Configure your open-source AI endpoints in the published config file:
// config/opensource-chat.php
return [
'default_provider' => env('OS_CHAT_PROVIDER', 'local'),
'providers' => [
'local' => [
'base_url' => env('OS_CHAT_LOCAL_URL', 'http://localhost:8000'),
'api_key' => env('OS_CHAT_LOCAL_KEY', ''),
'model' => env('OS_CHAT_LOCAL_MODEL', 'llama3'),
],
'fireworks' => [
'base_url' => env('OS_CHAT_FIREWORKS_URL', 'https://api.fireworks.ai'),
'api_key' => env('OS_CHAT_FIREWORKS_KEY', ''),
'model' => env('OS_CHAT_FIREWORKS_MODEL', 'llama-v2-7b-chat'),
],
'together' => [
'base_url' => env('OS_CHAT_TOGETHER_URL', 'https://api.together.xyz'),
'api_key' => env('OS_CHAT_TOGETHER_KEY', ''),
'model' => env('OS_CHAT_TOGETHER_MODEL', 'meta-llama/Llama-2-7b-chat-hf'),
],
],
'max_tokens' => env('OS_CHAT_MAX_TOKENS', 4096),
'temperature' => env('OS_CHAT_TEMPERATURE', 0.7),
'stream' => env('OS_CHAT_STREAM', true),
'timeout' => env('OS_CHAT_TIMEOUT', 60),
];
Environment Variables
Add these to your .env file:
OS_CHAT_PROVIDER=local
OS_CHAT_LOCAL_URL=http://localhost:8000
OS_CHAT_LOCAL_KEY=your_local_api_key_here
OS_CHAT_LOCAL_MODEL=llama3
OS_CHAT_FIREWORKS_URL=https://api.fireworks.ai
OS_CHAT_FIREWORKS_KEY=your_fireworks_api_key_here
OS_CHAT_FIREWORKS_MODEL=llama-v2-7b-chat
OS_CHAT_TOGETHER_URL=https://api.together.xyz
OS_CHAT_TOGETHER_KEY=your_together_api_key_here
OS_CHAT_TOGETHER_MODEL=meta-llama/Llama-2-7b-chat-hf
OS_CHAT_MAX_TOKENS=4096
OS_CHAT_TEMPERATURE=0.7
OS_CHAT_STREAM=true
OS_CHAT_TIMEOUT=60
Getting API Keys
Fireworks AI
- Visit Fireworks AI
- Create an account and navigate to API keys
- Generate a new API key
- Copy the key to your
.env file
Together AI
- Visit Together AI
- Sign up and go to API keys section
- Create a new API key
- Copy the key to your
.env file
Local Models
For local deployment, you can use:
- Ollama: Local model serving
- LM Studio: Desktop AI model interface
- Custom endpoints: Your own AI model servers
Usage
Accessing Opensource Chat
- Navigate to your Filament admin panel
- Look for the "Opensource Chat" menu item
- Start chatting with open-source AI models
Starting a Conversation
- Select Provider: Choose from available AI providers
- Select Model: Choose the specific AI model to use
- Type Your Message: Enter your question or prompt
- Send Message: Submit your message to the AI
- View Response: See the AI's response in real-time
- Continue Chat: Keep the conversation going
Managing Conversations
- New Chat: Start a fresh conversation
- Save Chat: Automatically save important conversations
- Load Chat: Resume previous conversations
- Export Chat: Download conversation transcripts
- Delete Chat: Remove unwanted conversations
Advanced Features
- Provider Switching: Switch between different AI providers
- Model Selection: Choose from available models per provider
- Parameter Tuning: Adjust temperature, max tokens, and other settings
- Context Management: Maintain conversation context across sessions
- Streaming Responses: Real-time AI responses for better user experience
Troubleshooting
Common Issues
- API key errors: Verify your API keys are correct and have sufficient credits
- Connection failures: Check if the AI service endpoints are accessible
- Model not available: Ensure the selected model is available in your plan
- Rate limiting: Check your API rate limits and usage
Debug Steps
- Check the plugin configuration:
php artisan config:show opensource-chat
- Verify routes are registered:
php artisan route:list | grep opensource-chat
- Test API connectivity:
# Test local endpoint
curl http://localhost:8000/health
# Test external endpoints
curl -H "Authorization: Bearer YOUR_API_KEY" https://api.fireworks.ai/v1/models
- Check environment variables:
php artisan tinker
echo env('OS_CHAT_PROVIDER');
echo env('OS_CHAT_LOCAL_URL');
- Clear caches:
php artisan optimize:clear
- Check logs for errors:
tail -f storage/logs/laravel.log
Provider-Specific Issues
Local Models
- Service not running: Ensure your local AI service is started
- Port conflicts: Check if the required ports are available
- Model not loaded: Verify the model is properly loaded in your service
Fireworks AI
- Authentication errors: Check API key and permissions
- Model availability: Ensure the model is available in your plan
- Rate limits: Monitor your API usage and limits
Together AI
- API key issues: Verify your Together AI API key
- Model access: Check if you have access to the selected model
- Service status: Check Together AI service status
Security Considerations
Access Control
- Role-based permissions: Restrict access to authorized users only
- API key security: Never expose API keys in client-side code
- User isolation: Ensure users can only access their own conversations
- Audit logging: Track all chat activities and API usage
Best Practices
- Use environment variables for API keys
- Implement proper user authentication
- Monitor API usage and costs
- Regularly rotate API keys
- Set appropriate rate limits
- Use HTTPS for external API calls
Performance Optimization
Local Deployment
- Resource allocation: Ensure sufficient RAM and CPU for models
- Model optimization: Use quantized models for better performance
- Caching: Implement response caching for common queries
- Load balancing: Use multiple model instances if needed
External APIs
- Connection pooling: Reuse HTTP connections when possible
- Request batching: Batch multiple requests when feasible
- Response caching: Cache responses to reduce API calls
- Fallback strategies: Implement fallback to local models
Uninstall
1. Remove Plugin Registration
Remove the plugin from your panel provider:
// remove ->plugin(\Filaforge\OpensourceChat\OpensourceChatPlugin::make())
2. Roll Back Migrations (Optional)
php artisan migrate:rollback
# or roll back specific published files if needed
3. Remove Published Assets (Optional)
rm -f config/opensource-chat.php
rm -rf resources/views/vendor/opensource-chat
4. Remove Package and Clear Caches
composer remove filaforge/opensource-chat
php artisan optimize:clear
5. Clean Up Environment Variables
Remove these from your .env file:
OS_CHAT_PROVIDER=local
OS_CHAT_LOCAL_URL=http://localhost:8000
OS_CHAT_LOCAL_KEY=your_local_api_key_here
OS_CHAT_LOCAL_MODEL=llama3
OS_CHAT_FIREWORKS_URL=https://api.fireworks.ai
OS_CHAT_FIREWORKS_KEY=your_fireworks_api_key_here
OS_CHAT_FIREWORKS_MODEL=llama-v2-7b-chat
OS_CHAT_TOGETHER_URL=https://api.together.xyz
OS_CHAT_TOGETHER_KEY=your_together_api_key_here
OS_CHAT_TOGETHER_MODEL=meta-llama/Llama-2-7b-chat-hf
OS_CHAT_MAX_TOKENS=4096
OS_CHAT_TEMPERATURE=0.7
OS_CHAT_STREAM=true
OS_CHAT_TIMEOUT=60
Support
Contributing
We welcome contributions! Please see our Contributing Guide for details.
License
This plugin is open-sourced software licensed under the MIT license.
Made with ❤️ by the Filaforge Team