Ollama Integration#
The LIT Platform seamlessly integrates with Ollama, a popular tool for running large language models locally. This integration allows you to leverage Ollama's efficient model management while enjoying all the features of the LIT Platform.
Benefits of Ollama Integration#
- Model Management: Easily download, update, and manage models
- Efficient Serving: Optimized inference performance
- Simplified Setup: One-click configuration
- Wide Model Support: Access to Ollama's growing library of models
Setup Instructions#
Prerequisites#
- Ollama installed on your system or a remote server
- Network connectivity between LIT Platform and Ollama
Configuration Steps#
- Navigate to Settings > Integrations > Ollama
- Enter the Ollama server URL (default: http://localhost:11434)
- Click "Test Connection" to verify connectivity
- Select "Enable Ollama Integration"
- Save settings
Using Ollama Models in LIT#
Once configured, Ollama-served models will appear alongside native LIT models in:
- Model selection dropdowns
- Workflow canvas components
- Chat interfaces
- Python scripts
Model Management#
The LIT Platform provides an interface to manage your Ollama models:
- Navigate to Models > Ollama Models
- View available models
- Download new models
- Update existing models
- View model details and parameters
Advanced Configuration#
Custom Model Tags#
You can create custom model tags in Ollama and use them in LIT:
These tagged models will be available in the LIT Platform with their respective tags.
Server Configuration#
For production environments, consider:
- Configuring Ollama on a dedicated GPU server
- Setting up proper authentication and network security
- Adjusting Ollama's resource limits in the configuration file
Troubleshooting#
If you encounter issues with the Ollama integration:
- Verify Ollama is running (
ollama ps
) - Check network connectivity between LIT and Ollama
- Ensure models are properly downloaded in Ollama
- Check Ollama logs for specific errors
For detailed information on Ollama itself, refer to the official Ollama documentation.