Skip to content

Latest commit

 

History

History

AIShell.Ollama.Agent

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 
 
 
 
 
 
 
 
 

Ollama Plugin

This agent is used to interact with a language model running locally or remotely by utilizing the Ollama API. Before using this agent locally you need to have Ollama installed and running.

Pre-requisites to using the agent locally

Configuration

To configure the agent, run /agent config ollama to open up the setting file in your default editor, and then update the file based on the following example.

{
    // To use Ollama API service:
    // 1. Install Ollama: `winget install Ollama.Ollama`
    // 2. Start Ollama API server: `ollama serve`
    // 3. Install Ollama model: `ollama pull phi3`

    // Declare predefined model configurations
    "Presets": [
        {
            "Name": "PowerShell Expert",
            "Description": "A ollama agent with expertise in PowerShell scripting and command line utilities.",
            "ModelName": "phi3",
            "SystemPrompt": "You are a helpful and friendly assistant with expertise in PowerShell scripting and command line."
        }
    ],

    // Declare Ollama endpoint
    "Endpoint": "http://localhost:11434",
    // Enable Ollama streaming
    "Stream": false,
    // Specify the default preset to use
    "DefaultPreset": "PowerShell Expert"
}