Cogito is a powerful Go library for building intelligent, co-operative agentic software and LLM-powered workflows, focusing on improving results for small, open source language models that scales to any LLM.
๐งช Tested on Small Models ! Our test suite runs on 0.6b Qwen (not fine-tuned), proving effectiveness even with minimal resources.
๐ Working on Official Paper
I am currently working on the official academic/white paper for Cogito. The paper will provide detailed theoretical foundations, experimental results, and comprehensive analysis of the framework's capabilities.
Cogito is the result of building LocalAI, LocalAGI and LocalOperator (yet to be released).
Cogito uses an internal pipeline to first make the LLM reason about a specific task, forcing the model to reason and later extracts with BNF grammars exact data structures from the LLM. This is applied to every primitive exposed by the framework.
It provides a comprehensive framework for creating conversational AI systems with advanced reasoning, tool execution, goal-oriented planning, iterative content refinement capabilities, and seamless integration with external tools also via the Model Context Protocol (MCP).
๐ง Composable Primitives
Cogito primitives can be combined to form more complex pipelines, enabling sophisticated AI workflows.
go get github.com/mudler/cogito
package main
import (
"context"
"fmt"
"github.com/mudler/cogito"
)
func main() {
// Create an LLM client
llm := cogito.NewOpenAILLM("your-model", "api-key", "https://api.openai.com/")
// Create a conversation fragment
fragment := cogito.NewEmptyFragment().
AddMessage("user", "Tell me about artificial intelligence")
// Get a response
newFragment, err := llm.Ask(context.Background(), fragment)
if err != nil {
panic(err)
}
fmt.Println(newFragment.LastMessage().Content)
}
// Create a fragment with user input
fragment := cogito.NewFragment(openai.ChatCompletionMessage{
Role: "user",
Content: "What's the weather in San Francisco?",
})
// Execute with tools
result, err := cogito.ExecuteTools(llm, fragment,
cogito.WithTools(&WeatherTool{}))
if err != nil {
panic(err)
}
// result.Status.ToolsCalled will contain all the tools being called
Guidelines provide a powerful way to define conditional rules for tool usage. The LLM intelligently selects which guidelines are relevant based on the conversation context, enabling dynamic and context-aware tool selection.
// Define guidelines with conditions and associated tools
guidelines := cogito.Guidelines{
cogito.Guideline{
Condition: "User asks about information or facts",
Action: "Use the search tool to find information",
Tools: cogito.Tools{
&SearchTool{},
},
},
cogito.Guideline{
Condition: "User asks for the weather in a city",
Action: "Use the weather tool to find the weather",
Tools: cogito.Tools{
&WeatherTool{},
},
},
}
// Get relevant guidelines for the current conversation
fragment := cogito.NewEmptyFragment().
AddMessage("user", "When was Isaac Asimov born?")
// Execute tools with guidelines
result, err := cogito.ExecuteTools(llm, fragment,
cogito.WithGuidelines(guidelines),
cogito.EnableStrictGuidelines) // Only use tools from relevant guidelines
if err != nil {
panic(err)
}
// Extract a goal from conversation
goal, err := cogito.ExtractGoal(llm, fragment)
if err != nil {
panic(err)
}
// Create a plan to achieve the goal
plan, err := cogito.ExtractPlan(llm, fragment, goal,
cogito.WithTools(&SearchTool{}))
if err != nil {
panic(err)
}
// Execute the plan
result, err := cogito.ExecutePlan(llm, fragment, plan, goal,
cogito.WithTools(&SearchTool{}))
if err != nil {
panic(err)
}
// Refine content through iterative improvement
refined, err := cogito.ContentReview(llm, fragment,
cogito.WithIterations(3),
cogito.WithTools(&SearchTool{}))
if err != nil {
panic(err)
}
An example on how to iteratively improve content by using two separate models:
llm := cogito.NewOpenAILLM("your-model", "api-key", "https://api.openai.com/")
reviewerLLM := cogito.NewOpenAILLM("your-reviewer-model", "api-key", "https://api.openai.com/")
// Create content to review
initial := cogito.NewEmptyFragment().
AddMessage("user", "Write about climate change")
response, _ := llm.Ask(ctx, initial)
// Iteratively improve with tool support
improvedResponse, _ := cogito.ContentReview(reviewerLLM, response,
cogito.WithIterations(3),
cogito.WithTools(&SearchTool{}, &FactCheckTool{}),
cogito.EnableToolReasoner)
Cogito supports the Model Context Protocol (MCP) for seamless integration with external tools and services. MCP allows you to connect to remote tool providers and use their capabilities directly within your Cogito workflows.
import (
"github.com/modelcontextprotocol/go-sdk/mcp"
)
// Create MCP client sessions
command := exec.Command("docker", "run", "-i", "--rm", "ghcr.io/mudler/mcps/weather:master")
transport := &mcp.CommandTransport{ Command: command }
client := mcp.NewClient(&mcp.Implementation{Name: "test", Version: "v1.0.0"}, nil)
mcpSession, _ := client.Connect(context.Background(), transport, nil)
// Use MCP tools in your workflows
result, _ := cogito.ExecuteTools(llm, fragment,
cogito.WithMCPs(mcpSession))
// Define guidelines that include MCP tools
guidelines := cogito.Guidelines{
cogito.Guideline{
Condition: "User asks about information or facts",
Action: "Use the MCP search tool to find information",
},
}
// Execute with MCP tools and guidelines
result, err := cogito.ExecuteTools(llm, fragment,
cogito.WithMCPs(searchSession),
cogito.WithGuidelines(guidelines),
cogito.EnableStrictGuidelines)
customPrompt := cogito.NewPrompt(`Your custom prompt template with {{.Context}}`)
result, err := cogito.ExecuteTools(llm, fragment,
cogito.WithPrompt(cogito.ToolSelectorType, customPrompt))
# Run the example chat application
make example-chat
This starts an interactive chat session with tool support including web search capabilities.
See examples/internal/search/search.go
for a complete example of implementing a DuckDuckGo search tool.
The library includes comprehensive test coverage using Ginkgo and Gomega. Tests use containerized LocalAI for integration testing.
# Run all tests
make test
# Run with specific log level
LOG_LEVEL=debug make test
# Run with custom arguments
GINKGO_ARGS="--focus=Fragment" make test
Ettore Di Giacinto 2025-now. Cogito is released under the Apache 2.0 License.
If you use Cogito in your research or academic work, please cite our paper:
@article{cogito2025,
title={Cogito: A Framework for Building Intelligent Agentic Software with LLM-Powered Workflows},
author={Ettore Di Giacinto <mudler@localai.io>},
journal={https://github.com/mudler/cogito},
year={2025},
note={}
}