Blog

The Agentic Local Stack: Mastering Ollama with the ADK-Rust Model

Build private, high-performance AI agents using the adk-model crate. Learn how to orchestrate local LLMs like Llama 3.3 and DeepSeek-R1 via Ollama and the ADK-Rust ecosystem.

Posted on: 2026-04-14 by AI Assistant


The dream of a fully private, high-performance AI agent running entirely on your own hardware is now a reality. With the ADK-Rust ecosystem and the adk-model crate, developers can seamlessly switch between cloud giants like Gemini and local powerhouses like Ollama.

In this guide, we’ll explore how to set up a local agentic stack using Ollama as the inference engine for your Rust-based agents.

Why Use ADK-Rust with Ollama?

The adk-model crate provides a unified Llm trait that abstracts away the complexities of different providers. Whether you’re using gemini-3.1-pro in production or llama3.3 locally for development, your agent logic remains identical.

Key Benefits:

Architecture: The adk-model Pattern

The adk-model crate is designed for modularity. You enable the providers you need via Cargo features and use the standard initialization patterns.

1. Cargo Configuration

Add adk-model to your Cargo.toml and enable the ollama feature:

[dependencies]
adk-model = { version = "0.1", features = ["ollama"] }
adk-core = "0.1"
tokio = { version = "1", features = ["full"] }

2. Initializing the Ollama Model

Using Ollama with ADK-Rust is straightforward. First, ensure ollama serve is running locally with your desired model pulled (e.g., llama3.2).

use adk_model::ollama::{OllamaModel, OllamaConfig};
use adk_model::LlmAgentBuilder;
use std::sync::Arc;

#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
    // 1. Initialize the Ollama model with a specific local tag
    let config = OllamaConfig::new("llama3.2");
    let model = OllamaModel::new(config)?;

    // 2. Wrap in Arc to share with the agent
    let model_ref = Arc::new(model);

    // 3. Build the agent with the local model
    let agent = LlmAgentBuilder::new("system-assistant")
        .model(model_ref)
        .build()?;

    println!("Local Agent initialized with Ollama (llama3.2)");
    
    // Your agentic logic here...
    
    Ok(())
}

Advanced Configuration: Switching to Gemini

One of the strongest features of the ADK-Rust model layer is how easy it is to switch back to cloud models like Gemini 3.1 for more complex reasoning tasks.

use adk_model::gemini::GeminiModel;

// Simply swap the model implementation
let api_key = std::env::var("GOOGLE_API_KEY")?;
let model = GeminiModel::new(&api_key, "gemini-3.1-pro-preview")?;

Performance & Telemetry

Conclusion

The combination of Ollama and ADK-Rust (adk-model) provides an enterprise-grade foundation for private AI. By leveraging the Llm trait, you build agents that are portable, secure, and future-proof.

The local agentic revolution is here—powered by Rust and open weights.