Blog

Building a ThaiLLM Agent in Rust: A Step-by-Step Guide

Learn how to create a simple, high-performance AI agent using the ThaiLLM model with Rust, Tokio, and Reqwest for optimized Thai language processing.

Posted on: 2026-04-22 by AI Assistant


This tutorial will guide you through creating a powerful, tool-enabled AI agent using Rust, the adk-rust SDK, and the ThaiLLM API. Our agent will be able to check the weather and manage files within a safe workspace.

Prerequisites

Project Structure

rust_openai/
├── Cargo.toml          # Project configuration
├── .env                # API Keys (DO NOT COMMIT)
├── src/
│   ├── main.rs         # Agent initialization
│   ├── weather_tool.rs  # Weather lookup logic
│   └── filesystem_tool.rs # File management logic
└── workspace/          # Sandboxed area for the agent

1. Project Setup

Add the following dependencies to your Cargo.toml:

[package]
name = "rust_openai"
version = "0.1.0"
edition = "2024"

[dependencies]
adk-rust = "0.6.0"
adk-tool = "0.6.0"
tokio = { version = "1", features = ["full"] }
dotenvy = "0.15"
anyhow = "1"
reqwest = { version = "0.12", features = ["json"] }
serde = { version = "1", features = ["derive"] }
serde_json = "1"
schemars = "1.2.1"
async-trait = "0.1"
urlencoding = "2"

Create a .env file with your ThaiLLM API key:

THAILLM_API_KEY=your_key_here

2. Creating Custom Tools

Tools allow the AI to interact with the real world. We use the #[tool] attribute from adk-tool.

Weather Tool (src/weather_tool.rs)

This tool fetches live data from wttr.in.

use std::sync::Arc;
use adk_rust::serde::Deserialize;
use adk_tool::{AdkError, Tool, tool};
use schemars::JsonSchema;
use serde_json::{json, Value};

#[derive(Deserialize, JsonSchema)]
struct WeatherArgs {
    /// The city to look up
    city: String,
}

#[tool]
async fn get_weather(args: WeatherArgs) -> Result<Value, AdkError> {
    let url = format!("https://wttr.in/{}?format=j1", urlencoding::encode(&args.city));
    let response = reqwest::get(&url).await.map_err(|e| AdkError::tool(e.to_string()))?;
    let body: Value = response.json().await.map_err(|e| AdkError::tool(e.to_string()))?;

    // Extract relevant data...
    let current = &body["current_condition"][0];
    Ok(json!({
        "city": args.city,
        "temp_c": current["temp_C"],
        "description": current["weatherDesc"][0]["value"]
    }))
}

pub fn weather_tools() -> Vec<Arc<dyn Tool>> {
    vec![Arc::new(GetWeather)]
}

Filesystem Tool (src/filesystem_tool.rs)

To keep things safe, we restrict the agent to a workspace/ directory.

// ... imports and sandbox logic ...

#[tool]
async fn read_file(args: PathArgs) -> Result<Value, AdkError> {
    let path = sandbox(&args.path).await?;
    let content = tokio::fs::read_to_string(&path).await.map_err(|e| AdkError::tool(e.to_string()))?;
    Ok(json!({ "content": content }))
}

3. The Main Agent Loop (src/main.rs)

The main.rs file ties everything together:

  1. Loads environment variables.
  2. Configures the OpenAIClient for ThaiLLM.
  3. Builds the agent with its description, instructions, and tools.
  4. Launches the interactive session.
#[tokio::main]
async fn main() -> anyhow::Result<()> {
    dotenvy::dotenv().ok();
    let api_key = std::env::var("THAILLM_API_KEY")?;
    
    // 1. Configure for ThaiLLM
    let config = OpenAIConfig::compatible(
        &api_key,
        "https://thaillm.or.th/api/v1",
        "typhoon-s-thaillm-8b-instruct"
    );
    let model = OpenAIClient::new(config)?;

    // 2. Build the Agent
    let mut builder = LlmAgentBuilder::new("rust_openai")
        .description("A helpful AI assistant")
        .instruction("You are a friendly assistant. Be concise and helpful.")
        .model(Arc::new(model));

    // 3. Register Tools
    for t in filesystem_tool::filesystem_tools() { builder = builder.tool(t).into(); }
    for t in weather_tool::weather_tools() { builder = builder.tool(t).into(); }

    let agent = builder.build()?;

    // 4. Run the interactive Launcher
    Launcher::new(Arc::new(agent)).run().await?;
    Ok(())
}

4. Running the Agent

Simply run:

cargo run

You can now ask the agent things like:

Summary

You’ve built a Rust-based AI agent that:

GitHub : https://github.com/anoochit/thaillm-example/tree/master/rust_openai