Blog

Building a Streaming Chat App with Flutter and OpenClaw

Learn how to build a modern, high-performance chat application with Flutter and OpenClaw, featuring real-time response streaming (SSE) and beautiful Markdown rendering.

Posted on: 2026-03-09 by AI Assistant


In this tutorial, we will build a modern chat application called OpenClaw Connect. This app features real-time response streaming using Server-Sent Events (SSE) and renders beautiful responses using Markdown.

System Architecture

sequenceDiagram
    participant App as Flutter App
    participant GW as OpenClaw Gateway (Port 18789)
    participant Agent as OpenClaw Agent Runner
    participant LLM as LLM Provider (OpenAI/Local)

    Note over App, GW: HTTP POST /v1/chat/completions
    App->>GW: Request (Prompt + Auth Token)
    
    GW->>GW: Validate Gateway Token
    GW->>Agent: Route to Agent (e.g., "openclaw:main")
    
    Note right of Agent: Agent loads Skills & Memory
    Agent->>LLM: Chat Completion Request
    LLM-->>Agent: Model Response (Text/Tool Call)
    
    alt Tool Call Detected
        Agent->>Agent: Execute Local Skill (e.g., File Write)
        Agent->>LLM: Send Tool Output
        LLM-->>Agent: Final Refined Response
    end

    Agent-->>GW: Final JSON Response
    GW-->>App: OpenAI-Compatible JSON

Prerequisites

Step 1: Project Setup

Create a new Flutter project and add the necessary dependencies.

flutter create openclaw_connect
cd openclaw_connect
flutter pub add http markdown_widget

Your pubspec.yaml should include:

dependencies:
  flutter:
    sdk: flutter
  http: ^1.6.0
  markdown_widget: ^2.3.2+8

Step 2: Configuration

Create a file named lib/config.dart to store your API settings. Using a central config file makes it easy to switch environments.

// lib/config.dart
const openClawEndpoint = 'http://10.0.2.2:18789/v1/chat/completions'; // Android emulator localhost
const openClawAuthToken = 'YOUR-OPENCLAW-TOKEN';

Step 3: Defining Data Models

The OpenClaw API returns data in chunks. We need a model to parse these JSON responses. Create lib/models/chat_model.dart.

import 'dart:convert';

ChatResponse chatResponseFromJson(String str) =>
    ChatResponse.fromJson(json.decode(str));

class ChatResponse {
  final List<Choice> choices;

  ChatResponse({required this.choices});

  factory ChatResponse.fromJson(Map<String, dynamic> json) => ChatResponse(
    choices: List<Choice>.from(json["choices"].map((x) => Choice.fromJson(x))),
  );
}

class Choice {
  final Delta delta;

  Choice({required this.delta});

  factory Choice.fromJson(Map<String, dynamic> json) => Choice(
    delta: Delta.fromJson(json["delta"]),
  );
}

class Delta {
  final String? content;

  Delta({this.content});

  factory Delta.fromJson(Map<String, dynamic> json) => Delta(
    content: json["content"],
  );
}

Step 4: Building the API Service

We use a Stream to handle the incoming SSE data from the server. This allows the UI to update as soon as a new word is received. Create lib/services/openclaw_service.dart.

import 'package:http/http.dart' as http;
import 'dart:convert';
import 'package:openclaw_connect/config.dart';
import 'package:openclaw_connect/models/chat_model.dart';

class OpenclawService {
  Stream<String> getResponseStream(List<Map<String, String>> messages) async* {
    final client = http.Client();
    try {
      final request = http.Request('POST', Uri.parse(openClawEndpoint));
      request.headers.addAll({
        'Authorization': 'Bearer $openClawAuthToken',
        'Content-Type': 'application/json',
      });
      request.body = jsonEncode({
        "model": "openclaw",
        "stream": true,
        "messages": messages,
      });

      final response = await client.send(request);

      await for (var line in response.stream
          .transform(utf8.decoder)
          .transform(const LineSplitter())) {
        if (line.startsWith('data: ')) {
          final data = line.substring(6).trim();
          if (data == '[DONE]') break;
          try {
            final json = chatResponseFromJson(data);
            if (json.choices.isNotEmpty) {
              final content = json.choices[0].delta.content;
              if (content != null) yield content;
            }
          } catch (e) { /* Skip malformed chunks */ }
        }
      }
    } finally {
      client.close();
    }
  }
}

Step 5: Creating the Chat UI

The chat screen handles the message history and renders bubbles. We use MarkdownWidget for rich text and a ScrollController to keep the view at the bottom.

Create lib/screens/chat.dart. (Key concepts: ListView.builder, TextEditingController, and Stream handling).

The Message Logic

Future<void> _sendMessage() async {
  final text = _controller.text.trim();
  if (text.isEmpty) return;

  setState(() {
    _messages.add(ChatMessage(content: text, isUser: true));
    _isLoading = true;
    _controller.clear();
  });

  // Prepare history and add empty assistant message
  final assistantMessage = ChatMessage(content: "", isUser: false);
  setState(() => _messages.add(assistantMessage));

  final stream = _openclawService.getResponseStream(history);
  await for (final chunk in stream) {
    setState(() => assistantMessage.content += chunk);
    _scrollToBottom();
  }
}

Step 6: Main Entry Point

Finally, set up your lib/main.dart with a Material 3 theme.

void main() => runApp(const MyApp());

class MyApp extends StatelessWidget {
  const MyApp({super.key});

  @override
  Widget build(BuildContext context) {
    return MaterialApp(
      theme: ThemeData(
        colorScheme: ColorScheme.fromSeed(seedColor: Colors.deepPurple),
        useMaterial3: true,
      ),
      home: const ChatScreen(),
    );
  }
}

Summary

You’ve built a functional AI chat client that:

  1. Streams responses word-by-word for a smooth experience.
  2. Maintains context by sending message history.
  3. Renders Markdown for code blocks, bold text, and lists.

Happy coding!

See the full source code on GitHub: anoochit/openclaw_connect