Skip to content
BAEM1N.DEV — AI, RAG, LLMOps 개발 블로그
Go back

DeepCoWork #2: Tauri 2 + Python Sidecar -- The Skeleton of a Desktop AI App

TL;DR: Tauri 2 runs a PyInstaller-bundled Python server as a sidecar, so users never need Python installed.

Table of contents

Open Table of contents

Why Tauri + Python

An AI agent app needs two runtimes:

RoleBest LanguageReason
Desktop UIRust (Tauri) / JSNative window, small binary
Agent LogicPythonLangChain, DeepAgents SDK, AI ecosystem

Electron bundles Chromium at 150MB+, while Tauri reuses the OS webview at around 10MB. Rust managing the Python process is stable and memory-efficient. The Tauri sidecar documentation covers how to embed external binaries in detail.

Process Architecture

[User launches app]
     |
     v
  Tauri (Rust)
     |
     +-- 1. Find available port (8008~8108)
     +-- 2. Spawn Python server (pass port)
     +-- 3. Poll /health (up to 30 seconds)
     +-- 4. Emit server_ready -> frontend
     +-- 5. Periodic health check (every 10s)
            |
            v (on crash)
       Emit agent_crashed -> UI error display

Core Code: spawn_python

Development and production use different Python launch strategies:

fn spawn_python(app: &tauri::AppHandle, port: u16) -> Result<CommandChild, String> {
    // Dev: uv run python main.py
    #[cfg(debug_assertions)]
    {
        let agent_dir = /* calculate agent/ path from project root */;
        let (_, child) = app.shell()
            .command("uv")
            .args(["run", "python", "main.py"])
            .env("PORT", port.to_string())
            .current_dir(&agent_dir)
            .spawn()
            .map_err(|e| format!("uv run failed: {e}"))?;
        Ok(child)
    }

    // Production: run PyInstaller binary as sidecar
    #[cfg(not(debug_assertions))]
    {
        let (_, child) = app.shell()
            .sidecar("agent-server")  // binaries/agent-server-{target}
            .map_err(|e| format!("sidecar not found: {e}"))?
            .args(["--port", &port.to_string()])
            .spawn()
            .map_err(|e| format!("sidecar launch failed: {e}"))?;
        Ok(child)
    }
}

Compile-time branching with #[cfg(debug_assertions)]:

Port Discovery

If port 8008 is taken, try the next one:

fn find_available_port(preferred: u16) -> u16 {
    for port in preferred..preferred + 100 {
        if std::net::TcpListener::bind(format!("127.0.0.1:{}", port)).is_ok() {
            return port;
        }
    }
    preferred
}

Up to 100 ports are scanned. The chosen port is passed to Python via the PORT env var or --port argument.

Health Check

Poll until the server is ready:

async fn wait_for_health(port: u16, max_retries: u32) -> Result<(), String> {
    let client = reqwest::Client::new();
    let url = format!("http://127.0.0.1:{}/health", port);
    for i in 0..max_retries {
        if let Ok(res) = client.get(&url).send().await {
            if res.status().is_success() {
                return Ok(());
            }
        }
        tokio::time::sleep(Duration::from_millis(500)).await;
    }
    Err("Server did not respond".into())
}

Once ready, emit server_ready to the frontend:

let _ = handle.emit("server_ready", port);

The frontend receives this event, sets the API base URL port, and activates the UI.

Crash Detection

After the server is ready, health checks continue every 10 seconds:

loop {
    tokio::time::sleep(Duration::from_secs(10)).await;
    match client.get(&url).timeout(Duration::from_secs(5)).send().await {
        Ok(res) if res.status().is_success() => {} // healthy
        _ => {
            let _ = handle.emit("agent_crashed", "Agent server is not responding");
            break;
        }
    }
}

If Python crashes, the frontend displays an error message.

Graceful Shutdown

On app exit, clean up the Python process safely:

.run(move |_app_handle, event| {
    if let tauri::RunEvent::ExitRequested { .. } = event {
        if let Some(child) = child_arc.lock().unwrap().take() {
            let pid = child.pid();
            // 1. Send SIGTERM (graceful)
            #[cfg(unix)]
            {
                let _ = Command::new("kill").args(["-15", &pid.to_string()]).output();
                std::thread::sleep(Duration::from_millis(2000));
            }
            // 2. SIGKILL (force)
            let _ = child.kill();
        }
    }
});

SIGTERM triggers FastAPI’s lifespan shutdown (DB cleanup, queue drain). After 2 seconds, SIGKILL ensures termination.

PyInstaller Bundling

For distribution, bundle all of Python into a single binary:

# agent/bundle.py
cmd = [
    sys.executable, "-m", "PyInstaller",
    "--onefile",
    "--name", f"agent-server-{target_triple}",
    "--collect-submodules", "deepagents",
    "--collect-submodules", "langgraph",
    "--collect-submodules", "langchain_core",
    "main.py",
]

PyInstaller --onefile mode packs all dependencies into a single executable. Copy the output to app/src-tauri/binaries/ and Tauri bundles it:

// tauri.conf.json
{
  "bundle": {
    "externalBin": ["binaries/agent-server"]
  }
}

Tauri automatically selects agent-server-{target-triple} matching the current platform.

CI Build Flow

GitHub Actions (per OS runner)
  |
  +-- 1. pip install -e . + pyinstaller
  +-- 2. python bundle.py -> binaries/agent-server-{target}
  +-- 3. npm ci
  +-- 4. tauri build -> .dmg / .msi / .deb (sidecar included)

Benchmark

MetricmacOS (arm64)Linux (x64)Windows (x64)
PyInstaller binary size~95MB~88MB~102MB
Cold start (sidecar to /health response)~3.8s~4.5s~5.2s
Hot reload restart (dev mode)~1.2s~1.5s~1.8s
Idle memory (Python process)~135MB~120MB~145MB
Average health check polls7 (3.5s)9 (4.5s)10 (5.0s)

FAQ

How large is the PyInstaller binary?

About 80-120MB including DeepAgents + LangChain + FastAPI. It is large, but the tradeoff of not requiring Python installation is well worth it.

Does SIGTERM work on Windows?

Windows has no SIGTERM. child.kill() calls TerminateProcess() for immediate termination. FastAPI’s lifespan shutdown may not run, but SQLite WAL mode guarantees data integrity.

What if there is a port conflict?

With 100 ports scanned sequentially, conflicts are virtually impossible. If all are in use, the default port (8008) is attempted and the Python server will error on startup.


Series

  1. DeepCoWork: I Built an AI Agent Desktop App
  2. [This post] Tauri 2 + Python Sidecar
  3. DeepAgents SDK Internals
  4. System Prompt Design per Mode
  5. SSE Streaming Pipeline
  6. HITL Approval Flow
  7. Multi-Agent ACP Mode
  8. Agent Memory 4 Layers
  9. Skills System
  10. LLM Provider Integration
  11. Security Checklist
  12. GitHub Actions Cross-Platform Build

AI-assisted content
Share this post on:

Previous Post
DeepCoWork #3: DeepAgents SDK Internals -- create_deep_agent, LocalShellBackend, ReAct Loop
Next Post
DeepCoWork: Building an AI Agent Desktop App with Deep Agents SDK