Go SDK

The Go SDK lets you instrument AI agents built with Go. It supports Go 1.21 and above.

Installation

bash
go get github.com/nodeloom/nodeloom-sdk-go

Quick Start

main.go
package main

import (
    "fmt"
    nodeloom "github.com/nodeloom/nodeloom-sdk-go"
)

func main() {
    client := nodeloom.New("sdk_...")
    defer client.Close()

    // Start a trace
    trace := client.Trace("customer-support-agent",
        nodeloom.WithInput(map[string]any{"query": "How do I reset my password?"}),
    )

    // Track an LLM call
    span := trace.Span("openai-call", nodeloom.SpanTypeLLM)
    // ... call your LLM ...
    span.SetOutput(map[string]any{"response": "You can reset your password..."})
    span.SetTokenUsage(150, 200, "gpt-4o")
    span.End()

    // Track a tool call
    toolSpan := trace.Span("lookup-user", nodeloom.SpanTypeTool)
    toolSpan.SetInput(map[string]any{"email": "[email protected]"})
    // ... call your tool ...
    toolSpan.SetOutput(map[string]any{"userId": "u123"})
    toolSpan.End()

    // End the trace
    trace.End(nodeloom.StatusSuccess,
        nodeloom.WithOutput(map[string]any{"answer": "You can reset your password from..."}),
    )
}

Client Configuration

The client uses the functional options pattern:

OptionTypeDefaultDescription
apiKey (1st arg)string(required)Your SDK token (starts with sdk_)
WithEndpoint()string"https://api.nodeloom.io"NodeLoom API URL
WithBatchSize()int100Maximum events per batch
WithFlushInterval()time.Duration5sDuration between automatic flushes
WithMaxQueueSize()int10000Maximum events held in memory before dropping
go
client := nodeloom.New("sdk_...",
    nodeloom.WithBatchSize(50),
    nodeloom.WithFlushInterval(3 * time.Second),
    nodeloom.WithMaxQueueSize(5000),
)
defer client.Close()

Traces

A trace represents one run of your agent. Use defer client.Close() to ensure all events are flushed on exit.

go
// Basic trace
trace := client.Trace("my-agent")
// ... agent work ...
trace.End(nodeloom.StatusSuccess)

// Trace with options
trace := client.Trace("my-agent",
    nodeloom.WithInput(map[string]any{"query": "Tell me about AI"}),
    nodeloom.WithMetadata(map[string]any{"userId": "u123"}),
    nodeloom.WithEnvironment("production"),
)
// ... agent work ...
trace.End(nodeloom.StatusSuccess,
    nodeloom.WithOutput(map[string]any{"answer": "AI is..."}),
)

// Trace with error
trace := client.Trace("my-agent",
    nodeloom.WithInput(map[string]any{"query": "..."}),
)
result, err := runAgent()
if err != nil {
    trace.EndWithError(err)
} else {
    trace.End(nodeloom.StatusSuccess,
        nodeloom.WithOutput(result),
    )
}

Spans

Spans track individual operations within a trace. Always call End() on each span.

go
// Create a span
span := trace.Span("openai-call", nodeloom.SpanTypeLLM)
span.SetInput(map[string]any{"prompt": "Hello"})
result, err := callLLM("Hello")
if err != nil {
    span.EndWithError(err)
} else {
    span.SetOutput(map[string]any{"response": result})
    span.End()
}

// Span with token usage
span := trace.Span("llm-call", nodeloom.SpanTypeLLM)
// ... call your LLM ...
span.SetTokenUsage(150, 200, "gpt-4o")
span.SetOutput(map[string]any{"response": result})
span.End()

Span Types

ConstantValueUse Case
SpanTypeLLMllmLLM API calls
SpanTypeTooltoolTool or function calls
SpanTypeRetrievalretrievalVector search, document retrieval
SpanTypeAgentagentSub-agent invocations
SpanTypeChainchainChain or pipeline steps
SpanTypeCustomcustomAny other operation

Custom Metrics

Record custom numeric metrics to track performance indicators like latency, cost, or quality scores.

go
client.Metric("response_latency", 1.23, nodeloom.MetricOpts{Unit: "seconds", Tags: map[string]string{"model": "gpt-4o"}})

Feedback

Attach user feedback to a trace for evaluation and fine-tuning workflows.

go
client.Feedback(trace.TraceID, 5, "Great response", []string{"accurate"})

Session Tracking

Group related traces into a session to track multi-turn conversations or long-running interactions.

go
trace := client.Trace("support-agent", nodeloom.WithSessionID("conv-123"), nodeloom.WithInput(map[string]any{"query": "Hello"}))

Prompt Templates

Manage versioned prompt templates and associate them with spans for prompt tracking and iteration.

go
client.SetPrompt("system-prompt", nodeloom.PromptOpts{
    Content:   "You are a helpful assistant for {{company}}.",
    Variables: []string{"company"},
    ModelHint: "gpt-4o",
})
span.SetPromptInfo("system-prompt", 2)

Callback URL

Register a callback URL for your agent to receive webhook notifications from NodeLoom.

go
client.RegisterCallback("my-agent", "https://my-agent.example.com/callback")

Guardrail Config

Fetch the guardrail configuration for your agent at runtime so your code can enforce the rules defined in the NodeLoom UI.

go
config, err := client.GetGuardrailConfig("my-agent")

Read-only

Guardrails are configured in the NodeLoom UI. The SDK provides read-only access to the configuration.

Shutdown

Always close the client to flush remaining events. The idiomatic Go pattern is to use defer:

go
client := nodeloom.New("sdk_...")
defer client.Close()  // flushes remaining events on exit

Context-aware shutdown

For servers with graceful shutdown, pass a context to CloseWithContext(ctx) to respect cancellation deadlines.