Java SDK

The Java SDK lets you instrument AI agents built with Spring AI, LangChain4j, or custom Java code. It supports Java 11 and above.

Installation

Maven

pom.xml
<dependency>
  <groupId>io.nodeloom</groupId>
  <artifactId>nodeloom-sdk</artifactId>
  <version>0.6.0</version>
</dependency>

Gradle

build.gradle
implementation 'io.nodeloom:nodeloom-sdk:0.6.0'

Quick Start

QuickStart.java
import io.nodeloom.sdk.NodeLoom;
import io.nodeloom.sdk.Trace;
import io.nodeloom.sdk.Span;
import io.nodeloom.sdk.SpanType;

import java.util.Map;

public class QuickStart {
    public static void main(String[] args) {
        NodeLoom client = NodeLoom.builder()
            .apiKey("sdk_...")
            .build();

        // Using try-with-resources for automatic cleanup
        try (Trace trace = client.trace("customer-support-agent")
                .input(Map.of("query", "How do I reset my password?"))
                .start()) {

            // Track an LLM call
            try (Span span = trace.span("openai-call", SpanType.LLM)) {
                // ... call your LLM ...
                span.setOutput(Map.of("response", "You can reset your password..."));
                span.setTokenUsage(150, 200, "gpt-4o");
            }

            // Track a tool call
            try (Span span = trace.span("lookup-user", SpanType.TOOL)) {
                span.setInput(Map.of("email", "[email protected]"));
                // ... call your tool ...
                span.setOutput(Map.of("userId", "u123"));
            }

            // Trace ends successfully when try-with-resources closes
            trace.setOutput(Map.of("answer", "You can reset your password from..."));
        }

        // Always close before exit
        client.close();
    }
}

Client Configuration

The client uses a builder pattern for configuration:

MethodTypeDefaultDescription
.apiKey()String(required)Your SDK token (starts with sdk_)
.endpoint()String"https://api.nodeloom.io"NodeLoom API URL
.batchSize()int100Maximum events per batch
.flushIntervalMs()long5000Milliseconds between automatic flushes
.maxQueueSize()int10000Maximum events held in memory before dropping
java
NodeLoom client = NodeLoom.builder()
    .apiKey("sdk_...")
    .batchSize(50)
    .flushIntervalMs(3000)
    .maxQueueSize(5000)
    .build();

Traces

Traces implement AutoCloseable, so you can use try-with-resources for automatic cleanup. When the trace closes, it sends a trace_end event with a success status. If an exception propagates, it records the error instead.

java
// Try-with-resources (recommended)
try (Trace trace = client.trace("my-agent")
        .input(Map.of("query", "..."))
        .environment("production")
        .metadata(Map.of("userId", "u123"))
        .start()) {

    // ... agent work ...
    trace.setOutput(Map.of("answer", "..."));
}

// Manual management
Trace trace = client.trace("my-agent")
    .input(Map.of("query", "..."))
    .start();
try {
    // ... agent work ...
    trace.end("success", Map.of("answer", "..."));
} catch (Exception e) {
    trace.end("error", e.getMessage());
}

Spans

Spans also implement AutoCloseable. They record timing automatically.

java
// Try-with-resources (recommended)
try (Span span = trace.span("openai-call", SpanType.LLM)) {
    span.setInput(Map.of("prompt", "Hello"));
    String result = callLlm("Hello");
    span.setOutput(Map.of("response", result));
    span.setTokenUsage(150, 200, "gpt-4o");
}

// If an exception occurs inside the try block, the span records the error

// Manual management
Span span = trace.span("tool-call", SpanType.TOOL);
span.setInput(Map.of("query", "search term"));
try {
    Object result = runTool();
    span.setOutput(Map.of("result", result));
    span.end();
} catch (Exception e) {
    span.endWithError(e.getMessage());
}

Span Types

ConstantValueUse Case
SpanType.LLMllmLLM API calls
SpanType.TOOLtoolTool or function calls
SpanType.RETRIEVALretrievalVector search, document retrieval
SpanType.AGENTagentSub-agent invocations
SpanType.CHAINchainChain or pipeline steps
SpanType.CUSTOMcustomAny other operation

Token Usage

java
try (Span span = trace.span("llm-call", SpanType.LLM)) {
    // ... call your LLM ...
    // Arguments: promptTokens, completionTokens, model
    span.setTokenUsage(150, 200, "gpt-4o");
}

Custom Metrics

Record custom numeric metrics to track performance indicators like latency, cost, or quality scores.

java
client.metric("response_latency", 1.23, "seconds", Map.of("model", "gpt-4o"));

Feedback

Attach user feedback to a trace for evaluation and fine-tuning workflows.

java
client.feedback(trace.getTraceId(), 5, "Great response", List.of("accurate"));

Session Tracking

Group related traces into a session to track multi-turn conversations or long-running interactions.

java
var trace = client.trace("support-agent", TraceOptions.builder()
    .sessionId("conv-123")
    .input(Map.of("query", "Hello"))
    .build());

Prompt Templates

Manage versioned prompt templates and associate them with spans for prompt tracking and iteration.

java
client.setPrompt("system-prompt", PromptOptions.builder()
    .content("You are a helpful assistant for {{company}}.")
    .variables(List.of("company"))
    .modelHint("gpt-4o")
    .build());
span.setPromptInfo("system-prompt", 2);

Callback URL

Register a callback URL for your agent to receive webhook notifications from NodeLoom.

java
client.registerCallback("my-agent", "https://my-agent.example.com/callback");

Guardrail Config

Fetch the guardrail configuration for your agent at runtime so your code can enforce the rules defined in the NodeLoom UI.

java
Map<String, Object> config = client.getGuardrailConfig("my-agent");

Read-only

Guardrails are configured in the NodeLoom UI. The SDK provides read-only access to the configuration.

Shutdown

Always call client.close() before your application exits. This flushes remaining events and shuts down the internal thread pool.

java
// Blocks until all events are flushed (up to 10s timeout)
client.close();

// With custom timeout
client.close(30, TimeUnit.SECONDS);

Spring Boot integration

If you register the NodeLoom client as a Spring bean, annotate it with @PreDestroy on the close method, or implement DisposableBean to ensure graceful shutdown.