SDK Auto-Discovery
Agents instrumented with the NodeLoom SDK are automatically added to your inventory — no configuration needed.
How It Works
When your SDK sends a trace_start event, NodeLoom extracts the agent name, framework, language, and SDK version. It creates or updates the agent in your inventory automatically.
As spans arrive, NodeLoom also tracks dependencies:
- LLM spans → model name and provider (e.g., gpt-4o / OpenAI)
- Tool spans → tool name (e.g., knowledge-base-search)
Framework Detection
The SDK automatically detects which AI framework is installed in your environment and includes it in every trace. Supported frameworks:
- LangChain — Python, TypeScript
- CrewAI — Python
- AutoGen — Python
- LlamaIndex — Python
- LangChain4j — Java
- Custom — any framework not in the list above
No configuration needed
Framework detection happens at SDK initialization. You don't need to pass it manually.
Example
A standard SDK trace automatically populates the inventory:
Python
from nodeloom import NodeLoom, SpanType
client = NodeLoom(api_key="sdk_...")
# This trace auto-registers the agent in your inventory
with client.trace("customer-support-agent", agent_version="2.0") as trace:
with trace.span("gpt-4o-chat", type=SpanType.LLM) as span:
# Your LLM call
span.set_token_usage(prompt=200, completion=150, model="gpt-4o")
with trace.span("knowledge-base-search", type=SpanType.TOOL) as span:
# Your tool call
span.set_output({"results": [...]})
client.shutdown()After this runs, your inventory will show:
- Agent: customer-support-agent (LangChain / Python)
- Model dependency: gpt-4o (OpenAI)
- Tool dependency: knowledge-base-search
- Status: MONITORED
Provider Inference
NodeLoom infers the AI provider from the model name automatically:
gpt-4o,o1,o3→ OpenAIclaude-3.5-sonnet→ Anthropicgemini-1.5-pro→ Googlemistral-large→ Mistralcommand-r→ Coherellama-3→ Meta