Edictum
Framework Adapters

LangChain Adapter

The LangChainAdapter connects Edictum to LangChain and LangGraph agents with three integration methods.

AI Assistance

Right page if: you are wiring Edictum into a LangChain or LangGraph agent using ToolNode. Wrong page if: you need to compare all adapters side-by-side -- see https://docs.edictum.ai/docs/guides/adapter-comparison. For the common constructor API, see https://docs.edictum.ai/docs/adapters/overview. Gotcha: as_middleware() crashes when an asyncio loop is already running (Jupyter, FastAPI). Use as_tool_wrapper() (auto-bridges) or as_async_tool_wrapper() (native await) instead.

The LangChainAdapter connects Edictum to LangChain and LangGraph agents. Every tool call passes through the Edictum pipeline before and after execution -- contracts are enforced, findings are produced, and audit events are emitted.

Getting Started

Install

pip install edictum[yaml,langchain]

This installs langchain-core >= 0.3. The yaml extra is needed to load YAML contract bundles.

Create adapter

from edictum import Edictum, Principal
from edictum.adapters.langchain import LangChainAdapter
from langchain_openai import ChatOpenAI
from langgraph.prebuilt import create_react_agent, ToolNode

# Load contracts
guard = Edictum.from_yaml("contracts.yaml")

# Create adapter
adapter = LangChainAdapter(
    guard=guard,
    session_id="session-01",
    principal=Principal(user_id="alice", role="analyst"),
)

Wire into agent

# Wrap tools with contract enforcement
tool_node = ToolNode(
    tools=[search_tool, file_reader, calculator],
    wrap_tool_call=adapter.as_tool_wrapper(),
)

# Build agent -- contracts enforced on every tool call
agent = create_react_agent(model=ChatOpenAI(model="gpt-4o"), tools=tool_node)
result = agent.invoke({"messages": [("user", "Summarize the Q3 report")]})

Every tool call now passes through the pipeline. If a contract denies the call, the agent receives a ToolMessage with "DENIED: <reason>" and can try a different approach. Postconditions scan output after execution.

Which Method to Use

MethodContextEvent loop handlingRecommendation
as_tool_wrapper()Sync or mixedDetects running loops, bridges via ThreadPoolExecutorUse this by default
as_async_tool_wrapper()Fully async (FastAPI, async LangGraph)Native await -- no bridgingUse in async code
as_middleware()Sync onlyrun_until_complete() -- breaks if loop is runningLegacy; avoid in new code

Returns a callable for ToolNode(wrap_tool_call=...). Handles nested event loops gracefully -- if an asyncio loop is already running (Jupyter, FastAPI), it bridges via ThreadPoolExecutor instead of crashing.

wrapper = adapter.as_tool_wrapper()
tool_node = ToolNode(tools=tools, wrap_tool_call=wrapper)
agent = create_react_agent(model, tools=tool_node)

as_async_tool_wrapper() (Async Contexts)

Returns an async callable. No sync-to-async bridging -- the cleanest option for fully async code.

async_wrapper = adapter.as_async_tool_wrapper()
tool_node = ToolNode(tools=tools, wrap_tool_call=async_wrapper)
agent = create_react_agent(model, tools=tool_node)

Use this when your application is already async (FastAPI endpoints, async LangGraph workflows) and you want to avoid the thread pool overhead.

as_middleware() (Alternative)

Returns a @wrap_tool_call decorated function for tool_call_middleware. This uses asyncio.get_event_loop().run_until_complete() internally, which raises RuntimeError if an event loop is already running.

middleware = adapter.as_middleware()
agent = create_react_agent(model, tools=tools, tool_call_middleware=[middleware])

If you hit event loop errors with as_middleware(), switch to as_tool_wrapper() or as_async_tool_wrapper(). If you must use as_middleware() in an async context, apply nest_asyncio:

import nest_asyncio
nest_asyncio.apply()

Constructor Parameters

adapter = LangChainAdapter(
    guard=guard,
    session_id="session-01",
    principal=Principal(user_id="alice", role="analyst"),
    principal_resolver=lambda tool_name, tool_input: resolve(tool_name),
)
ParameterTypeDescription
guardEdictumAn Edictum instance with loaded contracts
session_idstr | NoneSession identifier for session contracts. Auto-generated UUID if omitted
principalPrincipal | NoneStatic principal attached to every tool call
principal_resolverCallable | None(tool_name, tool_input) -> Principal for dynamic resolution. Overrides static principal when set

Dynamic Principal Resolution

When different tools need different identity context -- for example, a tool that operates on behalf of the end user vs. an internal service tool:

def resolve_principal(tool_name: str, tool_input: dict) -> Principal:
    if tool_name in ("query_user_data", "update_profile"):
        return Principal(user_id=current_user.id, role="end_user")
    return Principal(service_id="internal-agent", role="service")

adapter = LangChainAdapter(
    guard=guard,
    principal_resolver=resolve_principal,
)

See mutable principal for the full pattern.

Mid-Session Principal Changes

Update the principal mid-session with set_principal(). Subsequent tool calls use the new principal.

adapter.set_principal(Principal(user_id="alice", role="admin"))

Postcondition Callbacks

All three methods accept on_postcondition_warn. The callback receives the tool result (already modified by redact or deny if applicable) and a list of findings. Its return value replaces the tool result sent to the agent.

import re

def redact_pii(result, findings):
    text = str(result)
    for f in findings:
        if f.type == "pii_detected":
            text = re.sub(r"\b\d{3}-\d{2}-\d{4}\b", "[SSN REDACTED]", text)
            text = re.sub(r"\b[\w.+-]+@[\w-]+\.[\w.-]+\b", "[EMAIL REDACTED]", text)
    return text

wrapper = adapter.as_tool_wrapper(on_postcondition_warn=redact_pii)
tool_node = ToolNode(tools=tools, wrap_tool_call=wrapper)

The LangChain adapter is one of the adapters where the callback return value replaces the result. See adapter comparison for which adapters support result replacement vs. side-effect-only callbacks.

Lifecycle Callbacks

Register on_deny and on_allow callbacks on the Edictum instance to react to every governance decision:

def log_denial(envelope, reason, contract_name):
    logger.warning(f"Denied {envelope.tool_name}: {reason} ({contract_name})")

def log_allow(envelope):
    logger.info(f"Allowed {envelope.tool_name}")

guard = Edictum.from_yaml(
    "contracts.yaml",
    on_deny=log_denial,
    on_allow=log_allow,
)

These fire regardless of which integration method you use.

Observe Mode

Deploy contracts in observe mode to see what would be denied without actually denying any tool calls:

guard = Edictum.from_yaml("contracts.yaml", mode="observe")
adapter = LangChainAdapter(guard=guard)
wrapper = adapter.as_tool_wrapper()

The wrapper allows all tool calls through. CALL_WOULD_DENY audit events are emitted for calls that would have been denied, so you can review enforcement behavior before enabling it. See observe mode for the full workflow.

Server Connection

Connect to Edictum Console for centralized contract management, hot-reload, and production approvals:

guard = Edictum.from_server(
    url="https://console.example.com",
    api_key="ek_...",
)

adapter = LangChainAdapter(guard=guard)
wrapper = adapter.as_tool_wrapper()

The agent receives contract updates via SSE and applies them without restart. See connecting agents for setup details.

Full Working Example

A complete example with contracts, principal, session limits, PII redaction, and lifecycle logging:

from edictum import Edictum, Principal
from edictum.adapters.langchain import LangChainAdapter
from langchain_openai import ChatOpenAI
from langgraph.prebuilt import create_react_agent, ToolNode
import re

# Load contracts with lifecycle callbacks
guard = Edictum.from_yaml(
    "contracts.yaml",
    environment="production",
    on_deny=lambda env, reason, name: print(f"DENIED: {env.tool_name} -- {reason}"),
)

# Create adapter with identity and session tracking
adapter = LangChainAdapter(
    guard=guard,
    session_id="research-session-01",
    principal=Principal(user_id="researcher", role="analyst"),
)

# PII redaction callback
def redact_pii(result, findings):
    text = str(result)
    text = re.sub(r"\b\d{3}-\d{2}-\d{4}\b", "***-**-****", text)
    return text

# Build the agent
tools = [search_tool, calculator_tool, file_reader_tool]
tool_node = ToolNode(
    tools=tools,
    wrap_tool_call=adapter.as_tool_wrapper(on_postcondition_warn=redact_pii),
)
agent = create_react_agent(model=ChatOpenAI(model="gpt-4o"), tools=tool_node)

# Run -- all contracts enforced
result = agent.invoke({"messages": [("user", "Summarize the Q3 report")]})

Next Steps

Last updated on

On this page