Skip to content

Quick Start

Add Configure to an existing agent. By the end, your agent will have persistent user memory, access to connected tools, and automatic memory extraction.

Prerequisites

Backend-only agents

No frontend? See Server-Side Users for an alternative authentication path. Note: this creates unfederated profiles with limited cross-agent capabilities.

1. Get the user's profile

profile.get() returns everything Configure knows about the user. .format() converts it to a string you can inject into your system prompt.

typescript
import { ConfigureClient, CONFIGURE_TOOLS } from 'configure';

const client = new ConfigureClient('sk_...');

const profile = await client.profile.get(token, userId);
const systemPrompt = `You are TravelBot, a travel assistant.\n\n${profile.format()}`;

Your prompt, your rules — Configure doesn't touch it. .format() gives you the context string, including best practices for handling personal data — grounding, transparency, connection errors.

2. Pass tools to your LLM

CONFIGURE_TOOLS gives the LLM access to profile reads, memory saves, email search, calendar, drive, notes, and web search. Pass them alongside your system prompt.

typescript
import Anthropic from '@anthropic-ai/sdk';

const anthropic = new Anthropic();

let response = await anthropic.messages.create({
  model: 'claude-sonnet-4-20250514',
  max_tokens: 4096,
  system: systemPrompt,
  tools: CONFIGURE_TOOLS,
  messages: [{ role: 'user', content: userMessage }],
});

TIP

Tools are in Anthropic's format by default. For OpenAI or other providers, use toOpenAIFunctions(CONFIGURE_TOOLS).

3. Handle tool calls

When the LLM calls a Configure tool, route it to the SDK. Run a loop until the model stops calling tools.

typescript
while (response.stop_reason === 'tool_use') {
  const toolResults: Anthropic.MessageParam[] = [];

  for (const block of response.content) {
    if (block.type === 'tool_use') {
      const result = await handleConfigureTool(
        client, token, userId, block.name, block.input as Record<string, any>,
      );
      toolResults.push({
        role: 'user',
        content: [{ type: 'tool_result', tool_use_id: block.id, content: result }],
      });
    }
  }

  messages.push({ role: 'assistant', content: response.content });
  messages.push(...toolResults);

  response = await anthropic.messages.create({
    model: 'claude-sonnet-4-20250514',
    max_tokens: 4096,
    system: systemPrompt,
    tools: CONFIGURE_TOOLS,
    messages,
  });
}

const assistantResponse = response.content
  .filter((b): b is Anthropic.TextBlock => b.type === 'text')
  .map(b => b.text)
  .join('');

handleConfigureTool routes each tool name to the appropriate SDK method. See Tool Calling for the full implementation.

4. Extract memories

After each turn, call profile.ingest() to extract and store memories. This is fire-and-forget — it doesn't block your response.

typescript
await client.profile.ingest(token, userId, [
  { role: 'user', content: userMessage },
  { role: 'assistant', content: assistantResponse },
]);

Pass memoryCriteria to control what gets extracted (e.g., "travel preferences, dietary restrictions"). On the next session, those memories are part of the profile.

That's it

Four steps: get profile, pass tools, handle calls, extract memories. Your agent now has persistent memory across sessions.

Profiles start empty

Your first user will get a generic greeting — their profile has nothing in it yet. Seed the profile to get personalized responses from the first message. Connect Gmail and your agent knows the user's name, occupation, and interests within 10 seconds.

Next:

Identity and memory infrastructure for AI agents