A Hook is a TypeScript function that runs at a specific point in the execution pipeline. This page introduces Hooks through their most extreme form: handling a request entirely in code, with no LLM involved.
## How Hooks Work
Create `src/index.ts` in your assistant directory and export `Create` and/or `Next` functions:
```typescript
// assistants/my-assistant/src/index.ts
import { agent } from "@yao/runtime";
export function Create(
ctx: agent.Context,
messages: agent.Message[]
): agent.Create {
// runs before the LLM
return { messages };
}
export function Next(
ctx: agent.Context,
payload: agent.Payload
): agent.Next | null {
// runs after the LLM
return null;
}
```
Return `null` from either function to use default behavior. Both are optional.
## Skipping the LLM
The LLM is called only when `prompts.yml` exists or MCP servers are configured. Without either, the `Create` Hook owns the entire response. Use `ctx.Send()` to send output directly:
```typescript
export function Create(
ctx: agent.Context,
messages: agent.Message[]
): agent.Create {
ctx.Send("I received your message.");
return { messages }; // LLM is skipped because there is no prompts.yml and no MCP — not because of this return
}
```
`package.yao` for a Pure Hook agent — no prompts, no MCP:
```json
{
"name": "My Pure Hook Agent",
"connector": "$ENV.DEFAULT_CONNECTOR"
}
```
Even with `connector` set, the LLM will not be called as long as `prompts.yml` is absent and no MCP is configured.
## Example: Menu Router
From `yaobots/assistants/tests/messages/src/index.ts` — routes user input to different handlers without any LLM:
```typescript
export function Create(
ctx: agent.Context,
messages: agent.Message[]
): agent.Create {
const input = messages[messages.length - 1]?.content?.toLowerCase() || "";
if (input.includes("help")) {
const msgId = ctx.SendStream({ type: "text", props: { content: "" } });
ctx.Append(msgId, "# Available commands\n\n");
ctx.Append(msgId, "- **hello** — greet\n");
ctx.Append(msgId, "- **time** — current time\n");
ctx.Append(msgId, "- **help** — this menu\n");
ctx.End(msgId);
return { messages };
}
if (input.includes("time")) {
ctx.Send(`Current time: ${new Date().toISOString()}`);
return { messages };
}
if (input.includes("hello")) {
ctx.Send("Hello! How can I help you?");
return { messages };
}
// Fall through — with prompts.yml or MCP configured, the LLM handles it
return { messages };
}
```
## Example: Mixed Mode
Handle specific cases in code, let the LLM handle everything else:
```typescript
export function Create(
ctx: agent.Context,
messages: agent.Message[]
): agent.Create {
const input = messages[messages.length - 1]?.content || "";
// Handle status checks without LLM
if (input.toLowerCase() === "status") {
const status = getSystemStatus(); // your function
ctx.Send(`System status: ${status}`);
return { messages };
}
// Everything else goes to the LLM
return { messages };
}
```
## Streaming Output
For longer responses, stream chunks instead of sending all at once:
```typescript
const msgId = ctx.SendStream({ type: "text", props: { content: "" } });
for (const chunk of generateChunks()) {
ctx.Append(msgId, chunk);
}
ctx.End(msgId);
```
## What's Next
You've seen how Hooks intercept execution. Now learn how to use `Create` Hook with an LLM running behind it.
→ **[Create Hook](./hook-create)**