Part II: Building Applications with LLM APIs
Structured outputs, tool calling, sampling, and prompt engineering that hold up in production.
Interactive Flow: The Tool Calling Loop
Click each step to understand the flow.
Step 1: Request with Tools
The developer sends a prompt to the model, including a list of available `tools` defined by a JSON schema.
Structured Outputs
- Enable JSON mode if available; validate and retry on error.
- Define function/tool schemas with required fields and enums.
- Prefer small, composable functions with typed arguments.
Sampling & Prompting
- Temperature, top-k, and top-p control diversity and determinism.
- Defensive prompts (refusal guidance, constraints) for safety.
- Tool-first prompting for function calls with schemas.