Getting Started
With prompting fundamentals in place, you are ready to connect to LLM providers programmatically. This step is where things start feeling like real agent development: your code sends prompts, receives responses, and orchestrates tool calls in a loop.
Install the official client libraries to get started:
pip install anthropic openai
Both the Anthropic and OpenAI APIs follow a similar pattern. You create a client, build a messages array, and send it:
from anthropic import Anthropic
client = Anthropic() # reads ANTHROPIC_API_KEY from env
response = client.messages.create(
model="claude-sonnet-4-20250514",
max_tokens=1024,
messages=[
{"role": "user", "content": "Explain async/await in Python in 3 sentences."}
]
)
print(response.content[0].text)
Key Concepts
Function calling (also called tool use) is the mechanism that transforms a chatbot into an agent. You define tools as structured schemas, and the model decides when to invoke them based on the conversation:
tools = [
{
"name": "calculate",
"description": "Evaluate a mathematical expression",
"input_schema": {
"type": "object",
"properties": {
"expression": {
"type": "string",
"description": "Math expression to evaluate, e.g. '2 + 2'"
}
},
"required": ["expression"]
}
}
]
response = client.messages.create(
model="claude-sonnet-4-20250514",
max_tokens=1024,
tools=tools,
messages=[{"role": "user", "content": "What is 1847 * 293?"}]
)
When the model decides to use a tool, the response contains a tool_use content block instead of (or alongside) text. Your code must execute the tool and return the result:
import ast
# Check if the model wants to use a tool
for block in response.content:
if block.type == "tool_use":
# Use ast.literal_eval for safe expression parsing
result = ast.literal_eval(block.input["expression"])
# Send the result back to continue the conversation
Streaming is important for user-facing applications. Instead of waiting for the full response, you can process tokens as they arrive:
with client.messages.stream(
model="claude-sonnet-4-20250514",
max_tokens=1024,
messages=[{"role": "user", "content": "Write a short poem about APIs."}]
) as stream:
for text in stream.text_stream:
print(text, end="", flush=True)
Hands-On Practice
Build your chatbot incrementally. Start with a simple text-in, text-out loop, then add one tool at a time. The critical pattern to internalize is the tool use loop:
- Send user message with tool definitions
- Receive response (may contain tool calls)
- If tool call: execute the tool, append the result, go to step 1
- If text response: display to user
Handle errors at every stage. APIs return rate limit errors (HTTP 429), server errors (5xx), and malformed responses. Implement exponential backoff for retries and always set reasonable timeouts. These patterns will be essential when your agents run autonomously.