Overview
This cookbook demonstrates:- Making a basic chat completion request to OpenAI
- Capturing AI request and response events
- Viewing and validating captured data
Repository
Files
Application Code
app.py:Running
With Docker Compose
- OISP Sensor starts and loads eBPF programs
- Python app runs and makes OpenAI API call
- Sensor captures SSL traffic and extracts events
- Events written to
output/events.jsonl
Without Docker
Expected Output
Application Output
Captured Events
Event Details
ai.request event:Validation
What Gets Captured
| Event | What’s Captured |
|---|---|
| ai.request | Model, messages, parameters, timestamp |
| ai.response | Response content, token usage, timing |
| network.connect | Connection to api.openai.com:443 |
| process.exec | python3 execution |
Analysis
Try It Yourself
Modifyapp.py to:
-
Use different models:
-
Add system prompt:
-
Make multiple requests:
Next Steps
- LiteLLM - Use multiple AI providers
- LangChain Agent - Build agents with tools
- FastAPI Service - Production API service