Appearance
Chat Completions API
IMPORTANT
Starting a new project? We recommend trying Responses to take advantage of the latest Octagon platform features. Compare Chat Completions with Responses.
Streaming is required for all Octagon agents. Non-streaming requests are not supported.
The Chat Completions API provides a simple, OpenAI-compatible interface for interacting with Octagon's specialized agents. This guide explains how to use the API effectively.
Overview of the Chat Completions API
Chat Completions is a straightforward way to integrate with Octagon's financial agents using the familiar OpenAI-compatible format. It offers:
- Simple message-based format: Easy to implement with a standard chat interface
- OpenAI compatibility: Works with existing OpenAI client libraries
- Structured Citations: Provides source references in a consistent format
- Multi-turn conversations: Supports context from previous messages
Installation
Install the OpenAI SDK:
bash
pip install openai
bash
npm install openai
Request Parameters
Parameter | Type | Required | Description |
---|---|---|---|
messages | array | Yes | An array of message objects representing the conversation so far. |
model | string | Yes | The ID of the agent model to use. See Available Agents. |
stream | boolean | Yes | Must be set to true for all agent requests. |
Each message object in the messages
array should have the following structure:
Field | Type | Description |
---|---|---|
role | string | The role of the message sender. Can be "user" or "assistant". |
content | string | The content of the query or response. |
Basic Usage Example
Python
from openai import OpenAI
client = OpenAI(
api_key="your-octagon-api-key",
base_url="https://api.octagonagents.com/v1"
)
response = client.chat.completions.create(
model="octagon-sec-agent",
messages=[
{"role": "user", "content": "What was Apple's revenue growth rate in Q3 2023 compared to Q3 2022?"}
],
stream=True
)
full_analysis = ""
citations = []
# Process each chunk of the streaming response
for chunk in response:
if chunk.choices[0].delta.content:
content = chunk.choices[0].delta.content
full_analysis += content
print(content, end="")
# Check if this is the final chunk with citations
if hasattr(chunk.choices[0], 'citations'):
citations = chunk.choices[0].citations
print("\n\nSOURCES:")
for citation in citations:
print(f"{citation.order}. {citation.name}: {citation.url}")
JavaScript
import OpenAI from 'openai';
const openai = new OpenAI({
apiKey: 'your-octagon-api-key',
baseURL: 'https://api.octagonagents.com/v1',
});
async function getAnalysis() {
const stream = await openai.chat.completions.create({
model: 'octagon-sec-agent',
messages: [
{ role: 'user', content: 'What was Apple\'s revenue growth rate in Q3 2023 compared to Q3 2022?' }
],
stream: true
});
let fullAnalysis = '';
let citations = [];
for await (const chunk of stream) {
// Accumulate content
const content = chunk.choices[0]?.delta?.content || '';
if (content) {
fullAnalysis += content;
process.stdout.write(content);
}
// Check for citations in the final chunk
if (chunk.choices[0]?.finish_reason === 'stop' && chunk.choices[0]?.citations) {
citations = chunk.choices[0].citations;
}
}
console.log("\n\nSOURCES:");
citations.forEach(citation => {
console.log(`${citation.order}. ${citation.name}: ${citation.url}`);
});
}
getAnalysis();
cURL
curl -X POST https://api.octagonagents.com/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer your-octagon-api-key" \
-d '{
"model": "octagon-sec-agent",
"messages": [
{
"role": "user",
"content": "What was Apple'\''s revenue growth rate in Q3 2023 compared to Q3 2022?"
}
],
"stream": true
}' \
--no-buffer
# Note: Handling the stream response with cURL requires additional
# processing to extract the deltas and citations
Understanding Streaming Format
The API returns a stream of data chunks in OpenAI-compatible format. Each chunk is a JSON object prefixed with data:
and followed by two newline characters (\n\n
). The final chunk is indicated by data: [DONE]\n\n
.
Each chunk has the following structure:
json
{
"id": "response_id",
"object": "chat.completion.chunk",
"created": 1700000000,
"model": "octagon-sec-agent",
"choices": [
{
"index": 0,
"delta": {
"content": "Part of the content..."
},
"finish_reason": null
}
]
}
The final chunk will have a finish_reason
of "stop"
and may include additional metadata like citations to sources.
Handling Citations in Chat Completions
For research agents that provide citations (such as SEC agents, earnings transcripts agents, etc.), the final chunk in the stream will include a citations
array containing sources used to generate the analysis:
Python
from openai import OpenAI
client = OpenAI(
api_key="your-octagon-api-key",
base_url="https://api.octagonagents.com/v1"
)
response = client.chat.completions.create(
model="octagon-sec-agent",
messages=[
{"role": "user", "content": "What was Apple's revenue growth rate in Q3 2023 compared to Q3 2022?"}
],
stream=True
)
full_analysis = ""
citations = []
# Process each chunk of the streaming response
for chunk in response:
if chunk.choices[0].delta.content:
content = chunk.choices[0].delta.content
full_analysis += content
print(content, end="")
# Check if this is the final chunk with citations
if hasattr(chunk.choices[0], 'citations'):
citations = chunk.choices[0].citations
print("\n\nSOURCES:")
for citation in citations:
print(f"{citation.order}. {citation.name}: {citation.url}")
JavaScript
import OpenAI from 'openai';
const openai = new OpenAI({
apiKey: 'your-octagon-api-key',
baseURL: 'https://api.octagonagents.com/v1',
});
async function getAnalysisWithCitations() {
const stream = await openai.chat.completions.create({
model: 'octagon-sec-agent',
messages: [
{ role: 'user', content: 'What was Apple\'s revenue growth rate in Q3 2023 compared to Q3 2022?' }
],
stream: true
});
let fullAnalysis = '';
let citations = [];
for await (const chunk of stream) {
// Accumulate content
const content = chunk.choices[0]?.delta?.content || '';
if (content) {
fullAnalysis += content;
process.stdout.write(content);
}
// Check for citations in the final chunk
if (chunk.choices[0]?.finish_reason === 'stop' && chunk.choices[0]?.citations) {
citations = chunk.choices[0].citations;
}
}
console.log("\n\nSOURCES:");
citations.forEach(citation => {
console.log(`${citation.order}. ${citation.name}: ${citation.url}`);
});
}
getAnalysisWithCitations();
cURL
# With cURL you'll need to parse the stream manually
# This example shows the API request - parsing would need to be done with additional tools
curl -X POST https://api.octagonagents.com/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer your-octagon-api-key" \
-d '{
"model": "octagon-sec-agent",
"messages": [
{
"role": "user",
"content": "What was Apple'\''s revenue growth rate in Q3 2023 compared to Q3 2022?"
}
],
"stream": true
}' \
--no-buffer
# The response will be a stream of chunks
# You'll need to extract content from the delta field in each chunk
# And check for citations in the final chunk with finish_reason: "stop"
Best Practices
- Provide Clear Queries: Be specific in your messages to get more accurate and relevant responses.
- Use Conversation Context: You can include previous messages in the
messages
array to maintain context in multi-turn interactions. - Handle Streaming Efficiently: Update your UI incrementally as new content arrives rather than waiting for the full response.
- Display Citations: When available, display citations to show where information was sourced from.
- Implement Error Handling: Have robust error handling for network disruptions during streaming.
Next Steps
- Explore the Responses API → for more advanced features
- View available agent models → for detailed capabilities
- Learn about authentication → for secure access