Export to CSV
Capturing odds-tick history is a common need — backtesting, model training, retrospective analysis. This recipe writes one CSV row per odds:changed event.
Node — append to a file
Section titled “Node — append to a file”import { createClient } from 'realtimeodds'import { createWriteStream } from 'node:fs'import { stringify } from 'node:querystring'
const client = createClient({ url: 'wss://api.realtimeodds.xyz', apiKey: process.env.REALTIMEODDS_API_KEY!})
const out = createWriteStream('odds.csv', { flags: 'a' })
// Header — only write if file is newout.write('receivedAt,bookmaker,sportEventId,marketId,selectionId,price,size\n')
client.on('odds:changed', ({ bookmaker, sportEventId, marketId, selectionId, quote, receivedAt }) => { out.write( [ receivedAt, bookmaker, escapeCsv(sportEventId), escapeCsv(marketId), escapeCsv(selectionId), quote.price, quote.size ?? '' ].join(',') + '\n' )})
function escapeCsv(s: string): string { if (/[",\n]/.test(s)) return `"${s.replace(/"/g, '""')}"` return s}
await client.connect()createWriteStream(..., { flags: 'a' }) appends rather than truncates, so you can resume capture across restarts.
Add denormalised columns
Section titled “Add denormalised columns”The id columns above are compact but not human-readable. Add the bookmaker, sport, competition, and team names if you’ll be reading the file by hand:
client.on('odds:changed', payload => { const { sportEventId, selectionId, quote, receivedAt, bookmaker } = payload const ev = client.getSportEvent(sportEventId) if (!ev) return
const market = ev.getMarket(payload.marketId) const selection = market?.getSelection(selectionId)
out.write( [ new Date(receivedAt).toISOString(), bookmaker, ev.sport, escapeCsv(ev.competition), escapeCsv(ev.name), market?.kind ?? '', selection?.result ?? '', quote.price, quote.size ?? '' ].join(',') + '\n' )})A representative output:
2026-05-07T14:30:01.234Z,ps3838,basketball,comp:basketball.nba,Los Angeles Lakers vs Boston Celtics,market:basketball_match.moneyline,home,1.92,1500Rotate by day
Section titled “Rotate by day”For long-running captures, rotate the file daily so you don’t end up with a 50 GB single file:
import { createWriteStream, type WriteStream } from 'node:fs'
let currentDate = ''let stream: WriteStream | null = null
function getStream(): WriteStream { const today = new Date().toISOString().slice(0, 10) if (today !== currentDate) { stream?.end() stream = createWriteStream(`odds-${today}.csv`, { flags: 'a' }) currentDate = today } return stream!}
client.on('odds:changed', payload => { getStream().write(formatRow(payload))})Capture snapshots too
Section titled “Capture snapshots too”odds:changed only fires on changes. To capture the full state at boot, walk the snapshot once after connect():
await client.connect()
const snapshotStream = createWriteStream(`snapshot-${Date.now()}.csv`)snapshotStream.write('bookmaker,sport,competition,event,market,result,price,size,timestamp\n')
for (const ev of client.snapshot().sportEvents.values()) { for (const market of ev.markets.values()) { for (const sel of market.selections.values()) { if (!sel.quote) continue snapshotStream.write([ ev.bookmaker, ev.sport, escapeCsv(ev.competition), escapeCsv(ev.name), market.kind, sel.result, sel.quote.price, sel.quote.size ?? '', sel.quote.timestamp ].join(',') + '\n') } }}
snapshotStream.end()This gives you a baseline; subsequent odds:changed events fill in the deltas.
Browser — download as CSV
Section titled “Browser — download as CSV”In a browser, you can’t append to a file, but you can buffer in memory and trigger a download:
const rows: string[] = ['receivedAt,bookmaker,selectionId,price\n']
client.on('odds:changed', ({ bookmaker, selectionId, quote, receivedAt }) => { rows.push(`${receivedAt},${bookmaker},${selectionId},${quote.price}\n`)})
document.getElementById('download')!.addEventListener('click', () => { const blob = new Blob(rows, { type: 'text/csv' }) const url = URL.createObjectURL(blob) const a = document.createElement('a') a.href = url; a.download = 'odds.csv'; a.click() URL.revokeObjectURL(url)})Backpressure
Section titled “Backpressure”fs.WriteStream.write returns false when its buffer is full. For high-volume captures (hundreds of ticks per second sustained), respect the signal and resume on 'drain':
import { once } from 'node:events'
async function safeWrite(stream: WriteStream, line: string) { if (!stream.write(line)) { await once(stream, 'drain') }}For most alpha workloads (tens of ticks per second), the default buffer is plenty.