Skip to content

Next.js

Where to integrate

  • App Router:
  • Route Handlers under app/api/*/route.ts
  • Server Actions inside server components
  • Avoid client components for provider calls; keep keys server‑side.

Example (Route Handler)

// app/api/llm/route.ts
import { NextResponse } from 'next/server'
import OpenAI from 'openai'
import { RunForge } from '../../sdk-ts/index'

const openai = new OpenAI({ apiKey: process.env.OPENAI_API_KEY! })
const rf = new RunForge({ apiKey: process.env.RUNFORGE_API_KEY!, endpoint: process.env.RUNFORGE_ENDPOINT, projectId: process.env.RUNFORGE_PROJECT_ID })

export async function POST(req: Request) {
  const body = await req.json()
  const out = await rf.track({ model: body.model || 'gpt-4o-mini' }, () =>
    openai.chat.completions.create({ model: body.model || 'gpt-4o-mini', messages: body.messages })
  )
  return NextResponse.json({ ok: true, choices: out.choices })
}

Notes

  • Privacy: do not send prompts/outputs to RunForge; only usage metadata is sent by the SDK.
  • Streaming: for OpenAI, use include_usage in streams.
  • Retries: provide a stable runId to dedupe.
  • Troubleshooting: ensure RUNFORGE_API_KEY and RUNFORGE_PROJECT_ID are set; keep RUNFORGE_ENDPOINT server‑only.