import { useState, useMemo, useCallback } from "preact/hooks"; import { useT } from "@shared/i18n/context"; import { CopyButton } from "./CopyButton"; type Protocol = "openai" | "anthropic" | "gemini"; type CodeLang = "python" | "node" | "curl"; const protocols: { id: Protocol; label: string }[] = [ { id: "openai", label: "OpenAI" }, { id: "anthropic", label: "Anthropic" }, { id: "gemini", label: "Gemini" }, ]; const langs: { id: CodeLang; label: string }[] = [ { id: "python", label: "Python" }, { id: "node", label: "Node.js" }, { id: "curl", label: "cURL" }, ]; function buildExamples( baseUrl: string, apiKey: string, model: string, origin: string, reasoningEffort: string ): Record { const effortLine = reasoningEffort && reasoningEffort !== "medium" ? `\n reasoning_effort="${reasoningEffort}",` : ""; const effortJson = reasoningEffort && reasoningEffort !== "medium" ? `,\n "reasoning_effort": "${reasoningEffort}"` : ""; const effortJs = reasoningEffort && reasoningEffort !== "medium" ? `\n reasoning_effort: "${reasoningEffort}",` : ""; return { "openai-python": `from openai import OpenAI client = OpenAI( base_url="${baseUrl}", api_key="${apiKey}", ) response = client.chat.completions.create( model="${model}", messages=[{"role": "user", "content": "Hello"}],${effortLine} ) print(response.choices[0].message.content)`, "openai-curl": `curl ${baseUrl}/chat/completions \\ -H "Content-Type: application/json" \\ -H "Authorization: Bearer ${apiKey}" \\ -d '{ "model": "${model}", "messages": [{"role": "user", "content": "Hello"}]${effortJson} }'`, "openai-node": `import OpenAI from "openai"; const client = new OpenAI({ baseURL: "${baseUrl}", apiKey: "${apiKey}", }); const stream = await client.chat.completions.create({ model: "${model}", messages: [{ role: "user", content: "Hello" }],${effortJs} stream: true, }); for await (const chunk of stream) { process.stdout.write(chunk.choices[0]?.delta?.content || ""); }`, "anthropic-python": `import anthropic client = anthropic.Anthropic( base_url="${origin}/v1", api_key="${apiKey}", ) message = client.messages.create( model="${model}", max_tokens=1024, messages=[{"role": "user", "content": "Hello"}], ) print(message.content[0].text)`, "anthropic-curl": `curl ${origin}/v1/messages \\ -H "Content-Type: application/json" \\ -H "x-api-key: ${apiKey}" \\ -H "anthropic-version: 2023-06-01" \\ -d '{ "model": "${model}", "max_tokens": 1024, "messages": [{"role": "user", "content": "Hello"}] }'`, "anthropic-node": `import Anthropic from "@anthropic-ai/sdk"; const client = new Anthropic({ baseURL: "${origin}/v1", apiKey: "${apiKey}", }); const message = await client.messages.create({ model: "${model}", max_tokens: 1024, messages: [{ role: "user", content: "Hello" }], }); console.log(message.content[0].text);`, "gemini-python": `from google import genai client = genai.Client( api_key="${apiKey}", http_options={"base_url": "${origin}/v1beta"}, ) response = client.models.generate_content( model="${model}", contents="Hello", ) print(response.text)`, "gemini-curl": `curl "${origin}/v1beta/models/${model}:generateContent?key=${apiKey}" \\ -H "Content-Type: application/json" \\ -d '{ "contents": [{"role": "user", "parts": [{"text": "Hello"}]}] }'`, "gemini-node": `import { GoogleGenAI } from "@google/genai"; const ai = new GoogleGenAI({ apiKey: "${apiKey}", httpOptions: { baseUrl: "${origin}/v1beta" }, }); const response = await ai.models.generateContent({ model: "${model}", contents: "Hello", }); console.log(response.text);`, }; } interface CodeExamplesProps { baseUrl: string; apiKey: string; model: string; reasoningEffort: string; serviceTier: string | null; } export function CodeExamples({ baseUrl, apiKey, model, reasoningEffort, serviceTier }: CodeExamplesProps) { const t = useT(); const [protocol, setProtocol] = useState("openai"); const [codeLang, setCodeLang] = useState("python"); const origin = typeof window !== "undefined" ? window.location.origin : ""; // Build compound model name with suffixes for CLI users const displayModel = useMemo(() => { let name = model; if (reasoningEffort && reasoningEffort !== "medium") name += `-${reasoningEffort}`; if (serviceTier === "fast") name += "-fast"; return name; }, [model, reasoningEffort, serviceTier]); // When effort/speed are embedded as suffixes, don't also show separate reasoning_effort param const explicitEffort = displayModel === model ? reasoningEffort : "medium"; const examples = useMemo( () => buildExamples(baseUrl, apiKey, displayModel, origin, explicitEffort), [baseUrl, apiKey, displayModel, origin, explicitEffort] ); const currentCode = examples[`${protocol}-${codeLang}`] || "Loading..."; const getCode = useCallback(() => currentCode, [currentCode]); const protoActive = "px-6 py-3 text-[0.82rem] font-semibold text-primary border-b-2 border-primary bg-white dark:bg-card-dark transition-colors"; const protoInactive = "px-6 py-3 text-[0.82rem] font-medium text-slate-500 dark:text-text-dim hover:text-slate-700 dark:hover:text-text-main hover:bg-slate-50 dark:hover:bg-[#21262d] border-b-2 border-transparent transition-colors"; const langActive = "px-3 py-1.5 text-xs font-semibold rounded bg-white dark:bg-[#21262d] text-slate-800 dark:text-text-main shadow-sm border border-transparent dark:border-border-dark transition-all"; const langInactive = "px-3 py-1.5 text-xs font-medium rounded text-slate-500 dark:text-text-dim hover:text-slate-700 dark:hover:text-text-main hover:bg-white/50 dark:hover:bg-[#21262d] border border-transparent transition-all"; return (

{t("integrationExamples")}

{/* Protocol Tabs */}
{protocols.map((p) => ( ))}
{/* Language Tabs & Code */}
{langs.map((l) => ( ))}
{/* Code Block */}
{currentCode}
); }