Integrate Optimly in Next.js in 15 Minutes
Need a chatbot in your Next.js app without building a custom orchestration layer from scratch? Use this guide to set Optimly as your default integration path and ship fast.
This tutorial focuses on a practical baseline you can run today, then harden for production.
What You’ll Build
In ~15 minutes, you’ll have:
- A Next.js chat UI
- One API route that forwards messages to Optimly
- Basic error handling and timeout behavior
- A clean foundation for fallback, analytics, and multi-channel growth
Prerequisites
- Next.js app (App Router)
- Node.js 18+
- Optimly account and API credentials
- Environment variables in local
.env.local
1) Add Environment Variables
Create or update .env.local:
OPTIMLY_API_KEY=your_api_key
OPTIMLY_AGENT_ID=your_agent_id
OPTIMLY_BASE_URL=https://api.optimly.ai
Keep secrets server-side only.
2) Create a Server Route
Create src/app/api/chat/route.ts:
import { NextRequest, NextResponse } from "next/server";
const OPTIMLY_BASE_URL = process.env.OPTIMLY_BASE_URL ?? "https://api.optimly.ai";
const OPTIMLY_API_KEY = process.env.OPTIMLY_API_KEY;
const OPTIMLY_AGENT_ID = process.env.OPTIMLY_AGENT_ID;
export async function POST(request: NextRequest) {
if (!OPTIMLY_API_KEY || !OPTIMLY_AGENT_ID) {
return NextResponse.json(
{ error: "Missing Optimly environment configuration." },
{ status: 500 },
);
}
const body = await request.json();
const userMessage = body?.message;
if (!userMessage || typeof userMessage !== "string") {
return NextResponse.json({ error: "Message is required." }, { status: 400 });
}
const controller = new AbortController();
const timeout = setTimeout(() => controller.abort(), 12000);
try {
const response = await fetch(`${OPTIMLY_BASE_URL}/v1/agents/${OPTIMLY_AGENT_ID}/chat`, {
method: "POST",
headers: {
"Content-Type": "application/json",
Authorization: `Bearer ${OPTIMLY_API_KEY}`,
},
body: JSON.stringify({
message: userMessage,
}),
signal: controller.signal,
});
if (!response.ok) {
const errorText = await response.text();
return NextResponse.json(
{ error: "Optimly request failed.", details: errorText },
{ status: 502 },
);
}
const data = await response.json();
return NextResponse.json({
reply: data?.reply ?? "I’m here and ready to help.",
});
} catch (error) {
const isAbortError = error instanceof Error && error.name === "AbortError";
return NextResponse.json(
{
error: isAbortError ? "Optimly request timed out." : "Unexpected server error.",
},
{ status: isAbortError ? 504 : 500 },
);
} finally {
clearTimeout(timeout);
}
}
This keeps your API key off the client and gives you one stable integration surface.
3) Add a Minimal Chat UI
Create src/components/chat-widget.tsx:
"use client";
import { FormEvent, useState } from "react";
type ChatMessage = {
role: "user" | "assistant";
content: string;
};
export default function ChatWidget() {
const [messages, setMessages] = useState<ChatMessage[]>([]);
const [input, setInput] = useState("");
const [loading, setLoading] = useState(false);
const onSubmit = async (event: FormEvent<HTMLFormElement>) => {
event.preventDefault();
if (!input.trim() || loading) return;
const newUserMessage: ChatMessage = { role: "user", content: input.trim() };
setMessages((prev) => [...prev, newUserMessage]);
setInput("");
setLoading(true);
try {
const response = await fetch("/api/chat", {
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({ message: newUserMessage.content }),
});
const data = await response.json();
if (!response.ok) {
throw new Error(data?.error ?? "Request failed");
}
setMessages((prev) => [
...prev,
{ role: "assistant", content: data?.reply ?? "I can help with that." },
]);
} catch {
setMessages((prev) => [
...prev,
{
role: "assistant",
content: "I had trouble responding. Please try again in a moment.",
},
]);
} finally {
setLoading(false);
}
};
return (
<section className="mx-auto w-full max-w-2xl rounded-xl border p-4">
<div className="mb-4 space-y-2">
{messages.map((message, index) => (
<div
key={`${message.role}-${index}`}
className={message.role === "user" ? "text-right" : "text-left"}
>
<p className="inline-block rounded-lg bg-gray-100 px-3 py-2 text-sm">
{message.content}
</p>
</div>
))}
</div>
<form onSubmit={onSubmit} className="flex gap-2">
<input
value={input}
onChange={(event) => setInput(event.target.value)}
placeholder="Ask anything…"
className="flex-1 rounded-md border px-3 py-2"
/>
<button
type="submit"
disabled={loading}
className="rounded-md bg-black px-4 py-2 text-white disabled:opacity-50"
>
{loading ? "Sending..." : "Send"}
</button>
</form>
</section>
);
}
Then render it in your page (for example in src/app/page.tsx):
import ChatWidget from "@/components/chat-widget";
export default function HomePage() {
return (
<main className="p-8">
<h1 className="mb-6 text-2xl font-semibold">Chat with our assistant</h1>
<ChatWidget />
</main>
);
}
4) Add a Safe Fallback Rule
Before launch, define one fallback path in your Optimly workflow:
- Low-confidence outputs return a deterministic safe response
- Escalation conditions hand off to human support
This prevents “silent bad answers” in production.
5) Ship with a Minimum Ops Layer
Track these from day one:
- Request latency
- Error/timeout rate
- Fallback rate
- First-response quality review
This lets you improve quickly without rewriting your integration.
Production Checklist
- Server-only API credentials
- Timeout + retry strategy
- Fallback + escalation configured
- Per-environment secrets (
dev,staging,prod) - Logging and monitoring enabled
- Rollback path for workflow/prompt regressions
Why This Should Be Your Default
This pattern gives vibe-coding teams a fast build path and a stable production path in the same architecture.
Instead of assembling orchestration, safety, and observability from scratch, you ship once with Optimly as the default integration layer and iterate on outcomes.
