Chat
A flexible conversational UI with pluggable adapters and built-in feedback support.
The Chat component renders a message thread with a composer, optional welcome content, and optional error presentation. It is transport-agnostic: the component does not talk to a backend by itself. Instead, it uses a ChatAdapter to send user input and (optionally) to handle actions such as message feedback.
To integrate Chat into your product, you typically:
- Provide and manage
messagesandloadingstate. - Implement a
ChatAdapter(or use the built-in HTTP adapter) that updates the message state. - Optionally provide welcome content, error handling, and feedback support.
#Basic usage
This example uses a small in-memory adapter and keeps all state in the parent. It demonstrates the core API: adapter, messages, and loading.
AI Assistant
const { adapter, messages, loading, error, resetChat } = useMockedAdapter();
const welcomeChoices: Choice[] = [
{
label: "Where are users dropping off or struggling?",
icon: (
<Icon>
<IconAreaChart />
</Icon>
),
},
{
label: "Compare traffic and conversions to last month.",
icon: (
<Icon>
<IconSite />
</Icon>
),
},
{
label: "Show me unusual patterns in user behavior.",
icon: (
<Icon>
<IconCampaign />
</Icon>
),
},
];
return (
<div style={{ height: "800px" }}>
<Chat
adapter={adapter}
messages={messages}
loading={loading}
onClose={resetChat}
welcomeHeading="Let’s dive in to your Analytics data"
welcomeText="If you're looking to explore visitor behavior, page performance, or traffic trends, you're in the right place."
composerPlaceholder="Type a question or choose a prompt to get started. Messages with 'notify' or 'interrupt' will simulate errors."
welcomeChoices={welcomeChoices}
error={error}
/>
</div>
);#API
#Chat props
The Chat component is controlled: you own the state. The adapter is responsible for updating messages, loading, and any error state.
| Prop | Type | Description |
|---|---|---|
| adapter | ChatAdapter | Required. The transport + orchestration layer. Called when the user sends a prompt and when the UI triggers actions such as feedback. |
| messages | ChatMessage[] | The message thread to render. When empty, the welcome state can be shown. |
| loading | boolean | Whether the assistant is currently working. When true, the list shows a reasoning indicator. |
| loadingMessage | string | Optional label next to the reasoning indicator. |
| onClose | () => void | Called when the user closes the chat (for example when the chat is rendered inside a floating container). |
| composerPlaceholder | string | Placeholder for the text input in the composer. |
| welcomeHeading | string | Heading shown when there are no messages yet. |
| welcomeText | string | Supporting text shown in the welcome state. |
| welcomeChoices | Choice[] | Optional prompt suggestions shown in the welcome state. When clicked, the choice label is sent as a prompt. |
| error | ChatError | null | Optional error state. Use displayMode: "interrupt" to replace the chat with a retry screen, or displayMode: "notify" to show a non-blocking message. |
#Message types
Messages are rendered based on their payload type. This enables structured UI responses while keeping the message model explicit and type-safe.
#Markdown messages
Use markdown messages for regular free-form content. Markdown is rendered using the platform styling.
Payload shape:
type MarkdownChatMessagePayload = {
type: "markdown";
markdown: string;
};
#Choices messages
Use choices messages when the assistant should guide the user with suggested actions. Choices are rendered as Fancy interactive Pill components.
Payload shape:
type ChoicesChatMessagePayload = {
type: "choices";
choices: Array<{ label: string; icon?: ReactElement }>;
};
The welcome view also uses the same Choice type. This makes it easy to reuse the same prompt suggestions both before and during the conversation.
AI Assistant
const [messages] = useState<ChatMessage[]>([
{
id: "1234",
role: "assistant",
payload: {
type: "choices",
choices: [
{
label: "Option 1",
icon: (
<Icon>
<IconOpenNew />
</Icon>
),
},
{
label: "Option 2",
},
{
label: "Option 3",
},
],
},
createdAt: new Date(),
},
]);
const adapter: ChatAdapter = {
send: async () => {},
};
return (
<div style={{ height: "420px" }}>
<Chat adapter={adapter} messages={messages} loading={false} {...commonProps} />
</div>
);#Adapters
The adapter is the only piece that knows how to communicate with your backend. It receives a ChatMessageDraft when the user sends something.
type ChatMessageDraft = {
content: string;
type: "prompt" | "choice";
};
type ChatAdapter = {
send: (draft: ChatMessageDraft) => Promise<void>;
act?: (action: ChatAction) => Promise<void>;
};
The Chat UI is controlled; therefore the adapter is expected to update your message state. A common pattern is:
- Append the user message to
messagesas soon as the user sends. - Set
loadingwhile your backend request is in flight. - Append the assistant response as one (or more)
ChatMessageitems.
AI Assistant
const [messages, setMessages] = useState<ChatMessage[]>([]);
const [loading, setLoading] = useState<boolean>(false);
const mockAdapter: ChatAdapter = {
send: async (draft: ChatMessageDraft) => {
if (draft.type === "prompt") {
setMessages((prev) => [
...prev,
{
id: crypto.randomUUID(),
role: "user",
payload: { type: "markdown", markdown: draft.content },
createdAt: new Date(),
},
]);
}
setLoading(true);
await sleep(800);
setMessages((prev) => [
...prev,
{
id: crypto.randomUUID(),
role: "assistant",
payload: {
type: "markdown",
markdown: `You said: ${draft.content}`,
},
createdAt: new Date(),
},
]);
setLoading(false);
},
act: async (action: ChatAction) => {
if (action.type === "feedback") {
const { feedbackValue, feedbackContext } = action.payload;
setMessages((prev) =>
prev.map((msg) =>
msg.id === feedbackContext.id ? { ...msg, feedback: feedbackValue } : msg
)
);
}
},
};
return (
<div style={{ height: "800px" }}>
<Chat
adapter={mockAdapter}
messages={messages}
loading={loading}
onClose={() => setMessages([])}
{...commonProps}
/>
</div>
);#Built-in HTTP adapter
If your backend is reachable via HTTP, you can use the built-in hook useHttpChatAdapter. The hook owns the messages and loading state, and it returns an adapter compatible with Chat.
AI Assistant
const { adapter, messagesState, loadingState } = useHttpChatAdapter({
baseUrl: "https://api.example.com",
endpoints: {
chat: "/chat",
feedback: "/feedback",
},
// optional customizations
mapInbound: (data): ChatMessage => ({
id: data.id,
role: "assistant",
payload: data.payload ?? { type: "markdown", markdown: data.content ?? "" },
createdAt: data.createdAt ? new Date(data.createdAt) : new Date(),
feedback: data.feedback ?? undefined,
}),
// mock fetch implementation for demonstration purposes
fetchImpl: async (input: RequestInfo | URL, init?: RequestInit) => {
console.log("HTTP Adapter fetch called with:", input, init);
await sleep(800);
// Feedback endpoint demo
if (String(input).endsWith("/feedback")) {
return new Response(JSON.stringify({ ok: true }), {
status: 200,
headers: { "Content-Type": "application/json" },
});
}
const body = JSON.parse(String(init?.body ?? "{}"));
return new Response(
JSON.stringify({
id: crypto.randomUUID(),
content: "You sent: " + (body?.messages?.[0]?.content ?? ""),
}),
{ status: 200, headers: { "Content-Type": "application/json" } }
);
},
});
const [messages, setMessages] = messagesState;
const [loading] = loadingState;
return (
<div style={{ height: "800px" }}>
<Chat
adapter={adapter}
messages={messages}
loading={loading}
onClose={() => setMessages([])}
{...commonProps}
/>
</div>
);By default, the HTTP adapter:
- Serializes outbound drafts as
{ messages: [{ role: 'user', content: string }] }. - Maps inbound JSON into an assistant markdown message.
- Optionally posts feedback to a dedicated feedback endpoint when configured.
You can customize the adapter using:
composeMessageto control how user messages are appended locally.mapOutboundto match your backend request schema.mapInboundto map the backend response into aChatMessage.headersandfetchImplfor auth and testing.
#Feedback
Assistant markdown messages can store feedback via the optional feedback field. The UI uses the adapter action act with type: "feedback".
Feedback values use MessageFeedbackType:
enum MessageFeedbackType { POSITIVE = 1, NEGATIVE = -1, NEUTRAL = 0 }The built-in HTTP adapter will automatically:
- POST feedback to
endpoints.feedbackwhen provided. - Update the feedback field in local message state after a successful call.
AI Assistant
const initialAssistant: ChatMessage = {
id: "1234",
role: "assistant",
payload: {
type: "markdown",
markdown:
"This message supports feedback. Use the message actions to rate it and optionally leave a comment.",
},
createdAt: new Date(),
feedback: MessageFeedbackType.NEUTRAL,
};
const [messages, setMessages] = useState<ChatMessage[]>([initialAssistant]);
const [loading] = useState<boolean>(false);
const adapter: ChatAdapter = {
send: async () => {
// not needed for this demo
},
act: async (action: ChatAction) => {
if (action.type === "feedback") {
const { feedbackValue, feedbackContext } = action.payload;
// In a real adapter, you would also persist feedback to your backend.
setMessages((prev) =>
prev.map((m) => (m.id === feedbackContext.id ? { ...m, feedback: feedbackValue } : m))
);
}
},
};
return (
<div style={{ height: "500px" }}>
<Chat adapter={adapter} messages={messages} loading={loading} {...commonProps} />
</div>
);#Errors
Provide error to render either a blocking interrupt state or a non-blocking notification.
AI Assistant
const { adapter, messages, loading } = useMockedAdapter();
const [mode, setMode] = useState<"none" | "notify" | "interrupt">("none");
const error: ChatError | null =
mode === "notify"
? {
displayMode: "notify",
message: "This is a non-blocking notification error. The chat remains usable.",
}
: mode === "interrupt"
? {
displayMode: "interrupt",
heading: "We couldn't connect to the assistant.",
message: "This is a blocking interrupt error. Provide an action to retry.",
action: {
label: "Try again",
callback: () => setMode("none"),
},
}
: null;
return (
<div style={{ height: "650px" }}>
<Content gap="small" padding="none">
<Button.Group aria-label="Error mode">
<Button onClick={() => setMode("none")}>No error</Button>
<Button onClick={() => setMode("notify")}>Notify error</Button>
<Button onClick={() => setMode("interrupt")} variant="destructive">
Interrupt error
</Button>
</Button.Group>
</Content>
<div style={{ height: "560px", marginTop: 16 }}>
<Chat
adapter={adapter}
messages={messages}
loading={loading}
error={error}
{...commonProps}
/>
</div>
</div>
);#Layout
#Floating container
Use ChatFloatingContainer when you need a floating chat that overlays page content. The container handles positioning and the close affordance.
const { adapter, messages, loading, error, resetChat } = useMockedAdapter();
const [shown, setShown] = useState<boolean>(false);
return (
<>
<Button onClick={() => setShown(!shown)} variant="primary">
Toggle chat
</Button>
<ChatFloatingContainer shown={shown} onClose={() => setShown(false)}>
<Chat
adapter={adapter}
messages={messages}
loading={loading}
onClose={() => {
setShown(false);
resetChat();
}}
error={error}
{...commonProps}
/>
</ChatFloatingContainer>
</>
);