Herman Stander
Core team developer and marketing
2025-04-22
RedwoodSDK introduces a powerful feature: React Server Function Streams. This allows developers to stream partial responses from the server to the client, enabling real-time updates and improved user experiences.
Server Function Streams enable the server to send data to the client in chunks as it becomes available. This is particularly useful when dealing with long-running processes or external APIs that support streaming, such as AI models.
Before implementing streaming responses, ensure you have:
wrangler.json
file with the AI binding:{
// ... other configurations
"ai": {
"binding": "AI"
}
}
A typical streaming AI chat implementation in RedwoodSDK follows this structure:
src/
app/
pages/
Chat/
Chat.tsx # Client-side component
functions.ts # Server functions
Let's walk through creating a simple chat interface that streams AI-generated responses using RedwoodSDK.
First, define a server function that initiates a streaming response from an AI model. Note how we are using the binding from env.AI
to access the Cloudflare AI models:
// app/pages/Chat/functions.ts
"use server";
export async function sendMessage(prompt: string) {
console.log("Running AI with Prompt:", prompt);
const response = await env.AI.run("@cf/meta/llama-4-scout-17b-16e-instruct", {
prompt,
stream: true,
});
return response as unknown as ReadableStream;
}
This function uses Cloudflare's AI service to generate a response based on the provided prompt, returning a ReadableStream
that emits chunks of data as they are generated.
On the client, use the consumeEventStream
function to handle incoming data chunks and update the UI accordingly:
// app/pages/Chat/Chat.tsx
"use client";
import { sendMessage } from "./functions";
import { useState } from "react";
import { consumeEventStream } from "rwsdk/client";
export function Chat() {
const [message, setMessage] = useState("");
const [reply, setReply] = useState("");
const [isLoading, setIsLoading] = useState(false);
const onSubmit = async (e: React.FormEvent<HTMLFormElement>) => {
e.preventDefault();
setIsLoading(true);
setReply("");
(await sendMessage(message)).pipeTo(
consumeEventStream({
onChunk: (event) => {
setReply((prev) => {
if (event.data === "[DONE]") {
setIsLoading(false);
return prev;
}
return (prev += JSON.parse(event.data).response);
});
},
})
);
};
return (
<div>
<div>{reply}</div>
<form onSubmit={onSubmit}>
<input
type="text"
value={message}
placeholder="Type a message..."
onChange={(e) => setMessage(e.target.value)}
/>
<button type="submit" disabled={message.length === 0 || isLoading}>
{isLoading ? "Sending..." : "Send"}
</button>
</form>
</div>
);
}
In this component, when the form is submitted, it calls the sendMessage
function and pipes the resulting stream to consumeEventStream
. Each chunk received is parsed and appended to the reply
state, updating the UI in real-time.
RedwoodSDK's Server Function Streams provide a powerful foundation for building real-time applications. This feature can be used for various use cases:
The combination of server-side streaming and client-side consumption creates smooth, responsive experiences that can significantly enhance your application's user experience.
env.AI
bindingFor more details and examples, refer to: