Generative UI in React Native
Žiga Patačko Koderman
Posted on April 11, 2024
Let's build a generative UI Weather Chatbot!
This article is inspired by Vercel's Generative UI. It expands on OpenAI's functions/tools by allowing the model to visualize data to the user.
Let's get straight into it.
Create an Expo Typescript app 📱
npx create-expo-app -t expo-template-blank-typescript demo
cd demo
Install 📦
yarn add react-native-gen-ui zod
This adds the react-native-gen-ui
package that offers:
- Exposes
useChat
hook for easy access to OpenAI's completions api. - Supports streaming.
- Enables Generative UI via
tools
. - Is fully type-safe.
The package was initially developed by us at zerodays.dev and was open-sourced for everyone to use.
We also install zod
here for validation of tool parameters.
Import dependencies
At the top of the App.tsx
add
import { z } from "zod";
import { OpenAI, isReactElement, useChat } from 'react-native-gen-ui';
Initialize OpenAI
Below imports, write:
const openAi = new OpenAI({
apiKey: process.env.EXPO_PUBLIC_OPENAI_API_KEY!,
model: 'gpt-4',
});
Then make sure to add EXPO_PUBLIC_OPENAI_API_KEY
to your environment or in .env
file. You can obtain an API key in the OpenAI Platform.
Utilize useChat
hook.
At the top of the App function, let's use the useChat
hook.
export default function App() {
const { input, onInputChange, messages, isLoading, handleSubmit } = useChat({
openAi,
initialMessages: [
{ content: "You are a nice little weather chatbot.", role: "system" },
],
});
return <View/>;
}
Here we are passing initial messages into the useChat
hook, then allow it to manage the state of messages, stream the content from OpenAI and update the UI.
But we are still missing a way to render this - make the function return the snippet below:
return (
<View
style={{
flex: 1,
backgroundColor: "#fff",
alignItems: "center",
justifyContent: "center",
}}
>
{/* Iterate over all messages and render them */}
{messages.map((msg, index) => {
// Message can be either a React component or a string
if (isReactElement(msg)) {
return msg;
}
switch (msg.role) {
// Render user messages in blue
case "user":
return (
<Text style={{ color: "blue" }} key={index}>
{msg.content?.toString()}
</Text>
);
case "assistant":
// Render assistant messages
return <Text key={index}>{msg.content?.toString()}</Text>;
default:
// This includes tool calls, tool results and system messages
// Those are visible to the model, but here we hide them to the user
return null;
}
})}
{/* Text input for chatting with the model */}
<TextInput
style={{
borderColor: "gray",
borderWidth: 1,
textAlign: "center",
width: "100%",
}}
autoFocus={true}
value={input}
onChangeText={onInputChange}
/>
<Button
onPress={() => handleSubmit(input)}
title="Send"
disabled={isLoading}
/>
</View>
);
Then run the app using npx expo start
. More details about running an Expo app can be found here.
Voilà - we can have a basic chat with 🤖 now!
Add a tool
Now the real magic begins 🪄. A tool is defined by:
- its name,
-
description
(a simple string for model to understand what this tool does), -
parameters
(a type-safe zod schema of what this tool accepts as arguments) and - a
render
function for handling what both the user and model see when this tool is called.
Let's add a “get weather tool“ to tell us what the weather is like at a certain location:
const { ... } = useChat({
...
tools: {
getWeather: {
description: "Get weather for a location",
// Parameters for the tool
parameters: z.object({
location: z.string(),
}),
// Render component for weather - can yield loading state
render: async function* (args) {
// Fetch the weather data
const data = await getWeatherData(args.location);
// Return the final result
return {
// The data will be seen by the model
data,
// The component will be rendered to the user
component: (
<View
key={args.location}
style={{
padding: 20,
borderRadius: 40,
backgroundColor: "rgba(20, 20, 20, 0.05)",
}}
>
<Text style={{ fontSize: 20 }}>
{data.weather === "sunny" ? "☀️" : "🌧️"}
</Text>
</View>
),
};
},
},
},
});
This allows the model to call the getWeather
function. The model receives the data
and can comment upon it, while the user is presented with an emoji representation of the weather (either ☀️ or 🌧️).
The getWeatherData
function is still missing, let's add it:
async function getWeatherData(location: string) {
// Wait 3 seconds to simulate fetching the data from an API
await new Promise((resolve) => setTimeout(resolve, 3000));
// Randomly return either `sunny` or `rainy`
return {
weather: Math.random() > 0.5 ? "sunny" : "rainy",
};
}
The result is a chat interface that goes beyond just words and can display custom components (even though, this is just emojis in our example).
Show loading states
The react-native-gen-ui
allows us to yield zero or more components before returning the actual data and the final component. This can be used to display progress to the user until the data is fetched.
Add the following code at the beginning of render
:
render: async function* (args) {
// Yield a loading indicator
yield <ActivityIndicator key="activity-indicator" />;
// Fetch the weather data
...
The render
function is in fact a generator - this is denoted by the *
after the function
keyword. The interface will always display the last component that was either yielded or returned, allowing us to swap what the user sees over time.
This can be especially useful if the the render function takes a lot of time and has multiple steps. The user can stay informed of whatever is happening in the background. For example, one could pre-render partial information before the entire data pipeline completes.
References
- 📦 react-native-gen-ui
- A minimal example like the one in this post.
- A complete weather bot example from the GIF at the top.
This blog post and the react-native-gen-ui
package were made by the awesome team at zerodays.dev.
Posted on April 11, 2024
Join Our Newsletter. No Spam, Only the good stuff.
Sign up to receive the latest update from our blog.