Nuxt

How does Nuxt 4 integrate Vercel AI SDK for streaming AI chat interfaces?

December 4, 2025

download ready
Thank You
Your submission has been received.
We will be in touch and contact you soon!

Nuxt 4 combines Vercel AI SDK's useChat composable with Nitro server routes for real-time streaming responses. Server actions process messages via generateText with streaming enabled. Clients receive tokens instantly with full TypeScript support and conversation history.

Step-by-Step Implementation:-

Step 1: Install Dependencies

Code

npm install ai @ai-sdk/openai
      

Step 2: Configure Runtime API Key

Code

// nuxt.config.ts
export default defineNuxtConfig({
  runtimeConfig: {
    aiApiKey: process.env.OPENAI_API_KEY
  }
})
      

Step 3: Create Server Action

Code

// server/api/chat.post.ts
import { generateText } from 'ai'
import { openai } from '@ai-sdk/openai'

export default defineEventHandler(async (event) => {
  const { messages } = await readBody(event)
  const config = useRuntimeConfig()
  
  const { textStream } = await generateText({
    model: openai('gpt-4o-mini'),
    messages,
    stream: true
  })
    return textStream.toAIStreamResponse()
})
      

Step 4: Build Chat Component

Code

<template>
  <div class="chat">
    <div v-for="message in messages" :key="message.id" class="message">
      {{ message.role }}: {{ message.content }}
    </div>
    <form @submit.prevent="handleSubmit" class="input-form">
      <input v-model="input" placeholder="Type message..." />
      <button type="submit">Send</button>
    </form>
  </div>
</template>

<script setup lang="ts">
const { messages, input, handleSubmit, isLoading } = useChat({
  api: '/api/chat',
  initialMessages: [
    { role: 'assistant', content: 'Hello! How can I help?' }
  ]
})
</script>

Step 5: Use in Page

Code

<!-- pages/chat.vue -->
<template>
  <div class="chat-page">
    <ChatBox />
  </div>
</template>
      
Hire Now!

Need Help with Nuxt Development ?

Work with our skilled nuxt developers to accelerate your project and boost its performance.
**Hire now**Hire Now**Hire Now**Hire now**Hire now

How does Nuxt 4 integrate Vercel AI SDK for streaming AI chat interfaces?

Nuxt 4 combines Vercel AI SDK's useChat composable with Nitro server routes for real-time streaming responses. Server actions process messages via generateText with streaming enabled. Clients receive tokens instantly with full TypeScript support and conversation history.

Step-by-Step Implementation:-

Step 1: Install Dependencies

Code

npm install ai @ai-sdk/openai
      

Step 2: Configure Runtime API Key

Code

// nuxt.config.ts
export default defineNuxtConfig({
  runtimeConfig: {
    aiApiKey: process.env.OPENAI_API_KEY
  }
})
      

Step 3: Create Server Action

Code

// server/api/chat.post.ts
import { generateText } from 'ai'
import { openai } from '@ai-sdk/openai'

export default defineEventHandler(async (event) => {
  const { messages } = await readBody(event)
  const config = useRuntimeConfig()
  
  const { textStream } = await generateText({
    model: openai('gpt-4o-mini'),
    messages,
    stream: true
  })
    return textStream.toAIStreamResponse()
})
      

Step 4: Build Chat Component

Code

<template>
  <div class="chat">
    <div v-for="message in messages" :key="message.id" class="message">
      {{ message.role }}: {{ message.content }}
    </div>
    <form @submit.prevent="handleSubmit" class="input-form">
      <input v-model="input" placeholder="Type message..." />
      <button type="submit">Send</button>
    </form>
  </div>
</template>

<script setup lang="ts">
const { messages, input, handleSubmit, isLoading } = useChat({
  api: '/api/chat',
  initialMessages: [
    { role: 'assistant', content: 'Hello! How can I help?' }
  ]
})
</script>

Step 5: Use in Page

Code

<!-- pages/chat.vue -->
<template>
  <div class="chat-page">
    <ChatBox />
  </div>
</template>