Tailwind Logo

Check the operation of Next.js OpenAI Starter

Next.js

Published: 2024-02-01

In the previous article, I ran the sample code as is. I would like to confirm the operation by looking at the code included in this sample.

Check sample code

Last time I just started it and checked it works, but it is actually in the form of several pages that are included. All but the api listed below have individual pages.

vercelaisample01.png

It would take too long to introduce all of them, so we will pick up a few and review them so that you can understand the basic operation.

useChat

The code running in the home is very simple.

TypeScript
'use client';

import { useChat } from 'ai/react';

export default function Chat() {
  const { messages, input, handleInputChange, handleSubmit } = useChat();
  return (
    <div className="flex flex-col w-full max-w-md py-24 mx-auto stretch">
      {messages.map(m => (
        <div key={m.id} className="whitespace-pre-wrap">
          {m.role === 'user' ? 'User: ' : 'AI: '}
          {m.content}
        </div>
      ))}

      <form onSubmit={handleSubmit}>
        <input
          className="fixed bottom-0 w-full max-w-md p-2 mb-8 border border-gray-300 rounded shadow-xl"
          value={input}
          placeholder="Say something..."
          onChange={handleInputChange}
        />
      </form>
    </div>
  );
}

This is a simple code that uses message.map to process the area where the chat text is displayed and a form is provided. The following code executes the processing of the form input.

TypeScript
  const { messages, input, handleInputChange, handleSubmit } = useChat();

useChat is doing the input characters and returning the result in the message. The following page describes how it works.

When making a standard call, /api/chat is called as api, and app/api/chat/route.ts in the sample code is called. In this route.ts, the flow is to inquire ChatGPT and get the answer. The procedure to connect this calling part to Azure Open AI is introduced in the following page.

This procedure is outside the main theme of this article and will be presented in another article.

useCompletion

Then, add completion to the URL and access the site, and you will see the following screen.

vercelaisample02.png

The code shows that, unlike the above sample, it is called with useCompletion.

TypeScript
  const { completion, input, handleInputChange, handleSubmit, error, data } = useCompletion();

In the case of this call, the answer is in the form of a text completion to generate an answer. More information is available on the following page

By default, the API provided in app/api/completion/route.ts is called.

There is another sample available that makes use of this, spell-check is applicable. Referring to the code, the call is as follows

TypeScript
  const { complete } = useCompletion({
    api: '/api/spell-check',
  });

As an api, you can see that it calls a different code. In fact, if you display the page, you will see the following screen.

vercelaisample03.png

To try it out, let's include the following sentence.

vercelaisample04.png

When you click on the Publish button, a dialog appears asking if there may be a spelling error. and a dialog box will appear.

vercelaisample05.png

It is not just for chatting, but can also be used for this kind of purpose.

experimental_StreamData

The other characteristic of stream-react-response is the way it is called in app/stream-react-response/chat.tsx file, which uses a handler to call useChat instead of the API as follows The following is a example of how to do this.

TypeScript
const { messages, input, handleInputChange, handleSubmit } = useChat({
    api: handler,
  });

The entire page.tsx file is presented in the following code.

TypeScript
import { handler } from './action';
import { Chat } from './chat';

export default function Page() {
  return <Chat handler={handler} />;
}

export const runtime = 'edge';

The file chat.tsx is called as Chat, and the handler defined in action.tsx is passed as the handler,

TypeScript
export async function handler({ messages }: { messages: Message[] }) {
  const data = new experimental_StreamData();

This is where experimental_StreamData is used. By using this, it is possible to set the data to be linked to OpenAI. In this sample, the weather is asked and an answer is returned.

We would like to utilize this mechanism for the use of custom data in the next issue.

Summary

I just looked at the Next.js OpenAI Starter sample, but I see some hints on how to implement it. I would like to organize this in the next issue.

Tags