How to integrate ASI:One models with Next.js App Router / Cómo integrar los modelos de ASI:One con Next.js App Router

ENGLISH

FROM ASI ALLIANCE LATAM COMMUNITY

How to integrate ASI:One models with Next.js App Router

This article is a step-by-step tutorial on how to create an AI chatbot in Next.js App Router using ASI:One models.

Project GitHub repository here


Requirements

  • pnpm
  • ASI:One API Key
  • Next.js App Router v15

You can generate your ASI:One API Key here.


1. Create the project

Start by creating the Next.js project and selecting yes for each option:

pnpm create next-app@latest asi-one-x-nextjs --yes

cd asi-one-x-nextjs

pnpm install

pnpm add @ai-sdk/openai @ai-sdk/react ai clsx lucide-react react-markdown remark-gfm tailwind-merge zod

pnpm dev

2. Initial setup

  1. In /public/images/, add the asi-logo.png image.

  2. In /src/lib create the file utils.ts with the following code, which helps apply conditional styles in TailwindCSS:

    import { clsx, type ClassValue } from "clsx"
    import { twMerge } from "tailwind-merge"
    
    export function cn(...inputs: ClassValue[]) {
      return twMerge(clsx(inputs))
    }
    
  3. In /src/app/components/ create the component spinner.tsx:

    export const Spinner = () => {
        return (
            <div role="status">
                <svg aria-hidden="true" className="w-6 h-6 text-gray-200 animate-spin dark:text-gray-600 fill-[#111111]" viewBox="0 0 100 101" fill="none" xmlns="http://www.w3.org/2000/svg">
                    <path d="M100 50.5908C100 78.2051 77.6142 100.591 50 100.591C22.3858 100.591 0 78.2051 0 50.5908C0 22.9766 22.3858 0.59082 50 0.59082C77.6142 0.59082 100 22.9766 100 50.5908ZM9.08144 50.5908C9.08144 73.1895 27.4013 91.5094 50 91.5094C72.5987 91.5094 90.9186 73.1895 90.9186 50.5908C90.9186 27.9921 72.5987 9.67226 50 9.67226C27.4013 9.67226 9.08144 27.9921 9.08144 50.5908Z" fill="currentColor" />
                    <path d="M93.9676 39.0409C96.393 38.4038 97.8624 35.9116 97.0079 33.5539C95.2932 28.8227 92.871 24.3692 89.8167 20.348C85.8452 15.1192 80.8826 10.7238 75.2124 7.41289C69.5422 4.10194 63.2754 1.94025 56.7698 1.05124C51.7666 0.367541 46.6976 0.446843 41.7345 1.27873C39.2613 1.69328 37.813 4.19778 38.4501 6.62326C39.0873 9.04874 41.5694 10.4717 44.0505 10.1071C47.8511 9.54855 51.7191 9.52689 55.5402 10.0491C60.8642 10.7766 65.9928 12.5457 70.6331 15.2552C75.2735 17.9648 79.3347 21.5619 82.5849 25.841C84.9175 28.9121 86.7997 32.2913 88.1811 35.8758C89.083 38.2158 91.5421 39.6781 93.9676 39.0409Z" fill="currentFill" />
                </svg>
                <span className="sr-only">Loading...</span>
            </div>
        )
    }
    

3. Create the ASI:One provider

The official ASI:One documentation confirms full compatibility with OpenAI’s chat completions API.

For this project we’ll use the Vercel AI SDK with the OpenAI provider, creating a custom provider for ASI:One.

:backhand_index_pointing_right: More information in the Vercel AI SDK docs.

  1. Create the file /app/lib/ai/asi-provider.ts with the following code:

    import {
      OpenAIProvider,
      createOpenAI,
      openai
    } from '@ai-sdk/openai';
    import { customProvider } from 'ai';
    
    export const ASI_ONE_MODELS = [
      'asi1-mini',
      'asi1-fast',
      'asi1-extended',
      'asi1-agentic',
      'asi1-fast-agentic',
      'asi1-extended-agentic',
      'asi1-graph',
    ] as const;
    
    export type AsiOneModelId = (typeof ASI_ONE_MODELS)[number];
    
    const createAsi = (
      options: {
        apiKey?: string;
        baseURL?: string;
      } = {},
    ): OpenAIProvider => {
      return createOpenAI({
        baseURL: options.baseURL ?? 'https://api.asi1.ai/v1',
        apiKey: options.apiKey ?? process.env.ASI1_API_KEY,
      });
    };
    
    const asi = createAsi();
    
    export const asiProvider = customProvider({
      languageModels: {
        'asi1-mini': asi.chat('asi1-mini'),
        'asi1-fast': asi.chat('asi1-fast'),
        'asi1-extended': asi.chat('asi1-extended'),
        'asi1-agentic': asi.chat('asi1-agentic'),
        'asi1-fast-agentic': asi.chat('asi1-fast-agentic'),
        'asi1-extended-agentic': asi.chat('asi1-extended-agentic'),
        'asi1-graph': asi.chat('asi1-graph'),
      },
    
      fallbackProvider: openai,
    });
    
  2. At the root of the project, create the .env.local file with your API Key:

    ASI1_API_KEY=your_api_key_here
    

    You can get it from the ASI:One dashboard.


4. Create the chat endpoint

In /app/api/chat/route.ts add the following:

import { asiProvider } from '@/lib/ai/asi-provider';
import { streamText, UIMessage, convertToModelMessages } from 'ai';

export const maxDuration = 30;

export async function POST(req: Request) {
  const { messages }: { messages: UIMessage[] } = await req.json();

  const result = streamText({
    model: asiProvider.languageModel('asi1-mini'),
    messages: convertToModelMessages(messages),
  });

  return result.toUIMessageStreamResponse();
}

5. Create the main page

In /src/app/page.tsx replace everything with this code:

'use client';

import { useChat } from '@ai-sdk/react';
import Image from 'next/image';
import { useState } from 'react';
import Markdown from 'react-markdown'
import remarkGfm from 'remark-gfm'
import { User, Send, Square, Trash } from 'lucide-react';
import { cn } from '@/lib/utils';
import { Spinner } from '@/components/spinner';

export default function Chat() {
  const [input, setInput] = useState('');
  const { messages, sendMessage, status, stop, setMessages } = useChat();
  return (
    <div className="flex flex-col justify-between w-full max-w-3xl pt-12 pb-4 px-5 mx-auto stretch min-h-screen">
      <div>
        <h1 className='text-2xl font-semibold mb-5'>ASI1 x NextJS x Vercel AI SDK</h1>

        <div className='w-full max-h-[calc(100vh-200px)] overflow-y-auto'>
          <div className="whitespace-pre-wrap flex w-full justify-start items-start gap-3 mb-5">
            <Image
              src={'/images/logos/asi-logo.png'}
              alt='ASI Logo'
              title='ASI Logo'
              width={225}
              height={255}
              className='rounded-md w-9 h-auto'
            />

            <p>Hello, I am an ASI:One Assistant. How may I assist you today?</p>
          </div>

          <div className='flex flex-col justify-start items-center gap-7 w-full'>
            {messages.map(({ id, parts, role }) => (
              <div key={id} className={cn("whitespace-pre-wrap flex w-full justify-start items-start gap-3", role === 'user' ? 'flex-row-reverse' : '')}>
                {
                  role === 'user' ? (
                    <User />
                  ) : (
                    <Image
                      src={'/images/logos/asi-logo.png'}
                      alt='ASI Logo'
                      title='ASI Logo'
                      width={225}
                      height={255}
                      className='rounded-md w-9 h-auto'
                    />
                  )
                }

                <div>
                  {parts.map((part, i) => {
                    switch (part.type) {
                      case 'text':
                        return <Markdown remarkPlugins={[remarkGfm]} components={{
                          p(props) {
                            return <p className="block" {...props} />
                          },
                          a(props) {
                            return <a target="_blank" rel="noopener noreferrer" className="font-semibold cursor-pointer text-blue-600 underline" {...props} />
                          },
                          ul(props) {
                            return <ul className="flex flex-col justify-center items-start gap-4" {...props} />
                          },
                          ol(props) {
                            return <ol className="flex flex-col justify-center items-start gap-4" {...props} />
                          },
                        }} key={`${id}-${i}`}>{part.text}</Markdown>;
                    }
                  })}
                </div>
              </div>
            ))}
          </div>

          {
            status === 'submitted' ? (
              <div className="whitespace-pre-wrap flex w-full justify-start items-center gap-3 mb-5">
                <Image
                  src={'/images/logos/asi-logo.png'}
                  alt='ASI Logo'
                  title='ASI Logo'
                  width={225}
                  height={255}
                  className='rounded-md w-9 h-auto'
                />

                <div className="flex justify-start items-center gap-3">
                  <Spinner />
                  <p>Loading...</p>
                </div>
              </div>
            ) : ""
          }

        </div>
      </div>

      <form
        onSubmit={e => {
          e.preventDefault();
          if (status === 'streaming' || status === 'submitted') {
            stop();
          } else {
            if (!input.trim()) return;

            sendMessage({ text: input });
            setInput('');
          }
        }}
        className='mt-6 flex gap-3 items-center w-full'
      >
        <input
          className="w-full max-w-3xl p-2 border rounded shadow-xl"
          value={input}
          placeholder="Say something..."
          onChange={e => setInput(e.currentTarget.value)}
          disabled={status === 'submitted' || status === 'streaming'}
        />
        <button
          className='bg-[#111111] hover:opacity-80 transition-all cursor-pointer text-white font-semibold px-4 py-2 rounded shadow-lg flex items-center' type='submit'
        >
          {
            status === 'submitted' || status === 'streaming' ? (
              <Square fill='currentColor' />
            ) : (
              <Send />
            )
          }
        </button>
        <button
          disabled={status === 'submitted' || status === 'streaming'}
          className='bg-[#111111] hover:opacity-80 transition-all cursor-pointer text-white font-semibold px-4 py-2 rounded shadow-lg flex items-center' type='button'
        >
          <Trash onClick={() => {
            setInput('')
            setMessages([])
          }} />
        </button>
      </form>
    </div>
  );
}

6. Conclusion

That’s it! Now, every time you ask a question in the chat, you’ll get a response directly from the ASI:One model you’ve configured.

Thanks for following this tutorial :rocket:


SPANISH

DESDE ASI ALLIANCE LATAM COMMUNITY

Cómo integrar los modelos de ASI:One con Next.js App Router

Este artículo es un tutorial paso a paso para crear un chatbot de IA en Next.js App Router utilizando los modelos de ASI:One.

Repositorio de GitHub del proyecto aquí


Requisitos

  • pnpm
  • ASI:One API Key
  • Next.js App Router v15

Tu API Key de ASI:One la puedes generar aquí.


1. Crear el proyecto

Comenzamos creando el proyecto de Next.js y seleccionando yes en cada opción:

pnpm create next-app@latest asi-one-x-nextjs --yes

cd asi-one-x-nextjs

pnpm install

pnpm add @ai-sdk/openai @ai-sdk/react ai clsx lucide-react react-markdown remark-gfm tailwind-merge zod

pnpm dev

2. Configuración inicial

  1. En /public/images/ agrega la imagen asi-logo.png.

  2. En /src/lib crea el archivo utils.ts con el siguiente código, que facilitará la aplicación de estilos condicionales en TailwindCSS:

    import { clsx, type ClassValue } from "clsx"
    import { twMerge } from "tailwind-merge"
    
    export function cn(...inputs: ClassValue[]) {
      return twMerge(clsx(inputs))
    }
    
  3. En /src/app/components/ crea el componente spinner.tsx:

    export const Spinner = () => {
        return (
            <div role="status">
                <svg aria-hidden="true" className="w-6 h-6 text-gray-200 animate-spin dark:text-gray-600 fill-[#111111]" viewBox="0 0 100 101" fill="none" xmlns="http://www.w3.org/2000/svg">
                    <path d="M100 50.5908C100 78.2051 77.6142 100.591 50 100.591C22.3858 100.591 0 78.2051 0 50.5908C0 22.9766 22.3858 0.59082 50 0.59082C77.6142 0.59082 100 22.9766 100 50.5908ZM9.08144 50.5908C9.08144 73.1895 27.4013 91.5094 50 91.5094C72.5987 91.5094 90.9186 73.1895 90.9186 50.5908C90.9186 27.9921 72.5987 9.67226 50 9.67226C27.4013 9.67226 9.08144 27.9921 9.08144 50.5908Z" fill="currentColor" />
                    <path d="M93.9676 39.0409C96.393 38.4038 97.8624 35.9116 97.0079 33.5539C95.2932 28.8227 92.871 24.3692 89.8167 20.348C85.8452 15.1192 80.8826 10.7238 75.2124 7.41289C69.5422 4.10194 63.2754 1.94025 56.7698 1.05124C51.7666 0.367541 46.6976 0.446843 41.7345 1.27873C39.2613 1.69328 37.813 4.19778 38.4501 6.62326C39.0873 9.04874 41.5694 10.4717 44.0505 10.1071C47.8511 9.54855 51.7191 9.52689 55.5402 10.0491C60.8642 10.7766 65.9928 12.5457 70.6331 15.2552C75.2735 17.9648 79.3347 21.5619 82.5849 25.841C84.9175 28.9121 86.7997 32.2913 88.1811 35.8758C89.083 38.2158 91.5421 39.6781 93.9676 39.0409Z" fill="currentFill" />
                </svg>
                <span className="sr-only">Loading...</span>
            </div>
        )
    }
    

3. Crear el provider de ASI:One

La documentación oficial de ASI:One confirma su compatibilidad total con la API de chat completions de OpenAI.

Para este proyecto utilizaremos el Vercel AI SDK con el provider de OpenAI, creando un provider personalizado para ASI:One.

:backhand_index_pointing_right: Más información en la documentación del Vercel AI SDK.

  1. Crea el archivo /app/lib/ai/asi-provider.ts con el siguiente código:

    import {
      OpenAIProvider,
      createOpenAI,
      openai
    } from '@ai-sdk/openai';
    import { customProvider } from 'ai';
    
    export const ASI_ONE_MODELS = [
      'asi1-mini',
      'asi1-fast',
      'asi1-extended',
      'asi1-agentic',
      'asi1-fast-agentic',
      'asi1-extended-agentic',
      'asi1-graph',
    ] as const;
    
    export type AsiOneModelId = (typeof ASI_ONE_MODELS)[number];
    
    const createAsi = (
      options: {
        apiKey?: string;
        baseURL?: string;
      } = {},
    ): OpenAIProvider => {
      return createOpenAI({
        baseURL: options.baseURL ?? 'https://api.asi1.ai/v1',
        apiKey: options.apiKey ?? process.env.ASI1_API_KEY,
      });
    };
    
    const asi = createAsi();
    
    export const asiProvider = customProvider({
      languageModels: {
        'asi1-mini': asi.chat('asi1-mini'),
        'asi1-fast': asi.chat('asi1-fast'),
        'asi1-extended': asi.chat('asi1-extended'),
        'asi1-agentic': asi.chat('asi1-agentic'),
        'asi1-fast-agentic': asi.chat('asi1-fast-agentic'),
        'asi1-extended-agentic': asi.chat('asi1-extended-agentic'),
        'asi1-graph': asi.chat('asi1-graph'),
      },
    
      fallbackProvider: openai,
    });
    
  2. En la raíz del proyecto crea el archivo .env.local con tu API Key:

    ASI1_API_KEY=tu_api_key_aquí
    

    Puedes obtenerla en el dashboard de ASI:One.


4. Crear el endpoint del chat

En /app/api/chat/route.ts agrega lo siguiente:

import { asiProvider } from '@/lib/ai/asi-provider';
import { streamText, UIMessage, convertToModelMessages } from 'ai';

export const maxDuration = 30;

export async function POST(req: Request) {
  const { messages }: { messages: UIMessage[] } = await req.json();

  const result = streamText({
    model: asiProvider.languageModel('asi1-mini'),
    messages: convertToModelMessages(messages),
  });

  return result.toUIMessageStreamResponse();
}

5. Crear la página principal

En /src/app/page.tsx reemplaza todo el contenido con este código:

'use client';

import { useChat } from '@ai-sdk/react';
import Image from 'next/image';
import { useState } from 'react';
import Markdown from 'react-markdown'
import remarkGfm from 'remark-gfm'
import { User, Send, Square, Trash } from 'lucide-react';
import { cn } from '@/lib/utils';
import { Spinner } from '@/components/spinner';

export default function Chat() {
  const [input, setInput] = useState('');
  const { messages, sendMessage, status, stop, setMessages } = useChat();
  return (
    <div className="flex flex-col justify-between w-full max-w-3xl pt-12 pb-4 px-5 mx-auto stretch min-h-screen">
      <div>
        <h1 className='text-2xl font-semibold mb-5'>ASI1 x NextJS x Vercel AI SDK</h1>

        <div className='w-full max-h-[calc(100vh-200px)] overflow-y-auto'>
          <div className="whitespace-pre-wrap flex w-full justify-start items-start gap-3 mb-5">
            <Image
              src={'/images/logos/asi-logo.png'}
              alt='ASI Logo'
              title='ASI Logo'
              width={225}
              height={255}
              className='rounded-md w-9 h-auto'
            />

            <p>Hello, I am an ASI:One Assistant. How may I assist you today?</p>
          </div>

          <div className='flex flex-col justify-start items-center gap-7 w-full'>
            {messages.map(({ id, parts, role }) => (
              <div key={id} className={cn("whitespace-pre-wrap flex w-full justify-start items-start gap-3", role === 'user' ? 'flex-row-reverse' : '')}>
                {
                  role === 'user' ? (
                    <User />
                  ) : (
                    <Image
                      src={'/images/logos/asi-logo.png'}
                      alt='ASI Logo'
                      title='ASI Logo'
                      width={225}
                      height={255}
                      className='rounded-md w-9 h-auto'
                    />
                  )
                }

                <div>
                  {parts.map((part, i) => {
                    switch (part.type) {
                      case 'text':
                        return <Markdown remarkPlugins={[remarkGfm]} components={{
                          p(props) {
                            return <p className="block" {...props} />
                          },
                          a(props) {
                            return <a target="_blank" rel="noopener noreferrer" className="font-semibold cursor-pointer text-blue-600 underline" {...props} />
                          },
                          ul(props) {
                            return <ul className="flex flex-col justify-center items-start gap-4" {...props} />
                          },
                          ol(props) {
                            return <ol className="flex flex-col justify-center items-start gap-4" {...props} />
                          },
                        }} key={`${id}-${i}`}>{part.text}</Markdown>;
                    }
                  })}
                </div>
              </div>
            ))}
          </div>

          {
            status === 'submitted' ? (
              <div className="whitespace-pre-wrap flex w-full justify-start items-center gap-3 mb-5">
                <Image
                  src={'/images/logos/asi-logo.png'}
                  alt='ASI Logo'
                  title='ASI Logo'
                  width={225}
                  height={255}
                  className='rounded-md w-9 h-auto'
                />

                <div className="flex justify-start items-center gap-3">
                  <Spinner />
                  <p>Loading...</p>
                </div>
              </div>
            ) : ""
          }

        </div>
      </div>

      <form
        onSubmit={e => {
          e.preventDefault();
          if (status === 'streaming' || status === 'submitted') {
            stop();
          } else {
            if (!input.trim()) return;

            sendMessage({ text: input });
            setInput('');
          }
        }}
        className='mt-6 flex gap-3 items-center w-full'
      >
        <input
          className="w-full max-w-3xl p-2 border rounded shadow-xl"
          value={input}
          placeholder="Say something..."
          onChange={e => setInput(e.currentTarget.value)}
          disabled={status === 'submitted' || status === 'streaming'}
        />
        <button
          className='bg-[#111111] hover:opacity-80 transition-all cursor-pointer text-white font-semibold px-4 py-2 rounded shadow-lg flex items-center' type='submit'
        >
          {
            status === 'submitted' || status === 'streaming' ? (
              <Square fill='currentColor' />
            ) : (
              <Send />
            )
          }
        </button>
        <button
          disabled={status === 'submitted' || status === 'streaming'}
          className='bg-[#111111] hover:opacity-80 transition-all cursor-pointer text-white font-semibold px-4 py-2 rounded shadow-lg flex items-center' type='button'
        >
          <Trash onClick={() => {
            setInput('')
            setMessages([])
          }} />
        </button>
      </form>
    </div>
  );
}

6. Conclusión

¡Listo! Ahora, cada vez que hagas una pregunta en el chat, recibirás una respuesta directamente desde el modelo de ASI:One que hayas configurado.

Gracias por seguir este tutorial :rocket:

2 Likes

Great contribution from @ASI_LatAmASI_LatAm Community R&D Guild Lead, amazing adding value to the ASI stack in both languages ​​Spanish and English!

Great job @leandrogavidia, Godspeed GOAT!
The Latin Power of ASI Alliance in da house! Go LatAm

1 Like

Thank you so much, bro.

This is just the beginning

1 Like

Yes indeed, y mucho mas por delante , LFB @ASI_LatAm!

excellent contribution!

1 Like

Thanks so much, Mario!