LlamaIndex
๐ฆ Meet your new best friend in data wrangling! LlamaIndex is like a super-smart librarian for your LLM applications. It helps you organize, structure, and access your private or domain-specific data with style. And guess what? LlamaIndex.TS brings all that Python goodness to JavaScript land!
Supported Environments
- Node.js (Rock-solid official support) โจ
- Vercel Edge Functions (Experimental but exciting!) ๐
- Deno (Also experimental, but living on the edge is fun!) ๐ฆ
Quick Example: Completion
Letโs see this llama in action! Hereโs how to combine the AI SDK and LlamaIndex.
// api/completion/route.ts
import { OpenAI, SimpleChatEngine } from 'llamaindex';
import { LlamaIndexAdapter } from 'ai';
export const maxDuration = 60;
export async function POST(req: Request) {
const { prompt } = await req.json();
const llm = new OpenAI({ model: 'gpt-4o' });
const chatEngine = new SimpleChatEngine({ llm });
const stream = await chatEngine.chat({
message: prompt,
stream: true,
});
return LlamaIndexAdapter.toDataStreamResponse(stream);
}
For the frontend piece, youโll want to use our completion hook.