Router Query Engine
In this tutorial, we define a custom router query engine that selects one out of several candidate query engines to execute a query.
Setup
First, we need to install import the necessary modules from llamaindex
:
pnpm i lamaindex
import {
OpenAI,
RouterQueryEngine,
SimpleDirectoryReader,
SimpleNodeParser,
SummaryIndex,
VectorStoreIndex,
Settings,
} from "llamaindex";
Loading Data
Next, we need to load some data. We will use the SimpleDirectoryReader
to load documents from a directory:
const documents = await new SimpleDirectoryReader().loadData({
directoryPath: "node_modules/llamaindex/examples",
});
Service Context
Next, we need to define some basic rules and parse the documents into nodes. We will use the SimpleNodeParser
to parse the documents into nodes and Settings
to define the rules (eg. LLM API key, chunk size, etc.):
Settings.llm = new OpenAI();
Settings.nodeParser = new SimpleNodeParser({
chunkSize: 1024,
});
Creating Indices
Next, we need to create some indices. We will create a VectorStoreIndex
and a SummaryIndex
:
const vectorIndex = await VectorStoreIndex.fromDocuments(documents);
const summaryIndex = await SummaryIndex.fromDocuments(documents);