Customize OAK
Configuration Options
The config.ts
file in the root of the repository allows you to customize various aspects of the OAK core. Below are the key configuration options available:
Name
- Type:
string
- Description: Set a descriptive name for your configuration.
Models
- Type:
LanguageModelV1[]
- An array of language models following the Vercel AI SDK Language Model V1 specification - Description: Define an array of language models to be used by the application. You can specify different versions or types of models, such as
openai("gpt-4o")
,google("gemini-2.0-flash-001")
, oranthropic("claude-3-5-sonnet-20240620")
, to suit your needs.
Embedding
- Type:
object
(Optional) - Description: Configure the embedding model and its parameters to optimize text processing.
- Model:
- Type:
EmbeddingModel<string>
- Description: Choose the embedding model, such as
openai.embedding("text-embedding-3-small")
, to handle text embeddings. For more details on embeddings, refer to the Vercel AI SDK Embeddings Documentation.
- Type:
- ChunkSize:
- Type:
number
(Optional) - default: 1000 - Description: Adjust the size of text chunks for processing, which can impact performance and memory usage. The
chunkSize
is measured in tokens.
- Type:
- Overlap:
- Type:
number
(Optional) - default: 200 - Description: Set the overlap between text chunks to maintain context across large texts. The
overlap
is measured in tokens.
- Type:
- Model:
Customization Tips
- Adding Models: You can expand the
models
array with additional language models to enhance capabilities. - Tuning Embeddings: Modify the
chunkSize
andoverlap
parameters to better handle specific text processing requirements.
Here's a sample configuration:
import { openai, google, anthropic } from 'ai';
export default {
name: "Project Name",
models: [
openai("gpt-4o"),
google("gemini-2.0-flash-001"),
anthropic("claude-3-5-sonnet-20240620")
],
embedding: {
model: openai.embedding("text-embedding-3-small"),
chunkSize: 1000,
overlap: 200
}
} satisfies OAKConfig;