Class: OllamaEmbedding
OllamaEmbedding is an alias for Ollama that implements the BaseEmbedding interface.
Hierarchy
-
↳
OllamaEmbedding
Implements
Constructors
constructor
• new OllamaEmbedding(init
): OllamaEmbedding
Parameters
Name | Type |
---|---|
init | Partial <Ollama > & { model : string ; modelMetadata? : Partial <LLMMetadata > } |
Returns
Inherited from
Defined in
packages/core/src/llm/ollama.ts:39
Properties
additionalChatOptions
• Optional
additionalChatOptions: Record
<string
, unknown
>
Inherited from
Defined in
packages/core/src/llm/ollama.ts:35
baseURL
• baseURL: string
= "http://127.0.0.1:11434"
Inherited from
Defined in
packages/core/src/llm/ollama.ts:30
contextWindow
• contextWindow: number
= 4096
Inherited from
Defined in
packages/core/src/llm/ollama.ts:33
embedBatchSize
• embedBatchSize: number
= DEFAULT_EMBED_BATCH_SIZE
Implementation of
Inherited from
Defined in
packages/core/src/embeddings/types.ts:9
hasStreaming
• Readonly
hasStreaming: true
Inherited from
Defined in
packages/core/src/llm/ollama.ts:26
model
• model: string
Inherited from
Defined in
packages/core/src/llm/ollama.ts:29
modelMetadata
• Protected
modelMetadata: Partial
<LLMMetadata
>
Inherited from
Defined in
packages/core/src/llm/ollama.ts:37
requestTimeout
• requestTimeout: number
Inherited from
Defined in
packages/core/src/llm/ollama.ts:34
temperature
• temperature: number
= 0.7
Inherited from
Defined in
packages/core/src/llm/ollama.ts:31
topP
• topP: number
= 0.9
Inherited from
Defined in
packages/core/src/llm/ollama.ts:32
Accessors
metadata
• get
metadata(): LLMMetadata
Returns
Inherited from
Ollama.metadata
Defined in
packages/core/src/llm/ollama.ts:52
Methods
chat
▸ chat(params
): Promise
<AsyncIterable
<ChatResponseChunk
>>
Get a chat response from the LLM
Parameters
Name | Type |
---|---|
params | LLMChatParamsStreaming |
Returns
Promise
<AsyncIterable
<ChatResponseChunk
>>
Inherited from
Defined in
packages/core/src/llm/ollama.ts:64
▸ chat(params
): Promise
<ChatResponse
>
Parameters
Name | Type |
---|---|
params | LLMChatParamsNonStreaming |
Returns
Promise
<ChatResponse
>