Class: Ollama
Unified language model interface
Hierarchy
-
↳
Ollama
Implements
Constructors
constructor
• new Ollama(init
): Ollama
Parameters
Name | Type |
---|---|
init | Partial <Ollama > & { model : string ; modelMetadata? : Partial <LLMMetadata > } |
Returns
Overrides
Defined in
packages/core/src/llm/ollama.ts:39
Properties
additionalChatOptions
• Optional
additionalChatOptions: Record
<string
, unknown
>
Defined in
packages/core/src/llm/ollama.ts:35
baseURL
• baseURL: string
= "http://127.0.0.1:11434"
Defined in
packages/core/src/llm/ollama.ts:30
contextWindow
• contextWindow: number
= 4096
Defined in
packages/core/src/llm/ollama.ts:33
embedBatchSize
• embedBatchSize: number
= DEFAULT_EMBED_BATCH_SIZE
Inherited from
Defined in
packages/core/src/embeddings/types.ts:9
hasStreaming
• Readonly
hasStreaming: true
Defined in
packages/core/src/llm/ollama.ts:26
model
• model: string
Defined in
packages/core/src/llm/ollama.ts:29
modelMetadata
• Protected
modelMetadata: Partial
<LLMMetadata
>
Defined in
packages/core/src/llm/ollama.ts:37
requestTimeout
• requestTimeout: number
Defined in
packages/core/src/llm/ollama.ts:34
temperature
• temperature: number
= 0.7
Defined in
packages/core/src/llm/ollama.ts:31
topP
• topP: number
= 0.9
Defined in
packages/core/src/llm/ollama.ts:32
Accessors
metadata
• get
metadata(): LLMMetadata
Returns
Implementation of
Defined in
packages/core/src/llm/ollama.ts:52
Methods
chat
▸ chat(params
): Promise
<AsyncIterable
<ChatResponseChunk
>>
Get a chat response from the LLM
Parameters
Name | Type |
---|---|
params | LLMChatParamsStreaming |
Returns
Promise
<AsyncIterable
<ChatResponseChunk
>>
Implementation of
Defined in
packages/core/src/llm/ollama.ts:64
▸ chat(params
): Promise
<ChatResponse
>
Parameters
Name | Type |
---|---|
params | LLMChatParamsNonStreaming |
Returns
Promise
<ChatResponse
>
Implementation of
Defined in
packages/core/src/llm/ollama.ts:67
complete
▸ complete(params
): Promise
<AsyncIterable
<CompletionResponse
>>
Get a prompt completion from the LLM
Parameters
Name | Type |
---|---|
params | LLMCompletionParamsStreaming |
Returns
Promise
<AsyncIterable
<CompletionResponse
>>
Implementation of
Defined in
packages/core/src/llm/ollama.ts:139
▸ complete(params
): Promise
<CompletionResponse
>
Parameters
Name | Type |
---|---|
params | LLMCompletionParamsNonStreaming |
Returns
Promise
<CompletionResponse
>
Implementation of
Defined in
packages/core/src/llm/ollama.ts:142
getEmbedding
▸ getEmbedding(prompt
): Promise
<number
[]>
Parameters
Name | Type |
---|---|
prompt | string |
Returns
Promise
<number
[]>
Defined in
packages/core/src/llm/ollama.ts:182
getQueryEmbedding
▸ getQueryEmbedding(query
): Promise
<number
[]>
Parameters
Name | Type |
---|---|
query | string |
Returns
Promise
<number
[]>
Overrides
BaseEmbedding.getQueryEmbedding
Defined in
packages/core/src/llm/ollama.ts:209
getTextEmbedding
▸ getTextEmbedding(text
): Promise
<number
[]>
Parameters
Name | Type |
---|---|
text | string |
Returns
Promise
<number
[]>
Overrides
BaseEmbedding.getTextEmbedding
Defined in
packages/core/src/llm/ollama.ts:205
getTextEmbeddings
▸ getTextEmbeddings(texts
): Promise
<number
[][]>
Optionally override this method to retrieve multiple embeddings in a single request
Parameters
Name | Type |
---|---|
texts | string [] |
Returns
Promise
<number
[][]>
Inherited from
BaseEmbedding.getTextEmbeddings
Defined in
packages/core/src/embeddings/types.ts:26
getTextEmbeddingsBatch
▸ getTextEmbeddingsBatch(texts
, options?
): Promise
<number
[][]>
Get embeddings for a batch of texts
Parameters
Name | Type |
---|---|
texts | string [] |
options? | Object |
options.logProgress? | boolean |
Returns
Promise
<number
[][]>
Inherited from
BaseEmbedding.getTextEmbeddingsBatch
Defined in
packages/core/src/embeddings/types.ts:42
similarity
▸ similarity(embedding1
, embedding2
, mode?
): number
Parameters
Name | Type | Default value |
---|---|---|
embedding1 | number [] | undefined |
embedding2 | number [] | undefined |
mode | SimilarityType | SimilarityType.DEFAULT |
Returns
number
Inherited from
Defined in
packages/core/src/embeddings/types.ts:11
streamChat
▸ streamChat<T
>(stream
, accessor
): AsyncIterable
<T
>
Type parameters
Name |
---|
T |
Parameters
Name | Type |
---|---|
stream | ReadableStream <Uint8Array > |
accessor | (data : any ) => T |
Returns
AsyncIterable
<T
>
Defined in
packages/core/src/llm/ollama.ts:112
transform
▸ transform(nodes
, _options?
): Promise
<BaseNode
<Metadata
>[]>
Parameters
Name | Type |
---|---|
nodes | BaseNode <Metadata >[] |
_options? | any |