Class: ContextChatEngine
ContextChatEngine uses the Index to get the appropriate context for each query. The context is stored in the system prompt, and the chat history is chunk: ChatResponseChunk, nodes?: NodeWithScore<import("/Users/marcus/code/llamaindex/LlamaIndexTS/packages/core/src/Node").Metadata>[], nodes?: NodeWithScore<import("/Users/marcus/code/llamaindex/LlamaIndexTS/packages/core/src/Node").Metadata>[]lowing the appropriate context to be surfaced for each query.
Extends
Implements
Constructors
new ContextChatEngine()
new ContextChatEngine(
init
):ContextChatEngine
Parameters
• init
• init.chatHistory?: ChatMessage
[]
• init.chatModel?: LLM
<object
, object
>
• init.contextRole?: MessageType
• init.contextSystemPrompt?: ContextSystemPrompt
• init.nodePostprocessors?: BaseNodePostprocessor
[]
• init.retriever: BaseRetriever
• init.systemPrompt?: string
Returns
Overrides
Defined in
packages/llamaindex/src/engines/chat/ContextChatEngine.ts:46
Properties
chatModel
chatModel:
LLM
<object
,object
>
Defined in
packages/llamaindex/src/engines/chat/ContextChatEngine.ts:37
contextGenerator
contextGenerator:
ContextGenerator
&PromptMixin
Defined in
packages/llamaindex/src/engines/chat/ContextChatEngine.ts:39
memory
memory:
BaseMemory
<object
>
Defined in
packages/llamaindex/src/engines/chat/ContextChatEngine.ts:38
systemPrompt?
optional
systemPrompt:string
Defined in
packages/llamaindex/src/engines/chat/ContextChatEngine.ts:40
Accessors
chatHistory
get
chatHistory():ChatMessage
<object
>[] |Promise
<ChatMessage
<object
>[]>
Returns
ChatMessage
<object
>[] | Promise
<ChatMessage
<object
>[]>
Implementation of
Defined in
packages/llamaindex/src/engines/chat/ContextChatEngine.ts:42
Methods
_getPromptModules()
protected
_getPromptModules():ModuleRecord
Return a dictionary of sub-modules within the current module that also implement PromptMixin (so that their prompts can also be get/set).
Can be blank if no sub-modules.
Returns
Overrides
Defined in
packages/llamaindex/src/engines/chat/ContextChatEngine.ts:80
_getPrompts()
protected
_getPrompts():PromptsRecord
Returns
Overrides
Defined in
packages/llamaindex/src/engines/chat/ContextChatEngine.ts:68
_updatePrompts()
protected
_updatePrompts(prompts
):void
Parameters
• prompts
• prompts.contextSystemPrompt: ContextSystemPrompt
Returns
void
Overrides
Defined in
packages/llamaindex/src/engines/chat/ContextChatEngine.ts:74
chat()
chat(params)
chat(
params
):Promise
<EngineResponse
>
Parameters
• params: NonStreamingChatEngineParams
<object
>
Returns
Promise
<EngineResponse
>
Implementation of
Defined in
packages/llamaindex/src/engines/chat/ContextChatEngine.ts:86
chat(params)
chat(
params
):Promise
<AsyncIterable
<EngineResponse
,any
,any
>>
Parameters
• params: StreamingChatEngineParams
<object
>
Returns
Promise
<AsyncIterable
<EngineResponse
, any
, any
>>
Implementation of
Defined in
packages/llamaindex/src/engines/chat/ContextChatEngine.ts:87
getPrompts()
getPrompts():
PromptsRecord
Returns
Inherited from
Defined in
packages/core/prompts/dist/prompts/index.d.ts:58
reset()
reset():
void
Returns
void
Defined in
packages/llamaindex/src/engines/chat/ContextChatEngine.ts:131
updatePrompts()
updatePrompts(
prompts
):void
Parameters
• prompts: PromptsRecord
Returns
void
Inherited from
Defined in
packages/core/prompts/dist/prompts/index.d.ts:59
validatePrompts()
validatePrompts(
promptsDict
,moduleDict
):void
Parameters
• promptsDict: PromptsRecord
• moduleDict: ModuleRecord
Returns
void
Inherited from
Defined in
packages/core/prompts/dist/prompts/index.d.ts:57