Add response metadata to Custom LLM that is delivered through to the client
T
Theodore
I need to add trace_id's to LLM responses for feedback annotation purposes in my web application. It is not currently possible to add any information to Custom LLM responses. See https://dev.hume.ai/docs/empathic-voice-interface-evi/guides/custom-language-model#clm-outgoing-message-data-format
I propose adding metadata to the assistant_message responses that would be passed through to the client.
type OutgoingCLMMessage = AssistantInputMessage | AssistantEndMessage;
interface AssistantInputMessage {
type: "assistant_input";
text: string;
metadata?: { [key: string]: any }; // 👈🏼 Add
}
interface AssistantEndMessage {
type: "assistant_end";
metadata?: { [key: string]: any }; // 👈🏼 Add
}
Richard Marmorstein
Discussed in office hours: the use case that is driving this -- @.deuscapturus wants to attach "thumbs up" "thumbs down" to the assistant_messages that come through to the EVI chat, and he wants clicking those thumbs up / thumbs down to be routeable to the CLM inputs that caused those assistant messages.
For this, we need more than just custom_session_id to establish a connection between the client and CLM for a particular session, we need
message-level
attribution between CLM messages and messages within the EVI chat.T
Theodore
Richard Marmorstein For clarity; the request from Noah Jackson wants to send custom data from
Client -> Custom LLM
. This request is about sending custom data from Custom LLM -> Client
.Richard Marmorstein
Merged in a post:
Allow Metadata Object in Custom LLM Endpoint
Noah Jackson
Currently, only customSessionID can be customized when hitting my custom LLM endpoint, forcing me to concatenate multiple pieces of information into this single field (e.g., user_id|hume_chat_group_id). I plan to include even more data in the future, which will quickly become unruly. Please support an explicit metadata object alongside requests to manage this more cleanly.
T
Theodore
I'm able to do this currently:
const sessionSettings: Hume.empathicVoice.SessionSettings = {
type: 'session_settings',
customSessionId: '...',
metadata: {
mode_name,
subgraph_name,
},
}
sendSessionSettings(sessionSettings)
Noah Jackson
Theodore does that into hit your custom LLM endpoint though?
T
Theodore
Noah Jackson it does not. That metadata option in SessionSettings is meant to fool us into believing we can send metadata to our custom llm; in fact the data goes nowhere.