Add response metadata to Custom LLM that is delivered through to the client
T
Theodore
I need to add trace_id's to LLM responses for feedback annotation purposes in my web application. It is not currently possible to add any information to Custom LLM responses. See https://dev.hume.ai/docs/empathic-voice-interface-evi/guides/custom-language-model#clm-outgoing-message-data-format
I propose adding metadata to the assistant_message responses that would be passed through to the client.
type OutgoingCLMMessage = AssistantInputMessage | AssistantEndMessage;
interface AssistantInputMessage {
type: "assistant_input";
text: string;
metadata?: { [key: string]: any }; // 👈🏼 Add
}
interface AssistantEndMessage {
type: "assistant_end";
metadata?: { [key: string]: any }; // 👈🏼 Add
}
T
Theodore
Richard Marmorstein For clarity; the request from Noah Jackson wants to send custom data from
Client -> Custom LLM
. This request is about sending custom data from Custom LLM -> Client
.Richard Marmorstein
Merged in a post:
Allow Metadata Object in Custom LLM Endpoint
Noah Jackson
Currently, only customSessionID can be customized when hitting my custom LLM endpoint, forcing me to concatenate multiple pieces of information into this single field (e.g., user_id|hume_chat_group_id). I plan to include even more data in the future, which will quickly become unruly. Please support an explicit metadata object alongside requests to manage this more cleanly.
T
Theodore
I'm able to do this currently:
const sessionSettings: Hume.empathicVoice.SessionSettings = {
type: 'session_settings',
customSessionId: '...',
metadata: {
mode_name,
subgraph_name,
},
}
sendSessionSettings(sessionSettings)
Noah Jackson
Theodore does that into hit your custom LLM endpoint though?