Allow backend API to send session_settings messages to keep prompts confidential
R
Roman Kupkovic
Currently, only being able to update the settings of a live session/connection over the frontend client means making the prompts used visible to the user.
If we have sensitive business logic related prompts and actions we use for steering the hume conversation we might want to apply those without the user client knowing.
This could be solved with some form of hook or proxy, I am not sure. But it would greatly benefit bigger production projects that rely on their business logic being not easy to instantly copy.
Adam Tzagournis, CPA
This is mission critical for my use case as well, especially given the fact that Hume is now passing supplemental LLM costs onto the user.
R
Roman Kupkovic
Adam Tzagournis, CPA
Maybe they make the separation for this at Custom LLM vs Hume-side LLM.
If you use Custom LLM you can do backend context injections as much as you want but the hume-side feature is still cool and would be nice to be used.
R
Roman Kupkovic
UPDATE:
This can be made possible by having the session_settings message type accept an object (to reference a stored Hume prompt ID) for the system prompt (instead of just a string), same as with the config definition.