Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.subverseai.com/llms.txt

Use this file to discover all available pages before exploring further.

Default functions are pre-built actions provided by SubVerse AI and the underlying LLM provider. They are always available to the LLM during an active session — no webhook or custom code required. Simply enable the ones you need and instruct the agent when to use them.

Available Default Functions

Core Platform Functions

These functions are available regardless of which LLM you use.
FunctionDescription
end_sessionEnd the session when the conversation is complete.
schedule_communicationSchedule a follow-up call, message, or email for a future time.
query_knowledge_baseFetch relevant information from the agent’s connected knowledge base.
fetch_memoryRetrieve previously stored memory for this customer or session.
update_memoryWrite new information to memory for use in future sessions.

OpenAI LLM Functions

Available when your agent uses an OpenAI model.
FunctionDescription
web_searchSearch the web for up-to-date information.
code_interpreterExecute Python code in a sandboxed environment for calculations, data analysis, or file generation.

Claude LLM Functions

Available when your agent uses an Anthropic Claude model.
FunctionDescription
web_searchSearch the web for up-to-date information.
web_fetchRetrieve the contents of a specific URL.
code_executionRun code in a sandboxed environment.
bashExecute bash commands in a sandboxed environment.

Google Gemini LLM Functions

Available when your agent uses a Google Gemini model.
FunctionDescription
google_searchSearch Google for up-to-date information.
url_contextRetrieve and reason over the contents of a URL.
google_mapsLook up locations, directions, and place information.
code_executionRun code in a sandboxed environment.

When to Use Default Functions

Default functions cover the most common actions an agent needs during a session. You do not need to create or configure them — just enable the ones relevant to your use case and instruct the agent in its system prompt when to call them. Common patterns:
  • End the session automatically once the user’s query is resolved.
  • Search the web when the agent needs real-time or external information.
  • Store customer preferences in memory to personalise future sessions.
  • Query the knowledge base for product details or policy documents.
Be explicit in the agent’s system prompt about when to call each function. For example: “When the customer says goodbye or their issue is resolved, call end_session.” or “Use web_search only when the knowledge base does not contain a definitive answer.”

Availability

PhaseAvailable
Pre-SessionNo
During-SessionYes
Post-SessionNo

Next Steps

Agent Handoff

Route conversations to a specialized agent

Custom Functions

Add your own API calls for the LLM to execute