The LLMProfileStore class provides a centralized mechanism for managing LLM configurations.
Define a profile once, reuse it everywhere — across scripts, sessions, and even machines.
The store manages a directory of JSON profile files. By default it uses ~/.openhands/profiles,
but you can point it anywhere.
from openhands.sdk import LLMProfileStore# Default location: ~/.openhands/profilesstore = LLMProfileStore()# Or bring your own directorystore = LLMProfileStore(base_dir="./my-profiles")
API keys are excluded by default for security. Pass include_secrets=True to the save method if you wish to
persist them; otherwise, they will be read from the environment at load time.
"""Example: Using LLMProfileStore to save and reuse LLM configurations.LLMProfileStore persists LLM configurations as JSON files, so you can definea profile once and reload it across sessions without repeating setup code."""import osimport tempfilefrom pydantic import SecretStrfrom openhands.sdk import LLM, LLMProfileStore# Use a temporary directory so this example doesn't pollute your home folder.# In real usage you can omit base_dir to use the default (~/.openhands/profiles).store = LLMProfileStore(base_dir=tempfile.mkdtemp())# 1. Create two LLM profiles with different usageapi_key = os.getenv("LLM_API_KEY")assert api_key is not None, "LLM_API_KEY environment variable is not set."base_url = os.getenv("LLM_BASE_URL")model = os.getenv("LLM_MODEL", "anthropic/claude-sonnet-4-5-20250929")fast_llm = LLM( usage_id="fast", model=model, api_key=SecretStr(api_key), base_url=base_url, temperature=0.0,)creative_llm = LLM( usage_id="creative", model=model, api_key=SecretStr(api_key), base_url=base_url, temperature=0.9,)# 2. Save profiles# Note that secrets are excluded by default for safety.store.save("fast", fast_llm)store.save("creative", creative_llm)# To persist the API key as well, pass `include_secrets=True`:# store.save("fast", fast_llm, include_secrets=True)# 3. List available persisted profilesprint(f"Stored profiles: {store.list()}")# 4. Load a profileloaded = store.load("fast")assert isinstance(loaded, LLM)print( "Loaded profile. " f"usage:{loaded.usage_id}, " f"model: {loaded.model}, " f"temperature: {loaded.temperature}.")# 5. Delete a profilestore.delete("creative")print(f"After deletion: {store.list()}")print("EXAMPLE_COST: 0")
You can run the example code as-is.
The model name should follow the LiteLLM convention: provider/model_name (e.g., anthropic/claude-sonnet-4-5-20250929, openai/gpt-4o).
The LLM_API_KEY should be the API key for your chosen provider.
ChatGPT Plus/Pro subscribers: You can use LLM.subscription_login() to authenticate with your ChatGPT account and access Codex models without consuming API credits. See the LLM Subscriptions guide for details.
You can use a saved profile to switch the active model on a running conversation between turns. This is useful when you want to start with one model, then switch to another for later user messages while keeping the same conversation history and combined usage metrics.
"""Mid-conversation model switching.Usage: uv run examples/01_standalone_sdk/44_model_switching_in_convo.py"""import osfrom openhands.sdk import LLM, Agent, LocalConversation, Toolfrom openhands.sdk.llm.llm_profile_store import LLMProfileStorefrom openhands.tools.terminal import TerminalToolLLM_API_KEY = os.getenv("LLM_API_KEY")store = LLMProfileStore()store.save( "gpt", LLM(model="openhands/gpt-5.2", api_key=LLM_API_KEY), include_secrets=True,)agent = Agent( llm=LLM( model=os.getenv("LLM_MODEL", "openhands/claude-sonnet-4-5-20250929"), api_key=LLM_API_KEY, ), tools=[Tool(name=TerminalTool.name)],)conversation = LocalConversation(agent=agent, workspace=os.getcwd())# Send a message with the default modelconversation.send_message("Say hello in one sentence.")conversation.run()# Switch to a different model and send another messageconversation.switch_profile("gpt")print(f"Switched to: {conversation.agent.llm.model}")conversation.send_message("Say goodbye in one sentence.")conversation.run()# Print metrics per modelfor usage_id, metrics in conversation.state.stats.usage_to_metrics.items(): print(f" [{usage_id}] cost=${metrics.accumulated_cost:.6f}")combined = conversation.state.stats.get_combined_metrics()print(f"Total cost: ${combined.accumulated_cost:.6f}")print(f"EXAMPLE_COST: {combined.accumulated_cost}")store.delete("gpt")
You can run the example code as-is.
The model name should follow the LiteLLM convention: provider/model_name (e.g., anthropic/claude-sonnet-4-5-20250929, openai/gpt-4o).
The LLM_API_KEY should be the API key for your chosen provider.
ChatGPT Plus/Pro subscribers: You can use LLM.subscription_login() to authenticate with your ChatGPT account and access Codex models without consuming API credits. See the LLM Subscriptions guide for details.