fix(llm/frameworks): point users to LangChain when DefaultFramework has no base_url#1865
fix(llm/frameworks): point users to LangChain when DefaultFramework has no base_url#1865
Conversation
…as no base_url When a 0.21 config with a bare LangChain-only engine (anthropic, cohere, vertexai, deepseek, ...) lands on DefaultFramework with no env var set, _resolve_base_url raises with a hint that only points to parameters.base_url. That fix path is wrong for engines whose API is not OpenAI-compatible. Update the error to name both fix paths: - parameters.base_url for OpenAI-compatible endpoints, - NEMOGUARDRAILS_LLM_FRAMEWORK=langchain plus the matching langchain-<provider> package for everything else. Aligns the runtime hint with the migration guide so users hitting the error don't have to search the docs to know what the runtime is asking of them.
Greptile SummaryThis PR improves the user-facing error message raised by
|
| Filename | Overview |
|---|---|
| nemoguardrails/llm/frameworks/default.py | Error message in _resolve_base_url updated to include both the OpenAI-compatible fix path and the LangChain framework path; no logic changes. |
| tests/llm/clients/test_client_config.py | New async test test_unknown_provider_error_lists_both_fix_paths verifies all four expected strings in the updated error message; assertions are correctly placed outside the with pytest.raises block. |
Flowchart
%%{init: {'theme': 'neutral'}}%%
flowchart TD
A["create_model(model_name, provider_name, kwargs)"] --> B{provider_name in\n_providers?}
B -- Yes --> C[Call registered provider class]
B -- No --> D{base_url in\nkwargs?}
D -- Yes --> E[Use provided base_url]
D -- No --> F["_resolve_base_url(provider_name)"]
F --> G{provider_name in\n_DEFAULT_BASE_URLS?}
G -- Yes --> H[Return known URL\nopenai / nim / ollama]
G -- No --> I["Raise ValueError\n(UPDATED MESSAGE)\n- parameters.base_url\n- NEMOGUARDRAILS_LLM_FRAMEWORK=langchain\n- langchain-provider package"]
E --> J[_get_or_create_client]
H --> J
J --> K[OpenAIChatModel]
Reviews (1): Last reviewed commit: "fix(llm/frameworks): point users to Lang..." | Re-trigger Greptile
Codecov Report✅ All modified and coverable lines are covered by tests. 📢 Thoughts on this report? Let us know! |
| "Set it explicitly in model parameters: parameters.base_url" | ||
| "If your endpoint is OpenAI-compatible, set parameters.base_url. " | ||
| "Otherwise, set NEMOGUARDRAILS_LLM_FRAMEWORK=langchain and install " | ||
| "the matching langchain-<provider> package (see migration guide)." |
There was a problem hiding this comment.
nit: Could you add a link to the migration guide here (pointing to the docs)?
Description
When a 0.21 config with a bare LangChain-only engine (anthropic, cohere, vertexai, deepseek, ...) lands on DefaultFramework with no env var set,
_resolve_base_urlraises with a hint that only points toparameters.base_url. That fix path is wrong for engines whose API is not OpenAI-compatible.Update the error to name both fix paths:
Aligns the runtime hint with the migration guide so users hitting the error don't have to search the docs to know what the runtime is asking of them.
Summary by CodeRabbit
Bug Fixes
Tests