Skip to content

Generic LLM Configuration Support#147

Open
spichen wants to merge 4 commits intooracle:mainfrom
spichen:generic-llmconfig
Open

Generic LLM Configuration Support#147
spichen wants to merge 4 commits intooracle:mainfrom
spichen:generic-llmconfig

Conversation

@spichen
Copy link
Copy Markdown

@spichen spichen commented Mar 24, 2026

Implementation of RFC #114

@spichen spichen requested a review from a team March 24, 2026 20:18
@oracle-contributor-agreement oracle-contributor-agreement bot added the OCA Verified All contributors have signed the Oracle Contributor Agreement. label Mar 24, 2026
@spichen spichen force-pushed the generic-llmconfig branch 2 times, most recently from 48115b7 to 86e4440 Compare March 24, 2026 20:22
@dhilloulinoracle
Copy link
Copy Markdown
Contributor

Thank you @spichen for following-up.
@cesarebernardis please have a look as that's a very useful addition

api_provider: Optional[str]
api_type: Optional[str]
url: Optional[str]
api_key: Optional[str]
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please mark the api_key as sensitive

Suggested change
api_key: Optional[str]
api_key: SensitiveField[Optional[str]]

There's also a table at the end of this file with all the sensitive fields, please that too

model_id: str
provider: Optional[str]
api_provider: Optional[str]
api_type: Optional[str]
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Some fields are missing, please align the definition with the one in the spec

The ``provider`` field is optional and identifies the model provider (e.g. ``"openai"``, ``"meta"``, ``"anthropic"``, ``"cohere"``).
The ``api_provider`` field is optional and identifies the API provider serving the model (e.g. ``"openai"``, ``"oci"``, ``"vllm"``, ``"ollama"``, ``"aws_bedrock"``, ``"vertex_ai"``).
The ``api_type`` field is optional and identifies the wire protocol to use (e.g. ``"chat_completions"``, ``"responses"``).
The ``url`` field is optional and specifies the URL of the API endpoint (e.g. ``"https://api.openai.com/v1"``).
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would maybe specify that, if null, the default API url of the API provider (if any) should be used

all_subclasses = cls._get_all_subclasses(only_core_components=only_core_components)
adapter = TypeAdapter(Union[all_subclasses]) # type: ignore
json_schema = adapter.json_schema(by_alias=by_alias, mode=mode)
elif getattr(cls, "_include_subclasses_in_schema", False):
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you help understanding why you need this new logic?

Copy link
Copy Markdown
Author

@spichen spichen Mar 27, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

model_json_schema() used to hit the cls._is_abstract branch, which builds a Union[all_subclasses]. Now that the class is concrete, it falls through to the else branch and produces only LlmConfig's own schema - the subclasses are missing entirely. _include_subclasses_in_schema adds a third path that builds Union[cls, *all_subclasses], so the full hierarchy still appears in the output.


When a concrete class also serves as a base (e.g. ``LlmConfig``), Pydantic
only generates the parent schema. Phase 1 (abstract resolution) skips it
because it is not abstract, so subclass schemas are never added to ``$defs``.
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you double check this? Because I think we have this pattern already in the class tree (e.g., OllamaConfig is child of OpenAiCompatibleConfig, which is not abstract, and it shows up in the json schema).

Maybe I misunderstood the issue?

Maybe the problem raises when the starting class form which we call model_json_schema is not abstract?

Copy link
Copy Markdown
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, it is because of the starting class now not being abstract. OpenAiCompatibleConfig case worked earlier because the starting class was abstract Llmconfig. I added a test case to cover this.

Do you better approach to solve this?

return LlmConfig(name="test", model_id="some-model", api_provider="unsupported_provider")


class TestOpenAiAgentsDispatch:
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For tests that are adapter-specific we have ad-hoc folders in tests/adapters, and tests in that folder are skipped automatically if the extra dependencies required are not installed.

Please split these tests in the respective adapter folders.

spichen added 3 commits March 27, 2026 23:25
LlmConfig is no longer abstract. It can be used directly with model_id,
provider, api_provider, api_type, url, and api_key to describe any LLM
without a dedicated subclass. Existing subclasses remain unchanged.

All framework adapters dispatch bare LlmConfig instances via api_provider.
Schema generation handles concrete-with-subclasses via
_include_subclasses_in_schema on Component. Documentation, JSON spec,
and language spec updated accordingly.
- Mark api_key as SensitiveField in spec and add to sensitive fields table
- Add missing url and api_key fields to reference sheet
- Clarify url field defaults to API provider's default URL when null
- Split adapter-specific tests into respective tests/adapters/ folders
Add test that validates a bare LlmConfig embedded in an Agent schema,
ensuring subclass schemas are correctly populated in $defs when schema
generation starts from a parent component (not LlmConfig itself).
@spichen spichen force-pushed the generic-llmconfig branch from 94a7354 to 3f365c3 Compare March 27, 2026 19:25
…ispatch

Cover openai provider basic dispatch and url+api_key passthrough
for both adapters, matching the coverage already present in
openaiagents and langgraph adapter tests.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

OCA Verified All contributors have signed the Oracle Contributor Agreement.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants