Skip to content

Conversation

@dee-walia20
Copy link

Problem

When using allowed_openai_params with the SAP provider, extra parameters
(like Gemini's thinking_config) are filtered out in transform_request.

response = litellm.completion(
    model="sap/gemini-2.5-flash",
    messages=[...],
    thinking_config={"thinking_budget": 0},
    allowed_openai_params=["thinking_config"],  # Ignored!
)

Solution

Extend supported_params with any extra params that passed get_optional_params()
validation (via allowed_openai_params).

This is a generic fix - any new provider-specific params can be passed through
without code changes.

Changes

  • litellm/llms/sap/chat/transformation.py: 2 lines added in transform_request

Fixes #18412

@vercel
Copy link

vercel bot commented Dec 25, 2025

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Review Updated (UTC)
litellm Ready Ready Preview, Comment Dec 25, 2025 4:10pm

@CLAassistant
Copy link

CLAassistant commented Dec 25, 2025

CLA assistant check
All committers have signed the CLA.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[Bug]: unable to provide thinking_config with new provider SAP for gemini models

2 participants