Add LLM provider response headers to Responses API #16091
      
        
          +100
        
        
          −6
        
        
          
        
      
    
  
  Add this suggestion to a batch that can be applied as a single commit.
  This suggestion is invalid because no changes were made to the code.
  Suggestions cannot be applied while the pull request is closed.
  Suggestions cannot be applied while viewing a subset of changes.
  Only one suggestion per line can be applied in a batch.
  Add this suggestion to a batch that can be applied as a single commit.
  Applying suggestions on deleted lines is not supported.
  You must change the existing code in this line in order to create a valid suggestion.
  Outdated suggestions cannot be applied.
  This suggestion has been applied or marked resolved.
  Suggestions cannot be applied from pending reviews.
  Suggestions cannot be applied on multi-line comments.
  Suggestions cannot be applied while the pull request is queued to merge.
  Suggestion cannot be applied right now. Please check back later.
  
    
  
    
Title
Add LLM provider response headers to Responses API
Relevant issues
Fixes LIT-1153
Pre-Submission checklist
Please complete all items before asking a LiteLLM maintainer to review your PR
I have Added testing in the
tests/litellm/directory, Adding at least 1 test is a hard requirement - see detailsI have added a screenshot of my new test passing locally
My PR passes all unit tests on
make test-unitMy PR's scope is as isolated as possible, it only solves 1 specific problem
Type
🆕 New Feature
Changes
Summary
Added support for including LLM provider response headers in the Responses API, matching the behavior already implemented for chat completions.
What Changed
llm_http_handler.py: Updated the responses API handler to capture and process response headers from LLM providersopenai/responses/transformation.py: Added header processing totransform_response_api_response()method