can we use Gemini as the main LLM? #2277
              
                
                  
                  
                    Answered
                  
                  by
                    dd36
                  
              
          
                  
                    
                      The-BlackSmith872
                    
                  
                
                  asked this question in
                Q&A
              
            -
| In the supported LLMs section, Gemini's status is "coming soon", but inside the environment variables there are the rows "ENABLE_GEMINI" and "LLM_KEY" which says gemini-pro and gemini-flash are currently available. Could someone explain whether we can use gemini or not? | 
Beta Was this translation helpful? Give feedback.
      
      
          Answered by
          
            dd36
          
      
      
        May 2, 2025 
      
    
    Replies: 1 comment
-
| You can. Same way as OpenAI. | 
Beta Was this translation helpful? Give feedback.
                  
                    0 replies
                  
                
            
      Answer selected by
        The-BlackSmith872
  
    Sign up for free
    to join this conversation on GitHub.
    Already have an account?
    Sign in to comment
  
        
    
You can. Same way as OpenAI.