-
-
Notifications
You must be signed in to change notification settings - Fork 5.9k
Closed
Labels
⚙ DoneBug fix, enhancement, FR that's completed pending releaseBug fix, enhancement, FR that's completed pending release🐞 BugSomething isn't workingSomething isn't working
Milestone
Description
crawl4ai version
0.6.1
Expected Behavior
When I define GROQ_API_KEY in the .llm.env file and run the container with --env-file .llm.env, I expect Crawl4AI to use Groq as the LLM provider (llama3) instead of OpenAI.
Current Behavior
Even though I correctly set GROQ_API_KEY=sk-xxx in the .llm.env file, Crawl4AI still logs that it is using openai as the provider with model gpt-4o-mini. This indicates that Groq is not being picked up or used at all.
Logs show:
LiteLLM completion() model= gpt-4o-mini; provider = openai
I confirmed that the .llm.env file is mounted correctly and the container is started using:
docker run -d -p 11235:11235
--name crawl4ai
--env-file .llm.env
--shm-size=1g
unclecode/crawl4ai:latest
Is this reproducible?
Yes
Inputs Causing the Bug
Steps to Reproduce
Code snippets
OS
Windows WSL
Python version
3.10.12
Browser
No response
Browser version
No response
Error logs & Screenshots (if applicable)
No response
RobinBially and hexdecimal16RobinBially
Metadata
Metadata
Assignees
Labels
⚙ DoneBug fix, enhancement, FR that's completed pending releaseBug fix, enhancement, FR that's completed pending release🐞 BugSomething isn't workingSomething isn't working