Skip to content

[Bug]: Crawl4AI always defaults to OpenAI instead of using GROQ_API_KEY from .llm.env #1291

@Wassim-Laabidi

Description

@Wassim-Laabidi

crawl4ai version

0.6.1

Expected Behavior

When I define GROQ_API_KEY in the .llm.env file and run the container with --env-file .llm.env, I expect Crawl4AI to use Groq as the LLM provider (llama3) instead of OpenAI.

Current Behavior

Even though I correctly set GROQ_API_KEY=sk-xxx in the .llm.env file, Crawl4AI still logs that it is using openai as the provider with model gpt-4o-mini. This indicates that Groq is not being picked up or used at all.

Logs show:

LiteLLM completion() model= gpt-4o-mini; provider = openai

I confirmed that the .llm.env file is mounted correctly and the container is started using:

docker run -d -p 11235:11235
--name crawl4ai
--env-file .llm.env
--shm-size=1g
unclecode/crawl4ai:latest

Is this reproducible?

Yes

Inputs Causing the Bug

Steps to Reproduce

Code snippets

OS

Windows WSL

Python version

3.10.12

Browser

No response

Browser version

No response

Error logs & Screenshots (if applicable)

No response

Metadata

Metadata

Assignees

Labels

⚙ DoneBug fix, enhancement, FR that's completed pending release🐞 BugSomething isn't working

Projects

No projects

Milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions