You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The list of supported models for AI batch processing includes @cf/meta/llama-3.3-70b-instruct-fp8-fast.
However, when run against the REST API, trying to use this model returns an error:
[errors] => Array
(
[0] => Array
(
[message] => AiError: Ai: This model does not support request queuing (f76e541f-d076-4869-9130-ee14913a9159)
[code] => 8007
)
)
Expected Behavior
Models listed on this page should support batch processing.
Additional information
No response
The text was updated successfully, but these errors were encountered:
Example URL(s)
https://developers.cloudflare.com/workers-ai/features/batch-api/
Actual Behavior
The list of supported models for AI batch processing includes @cf/meta/llama-3.3-70b-instruct-fp8-fast.
However, when run against the REST API, trying to use this model returns an error:
[errors] => Array
(
[0] => Array
(
[message] => AiError: Ai: This model does not support request queuing (f76e541f-d076-4869-9130-ee14913a9159)
[code] => 8007
)
)
Expected Behavior
Models listed on this page should support batch processing.
Additional information
No response
The text was updated successfully, but these errors were encountered: