Skip to content

Misc. bug: -TS doesn't support more than ? Devices #13293

Open
@justinjja

Description

@justinjja

Name and Version

build: 5272 (3e959f0)

Compiled with #define GGML_SCHED_MAX_BACKENDS 64

Error:
Device 14: NVIDIA GeForce RTX 3090, compute capability 8.6, VMM: yes
Device 15: NVIDIA GeForce RTX 3090, compute capability 8.6, VMM: yes
error while handling argument "-ts": got 16 input configs, but system only has 16 devices

Operating systems

Linux

Which llama.cpp modules do you know to be affected?

llama-server

Command line

./llama-server ... -ts 10,11,11,11,11,11,11,11,11,11,11,9,9,9,9,9

Problem description & steps to reproduce

Build llama for more than 16 devices, try to use -ts

First Bad Commit

No response

Relevant log output

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions