Skip to content

llama.cpp full-cuda-b6795 Public Latest

Install from the command line
Learn more about packages
$ docker pull ghcr.io/ngxson/llama.cpp:full-cuda-b6795

Recent tagged image versions

  • Published about 21 hours ago · Digest
    sha256:455325dc58c9e44b4f8df6b0e9738026e44e0d0794282cfbafee9ee4d97a0aeb
    0 Version downloads
  • Published about 21 hours ago · Digest
    sha256:e1cdbee60f46e31d7cc7708d987dfe504e7f37e96b4cf4154cbc3cbc7c321df2
    0 Version downloads
  • Published about 21 hours ago · Digest
    sha256:15cf386a71c7c9e8bb3ff91f800b73b5eb95d798349a9aea68656a8572d465e7
    0 Version downloads
  • Published about 21 hours ago · Digest
    sha256:61fe1b2ef23a0a5b3b11818ef7358eed456e3d0b321d88e8a4f98f7605b866b5
    0 Version downloads
  • Published about 21 hours ago · Digest
    sha256:3b676eda272422d215c21a674df11328c8c26d56e887f2501622306e61ac9d0e
    0 Version downloads

Loading

Details


Last published

21 hours ago

Total downloads

63K