Description
Git commit
C:\llama-cpp-intel-oneapi\llama.cpp>git rev-parse HEAD
e046430
Operating systems
Windows
GGML backends
AMX
Problem description & steps to reproduce
I am trying to compile llama.cpp with Intel OneAPI (MKL) for amx support on Windows Server 2022 and get errors. I had installed, and MSVC2022 with C++-CMake Tools for Windows, Git for Windows, C++-Clang Compiler for Windows, MS-Build Support for LLVM-Toolset (clang) enabled. and newest Intel® oneAPI Base Toolkit (Version 2025.0.1) installed. I start from Intel oneAPI command prompt for Intel 64 for Visual Studio 2022, and the exact command I used: cmake -B build -DGGML_BLAS=ON -DGGML_BLAS_VENDOR=Intel10_64lp -DCMAKE_C_COMPILER=icx -DCMAKE_CXX_COMPILER=icpx -DGGML_NATIVE=ON -DGGML_CCACHE=OFF
I get following error messages:
:: initializing oneAPI environment...
Initializing Visual Studio command-line environment...
Visual Studio version 17.13.4 environment configured.
"C:\Program Files\Microsoft Visual Studio\2022\Community"
Visual Studio command-line environment initialized for: 'x64'
: advisor -- latest
: compiler -- latest
: dal -- latest
: debugger -- latest
: dev-utilities -- latest
: dnnl -- latest
: dpcpp-ct -- latest
: dpl -- latest
: ipp -- latest
: ippcp -- latest
: mkl -- latest
: ocloc -- latest
: pti -- latest
: tbb -- latest
: umf -- latest
: vtune -- latest
:: oneAPI environment initialized ::
C:\Program Files (x86)\Intel\oneAPI>cd C:\llama-cpp-intel-oneapi
C:\llama-cpp-intel-oneapi>git clone https://github.com/ggml-org/llama.cpp
Cloning into 'llama.cpp'...
remote: Enumerating objects: 46796, done.
remote: Counting objects: 100% (212/212), done.
remote: Compressing objects: 100% (144/144), done.
remote: Total 46796 (delta 154), reused 75 (delta 68), pack-reused 46584 (from 3)
Receiving objects: 100% (46796/46796), 99.64 MiB | 14.79 MiB/s, done.
Resolving deltas: 100% (33519/33519), done.
C:\llama-cpp-intel-oneapi>cd llama.cpp
C:\llama-cpp-intel-oneapi\llama.cpp>cmake -B build -DGGML_BLAS=ON -DGGML_BLAS_VENDOR=Intel10_64lp -DCMAKE_C_COMPILER=icx -DCMAKE_CXX_COMPILER=icpx -DGGML_NATIVE=ON -DGGML_CCACHE=OFF
-- Building for: Visual Studio 17 2022
-- Selecting Windows SDK version 10.0.22621.0 to target Windows 10.0.20348.
-- The C compiler identification is MSVC 19.43.34809.0
-- The CXX compiler identification is MSVC 19.43.34809.0
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Check for working C compiler: C:/Program Files/Microsoft Visual Studio/2022/Community/VC/Tools/MSVC/14.43.34808/bin/Hostx64/x64/cl.exe - skipped
-- Detecting C compile features
-- Detecting C compile features - done
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Check for working CXX compiler: C:/Program Files/Microsoft Visual Studio/2022/Community/VC/Tools/MSVC/14.43.34808/bin/Hostx64/x64/cl.exe - skipped
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Found Git: C:/Program Files/Git/cmd/git.exe (found version "2.47.1.windows.2")
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Failed
-- Looking for pthread_create in pthreads
-- Looking for pthread_create in pthreads - not found
-- Looking for pthread_create in pthread
-- Looking for pthread_create in pthread - not found
-- Found Threads: TRUE
-- CMAKE_SYSTEM_PROCESSOR: AMD64
-- CMAKE_GENERATOR_PLATFORM:
-- Including CPU backend
-- Found OpenMP_C: -openmp (found version "2.0")
-- Found OpenMP_CXX: -openmp (found version "2.0")
-- Found OpenMP: TRUE (found version "2.0")
-- x86 detected
-- Performing Test HAS_AVX_1
-- Performing Test HAS_AVX_1 - Success
-- Performing Test HAS_AVX2_1
-- Performing Test HAS_AVX2_1 - Success
-- Performing Test HAS_FMA_1
-- Performing Test HAS_FMA_1 - Success
-- Performing Test HAS_AVX512_1
-- Performing Test HAS_AVX512_1 - Success
-- Adding CPU backend variant ggml-cpu: /arch:AVX512 GGML_AVX512
-- Looking for sgemm_
-- Looking for sgemm_ - found
-- Found BLAS: C:/Program Files (x86)/Intel/oneAPI/mkl/latest/lib/mkl_intel_lp64_dll.lib;C:/Program Files (x86)/Intel/oneAPI/compiler/latest/lib/libiomp5md.lib;C:/Program Files (x86)/Intel/oneAPI/mkl/latest/lib/mkl_intel_thread_dll.lib;C:/Program Files (x86)/Intel/oneAPI/mkl/latest/lib/mkl_core_dll.lib
-- BLAS found, Libraries: C:/Program Files (x86)/Intel/oneAPI/mkl/latest/lib/mkl_intel_lp64_dll.lib;C:/Program Files (x86)/Intel/oneAPI/compiler/latest/lib/libiomp5md.lib;C:/Program Files (x86)/Intel/oneAPI/mkl/latest/lib/mkl_intel_thread_dll.lib;C:/Program Files (x86)/Intel/oneAPI/mkl/latest/lib/mkl_core_dll.lib
CMake Error at C:/Program Files/Microsoft Visual Studio/2022/Community/Common7/IDE/CommonExtensions/Microsoft/CMake/CMake/share/cmake-3.30/Modules/FindPackageHandleStandardArgs.cmake:233 (message):
Could NOT find PkgConfig (missing: PKG_CONFIG_EXECUTABLE)
Call Stack (most recent call first):
C:/Program Files/Microsoft Visual Studio/2022/Community/Common7/IDE/CommonExtensions/Microsoft/CMake/CMake/share/cmake-3.30/Modules/FindPackageHandleStandardArgs.cmake:603 (_FPHSA_FAILURE_MESSAGE)
C:/Program Files/Microsoft Visual Studio/2022/Community/Common7/IDE/CommonExtensions/Microsoft/CMake/CMake/share/cmake-3.30/Modules/FindPkgConfig.cmake:114 (find_package_handle_standard_args)
ggml/src/ggml-blas/CMakeLists.txt:25 (find_package)
-- Configuring incomplete, errors occurred!
C:\llama-cpp-intel-oneapi\llama.cpp>
First Bad Commit
No response
Compile command
cmake -B build -DGGML_BLAS=ON -DGGML_BLAS_VENDOR=Intel10_64lp -DCMAKE_C_COMPILER=icx -DCMAKE_CXX_COMPILER=icpx -DGGML_NATIVE=ON -DGGML_CCACHE=OFF
Relevant log output
:: initializing oneAPI environment...
Initializing Visual Studio command-line environment...
Visual Studio version 17.13.4 environment configured.
"C:\Program Files\Microsoft Visual Studio\2022\Community\"
Visual Studio command-line environment initialized for: 'x64'
: advisor -- latest
: compiler -- latest
: dal -- latest
: debugger -- latest
: dev-utilities -- latest
: dnnl -- latest
: dpcpp-ct -- latest
: dpl -- latest
: ipp -- latest
: ippcp -- latest
: mkl -- latest
: ocloc -- latest
: pti -- latest
: tbb -- latest
: umf -- latest
: vtune -- latest
:: oneAPI environment initialized ::
C:\Program Files (x86)\Intel\oneAPI>cd C:\llama-cpp-intel-oneapi
C:\llama-cpp-intel-oneapi>git clone https://github.com/ggml-org/llama.cpp
Cloning into 'llama.cpp'...
remote: Enumerating objects: 46796, done.
remote: Counting objects: 100% (212/212), done.
remote: Compressing objects: 100% (144/144), done.
remote: Total 46796 (delta 154), reused 75 (delta 68), pack-reused 46584 (from 3)
Receiving objects: 100% (46796/46796), 99.64 MiB | 14.79 MiB/s, done.
Resolving deltas: 100% (33519/33519), done.
C:\llama-cpp-intel-oneapi>cd llama.cpp
C:\llama-cpp-intel-oneapi\llama.cpp>cmake -B build -DGGML_BLAS=ON -DGGML_BLAS_VENDOR=Intel10_64lp -DCMAKE_C_COMPILER=icx -DCMAKE_CXX_COMPILER=icpx -DGGML_NATIVE=ON -DGGML_CCACHE=OFF
-- Building for: Visual Studio 17 2022
-- Selecting Windows SDK version 10.0.22621.0 to target Windows 10.0.20348.
-- The C compiler identification is MSVC 19.43.34809.0
-- The CXX compiler identification is MSVC 19.43.34809.0
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Check for working C compiler: C:/Program Files/Microsoft Visual Studio/2022/Community/VC/Tools/MSVC/14.43.34808/bin/Hostx64/x64/cl.exe - skipped
-- Detecting C compile features
-- Detecting C compile features - done
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Check for working CXX compiler: C:/Program Files/Microsoft Visual Studio/2022/Community/VC/Tools/MSVC/14.43.34808/bin/Hostx64/x64/cl.exe - skipped
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Found Git: C:/Program Files/Git/cmd/git.exe (found version "2.47.1.windows.2")
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Failed
-- Looking for pthread_create in pthreads
-- Looking for pthread_create in pthreads - not found
-- Looking for pthread_create in pthread
-- Looking for pthread_create in pthread - not found
-- Found Threads: TRUE
-- CMAKE_SYSTEM_PROCESSOR: AMD64
-- CMAKE_GENERATOR_PLATFORM:
-- Including CPU backend
-- Found OpenMP_C: -openmp (found version "2.0")
-- Found OpenMP_CXX: -openmp (found version "2.0")
-- Found OpenMP: TRUE (found version "2.0")
-- x86 detected
-- Performing Test HAS_AVX_1
-- Performing Test HAS_AVX_1 - Success
-- Performing Test HAS_AVX2_1
-- Performing Test HAS_AVX2_1 - Success
-- Performing Test HAS_FMA_1
-- Performing Test HAS_FMA_1 - Success
-- Performing Test HAS_AVX512_1
-- Performing Test HAS_AVX512_1 - Success
-- Adding CPU backend variant ggml-cpu: /arch:AVX512 GGML_AVX512
-- Looking for sgemm_
-- Looking for sgemm_ - found
-- Found BLAS: C:/Program Files (x86)/Intel/oneAPI/mkl/latest/lib/mkl_intel_lp64_dll.lib;C:/Program Files (x86)/Intel/oneAPI/compiler/latest/lib/libiomp5md.lib;C:/Program Files (x86)/Intel/oneAPI/mkl/latest/lib/mkl_intel_thread_dll.lib;C:/Program Files (x86)/Intel/oneAPI/mkl/latest/lib/mkl_core_dll.lib
-- BLAS found, Libraries: C:/Program Files (x86)/Intel/oneAPI/mkl/latest/lib/mkl_intel_lp64_dll.lib;C:/Program Files (x86)/Intel/oneAPI/compiler/latest/lib/libiomp5md.lib;C:/Program Files (x86)/Intel/oneAPI/mkl/latest/lib/mkl_intel_thread_dll.lib;C:/Program Files (x86)/Intel/oneAPI/mkl/latest/lib/mkl_core_dll.lib
CMake Error at C:/Program Files/Microsoft Visual Studio/2022/Community/Common7/IDE/CommonExtensions/Microsoft/CMake/CMake/share/cmake-3.30/Modules/FindPackageHandleStandardArgs.cmake:233 (message):
Could NOT find PkgConfig (missing: PKG_CONFIG_EXECUTABLE)
Call Stack (most recent call first):
C:/Program Files/Microsoft Visual Studio/2022/Community/Common7/IDE/CommonExtensions/Microsoft/CMake/CMake/share/cmake-3.30/Modules/FindPackageHandleStandardArgs.cmake:603 (_FPHSA_FAILURE_MESSAGE)
C:/Program Files/Microsoft Visual Studio/2022/Community/Common7/IDE/CommonExtensions/Microsoft/CMake/CMake/share/cmake-3.30/Modules/FindPkgConfig.cmake:114 (find_package_handle_standard_args)
ggml/src/ggml-blas/CMakeLists.txt:25 (find_package)
-- Configuring incomplete, errors occurred!
C:\llama-cpp-intel-oneapi\llama.cpp>