Git Product home page Git Product logo

onemkl's Introduction

oneAPI Math Kernel Library (oneMKL) Interfaces

Contents


Introduction

oneMKL interfaces are an open-source implementation of the oneMKL Data Parallel C++ (DPC++) interface according to the oneMKL specification. It works with multiple devices (backends) using device-specific libraries underneath.

User Application oneMKL Layer Third-Party Library Hardware Backend
oneMKL interface oneMKL selector Intel(R) oneAPI Math Kernel Library for Intel CPU Intel CPU
Intel(R) oneAPI Math Kernel Library for Intel GPU Intel GPU
NVIDIA cuBLAS for NVIDIA GPU NVIDIA GPU

Supported Usage Models:

There are two oneMKL selector layer implementations:

  • Run-time dispatching: The application is linked with the oneMKL library and the required backend is loaded at run-time based on device vendor (all libraries should be dynamic).

Example of app.cpp with run-time dispatching:

include "onemkl/onemkl.hpp"

...
cpu_dev = cl::sycl::device(cl::sycl::cpu_selector());
gpu_dev = cl::sycl::device(cl::sycl::gpu_selector());

cl::sycl::queue cpu_queue(cpu_dev);
cl::sycl::queue gpu_queue(gpu_dev);

onemkl::blas::gemm(cpu_queue, transA, transB, m, ...);
onemkl::blas::gemm(gpu_queue, transA, transB, m, ...);

How to build an application with run-time dispatching:

$> clang++ -fsycl –I$ONEMKL/include app.cpp
$> clang++ -fsycl app.o –L$ONEMKL/lib –lonemkl
  • Compile-time dispatching: The application uses a templated API where the template parameters specify the required backends and third-party libraries and the application is linked with the required oneMKL backend wrapper libraries (libraries can be static or dynamic).

Example of app.cpp with compile-time dispatching:

include "onemkl/onemkl.hpp"

...
cpu_dev = cl::sycl::device(cl::sycl::cpu_selector());
gpu_dev = cl::sycl::device(cl::sycl::gpu_selector());

cl::sycl::queue cpu_queue(cpu_dev);
cl::sycl::queue gpu_queue(gpu_dev);

onemkl::blas::gemm<intelcpu,intelmkl>(cpu_queue, transA, transB, m, ...);
onemkl::blas::gemm<nvidiagpu,cublas>(gpu_queue, transA, transB, m, ...);

How to build an application with run-time dispatching:

$> clang++ -fsycl –I$ONEMKL/include app.cpp
$> clang++ -fsycl app.o –L$ONEMKL/lib –lonemkl_blas_mklcpu –lonemkl_blas_cublas

Supported Configurations:

Supported domains: BLAS

Linux*

Backend Library Supported Link Type
Intel CPU Intel(R) oneAPI Math Kernel Library Dynamic, Static
Intel GPU Intel(R) oneAPI Math Kernel Library Dynamic, Static
NVIDIA GPU NVIDIA cuBLAS Dynamic, Static

Windows*

Backend Library Supported Link Type
Intel CPU Intel(R) oneAPI Math Kernel Library Dynamic, Static
Intel GPU Intel(R) oneAPI Math Kernel Library Dynamic, Static

Support and Requirements

Hardware Platform Support

  • CPU
    • Intel Atom(R) Processors
    • Intel(R) Core(TM) Processor Family
    • Intel(R) Xeon(R) Processor Family
  • Accelerators
    • Intel(R) Processor Graphics GEN9
    • NVIDIA(R) TITAN RTX(TM) (Linux* only. Not tested with other NVIDIA GPU families and products.)

Supported Operating Systems

Linux*

Operating System CPU Host/Target Integrated Graphics from Intel (Intel GPU) NVIDIA GPU
Ubuntu 18.04.3, 19.04 18.04.3, 19.10 18.04.3
SUSE Linux Enterprise Server* 15 Not supported Not supported
Red Hat Enterprise Linux* (RHEL*) 8 Not supported Not supported
Linux* kernel N/A 4.11 or higher N/A

Windows*

Operating System CPU Host/Target Integrated Graphics from Intel (Intel GPU)
Microsoft Windows* 10 (64-bit version only) 10 (64-bit version only)
Microsoft Windows* Server 2016, 2019 Not supported

Software Requirements

What should I download?

General:

Using Conan Using CMake Directly
Functional Testing Build Only Documentation
Linux* : GNU* GCC 5.1 or higher
Windows* : MSVS* 2017 or MSVS* 2019 (version 16.5 or newer)
Python 3.6 or higher CMake
Ninja (optional)
Conan C++ package manager GNU* FORTRAN Compiler - Sphinx
NETLIB LAPACK - -

Hardware and OS Specific:

Operating System Device Package Installed by Conan
Linux*/Windows* Intel CPU Intel(R) oneAPI DPC++ Compiler
or
Intel project for LLVM* technology
No
Intel(R) oneAPI Math Kernel Library Yes
Intel GPU Intel(R) oneAPI DPC++ Compiler No
Intel GPU driver No
Intel(R) oneAPI Math Kernel Library Yes
Linux* only NVIDIA GPU Intel project for LLVM* technology No

If Building with Conan, above packages marked as "No" must be installed manually.

If Building with CMake, above packages must be installed manually.

Notice for Use of Conan Package Manager

LEGAL NOTICE: By downloading and using this container or script as applicable (the “Software Package”) and the included software or software made available for download, you agree to the terms and conditions of the software license agreements for the Software Package, which may also include notices, disclaimers, or license terms for third party software (together, the “Agreements”) included in this README file.

If the Software Package is installed through a silent install, your download and use of the Software Package indicates your acceptance of the Agreements.

Product and Version Information:

Product Supported Version Installed by Conan Conan Package Source Package Install Location on Linux* License
Python 3.6 or higher No N/A Pre-installed or Installed by user PSF
Conan C++ Package Manager 1.24 or higher No N/A Installed by user MIT
CMake 3.13 or higher Yes
(3.15 or higher)
conan-center ~/.conan/data or $CONAN_USER_HOME/.conan/data The OSI-approved BSD 3-clause License
Ninja 1.10.0 Yes conan-center ~/.conan/data or $CONAN_USER_HOME/.conan/data Apache License v2.0
GNU* FORTRAN Compiler 7.4.0 or higher Yes apt /usr/bin GNU General Public License, version 3
Intel(R) oneAPI DPC++ Compiler latest No N/A Installed by user End User License Agreement for the Intel(R) Software Development Products
Intel project for LLVM* technology binary for Intel CPU Daily builds (experimental) tested with 20200331 No N/A Installed by user Apache License v2
Intel project for LLVM* technology source for NVIDIA GPU Daily source releases: tested with 20200421 No N/A Installed by user Apache License v2
Intel(R) oneAPI Math Kernel Library latest Yes apt /opt/intel/inteloneapi/mkl Intel Simplified Software License
NVIDIA CUDA SDK 10.2 No N/A Installed by user End User License Agreement
NETLIB LAPACK 3.7.1 Yes conan-community ~/.conan/data or $CONAN_USER_HOME/.conan/data BSD like license
Sphinx 2.4.4 Yes pip ~/.local/bin (or similar user local directory) BSD License

conan-center: https://api.bintray.com/conan/conan/conan-center

conan-community: https://api.bintray.com/conan/conan-community/conan


Build Setup

  1. Install Intel(R) oneAPI DPC++ Compiler (select variant as per requirement).

  2. Clone this project to <path to onemkl>, where <path to onemkl> is the root directory of this repository.

  3. You can Build with Conan to automate the process of getting dependencies or you can download and install the required dependencies manually and Build with CMake directly.

Note: Conan package manager automates the process of getting required packages, so that you do not have to go to different web location and follow different instructions to install them.


Building with Conan

** This method currently works on Linux* only **

** Make sure you have completed Build Setup. **

Note: To understand how dependencies are resolved, refer to the Product and Version Information section. For details about Conan package manager, refer to Conan Documentation.

Getting Conan

Conan can be installed from pip:

pip3 install conan

Setting up Conan

Conan Default Directory

Conan stores all files and data in ~/.conan. If you are fine with this behavior, you can skip to Conan Profiles section.

To change this behavior, set the environment variable CONAN_USER_HOME to a path of your choice. A .conan/ directory will be created in this path and future Conan commands will use this directory to find configuration files and download dependent packages. Packages will be downloaded into $CONAN_USER_HOME/data. To change the "/data" part of this directory, refer to the [storage] section of conan.conf file.

To make this setting persistent across terminal sessions, you can add below line to your ~/.bashrc or custom runscript. Refer to Conan Documentation for more details.

export CONAN_USER_HOME=/usr/local/my_workspace/conan_cache

Conan Profiles

Profiles are a way for Conan to determine a basic environment to use for building a project. This project ships with profiles for:

  • Intel(R) oneAPI DPC++ Compiler for Intel CPU and Intel GPU backend: inteldpcpp_lnx
  1. Open the profile you wish to use from <path to onemkl>/conan/profiles/ and set COMPILER_PREFIX to the path to the root folder of compiler. The root folder is the one that contains the bin and lib directories. For example, Intel(R) oneAPI DPC++ Compiler root folder for default installation on Linux is /opt/intel/inteloneapi/compiler/<version>/linux. User can define custom path for installing the compiler.
COMPILER_PREFIX=<path to Intel(R) oneAPI DPC++ Compiler>
  1. You can customize the [env] section of the profile based on individual requirements.

  2. Install configurations for this project:

# Inside <path to onemkl>
$ conan config install conan/

This command installs all contents of <path to onemkl>/conan/, most importantly profiles, to conan default directory.

Note: If you change the profile, you must re-run the above command before you can use the new profile.

Building

  1. Out-of-source build
# Inside <path to onemkl>
mkdir build && cd build
  1. If you choose to build backends with the Intel(R) oneAPI Math Kernel Library, install the GPG key as mentioned here, https://software.intel.com/en-us/articles/oneapi-repo-instructions#aptpkg

  2. Install dependencies

conan install .. --profile <profile_name> --build missing [-o <option1>=<value1>] [-o <option2>=<value2>]

The conan install command downloads and installs all requirements for the oneMKL DPC++ Interfaces project as defined in <path to onemkl>/conanfile.py based on the options passed. It also creates conanbuildinfo.cmake file that contains information about all dependencies and their directories. This file is used in top-level CMakeLists.txt.

-pr | --profile <profile_name> Defines a profile for Conan to use for building the project.

-b | --build <package_name|missing> Tells Conan to build or re-build a specific package. If missing is passed as a value, all missing packages are built. This option is recommended when you build the project for the first time, because it caches required packages. You can skip this option for later use of this command.

  1. Build Project
conan build .. [--configure] [--build] [--test]  # Default is all

The conan build command executes the build() procedure from <path to onemkl>/conanfile.py. Since this project uses CMake, you can choose to configure, build, test individually or perform all steps by passing no optional arguments.

  1. Optionally, you can also install the package. Similar to cmake --install . --prefix <install_dir>.
conan package .. --build-folder . --install-folder <install_dir>

-bf | --build-folder Tells Conan where to find the built project.

-if | --install-folder Tells Conan where to install the package. It is similar to specifying CMAKE_INSTALL_PREFIX

Note: For a detailed list of commands and options, refer to the Conan Command Reference.

Conan Build Options

Backend-related Options

The following options are available to pass on conan install when building the oneMKL library:

  • build_shared_libs=[True | False]. Setting it to True enables the building of dynamic libraries. The default value is True.
  • enable_mklcpu_backend=[True | False]. Setting it to True enables the building of oneMKL intelmkl cpu backend. The default value is True.
  • enable_mklgpu_backend=[True | False]. Setting it to True enables the building of oneMKL intelmkl gpu backend. The default value is True.
  • enable_mklcpu_thread_tbb=[True | False]. Setting it to True enables oneMKL on CPU with TBB threading instead of sequential. The default value is True.

Testing-related Options

  • build_functional_tests=[True | False]. Setting it to True enables the building of functional tests. The default value is True.

Documentation

  • build_doc=[True | False]. Setting it to True enables the building of rst files to generate HTML files for updated documentation. The default value is False.

Note: For a mapping between Conan and CMake options, refer to build options under the CMake section.

Example

Build oneMKL as a static library for oneMKL cpu and gpu backend:

# Inside <path to onemkl>
mkdir build && cd build
conan install .. --build missing --profile inteldpcpp_lnx -o build_shared_libs=False
conan build ..

Building with CMake

  1. Make sure you have completed Build Setup.

  2. Build and install all required dependencies.

Then:

  • On Linux*
# Inside <path to onemkl>
mkdir build && cd build
export CXX=<path_to_dpcpp_compiler>/bin/dpcpp;
cmake .. [-DMKL_ROOT=<mkl_install_prefix>] \               # required only if enviroment variable MKLROOT is not set
         [-DREF_BLAS_ROOT=<reference_blas_install_prefix>] # required only for testing
cmake --build .
ctest
cmake --install . --prefix <path_to_install_dir>
  • On Windows*
# Inside <path to onemkl>
md build && cd build
cmake .. -G Ninja  -DCMAKE_TOOLCHAIN_FILE="..\cmake\toolchain\intel_clang-cl-toolchain.cmake"  
                  [-DMKL_ROOT=<mkl_install_prefix>] \                   # required only if enviroment variable MKLROOT is not set
                  [-DREF_BLAS_ROOT=<reference_blas_install_prefix>]     # required only for testing

ninja 
ctest
cmake --install . --prefix <path_to_install_dir>

Build Options

All options specified in the Conan section are available to CMake. You can specify these options using -D<cmake_option>=<value>.

The following table provides a detailed mapping of options between Conan and CMake.

Conan Option CMake Option Supported Values Default Value
build_shared_libs BUILD_SHARED_LIBS True, False True
enable_mklcpu_backend ENABLE_MKLCPU_BACKEND True, False True
enable_mklgpu_backend ENABLE_MKLGPU_BACKEND True, False True
Not Supported ENABLE_CUBLAS_BACKEND True, False False
enable_mklcpu_thread_tbb ENABLE_MKLCPU_THREAD_TBB True, False True
build_functional_tests BUILD_FUNCTIONAL_TESTS True, False True
build_doc BUILD_DOC True, False False

Project Cleanup

Most use-cases involve building the project without the need to cleanup the build directory. However, if you wish to cleanup the build directory, you can delete the build folder and create a new one. If you wish to cleanup the build files but retain the build configuration, following commands will help you do so. They apply to both Conan and CMake methods of building this project.

# If you use "GNU/Unix Makefiles" for building,
make clean

# If you use "Ninja" for building
ninja -t clean

FAQs

oneMKL

  1. What is the difference between the following oneMKL items?

Answer:

  • The oneAPI Specification for oneMKL defines the DPC++ interfaces for performance math library functions. The oneMKL specification can evolve faster and more frequently than implementations of the specification.

  • The oneAPI Math Kernel Library (oneMKL) Interfaces Project is an open source implementation of a subset of the specification. The project goal is to demonstrate how the DPC++ interfaces documented in the oneMKL specification can be implemented for any math library and work for any target hardware. We encourage the community to contribute to this project and help to extend support to multiple hardware targets and other math libraries.

  • The Intel(R) oneAPI Math Kernel Library (oneMKL) product is the Intel product implementation of the specification (with DPC++ interfaces) as well as similar functionality with C and Fortran interfaces, and is provided as part of Intel® oneAPI Base Toolkit. It is highly optimized for Intel CPU and Intel GPU hardware.

Conan

  1. I am behind a proxy. How can Conan download dependencies from external network?

    • ~/.conan/conan.conf has a [proxies] section where you can add the list of proxies. For details refer to Conan proxy settings.
  2. I get an error while installing packages via APT through Conan.

    dpkg: warning: failed to open configuration file '~/.dpkg.cfg' for reading: Permission denied
    Setting up intel-oneapi-mkl-devel (2021.1-408.beta07) ...
    E: Sub-process /usr/bin/dpkg returned an error code (1)
    
    • Although your user session has permissions to install packages via sudo apt, it does not have permissions to update debian package configuration, which throws an error code 1, causing a failure in conan install command.
    • The package is most likely installed correctly and can be verified by:
      1. Running the conan install command again.
      2. Checking /opt/intel/inteloneapi for mkl and/or tbb directories.

onemkl's People

Contributors

e-kwsm avatar jasukhar avatar mehdi-goli avatar mkrainiuk avatar mmeterel avatar nadyaten avatar vmalia avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.