Skip to content
/ FBGEMM Public
forked from pytorch/FBGEMM

FB (Facebook) + GEMM (General Matrix-Matrix Multiplication) - https://code.fb.com/ml-applications/fbgemm/

License

Notifications You must be signed in to change notification settings

losif63/FBGEMM

This branch is 1058 commits behind pytorch/FBGEMM:main.

Folders and files

NameName
Last commit message
Last commit date

Latest commit

2a7b3ab · Jan 31, 2024
Jan 29, 2024
Jan 26, 2024
Jun 2, 2023
Jan 8, 2024
Jan 31, 2024
Jan 8, 2024
Jan 25, 2024
Nov 6, 2023
Jan 8, 2024
Mar 24, 2023
Aug 26, 2022
Dec 26, 2023
Feb 5, 2023
Jan 8, 2024
Jan 8, 2024
Mar 21, 2020
Jan 8, 2024
Jan 12, 2022
Jan 30, 2024
Mar 24, 2023
Jan 8, 2024
Dec 26, 2023

Repository files navigation

FBGEMM

FBGEMM CI

FBGEMM (Facebook GEneral Matrix Multiplication) is a low-precision, high-performance matrix-matrix multiplications and convolution library for server-side inference.

The library provides efficient low-precision general matrix multiplication for small batch sizes and support for accuracy-loss minimizing techniques such as row-wise quantization and outlier-aware quantization. FBGEMM also exploits fusion opportunities in order to overcome the unique challenges of matrix multiplication at lower precision with bandwidth-bound operations.

FBGEMM is used as a backend of Caffe2 and PyTorch quantized operators for x86 machines:

See the full Documentation for more information on building, installing, and developing with FBGEMM, as well as the most up-to-date support matrix and API documentation for this library.

What's New?

Citation

For a high-level overview, design philosophy and brief descriptions of various parts of FBGEMM please see our blog post.

For those looking for the appropriate article to cite regarding FBGEMM, we recommend citing our paper:

@article{fbgemm,
  title={FBGEMM: Enabling High-Performance Low-Precision Deep Learning Inference},
  author={Khudia, Daya and Huang, Jianyu and Basu, Protonu and Deng, Summer and Liu, Haixin and Park, Jongsoo and Smelyanskiy, Mikhail},
  journal={arXiv preprint arXiv:2101.05615},
  year={2021}
}

Join the FBGEMM community

For questions, support, news updates, or feature requests, please feel free to:

For contributions, please see the CONTRIBUTING file for ways to help out.

License

FBGEMM is BSD licensed, as found in the LICENSE file.

About

FB (Facebook) + GEMM (General Matrix-Matrix Multiplication) - https://code.fb.com/ml-applications/fbgemm/

Resources

License

Code of conduct

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • C++ 54.5%
  • Python 24.5%
  • Cuda 19.2%
  • CMake 1.3%
  • C 0.3%
  • Starlark 0.2%