hardware aware transformers:[ACL 2020] HAT 源码
HAT:用于高效自然语言处理的硬件感知变压器 @inproceedings{hanruiwang2020hat, title = {HAT: Hardware-Aware Transformers for Efficient Natural Language Processing}, author = {Wang, Hanrui and Wu, Zhanghao and Liu, Zhijian and Cai, Han and Zhu, Ligeng and Gan, Chuang and Han, Song}, booktitle = {Annual Conference of the Association for Computational Linguistics}, year = {2020} } 概述 我们发布了HAT
文件列表
hardware-aware-transformers:[ACL 2020] HAT
(预估有个213文件)
libbleu.cpp
3KB
module.cpp
791B
token_block_utils_fast.c
1.04MB
data_utils_fast.cpp
928KB
cuda_utils.cu
6KB
LICENSE
3KB
wmt19ende_gpu_titanxp_all.csv
288KB
wmt14ende_gpu_titanxp_all.csv
287KB
wmt14ende_cpu_xeon_all.csv
289KB
iwslt14deen_gpu_titanxp_all.csv
288KB
暂无评论