Follow
Niki Parmar
Niki Parmar
Co-Founder at Essential AI
Verified email at essential.ai
Title
Cited by
Cited by
Year
Attention is all you need
A Vaswani
Advances in Neural Information Processing Systems, 2017
1465902017
Attention is all you need [J]
A Vaswani, N Shazeer, N Parmar, J Uszkoreit, L Jones, AN Gomez
Advances in neural information processing systems 30 (1), 261-272, 2017
6734*2017
Conformer: Convolution-augmented transformer for speech recognition
A Gulati, J Qin, CC Chiu, N Parmar, Y Zhang, J Yu, W Han, S Wang, ...
arXiv preprint arXiv:2005.08100, 2020
33292020
Image transformer
N Parmar, A Vaswani, J Uszkoreit, L Kaiser, N Shazeer, A Ku, D Tran
International conference on machine learning, 4055-4064, 2018
20682018
Advances in neural information processing systems
A Vaswani, N Shazeer, N Parmar, J Uszkoreit, L Jones, AN Gomez, ...
Attention is all you need, 2017
19992017
Stand-alone self-attention in vision models
P Ramachandran, N Parmar, A Vaswani, I Bello, A Levskaya, J Shlens
Advances in neural information processing systems 32, 2019
13762019
Bottleneck transformers for visual recognition
A Srinivas, TY Lin, N Parmar, J Shlens, P Abbeel, A Vaswani
Proceedings of the IEEE/CVF conference on computer vision and pattern …, 2021
12962021
Gomez Aidan N., Kaiser Łukasz, and Polosukhin Illia. 2017
V Ashish, S Noam, P Niki, U Jakob, J Llion
Attention is all you need. In Advances in neural information processing …, 2017
8752017
Tensor2tensor for neural machine translation
A Vaswani, S Bengio, E Brevdo, F Chollet, AN Gomez, S Gouws, L Jones, ...
arXiv preprint arXiv:1803.07416, 2018
6502018
The best of both worlds: Combining recent advances in neural machine translation
MX Chen, O Firat, A Bapna, M Johnson, W Macherey, G Foster, L Jones, ...
arXiv preprint arXiv:1804.09849, 2018
5382018
Scaling local self-attention for parameter efficient visual backbones
A Vaswani, P Ramachandran, A Srinivas, N Parmar, B Hechtman, ...
Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern …, 2021
4812021
Mesh-tensorflow: Deep learning for supercomputers
N Shazeer, Y Cheng, N Parmar, D Tran, A Vaswani, P Koanantakool, ...
Advances in neural information processing systems 31, 2018
4252018
One model to learn them all
L Kaiser, AN Gomez, N Shazeer, A Vaswani, N Parmar, L Jones, ...
arXiv preprint arXiv:1706.05137, 2017
4002017
Attention is all you need. CoRR abs/1706.03762 (2017)
A Vaswani, N Shazeer, N Parmar, J Uszkoreit, L Jones, AN Gomez, ...
3272017
Proceedings of the 31st international conference on neural information processing systems
A Vaswani, N Shazeer, N Parmar, J Uszkoreit, L Jones, AN Gomez, ...
Curran Associates Inc., Red Hook, NY, USA, 2017
2692017
Purity homophily in social networks.
M Dehghani, K Johnson, J Hoover, E Sagi, J Garten, NJ Parmar, S Vaisey, ...
Journal of Experimental Psychology: General 145 (3), 366, 2016
2302016
Stand-alone self-attention in vision models
N Parmar, P Ramachandran, A Vaswani, I Bello, A Levskaya, J Shlens
1702019
Corpora generation for grammatical error correction
J Lichtarge, C Alberti, S Kumar, N Shazeer, N Parmar, S Tong
arXiv preprint arXiv:1904.05780, 2019
1642019
Fast decoding in sequence models using discrete latent variables
L Kaiser, S Bengio, A Roy, A Vaswani, N Parmar, J Uszkoreit, N Shazeer
International Conference on Machine Learning, 2390-2399, 2018
1422018
Attention is all you need. 2017. doi: 10.48550
A Vaswani, N Shazeer, N Parmar, J Uszkoreit, L Jones, AN Gomez, ...
arXiv preprint ARXIV.1706.03762 2 (2), 2017
1332017
The system can't perform the operation now. Try again later.
Articles 1–20