Follow
Erin Grant
Erin Grant
Senior Research Fellow, University College London
Verified email at ucl.ac.uk - Homepage
Title
Cited by
Cited by
Year
Recasting gradient-based meta-learning as hierarchical Bayes
E Grant, C Finn, S Levine, T Darrell, TL Griffiths
International Conference on Learning Representations (ICLR), 2018
6132018
Are convolutional neural networks or transformers more like human vision?
S Tuli, I Dasgupta, E Grant, TL Griffiths
Annual Meeting of the Cognitive Science Society (CogSci), 2021
1942021
Reconciling meta-learning and continual learning with online mixtures of tasks
G Jerfel*, E Grant*, TL Griffiths, K Heller
Advances in Neural Information Processing Systems (NeurIPS), 2019
143*2019
Doing more with less: Meta-reasoning and meta-learning in humans and machines
TL Griffiths, F Callaway, MB Chang, E Grant, PM Krueger, F Lieder
Current Opinion in Behavioral Sciences 29, 24-30, 2019
1292019
Evaluating theory of mind in question answering
A Nematzadeh, K Burns, E Grant, A Gopnik, TL Griffiths
Conference on Empirical Methods in Natural Language Processing (EMNLP), 2018
782018
Universal linguistic inductive biases via meta-learning
RT McCoy, E Grant, P Smolensky, TL Griffiths, T Linzen
Annual Meeting of the Cognitive Science Society (CogSci), 2020
282020
Getting aligned on representational alignment
I Sucholutsky, L Muttenthaler, A Weller, A Peng, A Bobu, B Kim, BC Love, ...
arXiv preprint arXiv:2310.13018, 2023
222023
How can memory-augmented neural networks pass a false-belief task?
E Grant, A Nematzadeh, TL Griffiths
Annual Meeting of the Cognitive Science Society (CogSci), 2017
192017
Passive attention in artificial neural networks predicts human visual selectivity
TA Langlois, HC Zhao, E Grant, I Dasgupta, TL Griffiths, N Jacoby
Advances in Neural Information Processing Systems (NeurIPS), 2021
172021
The transient nature of emergent in-context learning in transformers
AK Singh, SCY Chan, T Moskovitz, E Grant, AM Saxe, F Hill
Thirty-seventh Annual Conference on Neural Information Processing Systems, 2023
142023
Distinguishing rule-and exemplar-based generalization in learning systems
I Dasgupta*, E Grant*, TL Griffiths
International Conference on Machine Learning (ICML), 2022
82022
Exploiting attention to reveal shortcomings in memory models
K Burns, A Nematzadeh, E Grant, A Gopnik, TL Griffiths
EMNLP Workshop on BlackboxNLP: Analyzing and Interpreting Neural Networks …, 2018
82018
Gaussian process surrogate models for neural networks
MY Li, E Grant, TL Griffiths
Conference on Uncertainty in Artificial Intelligence (UAI), 2023
6*2023
Learning deep taxonomic priors for concept learning from few positive examples
E Grant, JC Peterson, TL Griffiths
Annual Meeting of the Cognitive Science Society (CogSci), 2019
62019
A computational cognitive model of novel word generalization
A Nematzadeh, E Grant, S Stevenson
Conference on Empirical Methods in Natural Language Processing (EMNLP), 1795 …, 2015
62015
Predicting generalization with degrees of freedom in neural networks
E Grant, Y Wu
ICML 2022 2nd AI for Science Workshop, 2022
42022
The emergence of gender associations in child language development
B Prystawski, E Grant, A Nematzadeh, SWS Lee, S Stevenson, Y Xu
Cognitive Science, 2022
42022
Bayes in the age of intelligent machines
TL Griffiths, JQ Zhu, E Grant, RT McCoy
arXiv preprint arXiv:2311.10206, 2023
32023
The interaction of memory and attention in novel word generalization: A computational investigation
E Grant, A Nematzadeh, S Stevenson
Annual Meeting of the Cognitive Science Society (CogSci), 2016
22016
Tracing the emergence of gendered language in childhood
B Prystawski, E Grant, A Nematzadeh, SWS Lee, S Stevenson, Y Xu
Annual Meeting of the Cognitive Science Society (CogSci), 2020
12020
The system can't perform the operation now. Try again later.
Articles 1–20