Follow
Marten van Schijndel
Title
Cited by
Cited by
Year
fMRI reveals language-specific predictive coding during naturalistic sentence comprehension
C Shain, IA Blank, M van Schijndel, W Schuler, E Fedorenko
Neuropsychologia 138, 107307, 2020
852020
Quantity doesn't buy quality syntax with neural language models
M Van Schijndel, A Mueller, T Linzen
Proceedings of EMNLP 2019, 2019
642019
A model of language processing as hierarchic sequential prediction
M van Schijndel, A Exley, W Schuler
Topics in Cognitive Science 5 (3), 522-540, 2013
602013
A Neural Model of Adaptation in Reading
M van Schijndel, T Linzen
Proceedings of EMNLP 2018, 2018
592018
Salience and attention in surprisal-based accounts of language processing
A Zarcone, M Van Schijndel, J Vogels, V Demberg
Frontiers in psychology 7, 844, 2016
522016
Modeling garden path effects without explicit hierarchical syntax.
M Van Schijndel, T Linzen
Proceedings of CogSci 2018, 2018
512018
Hierarchic syntax improves reading time prediction
M Van Schijndel, W Schuler
Proceedings of the 2015 conference of the North American chapter of the …, 2015
432015
Memory access during incremental sentence processing causes reading time latency
C Shain, M Van Schijndel, R Futrell, E Gibson, W Schuler
Proceedings of the workshop on computational linguistics for linguistic …, 2016
412016
Accurate Unbounded Dependency Recovery using Generalized Categorial Grammars.
L Nguyen, M Van Schijndel, W Schuler
Proceedings of COLING, 2125-2140, 2012
412012
Investigating locality effects and surprisal in written English syntactic choice phenomena
R Rajkumar, M Van Schijndel, M White, W Schuler
Cognition 155, 204-232, 2016
372016
Using priming to uncover the organization of syntactic representations in neural language models
G Prasad, M Van Schijndel, T Linzen
Proceedings of the 23rd Conference on Computational Natural Language Learning, 2019
342019
An Analysis of Frequency- and Memory-Based Processing Costs.
M Van Schijndel, W Schuler
Proceedings of NAACL-HLT, 95-105, 2013
292013
Single‐stage prediction models do not explain the magnitude of syntactic disambiguation difficulty
M Van Schijndel, T Linzen
Cognitive science 45 (6), e12988, 2021
242021
All bark and no bite: Rogue dimensions in transformer language models obscure representational quality
W Timkey, M van Schijndel
arXiv preprint arXiv:2109.04404, 2021
232021
Recurrent neural network language models always learn English-like relative clause attachment
F Davis, M Van Schijndel
Proceedings of ACL 2020, 2020
172020
Frequency effects in the processing of unbounded dependencies
M van Schijndel, W Schuler, PW Culicover
Annual Meeting of the Cognitive Science Society (CogSci), 2014
152014
An analysis of memory-based processing costs using incremental deep syntactic dependency parsing
M Van Schijndel, L Nguyen, W Schuler
Workshop on Cognitive Modeling and Computational Linguistics (CMCL), 2013
152013
Discourse structure interacts with reference but not syntax in neural language models
F Davis, M van Schijndel
arXiv preprint arXiv:2010.04887, 2020
112020
Evidence of syntactic working memory usage in MEG data
M van Schijndel, B Murphy, W Schuler
Workshop on Cognitive Modeling and Computational Linguistics (CMCL), 2015
92015
Can entropy explain successor surprisal effects in reading?
M van Schijndel, T Linzen
arXiv preprint arXiv:1810.11481, 2018
82018
The system can't perform the operation now. Try again later.
Articles 1–20