Search

Thursday, September 28, 2023

Prof. Himanshu Yadav: Potsdam -> IIT Kanpur

Himanshu Yadav ended his 3+ year stay at my lab today and will start a tenure-track assistant professorship at the Indian Institute of Technology, Kanpur in India, Department of Cognitive Science. He's the eighth professor from my lab.


 

Tuesday, March 21, 2023

Himanshu Yadav, PhD

Today Himanshu defended his dissertation. His dissertation consists of three published papers.

1. Himanshu Yadav, Garrett Smith, Sebastian Reich, and Shravan Vasishth. Number feature distortion modulates cue-based retrieval in reading. Journal of Memory and Language, 129, 2023.

2. Himanshu Yadav, Dario Paape, Garrett Smith, Brian W. Dillon, and Shravan Vasishth. Individual differences in cue weighting in sentence comprehension: An evaluation using Approximate Bayesian Computation. Open Mind, 2022

3. Himanshu Yadav, Garrett Smith, Daniela Mertzen, Ralf Engbert, and Shravan Vasishth Proceedings of the Annual Meeting of the Cognitive Science Society, 44, 2022.

Congratulations to Himanshu for his truly outstanding work!



Saturday, March 11, 2023

New paper: SEAM: An Integrated Activation-Coupled Model of Sentence Processing and Eye Movements in Reading.

Michael Betancourt, a giant in the field of Bayesian statistical modeling, once indirectly pointed out to me (in a podcast interview) that one should not try to model latent cognitive processes in reading by computing summary statistics like the mean difference between conditions and then fitting the model on those summary statistics. But that is exactly what we do in psycholinguistics. Most models (including those from my lab) evaluate model performance on summary statistics from the data (usually, a mean difference), abstracting away quite dramatically from the complex processes that resulted in those reading times and regressive eye movements. 

What Michael wanted instead was a detailed process model of how the observed fixations and eye movement patterns arise. Obviously, such a model would be extremely complicated, because one would have to specify the full details of oculomotor processes and their impact on eye movements, as well as a model of language comprehension,  and specify how these components interact to produce eye movements at the single trial level. This kind of model will quickly become computationally intractable if one tries to estimate the model parameters using data. So that's a major barrier to building such a model.

Interestingly, both eye movement control models and models of sentence comprehension exist. But these live in parallel universes. Psychologists have almost always focused on eye movement control, ignoring the impact of sentence comprehension processes (I once heard a talk by a psychologist who publicly called out psycholinguists, labeling them as "crazy" for studying language processing in reading :). Similarly, most psycholinguists just ignore the lower-level processes unfolding in reading, and just assume that language processing events are responsible for differences in fixation durations or in left-ward eye movements (regressions). The most that psycholinguists like me are willing to do is add word frequency etc. as a co-predictor to reading time or other dependent measures when investigating reading. But in most cases even that would go too far :).

What is missing is a model that brings these two lines of work into one integrated reading model that co-determines where we move our eyes to and for how long.

 Max Rabe, who is wrapping up his PhD work in psychology at Potsdam in Germany, demonstrates how this could be done: he takes a fully specified model of eye movement control in reading (SWIFT) and integrates into it linguistic dependency completion processes, following the principles of the cognitive architecture ACT-R. A key achievement is that the activation of a word being read is co-determined by both oculomotor processes as specified in SWIFT, and cue-based retrieval processes as specified in the activation-based model of retrieval.  A key achievement is to show how regressive eye movements are triggered when sentence processing difficulty (here, similarity-based interference) arises during reading.

What made the model fitting possible was Bayesian parameter estimation: Max Rabe shows in an earlier (2021) Psychological Review paper (preprint here) how parameter estimation can be carried out in complex models where the likelihood function may not be easy to work out.

 Download the paper from arXiv.




Tuesday, March 07, 2023

Job opening: Postdoc position, starting 1 Oct 2023 (Vasishth lab, University of Potsdam, Germany)

I am looking for a postdoc working in sentence processing (psycholinguistics); the position is at the TV-L 13 salary level. This is a teaching + research position in my lab (vasishth.github.io) in the University of Potsdam, Germany. The planned start date is 1st October 2023, and the initial appointment (following a six-month probationary period) is three years; this can be extended following a positive evaluation.

Principal tasks:

- Teaching two 90 minute classes to undergraduates and graduate students every semester. We teach courses on frequentist and Bayesian statistics, the foundations of mathematics for non-STEM students entering an MSc program in the linguistics department, psycholinguistics (reviews of current research), introductions to psycholinguistics and to experimental methodology.

- Carrying out and publishing research on sentence processing (computational modeling and/or experimental work (e.g., eye-tracking, ERP, self-paced reading). For examples of our research, see: https://vasishth.github.io/publications.html.

- Participation in lab discussions and research collaborations.

Qualifications that you should have:

- A PhD in linguistics, psychology, or some related discipline. In exceptional circumstances, I will consider a prospective PhD student (with a full postdoc salary) who is willing to teach as well as do a PhD with me.

- Published scientific work.

- A background in sentence comprehension research (modeling or experimental or both).

- A solid quantitative background (basic fluency in mathematics and statistical computing at the level needed for statistical modeling and data analysis in psycholinguistics).

An ability to teach in German is desirable but not necessary. A high level of English fluency is expected, especially in writing.

The University and the linguistics department:

The University of Potsdam's Linguistics department is located in Golm, which is a suburb of the city Potsdam, and which can be reached within 40 minutes or so from Berlin through a direct train connection. The linguistics department has a broad focus on almost all areas relating to linguistics (syntax, morphology, semantics, phonetics/phonology, language acquisition, sentence comprehension and production, computational linguistics). The research is highly interdisciplinary, involving collaborations with psychology and mathematics, among other areas. We are a well-funded lab, with projects in a collaborative research grant (SFB 1287) on variability, as well as through individual grants.

The research focus of my lab:

Our lab currently consists of six postdocs (see here), and three guest professors who work closely with lab members.  We work mostly on models of sentence comprehension, developing both implemented computational models as well as doing experimental work to evaluate these models. 

Historically, our postdocs have been very successful in getting professorships: Lena Jäger, Sol Lago, Titus von der Malsburg, Daniel Schad, João Veríssimo, Samar Husain, Bruno Nicenboim. One of our graduates, Felix Engelmann, has his own start-up in Berlin.

For representative recent work from our lab, see:

Himanshu Yadav, Garrett Smith, Sebastian Reich, and Shravan Vasishth. Number feature distortion modulates cue-based retrieval in readingJournal of Memory and Language, 129, 2023.

Shravan Vasishth and Felix Engelmann. Sentence Comprehension as a Cognitive Process: A Computational Approach. Cambridge University Press, Cambridge, UK, 2022.

Dario Paape and Shravan Vasishth. Estimating the true cost of garden-pathing: A computational model of latent cognitive processesCognitive Science, 46:e13186, 2022.

Daniel J. Schad, Bruno Nicenboim, Paul-Christian Bürkner, Michael Betancourt, and Shravan Vasishth. Workflow Techniques for the Robust Use of Bayes FactorsPsychological Methods, 2022.

Bruno Nicenboim, Shravan Vasishth, and Frank Rösler. Are words pre-activated probabilistically during sentence comprehension? Evidence from new data and a Bayesian random-effects meta-analysis using publicly available dataNeuropsychologia, 142, 2020.

How to apply:

To apply, please send me an email (vasishth@uni-potsdam.de) with subject line "Postdoc position 2023", attaching a CV, a one-page statement of interest (research and teaching), copies of any publications (including the dissertation), and names of two or three referees that I can contact. The application period remains open until filled, but I hope to make a decision by end-July 2023 at the latest.


Monday, January 30, 2023

Introduction to Bayesian Data Analysis: Video lectures now available on youtube

These recordings are part of a set of videos that are available from the free four-week online course Introduction to Bayesian Data Analysis, taught over the openhpi.de portal.

Tuesday, October 04, 2022

Applications open: The Seventh Summer School on Statistical Methods for Linguistics and Psychology, 11-15 September 2023

Applications are open (till 1st April 2023( for the seventh summer school on statistical methods for linguistics and psychology, to be held in Potsdam, Germany.

Summer school website: https://vasishth.github.io/smlp2023/

Some of the highlights

1. Four parallel courses on frequentist and Bayesian methods (introductory/intermediate and advanced)

2. A special short course on Bayesian meta-analysis by Dr. Robert Grant of bayescamp.

3. You can also do this free, completely online four-week course on Introduction to Bayesian Data Analysis (starts Jan 2023): https://open.hpi.de/courses/bayesian-statistics2023