What is your study programme?
Information for
[ enrolled students ]
What is your study programme?
Colloquium credits

Presentation Master's thesis - Nitya Shah - Psychological Research Methods

Last modified on 12-07-2024 13:59
Social Reasoning in Humans and Language-and-Vision Models (VLMs)
Show information for your study programme
You're currently viewing general information. Choose your study programme to see additional information that's specific to your study programme, such as deadlines, regulations and contact details.
What is your study programme?
Start date
15-07-2024 09:30
End date
15-07-2024 10:00
Location

Roeterseilandcampus - Gebouw A, Straat: Roetersstraat 11, Ruimte: A2.06

State-of-the-Art large language models (LLMs) appear to have formal linguistic competence (i.e., rules of language and grammar), but lack functional linguistic competence, which includes formal and social reasoning, among others. Functional linguistic competence is necessary for using language and engaging with the world. Since human interaction takes place in multimodal settings  (i.e., visual, auditory and textual input), computational models that leverage a key modality in human perception, vision, could be advantageous. Given the lack of research in this field, we specifically examine whether language-and-vision models (VLMs) exhibit social reasoning. In particular, we focus on Theory of Mind (ToM), which we test using the Edinburgh Social Cognition Test (ESCoT) (Baksh et al., 2018). The state-of-the-art VLM, LLaVA, and human participants were tested on theory of mind through the ESCoT. To examine which modalities facilitate social reasoning (ToM), we placed human participants in three different conditions: (1) image-only version of the ESCoT task, (2) image + textual description and (3) only textual description. The VLM, LLaVA, also received the task under these three conditions. The results of this experiment and its implications on our understanding of human cognition and AI's cognitive capabilities will be discussed.