I am an MSc student in Neuroscience at Bilkent University investigating computational approaches to understanding neural representation and visual processing. My research spans computational neuroscience and human-robot interaction, with a focus on machine learning for neural data and embodied AI. In computational neuroscience, I study visual processing, semantic encoding, action representation, and natural stimulus reconstruction using high-dimensional fMRI analysis. In HRI, I build LLM-powered Pepper robot systems for low-latency, multimodal interaction, including studies on mechanistic transparency that show the robot’s live reasoning on its tablet and on how embodiment, vision, and gesture shape trust in competitive interaction.
My work in human-robot interaction (HRI) focuses on LLM-powered social robots for low-latency, real-time, multimodal interaction. I study how task-level mechanistic transparency that shows the robot’s live reasoning on its tablet affects trust and perceived reliability. I also investigate how embodiment factors such as physical presence, vision, and gesture shape trust in competitive interaction with Pepper.
My computational neuroscience research focuses on how the brain represents and processes information, with emphasis on visual perception, semantic encoding, action representation, and natural stimulus reconstruction. I use high-dimensional fMRI analysis and machine learning methods such as representational similarity analysis and multivariate pattern analysis to study category representations, action signals, and perceptual similarity in visual cortex.
For the last 2 weeks, I’ve been trying to gather and synthetically generate data to use in fine-tuning and implementing...
Read MoreAs the admissions period in Türkiye draws to a close, I would like to share my own 2 cents on...
Read More