Music and cross-modal associations (FRAMES)

Exploring how people with different cultural and musical backgrounds associate music with images, shapes, colours, emotions, and words.
Painted piano keyboard
Project duration
-
Core fields of research
Languages, culture and society
Research areas
Department of Music, Art and Culture Studies - Research areas
Mind, body and emotions
Department
Department of Music, Art and Culture Studies
Faculty
Faculty of Humanities and Social Sciences
Funding
Research Council of Finland
This project is part of the Research Council of Finland’s Centre of Excellence in Music, Mind, Body & Brain (2022−2029) and FRAMES project (2025-2029: Fine-gRaining Audiovisual cross-Modal corrESpondences)

Project description

In this line of research, we explore the intriguing world of cross-modal associations involving auditory stimuli by examining how people from different cultures perceive connections between music and images, shapes, colours, emotions, and words. Participants from across the world will help understand how musical features, such as timbre and melodic contour, are associated with non-auditory features.

This research could provide insights into how universal elements of music perception, cultural, and musical background shape our experience of music. By delving into the intricate relationships between sound, images, and language, the project seeks to enhance our understanding of human perception and cultural diversity in the realm of music.

In June 2025, the Research Council of Finland funded FRAMES: Fine-gRaining Audiovisual cross-Modal corrESpondences, a project based on this research.

From Music and Cross-Modal Associations to FRAMES

A New Chapter Begins!

The work carried out in Music and Cross-Modal Associations laid the foundation for a new project funded by the Research Council of Finland. Building on the same research themes and experimental approaches, the new 4-year project - FRAMES  (Fine-gRaining Audiovisual cross-Modal corrESpondences [9/2025-8/2029]; PI: Dr Alessandro Ansani; Funds granted: 443,717 €) - has been awarded an Academy Research Fellowship grant. 

FRAMES will investigate cross-modal correspondences, namely, associations between stimuli or features across different sensory modalities, such as sight and sound. It will combine online psychophysical studies (WP1) with lab-based eye-tracking experiments (WP2) to gather data. Furthermore, FRAMES will generate substantial datasets, which will be openly shared to support transparency and reproducibility. To enhance reliability and accuracy, and to communicate the results more intuitively, FRAMES will rely on Bayesian statistical methods and inference. The project also aims at developing an R package to help other researchers analyse similar data more easily (WP3). The ambitious goal is to create a detailed map of how different sensory features are connected (WP4). This will help us understand the underlying mechanisms of these associations and their evolutionary significance. The expected results include a more systematic categorisation of crossmodal correspondences, insights into implicit and explicit attentional mechanisms, and the development of tools to facilitate research in this field.

FRAMES benefits from two invaluable collaborations: one with the Italian National Research Council’s Institute of Cognitive Sciences and Technologies (ISTC-CNR), through Dr Nicola Di Stefano, and another with the University of Oxford, through Prof Charles Spence

Interested in joining FRAMES?

Upcoming Positions

FRAMES is exploring the recruitment of up to two new team members. We outline below four potential profiles that would be valuable for the project. Candidates who believe they fit one or more of these descriptions are encouraged to get in touch with the PI to discuss potential opportunities. Although the project anticipates roles suitable for 2 postdoctoral researchers (1 year each), the exact roles and contract types remain flexible and will depend on candidates’ expertise, project needs, and institutional hiring procedures.

Profile A (Technical Profile: WP1-2 - Online Experiment Implementation): The first profile is expected to start in 2026. The main responsibility will be the implementation of the experimental paradigms for WP1–2. The ideal candidate has advanced expertise in JavaScript, including experience with jsPsych, JATOS, or comparable frameworks for developing browser‑based behavioural experiments. Their skill set should cover the full workflow from experiment programming (HTML/CSS/JavaScript) to online data collection, JSON‑based data handling, and basic preprocessing. Experience with Python or R for preliminary data cleaning is considered an asset. All experimental tasks involve the randomised presentation of both static and dynamic stimuli, including images, videos, and musical excerpts.

Profile B (Technical Profile: WP2 - Eye-tracking): The second profile will focus on WP2, with primary responsibility for the setup, execution, and management of an eye‑tracking data‑collection pipeline. The project includes the acquisition of a table‑based eye‑tracking system, for which the researcher will oversee hardware and software setup, calibration procedures, experimental integration, and participant management. The role involves running eye‑tracking sessions, ensuring high‑quality data acquisition, and conducting post‑collection data organisation and preprocessing (but not statistical analysis). Prior experience with eye‑tracking systems, stimulus presentation software, and data‑quality monitoring is required.

Profile C (Technical Profile: WP3 - R package implementation): The third position should start in 2027-2028 to take care of the developmental phase of WP3. They need to have advanced expertise in R, particularly in package development, including proficiency in designing efficient, modular, and user-friendly code. A strong background in multivariate statistical methods and data visualisation is essential, with a proven ability to translate complex analyses into clear and interpretable outputs. Familiarity with version control systems (e.g., Git), open-source software practices, and reproducibility standards is required. Additionally, experience in software testing, documentation, and dissemination, as well as excellent communication and collaborative skills, will be critical.

Profile D (Psychology Profile: WP4 - Literature Review): The fourth position will start in 2028 to take care of the literature review phase (WP4), which will include both qualitative and quantitative approaches. The ideal candidate should have a solid background in cognitive psychology, with a specific focus on CCs. Prior experience in conducting and publishing reviews (such as systematic reviews, meta-analyses, or theoretical papers) on related topics will be especially valuable. A broader interdisciplinary perspective - encompassing philosophical dimensions of perception and cognition - would be a significant asset. The candidate must exhibit strong analytical and organisational skills, excellence in academic writing, and the ability to synthesise complex, multifaceted bodies of literature into coherent and impactful reviews.

Publications & Forthcoming Papers

Read more about our research and upcoming work!

The first publication related to this research (pre-FRAMES) can be found here (open access):

Di Stefano, N.*, Ansani, A., Schiavio, A.*, & Spence, C. (2024). Prokofiev was (almost) right: A cross-cultural investigation of auditory-conceptual associations in Peter and the Wolf. Psychonomic Bulletin & Review, 31, 1735–1744.https://doi.org/10.3758/s13423-023-02435-7

The second publication can be found here (open access):

Di Stefano, N., Ansani, A., Schiavio, A., Saarikallio, S., & Spence, C. (2025). Audiovisual Associations in Saint-Saëns’ Carnival of the Animals: A Cross-Cultural Investigation on the Role of Timbre. Empirical Studies of the Arts, 43(2), 1162-1180. https://doi.org/10.1177/02762374241308810

And here's the third (open access):

Di Stefano, N., Ansani, A., Focaroli, V., Borsella, R., Formenti, G., Velardi, A., Schiavio, A., & Spence, C. (2026). Auditory-conceptual associations in Peter and the Wolf and Carnival of the Animals: Evidence from 6-9 year-old children. Psychonomic Bulletin & Review, 33(1). https://doi.org/10.3758/s13423-025-02804-4  

We have also published (open access) in collaboration with Dr. R. D. Wanke (Centre for Interdisciplinary Studies; University of Coimbra, Portugal):

Wanke, R., Ansani, A., Di Stefano, N., & Spence, C. (2025). Exploring auditory morphodynamics: Audiovisual associations in sound-based music. i-Perception, 16(4). https://doi.org/10.1177/20416695251338718  

Furthermore, four articles are currently under review. In one, we attempted to cross-culturally map the 12 musical intervals onto several extra-auditory dimensions (e.g., gustatory, spatial & physical attributes, emotional/affective, tactile, colour, and shape). In the second one, we explored the cross-modal associations of the same compositions in children with Williams Syndrome. In the third one, some physical properties of static stimuli were successfully mapped onto some auditory features. Lastly, in the fourth article, we explore some associations between trait creativity and increased association scores in an audiovisual task.

  • Ansani, A.*, Di Stefano, N.*, Schiavio, A., Saarikallio, S., Toiviainen, P., Brattico, E., & Spence, C. (in review). Consonance and dissonance shape the multisensory and emotional mappings of musical intervals across Western and Eastern cultures.
  • Di Stefano, N., Ansani, A., Alfieri, P., Schiavio, A., & Spence, C. (in review). Auditory-conceptual associations in individuals with Williams Syndrome.
  • Ansani, A.*, & Wanke, R*. (in review). Audiovisual Correspondences Through Morphodynamics: Mapping Auditory Spectrotemporal Patterns to Material Properties.
  • Schiavio, A., Ansani, A., Di Stefano, N., Kempf, A., Spence, C., & Benedek, M. (in review). Higher creativity is linked to increased association scores in an audiovisual task.

Note. Equal contributions are indicated with an asterisk (*)

Research group

Project team