Dr Jon Weinel


Programme Leader - Games Design; Senior Lecturer in Games Design
BA Hons, BSc Hons, MRes, PhD, MBCS, CITP, SFHEA

Academic and research departments

Music and Media.

About

Areas of specialism

Video game development (Unity/Unreal); Computer sound/music; Audio-visual composition and performance ; Altered states of consciousness ; Programming (C#); Practice-led research

Affiliations and memberships

Chartered IT Professional (CITP)
I am a Full Professional Member (MBCS) of the British Computer Society with Chartered IT Professional (CITP) status.

Supervision

Postgraduate research supervision

Publications

Jonathan P. Bowen, Jonathan Weinel, Ann Borda, Graham Diprose (2024)Proceedings of EVA London 2024, In: Proceedings of EVA London 2024: Electronic Visualisation and the Arts BCS Learning and Development Ltd. - Chartered Institute of IT

The Electronic Visualisation and the Arts London 2024 Conference (EVA London 2024) is co-sponsored by the Computer Arts Society (CAS) and BCS, the Chartered Institute for IT, of which the CAS is a Specialist Group. As for 2022, the EVA London 2023 Conference is a physical and online "hybrid" conference. We continue with publishing the proceedings, both online, with open access via ScienceOpen, and also in our traditional printed form, in full colour. The main conference presentations run during 10-13 July 2023, with workshops and other activities, especially for students, on 14 July 2023.

Jonathan P. Bowen, Jonathan Weinel, Graham Diprose (2023)Proceedings of EVA London 2023, In: Proceedings of EVA London 2023. BCS London 10-14 July 2023 EVA London Electronic Visualisation and the Arts - BCS Learning and Development Ltd.

The Electronic Visualisation and the Arts London 2023 Conference (EVA London 2023) is co-sponsored by the Computer Arts Society (CAS) and BCS, the Chartered Institute for IT, of which the CAS is a Specialist Group. As for 2022, the EVA London 2023 Conference is a physical and online "hybrid" conference. We continue with publishing the proceedings, both online, with open access via ScienceOpen, and also in our traditional printed form, in full colour. The main conference presentations run during 10-13 July 2023, with workshops and other activities, especially for students, on 14 July 2023.

Jonathan P. Bowen, Jonathan Weinel, Ann Borda, Graham Diprose (2022)EVA London 2022: Electronic Visualisation and the Arts BCS Learning and Development Ltd.

The Electronic Visualisation and the Arts London 2022 Conference (EVA London 2022) is co-sponsored by the Computer Arts Society (CAS) and BCS, the Chartered Institute for IT, of which the CAS is a Specialist Group. Of course, this has been a difficult time for all conferences, with the Covid-19 pandemic. For the first time since 2019, the EVA London 2022 Conference is a physical conference. It is also an online conference, as it was in the previous two years. We continue with publishing the proceedings, both online, with open access via ScienceOpen, and also in our traditional printed form, for the second year in full colour. Over recent decades, the EVA London Conference on Electronic Visualisation and the Arts has established itself as one of the United Kingdom's most innovative and interdisciplinary conferences. It brings together a wide range of research domains to celebrate a diverse set of interests, with a specialised focus on visualisation. The long and short papers in this volume cover varied topics concerning the arts, visualisations, and IT, including 3D graphics, animation, artificial intelligence, creativity, culture, design, digital art, ethics, heritage, literature, museums, music, philosophy, politics, publishing, social media, and virtual reality, as well as other related interdisciplinary areas. The EVA London 2022 proceedings presents a wide spectrum of papers, demonstrations, Research Workshop contributions, other workshops, and for the seventh year, the EVA London Symposium, in the form of an opening morning session, with three invited contributors. The conference includes a number of other associated evening events including ones organised by the Computer Arts Society, Art in Flux, and EVA International. As in previous years, there are Research Workshop contributions in this volume, aimed at encouraging participation by postgraduate students and early-career artists, accepted either through the peer-review process or directly by the Research Workshop chair. The Research Workshop contributors are offered bursaries to aid participation. In particular, EVA London liaises with Art in Flux, a London-based group of digital artists. The EVA London 2022 proceedings includes long papers and short "poster" papers from international researchers inside and outside academia, from graduate artists, PhD students, industry professionals, established scholars, and senior researchers, who value EVA London for its interdisciplinary community. The conference also features keynote talks. A special feature this year is support for Ukrainian culture after its invasion earlier in the year. This publication has resulted from a selective peer review process, fitting as many excellent submissions as possible into the proceedings. This year, submission numbers were lower than previous years, mostly likely due to the pandemic and a new requirement to submit drafts of long papers for review as well as abstracts. It is still pleasing to have so many good proposals from which to select the papers that have been included. EVA London is part of a larger network of EVA international conferences. EVA events have been held in Athens, Beijing, Berlin, Brussels, California, Cambridge (both UK and USA), Canberra, Copenhagen, Dallas, Delhi, Edinburgh, Florence, Gifu (Japan), Glasgow, Harvard, Jerusalem, Kiev, Laval, London, Madrid, Montreal, Moscow, New York, Paris, Prague, St Petersburg, Thessaloniki, and Warsaw. Further venues for EVA conferences are very much encouraged by the EVA community. As noted earlier, this volume is a record of accepted submissions to EVA London 2022. Associated online presentations are in general recorded and made available online after the conference.

Jonathan Weinel, Jonathan P. Bowen, Ann Borda, Graham Diprose (2021)EVA London (Electronic Visualisation and the Arts): proceedings of EVA London 2021, In: AI and the Arts: Artificial Imagination. Proceedings of EVA London 2021 (EVA 2021). 5th July - 9th July 2021July British Computer Society

The Electronic Visualisation and the Arts London 2021 Conference (EVA London 2021) is co-sponsored by the Computer Arts Society (CAS) and BCS, the Chartered Institute for IT, of which the CAS is a Specialist Group. Of course, this is a difficult time for all conferences, with the Covid-19 pandemic. As a result, the EVA London 2021 Conference is an online conference, as it was in the previous year as well. We continue with publishing the proceedings, both online, with open access via ScienceOpen, and also in our traditional printed form. Over recent decades, the EVA London Conference on Electronic Visualisation and the Arts has established itself as one of the United Kingdom's most innovative and interdisciplinary conferences. It brings together a wide range of research domains to celebrate a diverse set of interests, with a specialised focus on visualisation. The long and short papers in this volume cover varied topics concerning the arts, visualisations, and IT, including 3D graphics, animation, artificial intelligence, creativity, culture, design, digital art, ethics, heritage, literature, museums, music, philosophy, politics, publishing, social media, and virtual reality, as well as other related interdisciplinary areas. The EVA London 2021 proceedings presents a wide spectrum of papers, demonstrations, Research Workshop contributions, other workshops, and for the sixth year, the EVA London Symposium, in the form of an evening panel session with invited contributors. The conference includes a number of other associated online evening events including ones organised by the Computer Arts Society, Art in Flux, and the Lumen Prize. A feature of EVA London, started in 2018, has been a Research in Education Day, immediately after the main conference, bringing together students and associated staff from universities in the London area to enable presentations and networking. In the circumstances, we now aim to hold this in conjunction with EVA London 2022. As in previous years, there are Research Workshop contributions in this proceedings, aimed at encouraging participation by postgraduate students and early-career artists, accepted either through the peer-review process or directly by the Research Workshop chair. The Research Workshop contributors are offered bursaries to aid participation. In particular, EVA London liaises with Art in Flux, a London-based group of digital artists. The EVA London 2021 proceedings includes long papers and short "poster" papers from international researchers inside and outside academia, from graduate artists, PhD students, industry professionals, established scholars, and senior researchers, who value EVA London for its interdisciplinary community. The conference also features keynote talks. This publication has resulted from a selective peer review process, fitting as many excellent submissions as possible into the proceedings. This year, submission numbers were lower than previous years, mostly likely due to the pandemic forcing the conference online. It is still pleasing to have so many good proposals from which to select the papers that have been included. EVA London is part of a larger network of EVA international conferences. EVA events have been held in Athens, Beijing, Berlin, Brussels, California, Cambridge (both UK and USA), Canberra, Copenhagen, Dallas, Delhi, Edinburgh, Florence, Gifu (Japan), Glasgow, Harvard, Jerusalem, Kiev, Laval, London, Madrid, Montreal, Moscow, New York, Paris, Prague, St Petersburg, Thessaloniki, and Warsaw. Further venues for EVA conferences are very much encouraged by the EVA community.

Jonathan Weinel, Jonathan P. Bowen, Graham Diprose, Nick Lambert (2019)EVA London 2019: Electronic Visualisation & the Arts. Proceedings of a conference held in London 8th-11th July.July British Computer Society (BSC) The Chartered Institute for IT

The Electronic Visualisation and the Arts London 2019 Conference (EVA London 2019) is co-sponsored by the Computer Arts Society (CAS) and BCS, the Chartered Institute for IT, of which the CAS is a Specialist Group. 2019 marks the 50th anniversary of Event One, an early digital art exhibition held at the Royal College of Art in 1969.

Jonathan P. Bowen, Jonathan Weinel, Graham Diprose, Nick Lambert (2018)EVA London (Electronic Visualisation and the Arts) Proceedings of EVA London 2018. Electronic Visualisation and the Arts.July British Computer Society (BSC) The Chartered Institute for IT

The Electronic Visualisation and the Arts London 2018 Conference (EVA London 2018) is co-sponsored by the Computer Arts Society (CAS) and BCS, the Chartered Institute for IT, of which the CAS is a Specialist Group. 2018 marks the 50th anniversary of CAS, which is celebrated as part of EVA London 2018.

Tyler Howard McIntosh, Jonathan Weinel, Stuart Cunningham (2022)Lundheim: exploring affective audio techniques in an action-adventure video game, In: ACM Conference commitee (eds.), AM ’22: Proceedings of the 17th International Audio Mostly Conference. AudioMostly 2022 St. Pölten Austria September 6 - 9, 2022October Association for Computing Machinery (ACM)

This paper discusses Lundheim, a video game prototype made in Unity that incorporates interactive mechanisms based on affective computing techniques, which are used to control audio-visual aspects of the game. The project is based on a fictitious Old Norse realm named 'Lundheim', a place where emotions are woven into the fabric of reality. The game utilises Russell's circumplex model of affect, providing four runes which correspond with different sections of the circumplex model. The player must activate each rune by entering the corresponding emotion state, which is captured using a consumer-grade Interaxon Muse electroencephalograph (EEG) headband. Activating each emotion triggers particle effects and corresponding sonic materials including interactive music, which are implemented with the Wwise video game audio middleware software. The project thereby demonstrates a novel implementation of affective technologies and sound in a video game, contributing towards discourses in this area of research.

Anne Tsjornaja, Jonathan Weinel, Martyn Broadhead (2022)Art(NET)work: visualising interconnected artwork data in VR, In: Proceedings of EVA (Electronic Visualisation & the Arts). EVA London 2022 (4th - 8th July 2022) BCS Learning and Development Ltd.

As the technology progresses, VR becomes more widespread and finds uses outside of gaming. One of these applications is virtual museums (Alatrash et al.2021). As it has been shown that active participation with art pieces plays an important role in audience experience (Passebois Ducros & Euzéby 2021), virtual museums are gaining popularity all over the world. This makes VR a great tool that is potentially able to increase audience engagement with art, increasing its outreach. Museums all over the world have launched virtual tours, allowing audiences to wander through representations of gallery spaces and take a closer look at famous art pieces. In addition to digital representations of real museums, there are also completely digital museums that are accessible only through VR. However, these completely virtual museums rarely take advantage of their freedom from the restrictions of the physical world. Harnessing the potential of VR in this domain, we present Art(NET)work: an application able to visualise data in an interactive and immersive manner that can provide a much richer and more intuitive understanding of the data.

Jonathan Weinel (2022)Book launch - Explosions in the mind: composing psychedelic sounds and visualisations, In: Proceedings of EVA (Electronic Visualisation & the Arts). EVA London 2022 (4th - 8th July 2022)July BCS Learning and Development Ltd.

Explosions in the Mind: Composing Psychedelic Sound and Visualisations (Weinel, 2021) is a new book exploring more than a decade of the author's practice-led research composing sound and visualisations based on altered states of consciousness such as hallucinations and experiences of synaesthesia (Figure 1). The book is part of the Palgrave Studies in Sound series, edited by Prof. Mark Grimshaw-Daugaard.

Gabriela Maria Pyjas, Jonathan Weinel, Martyn Broadhead (2022)Storytelling and VR: ιnducing emotions through AI characters, In: Proceedings of EVA (Electronic Visualisation & the Arts). EVA London 2022 (4th - 8th July 2022)July BCS Learning and Development Ltd.

Recent forms of virtual reality (VR) have changed the way people interact with moving images in the entertainment and the games industry, as well as the way the content is created. Technological advances in VR have given an opportunity to create simulated environments that users can immerse themselves in, and sense almost as a real experience by combining film techniques and interactive media approaches. Storytelling in VR presents various challenges due to the spatial properties of the medium. Research suggests that engaging Non-Player Characters (NPCs) enhance storytelling and can do so by communicating emotions. Most VR war experiences use the concept of morale and emotions applied to a group of soldiers or individual characters. To address the need for more believable AI characters in VR, this project will investigate how emotions can be communicated more effectively in a VR war application. VR companies are increasingly using Artificial Intelligence (AI) and cloud technologies to develop a stronger ecosystem for NPCs. However, there are still significant number of limitations in terms of technology and immersive storytelling for VR with characters and props paying significant role for creating convincing VR experiences. This project will therefore aim to enhance storytelling in VR by inducing emotions through AI characters in a war environment inspired by realistic events from WWII.

Jonathan Weinel (2021)Worship the penguin: adventures with sprites, chiptunes, and lasers, In: EVA London (Electronic Visualisation and the Arts): Proceedings of EVA London 2021July British Computer Society (BSC)

This paper provides a review of recent projects developed through the author's reative practice and activities across multiple computing and games technologies platforms. These include: a 2D game project made in Unity; an Arduino-based laser puzzle; chiptune breakbeat music made on a Commodore 64; the archival of a collection of Amiga demoscene disks; PETSCII graphics; a controller adapter for the Amiga; and a DJ/VJ performance. While playfully exploring new trajectories, these projects broadly reflect on-going themes present in the author's previous work, such as explorations of the aesthetic paradigms presented by vintage computers, 1990s rave culture, and synaesthesia. The paper will address the various challenges and methodologies used to realise these projects; pedagogical considerations; and the pandemic context in which they have been created and presented.

Jonathan Weinel, Jonathan P. Bowen, G. Diprose, N. Lambert (2019)Virtual hallucinations: projects in VJing, virtual reality and cyberculture, In: EVA London 2019 (Electronic Visualisation and the Arts)July BCS, The Chartered Institute for IT

This paper discusses a variety of the author's artistic projects exploring altered states of consciousness and computer art. First, the paper will provide a brief overview of previous creative works, which include compositions of electroacoustic music, interactive visualisations, and visual music films. These previous works use the concept of altered states of consciousness as a compositional principle, as explored in the author's book Inner Sound: Altered States of Consciousness in Electronic Music and Audio-Visual Media (OUP 2018). Following this, a variety of the author's recent creative work produced from 2016-2019 will be discussed. These works include: a series of paintings that incorporate computer graphics animations when viewed in augmented reality; VJ performances constructed using direct animation on 8mm film, computer graphics animations generated from code and audio-reactive effects; and Cyberdream VR, a virtual reality experience. These interrelated projects continue to develop the author's artistic investigations into altered states, while also referencing work such as demo scene videos; cyberdelic imagery of the type seen on fliers from the 1990s rave-era; and the recent Internet-borne subculture vaporwave, which recontextualises the aesthetics of 1980s and 1990s ambient corporate music and utopian computer graphics to construct surrealistic dystopias.

Jonathan P. Bowen, Tula Giannini, Gareth Polmeer, Carla Gannis, Jeremy Gardiner, Jonathan Kearney, Bruce Wands, Jonathan Weinel (2018)States of being: art and identity in digital space and time, In: Jonathan P. Bowen, Jonathan Weinel, G. Diprose, N. Lambert (eds.), EVA London 2018 Electronic Visualisation in Arts and CultureJuly BCS: The Chartered Institute for IT

This one-day Symposium explored themes of personhood, modernity and digital art, bringing together speakers from a range of disciplines to consider technology, artistic practice and society. It seeks a renewed consideration of the role of art in illuminating human identity in a positive relation with technology, and its transformative effects upon space and time. The concerns for the role of art amidst the forces of a post-modern world are influenced by important legacies of the past, by which ideas about human identity and difference have been made meaningful in the relation of history and technology. In the frequently transient and conflicting forces of humanness and forces of modernity, the digital world of the arts emerges as a means by which new ideas of space and time can be considered, with new perspectives of human identity seen as states of being, towards the possibilities of experience, technology, individuality and society.

Darryl Griffiths, Stuart Cunningham, Jonathan Weinel (2016)An interactive music playlist generator that responds to user emotion and context, In: Jonathan P. Bowen, Graham Diprose, Nick Lambert (eds.), Electronic Visualisation and the Arts (EVA 2016)July BCS The Chartered Institute for IT

This paper aims to demonstrate the mechanisms of a music recommendation system, and accompanying graphical user interface (GUI), that is capable of generating a playlist of songs based upon an individual's emotion or context. This interactive music playlist generator has been designed as part of a broader system, Intended for mobile devices, which aims to suggest music based upon 'how the user is feeling' and 'what the user is doing' by evaluating real-time physiological and contextual sensory data using machine learning technologies. For instance, heart rate and skin temperature in conjunction with ambient light, temperature and global positioning satellite (GPS) could be used to a degree to infer one's current situation and corresponding mood. At present, this interactive music playlist generator has the ability to conceptually demonstrate how a playlist can be formed in accordance with such physiological and contextual parameters. In particular, the affective aspect of the interface is visually represented as a two-dimensional arousal-valence space based upon Russell's circumplex model of affect (1980). Context refers to environmental, locomotion and activity concepts, and are visually represented in the interface as sliders. These affective and contextual components are discussed in more detail next in Sections 2 and 3, respectively. Section 4 will demonstrate how an affective and contextual music playlist can be formed by interacting with the GUI parameters. For a comprehensive discussion in terms of the development of this research, refer to (Griffiths et al. 2013a, 2013b, 2015). Moreover, refer to Teng et al. (2013) and Yang et al. (2008) for related work in these broader research areas.

Jonathan Weinel (2015)Representing altered states of consciousness in computer arts, In: Kia Ng, Jonathan P. Bowen, Nick Lambert (eds.), Electronic Visualisation and the Arts (EVA 2015)July BCS: The Chartered Institute for IT

It has been proposed that among the earliest known artworks produced by humans may have been representations of altered states of consciousness (ASCs). With the advent of modern computer technology that enables the creation of almost any sound or image imaginable, the possibility of representing the subjective visual and aural components of hallucinatory experiences with increased realism emerges. In order to consider how these representations could be created, this paper provides a discussion of existing work that represents ASCs. I commence by providing an overview of ASCs and a brief history of their use in culture. This provides the necessary background through which we may then consider the variety of art and music that represents ASCs, including: shamanic art and music, modern visual art, popular music, film and video games. Through discussion of the ways in which these examples represent ASC, a concept of 'ASC Simulation' is proposed, which emphasises realistic representations of ASCs. The paper concludes with a brief summary of several creative projects in computer ,usic and arts that explore this area.

Jonathan Weinel (2015)Optical research: a curated visual music collection, In: K. Ng, Jonathan P. Bowen, Nigel Lambert (eds.), Electronic Visualisation and the Arts (EVA 2015)July BCS, The Chartered Institute for IT

Optical Research is a curated collection of visual music by a group of 12 international artists, which has recently been presented as a DVD, a gallery installation, and will be presented at the British Computer Society Electronic Visualisation and the Arts: EVA London 2015.

Jonathan Weinel (2013)Quake Delirium revisited: system for video game ASC simulations, In: Proceedings of the Fifth International Conference on Internet Technologies and Applications (ITA 13)July North East Wales Institute

This paper reviews the conceptual model devised for a previous project, where Max/MSP was used to modify the game Quake to create a more psychedelic experience for the player. In this original proof of concept, various available game parameters were animated in order to imitate perceptual distortions of the type produced by hallucinogenic drugs. A MIDI mixing console was used to 'remix' these perceptual distortions in real-time, and devise pre-defined sequences of hallucination. The control parameters were also used to manipulate a corresponding soundtrack, which was intended to reflect the hallucinatory game experience through electroacoustic sound. This paper outlines the existing proof-of-concept, and considers the development of this model to create a more sophisticated system for use in game engines such as Unity. Through the use of Hobson's 'Activation, Input, Modulation' (AIM) model of consciousness, I will propose a cohesive system for creating ASC simulations in video games.

Ross Rodney, Jonathan Weinel, Martyn Broadhead (2023)Retro stylistic transformations in games, In: Jonathan P. Bowen, Jonathan Weinel, Graham Diprose (eds.), Proceedings of EVA London 2023. BCS London 10th - 14th July 2023July EVA London Electronic Visualisation and the Arts - BCS Learning and Development Ltd.

Retro aesthetics describe a nostalgic approach to replicating or paying homage to a bygone 'nostalgic' age by imitating old cultures and styles from the past. However, Guffey (2006 p.9) describes the word 'retro' as "a word with many meanings". For example, he discusses one interpretation of the word as a synonym for old-fashioned and another as a view on life that "cleaves to the values and mores of the past" (p.10). Following this interpretation, retro is closely related to nostalgia, which Garda (2013) describes nostalgia as pervasive in a culture "obsessed with its own history" (p.1). Although there are games that include examples of retro stylistic transformations, there are numerous possibilities for how these transitions can occur that have yet to be explored. The purpose of this project is therefore to investigate the design of retro stylistic transformations in video games; explore specific techniques for their design and development through a prototype; and evaluate how effectively this seamlessly transforms between retro and modern art styles.

Raheem A. Lawal, Jonathan Weinel, Darrenlloyd Gent (2023)Representing amphibian perspectives in a 3D game engine, In: Proceedings of EVA London 2023 (EVA 2023)July BCS Learning and Development Ltd

First Person Shooters (FPS) have dominated the gaming landscape, as evidenced by successful titles such as Call of Duty: Black Ops (Treyarch 2005-present) selling more than 30 million units (Clement 2022). Although first-person perspective is particularly associated with FPS games, this viewpoint is found in a wide variety of genres from horror and survival games, to puzzle platformers. For instance, first-person perspective can be found in simulation games such as PowerWash Simulator (FuturLab 2003-present), and House Flipper (Frozen District 2014-present). Exploring this area, this project investigates recreating an amphibian's sensory experience, from a first-person point of view using a 3D game engine. Amphibians and in particular the Texas Blind Salamanders were chosen as they have unusual sensory perceptions that differ from people. They do not experience the world the same way that we would, which creates an interesting challenge to represent them in an audio-visual game engine. For the purposes of this project, this will be achieved by looking at what already exists in the gaming market; identifying possible approaches of non-human representation; and focusing on recreating these experiences through the design and development of a prototype in the Unity game engine.

Darryl Griffiths, Stuart Cunningham, Jonathan Weinel (2013)Automatic music playlist generation using affective technologies, In: Rich Picking, Stuart Cunningham, Nigel Houlden, Denise Oram, Vic Grout, Julie Mayers, Nathan Clarke, Carlos Guerrero, Raed A. Abd-Alhameed, Susan Liggett (eds.), Internet Technologies and Applications (ITA), 2013July North East Wales Institute

This paper discusses how human emotion could be quantified using contextual and physiological information that has been gathered from a range of sensors, and how this data could then be used to automatically generate music playlists. I begin by discussing existing affective systems that automatically generate playlists based on human emotion. I then consider the current work in audio description analysis. A system is proposed that measures human emotion based on contextual and physiological data using a range of sensors. The sensors discussed to invoke such contextual characteristics range from temperature and light to EDA (electro dermal activity) and ECG (electrocardiogram). The concluding section describes the progress achieved so far, which includes defining datasets using a conceptual design, microprocessor electronics and data acquisition using MatLab. Lastly, there is brief discussion of future plans to develop this research.

Jonathan Weinel, Darryl Griffiths, Stuart Cunningham (2014)Easter eggs: hidden tracks and messages in musical mediums, In: Karolos Papoulias (eds.), International Computer Music Conference Proceedings, 2014September International Computer Music Association

'Easter eggs' are hidden components that can be found in computer software and various other media including music. In this paper the concept is explained, and various examples are discussed from a variety of mediums including analogue and digital audio formats. Through this discussion, the purpose of including easter eggs in musical mediums is considered. We propose that easter eggs can serve to provide comic amusement within a work, but can also serve to support the artistic message of the artwork. Concealing easter eggs in music is partly dependent on the properties of the chosen medium; vinyl records may use techniques such as double grooves, while digital formats such as CD may feature hidden tracks that follow long periods of empty space. Approaches such as these and others are discussed. Lastly, we discuss some software components we have developed ourselves in Max/MSP, which facilitate the production of easter eggs by performing certain sequences of notes, or as a result of time-based events. We therefore argue that computer music performances present unique opportunities for the incorporation of easter eggs. These may occur to the surprise of audiences, performers and composers, and may support the artistic purpose of compositions as a whole.

Marianne Markowski, Jonathan Weinel, Marcantonio Gagliardi (2024)A place to tinker and transform: our vision for the XR lab for health, well-being and education, In: Jonathan P. Bowen, Jonathan Weinel, Ann Borda, Graham Diprose (eds.), Proceedings of EVA London 2024: Electronic Visualisation and the ArtsJuly BCS: The Chartered Institute for IT

This short paper presents our vision for our Extended Reality (XR) lab, a cross-faculty initiative for which we secured internal capital investment funding (70K) from the University of Greenwich. The faculties involved are the Faculty of Engineering (FES) and the Faculty for Education, Health and Human Sciences (FEHHS), which will be working closely with the Faculty of Liberal Arts and Sciences (FLAS).

Jonathan Weinel (2024)The Senses Beyond: New Directions in Game Engine Experiences, In: Tula Giannini, Jonathan P. Bowen (eds.), The Arts and Computational Culture: Real and Virtual Worldspp. 499-520 Springer Nature Switzerland

Video games are a significant part of audio-visual culture, and game engines are widely used not only for the construction of video games, but also arts projects, virtual reality experiences, and serious or educational games. This chapter presents a series of short summaries which describe student projects produced on the games and digital media BSc degree programmes at the University of Greenwich. These projects indicate interesting new directions in game engine experiences, and contribute towards emerging strands of practically-focused research in video games. Each project has involved background research into related literature and artefacts, which then informs the design, development and evaluation of a prototype product. The projects discussed here specifically explore themes related to adaptive music; representation of animal species; retro aesthetics; the design of artificially intelligent game characters; and the representation of hallucinations or intoxicated states in games. As a collection there are some common themes, and the projects collectively point towards novel ways in which game engines may generate experiences that activate the visual and auditory senses in ways that elicit empathy, emotions, memories, and awareness of unusual sensory states and systems.

Jonathan Weinel (2020)Visualising rave music in Virtual Reality: symbolic and interactive approaches, In: Jonathan Weinel, Jonathan P. Bowen, G. Diprose, N. Lambert (eds.), EVA London 2020 (Electronic Visualisation and the Arts)July British Computer Society (BSC)

In 2019 the virtual reality experience Cyberdream VR was presented at various events including the Event Two exhibition of computer art held at the Royal College of Art. This short demo for Oculus Gear VR provided a c.5 minute sonic journey, in which the user moves through a series of symbolic environments based on the futuristic techno-utopian or dystopian imagery of 90s rave fliers. These environments accompanied an original soundtrack of rave music and vaporwave, allowing users to enjoy the music whilst feeling as though they are inside synaesthetic virtual spaces related to the symbolic imagery of rave culture. This paper will discuss the subsequent development of this project, which is now being adapted for the Oculus Quest VR headset. Rajmil Fischman's concept of 'Music in the Holodeck'; the fictional 'holophonor' instrument from the TV series Futurama; or the video game Rez Infinite, are among those that suggest paradigms for performing sound and visuals in VR. Drawing upon these ideas, the current iteration of Cyberdream allows the user to trigger 'sound instruments' with the Oculus Quest Touch controllers. This is conceived as a means through which the user can 'paint with sound' in 3D space, improvising with the music, whilst also generating synaesthetic imagery. The design of these sound instruments presents interesting challenges, in order for sound to be generated in correspondence with visual imagery, where these materials may complement both the music and spatial environments. Drawing on the author's VJ work, at the macro level of compositional structure, the latest version of the project also seeks ways to provide a more continuous experience analogous to a DJ or VJ set, through blending of music tracks between scenes. This paper will discuss on-going work on this project, advancing the discourse regarding the visualisation of hardcore rave music in virtual reality.

Jon Weinel (2020)Cyberdreams: Visualizing Music in Extended Reality, In: Rae Earnshaw, Susan Liggett, Peter Excell, Daniel Thalmann (eds.), Springer Series on Cultural Computingpp. 209-227 Springer Cham

From the visual music films of the twentieth century to the Video Jockey (VJ) performances seen at the latest electronic dance music festivals, there is an extensive body of artistic work that seeks to visualize sound and music. The form that these visualizations take has been shaped significantly by the capabilities of available technologies; thus, we have seen a transition from paint to film; from hand-drawn animations to motion-graphics; and from analog to digital projection systems. In the twenty-first century, visualizations of music are now possible with extended reality (XR) technologies such as virtual reality (VR), augmented/mixed reality (AR/MR), and related forms of multi-projection environment such as fulldome. However, the successful design of visual music and VJ performances using XR technologies requires us to consider the compositional approaches that can be used by artists and designers. To investigate this area, this chapter will begin with an analysis of existing work that visualizes music using XR technologies. This will allow us to consider the spectrum of existing design approaches, and provide a commentary on the possibilities and limitations of the respective technologies. Following this, the chapter will provide an in-depth discussion of Weinel’s practice-led research, which extends from work exhibited at the Carbon Meets Silicon exhibitions held at Wrexham Glyndŵr University (2015, 2017), and includes AR paintings, VJ performances, and a VR application: Cyberdream VR. Through the discussion of these works, the chapter will demonstrate possible compositional principles for visualizing music across media ranging from paint to XR, enabling the realization of work that reinforces the conceptual meanings associated with music.

Jonathan Weinel (2013)Nausea: an approach to sonic arts composition based on ASC, In: Internet Technologies and Applications (ITA), 2013July North East Wales Institute

This paper concerns research in the field of compositional methods for electroacoustic music. I discuss the compositional approach used for creating 'Nausea': a large-scale work of electroacoustic music presented in surround sound. The piece is part of a larger body of creative work in sonic arts carried out as part of the author's PhD research. These works explore the use of altered states of consciousness (ASC) as a basis for the design of sonic materials and structure. Sounds are created to reflect aspects of a hypothetical psychedelic experience, such as visual patterning effects or hallucinated entities. These sounds are then arranged in a manner that suitably reflects the progression of a typical psychedelic experience. Through discussion of the compositional methodology used, it is intended to demonstrate how ASC can be used to inform the design of sonic artworks. It is anticipated that this research will also contribute more generally to knowledge of possible approaches for the design of digital artworks that represent ASC. The emphasis of this paper is on the compositional process, and does not attempt to measure audience response to the music. Similarly, the process described should be seen as appropriate, but not absolute; implementations of this method involving slightly different subjective artistic judgements would be possible within the general framework discussed.

Jonathan Weinel (2011)Tiny jungle: psychedelic techniques in audio-visual composition, In: Monty Adkins, Ben Isaacs (eds.), International Computer Music Conference Proceedings, 2011July Michigan Publishing ; International Computer Music Association (ICMA)

Tiny Jungle is a psychedelic audio- visual montage, with a soundtrack based upon electroacoustic music and late 1990s drum and bass. This work forms part of my research regarding 'Altered states of consciousness (ASCs) as an adaptive principle for composing electroacoustic music'. According to Heinrich Klu?ver and Timothy Leary, ASCs such as those induced by mescaline may commonly result in the perception of visual patterns of hallucination, cryptic symbols, perception of cellular or atomic forms, and strange journeys or narratives. Tiny Jungle uses these ASC features as a basis for the composition of sound and video. Visual material for this piece was created through a montage of animated hand-drawn artwork, 3D graphics and material created using a specially designed Max/MSP/Jitter tool that produces stroboscopic visual material in real-time. The combination of music and moving images is intended to construct a narrative that conveys psychedelic experiences. In this paper I will discuss the development of Tiny Jungle from concept to realisation, with emphasis on the development of visual material and a brief overview of the soundtrack. It is hoped to thereby stimulate thought for audio-visual practitioners on compositional techniques based upon ASCs and audio-visual montage techniques involving hand-produced artwork and real-time methods.

Andrei Copaceanu, Jonathan Weinel, Stuart Cunningham (2024)Using voice input to control and interact with a narrative video game, In: Jonathan P. Bowen, Jonathan Weinel, Ann Borda, Graham Diprose (eds.), Proceedings of EVA London 2024: Electronic Visualisation and the ArtsJuly BCS: The Chartered Institute for IT

With the advancement of artificial intelligence (AI) over recent years, especially the breakthrough in technology that OpenAI achieved with the natural language generative model of ChatGPT, virtual assistants and voice interactive devices such as Amazon's Alexa or Apple's Siri, have become popular with the general public. This is due to their ease of use, accessibility, and ability to be used without physical interaction. When it comes to the video games industry, there have been attempts to implement voice input as a core mechanic, with various levels of success. Ultimately, voice input has been mostly used as a separate mechanic or as an alternative to traditional input methods. This project will investigate different methods of using voice input to control and interact with a narrative video game. The research will analyse which method is most effective in facilitating player control of the game and identify challenges related to implementation. This paper also includes a work-in-progress demonstration of a voice-activated game made in Unreal Engine.

Christopher Folorunso, Jonathan Weinel, Nuno Otero (2024)Horror in modern and retro 3D games, In: Jonathan P. Bowen, Jonathan Weinel, Ann Borda, Graham Diprose (eds.), Proceedings of EVA London 2024: Electronic Visualisation and the ArtsJuly BCS: The Chartered Institute for IT

This research explores the evolution of horror games from their pixelated past to the present day. In recent times, various horror titles have sought to revisit the styles of retro horror games, however this project questions whether modern games' attempts to evoke nostalgia through direct scares and visual filters maintains the appeal of original 90s horror games like Resident Evil (Capcom 1996), Parasite Eve (Square 1998) or Silent Hill (Team Silent 1999). Exploring this area, this project combines elements of retro and modern horror games in order to produce a game demo product.

Jagunmolu Bamidele Oke, Jonathan Weinel, Nuno Otero (2024)Enhanced squad behaviour in tactical action games, In: Jonathan P. Bowen, Jonathan Weinel, Ann Borda, Graham Diprose (eds.), Proceedings of EVA London 2024: Electronic Visualisation and the ArtsJune BCS: The Chartered Institute for IT

This project will delve into background research on AI techniques involving spatial analysis, role assignment, NPC behaviour, and tactical techniques for squad-based combat scenarios. The investigation will involve the design, development and evaluation of a prototype. The goal of this project is to uncover methods for designing and developing squad-based NPCs that can assess situations, have group objectives, and recognise individual team members' strengths and weaknesses. It will also explore how to identify the best team member for unique roles and find the optimal position for engaging, recovering, flanking, and other related manoeuvres, based on the line of fire and other environmental factors, all within the context of a game engine.

Jonathan Weinel, Jonathan P. Bowen, Graham Diprose, Nick Lambert (2020)Proceedings of EVA London 2020 (EVA 2020). AI and the Arts: Artificial ImaginationJuly British Computer Society (BSC) The Chartered Institute for IT

The Electronic Visualisation and the Arts London 2020 Conference (EVA London 2020) is cosponsored by the Computer Arts Society (CAS) and BCS, the Chartered Institute for IT, of which the CAS is a Specialist Group. 2020 marks the 30th anniversary of EVA London, which was first held at Imperial College London in 1990.

Robert Ratcliffe, Jonathan Weinel, Zubin Kanga (2011)Mutations (megamix): exploring notions of the 'DJ set', 'mashup' and 'remix' through live piano-based performance, In: eContact! Online Journal for Electroacoustic Practices13(2) Communauté électroacoustique canadienne / Canadian Electroacoustic Community

Mutations (megamix) is an interactive work for piano and electronics, which explores notions of the "DJ set", "mashup" [1. The term "mashup" is used within popular music to denote a specific compositional technique in which two or more existing records are superimposed to create a new track. This often involves the use of vocal a capella material, which is layered against other recordings.

Jonathan Weinel (2010)Bass drum, saxophone and laptop: real-time psychedelic performance software, In: eContact! Online Journal for Electroacoustic Practices12(4) Communauté électroacoustique canadienne / Canadian Electroacoustic Community

Taking a performance by Z’EV and John Zorn as an inspirational starting point, Bass Drum, Sax & Laptop is a piece of software designed with Max/MSP which facilitates improvisational real-time performance for live instruments and electronics. This software and the music produced with it are a continuation of my research regarding compositional techniques that elicit altered states of consciousness. DSP effects are incorporated which process the live instruments, while a sampling module, the "atomizer", produces sound which is mimetic of visual patterns of hallucination. An integral feature of the software is the ability to automate control parameters temporally so that they respond to the live performance. This facilitates a system of interactivity in which the performers respond to the software and vice-versa. The resulting spontaneous interactions and temporally shifting effects are intended to create an analogy between the sounds produced and the complex biological processes which produce dreams and hallucinations. In this article I will discuss the development of the software and its realisation in performance with Sol Nte on saxophone and myself on bass drum.

Jonathan Weinel (2011)Quake Delirium: remixing psychedelic video games, In: Sonic Ideas/Ideas Sonicas3(2) Centro Mexicano para la Musica y las Artas Sonoras (CMMAS)

Quake Delirium is a specially designed software patch which enhances the video game Quake in order to create a uniquely modi ed version, with graphics, audio and game parameters altered towards a more individual aesthetic for each user. The project is part of my wider research regard- ing compositional techniques to elicit altered states of con-sciousness (ASCs). For the purposes of this article, 'altered states of consciousness' refers contextually to enhancing audio and graphical parameters of a video game in such a way as to re ect perceptual features of dreamlike, intoxicated or hallucinogenic experiences. The goal is to add a new level of interaction to existing video games and virtual worlds, by facilitating digital hallucinations and simulated temporal shifts of consciousness. The conceived ideal realisation of this project was to construct a 'Universal Game Remix Device' that would be suitable for use with many different video games, as a way to 'remix' the virtual experiences towards altered consciousness aesthetics. Technical limitations regarding the feasibility of creating a Universal Game Remix Device led to the development of Quake Delirium, a Max/MSP patch which works only with the game Quake while running the Fitzquake modification. Quake Delirium can be considered as a video game 'hack' which demon- strates the proof of concept. In the course of this article I shall demonstrate how altered states of consciousness can be portrayed in video games, either in unique cases or across multiple games by creating the hypothesised Universal Game Remix Device. More broadly, through this discussion I hope to stimulate thought on ways in which interactive artworks and video games could be remixed, and how the potential for projects of this kind could be furthered substantially by the development of a new software protocol which allows artists using Max/MSP/Jitter and Pure Data to exchange video game information data and signals.

Jonathan Weinel (2021)Explosions in the mind: Composing Psychedelic Sounds and VisualisationsNovember Springer Singapore

This book explores how to compose sounds and visualisations that represent psychedelic hallucinations and experiences of synaesthesia. Through a detailed discussion regarding compositional methodologies and technical approaches, the book aims to educate students, practitioners, and researchers working in related areas. It weaves together sound, visual design, and code across a range of media, providing conceptual approaches, theoretical insights, and practical strategies, which unlock new design frameworks for composing psychedelic sounds and visualisations.

Jonathan Weinel, Stuart Cunningham (2019)Designing game audio based on avatar-centred subjectivity, In: Michael Filimowicz (eds.), Foundations in Sound Design for Interactive MediaJuly Routledge (Taylor & Francis Group)

This chapter explores a selection of practical approaches for designing video game audio based on the subjective perception of a player avatar. The authors discuss several prototype video game systems developed as part of their practice-led research, which provide interactive audio systems that represent the aural experience of a virtual avatar undergoing an altered state of consciousness. Through the discussion of these prototypes, the authors expose a variety of possible approaches for sound design in order to represent the subjective perceptual experiences of a player avatar.

Jonathan Weinel (2024)Auditory hallucinations and altered states of consciousness in video games, In: William Gibbons, Mark Grimshaw-Aagaard (eds.), The Oxford Handbook of Video Game Music and SoundAugust Oxford University Press

Sequences that depict states of intoxication, hallucination, or psychosis, have become increasingly common in video games over the last few decades. In this chapter, I shall explore the design and function of these sequences by drawing together existing research and examples in this area. First, I will outline how ASCs may shape subjective experience in various ways, focusing in particular on sensory aspects of these experiences. This will allow us to consider how these may provide a practical basis for the design of corresponding sequences in video games. Following this, I will provide an overview of existing games that represent hallucinations, allowing us to see what techniques are used already. Next, I discuss several of my own projects which represent auditory hallucinations and ASCs in sound and video games. This discussion will provide the necessary backdrop through which we can then consider the concept of "avatar-centered subjectivity," whereby the design of graphics, sounds, and controls are modified to represent the subjective experience of a player avatar from a first-person perspective. Through an exploration of examples and the concept of avatar-centered subjectivity, I will provide the conceptual tools that readers need to understand how ASCs can be effectively represented in games, allowing them to undertake their own analyses and practical work in this specialist area of sound design and games development.

Jonathan Weinel (2022)Review: Ludified, edited by Marko Ciciliani, Barbara Lüneburg, and Andreas Pirchner, In: Journal of Sound and Music in Games3(2-3) University of California Press

Review of Ludified, Edited by Marko Ciciliani, Barbara Lüneburg, and Andreas Pirchner (The Green Box, 2021, 432 pp, ?38.00).

Jonathan Weinel, Stuart Cunningham, Nathan Roberts, Shaun Roberts, Darryl Griffiths (2015)EEG as a controller for psychedelic visual music in an immersive dome environment, In: Sonic Ideas/Ideas Sonicas7(14) Centro Mexicano para la Musica y las Artas Sonoras (CMMAS)

Altered States of Consciousness (ASC), and hallucinogenic experiences in particular have formed the basis for many works of art, literature and music. In my compositional prac-tices I have explored the use of visual patterns of hallucina-tion in particular as a basis for the design of electroacoustic music, and electroacoustic audio-visual or 'visual music' compositions (Weinel 2012). While many existing works of psychedelic art and visual music exist in fixed mediums such as film, we may conceive of interactive audio-visual experiences of this type. Such interactive artworks may be facilitated with computers, utilising video game engines or real-time sound and graphics software, together with an appropriate controller. The long-term goal of research in this area is to devise machines that are capable of transfer-ring the sounds and visuals of dreams and hallucinations from the human brain into digital technologies. The proper realisation of this is beyond our capability with current tech-nology, existing only in science fiction movies like Paprika(2006). Nonetheless, this article discusses an approxima-tion of such as system. For Psych Dome (Weinel 2013a), we used a consumer-grade electroencephalograph (EEG) headset so that brainwaves could be used to provide real-time control over a visual music artwork based upon visual patterns of hallucination. In doing so we are able to provide a system that conceptually links1 the human brain to gen-erative hallucinatory forms in digital media. In this article, I will discuss aesthetic and technical aspects of this project as used in our initial trial, where the artwork was presented in an immersive dome projection environment. Additionally, some testing of human participants was carried out, en-abling us to provide some general comment on the useful-ness of a consumer-grade EEG headset in the context of real-time visual music installations.

Jonathan Weinel (2024)Trackers and Breakbeats: celebrating Brain Records' underground revolution, In: Trackers and Breakbeats: Celebrating Brain Records' Underground RevolutionOctober

Trackers and Breakbeats showcases the critical contributions of Bizzy B, whose innovative use of the Commodore Amiga and OctaMED software pushed the boundaries of music production. Despite his significant impact, Bizzy B has often been left out of the mainstream narrative around breakbeat and jungle. This exhibition aims to correct that by acknowledging and celebrating his legacy and those of his collaborators. It also explores the broader demoscene, where early software was used creatively to challenge and redefine cultural boundaries. Curated by Rendezvous Projects with Bizzy B, DJ Dlux and Jon Weinel.

Over the last century, developments in electronic music and art have enabled new possibilities for creating audio and audio-visual artworks. With this new potential has come the possibility for representing subjective internal conscious states, such as the experience of hallucinations, using digital technology. Combined with immersive technologies such as virtual reality goggles and high-quality loudspeakers, the potential for accurate simulations of conscious encounters such as Altered States of Consciousness (ASCs) is rapidly advancing. In Inner Sound, author Jonathan Weinel traverses the creative influence of ASCs, from Amazonian chicha festivals to the synaesthetic assaults of neon raves; and from an immersive outdoor electroacoustic performance on an Athenian hilltop to a mushroom trip on a tropical island in virtual reality. Beginning with a discussion of consciousness, the book explores how our subjective realities may change during states of dream, psychedelic experience, meditation, and trance. Taking a broad view across a wide range of genres, Inner Sound draws connections between shamanic art and music, and the modern technoshamanism of psychedelic rock, electronic dance music, and electroacoustic music. Going beyond the sonic into the visual, the book also examines the role of altered states in film, visual music, VJ performances, interactive video games, and virtual reality applications. Through the analysis of these examples, Weinel uncovers common mechanisms, and ultimately proposes a conceptual model for Altered States of Consciousness Simulations (ASCSs). This theoretical model describes how sound can be used to simulate various subjective states of consciousness from a first-person perspective, in an interactive context. Throughout the book, the ethical issues regarding altered states of consciousness in electronic music and audio-visual media are also examined, ultimately allowing the reader not only to consider the design of ASCSs, but also the implications of their use for digital society.

Stuart Cunningham, Jonathan Weinel, Richard Picking (2016)In-Game Intoxication: Demonstrating the Evaluation of the Audio Experience of Games with a Focus on Altered States of Consciousness, In: Miguel Angel Garcia-Ruiz (eds.), Games User Research: A Case Study ApproachJuly CRC Press - Taylor and Francis

In this chapter, we consider a particular method of specifically evaluating the user experience of game audio. To provide a domain of game audio to evaluate, we focus on an increasingly occuring phenomenon in game; that of the altered state of consciousness. Our approach seeks to evaluate user experience of game audio from normal gameplay and gameplay that features altered states. As such, a brief background to person-centered approaches to use experience evaluation is presented and then we provide a detailed description of the method that has been adopted in this chapter: the use of personal construct theory via repertory grid interviews.

Jonathan Weinel (2024)Visualizing music with video game technologies, In: William James Gibbons, Mark Grimshaw-Aagaard (eds.), The Oxford Handbook of Video Game Music and SoundAugust Oxford University Press

Video game engines provide a variety of new ways to visualize and experience music. In this chapter I will discuss how these technologies are giving rise to new forms of music visualization. First, I will provide a brief tour through a spectrum of work related to music visualization, from early visual-music paintings and films to the latest VJ performances, music video games, and VR experiences. Following this, I will discuss one of my own projects: Cyberdream (2019-2021), a VR music visualization, which takes the user on a psychedelic flight across synaesthetic landscapes in which the user can "paint with sound" while listening to a soundtrack of 1990s-style rave music. Lastly, I will discuss a selection of student projects that explore sound design in the context of video game engines. These projects indicate new forms that may emerge when students studying programmes not directly related to games development are brought into contact with these technologies. This will also provide some pedagogical insights that will be useful for educators wanting to support growth in this area. By tracing the trajectory of music visualizations from past to present, and observing nascent forms in this area, we shall see how a new paradigm of gamified music visualizations is gradually emerging. This paradigm is currently amorphous, as new possibilities are rapidly being explored by practitioners working in this area, but broadly consists of audio-visual projects created with game engines, which visualize music, yet may incorporate traces of the video games through aspects such as control systems, use of game characters or game-like virtual worlds. In this chapter I will indicate some of the new forms that are emerging in this area, whilst also discussing possible strategies for cultivating further innovation through education.

Jonathan Weinel, Stuart Cunningham (2021)Practice-led and interdisciplinary research investigating affective sound design, In: Michael Filimowicz (eds.), Doing Research in Sound DesignNovember Routledge - Focal Press

This chapter discusses practice-led and interdisciplinary methods in sound design research. Sound design can be undertaken as a creative activity in combination with theoretical investigation, so that theory informs practice, while conversely, the generation of new practical approaches provides new theoretical insights. The authors discuss how this may provide a suitable methodology for research in sound design, where sound artifacts themselves may transmit and advance knowledge in the field, but may also be complemented and disseminated through means of papers and publications. Furthermore, the authors also argue that practice-led methods can be used in combination with other empirical approaches, which may help to reinforce the outcomes. For instance, it is possible to develop artifacts of sound design or prototype systems, which are informed by, or provide a basis for, quantitative, qualitative, or mixed methods evaluations. How such practice-led and interdisciplinary studies may be formulated, is specifically explored in this chapter with reference to the authors' previous research in affective sound design, which considers aspects of human emotion and perception as they relate to sound. Through the discussion of various projects, the chapter explores how practice-led and interdisciplinary approaches may lead to positive research outcomes, and may also have an enriching effect on the skillsets, knowledge and attributes of individual researchers and their teams.

Jonathan Weinel, Stuart Cunningham (2014)Digitized direct animation: creating materials for electroacoustic visual music using 8mm film, In: eContact! Online Journal for Electroacoustic Practices15(4) Communauté électroacoustique canadienne / Canadian Electroacoustic Community

"Direct animation" (also called "drawn-on animation" or "camera-less animation") involves the direct application of paints and other artistic materials to celluloid film in order to construct animated visual material. This article provides a brief overview regarding the historical use of this technique in the work of other experimental film-makers, before discussing the use of this process to create Mezcal Animations #1-3: a piece of psychedelic visual music with electroacoustic sound. To compose this piece, reels of 8 mm film were first bleached before applying inks to create the visual materials for the piece. The resulting visual materials contain various desirable æsthetic properties, resulting from the use of the analogue medium, when magnified under projection. However, in order to combine these materials with electroacoustic sound, and provide a more reliable format for performances, it is desirable to digitize the materials. For this piece, the film was digitized using a DSLR camera; while this may not be the ideal method of digitization, it succeeds in providing a quick, low-cost and convenient solution. Once the film has been digitized, a soundtrack can be created using typical approaches to the composition of electroacoustic music. The æsthetic qualities of direct animation create some interesting challenges for composing an electroacoustic soundtrack; these are briefly discussed with regards to Mezcal Animations #1-3. Through the course of this article, we therefore demonstrate a novel, low-budget compositional approach for creating visual music compositions in a digital context, using electroacoustic sound. The strategies discussed may also be of interest to other composers wishing to experiment with this technique, or for other composers who explore psychedelic æsthetics in their work.

Jonathan Weinel (2014)Shamanic diffusions: a technoshamanic philosophy of electroacoustic music, In: Sonic Ideas/Ideas Sonicas6(12) Centro Mexicano para la Musica y las Artas Sonoras (CMMAS)

Electroacoustic music affords the possibility of creating journeys through non- realistic or illusory spaces, through the use of sonic materials. This article proposes the application of the concept of 'technoshamanism' as a principle for composing and performing electroacoustic works of this type. I shall commence by examining the use of the term 'technoshaman' in relation to psy-trance culture. Through consideration of Rouget's (1985) definitions of ecstasy and trance, I will discuss the relationship of psy-trance culture to Rouget's definition of trance. From this position I shall then propose the use of electroacoustic music in relation to Rouget's definition of ecstasy. This will enable me to define 'shamanic diffusions' as an opposing technoshamanic approach to that which is used in psy-trance. Under this discussion, electroacoustic music will be considered as an ecstatic technology. I shall then conclude with some comments and speculation regarding how this concept may be useful as an approach for the composition and performance of electroacoustic music. For example, in various composed works I have used altered states of consciousness and hallucinations, as a principle for the design of sonic materials and musical structure. Through the course of this article then, I will describe a conceptual model through which to consider electroacoustic composition and performance.

Stuart Cunningham, John Henry, Jon Weinel (2020)Augmenting Virtual Spaces: Affective Feedback in Computer Games, In: Rae Earnshaw, Susan Liggett, Peter Excell, Daniel Thalmann (eds.), Technology, Design and the Arts - Opportunities and Challengespp. 229-247 Springer Nature Link

Computer games can be considered a form of art insomuch as they are critiqued, revered and collected for their aesthetics in addition to their ludic qualities. Perhaps most significantly, computer games incite a plethora of emotional responses in their players as a deliberate and defining mechanism. However, unlike other forms of traditional media and art, another key feature of games is their intrinsic interactivity, reliance upon technology and non-linearity. These traits make them particularly noteworthy if one wishes to consider how art forms might respond and adapt to their audience’s emotions. The field of affective computing has been developing for several decades and many of its applications have been in the analysis and modelling of emotional responses to forms of media, such as music and film. In gaming, recent developments have led to an increasing number of consumer-grade biofeedback devices which are available on the market, some of which are explicitly sold as ‘gaming controllers’, giving rise to greater opportunity for affective feedback to be incorporated. In this chapter, a review is provided of the affective gaming field. Specifically, it is proposed that these developments give rise to interesting opportunities whereby virtual environments can be augmented with player affective and contextual information. An overview is provided of affective computing fundamentals and their manifestation in developments relating specifically to games. The chapter considers the impact this biometric information has upon games players, in terms of their experience of the game and the social connections between competitors. A number of associated practical and technological challenges are highlighted along with areas for future research and development activities. It is hoped that by exploring these developments in gaming that the longer established forms of art and media might be inspired to further embrace the possibilities offered by utilising affective feedback.

Jon Weinel, Stuart Cunningham, Darryl Griffiths, Shaun Roberts, Richard Picking (2014)Affective Audio, In: Leonardo music journal24(24)pp. 17-20
Jonathan Weinel (2013)Visual patterns of hallucination as a basis for sonic arts composition, In: Proceedings of the 8th Audio Mostly Conference12pp. 1-7 ACM

Visual patterns of hallucination; pin-point dot patterns of light, arranged in spiral or funnel structures are often perceived in hallucinogenic experiences such as those produced by mescaline. This article discusses the use of altered states of consciousness (ASC) and visual patterns of hallucination, as principles upon which to base the design of musical compositions and related audio-visual works. I provide some background information regarding visual patterns of hallucination (or 'entoptic phenomena'), with reference to studies by Klüver and Strassman. I then proceed to discuss a process for using visual patterns of hallucination as a basis for designing sonic and visual material, using a purpose-built piece of software: the Atomizer Live Patch. The implementation of this sonic material in the context of 'ASC compositions' is discussed with regards to Entoptic Phenomena, a fixed electroacoustic composition, and Tiny Jungle, an audio-visual work. These pieces form part of a larger body of work completed as part of my PhD research, where ASC was used as a principle for the design of electroacoustic music and work in related mediums. Through the discussion of these works I will demonstrate an approach for using ASC, and visual patterns of hallucination in particular, as a basis for the design of sonic artworks and visual music. This research therefore contributes to the field of compositional methods for electroacoustic music, while more broadly indicating approaches for creating digital artworks that reflect ASC.

Darryl Griffiths, Stuart Cunningham, Jonathan Weinel (2013)A discussion of musical features for automatic music playlist generation using affective technologies, In: Proceedings of the 8th Audio Mostly Conference13pp. 1-4 ACM

This paper discusses how human emotion could be quantified using contextual and physiological information that has been gathered from a range of sensors, and how this data could then be used to automatically generate music playlists. The work is very much in progress and this paper details what has been done so far and plans for experiments and feature mapping to validate the concept in real-world scenarios. We begin by discussing existing affective systems that automatically generate playlists based on human emotion. We then consider the current work in audio description analysis. A system is proposed that measures human emotion based on contextual and physiological data using a range of sensors. The sensors discussed to invoke such contextual characteristics range from temperature and light to EDA (electro dermal activity) and ECG (electrocardiogram). The concluding section describes the progress achieved so far, which includes defining datasets using a conceptual design, microprocessor electronics and data acquisition using Matlab. Lastly, there is brief discussion of future plans to develop this research.

Jonathan Weinel, Stuart Cunningham (2017)Simulating Auditory Hallucinations in a Video Game, In: Proceedings of the 12th International Audio Mostly Conference on Augmented and Participatory Sound and Music Experiences131930a18pp. 1-7 ACM

In previous work the authors have proposed the concept of 'ASC Simulation1: including audio-visual installations and experiences, as well as interactive video game systems, which simulate altered states of consciousness (ASCs) such as dreams and hallucinations. Building on the discussion of the authors' previous paper, where a large-scale qualitative study explored the changes to auditory perception that users of various intoxicating substances report, here the authors present three prototype audio mechanisms for simulating hallucinations in a video game. These were designed in the Unity video game engine as an early proof-of-concept. The first mechanism simulates 'selective auditory attention' to different sound sources, by attenuating the amplitude of unattended sources. The second simulates 'enhanced sounds', by adjusting perceived brightness through filtering. The third simulates 'spatial disruptions' to perception, by dislocating sound sources from their virtual acoustic origin in 3D-space, causing them to move in oscillations around a central location. In terms of programming structure, these mechanisms are designed using scripts that are attached to the collection of assets that make up the player character, and in future developments of this type of work we foresee a more advanced, standardised interface that models the senses, emotions and state of consciousness of player avatars.

Jonathan Weinel, Richard Picking, Stuart Cunningham, Lyall Williams (2015)Holophonor: Designing the Visual Music Instruments of the Future, In: Recent Advances in Ambient Intelligence and Context-Aware Computingpp. 143-154 Igi Global

This chapter considers the technological feasibility of the Holophonor a fictional audio-visual instrument from the science fiction cartoon Futurama. Through an extended discussion of the progression of visual music towards interactive models, it was proposed that the Holophonor is an example of an ideal visual music instrument and could be constructed in the near figure. This chapter recapitulates the key features of the fictional instrument. An evaluation of the technological feasibility of building a real-world version of the Holophonor is then given, with reference to existing technologies. In particular, it is proposed that the Holophonor's ability to respond to the emotional state of the performer may he facilitated by drawing on approaches from HCI and affective computing. Following this, a possible architecture for the Holophonor is proposed.

Jonathan Weinel (2019)Cyberdream VR: Visualizing Rave Music and Vaporwave in Virtual Reality, In: Proceedings of the 14th International Audio Mostly Conference: A Journey in Soundpp. 277-281 Assoc Computing Machinery

Cyberdream VR is a short artistic virtual reality (VR) experience, which is based on the concept of visualizing the imaginative worlds suggested by rave music and vaporwave as symbolic, spatial, virtual environments. The piece is conceptualized as a virtual hallucination through the broken techno-utopias of cyberspace. Aesthetically, the work adapts the forms of 1990s VJ performance, demoscene animations, and the visual language of rave fliers from this era, constructing these forms as virtual spaces that the user can enter into through VR. Drawing upon the Internet-borne music subculture vaporwave, the piece also deconstructs the visual language of 1990s techno-utopian computer culture. By transporting the user into the imaginative worlds suggested by rave music and vaporwave, Cyberdream VR more broadly indicates a possible approach to visualizing music that prioritizes symbolic representation. In the future, this approach could be applied for other types of music and yield new transformative approaches to music visualization that may eventually be automated.

Stuart Cunningham, Jonathan Weinel, Shaun Roberts, Vic Grout, Darryl Griffiths (2013)Initial objective & subjective evaluation of a similarity-based audio compression technique, In: Proceedings of the 8th Audio Mostly Conference1pp. 1-6 ACM

In this paper, we undertake an initial study evaluation of a recently developed audio compression approach; Audio Compression Exploiting Repetition (ACER). This is a novel compression method that employs dictionary-based techniques to encode repetitive musical sequences that naturally occur within musical audio. As such, it is a lossy compression technique that exploits human perception to achieve data reduction. To evaluate the output from the ACER approach, we conduct a pilot evaluation of the ACER coded audio, by employing both objective and subjective testing, to validate the ACER approach. Results show that the ACER approach is capable of producing compressed audio that varies in subjective and objective and quality grades that are inline with the amount of compression desired; configured by setting a similarity threshold value. Several lessons are learned and suggestion given as to how a larger, enhanced series of listening tests will be taken forward in future, as a direct result of the work presented in this paper.

Stuart Cunningham, Jonathan Weinel, Richard Picking (2018)High-Level Analysis of Audio Features for Identifying Emotional Valence in Human Singing, In: Proceedings of the Audio Mostly 2018 on Sound in Immersion and Emotiona37pp. 1-4 Assoc Computing Machinery

Emotional analysis continues to be a topic that receives much attention in the audio and music community. The potential to link together human affective state and the emotional content or intention of musical audio has a variety of application areas in fields such as improving user experience of digital music libraries and music therapy. Less work has been directed into the emotional analysis of human acapella singing. Recently, the Ryerson Audio-Visual Database of Emotional Speech and Song (RAVDESS) was released, which includes emotionally validated human singing samples. In this work, we apply established audio analysis features to determine if these can be used to detect underlying emotional valence in human singing. Results indicate that the short-term audio features of: energy; spectral centroid (mean); spectral centroid (spread); spectral entropy; spectral flux; spectral rolloff; and fundamental frequency can be useful predictors of emotion, although their efficacy is not consistent across positive and negative emotions.

Stuart Cunningham, Harrison Ridley, Jonathan Weinel, Richard Picking (2021)Supervised machine learning for audio emotion recognition Enhancing film sound design using audio features, regression models and artificial neural networks, In: Personal and ubiquitous computing25(4)pp. 637-650 Springer Nature

The field of Music Emotion Recognition has become and established research sub-domain of Music Information Retrieval. Less attention has been directed towards the counterpart domain of Audio Emotion Recognition, which focuses upon detection of emotional stimuli resulting from non-musical sound. By better understanding how sounds provoke emotional responses in an audience, it may be possible to enhance the work of sound designers. The work in this paper uses the International Affective Digital Sounds set. A total of 76 features are extracted from the sounds, spanning the time and frequency domains. The features are then subjected to an initial analysis to determine what level of similarity exists between pairs of features measured using Pearson's r correlation coefficient before being used as inputs to a multiple regression model to determine their weighting and relative importance. The features are then used as the input to two machine learning approaches: regression modelling and artificial neural networks in order to determine their ability to predict the emotional dimensions of arousal and valence. It was found that a small number of strong correlations exist between the features and that a greater number of features contribute significantly to the predictive power of emotional valence, rather than arousal. Shallow neural networks perform significantly better than a range of regression models and the best performing networks were able to account for 64.4% of the variance in prediction of arousal and 65.4% in the case of valence. These findings are a major improvement over those encountered in the literature. Several extensions of this research are discussed, including work related to improving data sets as well as the modelling processes.

Jonathan Weinel, Richard Picking, Stuart Cunningham, Lyall Williams (2015)Holophonor: On the Future Technology of Visual Music, In: Recent Advances in Ambient Intelligence and Context-Aware Computingpp. 248-261 Igi Global

This chapter discusses the progression of visual music and related audio-visual artworks through the 20th Century and considers the next steps for this field of research. The principles of visual music are described, with reference to the films of early pioneers such as John Whitney. A further exploration of the wider spectrum of subsequent work in various audio-visual art forms is then given. These include visualisations, light synthesizers, Vi performances, digital audio-visual artworks, projection mapping artworks, and interactive visual music artworks. Through consideration of visual music as a continuum. of related work, the authors consider the Holophonor a fictional audio-visual instrument, as an example of the ideal visual music instrument of the future. They conclude by proposing that a device such as the Holophonor could be constructed in the near future by utilising inter-disciplinary approaches from the fields of HCI and affective computing.

Jonathan Weinel (2021)Synaesthetic Audio-Visual Sound Toys in Virtual Reality, In: Proceedings of the 16th International Audio Mostly Conferencepp. 135-138 ACM

This paper discusses the design of audio-visual sound toys in Cyberdream, a virtual reality music visualization. While an earlier version of this project for Oculus GearVR provided a journey through audio-visual environments related to 1990s rave culture, the most recent iteration for Oculus Quest provides the addition of three audio-visual sound toys, the discussion of which is the main focus of this paper. In the latest version, the user flies through synaesthetic environments, while using the interactive controllers to manipulate the audio-visual sound toys and 'paint with sound'. These toys allow the user to playfully manipulate sound and image in a way that is complementary to, and interfaces with, the audio-visual backdrop provided by the VR music visualization. Through the discussion of novel approaches to design, the project informs new strategies in the field of VR music visualizations.

Darryl Griffiths, Stuart Cunningham, Jonathan Weinel (2015)A self-report study that gauges perceived and induced emotion with music, In: 2015 Internet Technologies and Applications (ITA)7317402pp. 239-244 IEEE

This paper discusses a particular study that gauges emotion with respect to music by means of an online self-report survey. This is part of the ongoing construct of an intelligent mobile music player application that will adjudicate one's activity, environmental context, and physiological state. The study was structured so as to acquire emotional information by giving each participant a discrete affect word including `happy', `excited', `angry', `afraid', `miserable', `sad', `tired', and `relaxed'. Each affect word that was selected for a given song was corroborated by a degree of intensity using 5-point scale. The fundamental objective of this study was to measure as to how each song could be described emotionally, and how each song made them feel emotionally. Pearson's Chi-Squared concluded that 95% of the ratings for both `described' and `induced' emotions were statistically significant. The corresponding results were scaled to the proposed circular-based emotional model based on Russell's Circumplex model of emotion, by converting both the emotional ratings to polar coordinates. Further analysis of the data subsequently showed that the affect words that represent music could be given some granularity around the perimeter of the circle by expanding upon particular properties of Russell's circular ordering of affect words. Lastly, this paper concludes with a Section that outlines the future work for this research.

Jonathan Weinel (2019)Augmented Unreality, In: Mark Grimshaw-Aagaard, Mads Walther-Hansen, Martin Knakkergaard (eds.), The Oxford Handbook of Sound and Imagination, Volume 2 Oxford University Press

Jonathan Weinel deals with the representation of hallucinations within audiovisual media. Weinel forms his discussion around the concept of augmented unreality, providing examples from films, VJ performances, video games, and other audiovisual media to show how sounds are used to represent hallucinations. His focus is on the material design of the representations of hallucinations, and he discusses how the form of visual and auditory hallucinations may serve as the basis for audiovisual artworks. In conclusion, Weinel provides a set of structural norms that define representations of psychedelic hallucinations, and he hypothesizes that, given improvement of digital technologies, the boundaries between external reality and synthetic unreality might gradually dissolve.

Jonathan Weinel, Stuart Cunningham, Nathan Roberts, Darryl Griffiths, Shaun Roberts (2015)Quake Delirium EEG, In: 2015 Internet Technologies and Applications (ITA)7317420pp. 335-338 IEEE

Altered states of consciousness (ASC) can be represented in video games through appropriate use of sound and computer graphics. Our research seeks to establish systematic methods for simulating ASC using computer sound and graphics, to improve the realism of ASC representations in video game engines. Quake Delirium is a prototype `ASC Simulation' that we have created by modifying the video game Quake. Through automation of various graphical parameters that represent the conscious state of the game character, hallucinatory ASC are represented. While the initial version of Quake Delirium utilised a pre-determined automation path to produce these changes, we propose that immersion may be improved by providing the user with a `passive' method of control, using a brain-computer interface (BCI). In this initial trial, we explore the use of a consumer-grade electroencephalograph (EEG) headset for this purpose.

Stuart Cunningham, Iain McGregor, Jonathan Weinel, John Darby, Tony Stockman (2023)Towards a Framework of Aesthetics in Sonic Interaction, In: Proceedings of the 18th International Audio Mostly Conferencepp. 109-115 ACM

As interaction design has advanced, increased attention has been directed to the role that aesthetics play in shaping factors of user experience. Historically stemming from philosophy and the arts, aesthetics in interaction design has gravitated towards visual aspects of interface design thus far, with sonic aesthetics being underrepresented. This article defines and describes key dimensions of sonic aesthetics by drawing upon the literature and the authors’ experiences as practitioners and researchers. A framework is presented for discussion and evaluation, which incorporates aspects of classical and expressive aesthetics. These aspects of aesthetics are linked to low-level audio features, contextual factors, and user-centred experiences. It is intended that this initial framework will serve as a lens for the design, and appraisal, of sounds in interaction scenarios and that it can be iterated upon in the future through experience and empirical research.

Stuart Cunningham, Jonathan Weinel, Darryl Griffiths (2014)ACERemix, In: Proceedings of the 9th Audio Mostly: A Conference on Interaction With Sound2014-(October)7pp. 1-7 ACM

In this paper we discuss the use of a recently developed audio compression approach: Audio Compression Exploiting Repetition (ACER) as a compositional tool for glitch composition and remixing. ACER functions by repeating similar sections of audio where they occur in a file and discarding the repetitive data. Thresholds for similarity can be defined using this approach, allowing for various degrees of (dis)similarity between materials identified as 'repetitive'. Through our initial subjective evaluation of ACER, we unexpectedly discovered that the compression method produced musically interesting results on some materials with higher levels of compression. Whilst listeners demonstrate this level of loss of fidelity to be unacceptable for the purposes of compression, it shows potential as a performance or production tool. When applied to pop songs the predicable form of the music was disrupted, introducing moments of novelty, while retaining the songs quantized rhythmic structure. In this paper we propose the use of ACER as a suitable method for producing sonic materials for 'glitch' composition. We present the use of ACER for this purpose with regards to a variety of materials that may be suitable for glitch or electroacoustic composition and using ACER in several different ways to process and reproduce musical audio.

Darryl Griffiths, Stuart Cunningham, Jonathan Weinel, Richard Picking (2021)A multi-genre model for music emotion recognition using linear regressors, In: Journal of new music research50(4)pp. 355-372 Routledge

Making the link between human emotion and music is challenging. Our aim was to produce an efficient system that emotionally rates songs from multiple genres. To achieve this, we employed a series of online self-report studies, utilising Russell's circumplex model. The first study (n = 44) identified audio features that map to arousal and valence for 20 songs. From this, we constructed a set of linear regressors. The second study (n = 158) measured the efficacy of our system, utilising 40 new songs to create a ground truth. Results show our approach may be effective at emotionally rating music, particularly in the prediction of valence.

Stuart Cunningham, Harrison Ridley, Jonathan Weinel, Richard Picking (2019)Audio Emotion Recognition using Machine Learning to support Sound Design, In: Proceedings of the 14th International Audio Mostly Conference: A Journey in Soundpp. 116-123 Assoc Computing Machinery

In recent years, the field of Music Emotion Recognition has become established. Less attention has been directed towards the counterpart domain of Audio Emotion Recognition, which focuses upon detection of emotional stimuli resulting from non-musical sound. By better understanding how sounds provoke emotional responses in an audience it may be possible to enhance the work of sound designers. The work in this paper uses the International Affective Digital Sounds set. Audio features are extracted and used as the input to two machine-learning approaches: regression modelling and artificial neural networks, in order to predict the emotional dimensions of arousal and valence. It is found that shallow neural networks perform better than a range of regression models. Consistent with existing research in emotion recognition, prediction of arousal is more reliable than that of valence. Several extensions of this research are discussed, including work related to improving data sets as well as the modelling processes.

Jonathan Weinel, Stuart Cunningham, Darryl Griffiths (2014)Sound through the rabbit hole, In: Proceedings of the 9th Audio Mostly: A Conference on Interaction With Sound2014-(October)3pp. 1-8 ACM

As video game developers seek to provide increasing levels of realism and sophistication, there is a need for game characters to be able to exhibit psychological states including 'altered states of consciousness' (ASC) realistically. 'Auditory hallucination' (AH) is a feature of ASC in which an individual may perceive distortions to auditory perception, or hear sounds with no apparent acoustic origin. Appropriate use of game sound may enable realistic representations of these sounds in video games. However to achieve this requires rigorous approaches informed by research. This paper seeks to inform the process of designing sounds based on auditory hallucination, by reporting the outcomes of analysing nearly 2000 experience reports that describe drug-induced intoxication. Many of these reports include descriptions of auditory hallucination. Through analysis of these reports, our research establishes a classification system, which we propose can be used for designing sounds based on auditory hallucination.

Jonathan Weinel, Darryl Griffiths, Nathan Roberts, Stuart Cunningham, Shaun Roberts (2014)EEG as a Controller for Psychedelic Visual Music in an Immersive Dome Environment, In: Electronic workshops in computing (Online) BCS, The Chartered Institute for IT

Electronic Visualisation and the Arts (EVA 2014) - Jonathan Weinel, Stuart Cunningham, Nathan Roberts, Shaun Roberts and Darryl Griffiths - Psych Dome (Weinel 2013a) is a short interactive piece of visual music first presented in an immersive 'full dome' environment that forms part of the authors' on-going research regarding Altered States of Consciousness (ASC) as a basis for the design of computer-based artworks. - - Print copies of EVA 2014 - ISBN 978-1-78017-285-9 - RRP £85 - Available from the - Our website uses cookies to help improve your experience. Find out more by reading our - .

Jonathan Weinel (2016)Entoptic Phenomena in Audio: Categories of Psychedelic Electroacoustic Composition, In: Contemporary music review35(2)pp. 202-223 Taylor & Francis

Altered states of consciousness (ASCs) are perceptual states, such as dream, delirium, or hallucination that fall beyond a commonly accepted normal waking consciousness. This article discusses Entoptic Phenomena in Audio, a collection of four electroacoustic compositions that are based upon the author's research regarding these states. The compositional process utilised involves consideration of the typical features and structure of hallucinatory experiences, as described by participants in psychological studies and other available literature. Typical features of hallucination are then used to indicate the design of corresponding sonic materials, and the structure of the composition. This compositional process is described in detail, leading to a generalised structural approach for creating electroacoustic compositions based on ASCs, with several possible variations. In addition, the decision to present the works on 12 vinyl is also discussed, as are the ways in which this project interfaces with electronic dance music culture.

Stuart Cunningham, Jonathan Weinel (2016)The Sound of the Smell (and taste) of my Shoes too, In: Proceedings of the Audio Mostly 20164-06-2986456pp. 28-33 ACM

This work discusses basic human senses: sight; sound; touch; taste; and smell; and the way in which it may be possible to compensate for lack of one, or more, of these by explicitly representing stimuli using the remaining senses. There may be many situations or scenarios where not all five of these base senses are being stimulated, either because of an optional restriction or deficit or because of a physical or sensory impairment such as loss of sight or touch sensation. Related to this there are other scenarios where sensory matching problems may occur. For example: a user immersed in a virtual environment may have a sense of smell from the real world that is unconnected to the virtual world. In particular, this paper is concerned with how sound can be used to compensate for the lack of other sensory stimulation and vice-versa. As a link is well established already between the visual, touch, and auditory systems, more attention is given to taste and smell, and their relationship with sound. This work presents theoretical concepts, largely oriented around mapping other sensory qualities to sound, based upon existing work in the literature and emerging technologies, to discuss where particular gaps currently exist, how emotion could be a medium to cross-modal representations, and how these might be addressed in future research. It is postulated that descriptive qualities, such as timbre or emotion, are currently the most viable routes for further study and that this may be later integrated with the wider body of research into sensory augmentation.

Jonathan Weinel, Stuart Cunningham, Jennifer Pickles (2018)Deep Subjectivity and Empathy in Virtual Reality: A Case Study on the Autism TMI Virtual Reality Experience, In: M Filimowicz, Tzankova (eds.), New Directions in Third Wave Human-Computer Interaction: Volume 1 - Technologies183pp. 183-203 Springer Nature

The Autism Too Much Information (TMI) Virtual Reality Experience is a virtual reality (VR) application produced by The National Autistic Society (NAS) as part of an awareness campaign. The design of the application creates a short narrative simulation from a first-person perspective, which conveys aspects of what it may be like for a child on the autistic spectrum to experience a stressful situation precipitated by environments with 'too much information'. The application is part of a recent trend in VR and 360-degrees video, to create simulations of subjective experience, as a means to generate empathy. Yet the success of such tools depends significantly on how well sound and graphics can be used to communicate such experiences in a meaningful way. In this article, we provide a case study of the Autism TMI Virtual Reality Experience, as a means to unpack design issues for these simulations. Through an expert analysis and pilot study of user experience, we propose three distinct forms of subjective first-person simulation that may be produced in virtual reality. We argue that the Autism TMI Virtual Reality Experience exemplifies the third of these: 'deep subjectivity', which may lead to an improved sense of empathy by representing various aspects of multimodal perception and emotion. However, our study also suggests that VR may offer limited benefits over 360-video for generating a sense of empathy.

Jonathan Weinel (2021)Cyberdream: An Interactive Rave Music Visualization in Virtual Reality, In: Shuzo John Shiota, Ayumi Kimura, Christian Sandor, Maki Sugimoto (eds.), SIGGRAPH Asia 2021 XR3487609pp. 1-2 ACM

Virtual reality (VR) provides new opportunities for the design of interactive music visualizations. Exploring this area, Cyberdream is a prototype VR application realized through the author's practice-led research, which provides a journey through audio-visual environments based on the aesthetics of 1990s rave music. The project provides three audio-visual 'sound toys', which allow the user to interactively 'paint with sound', thereby facilitating creative play. Through its structural form and audio-visual sound toys, Cyberdream indicates new approaches for the design of music visualizations that harness the spatial properties of VR.

Jonathan Weinel, Stuart Cunningham (2015)Second Screen comes to the Silver Screen, In: 2015 Internet Technologies and Applications (ITA)7317381pp. 121-124 IEEE

`Second screen' technology is the use of mobile technologies and other computing devices in order to provide an additional screen that supports or augments the consumption of visually based media. The `Second Screen comes to the Silver Screen' project sought to explore the feasibility of using second screen context within a real-time cinema context, in order to augment and enhance the audience experience. The project included a review of technological feasibility, a consumer survey and two case studies. The results of the consumer survey and case studies are presented in this paper, the conclusions of which indicate a significant level of audience concern regarding the potential distraction that would be caused by second screen technology within the cinema. Conversely, the study does indicate some appetite for the use of mobile technologies to improve the cinema experience more generally, indicating a possible basis for further investigation regarding how this can be achieved appropriately.

Stuart Cunningham, Jonathan Weinel (2015)Second screen comes to the silver screen: A technology feasibility study regarding mobile technologies in the cinema, In: 2015 Internet Technologies and Applications (ITA)7317400pp. 228-232 IEEE

The act of `second screening' involves the use of an additional media screen such as provided by a mobile phone or tablet, to consume content alongside a primary screen such as a TV. In this paper we present findings of a study that explored the feasibility of using second screen technology in a cinema environment. A proposal for the application of second screening within the cinema context is discussed. Following this, the feasibility of the proposed design is considered in relation to available technologies, a consumer study and a market survey.

Jon Weinel (2025)Students want more authentic activity and less research-led teaching, In: Compass : the journal of learning and teaching at the University of Greenwich17(2)

This opinion piece argues that authentic activity which emphasises real-world practices may help to improve the student experience and employability. Through a discussion of five specific strategies used for enhancing authentic activity on a games development degree programme, the author argues that authentic activity may, in some situations, be a more appropriate guiding principle for designing teaching activities than research-led teaching.