The following are selections of various works relating to music I have done that have a large electronic component to them.
Painted Resonance is a multimedia ukulele and computer performance that makes use of programming in Max/MSP for real time interactive audiovisual effects. It was premiered at The Fidget Space in Philadelphia, PA in December 2024.
Regarding programming, Painted Resonance was programmed in Max/MSP, a visual dataflow programming language that allows for real-time signal processing and programming with OpenGL. This program requires user interfaces, real time signal processing, trigonometry for animation, visual processing, significant data management, gpu/cpu optimization, and abstractions with hierarchical programming. Using Max/MSP allowed for me to program a completely generative system, where the only upper limit to how many audiovisual loops you can record and how many visual bubbles you can generate is up until your processor can't make anymore (which will take quite a while).
The minimum hardware necessary during performance is a laptop, laptop charger, audio interface, microphone with mono cable, 2 TRS cables, stereo speakers, and 2 IEC cables for the speakers.
As seen in the video, a performer can play music and have the background color light up as they pluck notes. When they choose to record some notes into an audiovisual loop, the program will analyze what they recorded and generate visual bubbles associated with the notes they played. These bubbles then travel across the screen as they the associated loop's audio progresses, then finally replays from the beginning. When a loop's audio replays, so does its visual bubbles. When an audiovisual loop is recorded, it's visual bubbles are placed in a random place across the screen; thus, the audio for the loop is then panned to follow the visual location of the loop. The camera of the field of view slowly rotates around, creating visual interest evolving over time.
As the performance progresses, the performer has the option to start audiovisual noise. This makes their live ukulele playing be overdriven, and for the visuals to slightly distort guided by the sound of the overdriven ukulele. Musically, this decision was made to allow for the performer's soloing against the loops to timbrally stand out, as well as to provide an alternate sound to provide more interest & engagement with the texture, after attention fatigue from listening to the same type of sounds and looking at the same type of pictures for a few minutes.
The performance ends with a third option the performer gets to select on the computer. This selection slowly increases the reverb of the audio to let the audio dissolve into a wash of sound, has the visual distortion set itself into a feedback loop that doesn't reset creating noisy visuals, and slowly lowers the gain of the piece while the performer can play an ending. The performer can then further lower the gain as needed using the computer or their interface.
For further information on the project, click here
Codeposer was a semester long project for the class Music & Technology 2 at Rensselaer Polytechnic Institute. Codeposer is a software that generates computer-written music based on principles of algorithmic composition. It has two modes. The first allows you to write a piece of music in collaboration with the computer: you write a few notes, the computer responds and continues the phrase, then you respond and write more notes, and so on. The second mode has the computer write music by itself.
This code was later used practically for the composition Earnest Dishonesty. This composition mixed computer-generated sections with sections I wrote by hand to make an electronically generated piece that could be performed acoustically.
Codeposer was made in Python and Pure Data.
The video shown is a demonstration of how a version of the project runs. The audio is of the piece Earnest Dishonesty.
This is a project run in Max/MSP using Mari Kimura's Mugic Motion Sensor hardware to create live rhythmic visualizations along with someone performing ukulele to create a more immersive musical experience. With the motion sensor attached to the performer's right hand, the project uses Max/MSP patches to process the motion sensor data and trigger the visuals when a new note is plucked. Future work on this project focuses on smoothing the visuals and processing the audio with effects based on motion treshholds.
The following videos demonstrate the project:
Using music and sound in novel ways lead to this project. The aim of this is to improve scientific accessibility by sonifying data to make it easier to understand when traditional scientific pedagogy fails. This project used Ruby and SonicPi to process the audio. The google drive link shows the whole project.
In a collaboration between the Empire State Youth Orchestra and the Rensselaer Orchestra, select movements of Mussorgsky's Pictures at an Exhibition were played in concert with visualizations programmed for each movement. Working on this collaboration, I learned the Music:Eyes software and programmed the visualization for Movement 3, Bydlo.
Playing this in concert involved manually scrolling the software in time with the conductor & orchestra by using a hand crank. I do not have video of the concert, but the visualization with a recording of Bydlo can be seen at this link https://app.musiceyes.org/p/MmG6EvWhPVOHym/nushagarwal or in the video to the right.
Using Ableton Live and a contact microphone, I employed live processing on a ukulele in this performance. The processing was interactive with automation that changed based on an external foot pedal controller. Automation effects included delay, compression, feedback, and spectral resonance.
This is an electronic piece of music created using the DAWs Reaper and Beepbox. This piece was commissioned as the track for a video game mod for Rivals of Aether. It does not have a fixed time and is separated into segments that can loop based on the player's gameplay.
The top video is the track by itself and the bottom video shows gameplay of the mod.
The game mod is on steam here.
Dysfunction is a piece of chamber music written for interactivity with live electronics. The live electronics run as a Pure Data patch that has directions written in the score for an audio engineer to be performing with the musicians when it's performed. This was recorded in ProTools.
This is an electronic piece of music created in the DAW Beepbox. It was made to score an animated comic called Love Letter from a Scientist.
This was a piece of music I wrote and arranged for 12 ukuleles, but with a lack of other ukulele players, I both recorded all 12 parts and mixed them in ProTools.