Projects + Streams

Below are the amazing projects for ChangeUp's first jam!



Untitled Habitat

Project Lead(s): David Psutka & Xuan Ye

Description: An examination of the replicative nature of language protocols of electronic music and digital performance. Electronic music/digital performance - despite typically being a realm of solo/individual expression - can be expanded, shared, re-expressed and multiplied across various formats, people and locations because of the properties of CVgate/MIDI/dmx etc. What is the specific nature of this translation? How does the initial idea evolve across platforms? Can ideas be expressed more powerfully with a dramatically cross-platform approach. Emerging from the technological parameters, the performance will construct an immersive environment that addresses the interconnectivity of sound and body, nature and machine.

Xuan and David are looking to include some coders and synth/sound developers to help articulate this idea.

Music by Xuan + David as ACT! Digital Environments by Xuan Programming + Coding + tech integration by ???

Check out the artist's links below for more of their work! Xuan Ye: http://a.pureapparat.us/g-a-r-d-e-n
ACT!: https://actactact.bandcamp.com

Out of sight, out of mind

Project Lead(s): Julia Romanowski & Monica Bialobrzeski

Description: Audio-sensory experience exploring the "out of sight, out of mind" attitude of plastic consumption in Toronto.

Jammers: Didi Psomopoulos, Denis Titov

Writing shaders in the browser

Project Lead(s): Xavier Snelgrove

Description: Xavier will teach a workshop on how to program shaders. Shaders are computer programs that run on the graphics card to create real-time images and animations. These powerful programming languages Sites like http://thebookofshaders.com and https://www.shadertoy.com/ show some of the creative work people are doing in shaders today. During the jam participants will create shaders with these technologies that will be projected in an immersive space.

Cortex

Project Lead(s): Peter Rahul, Projoy Roy, Karl Skene

Description: This project aims to show how the values of the surveillance society we live in, surrounded by algorithms we do not understand, interact with us on a conscious and unconscious level. The aim is to set up an automated visual system controlled with minimal human intervention - however, the system depends on the presence of humans to be meaningfully activated. This project presents light in several forms including projection-mapping, on screen, and laser light.

A Kinect sensor tracks the spatial position of the viewer. This data is used to control the speed and direction of a motor(s) that is holding a custom made 3d object(s). This object is subjected to realtime dynamic projection mapping, meaning the projections follow its movement. Each limb of the viewer could be mapped to synthesis, allowing them to perform the audio/visual instrument with their body. This responsive object act’s as a ‘Big Brother’, constantly watching and counter-acting to the audience’s movements.

Several surveillance cameras are set up in the room recording the audience. The footage from these cameras are fed through video mixers and synthesizers. The resulting visuals (projected or displayed on CRT monitors) creates symbiotic imagery that blends live camera feeds with algorithmic video textures generated real-time via hardware.

Finally, a Muse brain sensing headband is set up in the space for participants to wear. The Muse headband measures brain activity via EEG, this activity is related to calmness/excitement, as well as concentration levels. This data is converted into MIDI and CV signals to generate additional visuals/audio, or to modulate the visuals/audio for the systems outlined above. These brainwaves can also be visualized on a modified Vectrex monitor.

Space provided, all these signals can also be sent into a laser (provided by artists) via ILDA control, to visualize waveforms, and generate ‘rotoscope’ type fx using laser light.

Team Members: Emily Curran, Anna Pavlova, Janica, Olga Kolotylo, Lily Tse, Ali Sermol, Chanelle Hartwig

Party Party Party Jam

Project Lead(s): Dames Making Games

Description: This stream will be dedicated to creating small engaging multiplayer interactive experiences expressly for exhibition at parties. We encourage alternative game interactions and welcome people of all game making experiences. We will spend the weekend making games that invite participants to face off in quick, interactions. We will be running a game jam for the weekend of ChangeUp! As an organization we are tool agnostic but our members have experience in physical games, VR, 2D and 3D game making. Feel free to suggest a project you'd like to build a team around, bring a team to create something you want to get a kickstart on or come friday evening to find a team looking for someone to join up!
Check out https://dmg.to for more info about Dames Making Games!

Jam the Jam

Project Lead(s): Trevor Blumas, Jonas Osmann

Description: Team: Chris Marlow, Greg Wilson, Tim Etchells, Mark Fernández The goal of this project is to gamify the entire jam event through an open-source, network linked multi-media hub that can interact with, alter, re-code, modify, copy, distort, etc. all the other projects in a way that forces improvisation, adaptability, change and malleability on the development and operations of all the other projects and the overall user experience of the change up jam weekend.

By the end of the jam

  • a localized server architecture and network framework will be built
  • Creating an access point, which when a user logs onto the network, a personalized webpage is assigned to that user built from data extracted from the initial log in.
  • When the user clicks anywhere in their page, they are randomly connected to another users home page.
  • Add a graph that shows the network and each user node/webpage which can be used for navigating through the network.
  • New modes are added in real-time as new users log on and new pages are generated.
Space in Sound

Project Lead(s): Dustin Good

Description: Team BFS (Curtis Clark, Sofie Mikhaylova, Clint Lee, David Sutherland)

Music Workshop: Composing and improvising with a focus on the use of sparsity and the power of simple relationships to define and fill empty space.

Introduce one object and dwell on the space it creates.

Allow it to move - imply tonality, colour, rhythm.

The introduction of a second object can solidify or drastically alter the implied space that was established by the listener’s ear. The composer here can begin to assert more direction over the pieces entirety.

The objects can evolve with changes in timbre, rhythm, and most importantly, by interaction with one another, changing to suit a natural progression guided by the hand of the composer .

It is here that we can look to outside influences to guide the composition. Most of the sound design will be done on Modular Synthesizers.

Non synth-wielding participants who can facilitate interactive systems that output MIDI are encouraged to join and help create a piece that visitors can interact with and influence.

Participants with compact synthesizer set ups (Modular, Keyboard, Computers) are encouraged to help create the patch and contribute their own objects to become part of the composition.

This is in essence looking deeper inside the foundations of harmony, rhythm, and motif. Seeking influence from a simple system and guiding the composition by listening, imagining, and facilitating relationships and movement that are already hanging in the air, the empty space.

Building a sonic composition with focus on space and individual part in motion creating harmony. These sounds and events will be controlled and influenced by external inputs - gestures, thoughts, converted to MIDI and then CV.

Toronto Love Notes

Project Lead(s): Yasmine Hay

Description: Team Members: Lily Tse, Cat Bluemke, Jonathan Carroll "Imagine you are walking home from work after a not-so-great day and you get an alert on your phone: ""Pssst... a love note awaits you around the corner :) "". You open up the Love Note-Toronto app to find out where the AR love-note creature is hanging out today and you realize it really is just around the corner! You head to the location and aim your phone to find it. The love-note creature is waiting there smiling and prompts you to give it a hug by pressing the button on your phone app. Once you do, it relays a message an anonymous fellow Torontonian left: ""You light up my life and I love you"

This interactive AR experience will encourage people to share anonymous and non-targeted 'love-notes' that anyone can access to make their day better or make them feel loved. Big cities need more to express more love. By the end of the jam we hope to be able to present an experience video and a mock-up on an AR platform for jammers to try :)

Reverberate

Project Lead(s): Betty Zhang, Kvesche

Description: An intimate room installation which invites the audience to enter and experience an immersive environment. The room responds to the audience moving through the space and generates organic sound and visuals. The user will be able to modulate sonic texture and directionality with movement. This sonic landscape will be conditioned so as to harmonize with the visual. The overall experience is an ambient room for people to spend some time in reflecting on vulnerability.

Sound Space

Project Lead(s): Valentin Tsatskin, Jina Kim, Owen Lyons, Projoy Roy

Description: By the end of the Jam, the team will be able to generate music based on inputs of points moving in a two dimensional space. The points will come from a webcam feed, where a computer vision system will extract where people are in a room from a top-down perspective.

Taking this further: Mapping of movement into sound. Establish a looping musical form for each input point. If there are 3 points, there will be 3 separate looping musical forms. As different points interact, each point’s speed of movement and the points’ positioning in relation to each other will determine increase or decrease in tempo and volume of that particular melody; thereby, creating a form of “orchestra of sounds” through dance.

Fluidity

Project Lead(s): Luc Palombo, Yuma Yanagisawa

Description: We will be making a Unity demo that showcases a photorealistic environment that can be manipulated using basic human interactions. Our intention is to manipulate the environment in a uniquely digital yet organic way. We intend to use stock assets from the Unity asset store.

By the end of the jam, we will have a working Unity project that lets users manipulate a photorealistic environment using basic human computer interaction.

I want a super power

Project Lead(s): Yang Sui, Sebastien Hart

Description: This project aims to map sound pitches to specific colours and display those colours on a computer screen, simulating the effects of synesthesia.

Moov

Project Lead(s): James Essex, Michael Landry

Description: We plan to make a playable experience for two people in which their co-movement generates effects. This is already complete, but we would like to add more effects.

We also want to add screens to make it a standalone installation, ready to play during pitch.