A cutting edge experimental workflow crafted for an event representing the cutting edge of hard techno featuring 999999999, MCMLXXXV, and more.
Using a small cluster of custom media servers, a new type of real time content creation pipeline was tested in public for the very first time. Low level optimizations to existing generative AI pipelines enabled rich and dynamic composites of pre-rendered content, generative visuals, and real time AI created imagery, all synthesized on the fly. Real time musical and audio analysis guided the AI generation on both the micro and macro scales, leading to a cohesive, dynamic show that continuously adapted to the music, rather than simple AI generated loops . The show fluidly progressed between integrating AI generated content as hero elements, virtual backgrounds, set extensions, and abstract imagery that could seamlessly transition to and from abstract generative content to treat the LED wall as a complex lighting fixture.
Location:
Knockdown Center, Brooklyn, NY
Lineup:
999999999
MCMLXXXV
Annie Lew
ÆON
Date:
01.27.24
Images Courtesy of Off Brand Project
In 2023 Human Person reached out to commission visuals for Calvin Harris’ long awaited return to Coachella’s main stage.
The selected concept was simple: Take the spectral data hidden within the audio of the song, and create something beautiful.
The target infrastructure was also simple yet tangibly difficult : Target an LED wall resolution greater than 10,000 pixels wide.
To handle the oversized canvas resolution, I created an optimized pipeline with custom shaders in TouchDesigner. This enabled real time editing at native resolution with lowered detail, to streamline musical automation of parameters, while still supporting slower-than-realtime exports at full quality. This pipeline allowed for repeated full resolution exports for client review on a single high powered computer, without the need for expensive cluster rendering typically associated with resolutions above 8K.
The final piece ultimately debuted to more than 100,000 concertgoers during Calvin Harris’ Returning To The Desert set on April 15th, 2023.
The Delta LaGuardia Terminal C hall features a digital media installation with 34 LED panels showcasing different scenes reacting to arriving and departing flights.
Working with ESI Design, an NBBJ studio, I co-designed and implemented a show control system to manage and render real time and prerendered content. This was implemented as a complex system within TouchDesigner. The software enabled seamless transitions between multiple generative scenes with a large canvas size.
Featured on Business Insider.
Image courtesy of Pavel Bendov / ArchExplorer.
The Brooklyn Mirage is an open air music venue located in Brooklyn, NY, featuring a 200ft long curved LED wall.
As one third of GGGlue, I co-designed the cutting edge system architecture for the live video pipeline - one of the very first electronic music event spaces in the United States to run a video infrastructure built on SMPTE2110 (uncompressed video over IP networks). The Panasonic Kairos Platform was a key part of the design, enabling a single stable compositing source for the massive 14,976 x 1920 pixel canvas.
I frequently contributed my own visuals to shows, bringing in generative and real time visuals from programs like TouchDesigner, Notch, Wire, and Resolume Arena.
In addition, I frequently walked VJs, Artists, and Brands through the technical steps of adapting preexisting live shows and video content to the unusual ~64:9 aspect ratio, developing technical pipelines to smooth out this often intimidating process and reassure clients along the way.
Prior to 2022, I helped design visual systems and led visual operations for the Brooklyn Mirage’s previous 16 projector courtyard design.
Photo Credits:
PRISMS is digital media installation featuring six 260-square foot LED boxes suspended from the ceiling of HSBC Place, giving it the honor of being Edmonton’s first digital art installation.
Working with ESI Design, I designed and implemented a show control system to manage and render real time and prerendered content for PRISMS, programmed as a multi-GPU solution using TouchDesigner. The software enabled seamless transitions between multiple generative scenes with an extremely large canvas size.
From ESI Design’s release:
Like jewels in a jewel box, the installation, titled Prisms, mixes hyperlocal imagery with abstract art. It shows scenes inspired by Edmonton, the nighttime sky, and the ocean, and incorporates live data including time of day, season and weather.
As the Visual Team lead for Avant Gardner, I designed and implemented software for generative visuals in the Great Hall, for a show featuring exclusively live electronic musical acts.
Working with an approximately 4K resolution canvas size, I used this event as an opportunity to experiment with chaining multiple GPUs together in real time. As an example, the first gpu would render a smoke simulation, pass it to the second gpu to add a simulated layer of water refraction, and then pass it to the third gpu for visual effects and the final output remapping.
This effort incorporated generative content from myself, digital media artists 404.zero, and various artists from the Shadertoy community.
Special thanks to Eric Chang (additional visual operation) and Brendan Zoltowski (visual footage capture)
The Charter Spectrum HQ in Stamford, CT features three 26 foot tall LED walls that display brand messaging, real time network statistics, and office communications.
Working with ESI Design, an NBBJ studio, I co-designed and implemented a show control system to manage and render real time and prerendered content. The software enabled seamless transitions between multiple generative scenes with a large canvas size, programmed as a complex solution with multithreaded Python alongside TouchDesigner.
In collaboration with Magnetic, I programmed two installations for Netflix's elaborate 'For Your Consideration' events, FYSEE. The premiere event in the space had over 50 celebrities attend, and was covered by many publications, including the NY Times, Variety, and Vanity Fair.
The first installation was an openframeworks system that controlled a 18 pcs, iPads, and iPhones, and synchronized them in a presentation of FYSEE nominees. It adhered to Netflix's 'card branding' in a generative way, so that new shows and their stills could literally shuffle in whatever combination was required for the talent present at the various FYSEE events. <Video Coming Soon>
The second was a TouchDesigner installation for Black Mirror featuring a high density LED wall hidden behind a 30 ft semitransparent dark mirrored surface. As attendees passed through a corridor, sensors detected their motion and played media from the title sequence of the show, and a generative crack that followed the attendees in the direction they were walking. <Video coming soon>
http://weremagnetic.com/awards-news/netflixs-fysee-campaign-makes-headlines/
https://www.nytimes.com/2017/06/19/business/media/for-your-consideration-an-increasingly-lavish-emmy-campaign-season.html?_r=1
http://variety.com/2017/digital/news/netflix-emmy-voters-fysee-experiential-space-la-1202402931/
https://www.vanityfair.com/hollywood/2017/05/netflix-goes-all-out-for-emmys-gold
The Brooklyn Mirage is an open air music venue with panoramic video projections lighting the courtyard walls.
In 2018 I became the visual lead for the space, leading all visual operations for multiple shows every week with up to 5,000 attendees each.
In 2019, I designed and rewrote the visual systems from scratch, creating a modular multi-GPU VJ app in TouchDesigner. This allows operators to work with computationally expensive generative visuals, like smoke and fluid simulations, in real time at 60fps over a 21,180 x 1080 resolution canvas spread across 16 projectors. I worked with visual artists 404.zero to integrate their generative TouchDesigner pieces in a custom plugin format, and extended this idea to support integration of Notch content.
I directly vj’ed for acts like Bonobo, Carl Cox, Hardwell, Jamie Jones, Guy Gerber, Seth Troxler, Tale of Us. I also created custom visual content and systems for clients like HBO, CityFox, Rezz, Anjunadeep, and Octave One. I frequently incorporated my own generative content, primarily consisting of GLSL shaders to simulate things like liquid, or biological systems.
Initial Visual System Design & Integration (2017) : Volvox Labs
Visual Systems Partner (2018) : Landon.VFX
Visual Operation & TouchDesigner Content Partner (2018-2020) : George Gleixner
Guest Artists (2019-2020): 404.zero
Visual Operation & Notch Content Partner (2019-2020): Eric Chang
Working with Magnetic Collaborative, I programmed a system involving 7 TouchDesigner rigs each playing back HAPQ 4K content on Sony's new flagship OLED displays for their launch party debut, all synchronized within a fractal mirror room designed by the Japanese artist Kaz Shirane.
http://weremagnetic.com/work/sony-tv-launch-event/
http://kaz-shirane.net/2017/05/04/sony_bravia-oled-evolve/
2018 | 4:18
Director: Jonathan Thompson | @pointshader
Music: Hyper Real by iTal Tek | @italtek
What worlds are hidden within the data of the music we listen to? While initially conceived as a technical experiment to visualize spectral data extracted from music, when early protoypes started producing beautiful and intricate imagery, the idea arose to construct a world solely from the spectral data from a single song. The intricate lattice of intertwined frequencies contained within iTal Tek’s frantic arpeggiated melodies and footwork inspired rhythms forms the crystalline structure of an entire visual world.
Hyperreal premeired in Los Angeles at HYPERSPECTIVE, a forward facing dome film showcase created by META and RYOT.
During NYC Fashion week 2018, YSL commissioned a pop-up at Beauté NYC to celebrate the photographer Kenneth Willardt. As guests walked down the runway suspended over water, my TouchDesigner systems used laser scanning sensors to track guests’ movements down the runway, which then allowed LEDs in the walkway to light their steps. The same system controlled moving head lights to put the guests in the literal spotlight.
Designed by Joel Fitzpatrick Studios
A custom AV and laser set specifically designed for the Brooklyn Mirage, using Ableton, TouchDesigner, and Notch for a real-time AV workflow.
Special thanks to Alex Van Roon for lighting, and George Gleixner for FOH visual operation support.
As the Visual Team lead for Avant Gardner, I designed and implemented software for visual control of two major stage installations.
In the King’s Hall, I set up a system to individually projection map 76 skulls. Each skull was able to display a mix of mapped video content, and real time rendering of artificial lighting and materials.
In the Great Hall, I used Raymapper, my custom multi-gpu vj software, to mix videos and real time simulation of fluids and cellular systems over a cathedral-esque canvas that spanned 3 massive led wall assemblies.
Both installations relied on a mix of rendered content by Sila Sveta and my own generative content.
Additional Great Hall visual operators included George Gleixner and Eric Chang.
- Page Under Construction -
At Panorama 2018, I used a TouchDesigner system to project movies warped to guests’ faces in real time.
Kiss
By Guillermo Calderón
Company: Yale Repertory Theater
Venue: Yale Repertory Theater
Dates: April 27 – May 19th, 2018
I worked directly with Projection Designer Wladimiro Woyno to develop an custom multicamera video switcher and effects system in TouchDesigner.
https://www.yalerep.org/productions-and-programs/production/kiss
A standing double date in Damascus quickly escalates into farce as four friends unburden their hearts and reveal their secret passions. But as civil war wages outside, nothing is really what it seems to be. Kiss is a politically charged and emotionally resonant exploration of what gets lost in translation: the unfathomable human toll of a nation in chaos.
A custom AV and laser set specifically designed for the Brooklyn Mirage, using Ableton, TouchDesigner, and Notch for a real-time AV workflow. Visual highlights include a real time fluid system reacting to the architecture of the venue, and smoke simulations based on mocap of lyrics in the set.
A huge thanks to George Gleixner for contributing visuals for portions of this set.
For Squnto's 2018 'Strikes Back' 40 stop US bus tour, I created a completely automated lighting and visual control system. With a complex system architecture spanning Ableton, Resolume, Max, JACK, and multiple instances of TouchDesigner, the system allowed Squnto's lighting and visuals to reflect the songs being mixed on stage. Extracted musical data controlling macro parameters of lighting in custom systems running at 240 FPS combined with custom calibration methods allowed the lighting to be controlled with sub 5 millisecond accuracy in concert. On the largest stops, custom TouchDesigner systems with Etherdream DACs were used to also control lasers with the same level of timing accuracy.
Lighting & Laser Control System - Jonathan Thompson
Audio & Visual Sequencing - Eric Roth (https://www.facebook.com/squnto)
Film & Additional Light Sequencing - George Gleixner (http://www.georgegleixner.com/)
LSO & Laser Prototyping Assistance - Grady Nouis
LSO - Jack Harding
Lighting & Visual Production by Pinnacle Production (http://www.pinnacleprodj.com/)
Photo Credit - Angel Park Photography
July 2016
An audiovisual performance created as Cyber Illusions’ Artist in Residence in Colombo, Sri Lanka.
I created a 35 minute mix of electronic music, then designed a story told by a core set of 3D visuals rendered in Blender using my render cluster pipeline established during Dreamwalker. These were augmented by generative visuals in TouchDesigner, as well as precise lighting, all running in an updated version of my ABSTRAKTION TouchDesigner system developed for MIDNIGHT. Additionally, landscape and silhouette footage was filmed in Sri Lanka.
Jonathan Thompson – Concept/Design/Audio/Lighting/Visuals
Lalindra Amarasekara – Design/Production
Anjalie Karunathilake - Stationery/Promotion
Taryn Gehman - Silhoutte Model
Opening performance by Geve – Jambutek Recordings
Produced by:
Cyber Illusions
Event Productions
February 2016
Eleven minute short created as a case study for Blender techniques such as particle systems, physical rendering and shaders in Cycles, and procedural modeling.
During development, I established a multi node render cluster pipeline on AWS. This render took approximate 300 hours over several batches on a 50 GPU cluster.
The audio is an excerpt of my mix, 'Voyagers Have To Say Goodbye'.
August 2016
Pettah Interchange is a celebration of alternative music, art and culture that has taken place annually since 2012 in abandoned and neglected urban spaces in Colombo, Sri Lanka.
The venue this year was the Transworks House (built in 1908), which originally housed the Ceylon Public Works Department and later served as the City Traffic Police headquarters is one of the last remaining buildings of its kind in the area left in a neglected state, though it will soon be renovated to serve as the entrance and lobby of an international hotel.
As Artist-in-residence with Cyber Illusions, I helped design the lighting for much of the Transworks Building, including the bottom floor (the 'Office'), and the projection mapped Dome area. I used a hybrid Ableton Live & TouchDesigner system so I could rhythmically tap out melodic and harmonic musical elements as lighting gestures and loop them. This let me perform lighting in the 'Office' for over 7 hours of sets that included Sri Lankan artists Geve, Asvajit, and Nigel Perera, as well as NYMA from Berlin.
2018 | 1:01
Music: George Gleixner | @gleix
Visual Programming: Jonathan Thompson | @pointshader
"In Light Of" is a post-minimalist ambient album recorded live on 8/29/2018, by musician George Gleixner. During the recording, I created a visual in TouchDesigner, focusing on a geometry shader allowing delicate strands to weave in complex patterns, gently responding to the spectrum of George’s music in 4k resolution, realtime. A refined version of this project was featured as the release video, and on the cover art for the CD and Cassette physical release on Lagom.
May 2016
Projection Mapping for the Sri Lankan punk band Sakwala Chakraya (which translates roughly from Sinhalese as 'Universe Portal'). The visuals I designed focused on creating the illusion of a 3D space behind the 16 windows used for the projection mapping. All visuals rendered in real time with TouchDesigner.
November 2015
In July 2015, I finished the most intricate electronic mix I had ever made. I included artists varying from Machinedrum to Philip Glass to Daft Punk, but then combined synthesizer recordings, lecture fragments, and field recordings for an elaborate journey over 75 minutes.
For the next 4 months, I pieced together MIDNIGHT as the complete audiovisual version of the mix. I designed a system in TouchDesigner that I later named 'ABSTRAKTION', that builds on the Ableton / TouchDesigner Sync environment provided by Derivative. I aimed to create an immersive space for the experience with an LED ceiling net, traditional LED lighting fixtures, and projected visuals, all running in TouchDesigner. Most of the visuals were created and rendered real time within TouchDesigner, though I also used Blender and SpaceEngine.
April 2013
Eleven minute drum and bass mix used as a technical demo for Max4live lighting and visual patches developed as a recipient of the University of Virginia Undergraduate Award for Arts. Features a custom built LED helmet, LED wall, and other LED lighting fixtures.
Featured on cdm.link (previously CreateDigitalMotion)