EboSuite 2.0 Lets You Treat Video Like Audio In Ableton Live

[embedded content]

Superboth 2022: The developers of EboSuite, a collection of video plug-ins for Ableton Live, have introduced EboSuite 2.0 – a major update that adds live video recording, which opens up new ways of performing and working with video live.

Since EboSuite was introduced, the developers have also expanded the number of plugins it offers from 6 to 54. They describe the collection a “the most advanced software to generate, play, manipulate and mix visuals in Ableton Live.”

“If you know Ableton Live, you know EboSuite,” they say. “It’s that simple.”

Features:

  • Treat video like audio
    • Play video in Session View.
    • Edit video clips using Warp Markers, Loop, Start/End etc.
    • Use Live’s audio mixer as a video mixer with blend modes
    • Mix video using Live’s crossfader with visual crossfade effects
    • Position, scale and rotate video tracks to make a visual arrangement
  • Get creative with video
    • Trigger video live or with MIDI clips
    • Pitch and tune video with different tuning modes
    • Simulate video scratch
    • Realtime time-stretching
    • Slicing, Velocity to opacity, audiovisual ADSR
    • One-shot, Loop, PreRoll, and more …
    • Fully automatable
  • Create your own visuals
    • ISF support, more than 1500 ISF effects and generators available for virtually unlimited visual possibilities
    • 34 image control, utility effects, and creative distortion effects incl. Luma/Chroma key, Mask, Blur
    • Organize effects in groups
    • Parameter smoothing
    • Easy to combine with Live’s audio effects, fully automatable
  • Other features:
    • Connect multiple webcams or iPhone and iPad cameras
    • Support for many video codecs (HAP, H264, MPEG4, PJPG, etc.)
    • Convert video to the HAP video codec within Live
    • Support for grouping, racks, linking
    • Create polyscopic compositions
    • Compile MIDI clips to a movie file within Live

Pricing and Availability

EboSuite 2.0 is available now for MacOS for 159.00.

New Silhouette Eins Synthesizer Blurs The Line Between Sound & Image (Sneak Preview)

[embedded content]

Developer Johannes Pit Przygodda shared this sneak preview of the Silhouette eins optical soundtrack synthesizer.

The Silhouette eins follows in the tradition of instruments like Evgeny Murzin’s ANS synthesizer and Daphne Oram’s Oramics Machine – devices that directly synthesized music from graphical scores.

“I dreamt I was surfing inside a wave of sound,” says Przygodda. “Since 1992, I’ve been looking for the way to get there. Since 2014, I’ve been building the instrument. Now it is here.”

The Silhouette Eins synthesizer is designed to bridge the worlds of visuals and sound. It features a camera and light table, which can be used to capture images that are translated into sound. It can also use photographs and videos as sources.

The source visuals are shown on a display, where you can select the area that is used to generate audio waves. You can manipulate and modulate the visual in order to change the resulting sound. You can also use tools like LFO’s and effects to create more complex sounds.

The Silhouette Eins also lets you output the display, so you can project the visuals that are generating the sound.

Features:

  • Live camera and light table
  • 1 optical soundtrack oscillator
  • 4 voices
  • 5 modulation waves
  • 7 modulation driver sources (LFOs, EGs, etc.)
  • 18 modulation addresses
  • Screen record
  • Reverb
  • Users can add their own content pictures and movies
  • Trackpad to move area and for edit functions
  • 37 full-size keyboard
  • Stereo output linkable to area position
  • Video output for audiovisual performance
  • External connections via MIDI

Audio Demos:

The Silhouette Eins is in development & the demo video was created with prototype #1.  Przygodda plans on debuting the synthesizer at Superbooth 2021, scheduled for Sept 15-18 in Berlin. See the Silhouette site for more information.

Imaginando Intros MIDI-Controllable VS – Visual Synthesizer

[embedded content]

Imaginando has introduced VS – Visual Synthesizer, a new application for iOS, Mac & Windows that lets you use both audio and MIDI to control visual creations.

VS uses dynamic ‘materials’; a variety of different graphical elements, from glowing particle effects to warping geometric planes, each with its own set of attributes. These materials can be layered on top of each other, then triggered and manipulated by modulating their parameters, providing an interactive way to create complex compositions.

Using MIDI, VS can be ‘played’ just like a regular synth, including polyphony. Multiple simultaneous notes translate into multiple visual voices.

VS’ extensive parameter and modulation functionality provides the flexibility to create highly customizable visuals with complete control. Use any combination of LFOs, envelope generators, audio-reactive modulators and MIDI data, to transform your creations.

VS is also available as an audio plugin, bringing the world of creative visualization into your DAW.

Features:

  • 8 Polyphonic visual layers
  • 1 background layer with solid color/image/video (no modulations)
  • 4 voices per layer
  • 50 built-in materials
  • Exclusive factory presents by @Perplex On
  • 4 LFOs
  • 2 EGs
  • 4 Dual-mode Audio Modulators (peak and band)

Pricing and Availability

VS – Visual Synthesizer for Windows, Mac and iOS, with the following intro pricing:

VS – Visual Synthesizer Desktop (Windows/MacOS):
Standard price: 99 EUR – Launch price: 49.5 EUR!

VS – Visual Synthesizer Mobile (iOS/iPadOS):
Standard price: 21.99 EUR – Launch price: 10.99 EUR!

VS is also available to Rent-To-Own for 9.99 EUR per month for 12 months

Videosync Lets You Treat Video Like Audio In Ableton Live

[embedded content]

Dutch software developer Showsync shared this introduction to Videosync, described as ‘a visual engine for Ableton Live’.

Videosync enables you to treat video as audio inside Ableton Live. You can create visuals using Warp Markers, Racks, Macros, Simpler, Automation, Modulation and more.

Videosync’s instruments generate content intended as a source for mixing, blending, keying or displacement. Simpler rhythmically triggers your video content, and the External In instrument fetches input from Syphon or any external video source.

Other features include:

  • Multiple outputs – Use Live’s Return & Master channels as Syphon outputs, allowing up to 13 channels of video output into other applications.
  • Networked playback – Increase redundancy and spread workload in your live shows by running Videosync on a separate computer.
  • ISF shaders – Support for Interactive Shader Format, and ships with a plugin SDK to easily develop your own video plugins.
  • Native video playback – Uses macOS high-performance video playback frameworks, to render video at the highest resolutions and frame rates that your system supports.
  • Format support – Support for all standard video formats, as well as HAP.
  • Native Apple Silicon support – Videosync ships as a universal binary for Intel and Apple Silicon.

Here’s an introduction to getting started with Videosync:

[embedded content]

Pricing and Availability:

Videosync is available now, with pricing starting at $99 USD. A demo version is also available.

Live Immersive Gestural Electronic Music

[embedded content]

Reader Carlos Martorell (Shoeg) shared this live performance, shot inside a kerosene tank in Tenerife island for Keroxen festival.

Martorell’s performance translates gestures into MIDI data, controlling both the audio and audio-reactive visuals.

Here’s what he has to say about the technical details:

“Live set sound is completely generated with a Waldorf Kyra, and MIDI is generated with a couple of motion sensors with my hands. I also brought my own sound reactive visual stuff for the ultrapanoramical screen.”

Secrets Of Weirdcore & The Visuals Of Aphex Twin

[embedded content]

This video, via It’s Nice That, captures a presentation by designer Weirdcore that offers a rare look into his visual art for Aphex Twin.

Using live generated elements – combined with trippy and intense pop culture imagery, manipulated in real time – Weirdcore creates visuals that pair perfectly with the music of Aphex Twin.

Here’s a short example of his work:

[embedded content]

via Torley_

The Basics Of Analog Video Synthesis

[embedded content]

In the latest Perfect Circuit video, LA-based artist Alex Pelly demonstrates the basics of analog video synthesis, from thoughts about planning a modular video synth system to the functions of individual modules.

Pelly discusses modules by LZX Industries and Erogenous Tones in depth, showing how they can be used as the core of a video-oriented modular system. And by providing an extended tour of her personal system and a detailed patch walkthrough, she demonstrates some tactics for harnessing feedback, audio reactivity, and hands-on control to create video that dynamically evolves along with music.

iPad IDM Jam With Live Visuals

[embedded content]

Sunday Synth Jam: This video captures a live iPad IDM performance by Perplex On.

Heres what they shared about the technical details:

A little glitchy idmesque iPad jam with #shockwavesynth, #isem and #zeeonsynth for melody and #playbeat, #ruismakerfm, #fractalbits for drums. Played with #kb1 through #steppolyarpunit and live-glitched with #koalafx. Embedded in audioreactive shattered visuals #madewithnotch.