An OpenGL-based version of Acid Cam controlled with
a computer keyboard for live visuals to stream online with OBS.
is a project I started developing in 2011 as an augmented reality hallucination simulator. The project grew and had various iterations over the years including the v1 series, v2 series, Qt app, command-line tool, and OpenGL visualization instrument. Different variations of the project run on Windows, Linux, and macOS. The most popular version of the project is the v2 OSX video editor version with over 2,000 CPU-based filters.
Acid Cam filters are puzzle pieces of a visible language to bend and alter data to create abstract visualizations through mixing them together in different orders. It’s a gift.
Available also is a collection of free stock footage I have created on my google drive that you can download for free and use in any of your projects or videos and evolve and expand. No credit or payment is needed. I want to share my love with you all.
The software is free, and the code is open source. You can download it on my GitHub or my website here: https://lostsidedead.biz
How the program works:
The program works when you load it from the command line you select either a camera, video file, or desktop. You choose whether you want to use a shader or a filter or both. You move between the filters and shaders via the arrow keys moving back and forth between them producing different combinations of visuals by pressing the keys in different orders. There are thousands of possible combinations just by pressing the keys in different orders. You toggle the filters on and off by pressing the space bar and shift the shaders up and down with the up and down arrow keys. You move between the filters with the left and right arrow keys. You add your own shaders by adding the name of the shader to the index.txt file in the path you pass to the program in the command you give it to start the program. You can use shaders and filters on your desktop as well as on cameras and video files. You can pass the program a playlist of filters and even shuffle them while the program is running or have the program randomize the playlist at the desired beats per minute. You can create custom filter stacks and add them to playlists.
Playing the instrument:
You decide whether you want to do a live session on live video or on video file and record
and pipe to FFMpeg. If you decide to pipe to FFMpeg you use the -4 or -5 switch for x264 or x265 and -m for the crf compression level. Then as the video processes you can press the keys in different orders and play the instrument to produce different effects. Shift up and down between the shaders and toggle on and off the filters. You can write your own shaders and filters In GLSL/C++. Shaders written in GLSL need to be added to the index.txt file that are passed to the start of the program in the -p path variable path or in SHADER_PATH environment variable. New filters C++ examples are shown in the plugin folder in the source code of the project.
Live streaming with OBS is easy on macOS just use the -Y command line argument after compiling as a with static macOS libraries on x86_64 and output as Syphon server and use Syphon input on OBS. Otherwise you could use Window input on OBS. Use a Cam Link 4K to get video input from HDMI devices such a Playstation 4, NES classic, TG16 Mini, or Sega Genesis Mini.