These are projects that I have selected for inclusion on the Portfolio Project page. That page is used as a quick overview of the variety of creative and technical projects in my history. Not every project will be listed here.
Over the course of 2020 and 2021, I was not very consistent with posting new content to my Word Press site (the site you are reading now). I was very active on YouTube, with a number of live streams, web tutorials and audio-visual experiments, but a large number of them were not linked in any way to my web site.
After working with someone to do an overhaul on this site (new template, restructuring content), I wanted to make posts to go with all of those YouTube videos. All I needed were three pieces of info from my YouTube channel:
Video title Post date Video URL
Gathering these basic pieces of data manually was not scalable (I tried). I knew there had to be a better way. A few web searches later and I was well on my way to creating a web scraper using the YouTube API and the Google Sheets API.
Once I had everything in a spreadsheet, it was easy to navigate all of the videos, copy/paste where needed into WordPress, and quickly get everything in sync.
I don’t really call myself a programmer… but I always seem to come back to programming as a skill that is good to have. In February of 2021 I was studying a variety of things related to graphics generation and came across Pygame.
This was my first real exploration of the Pygame library. It was a fun exercise, and helped me tackle future Python projects.
One fateful night in the early 90s, at an open mic in Bloomsburg, PA, I met Tom Dennehy. His mix of Dr. Demento-style originals and Weird Al Yankovic covers immediately endeared him to both me and, my future wife, Audra.
Life was never the same after meeting Tom. His combination of humor, musical smarts, and global vision have been a pleasure to observe throughout our friendship. Every interaction has been an education (as well as a lot of fun).
Careers always took us in completely different physical directions. But whether Tom was writing to me from India while taking khayal singing lessons, or (more recently) chatting on Zoom, we’ve always tried to stay in touch. Our last major collaboration was in 2003-04 with tabla player Bulu Rahman in a short-lived fusion experiment called Moonlight Masala. Somewhere in WVIA’s archives is a Homegrown Music performance of that group.
For many years, Tom has lived on the West Coast, adopting the moniker Breakfast. He also plays with improv groups The Wyatt Act and Mission Delirium. He’s a thoughtful multi-lingual wordsmith, multi-instrumentalist, and also teaches music and English.
During lockdown, Tom wrote 10 new instrumentals in his living room on his laptop (in Garageband no less, though you wouldn’t know it). At the same time, I was studying Touchdesigner and various visual techniques. I was starting to build what eventually became my JDRenderEngine, and I really wanted to put it through its paces. While I also do music, I wanted to collaborate with someone else on this experiment. So we decided to join forces – his music, my visuals. The result is Yellowcake, now available on Bandcamp and YouTube.
The brilliant, complex music centered around saxophone, electronics and odd time signatures, stands on its own. The visuals add another later of strangeness that we both had fun creating. Check out the playlist below and enjoy!
I came to Touchdesigner from timeline-based video editing in programs like iMovie, Adobe Premiere and Davinci Resolve. For a few years I also built patches in Magic Music Visualizer. I was always frustrated by the UI of iMovie and similar programs. I never found their UX very good coming from the audio world. I also disliked how they managed project data, creating a whole new project with every iteration of a piece.
I liked the scene creation features of Magic Music Visualizer, though these too were quirky. As primarily an audio artist, I wanted a quick way to render content I was creating in Touchdesigner. Out of the box, Touchdesigner didn’t have anything that checked the boxes for me, so I decided to build something.
As I dove deeper into Touchdesigner during a 6-month intensive study under lockdown, I realized that I could mostly replace iMovie or similar tools. I probably wouldn’t go back to Magic (though I still think it’s a brilliant program, and I still liked the scene creation/logic features.
Of course, all of these tools could still be used in various combinations depending on the creative need, but I was looking for a “one stop shop”.
Thus my work began, in January 2021, on what I would call the “Render Engine”, specifically to solve certain problems I was encountering for content development. By March 2021 I had a working proof of concept, and opened up testing with a fellow audio artist (Breakfast). By working on visual content for him, I was able to refine the tool further.
For full details about the tool, visit my GitHub page. There you will find demo videos as well.
Important info for event artists Created 4/22/2020 – Updated V 4.5 12/9/2020 Migrated from my .net to .com site 10/1/21 Pending review for updates.
NOTE: THIS GUIDE ALSO APPLIES TO EVENTS OF A SIMILAR NATURE
BEYOND COSMIC STREAM FEST
In early April 2020, I served as “stream consultant” for the virtual video/audio version of Cosmic Streamfest. The event was originally scheduled for June 13, 2020 at a Unitarian Church in NJ. COVID and social distancing needs moved this to a virtual event where I hosted/MC’d and each artist broadcast from their home studios. Over a two month period, this was basically a part time job for me (20-30 hrs week).
What follows is a combination of advice, tutorial, and commentary about setting up live streaming from your home studio. The notes below represent hours of testing, both solo and with the community. The focus here is on a solo performer or groups that are colocated (not social distancing). Tom Bruce and Karl Fury have been working out some things with JamKazam for remote collab. I will not be covering JamKazam here. If your group is using JamKazam because it’s impossible to meet in person, then you’ll want to check out Tom’s great guide (also the result of hours of testing on his part).
Please do your own testing before contacting me with questions. Also, as part of our post mortem for the event (which went very well overall), we may approach things a bit differently next time.
Streaming online – though very mature in many respects – is still developing. The usability (UX) and design of the available tools is seriously lacking. Everything is not always going to be obvious, or easy. You will hit a wall sooner or later with some aspect of your setup, whether that’s your computer, your audio interface, a software issue or other hardware problem. Get used to it. Take breaks. Go for a walk.
Patience and persistence here pays off. The best thing you can do is test, test, test on your own as much as possible WAY BEFORE the day of the gig. The configurations covered below are not something you can do the day of, or even the day before a streaming performance. In some cases, they need to be done weeks ahead of time.
Audio Interface: Usually a box that connects to your computer via USB, Firewire or Thunderbolt. This takes connections from mics, guitars, keyboards and other hardware and allows them to be captured by the computer and digitized. Some audio interfaces include a hardware mixer, which is great. Many do not, and all routing is under the hood. Learn about them here.
Capture Device: I’m using this term to refer to boxes and gadgets that bring video into your computer. This is not to be confused with an audio interface. A capture device for video is secondary and not required to stream in most cases.
Mixer: Will refer to a piece of hardware that you plug gear into (mics, synths, noisemakers)
Virtual mixer: Will refer to a piece of software used by your audio interface to route signals.
Latency: Delay caused by digital processing. (This is the less technical definition of course).
Loop Back: Blanket term for managing latency within an audio environment.
OBS: Open Broadcaster Software – Manages your incoming content (video/audio). Note: You do not have to use OBS, but it is one of the most prevalent, free, open source options. Any support that I provide is for OBS Studio.
Restream: A method of simulcasting to multiple destinations. At this time, we will not be using Restream for the event, but you may run into the terminology. Slightly more advanced topic that you shouldn’t approach until after you have single destination streaming down.
Minimum tech requirements for artists:
Computer with working audio interface (not “built in” audio chip). Some people use a separate machine for streaming, but this can add unnecessary complexity and is only needed for extreme situations. Most people should be able to stream with one machine. It’s best if that machine is on a wired Ethernet (no WIFI) Internet connection. More on this below.
Successful solo test session(s) via OBS using the OBS feature that allows you to save to a file. Your broadcast quality starts with the quality of your local configuration. The best way to test that is to record yourself and watch/listen to the playback for issues.
Successful stream test with stereo audio/video to broadcast to a streaming platform (YouTube, Facebook, Twitch).
Ability to communicate “back stage” via SMS or Facebook Chat. This is critical for coordination during the event.
It is pointless to do #3 without first getting a solid result from #1 and #2.
These will differ slightly based on your approach to performing. There are many possibilities:
Ambient mics + preamp + mixer + audio interface + 2 channel output to stream software (OBS) (not using a DAW). This is a common approach for acoustic-only acts and will not be a focus of this tutorial, though much still applies. When working with microphones, note that any noise in the room is going to pass through as well – this includes chair squeaks, etc.
Hardware synths and other gear + mixer (optional) + audio interface + DAW + 2 channel output to stream software (OBS). This relies first on a full understanding of how your audio interface works with your DAW of choice, and how to set your input/output channels. Loopback software (available free or paid) may be needed to capture the signal for streaming.
Hardware synths and other gear + mixer (optional) + audio interface + 2 channel output to streaming software (OBS). DAW running in background and used as simply another audio source in OBS, but not used as a mixer for any gear that is plugged in. Can be done without loopback software (this is my config).
Collab artists (using JamKazam or Ninjam) + signal for audience simulcast to streaming software (OBS). By itself, online collab is nothing new. However, incorporating online collab into a multi-artist stream event is still rather new. The last person in the collaboration chain is typically the person who sends a feed out through OBS. Great progress has been made in this area, so check out Tom’s guide.
People are streaming on Macs and PCs, so I won’t make any comments about pros/cons of either platform. Most of my direct experience has been on Mac, and my notes will reflect that. However, OBS Studio is cross-platform and all the same concepts apply. At this time I am not covering streaming via mobile with a USB/Lightning interface (though it can be done).
Think for a moment about how you typically perform live at a physical venue. Then think of what you’ll need to get the best experience sent out on a stream. Think about what you have available to capture your video/visuals as well as the audio – these may (and often do) require separate approaches.
Take an inventory of the multimedia gear you have on hand already. This might include: Tablets, Laptop/Desktop (Mac/PC), Video Camera, Audio Interface (USB, Firewire, Thunderbolt), Hardware mixer, DAW or other software, Outboard gear (synths, mics, etc)
For example, the setup I am using consists of:
Macbook Pro (circa 2019)
Model Identifier: MacBookPro11,4
Processor Name: Intel Core i7
Processor Speed: 2.8 GHz
Number of Processors: 1
Total Number of Cores: 4
L2 Cache (per Core): 256 KB
L3 Cache: 6 MB
Memory: 16 GB
Antelope Audio Orion 32+ audio interface over Thunderbolt
Behringer MXB1002 hardware mixer
Powered USB Hub with 10 ports (for connecting USB cameras and MIDI controllers). Hub has its own power supply, not USB powered.
Shure Beta 58A (for voice capture)
Cameras: Facetime camera (built into laptop), Logitech USB Webcam, Canon VIXIA HFR200
Black Magic Intensity Shuttle (to bring in the Canon via HDMI)
Ableton, Reason, VCV Rack, VLC audio player
Various synths and noise makers.
Now, please do not get on Sweetwater.com and start ordering gear! I arrived at this configuration based on what I had on hand, which just happened to be a full studio worth of gear. It still required many hours of testing offline, then live on Facebook, YouTube, and Twitch, and then in live performance.
Note that the cameras handle only video. All audio is handled by the Orion 32+ and (for now anyway) I am not running anything through a DAW unless I’m just doing two-track playback. This has more to do with the construction of my songs than anything else.
Your setup will very likely be very different from mine. That’s fine. Use what you have to meet the requirements outlined here to get the best results.
Potential failure points (PFP)
Before we get into the weeds on further setup I want to outline the potential failure points in streaming setups so that you can be on the lookout for them as you continue:
Input to computer – Cabling, mixer settings (hardware or virtual)
Audio Routing to OBS
Route to stream platform – requires proper URL and stream key
Internet traffic – after the stream leaves your location, you have no control over it.
Server load on streaming platform (Youtube will often downgrade content).
Audience Internet connection – you will have no control over the quality of the listener’s connection. Best you can do is provide them with the best source material.
There may be more, but this demonstrates the likelihood of hitting a problem. It’s a fact of streaming!
Now, take a deep breath and use this moment to appreciate the cuteness of this hedgehog before you read further.
Open Broadcaster Software (OBS)
This document assumes that the broadcasting artist/performer is using OBS Studio or similar software. I will heavily reference the 64-bit version of OBS Studio (currently at version 25.0.8).
Much here could also apply to Streamlabs OBS with slight modifications. I’m not sure if anything here applies to OBS Live. VMIX is another tool, similar to OBS, that can work very well. However, Streamlabs OBS is not supported via this document. The concepts, however, should be about the same. If there is time, I will provide a few tips on Streamlabs OBS at the end of this document.
We will not cover the browser plugin method of streaming to Twitch. It theoretically could replace OBS for some users, but at this time is untested by our group. You can read more about that option here: Stream To Twitch & YouTube Using Your Web Browser With Infiniscene! Infiniscene: https://www.infiniscene.com/
Below is an example screen from OBS Studio
It should be obvious, but folks should not use an on-board laptop mic or similar approach if at all possible. Onboard mics capture your room noise, and potentially conflict with your speakers, causing all kinds of feedback problems that will make the rest of your experience that much worse.
An audio interface should be used to get the best quality audio (before it is potentially mangled by a stream). As they say, garbage in – garbage out. However, a laptop mic can be a valuable tool for testing and can serve as a proof of concept to get you started with streaming using more complex configurations.
Get to know your make/model of interface, whether or not it has loop-back capability, and how the virtual mixer works. Some interfaces will provide a loop-back feature to configure audio for streaming. Many interfaces will not offer a built-in loop-back feature. This is where OBS tutorials come in. Plan to spend a couple hours on your own configuring before you join a multi-stream event.
A note about Focusrite interfaces…
Over the years (even before the streaming craze) I have had a number of people come to me with issues getting their Focusrite interfaces to work properly with a variety of software. This has continued into the world of streaming.
It’s not my intention to slam this manufacturer. I have a couple years of experience with their Scarlett 18i20 (1st gen) and know from experience it was a lot to wrap ones head around (particularly how Focusrite designed their virtual mixers). However, even today – with their products in 3rd generation – I continue to see people struggle significantly. This situation is made even more difficult by the inability to perform on-site troubleshooting during social distancing. It’s difficult to troubleshoot basic computer issues remotely under the best conditions – never mind something as complex and layered as live streaming.
Short of saying “don’t use Focusrite” I must say that your mileage will definitely vary. I continue to see people struggle on the OBS support forums and elsewhere. Some folks that I’ve talked to have even claimed that the 3rd generation interfaces were not designed for streaming, which I find very hard to believe. A digital audio interface should work with any program.
I’m holding off on publishing my private testing documentation – based on hours of testing – until I see how many more people run into issues. For now, here’s some links to Focusrite resources and other resources.
There’s no sense in reinventing the wheel when it comes to OBS setup, so I have gathered some of the most helpful links that I’ve used so far. Each combination of gear is different, so these are meant as starting points.
Soundflower (this is the link that is buried and hard to find) A signed Soundflower. Note that the Soundflower extension is signed, but the installer is not! You will have to hold the control key down to open the Soundflower.pkg installer for the first time.https://github.com/mattingalls/Soundflower/releases/tag/2.0b2
Please watch the videos below to get a quick overview of settings within OBS. These are not comprehensive tutorials showing all features. These videos are focused on what artists will need for our event. They are designed to get you up and running quickly.
OBS Quick and Dirty Tour
Overview of sources in OBS
Profiles & Scene Collections:
Optimizing Your OBS Stream Settings
This will be covered in the section on Twitch below.
All performers are expected to already have an understanding of gain staging (aka “gain structure”) and how to set this within their own studio environment before broadcasting a stream. If you’ve done any amount of home recording, you should already be familiar with gain staging. If this concept is new, there are plenty of resources elsewhere. Just do a search for “gain staging audio” on Youtube or at your favorite music magazine, and you’ll find plenty of help.
You don’t need anything special to do this right. Gain staging is necessary whether or not you are streaming. Once you have a handle on what your levels will be going into the stream, you can add a compressor and/or limiter within OBS to tame any rogue transients. You shouldn’t have to rely on OBS for the overall mix to sound good though.
Here’s a brief look at my settings…
Twitch is a popular streaming platform used by gamers, musicians, cooks, priests and humorists. Yes, I said priests. At the moment, we are planning to broadcast Cosmic Stream Fest 2020 on Twitch. Why?
We needed a place to do video and stereo audio.
We didn’t want listeners to have to log in, as they do with Facebook.
It has been used for other events in our community with success.
Twitch is a known quantity, with an established infrastructure.
We know it works with the majority of streaming tools (OBS, VMIX, etc)
As noted above:
We will not cover the browser plugin method of streaming to Twitch. It theoretically could replace OBS for some users, but at this time is untested by our group. You can read more about that option here: Stream To Twitch & YouTube Using Your Web Browser With Infiniscene! Infiniscene: https://www.infiniscene.com/
Regardless of how you plan to stream to Twitch, please open a free Twitch account today to begin your testing. Follow their guides on basic setup, and be sure to add a profile pict, channel descriptions, etc.(see Neil Alexander’s site below for an example.)
IMPORTANT: Locate your stream key – you’ll need that for testing in OBS. Your stream key will remain the same once you set it, and can be found at this address:
You’ll need your Twitch stream key for testing your own broadcasts to Twitch. Your testing will be done on your own time, way ahead of the event.
Ironically enough, you will not need your Twitch channel for the actual event. The Cosmic Stream Fest team will provide a special Twitch stream key to plug into OBS for the Cosmic Streamfest 2020 channel at the time of the event (and during testing sessions one month and one week before the event).
Now, take a moment to relax and admire this adorable otter doing cute human-like things with his/her hands while it revels in the joy of his/her watery abode.
There are several places you – or someone else – can capture a recording of your performance for posterity. There are some misconceptions out there about how this works, so I wanted to touch on this briefly.
To capture your video/audio for later use you can:
Record everything locallybefore it hits the stream, probably using the built in record function in OBS. You have control over whether this happens automatically or not. Depending on how powerful your machine is, it may or may not be a good idea to record at the same time you stream. But recording at that point is the best in terms of quality because you are not downgrading your sound/video through the stream yet.
2. You can record after your content hits the stream. YouTube, Facebook and Twitch all have options for how recording behaves. Since we’re using Twitch, it’s important to understand the behavior there.
Per the screen shot below from the Twitch channel settings page, there is an option for saving broadcasts. This will be on by default, but can be disabled on your Twitch channel.
Note that recordings are kept for 14 days unless you have a paid plan. And even if you have one of those other plans, the limit is 60 days.
This same setting holds true for any event channel that you broadcast to. If you broadcast to another channel with a different stream key, that channel will have control over this record setting. They will also need to get copies of the recordings to you. Now some are going to point to various utilities that skim video off of any site, often used by YouTube enthusiasts who like to grab content for personal use. That is always an option, but there is no native way built into Twitch to download recordings from a channel that you do not control.
3. You could also – theoretically – capture your performance as it is streaming by grabbing video/audio from Twitch (or whatever destination). This rather lo-fi approach might be used for creative purposes, but provides the least quality. If you recall the Potential Points of Failure above, the stream has to be encoded and then decoded. Poor Internet conditions on either of those can degrade the end result, introducing errors.
For Cosmic Streamfest, please consider visuals as secondary when you are starting out. If you have a single camera that works reasonably well, that’s all you need. If you have nailed your audio setup and want to venture into visual territory, there are a lot of options. Many people are having a lot of fun with this, and it really is an art form within itself that provides nice ways to connect with your audience.
At this time, any visuals that you supply will fall into two categories:
Pre-rendered: MP4, MOV or similar that you (or a friend) create and then bring into OBS.
Live via cameras, with or without OBS Filters: Video captures can be shown as-is without any processing in OBS, or you can get creative with filters. Simple is generally best.
We’re exploring some more advanced options for collaborative live visuals, but more testing needs to be done. This is where a visual artist would process your video feed from a different location (as they would at a live gig). The unique challenges of working remotely require far more testing and thought for this to work out.
A few basic techniques that have worked as I’ve created my own visuals include:
Cameras – Most folks will have at least one camera from their laptop or desktop webcam, and that’s a great start. Got a 10-yr old webcam sitting in a box? Get it out! Have a Canon or other brand of handy-cam from the 90s? Break that out and see what it can do! You will likely need to fuss with the connections, and maybe get a capture device, but it could be rewarding. I’ve even repurposed two wireless IP cameras that were previously used for security systems (using VLC plugin in OBS).
Pre-rendered videos (many of mine are made with Magic Music Visualizer). Running DAWs or digital art programs simultaneously can be a challenge, so pre-render whenever possible.
Text effects – OBS offers many ways to manipulate text both for practical use to communicate with your audience and for creative effects.
Ableton (used for playback only – no audio routed through it) = OK performance
OBS + Reason (used for playback only) = OK
OBS + VCV Rack = Poor performance. Some patches are OK, but larger patches will make audio choppy – even with all the performance recommendations
Still images – OBS offers options for displaying single images or scrolling through a folder of images.
Filters – OBS has a whole host of features for image manipulation, called Filters. They are a bit buried, but worth exploring. Just keep in mind that any processing those filters do in real time will also tax your CPU.
Plugins – OBS is open source, so a lot of people have made add-ons for different purposes. These can be a challenge to install at times, but worth checking out if you are a power user.
When using a cell phone camera, it is recommended that you broadcast in landscape mode for the best viewer experience.
In an ideal world, I’d run live visuals from a separate laptop, then feed that signal to OBS as a video source. Likewise for VCV Rack which is CPU heavy for both visuals and audio.
For some additional techniques about visuals within OBS, see my Youtube video and special feature here:
Capture Devices (Advanced topic)
Many artists will be just fine with a laptop or web camera. For those venturing into capturing HDMI, Component, Composite or VGA – you’ll likely need a capture device. When it came time for me to buy a capture device, I did some research and will post it here as a starting point. This is absolutely not comprehensive. Some new devices just hit the market As always, buyer beware!
Magewell USB Capture DVI Plus (DVI, VGA, HDMI Composite, and Component ) MAC & WINDOWS $419 !! No thanks!
Startech.com Usb 3.0 Video Capture Device – Hdmi / Dvi / Vga / Component Hd EXPLICITLY STATES WINDOWS ONLY
ClearClick HD Capture Box Platinum – Capture Video from HDMI, RCA, AV, VGA, YpbPr, VHS, VCR, DVD, Camcorders, Hi8. Record, Passthrough, NOT STREAMING
StarTech.com 6 ft. (1.8 m) VGA to USB C Video Capture Device – VGA to USB-C Recorder – Thunderbolt 3 Compatible – VGA to USB-C or USB-A Recorder (USBC2VGCAPRO) SOFTWARE INCLUDED FOR WINDOWS AND MAC: The external USB capture card includes intuitive software for Windows and macOS. 1080P 60FPS USB C VGA CAPTURE: The VGA to USB video capture device ensures seamless picture quality with USB 3.0 bandwidth enabling you to capture 1080p video at 60fps – DIRECTSHOW COMPATIBLE: Offering platform freedom this VGA to USB 3.0 video capture device works with any DirectShow compatible software. LIVE STREAM: On Windows, live broadcast your video over the internet using Twitch or other third-party video streaming platforms
Blackmagic Design Intensity Shuttle for USB 3.0 – MAC OK – $199.00. Would require converting VGA to component first.
BEWARE: During my studies, one site noted that OBS Studio would not work over a USB-A port, and claimed you must use USB-C. This turned out to be false. I am running the Blackmagic Intensity Shuttle over USB-A (3.0) and it’s flawless!
About Go Pro
SECTION SUMMARY: If I had it do over again, I probably would not have purchased another GoPro.
7/8/2020 – GoPro announced a GoProWebcam feature that changes some things below. Sort of. See below.
GoPros are great cameras in a lot of ways, but the company has missed the mark on making them easily usable for musicians. If you are planning to use a GoPro camera for any part of your streaming, please read on…
My wife bought me a Hero 5 Session a few years back. Unfortunately, it has no capability to connect to streams (even through the GoPro app) and no HDMI out. You can not connect a GoPro via USB and expect to use it in OBS. The battery life has also started to suffer and it is barely usable expect for very specific applications.
So I got a Hero 8 Black. Big disappointment. While you can stream directly to (via the Go Pro app), you can NOT connect this camera out of the box to OBS and manage it with other devices and content. Hero 8 streaming options:
Stream directly to Facebook or Youtube or RTMP using the GoPro mobile app. There are several issues with this.
You must tie up your cell phone or other mobile device running GoPro’s app.
You must use your wireless network.
You are stuck with the so-so audio quality of the GoPro, and you’re limited to “ambient” recording of your room rather than a direct line out of your studio rig.
You can’t go to Twitch directly. You’d need to use the RTMP option and reroute to Twitch from there with Restream or similar service. This adds an unnecessary layer of complexity.
You could buy the $80 media mod to get HDMI and work the camera into OBS via a capture device. This will be overly complex for many people but your only option. For more on this, visit this guy’s great video. Some reviewers report the audio quality on the media mod is worse than the native mic of the Hero 8!
May 1 Update – Three new options have come to my attention, but I have not verified them and can’t make any recommendations because they still involve too many hoops.
Link GoPro to iPhone/iPad, then connect iPhone/iPad to Mac, and OBS allegedly detects camera. Assumes you have an extra iPhone/iPad around.
7/8/2020 Update: The GoProWebcam feature (released in beta) is currently only available for Hero 8 Black, on MacOS only, and I had to get a special compile from GoPro support to get it working on High Sierra. But I did get a proof of concept, and it was much better than any of the above quirky methods.
Was I excited to see this? Sure. At least it avoids messing with questionable third-party software, crazy IP configs and jumping through other hoops. But I am still disappointed that GoPro put this out as a half-baked beta to turn their users into unpaid testing guinea pigs. As of this writing:
No native High Sierra support.
There is no Win version.
The webcam firmware is not compatible with GoPro Labs firmware.
GoPro Webcam mode uses a new protocol called GoPro Connect. The Quik Desktop app uses MTP. You can re-enable MTP on the camera by going to Preferences -> Connections -> USB Connection and switching from GoPro Connect to MTP. This disables the GoPro Webcam.
Webcam is not going to work for earlier models, like my Hero 5 Session.
Audio from the GoPro is not functional in webcam mode at the time of the beta. However, this would not typically be an issue for me – I would never rely on GoPro audio anyway, opting to use my audio interface.
In retrospect, I would have been better off with the Hero 7 Black, which has native HDMI output and doesn’t require the media mod. A capture device would still be required, but I got one of those for another purpose anyway.
If I had it do over again, I probably would not have purchased another GoPro. The only upside was that I already had a collection of proprietary GoPro mounts, so the cost I had sunk into those a few years back is paying off more now. Buyer beware. But there are some much better (and cheaper) camera options out there for streaming musicians like the Zoom Q2n – which has native HDMI out. Unless you need a camera that has the “action” specs of a GoPro, look elsewhere. All of the image stabilization that GoPro flaunts is really useless to most electronic musicians because we’re using these cameras in a stationary setup.
While writing this up I went to my Hero 8 to do some additional testing and provide some screen shots, and it was dead. This is a common issue. Even when powered off, GoPros will draw current. So unless you are really good about keeping them charged, they can be a hassle.
The example at right is an extreme example of the number of visual sources you can potentially bring into OBS. It also shows a combination of camera types.
1 = Canon Handycam connected via HDMI via BlackMagic Intensity Shuttle over USB 3.0
2 = Macbook Facetime cam
3 = Logitech webcam via USB
4 & 5 = IP Security cameras
We all want the best viewing and listening experience for our audience. We also want artists to have fun and (hopefully) make the stream technology transparent during the day of the event. With that goal in mind, we are planning some tech rehearsals to prepare for Cosmic Streamfest 2020. This is in addition to your own testing at home.
Preparing to validate your stream quality with the Cosmic Stream Fest team. 1. Finalize your OBS setup and follow the Pre-show and Pre-flight checklist provided here. 2. Join the chat group for the test. 3. Broadcast to Twitch 4. Await feedback on your sound and video quality (with priority on the audio). 5. Post-test notes and suggestions.
We will conduct ad hoc testing for individual artists, but we will also need group testing to smooth out transitions between artists.
December 2020 Update
What has not changed:
– You should definitely have tested to a local recording out of OBS.
– You should definitely have been testing on your own Twitch account.
What is optional, but useful for educational purposes (and for your use beyond this event):
– Set up *your own* (free) Restream account and learn how to use it from OBS.
– This is also a way to test without sending anything out to Twitch. See example image:
What has changed:
In the event you are asked to broadcast to someone else’s Restream account (not your own Restream)…
Go to the OBS Stream page.
Set OBS to use Restream and use the Restream key provided to you. Several artists have been confused by this because the event is on Twitch (but the audience doesn’t go to Restream). You do not need to have a Restream account. Even if you do, you will not be using it. Restream allows a venue to handle the multicast connection to Twitch/Youtube and FB.
What also has not changed:
Keep Twitch open on your phone for both chat and feed-checking purposes. Note the delay, so don’t panic. AVOID opening Twitch on the same device you are streaming from as you will get feedback (unless you mute audio on the Twitch site).
Best Practices Leading up to show:
Everyone tests their broadcasting set up on their own time several days (weeks?) before any performance.
Testing should include saving the test to a file so you can play it back to confirm things look good, but more importantly, that they sound good.
If all of your audio is coming from only one side (Left or Right): Stop and examine your configuration.
If instruments are hard-panned to left/right (without you consciously putting them there for creative reasons): Stop and examine your configuration.
Turn off all Facebook or other social media audio alerts on your broadcasting machine. Otherwise, those sounds (like back stage planning from our chats) will cause interference.
Turn off all audible social media notifications!
Facebook shown here:
Best practices night of show
Each artist should do a preflight check of all audio connections, cameras, etc.
Everyone logs on 30 min before show time to establish chat communication.
Make sure everyone has the correct stream info.
Brief stream tests are helpful, but by now everyone should have this down.
Artists should mute system sounds for chat (Facebook, etc) (See above)
Artist Pre-flight Checklist
Start computer (fresh boot)
Confirm you are using your wired network (if available)
Launch audio interface control panels (if in use)
Turn on hardware (example: off-board mixers, effects, synths)
Confirm stream settings (always double check this). You will not be streaming to your own Twitch account. You must use the key we’ve supplied.
Open your audio programs.
Turn on cameras as needed.
Set lighting in your studio.
Arrange your scene in OBS as needed (camera, graphics, video, text, etc)
Humming With The Gods is a 4-song Limited Edition EP born as a result of some different approaches I was taking to recording/writing in late 2017. I was working with some very evocative instruments, and suddenly found myself creating a lot of material in a particular universe of sound. Originally, I envisioned 10-12 songs in this universe, with a loose story line running through and lots of space references. Unconsciously, I was returning to the universe of “Children of Light” (from Chaos Rise Up in 2010) which always stood out for its electronic elements, and offered a positive outlook on the future.
Stylistically the music and lyrics have elements of what I was doing pre-2010, before joining the electro-music.com community. It’s also been informed by the experiences I’ve had performing and interacting with electronic musicians for the last 10 years.
This material has sustained me in unexpected ways during a very trying time in our lives. Friends and family know the gory details, and its not appropriate to rehash them here. Let’s just say – and I know this is cheesy – that looking to the stars and imagining other worlds really can help you deal with reality. Working on Humming With The Gods was perhaps the best expression of what I’ve been feeling over these last few years, and I think in some small way it expresses some important themes of our time (among them, moving forward after trauma, and on a larger scale, life in times of accelerating scientific discoveries amidst accelerated social concerns).
There’s yet more recorded material waiting in the wings, but before I can put that stuff out, I really felt that Humming With The Gods had to be finalized.
Teensy is a popular microcontroller ecosystem that I’ve highlighted on my site before. There are already a number of breakout shields on the market for most of the Teensy models. I’ve tried several, and written about them here.
The breakout boards that I’ve tried were either overbuilt with way more functionality than I required, or too basic, not offering what I needed for my projects. After considering a few ways to approach this, I made a physical prototype for connecting some sensors.
A few months prior to this, my father was transitioning from acid-etched circuit boards to computer designed boards using Eagle. He hadn’t sent his board off for manufacturing yet, so this was a good time to combine efforts and save on shipping if I could get my own board designed. I thought it would be good to use the same program so we could learn about it together. Unfortunately, couldn’t get Eagle to run on my laptop due to a video issue, so I opted for Circuit Maker.
For about two weeks, I learned how to pull off my design successfully and pass the QC process within Circuit Maker and the online manufacturer I selected (PCBWay in China). On one level the process is very much like desktop publishing in that you have many layers to manage and very fine tolerances down to the fraction of a millimeter. On another level this was a completely different world for me and I definitely learned a lot.
As usual, taking breaks was a good idea throughout the process. I often found myself having an idea about how to make an improvement after some time away. One frustrating part was that PCBWay doesn’t give you all errors at once when you go through their QC process. They tell you about one error at a time, and with the time delay between here and China, that can be somewhat grueling. But with ample help from Youtube and some forums I got through the major hurdles, and in the end I was able to go with a lot of default settings in Circuit Maker. I know a lot of people swear by Eagle, but I am pretty used to Circuit Maker now and will likely use it again.
I used Otherplan as a supplemental check to see if the build files would translate properly. My dad was also able to check my design in a different board viewer. All of the activity on my board project prompted my dad to finish his design, and we were able to ship everything together, saving on shipping. Then we waited… to make sure we didn’t just create a bunch of funky coasters!
After about two weeks, our order came. I couldn’t have been more pleased. The board looks great, and functions electrically the way it should. Like anything you do for the first time, there are always some improvements to make, so I am sketching out V2.0. I’ve worked the board into my project for now, and will be selling the rest on Tindie.com if anyone is interested.
This was a very fun process, and I already have ideas for a completely different board I’d like to make in addition to a revision of this one. My dad’s board is awaiting testing, and we’ll know before long how his design went. Hopefully it was just as successful and I can post an update soon!
At the end of August I built the SparkPunk Sound Kit, and used an EHX 8-Step to trigger the kit via control voltage. Not long after this, Sparkfun released their own sequencer, which I promptly ordered, hoping it would arrive before the next electro-music festival.
The sequencer kit arrived well before the festival, but still required assembly. I spent the better part of an evening building this, for use in a gig three days later!