With three cameras running on board, I do a quick cherry drying flight and make two very different videos from it.
This week, I’m releasing two videos from the same flight.
Helicopter Cherry Drying Flight w/Tailcam Views
Hop on board for a standard cockpit POV view of a cherry drying flight, from start to finish. In this video, I take off from a landing zone in Malaga, WA, fly to an orchard up Squilchuck Canyon in Wenatchee, and settle down over the treetops to dry 3.3 acres of rainier cherry trees. Along the way, I talk on the intercom and radio to point out things of interest, explain what I’m doing, and give position calls. On some of the new tailcam views, you can see the trees shaking like crazy as I fly past. When I’m finished, I fly back to my landing zone in Malaga.
360° Helicopter Cherry Drying Flight
Hop on board — virtually — for a 360° view of a cherry drying flight, from start to finish. In this interactive video, I take off from a landing zone in Malaga, WA, fly to an orchard up Squilchuck Canyon in Wenatchee, and settle down over the treetops to dry 3.3 acres of rainier cherry trees. By using two cameras to record video, I was able to get an audio track of me talking on the intercom and radio to point out things of interest, explain what I’m doing, and give position calls. If you look out the back windows while I’m flying low over the trees, you should see them shake like crazy as I fly past. When I’m finished, I fly back to my landing zone in Malaga.
To experience the 360° features of this video, you need to watch it one of three ways:
- On a computer in a Web browser, use your mouse to change the view by clicking and dragging. You may also be able to zoom in or out.
- On a mobile device IN THE YOUTUBE APP (not a mobile browser), tilt or swipe your app to change the view.
- On a smart TV viewing the YouTube channel, you should be able to use your remote’s arrow keys to change the view.
Behind the Scenes
How did I make these two videos? Here’s what went on behind the scenes.
Cameras
I started with three GoPro Cameras:
- Cockpit Cam Instrument View (or just InstrumentCam) – a GoPro Hero7 mounted on a pole on the bar between my front seats. This provides a view out the front cockpit window, along with the helicopter’s instrument stack and cyclic. I show off this modified camera mount in “Behind the Scenes: Cockpit Cam Instrument View Setup” on YouTube; you can read more about it in the description for that video. This camera is also hooked up to my helicopter’s intercom, so it provides the cockpit audio.
- TailCam – a GoPro Hero7 mounted at the very end of the helicopter’s stinger, in this case, pointing forward and down. This provides a view of the lower half of the back of the helicopter and the ground between it and the helicopter. It also records ambient audio: engine, rotor, and wind. (I did not use this audio track.)
- 360° Cam – a GoPro Fusion mounted inverted from the tiny window over the passenger seat. This camera records everything a front seat passenger might see in flight, even if he moved around to look out various windows, straight down (provided he was invisible), and straight up. This camera also records ambient audio: engine and rotor.
Audio
If you read this blog regularly, you might know that I’ve been struggling with one-channel audio being recorded through the NFlightCam audio cable I rely on for cockpit audio. There is nothing they can do to “fix” the cable so I have to “fix” the audio by combining the live left and nearly dead right channels into mono audio. Filmora Pro, which I switched to, does not include a feature to do this. I explained what I do in “The Video Editing Audio Workaround” on this blog.
The net result: for both of these videos, I used the post-processed (mono) audio track from the InstrumentCam.
Since most viewers like to hear the engine and rotors, I also used the audio track from the 360° Cam — although I could have used the audio from the TailCam. The difference between them? The TailCam includes wind noise, which I don’t think we needed.
But it’s important to keep in mind that for both videos the audio tracks were completely detached from the video tracks, thus adding the need to sync them with the video and keep them synced.
360° Video
I made this video first, using Filmora Pro. I obtained Filmora specifically to edit 360° video; I later discovered that other, possibly cheaper packages might also do the job. 🙄 I also discovered that Filmora doesn’t export the edited 360° video in a format that is recognizable as 360° on YouTube or in apps. Fortunately, I found an excellent YouTube video that explains all the steps I needed to edit and format for 360°: “How to edit 360 video in Filmora – Fail-proof method.” It required an additional software tool which, fortunately, was available for Macs and free.
To make the video, here’s what I did:
- Import the video from the 360° Cam into Filmora Pro, allowing it to change the resolution as necessary.
- Change the rotation of the camera view by 180° (it had been inverted).
- Detach the audio track from the video track but keep the two tracks synced.
- Dial down the audio track by 8db.
- Import the Instrument Cam video.
- Detach and discard the InstrumentCam audio track. (Remember, it’s left channel only.)
- Import the processed audio from the InstrumentCam and line it up exactly with the video track so it’s synced.
- Lock the two tracks together to keep them synced.
- Resize the InstrumentCam video so I can see it and the 360° Cam video at the same time.
- In the timeline, move the InstrumentCam audio/video tracks so they sync with a specific motion captured by both cams. If I’d been smart about this, I would have used my clapper, but I didn’t when I recorded the video, but I didn’t. 🙄 Instead, I had to rely on a specific movement — I think it was when I adjusted my headset. This, by the way, is the most time consuming and tedious part of making any of my videos — syncing two cameras — and it’s considerably easier when I just use the damn clapper.
- Trim the beginning and end of all audio and video clips to cut out what’s not going to appear in the final video.
- Hide the video track from the InstrumentCam. (I hid it rather than delete it so I could use the same video editing file for the other video, as you’ll learn below.)
- Add beginning titles, Member Wall, etc.
- Add to the render queue, adjust settings for 360° video, and render. The rendering process for this 15-minute video took just over one hour.
- Run the resulting video through the Spatial Media Metadata Injector. (I’ll let you watch the video I linked to above for details if you want to know what the heck that is and why it’s important.)
- Test the video with GoPro VR Player to make sure it looks and works OK.
- Discover that the opening titles and Member Wall are completely FUBAR. Go back to the step where I inserted them, fix them, and repeat remaining steps.
- Repeat previous step. (So yes, I rendered the video 3 times.)
You might say, Wait a minute, M. Doesn’t GoPro offer software for editing that video?
Yes, it does, but it’s extremely limited. My channel viewers want cockpit audio and the GoPro solution does not enable me to combine multiple audio tracks. So this is what I have to do.
Is it time-consuming? Hell yes. This video is about 15 minutes long and took a total of at least 3 hours of my time plus 3 hours of rendering time to produce. And that doesn’t include the hour of pre-editing stitching time or the bad stereo to good mono audio processing time or 30 minutes of YouTube upload time.
But I’m very pleased with the results. I just wish the final video started facing forward instead of backwards. (More experimentation is obviously needed to fix that in future videos.)
InstrumentCam Video w/TailCam Views
When the 360° video was done and uploaded for certain tiers of channel members, I took a day off and then got back to editing. My goal was to have both videos ready for Sunday public release.
This one was considerably easier because I already had some of the tracks prepared. Here’s what I did.
- Duplicate the 360° Cam video’s editing file and rename it.
- Delete the 360° video (but not audio) track. Its audio track is already synced with the InstrumentCam video.
- Adjust the screen resolution to 2704 x 1520. (I shoot in 2.7k these days but will likely start working with 4K video soon.)
- Unhide the InstrumentCam video track and resize it to fill the video screen.
- Import the TailCam video.
- Detach the audio track and delete it.
- Following pretty much the same procedure I outlined in a step above, sync the TailCam video with the InstrumentCam video. I used the moment the helicopter lifted from the ground, although I might be off by a few frames.
- Using a variety of cuts and transitions, place the TailCam video in a PIP inset where it makes sense to show it: takeoff, movement over trees, landing. Note that I masked parts of the video I didn’t need to show depending on where I showed it.
- Fix titles for new video resolution.
- Redo and add the Member Wall. (I had one new member since I made the previous video.)
- Add to the render queue, adjust render settings, and render. Rendering still took about an hour.
- Test the video in QuickTime player to make sure it looks good. Fortunately, I nailed it on the first try.
I uploaded this one on Friday for certain tiers of channel members.
Results
It’s a lot of work, but I think the results speak for themselves. I’m not a professional video editor. I’m a professional pilot. I’m fortunate that I have excellent computer skills from my previous career. But that doesn’t mean I have professional video editing skills. It just means that I can use software to produce decent (but not perfect) results.
I hope my viewers appreciate the time that goes into making these videos. They can show their appreciation by sharing my videos, subscribing to my channel, and considering membership.