Tag 4K

Tag 4K

Secret Genius for Spotify • An FCPX Workflow

April 16, 2018 Tags: , , , , , ,
featured image

In this exclusive interview, FCPWORKS chats with Ben Jehoshua from Brian Graden Media. Ben recently directed the debut season of Secret Genius, a Spotify documentary series about singer-songwriters. The project was also the first piece of original video content for Spotify and was produced using Final Cut Pro X and Lumaforge’s Jellyfish. 

Describe your overall responsibilities.

I’m the Senior Vice President of Development at Brian Graden Media. I run our internal production studio called BGM Studio and work on sizzle reels, presentations and pilots. Over the last couple of years, we’ve developed well over 150 projects. I also directed a feature film and I’m working with two teams right now to write new projects, some of it low burner/long-term kind of stuff. We’re also representing one of my personal projects, which is a suspense thriller feature film.

What’s your production background?

I’ve been filming since I was very young and growing up in Israel. I was also teaching younger kids how to shoot video and edit. Then I came to Los Angeles to attend film school at Columbia College in Hollywood and later started my career as an editor. I worked on TV shows like Unsolved Mysteries, Disney’s Shark Attack and Intimate Escapes for TLC. I was an editor for years and then started my own production company with two other partners, one of whom is my brother, Judah Jehoshua. We did a lot of stuff for Mattel toys like Barbie Hot Wheels, Brats and tons of kid’s commercials. Also corporate video for tech companies like Intel, Microsoft, IBM; car companies like Honda, Hyundai, Toyota. That was our bread and butter for years until we started doing developing a project called Geek My Pad.

And I showed it around to my contacts and they said wow, you’re really good at this presentation thing. One thing led to another and I worked on a few projects in the industry that did well. I was directing, producing and editing presentations for a while until Brian Graden and I worked together and then we started this internal division at Brian Graden Media.

Was Secret Genius something you originally pitched to Spotify?

It came from one of our SVP’s, Jeffrey Wank and it was his passion project. For years he’s been going to these songwriter conventions and been learning about the stories behind the songs. So it was a project we pitched in other places around town. And strangely enough, when we went to Spotify they already wanted to do something very songwriter focused and pay homage to all these amazing songwriters that are featured on their platform. So it was good timing. We looked at the budget and the resources that they were giving us and it just made sense to do it internally with my team. I created a lot of presentations that kind of went back and forth until we settled on the creative with Spotify.

Was the length of each episode predetermined?

It came out organically, because other than the storytelling, the format also includes an unplugged performance where the songwriters perform their own music at the end of the episode. We knew that would be three to five minutes depending on the song. And then we sort of reverse engineered it from there and wanted to keep it between five and eight minutes for the story portion. But one of the challenges was how do you have one person in a very intimate storytelling format? And also how do you integrate the photos from their past and names of the songs and lyrics? That was the most interesting part of this to work on creatively.

Were the guest subjects mostly established or brand new artists?

Very well established, like Justin Tranter, Priscilla Renae and also Poo Bear who writes for Justin Bieber. So we were always pleasantly surprised by how as you’d talk to somebody, you just would realize how prolific and amazing their work is. Our showrunner Georgi Goldman really did a deep dive into their stories and I’m very proud of being able to work with her.

Did Spotify want to start with just a pilot or shoot the entire season at once?

They went for the entire season from the start, 10 episodes. One of the challenges Georgi and her research team had to overcome was scheduling because these songwriters are ultra-successful people with very packed schedules. So it was a little bit like herding cats to bring everybody onto the soundstage. We shot two episodes a day over a one week shoot on a soundstage. Everything had to be very tight and scheduled correctly.

Describe the production.

Everything was filmed in 4K on four Sony FS7’s. And the interesting part was that Spotify came to us about five days before the shoot and said, we might just broadcast this at a 9:16 aspect ratio on our app on the phone and even if we don’t, we would like everything to be formatted so it’s both landscape and portrait, so make sure nothing hits outside of the assigned 9:16 area for portrait viewing on a phone.

So that threw a very challenging wrench into our production and we decided to mark all the monitors on set. Every single shot was carefully planned to not stray outside of the 9:16 portrait area on our monitors. The creative called for lots of camera motion and also moves in post. We want it to be constantly moving and zooming even if it’s digitally and the interesting part of the creative is one of our cameras was outside of the stage window that we built and the intention was always to track graphics and images from the songwriters past, whether it’s a performance or childhood pictures or whatever the creative called for to track it on to this window that we built into the set. So that was a challenge because we also knew that we needed to show these images in landscape and also make them work in portrait.

And where did you shoot?

We were Glendale on a soundstage for five days. We had a couple of prep days and a breakdown day at the end. The set had this massive chandelier and enormous crystal disco ball. Just mounting that was a challenge because we really tried to go for a specific look.

What was the timeframe from completing production to delivering the entire season? 

We actually staggered the delivery because there were so many people approving the stuff both internally and externally at Spotify. Episodes one, two and three were released together first and then the rest were released in clusters of two or three. Spotify also commissioned a format from us that we referred to it as a living playlist. It’s a 30-minute audio playlist that incorporates our footage. Whenever the songwriter mentions a particular song, that song starts playing and then a few other songs from that songwriter follow. So it’s almost like a premium vlog by that songwriter, direct to camera.

Which tools did you use in post-production? 

Our internal team has been working on Final Cut Pro X since version one. Our editors got really fast on it and we were talking about using it because we’d developed a sensibility with our editors and we love them. They didn’t want to move to Avid and get bogged down.

We wanted to use the LUT our DP Neal Brown created on set and do some moves digitally all in 4K. We also brought on two additional Avid editors. They were very well accomplished on big shows and I was kind of dreading the conversation with them about working with Final Cut Pro X. But they were actually familiar with it, they had just never dived in with it on a professional project like this with deadlines and lots of people touching media.

After that, we needed to iron out the kinks in our building because we were initially on an Avid Isis media server and that was not fun. Then, Lumaforge came in with the Jellyfish and solved our issues and the editors got so addicted to it and flying on the system. It was kind of a joy to see Avid editors meld into Final Cut Pro X.

We had three story producers, our showrunner and two graphics graphic artists. In total, there were eight people working simultaneously off of the Jellyfish, four stations on 10 gig and four stations on 1 gig, which was seamless and flawless. The capacity of the Jellyfish was 36 terabytes but we only used about 18-20 terabytes. We cut everything inside of Final Cut Pro X and did the animation in After Effects.

How did you first hear about Lumaforge? 

We did a little demo back when they were in Culver City. We went over to check them out because we heard they had a shared server optimized for Final Cut Pro X. So we bought version one for our internal development team. Honestly, I turned it on once and then about a year and a half later I realized I had never turned it off but it was just working. Then we rented another Jellyfish for Secret Genius and ended up buying that one as well. Sam Mestman and everyone over at Lumaforge has been amazing, some of the best support I’ve ever had.

What was the final delivery for Spotify?

We finished as much as we could inside Final Cut Pro X including the initial color correction. With Neal’s LUT, the main work was to just make sure the levels are all set within the waveform. We did our final sound mix at a post house in New York.

How does Spotify measure the overall success of a show?

The digital world is ever-shifting and people are really trying to find the meaning of success, especially in a subscription-based service like Spotify. They haven’t told us what the viewership is and it’s also still very much fresh and new episodes are still coming online at this moment. I do know that their number one initiative is Secret Genius songwriting songwriter outreach. It’s very important to them culturally as a company to reach out to songwriters. We get new pieces of information every week and we’re certainly proud to be a part of their first push into media.

How would you compare delivering a project for a streaming subscription service to a more traditional broadcast workflow? 

Brian Graden Media has been in the forefront of the production on digital for a few years now. We haven’t watered down our delivery process and one of the key reasons is a lot of the linear people have been migrating over to digital. So, they bring with them the expectations for high quality and expect top-notch color correction and sound mix. The key differences are that the air date can get a little flexible and the running times, because we don’t have to put the commercial breaks in between the content or adhere to a specific length.

If Spotify requested a second season, would you change your approach?

We enjoyed the process so much and everybody got emotional at the end of the week because it was such a great subject to get immersed in. The crew was spectacular, so I wouldn’t change a thing when it comes to the shoot. Honestly, I don’t even know how we would’ve finished this show without the magnetic timeline in Final Cut Pro X. We were getting things like crazy and just flowing so nimbly and quickly on cuts and that was that was really cool. And it’s also just a powerful workflow.

Is producing for streaming really popular now?

It’s not quite the Golden Age of streaming just yet. For us, it’s definitely still starting because we have our development meetings and we’re excited about digital and anything that’s cutting-edge and new. I think everybody’s trying to figure it out every week and often when we look at our digital networks we’re surprised to find that one of them folded or another one has popped up and the network needs are so vastly different. I think it’s a great time to do what we’re doing and it feels like the ground is shifting a lot.

Is there anything, in particular, you’re keeping an eye out for in terms of industry trends?

I’m always fascinated with workflow, for example, we recently installed Transcriptive from Digital Anarchy which does automated transcriptions. I’m always kind of guilty of adopting things a little early. I’m not an excitement junkie or anything, I just love the technology and I’m always trying to be tuned in. I love to see what’s new with editing and who’s forging forward and creating new workflows.

DJI Inspire 1 Pro Raw Review

April 5, 2017 Tags: , , , ,
featured image

In the last decade DJI has revolutionized the aerial photography and video industry. When they introduced the Phantom they introduced aerial video and photography to an entirely new atrarket.

Working with the Phantom 2 with a Zenmuse X3 gimbal since 2014 we rarely used it as the images and operability of the drone just didn’t live up too the level of production we hoped. In 2016 though at Jamestown Films we took the step though to purchase the Inspire 1 X5R. Based on some previous experience with it we were confident we could reach the quality we needed for our productions.

DJI Inspire Pro 1


When Inspire 1 was introduced in November of 2014 with the X3 camera we were immediately excited with the capabilities the drone could offer its pilots. When they upgraded the camera in early 2016 to the X5 and X5R it made us wonder just how competitive the Inspire 1 X5R would be with larger drones that could fly with a full camera system. We have seen a long time drone operator switch from full camera system commercial drones to the Inspire 1 X5R. He told us that it is the best decision he ever made.

First Impressions:

Lets take a look at the DJI Inspire 1 X5R. If a pilot comes from a commercial drone like the Matrice, they will first notice the compact case that the Inspire 1 comes in. If the pilot comes from a Phantom, they will notice this huge brief case that they will now have to lug around. The Phantom is more portable, but lacks the image and operability that the Inspire 1 gives you.

The Inspire 1 drone is more competitive with the Matrice and should be compared to the Matrice system. The Inspire 1 X5R requires far less equipment to operate. We loved that we didn’t have to drag cases for the gimbal, drone, camera and lenses around to obtain a high level arial image.

Inspire 1


The Inspire 1 X5R includes one battery, two remotes, an Inspire 1, and the X5R camera which comes in a separate hard case within the Inspire case. To truly operate and shoot for a day you will need more than one battery. Each battery has about a 35min of flight time. I like to have at least 3 batteries for straight forward shoots where I wont need to have multiple takes. But, If the shots are more complicated and needing multiple takes I would prefer to have 5 batteries or more.

The main controller operates the drone with DJI GO app available on iOS or Android. DJI has made an excellent app that turns your phone or iPad into a first-person view, also known as an FPV, monitor. Your iPad or iPhone isn’t just a monitor, you can set focus, exposure, check histogram and get all of the information that you could ever want right at your finger tips. This app makes it a pleasure to be both the pilot and camera operator.

When operating in dual remote, where one person is the pilot and the other is camera operator, the camera operators remote piggybacks off of the pilots remote receiving a video signal from their controller. Both the pilot and camera operator share a camera view. This can become a problem in two ways. 1.) if the operator is far away from the pilot the FPV starts crashing and makes controlling the drone difficult. 2.) As the camera operator moves the camera around, as a pilot, you loose all sense of direction often times making getting the shot near impossible.

The Inspire 1 drone generally operates though with intuitive ease, making it a pleasure to fly and operate as a pilot in single operator mode. The dual operator mode needs some major improvement, which have been answered by the recently announced Inspire 2. For example, they add a flight camera that is just for the pilots FPV, while having the ability to view the shot in a smaller window. And more object avoidance sensors.

DJI Inspire 1


The X5R is capable of capturing incredible images for such a small camera. The specs for this little camera are impressive with 12 stops of dynamic range and a 4K raw .DNG recording option. The quality is absolutely amazing. While recording raw it also records a .mp4 reference file for reviewing the shot or quick access.

Color correcting and grading the raw images produced by this sensor is, well, Inspire-ing. The ability of the image to bend and not break makes it able to match many high end cameras. We’ve seen this footage cut in with RED Scarlet-W and an Alexa Mini footage with ease.

One drawback to be aware of is that this is not a low light camera. Its sensor is micro 4/3s and noise can become significant if you go too far up with the ISO or need to lift the the exposure too much on an underexposed image. Great thing though is that with the raw data in the .DNG files, the shadows, though grainy, can be recovered. We’ve had good results with the Neat Video plugin de-noising the image for an excellent low light end result, but it adds more steps and time to post production.

Zenmuse X3

Examples from a couple of projects we’ve worked on recently where we used the Inspire 1 Pro RAW:

MOUNTAIN TRAILS FINE ART | JEFF HAM (quite a few shots at the beginning):

FIBER FIX (shots scattered throughout):



As great as the benefits of raw are, there is one big drawback. That is the post production workflow. For one thing, raw files take up a lot more drive space. So plan accordingly. After a shoot you remove the proprietary DJI SSD from the X5R base and insert it into a DJI dock. You have to then open DJI Cinelight (for Mac) or DJI Camera Exporter (for Windows) and from there either transcode to whatever flavor of ProRes you would like, or export the raw footage to folders of raw .DNG files. This process can take a long, long time. Hours. The files unfortunately can not be accessed through the file browser before processing through Cinelight. So, there is no drag and drop option with the Inspire 1. This has been addressed with the Inspire 2 and we’re hoping that functionality comes to the Inspire 1 as well via firmware update. This is the biggest draw back to the X5R. It adds so much time to the offloading workflow that shooting all day in raw requires multiple, expensive, DJI SSD’s.


We prefer to get ProRes 4444 XQ out of DJI Cinelight for three reasons. One, ProRes is a great editing codec. It has great playback performance and renders quickly. Two, 4444 XQ maintains near raw quality. Many features films are shot directly to 4444 XQ. Three, FCP X, unfortunately, does not have true image sequence, imported as a clip, support. There’s a simple enough compound clip workaround, but if you have many shots that gets annoying. If you are someone on Windows prepping footage for someone on FCP X then output to .DNG sequences. Then the editor can run those through Compressor or DaVinci Resolve to create ProRes 4444 XQ clips. Luckily the Inspire 2 RAW has addressed and simplified the whole workflow and can now record directly to ProRes!

The Inspire 1 X5R is an excellent drone system that spanned the market gaps between low and high budget video.

Closing thoughts:

The Inspire 1 X5R is an excellent drone system that spanned the market gaps between low and high budget video. It over lapped into the high end commercial drone world. The affordability and portability of the Inspire 1 make it, for most people, a first choice when considering which drone to purchase. This drone’s ability to fly at high speeds and with precision makes it a great pleasure to operate, but the lack of a pilot FPV makes dual operation difficult. We absolutely love operating this drone and believe that it is a unique and boundary breaking tool for the media world.



This guest blog post is from Braden Storrs and Patrick Newman.

About Braden Storrs:

I’m a Utah based editor and video creative. As post production manager at Jamestown films I’ve had the chance to work with some great people on great projects and continually improve our techniques and workflows in an ever changing digital landscape. I am also the creator of the Final Cut Put X Editors Facebook group and can can be found at @thefcpeditor on Twitter. I love trying to elevate each project a little more then the last. In the end, I’m a storyteller.

Braden Storrs


About Patrick Newman:

I am a DP located in Salt Lake City, Ut. Over the past year I worked at Jamestown Films with their wonderfully creative team. “The Story Teller” has taught me about story telling side of cinematography and the roll it plays in creating an excellent story. Jamestown gave me the chance to grow rapidly as a DP in the commercial world. This has given me the chance to work with just about every camera in the industry, helping me perfect my technique as a DP.  Recently I have moved on to DP a few feature documentaries over the 2017 summer. Find me on instagram @patricknewman170

Patrick Newman

5K Retina iMac for FCPX

October 16, 2014 Tags: , , ,
featured image

Apple finally brought in a key missing piece to its 4K editing workflow/ Thunderbolt 2 Mac pipeline. On the high end, you’ve got the Mac Pro with dual GPUs, Thunderbolt 2 buses and the ability to run up to 3 4K displays. On the Mobile end, you’ve got the Macbook pro which has a 4K HDMI out.

Now for your everyday editor, you’ve got the new Retina iMac, which has a built in 5k screen, and Thunderbolt 2 capability. At 5120‑by‑2880, that leaves enough screen real-estate for 4K at 100% with space left over for the UI. For your average FCPX editor, this is an amazing sweet spot. Check out the full tech specs here: http://www.apple.com/imac-with-retina/specs/

You’re going to be able to easily monitor pixel-for-pixel 4K footage on a solid display and in many ways, the new iMac represents the missing piece for 4K workflow: a very high quality, affordable 4K workstation to watch/edit all the 4K material that the new cameras from BMD, AJA, Sony, RED, GoPro, Panasonic, and pretty much everyone else are now recording to by default.

The main challenge up to this point was that even though you were recording to these formats, it was hard to actually monitor the resolution. Now, your average editor is going to be able to do that, and I think that this iMac release is further confirmation that it’s time to get your 4K workflow together.

The price point for the 5k Retina iMac is also really astounding at $2499. 4K editing is here. Let’s all get back to making movies now. My real takeaway from the 5k iMac is that we’ve finally got a machine with a proper standard 4K ready panel that makes it easy to view and edit in a great form factor.

As usual, Apple took something that everything else was making really difficult (4K display/editing) and put it all together into a computer that makes the workflow a lot more straightforward and makes you wonder why no one else is offering something like that already. (No doubt Apple’s scale of manufacturing and ability to source exclusive vendor arrangements helps a lot here).

4K is here. Time to upgrade if you haven’t already… and if you need an iMac-based Final Cut Pro X package, FCPWORKS is a full Apple reseller with unique workflows and tons of experience with FCPX. We’re ready to get you up and running now. (One hint, you’ll probably want to get the AMD Radeon R9 M295X 4GB GPU option and not the stock configuration).

TED Goes FCPX with the help of FCPWORKS

October 10, 2014 Tags: , , ,
featured image

In what may be one of those Cold Mountain moments for Final Cut Pro X, TED reveals it has made the complete move to FCPX for all of its editorial work. Our own Sam Mestman was privileged to be a big part of TED’s transition plan as he worked directly with Michael Glass, TED’s Director of Film + Video. For the full story, please head over to Studio Daily. Some nice excerpts:

TED made the official transition to FCPX on September 1. “There really was a long runway before officially switching,” says Glass, “but we made that the date at which we would never open FCP7 again in order to edit a talk from scratch,” he says. “If we need to go back into a previous edit, instead of trying to translate from one to the other, we’ll edit in the old software.” Assisted by Sam Mestman from FCPWORKS, the TED editorial team spent six months training in the new version, starting with tutorial on Lynda.com and Ripple Training and moving to one-on-one tutoring with Mestman, especially at the beginning and again during the week “marathon” leading up to the official transition.

“This whole process really started almost two years ago,” says Glass. “We knew Final Cut Pro X was there, but we also looked at Premiere, Avid, even Smoke at one point. We narrowed it down to Premiere and FCPX, and once we had the lay of the land from the press and what we could read about it, we took two-to-three-day intro courses offsite to both softwares. That gave us a good handle on where the problems would be, whatever we took on, and also what the advantages would be.

People were starting to come around to the idea of FCPX before Sam showed up for a week here, rotating through two-hour sessions one-on-one with each of the editors and assistant editors. I think having someone be able to walk you through the nuts and bolts of how it all worked, as well as give the context of how to think about this new-ish approach to editing made the editors finally feel safe and excited about taking on the new technology.

"I think having someone be able to walk you through the nuts and bolts of how it all worked, as well as give the context of how to think about this new-ish approach to editing made the editors finally feel safe and excited about taking on the new technology."

But to make that leap you still need someone to encourage you along the way that you’ll land safely. That’s when everyone came around. They went from saying, half-heartedly, ‘OK, I can use this,’ to ‘Wow, this will actually improve my workflow, and even makes editing kind of fun again.’ That was better than even I expected.

The meme has been stated over the years that Final Cut Pro X isn’t a professional tool… And yet a company as intelligent, world-renown and respected as TED is now using it as their primary editorial platform. It appears that the game has changed…

4K Goes Mainstream with the GoPro Hero 4

September 29, 2014 Tags: , ,
featured image

Hey guys,

Whether you’re shooting 4K already or not yet, the arguments against its coming arrival as the default new delivery spec are officially moot, with the release of the $499, 4K-capable GoPro Hero 4:


When you’ve got a $500 camera doing 4K at 30p the argument against a delivery falls apart. Whether or not your your camera does that may be open for debate, but the idea that 4K and beyond is where everything is going isn’t.

It’s become so possible now to shoot and edit in 4K there’s no reason to not have clients ask you to finish at that resolution as well. The good news is that at least in the short term it will be a nice premium deliverable you could charge for (or be paid extra for by distributors) as there is currently a shortage of quality 4K content being delivered… and there’s all these TV’s being manufactured that need content.

I believe that the HD vs. 4K argument will look back at this GoPro announcement as pretty much the final nail in the coffin. 4K is not going anywhere… not at that price point. 4K will not be the next 3D. It’s now to HD as HD once was to SD. The format you have in your camera and the workflow you want to learn for post. We’ve been in these waters for a while now and are happy to help, drop us a line.


Sam Mestman

Sam Mestman, FCPWORKS.

This blog post contains the personal musings of FCPWORKS’ Workflow Architect, Sam Mestman. Sam’s also a regular writer for fcp.co and MovieMaker Magazine, teaches post workflow at RED’s REDucation classes, and is the founder and CEO of We Make Movies, a film collective in Los Angeles and Toronto which is dedicated to making the movie industry not suck. If you’ve got any FCP X questions or need some help putting together a system, drop him an email at workflow@fcpworks.com and you can follow him on Facebook or Twitter at @FCPWORKS.

Canon DP-V3010 4K Reference Display

September 24, 2014 Tags: ,
featured image

Had a chance to check out the Canon DP-V3010 4K Reference Display last night. It was, hands down, the nicest small screen monitor I’ve ever seen. If you’re one of the people who say “4K resolution doesn’t matter, blah, blah, blah…” I offer this monitor as proof that it does matter, regardless of display size, and it is very relevant if the panel on the display is nice enough.

Anyway, It’s a 30 inch monitor and does 4K DCI (4096×2160) as well as Quad HD Resolutions (as well as a high quality upscale for 2K and 1080). Color is beautiful… and is DCI-P3 accurate as well as Rec-709 accurate.

It’s just an amazing looking monitor… and if you’ve been confused about what the deal is with 4K and why it matters, well, I think this is the starting point for a convincing argument

Anyway, there’s only one tiny problem with it… it’s $30,000.

A slightly smaller problem is that it doesn’t take HDMI… you’re going to need to run a Quad SDI signal to it to run the 4K into it… meaning you’re going to need either the AJA IO 4K or BMD Ultrastudio 4K to really work with it.

So… basically… it’s expensive and still a bit complicated… but it also looks awesome.

As soon as the price point for a monitor like this becomes manageable (let’s say 5 grand), and HDMI becomes a legit option, 4K will officially become the standard we’re all working in.

Regular HD monitors started this way too… the good ones were really overpriced and complicated at first… so this is nothing new. It’s just a matter of time.

Anyway, for more info on the monitor, go here:


It’s worth checking out if you ever get the chance to see one in the wild.


Sam Mestman

Sam Mestman, FCPWORKS.

This blog post contains the personal musings of FCPWORKS’ Workflow Architect, Sam Mestman. Sam’s also a regular writer for fcp.co and MovieMaker Magazine, teaches post workflow at RED’s REDucation classes, and is the founder and CEO of We Make Movies, a film collective in Los Angeles and Toronto which is dedicated to making the movie industry not suck. If you’ve got any FCP X questions or need some help putting together a system, drop him an email at workflow@fcpworks.com and you can follow him on Facebook or Twitter at @FCPWORKS.

Alexa 65: Camera competition explodes

September 23, 2014 Tags: , , ,
featured image

No price or availability yet, and it looks like this is going to be a rental item only… but even so, it would seem to be officially “Game on” in the camera business with Arri’s upcoming Alexa 65:


Looks like Arri is doing 6k and the RED Dragon officially has its first bit of 6k competition.

And if you’re wondering why you should be shooting 6k… well, in my opinion, you certainly don’t need to deliver in 6k, but for the same reasons that it gave you a lot of flexibility to punch in from 4k down to 1080… well, it’s basically the same deal going from 6k down to 4k. It’s all about flexibility.

ARRI Alexa 65 6K
As far as I’m concerned, the Dragon is still my first choice on the high end if I had to shoot a feature… but, well, when we know a little more about this camera (price/ship date), that may change a bit.

It seems like every day some crazy new camera comes out that changes the whole game. Whether it’s the Panasonic GH4, Sony A7s, RED Dragon, the AJA Scion, all the BMD Cameras (I can’t keep track of what’s current), or the Alexa 65… I guess all it really means is that we as filmmakers have a lot of great new options… and as editors, a lot of different formats and styles we’ve got to keep track of.


Sam Mestman

Sam Mestman, FCPWORKS.

This blog post contains the personal musings of FCPWORKS’ Workflow Architect, Sam Mestman. Sam’s also a regular writer for fcp.co and MovieMaker Magazine, teaches post workflow at RED’s REDucation classes, and is the founder and CEO of We Make Movies, a film collective in Los Angeles and Toronto which is dedicated to making the movie industry not suck. If you’ve got any FCP X questions or need some help putting together a system, drop him an email at workflow@fcpworks.com and you can follow him on Facebook or Twitter at @FCPWORKS.

No more RED proxies necessary?

September 11, 2014 Tags: , , ,
featured image

Sam here…

I’m noticing RED proxies may no longer be needed in FCPX. So… way back when 10.1 got released there was a little feature in the release notes that’s actually a big deal, but no one really talks about… and I’m not ever sure anyone really noticed it:

If you have transcoded RED RAW files to ProRes through a third-party application, you can relink to the original RED files within Final Cut Pro.

For me the Proxy workflow with RED stuff always worked fine… but last week, I did a little test. Basically, I brought some RED files into FCPX, did a quick batch rename, some prep, etc. Then, I went and transcoded out a 1:1 5k prores LT file from the Epic footage in REDcine-X. I went back into FCPX to relink from the R3D to the RCX prores file… relinked with no problem.

It would seem that you don’t need proxies anymore to be offline/online with RED footage… you can import your RED files right into FCPX, get prepped, etc… in the meantime, you can be trancoding that same footage through RCX to whatever codec you want (I’d typically recommend Prores LT for offline)… and then when you’re done transcoding, just relink to your to your transcodes, edit away, and when you’re done, relink back to your RED files and then finish. There should be no downtime and your relink should be almost as fast as flipping from proxy, except that you won’t be stuck with the prores proxy codec for your offline, and you can work with other non-RED formats in the same timeline in optimized/original mode largely without issue. Kind of awesome.

One small caveat – when you’re transcoding your RED footage, make sure your timecode setting matches the timecode displayed in FCPX. I did a test that had the timecode set to Edgecode for some reason, and it caused some relink issues until I noticed that my timecodes for the Prores transcodes weren’t matching the timecodes for the RED files in FCPX. Once I was on the right timecode setting, I was able to relink without issue.
RED proxies
Anyway, for smaller RED only projects, I’m still going to use the Proxy workflow, only because it’s so easy and I can transcode in the background right in FCPX, and flip modes as necessary… however, for longer form work where I know I’ll be working offline for an extended period of time, the flexibility of being able to easily relink to RCX transcodes is great.

Now… here’s the million dollar question that I haven’t tried… will this relinking business work with non-RED formats? If anyone has a chance to check, let me know in the comments…

ARRI Amira now does 4K UHD

September 10, 2014 Tags: , , ,
featured image

Sam here…

Looks like ARRI’s finally getting their act together and are finally supporting 4K capture with the Amira:


While I’d love it if it supported full 4K DCI (4096×2160), this is still a big step in the right direction for them.

ARRI Amira Workflow Diagram

What’s really nice about this is that’s it’s going to shoot out Prores, and not some ridiculous, impractical RAW format that’s going to be hard to work with… although it seems from the diagram like they’re not quite supporting Prores XQ, which is a bit of a drag. That’s the capture format I’d love to see for feature/high end work.

Regardless, a huge step in the right direction.

Whether you all like the way the ARRI Amira is set up or not… you need to start thinking about how you’re going to shoot and master in 4K. It’s going to become the new standard.

PS- and don’t forget to tune into FCPWORKS’ Sam Mestman appearing on the Final Cut Virtual User’s Group at http://hazu.herokuapp.com/pixelcorps/fcvug-3 at 1:00 PM PST today!

4K Monitoring Options: BMD or AJA

August 22, 2014 Tags: , , ,
featured image

Sam here… finishing up 4K week on FCPWORKS Workflow Central with one more post.

So… BMD or AJA… the eternal debate. Right now, we’re centering this around the 4K monitoring & I/O products, AJA’s IO 4K or the BMD Ultrastudio 4K. Basically, it comes down to this… Do you use DaVinci Resolve for Color Correction? If the answer is yes, you’re going to need to go with Blackmagic. Case closed. Blackmagic Devices are the only ones that will work with Resolve.

However, If the answer to that question is no, and you’re doing your color work primarily in FCPX or another program that isn’t Resolve (Scratch, Baselight, Smoke to name a few) the discussion becomes a lot more complicated. Additionally, in the case of the BMD Ultrastudio 4K… it’s can be loud. The AJA IO 4K is quiet and considerably smaller. If you’re keeping the product in a room with a new Mac Pro as your primary computer, you really start to hear the Ultrastudio 4K when it’s on…. and if you’re doing serious sound mixing, the noise makes a big difference.

Additionally, and this is a little known fact, but the AJA cards support more monitoring formats for FCPX as well. For whatever reason, your monitoring formats in the Blackmagic preferences (system settings on the Mac) are more limited and considerably smaller than when you’re using your Ultrastudio 4K in Resolve.

HOWEVER, at the end of the day, price is also a factor and the AJA products are almost universally more expensive than their BMD counterparts. So price vs. performance is definitely a consideration. In my opinion, if you’re more interested in specific features, go AJA. If you’re budget conscious or a heavy Resolve user, go BMD.

p.s – little known fact, but the HDMI out on the back for the New Mac Pro can be used as an 8-bit A/V out in FCPX, and it is FAR more configurable for video I/O if you’re using weird sequence settings and just need to send out a 1:1 output over HDMI than what’s available through your BMD or AJA device.