RED Raw For Your Colorist

August 14, 2014 Tags: , , ,
featured image

Sam here… we’re going to talk RED RAW today, because there’s no reason for this to be so hard and complicated.  Mostly, it’s a public service to DP’s everywhere, many of whom seem to be confused by how all of this works.  It’s been my experience that a lot of DP’s try to capture their LOOK on set… and yet they’re shooting RAW.  Mostly, this is because of a fear (often justified) that post will screw it up later if they don’t lock in their look now.  Unfortunately, this approach is counter intuitive to how the camera is designed to work, and doing things this way will often lead to a lot of finger pointing, anger, and inflated post budgets once the film hits the finishing stage.

The bottom line is that if you’ve ever heard your DP say the following words… show them this post:

“The RED is a noisy camera… I’ve always got add noise reduction in post to my footage.  Also… I always like to save LUTs and looks when I shoot RED.”

With the RED, LUTs are stupid.  Sorry.  Someone needs to say it. You’re just going to go back to REDlogfilm when you hit the finish line anyway… or you should be using the controls in RCX to manipulate the RAW the way you want it AFTER YOU’VE SHOT IT.

If exposed and lit correctly, you should NEVER want/need a LUT when you hit the color room.  Use the standard REDcolor/Gamma settings while you’re shooting as a baseline, and then tweak later in REDcine-X.  When it comes to RED, probably the worst thing you can do is try and dial your look in while you’re on set.  It defeats the whole purpose of shooting RAW.

The truth is that shooting RAW is not a cure all.  While it provides greater flexibility than traditional codecs, You need to do certain things correctly and understand a couple things in order to get good results.

Fortunately, there isn’t all that much that you need to know.  In fact, if you do the following, you’re pretty much guaranteed good results with your Scarlet/Epic/Dragon:

  1. Shoot at 800 ISO – The RED sensor is rated to be shot at this ISO.  Start here while on set.  While you can shoot at other ISOs, you shouldn’t unless you absolutely have to.  Play with that stuff later in Redcine-X.  Shoot and light it for 800.
  2. Don’t clip – Look at your histogram.  Make sure everything you’re shooting is between the “goal posts”.  If it’s not… do a better job with your lighting, or accept certain realities in post.  Also, keep in mind you always have HDRX available to you in extreme cases.
  3. Expose your skin tones correctly – For the love of God, don’t underexpose your skin tones.  Seriously… just don’t.  It’s the number one reason why people end up unhappy with their RED footage and why things turn out noisy, because they find they want to brighten up their skin tones in the color room.  To make sure your skin tones are exposed properly, use the False color mode and make sure your skin tones are “pink”.  If they are, you’re good to go.  You can always make things darker later… rarely, however, can you make things brighter and not introduce unwanted noise.  Even if you want things “moody”, EXPOSE YOUR SKIN TONES PROPERLY.
  4. The smaller your resolution, the grainier your footage – Basically, if you shoot with the Dragon/Epic at 4k, 3k, or 2k… you’re using less and less of your sensor, and less and less information is being captured.  Many complain that their 2k stuff looks worse than their 4k and 5k stuff… that’s cause it does.  You’re only using part of your sensor, and depending on your compression rate, you may start to see a lot of problems, noise, and grain introduced… especially when you shoot at 2k.
  5. Up your compression ratio if you’re going to reframe – For the same reasons discussed in #4, the higher the compression ratio you shoot with your RED at, the more noise you’re going to see from your punch ins in post.  Once you get past 7:1 compression or so, expect the quality of your punch-ins to decrease and become far more noticeable.  While there’s no reason to shoot RED RAW uncompressed (not even really a good reason to go below 5:1), keep in mind that the higher you go, the more noise will be introduced, and this noise will be compounded when you reframe/punch in during the edit.  Even though you shot at 4k, it doesn’t necessarily mean that all punch-ins are created equal when you come down to 1080.

Seriously, those five things are all you really need to know in order to make you and your colorist happy when you reach the finish line.  Why people make this so hard, I’ll never understand.

For more info on some of the RED exposure tools and how all this works, read these articles –

FCPWORKS’ Noah Kadner on FCVUG

August 13, 2014 Tags: ,
featured image

FCPWORKS’, Noah Kadner will be doing a round table discussion this Thursday evening on the Final Cut Virtual Users Group. Be sure to tune in as they’ll be answering your questions live. Fellow FCPX Whiz Kids will include: Mike Matzdorff, Chris Fenwick, Mark Spencer, and Steve Martin. Tune in live at 6:00 PM PST on Thursday, August 14th, 2014 at

No More Pro Video Support

August 13, 2014 Tags: , ,
featured image

While, selfishly, I’m kind of okay with this because it’s good for my business:

It’s still a bit of a bummer… because what it really means for your average person is that they’ll now think the only option for real, immediate, pro A/V support with FCPX is to go to the Apple Store and talk to an Apple Genius… and I think we all know how that is going to go for them.

*** Shameless self promotion… if you need help with FCPX and the related ecosystem, FCPWORKS is a way better option for you than the Apple store ***

Sales pitch over.

Anyway, what this news also means is that it’s just another barrel in the gun for Apple critics who want to rail about how Apple has abandoned the Pro Video market… and I was really hoping I was done with that debate.

Here’s my point of view on it, though, for what it’s worth.  I don’t think this is a sign that Apple isn’t committed to the pro market.  I think it’s more a sign that they have bigger problems to solve, and that maybe the support infrastructure itself has changed a bit from when Apple’s Pro Video Support was necessary.

The truth is that most people find a lot of the answers they need through google, blogs, and videos nowadays.  I know I do.

And for more specialized, higher end cases, they probably weren’t going to be Apple Support clients anyway… those guys are all going to third party specialized consultants who are doing the real work day in and day out.

I think what this move symbolizes more is probably the fact that Apple woke up one day and realized that a $799/year support service wasn’t really in line at all with what their average customer needed from them… so, I think that’s why they killed it.

I mean, honestly… would you/were you paying $799 a year (more than twice as much as it costs for a license of FCPX) for Apple’s Pro Video Support?

I know I wasn’t.

To be honest, until I read Alex’s post… I didn’t even know this was still around… which probably says more about why they got rid of it than anything.

However, if you do find that you need this kind of support… well, we do that here at FCPWORKS, and we’ll have you covered.  Feel free to reach out to with any questions.

Why Aren’t You Using Motion?

August 12, 2014 Tags: , , ,
featured image

So… the title of this blog is a bit of a trick question.  If you’re using any of the built in FCPX effects, titles, or generators, or you’re using most of the 3rd party effects or templates, you’re actually using Motion without even being aware of it.

However, because of the new integration with how FCPX works with Motion and the loss of the “Send To Motion” command, I tend to feel like Motion has become the forgotten App in the the Apple Pro Apps ecosystem.  I sort of feel bad about this, because Motion is actually awesome, especially if you’re an editor like me who has no interest in becoming an After Effects/Nuke genius.

No one has the time to know and be good at everything.  I always gravitated towards the Edit and Color correction ends of the business.  While I had an understanding and interest in essential GFX and audio techniques, for me, if I couldn’t get something done quickly on that end, it just wasn’t going to happen… and I never had the time or natural inclination to become an After Effects or Pro Tools Master.  For whatever reason, those apps just never made sense to my brain the way Final Cut, Color, and Resolve did.  I liked to stay in-app/as integrated as possible when it came to GFX and audio and is why I bothered to learn Motion and Soundtrack Pro back in the FCP7 days.

The truth is that, even though GFX may not be your thing, especially if you’re a one man band, your clients are still going to expect you to be able to do high quality lower thirds, titles, and other common GFX tasks… especially for lower budget corporate, commercial, and internet projects.  In fact, just about any Youtube video you make for a paying client is going to require you to know how to make some kind of endtag for it.  In fact, many of these tasks end up being repetitive, and in most cases would tend to be best suited for having a template you could work with quickly right in your NLE.

This is where, as an editor, getting to know Motion was my best friend.

The reason is that things you make in Motion will show up automatically as titles, generators, or effects in FCPX, and understanding the rigging and publishing concepts inherent in Motion can save you RIDICULOUS amounts of time in your edits.  It’s a bit of a different way of working, which may be why a lot of people haven’t gravitated to it… but when it comes to 85% of the common tasks asked of an editor, between FCPX and Motion, you’re going to save a ton of time doing it that way, and often at a higher quality because of the time saved and simplicity of the workflow, than you would banging your head against the wall with After Effects renders.

Also, Motion becomes a lot more powerful the deeper you get into it.

The good news is that Mark Spencer from Ripple Training created some of the best tutorials for Motion that I think anyone has made for any kind of app.  More specifically, his tutorials for Rigging and Publishing for Final Cut Pro X, Mastering Replicators, Mastering The Camera, and my personal favorite, Mastering Shapes, Paint Strokes & Masks, are just awesome… especially for the average editor who just wants to be a solid B when it comes to GFX, and just wants to know how make something that looks professional quickly.

Anyway, if you want to get up to speed quickly on Motion, I can’t recommend Mark’s Motion 5: The Complete Series (which has all the tutorials I just mentioned) highly enough.

And if you’re looking for some great free Motion tutorials, you need to stop what you’re doing and check out Simon Ubsdell’s Youtube Channel.

Seriously, even though no one ever talks about it… for a lot of people, Motion is really worth learning.

FCPX SAN Workflow Dos and Donts

August 11, 2014 Tags: , , ,
featured image

Hey guys,

Sam here again… so, contrary to what a lot of people believe, SAN workflow in FCPX is actually very simple and straightforward, especially with the newly released version (10.1.2). While I’d still say that the Avid Unity workflow is still a little more robust with its bin locking features, working from a SAN/network in FCPX is still very practical. Also, in terms of price vs. performance, I’d say FCPX is the way to go, as Unity setups tend to be slow and extremely overpriced. The truth is that you can have multiple editors easily accessing the same media and passing projects to each other seamlessly. All you really need to know are a few things:


Keep all your media centralized on the SAN – For me, best practice is to put your original Camera negatives into a “Media” folder on your SAN. Within that, create a new folder based on your project, and then a “media” folder within that and place your camera/sound originals in there accordingly in their original directory structure.

Keep your media outside of the library – When you make a new library, from your preferences, first make sure that “leave files in place” is selected. Then, select your library, go to “modify settings” in the inspector, and set your media to be imported to a folder that is outside your library (this can be on the SAN). When you import media now, your media will added to this new folder, but it will be adding “Sym Links” (similar to aliases) that are pointing back to the original media that is also living on the SAN (see above). This will come in handy for Archiving and media management later (see below for why).

Keep your Libraries on an internal drive – If you’ve worked off a network in FCPX and experienced slow opening Libraries, and generally slow performance, it’s probably because you have your libraries on the SAN itself. The reason for this is because SAN’s are designed to handle large chunks of media, not the small database files that FCPX creates. If you’re on a network… run a test from a large library that has media outside the bundle (so it should be a lightweight file) and copy that file from the SAN to your desktop. You’ll notice that the copy time is probably far longer than it should have been. Now, copy a regular media file over. If you’re on a decently fast network, this copy should be MUCH faster than the library copy was, even though the library was much smaller in size. This illustrates this issue. For this reason, best practice is to keep your database on local storage (especially if you have a new Mac Pro which has an extremely fast internal SSD) or an external hard drive. You will see a significant increase in speed and application startup times doing things this way.

Make sure all Editors’ Libraries are pointing to the same place – The best way to do this is to make a master editorial Library for your primary editor using the import steps described above, and then duplicate that library and hand it off to each new editor. If you’ve kept your media outside your bundle, this will now be very similar to the standard FCP7 approach most of you are used to… and basically all you’re doing is passing each editor a duplicated “project file” that is making sure that your “capture scratch” is all being set to the same place for easy reconnecting later. This way, any time an editor imports media, you’re guaranteed that it’s going to the right place in the database, and that this database is the same as what your other editors are working from.

Use Xfer Libraries and keep those on the SAN/Network – Because two editors can’t work from the same library at the same time, you should have a centralized Xfer library that editors open and close when they want to pass new edits to each other (this can also live in a dropbox/google drive). If an editor needs to pass media or an edit to another editor, they should consolidate their library first to the network to make sure all media they’re referencing lives in the correct place (see below for the reason why), and THEN pass their project(s)/event(s) into the Xfer library to distribute to other editors. They should then close the Xfer library so others can access it.

Use consolidate commands and Hard links to seamlessly and non-destructively consolidate media in FCPX – Ok, so here’s something really cool not a lot of people are aware of. If you are using the FCPX media structure in the correct way, because of the way FCPX takes advantage of Hard Links (for the record, I have no idea what real definition of these are… just what they do), you can have multiple copies of the same file on a drive/SAN/network, and those files will only take up the space of a single copy of that file. To ilustrate, here’s a simple test you can run:

  1. Create a new library set up the way I described above, and make sure the media folder in your library is set to the same drive your Camera Originals/files you want to import are stored on.
  2. Check and see how much space is still left on the drive. Write this number down.
  3. Take a relatively large file (minimum 5+ GB) and Import into the library and confirm that you have sym links show up in your original media folder.
  4. Now, use the consolidate media command new in 10.1.2 to have that media you just imported copied over to the new library.
  5. Ctrl click the file and select “reveal in finder” and then look in the original media folder and confirm that the file you imported no longer has an arrow next to it (meaning that this file is now an actual copy and no longer a sym link).
  6. Check the total disk space on the drive/SAN/network. It should still be identical to the number that you wrote down.
  7. This means that you have two “copies” of the file on the same drive, but you are only losing space for one of them because of “Hard Links.”

The reason for this is that because it’s using Hard Links for your media, FCPX is keeping track of the files you have on a drive, and if you use the FCPX commands to manage your media, you can have files living in more than one place on your drive, but not be penalized in terms of disk space.

What this means is that in terms of archiving and importing new media, if all your editors have their libraries pointing to the same media folder, and you are using the consolidate media commands correctly, you now have the best of both worlds; you can copy your media onto the network and have that directory structure remain untouched by FCPX… but you can also ensure that all of your editors have the same access to media that all your other editors have access to because everyone is consolidating to the same place when they import to new things, and you are never losing disk space because of this. Not only that, but when you’re done, everything is simple to archive because everything you need for a project will be living within a single directory which you can easily archive whenever you need to, and you’ll be sure that you’re not missing anything. Hard Links are great.

So, to recap… here are some don’ts:

  1. Don’t keep your media inside the library on the network. This will make your libraries far less portable.
  2. Don’t let your editors have their library Media folders pointing to different places.
  3. Don’t keep your Library Media folders on different/networks/SAN’s drives than your original negatives if you can avoid it.
  4. Don’t keep your libraries on the SAN/Network – except for a single Xfer Library for people to pass edits and events to and from (but even then, you should probably put this Xfer library in a dropbox).
  5. Don’t try Group workflow on a large project without first running tests with your network (especially disk permissions which can do all kinds of weird things to you).
  6. Don’t let your editors start cutting without explaining the workflow to them ahead of time and making sure they understand why they’re doing what they’re doing.

Also, for some more information about how media management works in FCPX, check out this awesome video by Dustin Hoye:

For an expanded understanding of working with FCPX on a SAN, check out this great resource posted recently on

DCP through FCPX/Compressor

August 8, 2014 Tags: , , ,
featured image

Hey guys,

Sam here… some of you guys know about this, and some of you don’t, but you can actually make and view your own DCP’s using Compressor (or in FCPX), and it’s ridiculously easy.

We demoed this for folks who visited the FCPWORKS suite at NAB, and I even had one of my We Make Movies shorts (Agnes) screen at the NAB StudioXperience 4k Filmmakers Showcase. The only reason I was able to get them what they needed (a 4k DCP) was because of the Wraptor Plugin/DCP Player combo. Given my timeline and how quickly I needed to turn it around, I just wouldn’t have bothered with the other solutions due to their complexity and inability to easily check/preview the DCP on my Mac. Honestly, the workflow for this is so easy, I kind of felt like I was cheating or something. In my mind, DCP creation was supposed to be hard. That’s no longer the case. Thanks Quvis.

Anyway, in order to make a DCP through Quvis Wraptor in Compressor, here’s what you need to do:

  • Buy the Wraptor 3.1 for Apple Compressor ($699)… you can also try the watermarked version for free.
  • Buy the DCP Player ($699 to own) or rent it ($60 for 30 days, $360 for the year)
  • Download and install the plugin in Compressor
  • Export a master file of your movie (Prores XQ, 4444, or HQ are your best bets), with your audio channels laid out according to your DCP requirements
  • Drag the file into compressor
  • Apply the Wraptor plugin, configure for resolution (2k or 4k), frame rate, and number of audio channels
  • Set your destination
  • Export
  • Check it using the DCP Player Software
  • Bring it to the theater or upload to a server

You can also set up a custom Compressor setting that you can use right in FCPX from your timeline.

When it’s done exporting, you’ll have a DCP folder that you can preview right on your Mac using the DCP Player software. It’s going to automatically interpret the color space of your DCP file to display on your Mac pretty much the way you’ll see it in the theater.

In terms of quality, there’s no difference between what we were able to see on the Quvis DCP in the theater vs. the very same file encoded by the Studio’s post house.

On a new Mac Pro, with the recent Quvis 3.1 upgrade, you should see near real time encoding for 2k DCP’s (it will take longer for 4k).

The main difference between what Quvis does vs. the free Open DCP software is the ease of use, render time, and higher quality of the signal to noise ratio in the DCP’s you’re generating. Bottom line is that if you find yourself needing to deliver to DCP regularly, the Wraptor/DCP player gives you the best bang for your buck.

One small thing to note… encrypted DCP’s are not supported yet… so if you find that you need that, you’ll need to get additional 3rd party software to encrypt the DCP.

Anyway, for you FCPWORKS customers out there, if you find yourself running into issues, hit us up at and we’ll help you out.

Eizo CG-277 Workflow Review

August 7, 2014
featured image

The Eizo CG-277: The most versatile 2nd display monitor for FCPX

Sam here… wanted to take a minute to talk about a monitor I really love that I think is specifically designed for the needs of FCPX users. It’s the 27” Eizo CG – 277, and as far as I’m concerned it’s the best monitor on the market when you factor in size, price, performance, ease of use, quality, and flexibility. Here’s some reasons why:

  • Can be used either as a desktop display or as a grading monitor with a native resolution of 2560×1440.
  • Built in, incredibly easy calibration tools (no probe necessary) using Eizo’s ColorNavigator NX software. Through the built in probe, the monitor will automatically compensate for ambient light in the room to ensure a color accurate signal.
  • Direct attach to Mac through Mini Displayport or through HDMI
  • Support for 10 bit HDMI (using either AJA or BMD video I/O)
  • 100% Rec 709 accurate or 91% DCI-P3 accurate
  • Multiple color spaces built in that you can easily switch to, and it’s very easy to create your own custom ones.
  • Excellent blacks and contrast ratio
  • MSRP is only $2499
  • Here’s how I’m using it:

    I’m primarily using it as my 2nd display/grading monitor in FCPX. For the most part, I’m running through mini displayport as 2nd desktop display. When it’s time to do color grading, I’ll either quickly switch over to the built in 8-bit A/V out on the New Mac Pro through the HDMI out if I just need to know what I’m looking at, or for more high end projects, I’ll go through the A/V out on my Ultrastudio 4k for 10-bit grading (I’ll typically only do this if I’m in Resolve, as grading in 8bit off the HDMI is a bit more flexible in FCPX, and I don’t really need to go the 10-bit route for most projects… 8 bit if fine for the color tools in FCPX).

    The real advantage of this monitor is its flexibility and how easy it is to set up, calibrate, and switch between color spaces and signals.

    A couple gotchas:

  • Out of the box, your REC-709 profile may not look right. Using ColorNavigator to recalibrate will quickly correct this.
  • In desktop mode (mini displayport) there’s a slight gamma shift through the Mac OS vs. running through the HDMI out on the Mac/Macbook Pro… I think it’s because OSX is adding something to the display profile. It’s not terribly noticeable, but if you’re doing sensitive color work, you’ll want to be doing it over the A/V out from the Mac Pro, or through the HDMI out from your AJA/BMD box.
  • There’s no SDI in… for me, I don’t really care about this… but it’s important to a lot of people for some reason. If you understand how to get your HDMI signal in properly (and there’s not much to know), there aren’t really any advantages to SDI… in fact, I think HDMI is far more manageable and flexible for the average person.
  • Anyway, if you’re budget conscious but don’t want to sacrifice quality, for me, the CG-277 is the way to go. I’ve been using it since it came out, and I really don’t have any complaints. In terms of price/performance and what the average FCPX user needs out of a 2nd color accurate display… there are almost no drawbacks, and it’s a bit of a no-brainer.

    At FCPWORKS, we only sell the products we would use if we were our clients… and we’re a proud reseller of Eizo products. So, if you’re interested in buying one, get it through us (at the same price you’d find elsewhere), and we can walk you through any tech issues you might run into and get you up to speed, calibrated, and ready to edit.

    Resolve Workflow Test Results

    August 6, 2014 Tags: , ,
    featured image

    Hey guys,

    Sam here. Did some pretty comprehensive FCPX roundtrip tests today using a whole bunch of different types of media in the new Resolve 11 Beta (#2)… tests were done in a 1080p 23.98 timeline… but so far, the results are really promising. At this time, there are only two major issues with the roundtrip:

    1. Synchronized clips were not working at all.
    2. Synchronized clips created through Sync N Link were also not working (might be part of the same problem).

    If you’re using synchronized clips, the workaround right now is to use “Break Apart Clip Items” in your FCPX timeline and then send to Resolve. The only real issue about this is that you’ll need to reapply keyframes and transforms in your timeline after you do this if you want those to come through into Resolve, as they get wiped away when you break apart your sync clips. As always, before getting into the conform stage, you should duplicate your master timeline beforehand so that you can reapply anything you need to later by using Paste Attributes.

    Anyway, while there was some other weirdness, this is pretty much ready for primetime on big projects for roundtripping so far as I can tell… and there are some MAJOR XML roundtrip improvements with the new version of Resolve 11. Here’s a quick rundown of what is/isn’t working:

    Spatial Conform – Correct (although it doesn’t appear so right away as you have to set the renders back to Fill when you roundtrip into FCPX)
    Markers – don’t roundtrip
    Multicam – Works
    Transform – works, even with keyframes
    Crop – Ken Burns does not work… only exports with “end” animation
    Primary Color – sort of works… Color tab mostly correct except for slight highlights adjustment, exposure tab works, saturation tab not accurate at all
    Secondary Color – Secondary and masks don’t work
    Effects – Didn’t translate/roundtrip correctly (settings went back to default on roundtrip)
    3rd party plugins – The one i tried from Luca VFX actually works… even bringing over the adjustments I made to it. This was surprising
    Titles – worked correctly, except that font didn’t carry over on roundtrip
    Compound clips – kind of worked… compound clip came in as a single clip, but there were a whole bunch of inconsistencies inside it. In general, these probably aren’t too functional for color correction anyway. You should break apart before sending to Resolve.
    Speed changes:
    HUGE improvements here over previous versions. Here’s what I found-

    Worked on import and roundtrip:

    • Fast speed
    • Slow speed
    • Reverse
    • Normal speed quality
    • Blade speed (variable)
    • Range speed (variable)
    • Custom speed
    • Ramp to 0
    • Ramp from 0

    What didn’t work:

    * Frame Blending and Optical Flow didn’t carry over (reset to normal on roundtrip and couldn’t determine what they were set to in Resolve). You should set these after the roundtrip in FCPX.
    * Instant replay (combining a forward and reverse speed adjustment in the same clip)

    Overall – the fact that the speed changes worked so well was HUGE, and most of the issues I described above are holdovers from the current XML implementation (although Spatial Conform works considerably better). Really, if the BMD team can get Sync clips (as well as Sync N Link versions of those) working, the FCPX-Resolve roundtrip becomes about as good as anyone can reasonably ask for… and I think it’s already better that what any of the other NLE’s can do as it currently is right now. Lots of progress here.

    Thunderbolt Bus Mapping on the New Mac Pro

    August 5, 2014 Tags: , ,
    featured image

    Sam here… Before we get to the Thunderbolt busses on the New Mac Pro, here are a few things you should probably know about Thunderbolt and modern macs:

    • Macs that are still Thunderbolt 1 only: Mac Mini, iMac, Macbook Air
    • Thunderbolt 2 supported Macs: Macbook Pro, Mac Pro
    • You can connect up to 6 Thunderbolt devices on a single Thunderbolt bus.  On a Mac Pro, you can do up to 36 Thunderbolt devices.  Note: this is very device-dependent.  In real-world testing, depending on what kinds of devices you’re trying to hook up, you may  only get up to 2-3 per bus before you start having problems.
    • The Mac Pro is the only Mac that has more than one Thunderbolt Bus (it has 3, actually).  This means that even if you have more than one thunderbolt port, it doesn’t mean that you’re necessarily getting the full 10 or 20 GB/s to your device from each port.  It’s all pulling from the Thunderbolt bus that the port is connected to.  This means that if you have multiple displays (even Apple Thunderbolt Displays), these devices will affect Thunderbolt performance as you continue to add devices to your bus.

    How you attach your devices to the new Mac Pro is really important as it will affect performance across all of your devices.  Here are some tips for mapping out your thunderbolt devices across the individual buses.

    • Do not attach more than 2 displays to a thunderbolt bus.  If you do, expect to see problems.
    • You can connect up to 6 Thunderbolt/mini DisplayPort displays (2 on each bus) to the new Mac Pro.
    • You can connect up to 3 4K displays (1 each on buses #1 and #2) and a third through the HDMI port, which connects to the third Thunderbolt bus.
    • On my setup, I have my ports configured this way: my two desktop monitors are on bus #1.  My Promise R8 and some additional thunderbolt storage are on bus #2.  My Ultrastudio 4K for video I/O is on bus #0, and I’ll connect additional drives/peripherals when necessary to this bus.

    When you take full advantage of the Thunderbolt mapping on the New Mac Pro, you can do something like what we did at the RED booth at NAB.  We had two LG 21:9 Ultrawide monitors (3440×1440) each hooked into buses 1 and 2, as well as a large 4K Sharp HDTV (think it was the 70”… but I could be wrong) hooked up to the HDMI port.  We were able to get realtime playback of 4K Prores (playback set to High quality) in FCPX off a Promise Pegasus while simultaneously getting realtime 6k Dragon Playback in Resolve off the internal Mac Pro SSD (yes, both programs were open and playing back at the same time).  We had FCPX on one of the LG’s with the AV out to the Sharp, while resolve was open on the other LG Monitor.

    And while you would never actually have a reason to do this in a real-world workflow… the fact is that you can if you wanted to, and you knew how to map your Thunderbolt ports correctly on the new Mac Pro.  We are living in interesting times.

    Welcome To FCPWORKS – Workflow Central

    August 4, 2014
    featured image

    Hey guys,

    Welcome to the FCPWORKS Workflow central.  I’m Sam Mestman, this is our blog, and what we’re really hoping to create here is a place you can come to get some tips you can really use in your professional Final Cut Pro X workflows.  The FCPWORKS team will be be covering a wide range of topics that are going to be all about ways to get your work done faster and make sure that more of your time ends up getting on the screen and less on troubleshooting and busywork.

    Whether it’s tips for FCPX, Motion, Compressor, or Resolve, or thoughts on other Apps, Plugins, and products we like here at FCPWORKS, the goal is to take bits and pieces of things we’re finding in our projects and sharing them with you so that hopefully you’ll be able to turn your edits out a little bit faster.

    And if you ever have any questions, comments, or things you’d like to see us cover, absolutely shoot those over. Also, we’d love to know if you’re finding apps or products you’re using that you think other people in the FCPX community should be using too.  At the end of the day, we want to help create a better, smarter FCPX community… and because FCPX workflow is so different than what most professionals are used to, we felt like creating a place that people can come to get some real world tips was really important.

    Anyway, we’re going to try and get a little something up here for you regularly and if you want to stay current on everything we’re doing, go ahead and sign up for our newsletter using the form on this page.

    If you’re ever curious about what kind of system you should buy, or you feel like you’d like to buy your gear from people who can actually answer your workflow questions… well, that’s what we do here at FCPWORKS.



    Sam Mestman
    Workflow Architect – FCPWORKS