Tag fcpx

Tag fcpx

Panasonic GH4 Tips and Tricks

August 20, 2014 Tags: , ,
featured image

Those that know me well know that I’m a huge fan of both FCP X and the Panasonic GH series of cameras. Currently, I own and shoot with the GH2, GH3, and yes, even the almighty 4K-capable Panasonic GH4. FCP X handles the different codecs, bit-rates, and frame sizes like a champ. In addition, I always have the option to transcode the footage to ensure that everything runs smoothly on, say, an older laptop.

I determined that I would keep the older cameras for a variety of reasons. The most practical of those reasons is that they are both paid off! However, choosing to shoot with all three cameras at the same time presents some problems.

Chiefly, how do I effectively balance the distinct looks of each camera when the looks are baked in to the image? Does FCP X have the tools I need to get the job done? How do I keep my workflow simple without having to do a major color correction at the end of the job? It turns out, the answers are quite simple.

My personal holy grail for camera matching comes in the form of the DSC OneShot Chart. I make absolutely sure to bring it with me on multicam shoots. I allot a few extra minutes to white balance and to shoot the chart. The back side of the chart is split between white and gray. Frankly, I wish they chose either white or gray versus splitting them up since it’s hard to fill the frame of your camera with the space given for each when performing an auto white balance.

The OneShot chart was developed by both DSC and Art Adams. Art has a blog post that fully explains the chart here. I won’t go into too much techno talk about it, but it’s a brilliant chart which has all the basics you need for proper luma and chroma balancing: true black, white, gray, skin tones, plus primary and secondary broadcast colors.

Before I got the chart, I would balance the shots manually using a waveform monitor and vectorscope. The great news is that Resolve 11 can actually understand the color information on the chart using the built-in Color Match feature. In the video below, I show you how I send the shots from FCP X to Resolve, match them, and then send the LUTs back to FCP X.

To summarize, once each camera is balanced in Resolve, I export a 3D LUT of each camera. I name the LUT based on the camera and name of the shoot. Of course, you could add any additional info that you deem necessary, such as scene number.

The next trick is getting all this back into FCP X. I purchased a great plugin from Denver Riddle’s Color Grading Central called LUTutility. This program can actually read the LUTs you export from Resolve and attach them to your shots inside of FCP X.

All you need to do is drag and drop the .cube files into LUTutility’s preference pane, located in System Preferences. The LUTs are then installed and accessible inside of FCP X.

Inside FCP X, it’s simply a matter of adding the LUTutility effect to the footage and choosing the correct LUT based on your camera from the pulldown menu in the inspector. The beauty of this workflow is that all you need to do is import the shots with the DSC One Shot chart into Resolve. There’s no need to render anything out of Resolve. All that color grading info is stored in the LUTs and read by the LUTutility effect inside FCP X. I’ll then apply the necessary LUT to all shots within a scene and perform the final grade inside of FCP X.

One side-note to all of this. After I apply the LUTs, I’ll usually add some minor color correction tweaks as nothing is ever 100% perfect. But even with the minor tweaks, this process takes so much work out of balancing different cameras, especially DSLRs where the look is baked in.

As long as the cameras are white balanced off the same source and are generally shot at the same ISO, the tweaks are very minor compared to having to entirely match by eye. Frankly, I find it amazing that this is now all possible. It speaks to the exciting development that is going on in the FCP X ecosystem.

I hope this tip helps. Now go shoot and edit something awesome!

ABOUT

Michael Garber

Guest Blogger Michael Garber from Garbershop.

Michael Garber is a post production professional with over 14 years of experience. He started his company, 5th Wall, in 2004 and has worked with clients such as Discovery Agency, Huell Howser Productions, Automat Pictures, FuelTV, PBS and more. In addition to editorial work, Michael produces corporate documentaries for a Fortune 500 company. When not editing or shooting, Michael is more than likely talking about editing and shooting on his blog, GARBERSHOP.

About 4K Monitoring

August 19, 2014 Tags: , ,
featured image

There’s a lot of talk and a whole lot of hype when it comes to 4K. I’m certainly guilty of a lot of that hype. However, most people know very little about 4K and are pretty intimidated by the subject. Here’s some quick hits when it comes to working with it.

First thing you need to know is that there are two flavors of 4K delivery resolutions:
4K UHD – This is the spec for 4K for broadcast and in the home. Resolution is 3840×2160. It’s a 16:9 aspect ratio (1.78:1), and is really just double the resolution of standard HD (1920×1080). Most 4K displays and televisions will be 4K UHD.
4K DCI – This is the cinema spec, and resolution is 4096×2160. Aspect ratio is 1.85:1. Like 4K HD, this is just double the resolution of the standard 2k spec (2048 x 1080). You’ll only really see the 4K DCI spec in play if you’re watching a movie in a theater.

From a traditional viewing distance, 4K only really becomes noticeable once the screen hits 84 inches. However, once you hit that size, and if shot and projected properly, the results are pretty stunning.

As of this writing, most of the 4K TV’s being sold are not worth buying. Either the panels are really cheap and the image quality is not great, or the price is just not worth it. If you need something that can monitor at 4K, and you’re working at a budget, get one of the cheaper panels, and pay very little attention to the color and contrast of the monitor. Just watch for sharpness and resolution factors. In many ways, what’s happening now is like what happened when HD first appeared. Sets were really expensive and only for high end pros or people with money to blow. Wait a while and you’ll start to see more affordable options appear.

Lastly, you don’t need to monitor 4K while you’re doing color correction. I’d recommend using an HD Broadcast monitor while doing color (with your video I/O set to 1080). Buying an affordable 4K grading monitor is pretty much impossible and won’t make any difference to your color decisions. Color correcting your scaled down 4K images at 1080 is still the way to go. Right now, I think the only useful thing a 4K monitor is really capable of is to check the overall sharpness of your image at a 4K resolution when you’re mastering. Everything else is not ready for prime time yet… at least in my opinion.

The Easiest Way to do a 4K Screening

August 18, 2014 Tags: , ,
featured image

Sam here… So… Believe it or not, it’s actually easier for the average person, if they had access to the right projector, to put on a higher resolution screening than they typically see when they go out to the theater.

When you go watch a movie at the typical multiplex, you’re almost universally watching a movie that was made from a 2K master… even if the projector is 4K, the movie itself was up-rezzed from a 2K file to fill the screen.

The main reason for this is that Hollywood hasn’t really figured out the whole 4K pipeline thing… especially on the VFX side. It’s far simpler and more practical for them to finish in 2K.

What this means is that if you have a Dragon, Epic, GH4, or 4K BMCC, It’s a pretty straightforward process for you to shoot, finish, and screen at a much higher level than the big guys do… especially if your VFX pipeline is simple.

In fact, if you somehow managed to have access to a nice 4K Projector with an HDMI port on it, you can put on a higher quality screening in your living room than you’ll currently see in the multiplex.

Why? Well, Both the Mac Pro and Macbook Pro will shoot out a 4K signal through their HDMI ports.

Also, those 4K HDMI ports will also send a 5.1 signal.

A 4K 5.1 screening is now a pretty straightforward process if you’ve got the right home theater and you know how to plug in an HDMI cable and export a 4K ProRes.

It’s now easy to Shoot 4K, post 4K, and then screen it right from your laptop.

I have no idea why film festivals make things so hard for Filmmakers with their DCP, Bluray, or Tape requirements.

Filmmakers should be able to just hand over/dropbox a QuickTime movie and get on with their lives. For some reason, everyone loves to make things complicated.

With my Film Collective, We Make Movies We Make Movies, we do our annual WMM Fest of our communities’ work in LA, and and we run all of the screenings (there were 5 this year) right from my laptop. In fact, every screening we’ve ever done has been done through QuickTime,in 1080 ProRes, using our filmmakers’ QuickTime master files, and playing a from a laptop through QuickTime or Final Cut. It’s just easier.

The only reason we’re not doing 4K screenings is because most filmmakers are still mastering at 1080, and 4K projectors are still way too expensive. Both of these things will be changing in the not too distant future.

If we had the right files and the right gear, though, our process would still not change at all. ProRes is still ProRes, and we’re still just playing it out of an HDMI port to a projector.

Our screenings look better, sound better, and we have almost no room for technical issues because we do things this way. We work from the masters, and leave as few things to chance as humanly possible. As long as the projector is calibrated, we’re good to go.

And while I explained that it’s a lot easier for filmmakers to make DCP’s these days in our blog here… it’s still a very difficult format for the average person to implement on their own and is far from a user friendly experience to screen and play one of those things for an audience.

Both the DCP and Bluray formats were designed from day 1 to be difficult to create and hard to pirate. Essentially, as most high end technologies typically are, they were designed to both keep people from understanding them, keep them proprietary, and to maintain established business models… in this case preserving the studio multiplex and home digital distribution businesses.

Fortunately, there’s a pretty easy way around all of this nonsense… which is good news for the independent filmmaker who isn’t tethered to this process and can figure out how to make and distribute their own content.

Right now, I look at DCP’s as a necessary evil, but the truth of the matter is that the safest and easiest way to screen a movie for an audience is to just run it through the HDMI out of your Mac from your QuickTime master.

Why do people feel the need to make things so hard?

Why I started to use FCPX

August 15, 2014 Tags: ,
featured image

Sam here… so, over the years, I’ve gotten a lot raised eyebrows when I run into people I used to work with, or editors and people outside my circle, and I tell them I cut everything I do with FCPX and it’s the best thing out there. Usually, I get back some garbled version of “really? I heard it sucked…” or “I tried it a long time ago and couldn’t get into it…”

We then have a 10 minute conversation about why they switched to Premiere and why I didn’t… and who, in fact, the crazy person really is in this equation.

And when I look back and really think about why I switched to FCPX… I realized that my circumstances were different than pretty much anyone else’s when it came to switching, so it shouldn’t be surprising that my viewpoint on the program is much different than everyone else’s.

Long story short… I downloaded the program day one like everyone else. There were things I liked, and a lot of things I didn’t. Unlike most, I kept playing with it, and cutting small projects, trying to figure out why Apple had done what they had done… and if, in fact, there was something I wasn’t getting with all of this. I was doing all of this on my off days while I worked at my regular freelance gig still using FCP7 and being pretty content with that workflow.

Somewhere along the way, I got invited to come out and work with the Final Cut team and got to ask some of my questions in person… and I got some answers… when I was finished, I came back to LA, and my perspective had changed a bit. I’d been shown a different way of looking at editing, and sort of realized I couldn’t go back to what I was doing and still be happy with that. I had found I liked editing again (I’d become a bit of a robot with FCP7)… and for the first time in a long time, I felt like there was something new and interesting for me to explore.

So… I sort of made the decision that I was just going to run with FCPX, start my own post house, not tell my clients I was cutting with X (I’d just say Final Cut and let them assume I meant FCP7), and see just how far I could get with what I was doing before I ran out of money.

I haven’t run out of money yet.

In fact, I made more. You see, I was still charging what I would normally charge, but I was able to deliver in half the time… time equaled money. So even though I lost a few customers at first, the ones I did keep I was able to take better care of.

That one decision to go out on my own led to a big old giant chain reaction in my career that is still snowballing. It’s been weird, frustrating, cool, and consistently surprising. At the end of the day, it’s been fun. I have a lot more fun than most editors I know, and a lot more control over the projects I choose to do… which is mostly all I ever cared about.

And when I compare it to cutting the same old piece every single day at my old freelance job in the same tired workflow… well, there really is no comparison. You literally couldn’t pay me to go back to that. People have tried.

So what’s the lesson here? The person who was bored with editing at his cushy freelance gig (me before FCPX) had stopped learning and had stopped getting better. I was starting to become less curious, and editing itself had become just a transaction I would do for money. And when that happens, when you stop caring about what you do, and you stop learning, it makes you more likely to want to preserve the status quo and keep collecting checks. Change becomes threatening and learning becomes difficult. Your job becomes less about doing something cool, and it becomes more about protecting your territory from outsiders. It becomes easy to dismiss new ways of working. Eventually, you become the flatbed film editor who wakes up one day to realize their gigs are gone and everyone is editing on video. You blame the world and get really angry and bitter. No one cares that you are angry and bitter. You get more angry and bitter.

If I had stayed that way, I’d be well on my way to being one of those crusty old editors who love to tell everyone else how dumb and unprofessional their workflow is. “Get off my lawn!”

The truth is that you don’t know what you don’t know. I got lucky enough to have some people show me, and it changed the way I looked at what I do. It’s made be a better and more efficient editor and it has prepared me for the next ten years in this business in a way that many people can’t even see.

At the end of the day, it makes no difference to me what editing platform you cut with. You should use what works for you… but as an editor, it’s part of your job to know enough to know the difference between the different tools, and to continue to adapt to the changing world around you.

I guess my only piece of advice might be that, before you go ahead and dismiss a different idea entirely, decide for yourself, and be willing to occasionally go down the rabbit hole. Don’t stop being curious. Sometimes going down the rabbit hole can change your perspective on things completely. It did for me. It’s why I’m only cutting with FCPX now.

I’m always looking for the next Rabbit hole, though.

RED Raw For Your Colorist

August 14, 2014 Tags: , , ,
featured image

Sam here… we’re going to talk RED RAW today, because there’s no reason for this to be so hard and complicated.  Mostly, it’s a public service to DP’s everywhere, many of whom seem to be confused by how all of this works.  It’s been my experience that a lot of DP’s try to capture their LOOK on set… and yet they’re shooting RAW.  Mostly, this is because of a fear (often justified) that post will screw it up later if they don’t lock in their look now.  Unfortunately, this approach is counter intuitive to how the camera is designed to work, and doing things this way will often lead to a lot of finger pointing, anger, and inflated post budgets once the film hits the finishing stage.

The bottom line is that if you’ve ever heard your DP say the following words… show them this post:

“The RED is a noisy camera… I’ve always got add noise reduction in post to my footage.  Also… I always like to save LUTs and looks when I shoot RED.”

With the RED, LUTs are stupid.  Sorry.  Someone needs to say it. You’re just going to go back to REDlogfilm when you hit the finish line anyway… or you should be using the controls in RCX to manipulate the RAW the way you want it AFTER YOU’VE SHOT IT.

If exposed and lit correctly, you should NEVER want/need a LUT when you hit the color room.  Use the standard REDcolor/Gamma settings while you’re shooting as a baseline, and then tweak later in REDcine-X.  When it comes to RED, probably the worst thing you can do is try and dial your look in while you’re on set.  It defeats the whole purpose of shooting RAW.

The truth is that shooting RAW is not a cure all.  While it provides greater flexibility than traditional codecs, You need to do certain things correctly and understand a couple things in order to get good results.

Fortunately, there isn’t all that much that you need to know.  In fact, if you do the following, you’re pretty much guaranteed good results with your Scarlet/Epic/Dragon:

  1. Shoot at 800 ISO – The RED sensor is rated to be shot at this ISO.  Start here while on set.  While you can shoot at other ISOs, you shouldn’t unless you absolutely have to.  Play with that stuff later in Redcine-X.  Shoot and light it for 800.
  2. Don’t clip – Look at your histogram.  Make sure everything you’re shooting is between the “goal posts”.  If it’s not… do a better job with your lighting, or accept certain realities in post.  Also, keep in mind you always have HDRX available to you in extreme cases.
  3. Expose your skin tones correctly – For the love of God, don’t underexpose your skin tones.  Seriously… just don’t.  It’s the number one reason why people end up unhappy with their RED footage and why things turn out noisy, because they find they want to brighten up their skin tones in the color room.  To make sure your skin tones are exposed properly, use the False color mode and make sure your skin tones are “pink”.  If they are, you’re good to go.  You can always make things darker later… rarely, however, can you make things brighter and not introduce unwanted noise.  Even if you want things “moody”, EXPOSE YOUR SKIN TONES PROPERLY.
  4. The smaller your resolution, the grainier your footage – Basically, if you shoot with the Dragon/Epic at 4k, 3k, or 2k… you’re using less and less of your sensor, and less and less information is being captured.  Many complain that their 2k stuff looks worse than their 4k and 5k stuff… that’s cause it does.  You’re only using part of your sensor, and depending on your compression rate, you may start to see a lot of problems, noise, and grain introduced… especially when you shoot at 2k.
  5. Up your compression ratio if you’re going to reframe – For the same reasons discussed in #4, the higher the compression ratio you shoot with your RED at, the more noise you’re going to see from your punch ins in post.  Once you get past 7:1 compression or so, expect the quality of your punch-ins to decrease and become far more noticeable.  While there’s no reason to shoot RED RAW uncompressed (not even really a good reason to go below 5:1), keep in mind that the higher you go, the more noise will be introduced, and this noise will be compounded when you reframe/punch in during the edit.  Even though you shot at 4k, it doesn’t necessarily mean that all punch-ins are created equal when you come down to 1080.

Seriously, those five things are all you really need to know in order to make you and your colorist happy when you reach the finish line.  Why people make this so hard, I’ll never understand.

For more info on some of the RED exposure tools and how all this works, read these articles –
http://www.red.com/learn/red-101/exposure-with-red-cameras
http://www.red.com/learn/red-101/red-camera-exposure-tools http://www.red.com/learn/red-101/exposure-false-color-zebra-tools

FCPWORKS’ Noah Kadner on FCVUG

August 13, 2014 Tags: ,
featured image

FCPWORKS’, Noah Kadner will be doing a round table discussion this Thursday evening on the Final Cut Virtual Users Group. Be sure to tune in as they’ll be answering your questions live. Fellow FCPX Whiz Kids will include: Mike Matzdorff, Chris Fenwick, Mark Spencer, and Steve Martin. Tune in live at 6:00 PM PST on Thursday, August 14th, 2014 at http://www.hazu.io/pixelcorps/fcvug-2

No More Pro Video Support

August 13, 2014 Tags: , ,
featured image

While, selfishly, I’m kind of okay with this because it’s good for my business:

http://alex4d.com/notes/item/apple-discontinues-applecare-provideo-support

It’s still a bit of a bummer… because what it really means for your average person is that they’ll now think the only option for real, immediate, pro A/V support with FCPX is to go to the Apple Store and talk to an Apple Genius… and I think we all know how that is going to go for them.

*** Shameless self promotion… if you need help with FCPX and the related ecosystem, FCPWORKS is a way better option for you than the Apple store ***

Sales pitch over.

Anyway, what this news also means is that it’s just another barrel in the gun for Apple critics who want to rail about how Apple has abandoned the Pro Video market… and I was really hoping I was done with that debate.

Here’s my point of view on it, though, for what it’s worth.  I don’t think this is a sign that Apple isn’t committed to the pro market.  I think it’s more a sign that they have bigger problems to solve, and that maybe the support infrastructure itself has changed a bit from when Apple’s Pro Video Support was necessary.

The truth is that most people find a lot of the answers they need through google, blogs, and videos nowadays.  I know I do.

And for more specialized, higher end cases, they probably weren’t going to be Apple Support clients anyway… those guys are all going to third party specialized consultants who are doing the real work day in and day out.

I think what this move symbolizes more is probably the fact that Apple woke up one day and realized that a $799/year support service wasn’t really in line at all with what their average customer needed from them… so, I think that’s why they killed it.

I mean, honestly… would you/were you paying $799 a year (more than twice as much as it costs for a license of FCPX) for Apple’s Pro Video Support?

I know I wasn’t.

To be honest, until I read Alex’s post… I didn’t even know this was still around… which probably says more about why they got rid of it than anything.

However, if you do find that you need this kind of support… well, we do that here at FCPWORKS, and we’ll have you covered.  Feel free to reach out to workflow@fcpworks.com with any questions.

Why Aren’t You Using Motion?

August 12, 2014 Tags: , , ,
featured image

So… the title of this blog is a bit of a trick question.  If you’re using any of the built in FCPX effects, titles, or generators, or you’re using most of the 3rd party effects or templates, you’re actually using Motion without even being aware of it.

However, because of the new integration with how FCPX works with Motion and the loss of the “Send To Motion” command, I tend to feel like Motion has become the forgotten App in the the Apple Pro Apps ecosystem.  I sort of feel bad about this, because Motion is actually awesome, especially if you’re an editor like me who has no interest in becoming an After Effects/Nuke genius.

No one has the time to know and be good at everything.  I always gravitated towards the Edit and Color correction ends of the business.  While I had an understanding and interest in essential GFX and audio techniques, for me, if I couldn’t get something done quickly on that end, it just wasn’t going to happen… and I never had the time or natural inclination to become an After Effects or Pro Tools Master.  For whatever reason, those apps just never made sense to my brain the way Final Cut, Color, and Resolve did.  I liked to stay in-app/as integrated as possible when it came to GFX and audio and is why I bothered to learn Motion and Soundtrack Pro back in the FCP7 days.

The truth is that, even though GFX may not be your thing, especially if you’re a one man band, your clients are still going to expect you to be able to do high quality lower thirds, titles, and other common GFX tasks… especially for lower budget corporate, commercial, and internet projects.  In fact, just about any Youtube video you make for a paying client is going to require you to know how to make some kind of endtag for it.  In fact, many of these tasks end up being repetitive, and in most cases would tend to be best suited for having a template you could work with quickly right in your NLE.

This is where, as an editor, getting to know Motion was my best friend.

The reason is that things you make in Motion will show up automatically as titles, generators, or effects in FCPX, and understanding the rigging and publishing concepts inherent in Motion can save you RIDICULOUS amounts of time in your edits.  It’s a bit of a different way of working, which may be why a lot of people haven’t gravitated to it… but when it comes to 85% of the common tasks asked of an editor, between FCPX and Motion, you’re going to save a ton of time doing it that way, and often at a higher quality because of the time saved and simplicity of the workflow, than you would banging your head against the wall with After Effects renders.

Also, Motion becomes a lot more powerful the deeper you get into it.

The good news is that Mark Spencer from Ripple Training created some of the best tutorials for Motion that I think anyone has made for any kind of app.  More specifically, his tutorials for Rigging and Publishing for Final Cut Pro X, Mastering Replicators, Mastering The Camera, and my personal favorite, Mastering Shapes, Paint Strokes & Masks, are just awesome… especially for the average editor who just wants to be a solid B when it comes to GFX, and just wants to know how make something that looks professional quickly.

Anyway, if you want to get up to speed quickly on Motion, I can’t recommend Mark’s Motion 5: The Complete Series (which has all the tutorials I just mentioned) highly enough.

And if you’re looking for some great free Motion tutorials, you need to stop what you’re doing and check out Simon Ubsdell’s Youtube Channel.

Seriously, even though no one ever talks about it… for a lot of people, Motion is really worth learning.

FCPX SAN Workflow Dos and Donts

August 11, 2014 Tags: , , ,
featured image

Hey guys,

Sam here again… so, contrary to what a lot of people believe, SAN workflow in FCPX is actually very simple and straightforward, especially with the newly released version (10.1.2). While I’d still say that the Avid Unity workflow is still a little more robust with its bin locking features, working from a SAN/network in FCPX is still very practical. Also, in terms of price vs. performance, I’d say FCPX is the way to go, as Unity setups tend to be slow and extremely overpriced. The truth is that you can have multiple editors easily accessing the same media and passing projects to each other seamlessly. All you really need to know are a few things:

Do’s:

Keep all your media centralized on the SAN – For me, best practice is to put your original Camera negatives into a “Media” folder on your SAN. Within that, create a new folder based on your project, and then a “media” folder within that and place your camera/sound originals in there accordingly in their original directory structure.

Keep your media outside of the library – When you make a new library, from your preferences, first make sure that “leave files in place” is selected. Then, select your library, go to “modify settings” in the inspector, and set your media to be imported to a folder that is outside your library (this can be on the SAN). When you import media now, your media will added to this new folder, but it will be adding “Sym Links” (similar to aliases) that are pointing back to the original media that is also living on the SAN (see above). This will come in handy for Archiving and media management later (see below for why).

Keep your Libraries on an internal drive – If you’ve worked off a network in FCPX and experienced slow opening Libraries, and generally slow performance, it’s probably because you have your libraries on the SAN itself. The reason for this is because SAN’s are designed to handle large chunks of media, not the small database files that FCPX creates. If you’re on a network… run a test from a large library that has media outside the bundle (so it should be a lightweight file) and copy that file from the SAN to your desktop. You’ll notice that the copy time is probably far longer than it should have been. Now, copy a regular media file over. If you’re on a decently fast network, this copy should be MUCH faster than the library copy was, even though the library was much smaller in size. This illustrates this issue. For this reason, best practice is to keep your database on local storage (especially if you have a new Mac Pro which has an extremely fast internal SSD) or an external hard drive. You will see a significant increase in speed and application startup times doing things this way.

Make sure all Editors’ Libraries are pointing to the same place – The best way to do this is to make a master editorial Library for your primary editor using the import steps described above, and then duplicate that library and hand it off to each new editor. If you’ve kept your media outside your bundle, this will now be very similar to the standard FCP7 approach most of you are used to… and basically all you’re doing is passing each editor a duplicated “project file” that is making sure that your “capture scratch” is all being set to the same place for easy reconnecting later. This way, any time an editor imports media, you’re guaranteed that it’s going to the right place in the database, and that this database is the same as what your other editors are working from.

Use Xfer Libraries and keep those on the SAN/Network – Because two editors can’t work from the same library at the same time, you should have a centralized Xfer library that editors open and close when they want to pass new edits to each other (this can also live in a dropbox/google drive). If an editor needs to pass media or an edit to another editor, they should consolidate their library first to the network to make sure all media they’re referencing lives in the correct place (see below for the reason why), and THEN pass their project(s)/event(s) into the Xfer library to distribute to other editors. They should then close the Xfer library so others can access it.

Use consolidate commands and Hard links to seamlessly and non-destructively consolidate media in FCPX – Ok, so here’s something really cool not a lot of people are aware of. If you are using the FCPX media structure in the correct way, because of the way FCPX takes advantage of Hard Links (for the record, I have no idea what real definition of these are… just what they do), you can have multiple copies of the same file on a drive/SAN/network, and those files will only take up the space of a single copy of that file. To ilustrate, here’s a simple test you can run:

  1. Create a new library set up the way I described above, and make sure the media folder in your library is set to the same drive your Camera Originals/files you want to import are stored on.
  2. Check and see how much space is still left on the drive. Write this number down.
  3. Take a relatively large file (minimum 5+ GB) and Import into the library and confirm that you have sym links show up in your original media folder.
  4. Now, use the consolidate media command new in 10.1.2 to have that media you just imported copied over to the new library.
  5. Ctrl click the file and select “reveal in finder” and then look in the original media folder and confirm that the file you imported no longer has an arrow next to it (meaning that this file is now an actual copy and no longer a sym link).
  6. Check the total disk space on the drive/SAN/network. It should still be identical to the number that you wrote down.
  7. This means that you have two “copies” of the file on the same drive, but you are only losing space for one of them because of “Hard Links.”

The reason for this is that because it’s using Hard Links for your media, FCPX is keeping track of the files you have on a drive, and if you use the FCPX commands to manage your media, you can have files living in more than one place on your drive, but not be penalized in terms of disk space.

What this means is that in terms of archiving and importing new media, if all your editors have their libraries pointing to the same media folder, and you are using the consolidate media commands correctly, you now have the best of both worlds; you can copy your media onto the network and have that directory structure remain untouched by FCPX… but you can also ensure that all of your editors have the same access to media that all your other editors have access to because everyone is consolidating to the same place when they import to new things, and you are never losing disk space because of this. Not only that, but when you’re done, everything is simple to archive because everything you need for a project will be living within a single directory which you can easily archive whenever you need to, and you’ll be sure that you’re not missing anything. Hard Links are great.

So, to recap… here are some don’ts:

  1. Don’t keep your media inside the library on the network. This will make your libraries far less portable.
  2. Don’t let your editors have their library Media folders pointing to different places.
  3. Don’t keep your Library Media folders on different/networks/SAN’s drives than your original negatives if you can avoid it.
  4. Don’t keep your libraries on the SAN/Network – except for a single Xfer Library for people to pass edits and events to and from (but even then, you should probably put this Xfer library in a dropbox).
  5. Don’t try Group workflow on a large project without first running tests with your network (especially disk permissions which can do all kinds of weird things to you).
  6. Don’t let your editors start cutting without explaining the workflow to them ahead of time and making sure they understand why they’re doing what they’re doing.

Also, for some more information about how media management works in FCPX, check out this awesome video by Dustin Hoye:

For an expanded understanding of working with FCPX on a SAN, check out this great resource posted recently on fcp.co:

http://www.fcp.co/final-cut-pro/articles/1467-free-pdf-fcpx-in-a-shared-environment-updated-for-10-1-2

DCP through FCPX/Compressor

August 8, 2014 Tags: , , ,
featured image

Hey guys,

Sam here… some of you guys know about this, and some of you don’t, but you can actually make and view your own DCP’s using Compressor (or in FCPX), and it’s ridiculously easy.

We demoed this for folks who visited the FCPWORKS suite at NAB, and I even had one of my We Make Movies shorts (Agnes) screen at the NAB StudioXperience 4k Filmmakers Showcase. The only reason I was able to get them what they needed (a 4k DCP) was because of the Wraptor Plugin/DCP Player combo. Given my timeline and how quickly I needed to turn it around, I just wouldn’t have bothered with the other solutions due to their complexity and inability to easily check/preview the DCP on my Mac. Honestly, the workflow for this is so easy, I kind of felt like I was cheating or something. In my mind, DCP creation was supposed to be hard. That’s no longer the case. Thanks Quvis.

Anyway, in order to make a DCP through Quvis Wraptor in Compressor, here’s what you need to do:

  • Buy the Wraptor 3.1 for Apple Compressor ($699)… you can also try the watermarked version for free.
  • Buy the DCP Player ($699 to own) or rent it ($60 for 30 days, $360 for the year)
  • Download and install the plugin in Compressor
  • Export a master file of your movie (Prores XQ, 4444, or HQ are your best bets), with your audio channels laid out according to your DCP requirements
  • Drag the file into compressor
  • Apply the Wraptor plugin, configure for resolution (2k or 4k), frame rate, and number of audio channels
  • Set your destination
  • Export
  • Check it using the DCP Player Software
  • Bring it to the theater or upload to a server

You can also set up a custom Compressor setting that you can use right in FCPX from your timeline.

When it’s done exporting, you’ll have a DCP folder that you can preview right on your Mac using the DCP Player software. It’s going to automatically interpret the color space of your DCP file to display on your Mac pretty much the way you’ll see it in the theater.

In terms of quality, there’s no difference between what we were able to see on the Quvis DCP in the theater vs. the very same file encoded by the Studio’s post house.

On a new Mac Pro, with the recent Quvis 3.1 upgrade, you should see near real time encoding for 2k DCP’s (it will take longer for 4k).

The main difference between what Quvis does vs. the free Open DCP software is the ease of use, render time, and higher quality of the signal to noise ratio in the DCP’s you’re generating. Bottom line is that if you find yourself needing to deliver to DCP regularly, the Wraptor/DCP player gives you the best bang for your buck.

One small thing to note… encrypted DCP’s are not supported yet… so if you find that you need that, you’ll need to get additional 3rd party software to encrypt the DCP.

Anyway, for you FCPWORKS customers out there, if you find yourself running into issues, hit us up at workflow@fcpworks.com and we’ll help you out.