blog

blog

Cineflare Kinetic Badges from FX Factory

August 29, 2014 Tags: , , ,
featured image

Sam here…

It’s becoming increasingly obvious to me that we’re entering a “do-it-all-in-one-app” world for most things.

Bouncing out to After Effects, Motion, or having a dedicated GFX person who “handles all that stuff” for many clients is becoming far less common when it comes to how Producers budget for jobs.

Turnaround times for videos are faster, and editor’s with “Jack of All Trades” skill sets are becoming almost mandatory (please don’t get upset with me about that… I’m just the messenger).

Clients asking “Can’t you just do it yourself?” is becoming the norm.

And even if they’re not asking that, being able to turn around GFX quickly and having them still look professional is a HUGE value add you can provide for your clients as an editor.

The main problem is that most people don’t have time to learn everything… and when it comes to things like Motion GFX, for the most part, especially for corporate/branded work where you’re expected to do everything yourself, having some great looking templates/design elements in your arsenal becomes the difference between a profitable vs. too-time-consuming-to-be-worth-it job.

For FCPX users, Plugins like XEffects Snapshots, Cineflare’s Kinetic Badges,and Ripple Callouts (recently updated) are part of the solution.

I’ve actually done a video about Ripple Callouts in its previous incarnation, but it just got a great new update (free), and the video on the link above will tell you more about what can be done with it than I can. In terms of creating quick, professional looking “callouts” for things going on within a frame, there simply isn’t a better plugin package on the market.

If you need something that’s going to add a bunch of style to your typical freeze frame, you’re going to want to look at Snapshots… which is a series of Freeze frame templates (transitions for these are included as well). These plugins are a great way to impress a client or create a package around their branding with minimal effort… or at least far less effort than everyone else is putting in.

And when you these in conjunction with something like Cineflare’s Kinetic Badges, which is a series of animated and highly customizable vector graphics, chart type things, and textures, corporate and branded work becomes are far simpler, less tedious, and considerably better looking.

Quick note about all of these packages, and really any package you work with in FX Factory… don’t watch the “demo videos” to learn how to use the plugins. Watch the “(plugin package) in Final Cut Pro X” video that is next to the “demo video” on the product’s info page to get a more in depth understanding of what to do with the plugins.

All of these packages are available through FxFactory and are only available for use in FCPX.

Review disclaimer – Yes, we do sometimes get free products and licenses. No, this does not affect our reviews. We only advocate and sell the products that we use in our own workflows. If we bother to review something, it’s because we use it in our day to day and like it. We also very much admit that we haven’t seen everything… if you think there’s a product out there that we should be talking about, please let us know at workflow@fcpworks.com.

Yanobox Nodes from FX Factory

August 28, 2014 Tags: , , ,
featured image

Sam here…

So… if for no other reason than it’s the coolest plugin demo video ever created, go watch this one from FX Factory about the Yanobox Nodes 2 plugin:

http://fxfactory.com/info/nodes2/

To be perfectly honest, I was a bit scared of this one when I first got it. It took me awhile to open it up and really dig into it… and even longer before I figured out what I’d actually want to do with it.

Honestly, I’m literally not capable of building one of the standard templates in Nodes on my own… so it’s nice to be able to start from something you could never even begin to understand how to create and then quickly start building on top of it.

Like most things… once you make the decision you’re going to dive in with it no matter what, you start figuring things out, and once I decided I was going to take the time to figure out how Nodes worked, I actually picked it up pretty easily.

Also, the templates were surprisingly responsive on both my Macbook and Mac Pro. I was able to move sliders around and see things update extremely quickly, which I have to admit was a surprise. With the level of complexity in these plugins, I was expecting to see lots of beach balls when I messed with stuff. This has not been the case with Nodes, for the most part.

For me, Nodes is most useful as a place to begin your larger template or as a quick way to throw on a complicated design element on an existing comp you’re doing. Basically, you get a sense of something you want to do, and instead of reaching into the Motion Particle Emitters tab (another underutilized resource that is a bit hard to wrap your head around), grab a Nodes template and get going. You’ll likely end up with something cool a whole lot faster than you would building something from scratch.

Yanobox Nodes

Yanobox Nodes

They’re great also as just a ready made cool looking particles element you can easily throw a blend mode on (I typically soft light or Overlay for this) to give a texture some life. You can use them in a very similar way to how you’d use a Light Leak… and the two complement each other nicely if you need something like that.

So often, you get plugins that just replicate stuff you could probably do yourself…. or feel very “stock” or hard to customize. Not Yanobox Nodes.

Things are so ridiculously customizable, it’s hard to go into too much detail. I think Plugins like this are the future, I think, because they allow you to start from a place you’d have literally no idea how to get to on your own. Like Coremelt’s Slice-X, It allows advanced effects techniques to become a lot more approachable to the average editor… and I think that’s something all Plugin makers should aspire to doing.

I’ve never seen another plugin package like Nodes 2. It’s definitely not cheap ($299), but you get what you pay for… and then some.

For more information on how to get started with it, go here:

http://www.yanobox.com/NodesOnlineManual/

If you have a need to quickly step up your Motion Graphics game, this plugin should probably be at the top of your list.

Review disclaimer – Yes, we do sometimes get free products and licenses. No, this does not affect our reviews. We only advocate and sell the products that we use in our own workflows. If we bother to review something, it’s because we use it in our day to day and like it. We also very much admit that we haven’t seen everything… if you think there’s a product out there that we should be talking about, please let us know at workflow@fcpworks.com.

Tour De France Workflow

August 26, 2014 Tags: , ,
featured image

Sam here… nothing I write here is going to compare to what Peter Wiggins did over on fcp.co. Go grab a cup of coffee and read this for the next 25 minutes:

http://www.fcp.co/final-cut-pro/articles/1480-editing-the-tour-de-france-on-final-cut-pro-x

What Peter had going over there is exactly the type of setup we’ve been advocating at FCPWORKS and not radically different than what we had going at our FCPWORKS launch event.

If his case study doesn’t blow the doors off the myth of FCPX being an unprofessional product, I really don’t know what it’s going to take.

Group Workflow (XSAN from Promise). Broadcast Workflow (Softron). Video IO (AJA). High Profile project (Tour De France). What are we missing here exactly?

I have a feeling we’re going to see a ton more stories like this coming. I’m looking forward to the day where this sort of thing isn’t raising eyebrows anymore. I feel like it shouldn’t be.

Congrats Peter!

10 Reasons to Switch to FCPX

August 25, 2014 Tags: ,
featured image

Our colleagues over at Kingdom Creative (who were profiled recently over at FCP.CO) have come up with a great set of 10 reasons to switch to FCPX. The first five are:

  1. The Skimmer
    We take it for granted now, but I think the skimmer is the most useful tool in our toolbox. To be able to skim along the thumbnails in filmstrip view, especially on long takes, means we can very quickly drill down to the exact clip content we need. We used to wear out the spacebar and J,K,L keys doing this.
  2. Keywording
    Although it is hard to get your brain out of thinking of the Final Cut Pro 7 way of thinking as content being in bins, once you can see the power of keywording, it’s hard to imagine working without it now. Being able to keyword individual ranges in a clip and also applying more than one keyword is a killer feature.
  3. Working with native formats
    No longer do we spend half our time batch encoding in MPEG Streamclip, we can now take clips from many different formats and mix and match them on the same timeline with no encoding or initial rendering. Once ingested, we can get to work straight away, which means better value for our clients.
  4. Smart Collections
    We have default smart collections to sort by camera, but also having default collections to sort by favourites with ‘Interview’ keywords, means there is always a dynamic pool of content in the project as soon as any interview selects are done, which is a huge timesaver when trying to get the initial edit underway.
  5. The Magnetic Timeline & Clip Connections
    The feature that is hardest to get used to is the magnetic timeline, yet once you have to use another track based NLE again, it suddenly becomes the best feature. Being able to move around entire segments of a cut while the remaining structure remains intact is invaluable and you can do this with reckless abandon. Having clip connections means that this all works exactly as you want it to.

For the rest of the list, check out the Kingdom Creative site at- http://kingdom-creative.co.uk/10-reasons-move-final-cut-pro-x/

4K Monitoring Options: BMD or AJA

August 22, 2014 Tags: , , ,
featured image

Sam here… finishing up 4K week on FCPWORKS Workflow Central with one more post.

So… BMD or AJA… the eternal debate. Right now, we’re centering this around the 4K monitoring & I/O products, AJA’s IO 4K or the BMD Ultrastudio 4K. Basically, it comes down to this… Do you use DaVinci Resolve for Color Correction? If the answer is yes, you’re going to need to go with Blackmagic. Case closed. Blackmagic Devices are the only ones that will work with Resolve.

However, If the answer to that question is no, and you’re doing your color work primarily in FCPX or another program that isn’t Resolve (Scratch, Baselight, Smoke to name a few) the discussion becomes a lot more complicated. Additionally, in the case of the BMD Ultrastudio 4K… it’s can be loud. The AJA IO 4K is quiet and considerably smaller. If you’re keeping the product in a room with a new Mac Pro as your primary computer, you really start to hear the Ultrastudio 4K when it’s on…. and if you’re doing serious sound mixing, the noise makes a big difference.

Additionally, and this is a little known fact, but the AJA cards support more monitoring formats for FCPX as well. For whatever reason, your monitoring formats in the Blackmagic preferences (system settings on the Mac) are more limited and considerably smaller than when you’re using your Ultrastudio 4K in Resolve.

HOWEVER, at the end of the day, price is also a factor and the AJA products are almost universally more expensive than their BMD counterparts. So price vs. performance is definitely a consideration. In my opinion, if you’re more interested in specific features, go AJA. If you’re budget conscious or a heavy Resolve user, go BMD.

p.s – little known fact, but the HDMI out on the back for the New Mac Pro can be used as an 8-bit A/V out in FCPX, and it is FAR more configurable for video I/O if you’re using weird sequence settings and just need to send out a 1:1 output over HDMI than what’s available through your BMD or AJA device.

4K Isn’t the Problem

August 21, 2014 Tags: , ,
featured image

Sam here… Came across this article and I think it’s a symptom of a larger problem:

http://www.redsharknews.com/business/item/1831-the-peril-of-4k-our-craft-is-being-threatened

Basically, the long and short is that the DP is worried that higher resolutions and the ability to make alterations to images further down the line in post is going to take control away from DP’s over their images.

On the one hand, I can totally understand where he’s coming from, and he’s totally right. I’ve seen quite a few projects butchered in color correction, and I imagine it must be very difficult to go out and put your heart and soul into shooting/lighting something only to have it completely reworked in a way that’s entirely not what was imagined… and then be credited as if that was how you wanted it. That sucks.

However, this is not the fault of the resolution, RAW, or improvements in technology. The fault lies with the way that departments work together, and it’s my biggest pet peeve in the entire industry.

No one talks to each other.

Departments don’t talk about workflow before the shoot starts. Production rarely asks what post wants. Post rarely checks in with the DP or sound department after the shoot is over. VFX lives on its own island and is expected to push the “make it better” button on whatever production hands them. Everyone is just trying to get through the day, and get through the gig.

There’s no process and no blueprint. There’s no workflow.

Actually… that’s not even really true. There’s too many workflows, and every department/individual has their own specific way they think things should be done/delivered to them. Rarely do these different workflows sync up across departments. Even rarer than that does one department ask the other department how they want to do things before the production starts. Usually there’s a list of delivery requirements on how a vendor wants things that is discovered after the critical production decisions have been made.

A few examples that illustrate this:

– An anamorphic lens is chosen because Production likes the widescreen look. Post is never consulted. However, no one in post knows how to transcode/desqueeze the anamorphic footage correctly. Footage is processed slightly warped and then edited this way. Conform becomes a nightmare. Also, turns out the distributor needs a full frame 1080 master (no bars on it), however the movie wasn’t framed in many cases to live in a 16:9 master. Massive pan and scan work needs to be done. Post budgets go up.

LUT’s are created for each shot but no discussion has been had over how these will be applied to the RAW footage when it’s time to do the conform. No one bothered to run this workflow past the editor or colorist who have no how any of this was handled, and the production had a falling out with the DIT who made all of the looks. In the end, LUTs are applied incorrectly or not at all and no one has any idea what LUT goes with what shot or how to sync all of these LUTs up in a way that isn’t ridiculously time consuming. Post budgets go up.

– RED footage will be transcoded down to prores at a random resolution with letterboxing added on to the prores. No one in post has any idea how to correctly get back to the original R3D’s with proper transforms from the edit applied to the RAW footage. Post budgets go up.

– No one asks the VFX department how they want their greenscreen shots done. Tracking marks are not used and yet the camera is moving during the shot. Posts budgets go up.

– VFX works in RED Color3 and delivers DPX plates. The colorist is using REDlogfilm and grading everything from the RAW. Things don’t match shot to shot. Post budgets go up.

– Editors need to deliver their picture to a Sound house. They’ve never delivered to this sound house before, and the Producer picked them because they had the cheapest bid. Lots of ADR work is expected. Post budgets are about to go up.

Anyway, you take things like the above and then throw in the fact that, in most cases, especially on smaller commercial jobs, most of the people involved are working with each other for the first time. Chemistry and trust are non-existent. Bids have gone out to the lowest bidder and not to the most qualified teams. CYA (Cover-Your-A$$) attitude becomes prevalent. Fingers become ready to be pointed. Accountability becomes nonexistent. People get angry. People get fired.

Someday, I want to live in a world where the DP knows the editor and both of them know the colorist. They’ve all worked with the director before. Also, before they’ve shot, each of these people sat down in a meeting with the VFX and sound departments and talked through how the imaging pipeline was going to go from set to edit to VFX to sound to color to mastering. Then, someone would come up with a diagram based on what cameras were being used, how sound was being recorded, what resolution needed to be delivered, and in what color space(s). Then, they’d also write down how metadata would be managed, VFX would be roundtripped, sound would be turned over for the mix, video would be conformed for color, and how, in general, the project would be set up and delivered to the distributor based on pre-agreed upon sound, color, and mastering specs. The departments would then take this diagram home, decide what needed to be changed based on their needs, and then come back and finalize their process, compromising where necessary for the greater good of the project.

This would all be done before a single frame of footage was shot.

A man can dream.

Anyway, until people start working this way and figuring out their process ahead of time, people will continue to write blogs like the one I linked to above and blame things like resolution and RAW for why their footage doesn’t look right in the end.

Departments need to communicate about workflow more. That’s not technology’s fault.

Panasonic GH4 Tips and Tricks

August 20, 2014 Tags: , ,
featured image

Those that know me well know that I’m a huge fan of both FCP X and the Panasonic GH series of cameras. Currently, I own and shoot with the GH2, GH3, and yes, even the almighty 4K-capable Panasonic GH4. FCP X handles the different codecs, bit-rates, and frame sizes like a champ. In addition, I always have the option to transcode the footage to ensure that everything runs smoothly on, say, an older laptop.

I determined that I would keep the older cameras for a variety of reasons. The most practical of those reasons is that they are both paid off! However, choosing to shoot with all three cameras at the same time presents some problems.

Chiefly, how do I effectively balance the distinct looks of each camera when the looks are baked in to the image? Does FCP X have the tools I need to get the job done? How do I keep my workflow simple without having to do a major color correction at the end of the job? It turns out, the answers are quite simple.

My personal holy grail for camera matching comes in the form of the DSC OneShot Chart. I make absolutely sure to bring it with me on multicam shoots. I allot a few extra minutes to white balance and to shoot the chart. The back side of the chart is split between white and gray. Frankly, I wish they chose either white or gray versus splitting them up since it’s hard to fill the frame of your camera with the space given for each when performing an auto white balance.

The OneShot chart was developed by both DSC and Art Adams. Art has a blog post that fully explains the chart here. I won’t go into too much techno talk about it, but it’s a brilliant chart which has all the basics you need for proper luma and chroma balancing: true black, white, gray, skin tones, plus primary and secondary broadcast colors.

Before I got the chart, I would balance the shots manually using a waveform monitor and vectorscope. The great news is that Resolve 11 can actually understand the color information on the chart using the built-in Color Match feature. In the video below, I show you how I send the shots from FCP X to Resolve, match them, and then send the LUTs back to FCP X.

To summarize, once each camera is balanced in Resolve, I export a 3D LUT of each camera. I name the LUT based on the camera and name of the shoot. Of course, you could add any additional info that you deem necessary, such as scene number.

The next trick is getting all this back into FCP X. I purchased a great plugin from Denver Riddle’s Color Grading Central called LUTutility. This program can actually read the LUTs you export from Resolve and attach them to your shots inside of FCP X.

All you need to do is drag and drop the .cube files into LUTutility’s preference pane, located in System Preferences. The LUTs are then installed and accessible inside of FCP X.

Inside FCP X, it’s simply a matter of adding the LUTutility effect to the footage and choosing the correct LUT based on your camera from the pulldown menu in the inspector. The beauty of this workflow is that all you need to do is import the shots with the DSC One Shot chart into Resolve. There’s no need to render anything out of Resolve. All that color grading info is stored in the LUTs and read by the LUTutility effect inside FCP X. I’ll then apply the necessary LUT to all shots within a scene and perform the final grade inside of FCP X.

One side-note to all of this. After I apply the LUTs, I’ll usually add some minor color correction tweaks as nothing is ever 100% perfect. But even with the minor tweaks, this process takes so much work out of balancing different cameras, especially DSLRs where the look is baked in.

As long as the cameras are white balanced off the same source and are generally shot at the same ISO, the tweaks are very minor compared to having to entirely match by eye. Frankly, I find it amazing that this is now all possible. It speaks to the exciting development that is going on in the FCP X ecosystem.

I hope this tip helps. Now go shoot and edit something awesome!

ABOUT

Michael Garber

Guest Blogger Michael Garber from Garbershop.

Michael Garber is a post production professional with over 14 years of experience. He started his company, 5th Wall, in 2004 and has worked with clients such as Discovery Agency, Huell Howser Productions, Automat Pictures, FuelTV, PBS and more. In addition to editorial work, Michael produces corporate documentaries for a Fortune 500 company. When not editing or shooting, Michael is more than likely talking about editing and shooting on his blog, GARBERSHOP.

About 4K Monitoring

August 19, 2014 Tags: , ,
featured image

There’s a lot of talk and a whole lot of hype when it comes to 4K. I’m certainly guilty of a lot of that hype. However, most people know very little about 4K and are pretty intimidated by the subject. Here’s some quick hits when it comes to working with it.

First thing you need to know is that there are two flavors of 4K delivery resolutions:
4K UHD – This is the spec for 4K for broadcast and in the home. Resolution is 3840×2160. It’s a 16:9 aspect ratio (1.78:1), and is really just double the resolution of standard HD (1920×1080). Most 4K displays and televisions will be 4K UHD.
4K DCI – This is the cinema spec, and resolution is 4096×2160. Aspect ratio is 1.85:1. Like 4K HD, this is just double the resolution of the standard 2k spec (2048 x 1080). You’ll only really see the 4K DCI spec in play if you’re watching a movie in a theater.

From a traditional viewing distance, 4K only really becomes noticeable once the screen hits 84 inches. However, once you hit that size, and if shot and projected properly, the results are pretty stunning.

As of this writing, most of the 4K TV’s being sold are not worth buying. Either the panels are really cheap and the image quality is not great, or the price is just not worth it. If you need something that can monitor at 4K, and you’re working at a budget, get one of the cheaper panels, and pay very little attention to the color and contrast of the monitor. Just watch for sharpness and resolution factors. In many ways, what’s happening now is like what happened when HD first appeared. Sets were really expensive and only for high end pros or people with money to blow. Wait a while and you’ll start to see more affordable options appear.

Lastly, you don’t need to monitor 4K while you’re doing color correction. I’d recommend using an HD Broadcast monitor while doing color (with your video I/O set to 1080). Buying an affordable 4K grading monitor is pretty much impossible and won’t make any difference to your color decisions. Color correcting your scaled down 4K images at 1080 is still the way to go. Right now, I think the only useful thing a 4K monitor is really capable of is to check the overall sharpness of your image at a 4K resolution when you’re mastering. Everything else is not ready for prime time yet… at least in my opinion.

The Easiest Way to do a 4K Screening

August 18, 2014 Tags: , ,
featured image

Sam here… So… Believe it or not, it’s actually easier for the average person, if they had access to the right projector, to put on a higher resolution screening than they typically see when they go out to the theater.

When you go watch a movie at the typical multiplex, you’re almost universally watching a movie that was made from a 2K master… even if the projector is 4K, the movie itself was up-rezzed from a 2K file to fill the screen.

The main reason for this is that Hollywood hasn’t really figured out the whole 4K pipeline thing… especially on the VFX side. It’s far simpler and more practical for them to finish in 2K.

What this means is that if you have a Dragon, Epic, GH4, or 4K BMCC, It’s a pretty straightforward process for you to shoot, finish, and screen at a much higher level than the big guys do… especially if your VFX pipeline is simple.

In fact, if you somehow managed to have access to a nice 4K Projector with an HDMI port on it, you can put on a higher quality screening in your living room than you’ll currently see in the multiplex.

Why? Well, Both the Mac Pro and Macbook Pro will shoot out a 4K signal through their HDMI ports.

Also, those 4K HDMI ports will also send a 5.1 signal.

A 4K 5.1 screening is now a pretty straightforward process if you’ve got the right home theater and you know how to plug in an HDMI cable and export a 4K ProRes.

It’s now easy to Shoot 4K, post 4K, and then screen it right from your laptop.

I have no idea why film festivals make things so hard for Filmmakers with their DCP, Bluray, or Tape requirements.

Filmmakers should be able to just hand over/dropbox a QuickTime movie and get on with their lives. For some reason, everyone loves to make things complicated.

With my Film Collective, We Make Movies We Make Movies, we do our annual WMM Fest of our communities’ work in LA, and and we run all of the screenings (there were 5 this year) right from my laptop. In fact, every screening we’ve ever done has been done through QuickTime,in 1080 ProRes, using our filmmakers’ QuickTime master files, and playing a from a laptop through QuickTime or Final Cut. It’s just easier.

The only reason we’re not doing 4K screenings is because most filmmakers are still mastering at 1080, and 4K projectors are still way too expensive. Both of these things will be changing in the not too distant future.

If we had the right files and the right gear, though, our process would still not change at all. ProRes is still ProRes, and we’re still just playing it out of an HDMI port to a projector.

Our screenings look better, sound better, and we have almost no room for technical issues because we do things this way. We work from the masters, and leave as few things to chance as humanly possible. As long as the projector is calibrated, we’re good to go.

And while I explained that it’s a lot easier for filmmakers to make DCP’s these days in our blog here… it’s still a very difficult format for the average person to implement on their own and is far from a user friendly experience to screen and play one of those things for an audience.

Both the DCP and Bluray formats were designed from day 1 to be difficult to create and hard to pirate. Essentially, as most high end technologies typically are, they were designed to both keep people from understanding them, keep them proprietary, and to maintain established business models… in this case preserving the studio multiplex and home digital distribution businesses.

Fortunately, there’s a pretty easy way around all of this nonsense… which is good news for the independent filmmaker who isn’t tethered to this process and can figure out how to make and distribute their own content.

Right now, I look at DCP’s as a necessary evil, but the truth of the matter is that the safest and easiest way to screen a movie for an audience is to just run it through the HDMI out of your Mac from your QuickTime master.

Why do people feel the need to make things so hard?

Why I started to use FCPX

August 15, 2014 Tags: ,
featured image

Sam here… so, over the years, I’ve gotten a lot raised eyebrows when I run into people I used to work with, or editors and people outside my circle, and I tell them I cut everything I do with FCPX and it’s the best thing out there. Usually, I get back some garbled version of “really? I heard it sucked…” or “I tried it a long time ago and couldn’t get into it…”

We then have a 10 minute conversation about why they switched to Premiere and why I didn’t… and who, in fact, the crazy person really is in this equation.

And when I look back and really think about why I switched to FCPX… I realized that my circumstances were different than pretty much anyone else’s when it came to switching, so it shouldn’t be surprising that my viewpoint on the program is much different than everyone else’s.

Long story short… I downloaded the program day one like everyone else. There were things I liked, and a lot of things I didn’t. Unlike most, I kept playing with it, and cutting small projects, trying to figure out why Apple had done what they had done… and if, in fact, there was something I wasn’t getting with all of this. I was doing all of this on my off days while I worked at my regular freelance gig still using FCP7 and being pretty content with that workflow.

Somewhere along the way, I got invited to come out and work with the Final Cut team and got to ask some of my questions in person… and I got some answers… when I was finished, I came back to LA, and my perspective had changed a bit. I’d been shown a different way of looking at editing, and sort of realized I couldn’t go back to what I was doing and still be happy with that. I had found I liked editing again (I’d become a bit of a robot with FCP7)… and for the first time in a long time, I felt like there was something new and interesting for me to explore.

So… I sort of made the decision that I was just going to run with FCPX, start my own post house, not tell my clients I was cutting with X (I’d just say Final Cut and let them assume I meant FCP7), and see just how far I could get with what I was doing before I ran out of money.

I haven’t run out of money yet.

In fact, I made more. You see, I was still charging what I would normally charge, but I was able to deliver in half the time… time equaled money. So even though I lost a few customers at first, the ones I did keep I was able to take better care of.

That one decision to go out on my own led to a big old giant chain reaction in my career that is still snowballing. It’s been weird, frustrating, cool, and consistently surprising. At the end of the day, it’s been fun. I have a lot more fun than most editors I know, and a lot more control over the projects I choose to do… which is mostly all I ever cared about.

And when I compare it to cutting the same old piece every single day at my old freelance job in the same tired workflow… well, there really is no comparison. You literally couldn’t pay me to go back to that. People have tried.

So what’s the lesson here? The person who was bored with editing at his cushy freelance gig (me before FCPX) had stopped learning and had stopped getting better. I was starting to become less curious, and editing itself had become just a transaction I would do for money. And when that happens, when you stop caring about what you do, and you stop learning, it makes you more likely to want to preserve the status quo and keep collecting checks. Change becomes threatening and learning becomes difficult. Your job becomes less about doing something cool, and it becomes more about protecting your territory from outsiders. It becomes easy to dismiss new ways of working. Eventually, you become the flatbed film editor who wakes up one day to realize their gigs are gone and everyone is editing on video. You blame the world and get really angry and bitter. No one cares that you are angry and bitter. You get more angry and bitter.

If I had stayed that way, I’d be well on my way to being one of those crusty old editors who love to tell everyone else how dumb and unprofessional their workflow is. “Get off my lawn!”

The truth is that you don’t know what you don’t know. I got lucky enough to have some people show me, and it changed the way I looked at what I do. It’s made be a better and more efficient editor and it has prepared me for the next ten years in this business in a way that many people can’t even see.

At the end of the day, it makes no difference to me what editing platform you cut with. You should use what works for you… but as an editor, it’s part of your job to know enough to know the difference between the different tools, and to continue to adapt to the changing world around you.

I guess my only piece of advice might be that, before you go ahead and dismiss a different idea entirely, decide for yourself, and be willing to occasionally go down the rabbit hole. Don’t stop being curious. Sometimes going down the rabbit hole can change your perspective on things completely. It did for me. It’s why I’m only cutting with FCPX now.

I’m always looking for the next Rabbit hole, though.