Resolution, codecs, white balance, and other image controls are critical to capturing not just a good-looking image but also one that can be easily edited and manipulated. Technical settings are key, regardless of what kind of camera you use, so even if you don’t plan on shooting a video yourself, these are the tech specs you’ll need to know.
Resolution
Video is essentially a series of still images, and each frame has a horizontal and vertical dimension in pixels. Common resolutions use abbreviations, including the following:
- Ultra HD (UHD, commonly referred to as 4K): 3840 x 2160
- 1080p: 1920 x 1080
- 720p: 1280 x 720
To visualize those numbers, here’s a diagram:
Figure 2.1: Several common resolutions you might shoot and deliver
While it’s true that 3840 is a little less than 4000 (4K), there’s not much in it. Some cinema purists use a size of 4096 x 2160, one of the official “DCI 4K” sizes, but it’s intended for cinemas and leaves black bars at the top and bottom of a 16:9 display. For the purposes of this discussion, when I say 4K, I’m actually referring to UHD.
What should you shoot, then? Today, I always shoot in UHD (or higher!) and recommend that you do the same. But will people notice the extra detail over 1080p? Some will. It’s true that people sit at a fair distance from their TVs, and the limitations of our eyes mean that some of the detail in a 4K image at that distance can be lost. Most phone-based viewers won’t see the extra detail either; small screens, slow networks, and limited resolutions all limit what you can deliver.
Yet, even if most of your audience is using their phones, it’s a mistake to only plan for that lowest common denominator. Computer users with high-resolution monitors — every iMac user and many others — sit close enough to their monitors to see the extra detail easily. Aim higher than the baseline and you’ll make even the fussiest clients happy. This shouldn’t slow you down, either. While some other editing applications struggle to edit 4K without stuttering, a relatively modern iMac running FCP will do just fine.
There’s another big reason to shoot in 4K, though. I shoot 4K for jobs that I know I will be delivering to a client at 1080p, or even 720p. Why? Shooting in a format bigger than you need — called oversampling — lets you safely zoom in to show just a small part of the captured image in the final video. This means that final control over framing passes from the camera operator to the editor, giving you lots of extra options and making every shot more flexible.
Still, if your camera doesn’t shoot 4K well, shooting 1080p can be totally fine. If you plan to deliver 1080p, just don’t zoom (much) during editing and you’ll be OK. A general rule is to never go above 120%.
Portrait or landscape?
If you only ever plan on delivering to a vertical orientation, then you can shoot that way too. But for 99% of jobs, you should shoot in landscape — especially on a phone. If your client will be shooting footage on their own phone, be sure to ask them to shoot in landscape because most people will shoot in portrait without thinking twice:
Figure 2.2: Hold your phone like it’s a TV!
Computers and TVs use landscape orientation, and if you shoot portrait, your work will only ever look good on a phone. But it’s best to step back for a wider shot, so that the frame you shoot may be re-cropped and repurposed. Many kinds of commercial work are delivered in multiple aspect ratios, so bear that in mind when you shoot.
One good strategy is to use a common top, where you frame a normal amount of space between your subject’s head and the top of the frame. When you present that footage in different aspect ratios, simply put the top of the clip at the top of the frame and let the other edges of the frame change as they need to.
In summary, then — take a few steps back to zoom out, frame with a regular gap above the head, shoot in landscape, and record in high resolution, and you can use that footage for any kind of delivery.
Frame rate
If it is at all possible, shoot at the same frame rate that you want to deliver. For a “cinematic” look, you’ll want to shoot at 24 or 25 frames per second (fps), although 29.97 fps is widely used too. Why these specific numbers?
A video image is updated a certain number of times per second, and that number is different for TV signals in different parts of the world for historical reasons related to electricity. Here are some guidelines:
- In 110–120 V countries, such as the US, Canada, and elsewhere, 29.97 fps is used.
- In 240–250 V countries, such as most of Europe (including the UK), Australia, and New Zealand, 25 fps is used.
- In the international world of feature films and high-quality TV, 24 fps is the norm, although 23.98 fps is often used as it makes for easier conversion to US TV standards.
Does it matter which one you use if you’re delivering online? Not much. But if you’re delivering to TV or cinema, then it definitely matters, and you’ll need to examine the delivery requirements carefully.
IMPORTANT NOTE
You might also have to deal with interlaced delivery (1080i) rather than progressive (1080p), but interlaced video today is only requested for TV broadcasts. Shoot and deliver in progressive formats unless the client explicitly asks for interlaced delivery.
It’s also possible to record at moderately high frame rates, such as 50 or 60 fps, or even higher. While these frame rates do deliver smoother motion, most viewers find that videos shot in these modes look a little fake, unnatural, or cheap when played back at that speed, and so these higher frame rates are rarely seen outside of sports and gaming videos.
Rather, these higher frame rates are more commonly used to give the option of slow motion, captured at a high speed, but played back at a slower speed:
Figure 2.3: This splash (and, in fact, this whole shoot) was recorded at 50 fps to give slo-mo options
If you record at 60 fps, you can then slow it down on the timeline, showing every frame you shot at a speed of 42% on a 25 fps timeline or 50% on a 29.97 fps timeline. This is referred to as Automatic Speed in FCP, and it’s very handy. But this is just an option — footage shot at a moderately high frame rate doesn’t have to be slowed down. It’s entirely possible to use this footage in real time instead by skipping frames on playback. Many shooters use these moderately high frame rates for B-roll (explained later in this chapter, in Shooting the right shots) to give more options during editing.
Be aware that as you increase the frame rate, especially to higher numbers, the camera has to work harder, and you may have to compromise resolution as a result. Check your camera because there’s often a distinction between “regular” frame rates, up to 60 fps, and “high-speed” frame rates, which can go much higher at a lower resolution, lower quality, and/or without audio. Whatever frame rate you shoot at, you should be consistent. While it’s quite easy to incorporate slow-motion footage shot at any speed, your regular footage should all use the same frame rate — probably 24, 25, or 30 fps. Mixing similar frame rates can cause visible stutters (due to skipped or duplicated frames) and it’s something to avoid if at all possible.
Shutter speed
A separate but related issue is shutter speed, how many times per second an image is captured, which is expressed as a fraction of a second, such as 1/50 or 1/200. As a rule of thumb, to give your footage a natural motion blur, you should try to double your frame rate to determine the “ideal” shutter speed denominator:
- 1/48 or 1/50 for 24 fps
- 1/50 for 25 fps
- 1/60 for 30 fps
If you shoot at a significantly faster shutter speed, such as 1/100 or 1/200, the natural motion blur of 1/50 or 1/60 will be lost, and anything in motion will look a little “choppy” as a result. Conversely, if you shoot at a significantly slower speed, such as 1/25 or 1/30, everything in motion will look a little blurry.
It’s important to note that this rule (known as the 180° shutter rule) does not apply to higher frame rates simply because any objects in motion will barely move between frames — there’s hardly any blur to be had! If there’s very little movement in the shot, there’s not much blur either.
Banding
However, there’s another reason why these frame rates are popular, and it goes beyond a natural-looking blur. If you shoot at frame rates that aren’t an even multiple of your lighting source, you might record banding, dark lines that continuously move down your image.
Why? The answer gets a little messy, but it’s for a similar reason to frame rates. Here’s an extreme example:
Figure 2.4: This blank wall shows extreme banding when a 1/200 shutter speed is used
Your electricity supply is based on alternating power, which cycles at 60 Hz in North America, South America, and western Japan, and 50 Hz in Europe, Australasia, and eastern Japan. Many light sources cycle at the lighting frequency found in your country, and if your shutter speed doesn’t match the frequency of your lighting source, your camera won’t capture the light evenly. Instead, it will catch a little more in one part of a frame than the rest, leaving a dark band that moves from frame to frame. Sometimes, these problems are subtle, but often they’re not, and it’s critical to get this right for your camera.
This issue has led many people to conclude that the frame rate must be 25 fps in Europe and 30 fps in North America, but the frame rate can in fact be different. It’s the shutter speed that controls banding, and even in 50 Hz countries, if your main light source is from a projector or large TV refreshing at 60 Hz, you’ll need to adjust your shutter speed to 1/60. (Note that some special lights and some computer monitors refresh at unusual frequencies, and some cameras have special modes to deal with this issue.)
Extra care is needed for high frame rates above 60 Hz. Shooting at extreme frame rates requires a really high shutter speed, and therefore, you’ll need to use a light source that doesn’t refresh itself at these low rates. Some professional lights are suitable, but the great ball of light in the sky always works well.
White balance
Before you shoot in a new location, it’s important to set a custom white balance, ideally with a gray card positioned where your subject will be. Different light sources add different colors to a scene, and if you don’t get this right, your subjects will look “wrong” — too blue, green, orange, or yellow. This can be corrected to some degree in the edit, but it’s not always the easiest task.
IMPORTANT NOTE
Don’t use auto-white balance unless you have no choice! Auto-white balance can drift in the middle of a shot and be very hard to correct, while an incorrect manual white balance will be consistently wrong and is more easily fixed.
The time before a shoot can often be used to set and store a custom white balance, and fancier cameras can store multiple custom white balance settings, such as the 1/2/3/4 setting shown here:
Figure 2.5: The Panasonic GH5 white balance menu
If your camera doesn’t have custom white balance options, you’ll at least be able to choose from a range of presets, including daylight, fluorescent light, tungsten, and more. Choosing one of these will at least give you a consistent place to work from.
Codecs
Almost all cameras record in a compressed format and some are more heavily compressed than others. While it might seem useful to be able to record files in a smaller space, the more you compress a video, the lower the quality will be. Finding that sweet spot between a video that takes up too much space and a video that falls apart can be tricky, and not all cameras give you many options here.
Most compressed videos today use a compression method (codec) called H.264, although HEVC (also known as H.265) is becoming more popular. Support for HEVC has grown; all Apple Silicon Macs (and Intel Macs with a T2 chip) can decode HEVC easily. Higher-end cameras might offer other options, such as ProRes, ProRes RAW, and Blackmagic RAW. While all of these formats do increase the quality, they take up significantly more space.
For example, a Panasonic GH5 can record at a data rate of 100 Megabits per second (Mbps) at 4K at 24/25/30 fps, or 150 Mbps for 4K at 50/60 fps, alongside many other options. Here’s what that looks like:
Figure 2.6: So many options — delve into the menus of your camera and test the settings out
The data rate for ProRes at the same resolutions and frame rates ranges from 470 to 589 Mbps, much, much higher than typical H.264 and HEVC codecs. These increased data rates require a much faster and larger recording device, typically an SSD or CF Express card rather than an SD card. You’ll want to find a balance between quality and file size that suits your job’s needs; read reviews, download files, and do the math to figure out how much space you’ll need.
Lastly, it’s very important to know that not all cameras compress video in the same way. H.264 from one camera can be easy to deal with, while the same codec from another camera stutters on playback. If it is at all possible, download some original footage from a camera you’re planning on using to make sure it works well in your workflow. Expect new codecs and workflow changes in the future — standards do shift over time.
Containers
Video data is encoded using a particular codec and is then stored in a container, usually a file with a .mp4 or .mov file extension. A container is not a codec, however; H.264 can be found inside many different types of containers, and a .mp4 file might contain video data in one of many different codecs. Still, you’ll probably be fine; just look at the extension at the end of the filename to see which kind(s) your camera makes.
However, there are cameras out there that don’t produce single-contained video files at all. The AVCHD format, for example, spreads important data out across separate files and different subfolders, meaning you can’t simply copy a file from an SD card and have a single video clip. Instead, the video data needs to be rewrapped inside a container format that FCP can use by importing directly from the SD card. Other cameras do contain their clips in single files (yay!), but they restart their file numbering on every card (boo!), leaving you to manage multiple files with identical names.
Where possible, I prefer to avoid AVCHD and other fussy container formats. A standalone video clip with a unique name using a standard codec is the gold standard, and plenty of cameras make files like this. If you’re choosing a camera, don’t bend over backward to support one that makes your life difficult.
Review — getting all the settings right
Let’s combine all of these settings:
- Ideally, you’ll be shooting in 4K to H.264 or HEVC in a
.mp4 or .mov container, at a data rate of around 100 Mbps or higher.
- You’ll use a gray card to set the white balance for each location you shoot in.
- If you’re in Europe or Australasia, you’ll probably mostly shoot at 25 fps with a 1/50 shutter speed.
- In North and South America and most of Asia, you’ll probably mostly shoot at 24 fps or 30 fps with a 1/50 or 1/60 shutter speed.
- You might choose to use a moderately high shutter speed (50 or 60 fps) for B-roll shots, to allow for 50% slow motion.
- You’ll change the shutter speed for high frame rates or if you see banding.
That’s it — you’ve set your camera up for files that FCP will import easily and that will cut well together. Now, you just need to master the aperture, ISO, color profiles, and everything else that your camera offers! As much as I’d like to discuss all of these variables, it’s beyond the scope of the book, and I can only advise you to keep the ISO low to avoid noise, keep your subject correctly exposed, shoot with a low f-stop if you want blurry backgrounds, and if you want to keep things simple, use a color profile that’s close to what you want to deliver. But what kind of camera will give you these controls? Read on!