A Photographer's Intro to Videography: 4 key differences you need to know!

 

A Photographer’s Intro to Photography

 

Some of the links in this post may be affiliate links. This means if you click on the link and purchase qualifying items, I may receive an affiliate commission at no extra cost to you. For more details, check out my disclosures page. Thanks for your support!

Videography and Photography are Different Skill Sets

A common misconception I see is that if you know how to shoot photographs, then shooting high-quality videos will be easy, but that's not really how it works. Videography provides new technical challenges, and photographers can find adjusting to new uses of familiar settings particularly tough.

So, I'm going to focus on these core differences between photography and videography to give you an idea of areas you may find challenging to adjust to as you move to become more of a hybrid photo and video shooter

If you’re a person that prefers to watch the information (and see visual examples) rather than read it, check out the video above. Otherwise, read on…

Difference #1: Frame Rates

Let's start off with something that you may not have been exposed to at all in photography: frame rates or frames per second.

In photography, the term “frame rate” is something you likely never even heard of. There are certain niches of photography where being able to shoot in burst mode is important to maximize your chance of getting a keeper image or two when there is fast-moving action. But even then, your main concern is really how fast can my camera fire off these shots and how many images you can shoot before your buffer fills up?

In videography, your frame rate permeates everything. It's a core component of video. You can't get around it. Video is essentially a constant stream of still frames recorded and played back multiple times per second that fools your brain into thinking that it's seeing movement. Think of one of those old flipbooks you may have seen as a kid. Similar idea. The number of times per second that an image is displayed is your video frame rate.

It's important to understand that the frame rate you choose can affect the look of your video drastically and it's closely tied to the shutter speed you use, which I'll cover in a minute.

A good general guideline for thinking about your frame rate is that if you want to shoot video footage that you're going to play back in real-time, shoot it in a standard frame rate like 24 or 30 frames per second (or 25 fps if your camera is set up for use in a PAL region, like Europe). These frame rates are meant to have the footage played back at the same speed it was filmed.

If you want to be able to slow your footage down later when you edit, shoot in a higher frame rate like 60 or 120 fps. If you're in a PAL region, the equivalents would be 50 or 100. When you shoot in these higher frame rates, it allows you to have enough frames per second to stretch the footage out and play it back at a standard frame rate, thereby slowing it down without getting choppy.

The bottom line here is that the frame rate becomes a purposeful choice. More frames per second are not automatically better and it can in fact often be a worse choice. The video game industry is constantly trying to squeeze out higher frame rates for a better gaming experience, but don't confuse video games with real video. They're very different use cases.

In video, lower frames per second give a more organic and visually pleasing look. Higher frame rates look smoother, but more artificial and jarring unless you're going to slow them down. The standard for a filmic look or what people often call cinematic is 24 fps. This is the frame rate at which we're used to seeing movies delivered.

 

Difference #2: Shutter speed

I mentioned that frame rates and shutter speed are closely tied together in video. This is the next big difference to understand and the one that might cause you the most heartburn at first so I want to dedicate a little time to explain this well.

If you recall from the video/post about using manual mode for photography, we talked about how your shutter speed is not only a creative choice but also one of your primary exposure controls. So you can vary it drastically depending on what you need. When shooting video, though, you are much more restricted in your use of shutter speed - you essentially lose the ability to use shutter speed as an exposure control.

The similarity between photo and video here is that motion blur is affected by your shutter speed, but the difference is that in video, motion blur is such a crucial priority that it essentially negates any control you have over using shutter speed for exposure. And to get the most natural-looking motion blur, you'll want to adhere to something called the 180-degree shutter angle rule.

The 180-degree shutter rule is a standard brought over from the days of analog motion picture cameras where film was exposed with circular gate shutter mechanism that could be opened up wider or closed down smaller. The extent to which the shutter gates were open or closed was called the shutter angle.

So if this shutter was set to be open the whole time, it was the full 360 degrees of a circle. If it was set to be open half the time, then that would be a 180-degree shutter angle. If it was open a quarter of the full amount, a 90-degree shutter angle, etc.

How this worked was that these gates would rotate at a consistent speed while the film passed by at a matching consistent speed - the frame rate - and this combination of frame rate and how much of the shutter mechanism was open would control how long each frame of film was exposed. It would also control how much motion was captured in each frame and give a cadence of motion between the frames of film. Think of cadence as how much blur from one frame overlapped to the next.

What the film industry learned over time was that setting their shutter angle to 180 degrees resulted in the most natural-looking motion blur that emulated what we see in real life. We tend to think that we see things nice and crisp with our eyes, but we actually see a significant amount of motion blur. You can test this yourself by just holding your hand up in front of your face and then shaking your fingers back and forth. Notice how your fingers appear blurred? This is what the 180-degree shutter rule emulates. It's essentially deliberately making the video footage less perfect than it could be so that it feels more real.

Now, In our modern digital cameras, we don't have a circular shutter, it’s either a physical curtain or an electronic shutter that turns your sensor's photosites on and off. Some cameras, especially cinema cameras, still use the shutter angle convention, but most of us use cameras where we use shutter speed instead.

To leverage the 180-degree shutter rule using shutter speed, we're going to set the denominator of our shutter speed to as close to double our frame rate as possible. Meaning, for example, that if I'm shooting at 24 fps, then my shutter speed should be 1/48th of a second. Now, most mirrorless cameras don't shoot at 1/48th of a second so you'd use 1/50th instead. But if I was shooting at 30 fps, I'd use a 1/60th of a second shutter speed. If I was shooting at 60 fps - I’d use a 1/120th of a second shutter speed, etc.

The reason this “rule” matters is that if you set your shutter speed too slow, you'll have too much motion blur, and if you set it too fast you'll remove too much motion blur which really affects the look of your video.

The second issue, in particular, where your shutter speed is too fast, seems to be where most beginners mess it up. We're used to controlling photographic exposure with our shutter speed, and so we crank that shutter speed up to darken our image and the footage comes out looking all janky. It contributes to a strange, hyper-realistic, jittery, and very distracting look in your footage.

Now there might be legitimate artistic reasons where you may want to break this "rule" and intentionally use a slower or a faster shutter speed. For instance, if you wanted to shoot a trippy dream-like sequence, using a slower shutter speed could be useful. If you wanted to shoot some crisp, jarring real-time action without much motion blur, you might crank that shutter speed up to make things feel frenetic and uncomfortable - think about war scenes like the beginning of Saving Private Ryan or the fight scenes in Gladiator. In those cases, less motion blur is allowing us to see more of what's happening and it doesn't feel as off-putting. Things are moving fast and because of the context, we expect it to be jarring - in that sense you could say it feels more realistic even though it isn't actually realistic. The feeling you get is what's important.

Another example of when you might want to break this "rule" that I usually don't see people talk about is if you wanted to shoot in a high frame rate and play that back at a real-time speed for some reason as opposed to using it for slow motion. In this case, to emulate realistic motion blur, you'll actually want to use a 360-degree shutter angle which would equate to a shutter speed that's the same as the frame rate. So 60 fps would be shot with a 1/60th shutter speed. 120 fps would be shot at 1/120th of a second, etc. This would help your higher frame rate footage feel less jarring than it normally does, though if there's even a chance you're going to want to slow that footage down, you should stick with the 180-degree shutter rule, else it will look really weird and blurry.

But the point here is that breaking the 180-degree shutter angle "rule" should be an intentional choice. The vast majority of the time, you're going to want to set your shutter speed to double your frame rate.

But regardless of the creative choice you make with your shutter speed, the important thing to remember here is that your shutter speed is ALWAYS set relative to your choice of frame rate to control motion blur. Exposure will change when you change your shutter speed just like you're used to, but now it's a side effect.

 

Difference #3: Controlling Exposure

So, with shutter speed taken off the table to use as an exposure control, this leaves us with a real problem. We've lost one of our most important methods to restrict how much light is coming into our cameras.

What you'll find is that once you set your frame rate and shutter speed to be what you want, you'll often be overexposed. And what happens if you don't have 100% control of the lighting in your scene, which would be the preferable way to deal with this?

Well, you could stop down your aperture - aperture works exactly the same in videography as it does in photography - but you still may not be able to get your image dark enough. Plus, choosing aperture will often mean that we have to throw all aesthetic considerations out the window. What happens if we want a shallow depth of field for a particular scene and also need less light?

Now if you're a landscape photographer you probably already see where I'm going with this - you know that we have other external tools to control our exposure, but most photographers don't have need of them and may not even know they exist. And I'm talking about ND filters, of course.

In video, ND or neutral density filters are basically as necessary as having a lens. In higher-end cinema cameras, they're even built into the camera directly - that's how necessary they are.

For those who have never heard of ND filters, they're a tool that allows you to reduce exposure by blocking all light in a uniform way. Many people describe them as sunglasses for your camera and I think that's a pretty good way to describe it. They generally come in two different flavors - you have regular ND filters that reduce the light by a predetermined number of stops. And then you have variable ND filters that allow you to adjust the strength of the ND. Personally, I tend to use VNDs for video (2-5 stop variable ND and 6-9 stop variable ND) and regular ND filters for photography (this magnetic ND filter system is great!).

Generally speaking, the trade-off here is that VNDs are drastically more convenient and quicker to use than regular NDs because you don't have to constantly switch out the filters, though VNDs tend to come with a cost of slightly lower image quality and some color casting as a byproduct of how they work (they use two cross-polarized linear polarizing filters).

That said, this is one area where you get what you pay for - more expensive VNDs are generally much higher quality than cheap variable ND filters. My favorite brand for run n' gun filming is currently polar pro: optically very high quality, barely any color cast, and they don't get stuck on the thread of your front lens element like other brands.

ND filters will usually come in a screw-on design that thread into the front of your lens element or drop-in filters that fit into a matte box system, but regardless of what kind of system you ultimately want to go with, if you want to create pleasing, high-quality video where you can control exposure AND have creative options over your depth of field, some type of neutral density filter system is going to be necessary.

 

Difference #4: ISO performance

Something else to be aware of that will be different from what you're used to in photography is that your ISO may act differently than you're used to depending on the settings you choose for your particular camera.

I almost didn't add this section because there is such a wide variety of differences between camera manufacturers here and even between different model cameras from the same manufacturer that it's impossible to make this be anything less than nebulous, but I ultimately decided that it's better to at least point you in the right direction about things you'll need to be on the lookout for and research for your own camera.

The overall idea here is that you'll need to be able to adapt to ISO not working how you're used to in photography. There are a few potential reasons for this and I'll describe a couple of them at a high level.

The first has to do with how your camera may record scene luminance and color spaces differently for video than for photo depending on the settings you’re using.

Now, when you first try to shoot video, you're probably just going to be focused on capturing video with the default, out-of-the-box luminance and color settings.

But you'll very quickly start to see some limitations. Video will seem a lot more like the equivalent of shooting jpeg instead of shooting raw. You do not have the same ability to push the image around as you're used to in photography. You'll come to understand just how much more limited you are with video when it comes to editing things like exposure, dynamic range, and color. So you'll quickly start to look for other options above and beyond the out-of-the-box settings.

What is LOG?

Now, in video, the vast majority of cameras can't shoot in raw. One big reason for that is that video requires a much higher data bandwidth so compromises have to be made to still be able to shovel as much data to your memory cards as fast as possible. So instead of getting the unprocessed raw data to work with later, some amount of processing will always be baked in.

The next best thing to shooting in raw is something in video called a logarithmic gamma curve. You might hear people refer to this as a “log picture profile” or simply as “shooting in log.” It's not as flexible as shooting in raw, but it gives you a whole lot more flexibility to edit than if you shot in your standard default video mode.

Different camera manufacturers have their own flavors of log and so there are nuances to working with specific types of log, but the basic idea is that the camera pulls the maximum amount of data it can off the sensor and then compresses it into a logarithmic curve before recording it to the memory card to save space. So even though the camera is still processing the footage, these log profiles are focused on preserving a wider range of tonal information for increased highlight and shadow detail.

Log gamma curves also allow you to film in wider color gamuts - think of these as what you've probably heard referred to as color spaces in photography. Shooting in log will allow you to film a wider color range than can be delivered, which will result in your footage appearing very washed out if you don’t color correct and color grade, but this is how you can preserve as much information as possible straight out of camera.

How shooting in LOG can affect your ISO settings

Now putting aside the fact that you'll need to learn how to color correct and color grade your log footage, something that often throws people for a loop when recording log footage is that your ISO settings appear out of whack compared to what you're used to. To give you an example, I filmed the video at the top of this post in one of Sony's log profiles, S-Log3 (see the video for an example of the ungraded footage). The base photographic ISO on the camera I used is 100. Pretty standard, right?

However, Sony specifies that S-Log3's base ISO is always 8 times higher than your base photography ISO. So my base ISO is 800 when shooting in S-Log3 on that camera (the Sony a7IV). On my Sony a7SIII, the base ISO for photos is 80, so my base ISO for S-Log3 is 640. And you're just going to have to research and memorize things like this for your specific camera.

But, you can start to see what I mean, here. ISO is not absolute between photo and video. Values are relative and can change depending on how you shoot. In this specific example where I'm shooting in S-Log3, ISO 800 is exactly the same as the regular ISO 100.

Dual Native ISO

Another way that your ISO may act in ways that you're not expecting or may have not really known about in photography - at least unless you're in a photography niche that really cares about this like astrophotography - is that keeping your ISO as low as possible may not achieve the best results. It's not always publicized, but there are a fair number of cameras out there that have what's called dual gain circuits or dual native ISO.

This is one reason why, for instance, in the video I made about using manual mode for photography I was so particular about making sure people understand what ISO actually is as opposed to what people commonly say it is: that it's a gain control, not sensor sensitivity or an exposure control.

In cameras that have dual native ISO, the gist is essentially that they have two different circuits that handle different ranges of ISO amplification before the signal is converted from analog to digital. One circuit, the lower gain circuit at lower ISOs typically prioritizes maximum dynamic range, and the second higher circuit sacrifices some dynamic range and changes how dynamic range is distributed for the sake of an improved signal to noise ratio. What this looks like practically is that you'll move up through the ISO range and then you'll hit a certain ISO where the image suddenly appears cleaner than it did at lower ISOs - and obviously this probably goes against everything you've likely been told about ISO.

Now, as I said, most photographers are blissfully unaware about this rabbit hole application of ISO, but in video, the noise issue can be exacerbated because you can see the noise moving around - it's not static like in a photo. Plus, noise is harder to clean up in video. So you'll tend to find that videographers are typically much more sensitive to the nuances of noise performance than you'll see in most photography circles. It's just a harder problem to deal with in video.

So my point in bringing up these ISO caveats is really to call attention to the fact that you may need to readjust what you thought you knew about ISO and that you'll need to dig into how your specific camera handles specific situations.

Questions or comments?

Let me know below in the comment section!

=============================

Relevant Links/Stuff Mentioned:

=============================

► My Favorite Variable ND filter set: https://geni.us/polarpro_vnd_set

If you need only one strength of VND, here is a 2-5 stop: https://geni.us/polarpro_vnd_2-5

...and a 6-9 stop: https://geni.us/polarpro_vnd_6-9

► Best step-up rings that don't get stuck (make sure you grab the right size): https://geni.us/brass-step-up

► My favorite "regular" ND system for photography (magnetic): https://geni.us/kase-filters

► PolarPro makes regular, high-quality ND filters, too: https://geni.us/polarpro_quartzline - fingers crossed that they make a magnetic system someday.

=============================

Gear used to make this video:

=============================

► My favorite camera for video: https://geni.us/sony_A7Siii

► My main, daily photo driver and hybrid camera: https://geni.us/sonyA7IV

► My main landscape photo camera: https://geni.us/sony_a7r4

► The best vlogging and tabletop tripod: https://geni.us/vlogging-tripod

► My Super sharp wide-angle lens: https://geni.us/sony_16-35gm

► The mic I used for my talking head: https://geni.us/deity-vmic-d3pro

► Awesome (and tiny!) wireless mics: https://geni.us/rode-go-2

► Incredible vlogging mic: https://geni.us/sony-digital-mic

► Fast, reliable memory cards I use the most: https://geni.us/prograde-cards

► Affordable, yet still high-quality key light: https://geni.us/godox-sl60w

► Softbox for key light: https://geni.us/neewersoftbox35

► My favorite mini RGB light: https://geni.us/aputuremcRGBWW

► My favorite landscape tripod: https://geni.us/favorite_tripod

► My favorite travel tripod: https://geni.us/travel_tripod

► The most versatile tripod head ever: https://geni.us/acratech-gxp

► The awesome camera bag in this video: https://geni.us/shimoda_explorev2

Dan Fox

Lover of coffee and systems, short-form video creator, photographer, writer, facilitator, rider of motorcycles, and all-around adventurer. Based out of Seattle.

https://foxandlens.com/
Previous
Previous

Some days I'm extra terrible at YouTube

Next
Next

Stop Missing Focus! (With Back Button Focus)