I know this! I just use H.264 for everything!
Hold on there! Let’s take a step back and break this down. The world of video codecs is vast, filled with myths and outdated opinions. What was true a decade ago is no longer the case. While H.264 seemed to be the standard for distribution a few years back, things have changed. Different codecs are used in production, for web platforms, and streaming services, and it can vary greatly.
Starting with an easy one, can I tell what codec a file uses from its file extension?
No, you can’t. The file extension only tells you the type of container, not the codec used to compress the video information inside. Containers like MOV, MP4, MXF, and MKV describe how the information is stored, but they don’t reveal the exciting part of the codec story. Each container has its limitations and capabilities. For example, you can’t include a timecode track in an MP4 like you can in a MOV. (By the way, MP4 is actually an ISO standard based on MOV.)
What are a few common codecs?
You may have encountered H.264 (or AVC), HEVC, ProRes, and DNxHD. AV1 and VP9 are widely used in distribution but not as much in production. Each codec has its level of compression, with some being heavily compressed (H.264, HEVC) and others less so (ProRes, DNxHD, DNxHR). It’s also important to note that most compressed codecs have variable bit rates, while less compressed intermediate codecs like ProRes have fixed data rates at specific frame sizes and rates.

Is data rate important?
Absolutely! The data rate determines the file size. Generally, newer compressed codecs are more efficient and offer higher image quality at the same data rate compared to older codecs. Older codecs require higher data rates to achieve the same image quality as newer ones. However, it’s worth noting that different encoding apps can yield different results.
I’m in production. Why should I prefer one codec over another?
Different types of shoots have different requirements. Event shooters may need moderate file sizes with high quality (H.264 or HEVC). Feature cinematographers prioritize pristine quality at any file size (ProRes 422 HQ or better, camera-native formats, or RAW formats like BRAW, ARRIRAW, or ProRes RAW). Game streamers require high-resolution screen recordings with high frame rates but moderate data rates. Social media content creators often prioritize quick transfers. There’s no one-size-fits-all solution.
Regardless of your shooting preferences, remember to keep post-production in mind. Shooting in a codec that requires conversion before playback adds extra work in post.
So what about post-production?
If you’re an editor, you want to work with high-quality files that are easy to play back. Depending on the shoot, you might be able to use camera original files like H.264, HEVC, or ProRes if your computer can handle them. However, if your production used a codec that is challenging to work with, you might consider converting to a high-quality intermediate codec (ProRes, DNxHD, DNxHR) or a low-quality offline proxy format (ProRes Proxy or H.264) to manage storage usage.
If you’re involved in VFX or color, you likely prioritize the absolute highest quality files, and file size becomes less of a concern. Choose the best flavor of the intermediate codec preferred by your provider. (Interestingly, many VFX pipelines rely on still images rather than videos.)
What about video consumers?
Most consumers don’t directly care about codecs, but they do care about the effects they experience. The speed of playback and battery life on mobile devices are the primary concerns, with quality being a distant third for most. Smaller file sizes reduce the likelihood of interruptions due to slow networks, and codecs that are hardware-accelerated on most devices (like H.264) help save battery life. Providers strive to strike a balance between quality and file size, but they may have to choose a codec based on what’s compatible with their customer base’s devices.
If you want the best possible image quality, consider purchasing content explicitly. Paid streaming services generally offer higher quality than mainstream “free with your plan” streaming, and Blu-ray is usually even better.
So, streaming providers always choose a codec that’s easy for consumers to play back?
Not necessarily. YouTube, for example, primarily uses the VP9 codec for desktop computers, even though it’s not widely hardware-accelerated like H.264 and HEVC are. This choice impacts CPU usage and, when unplugged, battery life. However, YouTube still sends videos to iOS and iPadOS devices in the hardware-accelerated H.264 codec, as they are more likely to be used on the go.
Why not stick with H.264 across the board?
Simply put, H.264 doesn’t offer the same level of quality as more modern codecs at the same data rate. Additionally, H.264 hardware decoders struggle with 10-bit data required for HDR and resolutions above 4K. To progress, the same standards process that brought us H.264 introduced HEVC (H.265) as its successor, offering the same quality at a lower data rate. Now, the process is proposing H.266 (VVC), promising even better compression. However, YouTube doesn’t use either HEVC or VVC.
So why doesn’t YouTube use HEVC?
Some codecs, including HEVC, are patent-encumbered, meaning that the algorithms and methods used to compress video aren’t all free to use. Large companies that use H.264 and HEVC typically join patent pools, paying fees and sharing relevant patents to manage the licensing process. HEVC has multiple patent pools and is more expensive to support than H.264.
Moreover, YouTube is a massive platform. If they were to use HEVC extensively, an unknown patent holder outside of existing pools could potentially make a claim related to some obscure part of the encoding process, leaving YouTube liable for significant financial consequences.
To avoid these issues and maintain similar data rates while increasing resolution, YouTube switched to the open VP9 codec for resolutions higher than 1080p. They also utilize another open codec, AV1, for certain videos and resolutions, although it demands more from playback devices.

So to avoid patent issues and high costs, everyone else moved to VP9 too?
Nope. VP9 is not hardware accelerated on most devices, and to some at least, it’s seen as a Google invention. There are questions about how patent-free it truly is, too. Instead, some (like Apple) use HEVC, pay the fees, and rely on the patent pools to cover themselves. They’ve even created an extension, MV-HEVC, to cleverly store the separate 3D video streams used by Spatial Video.

Others (like Netflix) have skipped over VP9 and moved to AV1, a newer open, royalty-free codec designed to avoid patent issues and minimize costs while compressing data even further. The Alliance for Open Media, creators of AV1, includes many high-profile members, including Google, Netflix, Microsoft, Intel and (though they joined late) Apple.
Hang on, royalties? Do I have to pay per-video with some codecs?
Codecs like H.264 are only truly free to use when the videos are delivered free to consumers. In theory, if you’re delivering paid training courses or other video, then your delivery platform should probably be paying a royalty on your behalf — but I’m not a lawyer and this is certainly not legal advice.
Royalties are at least part of the reason why so many video games use the ancient-but-royalty-free Bink Video codec instead of something more modern. Standards are great, but they won’t always be widely adopted if they’re not free.
So should we all be moving to AV1?
Probably not yet, because working with complex codecs purely in software can be slow and intensive. Hardware accelerated decode support for AV1 only just arrived in the latest M3 chips, and there’s no accelerated encode on Macs yet. DaVinci Resolve only added AMD-based hardware encoding of AV1 over the last year or so on PCs, but no other major NLE has followed suit, and playback is still not universal — Windows needs a plug-in for support, for example..
Let the streaming services worry about bleeding-edge codecs for now, and stay with things that are easy to work with. Use a higher data rate to keep quality high, or switch to an encoding app that does a better job with existing codecs. Eventually it’ll become much easier to make and play back AV1, but don’t worry about it until it turns up on a spec sheet with your name on it.
What programs should I use for converting between one codec and another?
Compressor and Adobe Media Encoder are popular and widely used for production tasks, like making proxies or converting to intermediate codecs. Note that some NLEs can do this for you too.
What codecs should I be using for editing?
Whatever your system plays back efficiently, which means whatever your system has hardware acceleration support for. On a modern Mac, that’s H.264, HEVC or ProRes, and on a PC, your CPU and/or your GPU should accelerate H.264 and HEVC at least, but check actual footage against your workflow — not all files are equal. AV1 decode acceleration just turned up on the M3 Macs, but it’s not really a codec intended for editing.
What codecs should I be using for collaboration?
Start by taking the list of codecs everyone in the pipeline can work with, then figure out what needs to happen to that footage. If it’s copied once, processed in an NLE, then output, camera-native files might be fine, assuming they’re hardware accelerated in that NLE.
But if you need to send files to an unknown third-party on a random computer, or if some camera-native files are not easy to work with, converting at least those files into some kind of intermediate or mezzanine codec like ProRes (or DNxHD/DNxHR for Avid) will help keep workflows smooth. If you have space, use ProRes/DNxHD/DNxHR to replace camera-original files, or if there’s too much footage to allow that, use a low-quality proxy codec (potentially even H.264) as a smaller stand-in during editing. If you go with proxies, switch back to the original files for final output.
Test your workflows, and don’t assume that wisdom from five or ten years ago still applies. Converting everything to ProRes will work smoothly, but the file sizes are too much for some productions, and it might not be necessary.
I need to share files with transparency. Should I still use the Animation codec?
Nope, that won’t work in the Finder, Final Cut Pro, a random client PC or any mobile device. Instead, use ProRes 4444 for transparency support along with smaller files that play back faster on modern Macs. It will also work wherever ProRes playback is supported (which is an open standard, and therefore theoretically free forever). HEVC can also support transparency, but if you’re in production, you’ll probably want to use the higher quality ProRes 4444 instead.
OK, what about archiving?
Codec longevity is a concern for archivists, because many older codecs no longer function in the way they once did. You’ll usually be able to play those clips back somehow, but you might have to track down an older system or a specialist program to access them. For example, Apple deprecated their oldest codecs several years ago, meaning that files encoded with the once-popular Cinepak and Animation codecs don’t work on modern Apple software.

ProRes is a safe archiving choice for high quality media — ProRes decoding is an SMPTE standard, so it will always be readable in the future. Creating additional copies in common, industry-supported standards like H.264 and HEVC, using a high data rate, is also a sensible idea.
What codecs should I use for deliveries to web platforms and social media?
H.264 or HEVC for SDR content, HEVC for HDR content, or ProRes for everything if you have the bandwidth. Modern Macs have hardware encoders to accelerate all those deliverables, and PCs with decent GPUs can accelerate exports to most modern compressed formats. Whichever codec you choose, keep the data rate high.
Every web platform will recompress whatever you send them to a much smaller file, so you just need to send them a file that looks great to you, in any codec they support. Platforms usually recommend specific codecs or data rates, but they’ll actually take just about anything. Sometimes official guidelines are far lower than I’d be comfortable delivering, though. For example, if I’m delivering a 24/25/30 fps 1080p SDR H.264 to YouTube, I’d want to use a data rate of 20Mbps, but YouTube asks for just 8Mbps, and will show users around 2.5Mbps. For comparison, a ProRes 422 file requires around 120-140Mbps.
Looking to send high-quality videos? You have the option to use ProRes, but keep in mind that uploading these files may take longer.
What about professional deliverables?
When it comes to professional standards, the focus is on quality rather than file size. This means that older codecs may still be used if the specifications haven’t changed. However, be prepared for a potentially inconvenient and time-consuming process, especially when creating formats like DCP or IMF. Interestingly, DCPs and Netflix IMFs both utilize JPEG2000 as the basis for compression, despite it being a still image format.

For specific production deliverables, carefully review the spec sheet and consider using apps like Compressor or Adobe Media Encoder to ensure the precise combination of codec, container, and data rate required.
What if I need to make really small files?
Typically, the best way to share files with clients is through platforms like Vimeo, frame.io, or Dropbox Replay, where you can simply send them a link. However, if you need to create highly compressed files for local playback, tools like Handbrake (free and easy) or FFmpeg (free but complex) can be used with codecs like H.264 or HEVC. Keep in mind that encoding to newer codecs may be slower and result in less compatibility.
Can you give me a one-line summary?
No, codecs are complex. However:
- H.264 is still a decent choice for many videos today.
- HEVC is suitable for various purposes as well.
- Intermediate codecs like ProRes remain dominant in production.
- Widespread hardware encode and decode capabilities are crucial for widespread codec adoption.
- In the future, many may transition to better open-source codecs like AV1 for various deliverables, but that day is not today.