DISCLOSURE: AS AN AMAZON ASSOCIATE I EARN FROM QUALIFYING PURCHASES.
THIS POST MAY CONTAIN AFFILIATE LINKS, MEANING, AT NO ADDITIONAL COST TO YOU, I EARN FROM QUALIFYING PURCHASES. AFFILIATE LINKS ARE MARKED WITH #ad. "I" IN THIS CASE MEANS THE OWNER OF FILMDAFT.COM. PLEASE READ THE FULL DISCLOSURE FOR MORE INFO.
When it comes to connecting your video camera for live streaming or to an external on-camera monitor for checking focus, most videographers use SDI as it is the film industry standard.
But what does SDI stand for? What does it mean? And is SDI better than e.g. HDMI? Let’s find out. First, here’s a short definition:
SDI stands for ‘Serial Digital Interface’ and is a category of digital video interface standards for SDI transferring video, audio, timecode, and metadata in a 75-ohm coaxial cable with BNC connectors. The speed defines each SDI standard. It transports data, e.g., an 3G-SDI (3 Gbit/s) can transport 1080p-video at 60 fps, while 12G-SDI (12 Gbit/s) can carry a 4K UHD signal at 60 fps.
Ok, that was the quick explanation. Let’s dive into this in more detail below and answer some of the frequently asked questions about SDI.
What are the SDI standards commonly used in video and film production today?
SDI is an umbrella term for a series of different standards for transferring video-related data. SDI was first standardized by SMPTE (The Society of Motion Picture and Television Engineers) in 1989.
There are many different standards under the SDI-umbrella, e.g., broadcast, stereoscopic (3D) video, and dual-link, quad-link, etc., but I’m not going to touch upon all of them here. Also, I’m not going to address interlaced formats.
Instead, I’m going to focus on the standards mostly used for modern digital video cameras to send a signal to an external monitor on top of your camera or to the director on set.
I find that the most commonly used standards found on cameras used by small scale production companies and indie filmmakers are 3G-SDI and 12G-SDI.
3G-SDI has been the most used professional standard for years, but things are about to change, and I see 12G-SDI becoming the new standard.
Each of these is designed for a specific bit rate and video format. In other words, each can transfer an uncompressed video signal at a specific resolution.
Here’s a table for a quick overview:
|Standard||Name||Bitrate||Video Format||Designed To Carry Up To:|
|SMPTE 292||HD-SDI||1.485 Gbit/s||HD-ready, HD, 720p||1280×720 (60 fps)|
|SMPTE 424||3G-SDI 50m+reclock||2.970 Gbit/s||FullHD, 1080p||1920×1080 (60 fps)|
|SMPTE 2081||6G-SDI 25m||6 Gbit/s||FullHD, 1080p, UHD, 4K, 2160p||1920×1080 (120 fps) and 3840×2160 (30 fps)|
|SMPTE 2082||12G-SDI 10m + fiber||12 Gbit/s||UHD, 4K, 2160p||3840×2160 (60 fps)|
|SMPTE 2083||24G-SDI 5m||24 Gbit/s||UHD 4K, C4K/4K DCI, 8K||3840×2160 (120 fps) and 7680×4320 (60 fps)|
I find these are the most commonly used SDI-standards in modern cinema cameras today.
The higher the resolution and frame rate the camera can send to an external monitor, the more likely it will have a faster SDI-connection.
The Panasonic Varicam can send two 6G-SDI, but 6G-SDI isn’t commonly used.
What SDI output do you need for your on-camera monitor?
Unless you use a 4K monitor or external recorder and your camera can output a 4K signal over SDI, often a 3G-SDI will be sufficient. Most 5″ and 7″ external monitors are only FullHD and will downscale a 4K signal to 1080p anyway.
When we’re talking about 5″ and 7″ monitors then 4K doesn’t make much sense, because you can’t see a difference between FullHD and 4K small screens.
But if you were to transmit a 4K signal to a large 4K screen, then 12G-SDI starts to make a lot of sense. Especially if you’re into broadcasting and live streaming, since you don’t have to use two or four (dual/quad link) cables to transmit a 4K signal anymore – you can use a single 12g cable instead.
That’s also why a camera such as the Blackmagic Ursa Mini 12K has a 12g SDI connection for broadcasting and “only” a 3g SDI connection for outputting to an external monitor.
If your camera “only” has 12G-SDI don’t stress. It backward compatible, so you can use it with slower standard monitors as well.
SDI vs HDMI. What is the difference?
SDI and HDMI’s biggest difference is that SDI only transfers raw data, while HDMI transfers image data.
For example, if you transfer the RAW data from the sensor output to an Atomos recorder via SDI, you’re only transferring the raw data of 1s and 0s. That data then has to be interpreted in the recorder and encoded within the Atomos Recorder so that you can see an image on your screen.
When you’re using HDMI, you’re instead sending the image data as it is created within the camera itself.
So which connection is best – SDI or HDMI? And what are the pros and cons of each system?
Benefits of using SDI
The film industry standard is SDI connections because it is a more stable connection in general.
The reason is that SDI consists of a single strand of cable that works as a pipeline for serial digital data. This makes it more stable over longer cable runs and less prone to interference.
Also, the BNC connectors lock the cable in place, so you don’t have the risk of accidentally pulling out the cable on set.
Limitations of using SDI
Now, SDI also has some limitations.
First of all, SDI is expensive. You pay a lot of money for the co-ax cable compared to other solutions such as Wi-Fi or fiber.
Second, SDI is heavy – especially compared to fiber-connections.
Third, even though you can use longer SDI-cables than HDMI, this doesn’t mean that you can roll out 2km of SDI-cable and be fine. And the faster the SDI-connection, the shorter the cable you can use for a secure transfer of data.
Below, I’ve created a table of the SDI coax cable lengths you can use before you need to boost the signal with a repeater.
As a rule of thumb, you can expect about half the cable run when you upgrade one step to a faster standard.
|SDI connection||Max Cable Length|
If you move to fiber connections, you can run the cables much longer. But this is getting into broadcast territory and is beyond the scope of this article.
So if SDI is the industry standard, why would anyone use HDMI? Let’s find out.
Benefits of using HDMI
HDMI started as a consumer standard for getting devices such as television and DVD-players to transfer video and audio.
Because HDMI transfers the image data as it is created in your camera, you can send something like a 4K 10-bit HDR or HLG signal over HDMI.
A benefit to HDMI is, that both devices are able to “talk” to each other. Fx when you switch on your Apple-TV, you can switch on your TV at the same time, because the TV is able to respond to the Apple-TV and vice versa.
Because HDMI works both ways, i.e. your camera and your monitor talks to each other, you can control features on your camera via HDMI.
For example, if you connect an Atomos recorder to certain cameras from Sony, Panasonic or Canon, you can press ‘record’ on the camera, and the Atomos recorder will automatically start recording as well – and vice versa.
You can’t do this with a SDI-cable without an extra cable or using a timecode tricker.
Fx, on my RED Komodo, I need an extra control cable besides the SDI before I’m able to control the camera from my SmallHD 702 touch. In other words, I need two cables instead of one, which I’m not a huge fan of.
Limitations of using HDMI
Compared to an SDI-cable, which is a single thick strand of cable, an HDMI cable consists of several thinner cables that are prone to breaking and to interference.
HDMI doesn’t work over longer cable runs. As a rule of thumb you need to stay below 5 meter – and probably less if there’s a lot of interference from LCD-screens, Wi-FI, etc.
Also, HDMI plugs don’t have any locking mechanisms, so they can easily be pulled out of your camera or monitor unless you buy an extra locking mechanism for your camera cage or monitor.
So that sums it up.
SDI is the industry standard for serial data interface because it provides a more stable connection, that is less prone to interference – especially over long cable runs. The locking BNC-connectors locks it in place on both camera and monitor, so you don’t accidentally pull it out.
HDMI is the consumer based standard for transferring video and audio between devices. It’s more prone to breaking, can easily be pulled out of camera and monitor, and is also more susceptible to interference.
You can get a lot more technical about SDI, and there’s a lot that I’ve left out. But I hope this has provided you with some useful insight into the pros and cons of using SDI and HDMI in video and film production?
As always, if you got any comments or questions, please let me know in the comment section below.
About the author:
Jan Sørup is a videographer and photographer from Denmark. He owns filmdaft.com and the Danish company Apertura, which produces video content for big companies in Denmark and Scandinavia. Jan has a background in music, has drawn webcomics, and is a former lecturer at the University of Copenhagen.