Broadcast equipment, monitors for example, have HDMI, or SDI, or both inputs/outputs.
When they sport SDI (Serial Digital Interface) inputs, they are characterized by a specific standard, set by SMPTE, ex. "SDI 3G".
What does that mean? Well it's a combination of supported transmission bitrates and video formats.
To simplify:
8bit video signal
SD SDI supports legacy NTSC 480i and PAL 576i formats at a speed of 270Mbps max.
HD SDI supports 1080i and 720p at a speed of 1485Mbps max.
10bit video signal
3G SDI supports 1080 60p at 2970Mbps max.
6G SDI supports up to 2160p30 ("4K") at 6000Mbps max.
12G SDI supports 2160p60 at 12000Mbps max.
24G SDI supports 4320p30 ("8K") at 24000Mbps max.
Audio is always embedded, and color subsampling varies, most often is YCbCr 4:2:2 with standardized colorimetry (ex. Rec709.)
Now, practically, a 4K monitor will need a 6G+ SDI input to accept a 4K signal via SDI. Else a "lesser G" SDI input will only receive HD signals, and any 4K signal will have to be connected to the HDMI 2.0+ input.
Example:
This "BM 4KS" line of Lilliput monitors have 3G SDI, so no 4K signal via SDI, only compatible with HDMI input.
This "BM 12G" line, and this "Q 12G" line of Lilliput monitors have 12G SDI, so 4K signals can be routed to either SDI or HDMI inputs.
The main differentiation between SDI and HDMI in post being HDMI has a greater latency, which might or might not be a problem for your application.
More info about HDMI on Wikipedia.
More info about SDI on Wikipedia.
No comments:
Post a Comment