Raspberry Pi 4
Clive Semmens (2335) 3276 posts |
Yup. My Pi and my Mac mini both produce 3840×2160 outputs, but the Pi does it at 24Hz and the Mac does it at 25Hz. You’d think the monitor would scarcely notice the difference, but no. “Let me think about that for a minute!” Or maybe what it’s thinking about is the change from the HTML1 input to the HTML2 input in response to the click of the remote? And of course yes, the Pi’s issuing 3840×2160 video, but RISCOS might be thinking it’s any of a dozen different modes all the while. |
David J. Ruck (33) 1629 posts |
Even if they were using the same resolution and refresh rate, there are still several different signal timing options. The monitor may choose to resync when inputs are switched even with identical signals. With Linux I use XRDP or VNC to drive one machine from another without changing the monitor input, that works ok for RISC OS using VNCserver, but if its more than a couple of minutes its worth switching screens to get a smoother desktop. |
Clive Semmens (2335) 3276 posts |
The remarkable thing here is how smooth the desktop is on the Pi, whatever resolution I use – but they’re all 3840×2160 to the monitor anyway. The Mac’s perfectly smooth too. I’ve got a choice of nine different modes from 1920×1080 to 3840×2160 (I don’t use anything below 1920×1080) on RISCOS, and they’re all fine. |
Rick Murray (539) 13806 posts |
Ah, but while you might know the signals are identical 1, the monitor cannot make any such assumption. 1 Actually, they won’t be identical unless they’re sync and frame locked. The exact same signal running six microseconds later isn’t “identical” from the point of view of the monitor, that will have to wait for a number of frames to derive the timing and start of frame. |
Clive Semmens (2335) 3276 posts |
I’ve no idea how modern digital video signals are coded, Rick 8~). I knew the old analogue schemes inside out (and back to front and right way round…) |
Rick Murray (539) 13806 posts |
Not so different, actually. I’m sure you know that video in a computer is created digitally by clocking out data (usually eight bit per channel) to a DAC that would convert that byte into a variable voltage that represents the intensity of the on-screen dot for that channel (three in parallel for the RGB) at that point in time. Run this really fast and you literally draw hundreds of lines of varying intensity which make up a frame. Then you do it all over again, something like sixty times a second (for VGA etc). On the monitor side, the line sync and frame sync tells the monitor how it should be tracing out each line on the screen, and the changing intensities in the red, green, and blue signals directly control the electron guns in the neck of the cathode ray, maximum signal is maximum voltage is maximum brightness (more or less). Now enter flat screen LCD monitors. These give a much more stable picture then cathode rays because they don’t continually draw and redraw the same thing. Instead, the analogue input goes through an ADC to convert the analogue signal into a digital representation that isn’t so different to what the computer generated in the first place. This is buffered until a complete frame is available, and then is sent to the LCD panel in blocks (depending on how the panel was created). So what HDMI etc has done is realise that digital to analogue to digital is wasteful and pointless. So the data is nominally three channels (red, green, blue) plus clocking, with the basic protocol being 8 bit sRGB expanded to 10 bit and then fiddled with in order to reduce the transitions between zeros and ones. The flyback periods, while meaningless in digital, persist and these are now called “data islands” and are usually used to provide audio data. There are, of course, various extensions (deeper colour, different gamuts, compressed audio, rights encryption, blah blah) but at it’s heart HDMI is pretty much the eight bit video data and clocking signals as would have once been converted to analogue, only now simply sent directly using differential signalling (less signal corruption on long cables). Some info here: https://warmcat.com/2015/10/21/hdmi-capture-and-analysis-fpga-project-2.html |
Clive Semmens (2335) 3276 posts |
Up to that point, it’s all stuff I’ve actually done design work on. From that point on, I’ve guessed how things work, but not actually known. Apparently I guessed pretty much correctly. |