Quantcast
Channel: July 2020 – Gough's Tech Zone
Viewing all articles
Browse latest Browse all 7

Note: Capturing VGA Analog Component Frames with an Oscilloscope

$
0
0

Call me slow, but I’ve only just come to realise that our beloved VGA has finally left us, to be banished to the depths of “legacy” technology. While the push to kill off the port had already been happening by 2012, the port lingered on especially in the low-end of the market where monitors continued to feature the port for backwards compatibility and penny-pinching companies opted not to include anything but a basic VGA cable in the package. Other places it would often be found include facilities with projectors as a last-ditch compatibility effort.

For many old computers, VGA is amongst the most common form of video output available. Sometimes integrated on-board, but also popular amongst later ISA, PCI and early AGP boards, this port served the CRTs and early LCDs well. It was a “simple” analog interface, with a few minor technicalities, but very much dear to me.

Having recently published a video where capturing VGA was important, I realised that doing that well is not as straightforward as it might seem. One of the key issues that stuck out to me was the issue of uncommon, non-standard modes and refresh rates. As the standard offers much latitude in how timings are derived, making it hard to find a capture solution that can handle every mode.

What’s in a VGA Signal?

Put simply, the VGA signal is a type of component analog video signal. The red, green and blue signals are carried in separate unbalanced 75-ohm circuits that run from 0V to 0.7V, along with a digital horizontal and vertical sync pulse which are 0V to 5V. Aside from this, there are some extra pins for ID bits which were later reallocated to an I2C bus used to form the Display Data Channel (DDC), used to identify the capabilities of an attached display device through their EDID programmed in an EEPROM.

The signals themselves don’t only carry active video from each line, scanned sequentially. Around the active video periods, there is a front porch and back porch which allows time for the repositioning of an electron gun to the start of each line in the case of a CRT monitor. There is also a vertical blanking interval as well which is more “dead” time for the movement of the electron gun back to the top of the frame. This means that the signal actually contains a lot of “inactive” periods which are often unseen.

The signals themselves also vary in terms of their timings, especially depending on the refresh rate and resolution being displayed. Two major systems used to produce the timings are the Generalised Timing Formula (GTF) and the VESA Coordinated Video Timings (CVT) which results in different amounts of blanking time. Early VGA monitors had very strict limitations on the timings, with out-of-spec operation potentially causing damage to their deflection circuitry or being unable to display a stable picture. Later monitors were “multisync” and capable of displaying a range of timings. In fact, I remember “overclocking” monitors by reducing porch intervals while increasing the scan rate to squeeze a little more refresh rate out of them (e.g. 68Hz instead of 60Hz) using tools such as PowerStrip.

Capturing a VGA Signal with an Oscilloscope

To get up and close with a VGA signal, I needed to get it into an oscilloscope.

Rather handily, I have one of these cables that breaks out the VGA connection into five BNC connectors that carry red, green, blue, horizontal and vertical sync. These cables were intended to allow devices with a DE15HD “VGA” plug to be connected to professional workstation monitors or high-end projectors that featured the five BNC connections. Higher end devices would use BNC connections despite their bulk because they offered much better impedance control and cross-talk management with their coaxial shielded cables.

The one technicality is that the VGA system uses a 75-ohm impedance and these BNC connectors are of the 75-ohm variety. Compared to 50-ohm BNC plugs, they differ subtly with a metal ring skirting the outside of the dielectric, a thinner dielectric overall and supposedly “thicker” pins. I’ve been told that cross-connecting 75-ohm BNC plugs to 50-ohm BNC sockets could cause damage, but I’ve never experienced any ill effects with most sockets I’ve come across being capable of accommodating either version. For a quick experiment, I decided to just take the chance and go with it.

I used my Rohde & Schwarz RTM3004 to capture the signal, using channel 1 for red, channel 2 for green, channel 3 for blue and channel 4 for horizontal sync. The vertical sync would be connected to the external trigger in port, to allow for a trigger to occur at the start of each picture.

I suppose if you had just four channels without an external trigger input, there are a few possibilities – one is just to ignore it and ensure you capture more than one frame’s worth of time and live with a non-aligned image (similar to a rolling TV with the vertical hold misconfigured). Another option, if your graphic card supports it, is to use the sync-on-green signal to derive the horizontal sync, and thus ignoring the horizontal sync line and substituting the vertical sync on that channel instead. If you had less than four channels (e.g. two), then you’re probably going to be looking at a monochrome or duotone image by taking one colour component and a sync line or two colour components (sync-on-green plus either red or blue).

To configure the oscilloscope, I set the input impedances for all inputs to 50-ohm. Ideally, the lines should be terminated with 75-ohm, but for a proof of concept this would be close enough. Having the load would be important for the attached computer to detect the presence of a display, although the mismatched load is likely to cause the absolute intensity value to be incorrect and may cause some signal reflection. In hindsight, as the sync pulse lines are digital, they probably should have been terminated as 1Mohm instead.

Soon enough, setting the trigger to External, falling edge 300mV, a stable waveform is obtained. The active video area is seen, along with the repetition of each frame. The horizontal sync pulses can be seen to run continuously. That, in essence, is what a VGA signal is.

The next step is to grab it and store it for some analysis. This was done by taking a single shot and then saving the waveform to an attached USB device. To simplify things, I chose to save all visible channels into a single CSV file. The ~30Mpoint record takes almost 45 minutes to save, which is quite slow, but it is a 1.78GiB file! It would have probably been faster to save the binary values of each channel separately and then process those in hindsight.

Converting the Waveform to an Image

To convert the waveform to an image, I wrote a quick Python 3 program that I call rsvga (for obvious reasons) which reads in the whole .csv file into RAM and converts it into a raw 16-bit-per-channel bitmap file with optional sample averaging. This program doesn’t strictly adhere to the “absolute” 0V to 0.7V voltage range being the full scale, instead using -0.15V to 1V as the range to accommodate “noise” spikes in the data and allowing the possibility to visualise transients. The horizontal width is tied in with the sampling rate of the oscilloscope and the averaging factor. As a result, post-processing of the image (resize, curves, crop) is necessary to form a usable image.

Once a raw file has been produced using the Python program, it can be opened in Photoshop as it is capable of importing raw bitmaps. The settings will depend on the source .csv recording, but the use of three channels, 16-bit in IBM-PC order is required.

A converted image from an old netbook putting out a 1920 x 1080 @ 60Hz signal with a web browser in kiosk mode looks like this:

The horizontal sync pulse period can be seen on the left, as some cross-talk may have occurred. The porches are visible to either side of the active image area, along with the vertical blanking between successive repeats of the image. As a capture is started with the vertical sync, the alignment at the top is enforced but the oscilloscope does not detect the “end” of the image.

If you do the right image manipulations, you can turn it into an image like the above. It’s surprisingly sharp and clear given the fact it was just “hacked” together – it’s not easy to tell it was analog at one stage.

I tested my other laptop, a Lenovo E431 outputting 2560 x 1600 at 60Hz for a real challenge. Using a one pixel checkerboard pattern, we can definitely see some analog effects –

This includes the effect of the sampling of the oscilloscope not being “aligned” with the pixel clock, leading to periods of high contrast and low contrast, made worse by analog “smearing” or “ghosting” effects.

Here, for example, the X for closing the window can be seen to be “ghosted”, along with the window white into the porch area. Part of the reason may be impedance mismatches due to the use of 50-ohm, but also potentially due to the VGA connector or the quality of the motherboard traces. There is also noticeable “jitter” in the timing which results in lines seeming to be “feathery” around the edges. It seems a little worse given that we’re zoomed in.

But looking it at the normal scale, it is noticeable. I’m sure those who have used very long VGA leads or have seen projectors fed with VGA signals may have seen such artifacts due to the analog nature of the transmission.

Conclusion

In the end, a VGA signal is an analog component video signal which can be observed with an oscilloscope. It is possible, given a fast-enough oscilloscope to capture the signals, record them to a USB stick and convert them back into images. The process is slow and consumes quite a lot of resources making it only practical for one-off screen grabs, but it should be capable of handling virtually every mode, making it versatile in that regard. The mismatch of impedance may potentially degrade the quality of the captured signal, introducing ghosting and causing voltage levels not to be correct on an “absolute” level.

Appendix: RSVGA Code

The rsvga program is something I coded in half-an-hour in an evening and isn’t very fast or efficient, consuming a lot of RAM (for the longer records, expect to use at least 8GiB). It takes in the .csv “all channels” waveform save from my RTM3004 with the assumed format of a single header row and five columns of time, red, green, blue amplitudes and the horizontal sync pulse. From this it converts it to a 16-bit-per-component bitmap raw output, with the ability to average several samples together. Each output row starts at the horizontal sync crossing the threshold voltage until the longest row has been output. For rows where there is less samples than the longest row, the remaining pixels in the row have their data taken from the following row. The output values are not conformant to the 0V to 0.7V strict definition, by design, to allow for transient excursions outside these values to be seen, thus output images will need to have their levels fixed. The code will accept values from about -0.15V to 1V and does not catch or clip samples which are outside this range which will cause an exception to be thrown. No warranties are given for this code – it may not be free of errors, but it did produce the images seen in this post.

# RSVGA by Gough Lui (goughlui.com) - July 2020
# Version 0.2 - Use at Your Own Risk! No Guarantees Provided!
# Python 3 program takes in a .CSV with four channels from the RTM3004
# in the order of R, G, B, Hsync and interprets them into a 16-bit per
# channel raw bitmap file output with the ability for sample averaging.

import csv
import struct

infile = input("Input Filename: ")
csvarray = []

# Read in whole CSV - Memory Hog(!!!)
with open(infile, newline='') as f:
  reader = csv.reader(f)
  csvheader = next(reader) # Eat the header, but do nothing with it
  for row in reader :
    csvarray.append([float(i) for i in row])

# Compute File Stats
print("File Statistics")
print("Samples: "+str(len(csvarray)))
print("Duration: "+str(csvarray[(len(csvarray)-1)][0]-csvarray[0][0])+" s")
print("Duration after Trigger: "+str(csvarray[(len(csvarray)-1)][0])+" s")
# Look for Sample which is Trigger
for trigt in range(0,len(csvarray)):
  if(csvarray[trigt][0]>=0) :
    break;
print("Trigger occurs at sample "+str(trigt))
# Calculate Sample Rate over Whole File
sr = (len(csvarray)-1)/(csvarray[(len(csvarray)-1)][0]-csvarray[0][0])
print("Sample Rate: "+str(sr)+"Hz")
print("")

# Compute Line Stats
print("Line Statistics - After Trigger Only!")
linet = []
hs = 0
for i in range(trigt,len(csvarray)):
  if hs :
    if(csvarray[i][4] > 0.25) : # 0.25V Sync Threshold Detector
      hs=0
  else :
    if(csvarray[i][4] < 0.25) :
      hs=1
      linet.append(i)
print("Detected "+str(len(linet)-1)+" lines.")
sline=linet[1]-linet[0];
lline=linet[1]-linet[0];
for i in range(0,len(linet)-1):
  llen = linet[i+1]-linet[i]
  if llen > lline:
    lline=llen
  if llen < sline:
    sline=llen
print("Shortest Line: "+str(sline)+" samples.")
print("Longest Line: "+str(lline)+" samples.")
print("")

# Process Each Line and Export as Raw - 16-bit 3ch interleaved PC order
decfacstr=input("Average n Samples (Integer): ")
decf=int(decfacstr)
outl=(int(lline/decf)+1)*decf
print("Exporting RAW file - 16bpp "+str(int(outl/decf))+" pixels x "+str(len(linet)-1)+" pixels")
outfile = input("Output Filename: ")
g=open(outfile,'wb')
for x in linet : # Each Line in Output
  if(x+outl < len(csvarray)) : # If there's enough data
    for i in range(0,outl,decf) : # For Each Line Sample
      red=0
      green=0
      blue=0
      for k in range(0,decf): # For each decimation point
        red+=csvarray[x+i+k][1];
        green+=csvarray[x+i+k][2];
        blue+=csvarray[x+i+k][3];
      red=red/decf;
      green=green/decf;
      blue=blue/decf;
      # 0.15v offset to avoid negative, 1V full scale.
      # DO NOT let signal get above 0.85V (even transiently)
      # or the code will have an out-of-bounds exception!
      g.write(struct.pack("<H",int((red+0.15)*65535)))
      g.write(struct.pack("<H",int((green+0.15)*65535)))
      g.write(struct.pack("<H",int((blue+0.15)*65535)))
g.close()
print("Done!")

Viewing all articles
Browse latest Browse all 7

Trending Articles