By Adam Taylor

With the hardware platform built using the Zynq-based Avnet MiniZed dev board, the next step in this adventure is to write the software so we can display images on the 7-inch touch display. To do this we need write a bare-metal software application to do the following:

Configure the video timing controller (VTC) to generate timings required for the 800x480-pixel WGA (Wide Video Graphics Array) display.

Create three frame buffers within the PS (processing system) DDR SDRAM.

Configure the FLIR Lepton IR camera and store images in the current write frame buffer.

Configure the VDMA to read from the current read frame buffer.

The first step is to configure VTC to generate video timing signals for the desired resolution. Failing to do this correctly will mean that the AXI-Stream-to-Video-Out block won’t lock with the AXIS video stream.

The VTC is a core component, present in most image-processing pipelines (ISPs). The VTC’s function is not just limited to generating timing signals; it also detects video input timing. This feature allows the VTC to lock its timing generation with input video streams. That’s a key capability if the ISP needs to be agile and if it’s to adapt on the fly to changes in input resolution.

The VTC generator can be configured by either its own registers, which we update when write to those registers directly, or by the VTC detector registers. For this exercise, we need to set the VTC generator register sources correctly because we are only using the generator half of the VTC and not the detector half. The VTC’s power-on default is to take configuration data from the detector registers and that’s not the mode we wish to use here. To set the VTC register source, we’ll use a variable of the structure type XVtc_SourceSelect in conjunction with the function XVtc_SetSource().

Together these lines of code set the VTC control-register bits 8 to 26, which determine the source for each register. Each of these bits controls a specific generator register source. For example, bit 8 controls the Frame Horizontal Size register. Setting this bit to “0” instructs the VTC to use the detector settings while a “1” instructs the VTC to use the generator’s internal register settings.

Failing to do this results in writes to the detector registers having no effect on the generated video timing, which can be a rather frustrating issue to track down.

With the correct register source set, the next step is to write the timing parameters. We need the following settings for the 7-Inch touch display:

These parameters are stored in a variable of the XVtc_Timing type. We write them into the VTC using the XVtc_SetGeneratorTiming() function:





Of course, the VDMA and the frame buffers must also be aligned with the VTC. The current design uses three frame buffers to store the output images. Each frame buffer is based on the u32 type and declared as a one-dimensional array containing the total number of pixels in the image.

The u32 type is ideal for the frame buffer because each pixel in the 7-inch touch display requires eight-bit Red, Green, and Blue values. Therefore, we need 24 bits per pixel. Each frame buffer has an associated pointer that we’ll use for frame-buffer access. We initialize these pointers just after the program starts.

We use the VDMA to display the contents of the frame buffer. The key VDMA configuration parameters are stored within a variable of the type XAxiVdma_DmaSetup. It is here where we define the vertical & horizontal size, stride, and the frame-store addresses. The DMA is then configured using this data and the XAxiVdma_DmaConfig() and XAxiVdma_DmaSetBufferAddr() functions. One very important thing to remember here is that the horizontal size and stride are entered bytes. So in this example, they are set to 800 * 4 as each u32 word consists of four bytes.

We’ll use code from the previous example (p1 & p2) to interface with the FLIR Lepton IR camera. This code communicates with the camera over I2C and SPI interfaces. Once the image has been received from the camera, the code copies the image into the frame buffer. However, to ensure that we use most of the available image frame, we’ll use a simple digital zoom to scale up the 80x60-pixel image from the Lepton 2 camera. To do this, we output each pixel eight times to generate a 640x480-pixel display image that we’ll position within the 7-inch touch display’s 800x480 pixels. We set the remaining pixels to a constant color. As this is a touch display, this remaining space would be idea for command buttons and other user interfaces.

Putting all this together results in the image below. The green coloring comes from mapping the 8-bit Lepton image data into the green channel of the display.

This combination of the FLIR Lepton camera and the Zynq-based MiniZed dev board results in a very compact and cost-efficient thermal-imaging solution. The next step in our journey is to get the MiniZed’s wireless communications working with PetaLinux so that we can transmit these images over the air.

I have uploaded the initial complete design to GitHub and it is available here.

If you want E book or hardback versions of previous MicroZed chronicle blogs, you can get them below.

First Year E Book here

First Year Hardback here.