New Technologies

Discover the transformative power of VHDL programming

VHDL – or VHSIC Hardware Description Language (HDL) – has great potential for increasing efficiency and performance in engineering. Here we explore how VHDL programming accelerates development, from concept to creation.

“”
8 minutes to read
With insights from...

What is VHDL?

VHDL Hardware Description Language (HDL) is a specialised programming language for describing the structure and behaviour of electronic circuits. It enables engineers to design, simulate, and synthesise complex hardware systems efficiently. The smallest and simplest of logical behaviour, such as a data flip-flop (DFF), can be expressed in VHDL.

Diagram and VHDL code example for a D Flip-Flop, showing the entity declaration and architecture behavior with a process triggered by the clock signal. Above: the DFF captures the value of data input (d_i) at each rising edge of the clock (clk). The captured value of the input is then propagated to the output (q_o).

Hardware description language serves as a key tool in the field of digital design and the design of digital logic, providing engineers with the means to express, simulate, and synthesise complex hardware systems more efficiently and effectively. Basic logic elements are connected together to create the intended functionality.

Why should engineers use VHDL programming?

Complexities in the design and delivery stage can make it difficult to fully realise your hardware innovation. Hardware often comes in above budget, and the elegant solution you were striving for might end up a lot larger and more cumbersome than you envisioned.

Off-the-shelf hardware enables engineers to abstract some of this complexity in the design by re-using known architecture and elements to perform the desired function. But these off-the-shelf solutions can create additional complexity and costs when they fall short of the required functionality or performance.

Thankfully, there is a way to turn this around and meet expectations, by using field programmable gate array (FPGA) – a type of integrated circuit you can program to perform specific tasks. VDHL can be used to describe digital systems inside of FPGAs, enabling a huge amount of customisation, iterations, rapid prototyping, and performance.

Developing an FPGA design is, however, a steep learning curve, and can be complex and risky. As with any programming language, tool, or framework, you need experienced engineers with a deep understanding of the FPGA system and its capabilities. For example, the task of translating desired function to logical function – expressed in VHDL – is difficult and may need multiple iterations in the development phase, adding additional checks and tests.

Here at Zühlke, we’re continually optimising our engineering processes, for example by using automation to run more intensive testing and streamline the development phase. We grow from experience and refine our processes based on continuous learning.

Self-driving cars require vision control, so several cameras must be processed and streamed together. The vehicle needs constant 360 vision as well as close-, medium-, and long-range vision of the road to predict its road and avoid obstacles. Meanwhile, a similar project we were working on required simultaneous vision and high-speed acquisition of twelve cameras. Using the MIPI protocol, the FPGAs had to decode the image data from the MIPI protocol and aggregate the different streams together. FPGA technology was invaluable in helping us stitch together several video streams into a single stream for encoding back into MIPI and sending onto the processing unit, showcasing how FPGA architecture and VDHL programming can help to solve complexity and improve outcomes.

From discovery to delivery: putting the pieces of the FPGA development puzzle together

left: Visualization of the discovery phase with individual separate pieces. In the middle, the pieces slowly come together and on the right, the pieces are put together to form a cube, illustrating the delivery.

A complex system hardly ever starts with a well-polished list of requirements. That’s why, at Zühlke, we thoroughly explore and validate concepts, providing a tangible platform to assess feasibility and performance. We want to capture, explore, and develop an idea that meets the client’s needs, with a full view of all possibilities and the task ahead.

This process not only fosters creativity and innovation, but also minimises the risks and uncertainties associated with implementing unproven ideas. We continuously iterate on ideas using system thinking, also known as system engineering. We explore and establish the system boundaries and contexts in which actors, machine, or humans, interact with the device. And we use case scenarios to gain a full and detailed picture of the intended uses, navigating potential missuses and risks relating to safety and cybersecurity.

Block definition diagrams (BDD) show each component separately and illustrate how each is composed, and how many devices are instantiated. An internal block diagram (IBD) maps the links between elements, showing the information flow, transformation, and processing.

This can be viewed as a concept diagram, but it actually helps inform the architectural choice and design flow for FPGA development. Each element then starts with its own lifecycle by specifying requirements and defining functionality, interactions, inputs, outputs, and, finally, behaviour. The entities are tested separately in a white box environment and then assembled together with another block to form a new component. Building little by little, keeping the elements together and testing in the centre, we build the system behaviour.

left: light bulb visualising the word idea. in the middle: paper and pen visualising capturing and right: graphic with containers visualising system engineering

Finally, we ensure a comprehensive cycle, leveraging the synergies of having all capabilities under one roof to deliver seamlessly integrated results. This approach ensures that every aspect of the project is expertly managed, from discovery to delivery.

To return to our MIPI camera project, this is a great example of the versatility that’s possible with our approach to FPGA development and VHDL programming. In the discovery phase of the project, eight cameras need to be aggregated. As the project evolved over time, we increased to 12 cameras. Even if the hardware changed in iteration, the core aggregation was generic enough to accommodate any number of cameras making the evaluation of concepts possible early in the project.

How HDL simulation accelerates development cycles and time to market

Hardware description language simulation is vital in the development and validation of digital hardware designs. Through simulation, engineers can thoroughly test and verify the functionality of their designs before committing them to costly system integration or even fabrication. HDL simulations allow designers to simulate the behaviour of complex digital systems under various conditions, helping identify and rectify potential issues early in the design process. This process not only reduces the risk of errors, but also accelerates the development cycle, enabling faster time to market.

left: visualisation of RTL simulation with open tab and load sign, in the middle: visualisation of FPGA simulation and on the right side: visualisation of a hardware simulation

At Zühlke, we develop around random stimuli testbenches, which play a crucial role in verifying the robustness and reliability of digital designs. Random stimuli testbenches generate a diverse range of input signals, mimicking real-world scenarios and stress-testing the design's responsiveness and resilience. In addition, we try to stay up to date with the latest testing frameworks such as GHDL/Vunit, CocoTB, which help in supercharging the testing of the design. By using repetitive patterns and random generation, we can uncover corner cases and edge conditions that may not be apparent with traditional test methods.

Component obsolescence often put projects in jeopardy – for example, where a core element is no longer produced, or where the switch to a newer and faster component corrupts the behaviour of the system. By testing these new elements in RTL, we can elaborate the response of the system prior to changing the components. Investigating if a faster device poisons our regulation loop time or changes the parameters of this last one.

How simulation results can increase quality

In software development, we see many feature branches from a git repository. VHDL projects rely on the vendor integrated development environment (IDE), where the source files are added, edited, and where the project can be built into a bitstream, which is used to program the FPGA. But if every new feature has its own a set of different files (its own IDE), your workstation could soon be overwhelmed.

At Zühlke, our workstation can only handle so many project IDEs. That’s why we create an environment that enables engineers to have any number of feature branches. This works because we containerise the environment and the IDE, meaning we have full control over the build process as software development, making it suitable for continuous integration/continuous deployment (CI/CD).

""

What’s more, incorporating HDL simulation into CI/CD pipelines enhances the efficiency and reliability of the hardware development process. By automating simulation runs within CI/CD workflows, engineers can ensure that each code change undergoes rigorous testing, with multiple test cases, including HDL simulation, before integration into the main codebase. The engineers can continue to modify and enhance the design while this one is being thoroughly tested.

Finally, post-processing capabilities in HDL simulation enable engineers to analyse simulation results comprehensively. They can extract meaningful insights, identify patterns, and diagnose potential issues through waveform visualisation, statistical analysis, and other analytical techniques. This empowers engineers to iteratively refine their designs, ensuring they meet stringent performance requirements and quality standards before deployment.

Reproducibility and continuous improvement

With every project, we gather and refine techniques that increase efficiency and we continuously incorporate these insights into our custom-made tool: ‘ZAH’ (short for ‘Zühlke And HDL’). This tool has streamlined the process of setting up and running simulations, automating repetitive tasks, and providing intuitive interfaces for result analysis. By centralising key functionalities within a unified platform, our teams are able to focus more on design refinement and integration, ultimately accelerating the development cycle and improving overall productivity.

Advancing hardware innovation and engineering with FPGA and VHDL programming

In this article, we’ve explored the benefits of VHDL programming and how to streamline FPGA development to harness its benefits. This is based on our extensive expertise and experience in software and hardware innovation and engineering. Our philosophy is based on efficiency, precision, and continuous learning to ensure early and ongoing value – even if that means taking additional time and effort to understand what’s really under the hood.

With a proven track record, we combine strategy and hands-on experience supported by processes and methodologies that ensure high-quality and reliable hardware solutions.

Our teams have rich project experience spanning signal processing, data compression, interfaces, DSP algorithms, LabVIEW frameworks, vision systems, transceiver designs, MATLAB modelling, FPGA co-simulation, and VHDL design for tailored hardware systems.

Keen to explore what VHDL could do for your business? Drop us a message to start exploring your best opportunities. We look forward to supporting you!