Roundtable: Getting to grips with the challenges of designing pcbs featuring fpgas

4 mins read

Designing pcbs was at one time a sequential process. But the growth in popularity of fpgas has seen that process become more of a parallel operation.

Where fpgas once provided 'glue logic' and similar functionality, the devices are now central to the operation of many board level products. Because of this, fpgas are often complex, high pin count devices. This level of complexity brings design iterations and each iteration may require the board on which the fpga sits to be redesigned; even seemingly small changes to an fpga's pin out could have dramatic effects on the pcb's layout and even the number of layers. In a reader survey, New Electronics discovered that 47% of respondents had recently designed a board level product in which an fpga provided the main functionality. And only 20% of those readers said it was the first time they had done so. For those who have designed an fpga based board before, 58% said their latest design was more complex, as was the design process itself, because of the need to get the fpga and pcb right. More than 50% found fpga design hard and 40% struggled with board design. Looking to address these issues, New Electronics held a roundtable to discuss fpga centric design. Adam Clarkson, Northern Europe technical manager with Lattice, pointed out that fpgas cover a wide spectrum. "They're not just 28G beasts," he said. "There is integration and complexity, even at the low end." Steve Clark, director of marketing and engineering for Arrow Europe, said: "High end mcus and fpgas require more complex voltage rails, with specific start up sequences." Clarkson added: "You can't have 100mV overshoot on a 0.9V supply, so there is the need for accuracy." Rather than replacing mcus, fpgas are more likely to be used to integrate peripherals, said Clarkson. "It's bringing programmability to the rest of the board. It provides designers with a quick way of getting a board out because they have control over such factors as sequencing. "High end fpgas will be more of a processor platform. At the low end, micropower fpgas will be used for such functions as offloading sensor management from processors." Providing the pcb design perspective, Mentor Graphics' product manager Rakesh Jain said: "We see designs spanning all ranges; from big fpgas with lots of functionality to boards where fpgas act as glue logic. But a low volume design is unlikely to afford integrating an mcu and lots of I/O into a big fpga." Jain sees a change in how pcb design is being undertaken; particularly when fpgas are involved. "Once, it was done by different people or teams who didn't talk to each other. They had conflicting objectives and often found out late in the design cycle that they had been working towards different goals." So what can go wrong? "FPGA designers allocate pins and hand off to the pcb designers. They change the pin out and throw the changes back over the wall; there are too many iterations," Jain contended. "Both teams must work with each other or use tools which help them to communicate. If you can simplify the pcb/fpga flow, design can happen simultaneously." Another part of the jigsaw is the signal chain. "Analogue is the largest technology segment in the UK and power management, which represents 55% of that, is the fastest growing sector. So a third of Arrow's UK FAE expertise is focused on this important growth area," said Clark. Simon Bramble, a Linear FAE, noted: "Everyone wants smaller parts with more functionality. That means more transistors and lower voltages. In response, signal chain developers are being force to create smaller dc/dc converters with higher switching frequencies in order to get more into the package." Lower voltages, he contended, are 'a nightmare' when it comes to power management. "Designers could live with 100mV ripple in the past but that could take you outside of tolerance, particularly with a 1V supply." Lower voltages also mean higher currents. "And that brings thermal management problems," he continued. "One way around this is parallel phasing. If you need 20A, then use two 10A supplies in parallel; it divides the current by two and the losses by a factor of four." In Clark's opinion, system architects and hardware designers often do not have the skills to design complex power management circuitry. "With fpgas, the power requirements often cannot be defined until the logic design is completed. This means flexible off the shelf solutions are imperative to meet challenging time to market objectives and to allow hardware engineers to focus on system functionality, while finalising power management late in the design cycle." Avoiding noise and distortion is also important. "Noise is the main problem," Bramble contended. "Sampling noise and digital noise will corrupt analogue signals." Another issue is the so called 'jaws of death' – the input to an a/d converter. "It's a nasty environment," Bramble pointed out. "Noise on the reference will look the same as noise on the signal, so use a lower noise reference." Other areas to consider include digital noise and power supply noise. "Lots of code transitions will map back and corrupt the date, so randomise the data. And don't ignore the power supply rejection ratio; it will help to remove noise at the input." Finally, don't forget the need for good pcb layout. "PCBs featuring fpgas tend to have a lot of layers and this helps to keep signal layers separate from ground planes and so on," he said. With the board, fpga, power supply and signal chain designed, all that left is to test things. Richard Bloor, business development manager with Rohde & Schwarz, said one of the reasons for problems is complexity. "There are more devices in less space and more things that can go wrong. All of these can be difficult to solve." A common problem, said Bloor, is signal integrity. "There's a lot of talk about this, but what does it mean? From a digital perspective, the signal should be clean, low noise and with no ringing; you get 1s when you should get 1s." Other faults which need to be investigated include analogue deviations and timing errors. "A lot of these can be random and infrequent," Bloor said. "Engineers still like to use analogue scopes to investigate these effects because the only delay is the flyback time." Bloor noted that digital problems are often easier to find using analogue representations. "If you can't see the true analogue signal, you won't see things like amplitude problems and ringing." Other problems with pcb designs include crosstalk. "High density designs with multiple layers and high speed data means the problems can be acute; and errors are there because of layout." "Engineers need to see analogue signals," Bloor concluded, "but digital designers are only interested in protocols."