BMOW title
Floppy Emu banner

Archive for the 'Dev Tools' Category

Saleae Pro 8 Logic Analyzer Review

saleae-logic-pro-8

When Saleae’s first USB-based logic analyzer burst onto the electronics scene in 2008, it was praised for its ease of use and low cost. For 2014 the company revamped its product line, replacing all its existing models with four new products. Last month the nice folks at Saleae were kind enough to send me a new Logic Pro 8 for review, so I recently had a chance to test the hardware first-hand. In brief, it’s well-polished and good at what it does, though I wish it did more.

But first – what’s a logic analyzer? Much like an oscilloscope, an LA is a tool for examining electrical waveforms in a running circuit – the so-called “device under test” or DUT. But where an oscilloscope is used to view analog waveforms with a continuously varying voltage, a logic analyzer displays digital waveforms whose value is either 0 or 1. Most oscilloscopes have two channels, but typical LAs have at least eight channels, and sometimes 40 or more. An LA also includes powerful software for triggering, decoding, and dissecting the collected data. If you do any kind of digital electronics work, a logic analyzer is indispensable.

In the old days, an LA was a stand-alone tool, like my ancient HP 1631D. Modern LAs such as Saleae’s are more likely to be PC peripherals consisting only of the signal acquisition hardware, with all the display and analysis work handled by a software program on the PC.

 
Specs

The Logic Pro 8 is the second from the top in Saleae’s product lineup, which also includes the Logic 4, Logic 8, and Logic Pro 16. Priced at $399, it’s an 8 channel logic analyzer with a max sampling rate of 500 Megasamples/sec, though when using all 8 channels the max sampling rate is reduced to 100 MS/sec. The Logic Pro 8 uses USB 3.0 to push all that sample data to the PC at high speed.

Saleae recommends a minimum of 4x oversampling when capturing digital signals, so 100 MS/sec is enough to reliably capture data from digital systems with signal speeds up to 25 MHz. With four or fewer channels in use, the full sampling rate of 500 MS/sec is possible, allowing capture of digital signals up to 125 MHz. It’s important to remember that these are signal speeds, not CPU core speeds. The Beaglebone Black may have a 1 GHz processor, but its GPIO signals will normally be changing state at a few tens of megahertz at most. I’ve personally never built a digital system with external signal speeds above 5 MHz. 100 MS/sec will be more than enough for most hobbyist purposes.

saleae-analog-example

Unique among the competition, all the new Saleae models except the Logic 4 feature input channels with dual digital/analog capability. Each channel can be configured as a digital input, an analog input, or both simultaneously. The analog sample rate is limited to 50 MS/sec with up to three channels, with a bandwidth of only 5 MHz, so it’s not going to replace a bench oscilloscope. But as a quick sanity check for what’s happening in the analog domain for low-speed signals, it’s a nice addition.

Unlike some other logic analyzers that feature external clock and trigger inputs, 8 channels on the Logic Pro 8 means 8 total inputs. There’s no support for external clocking, which is a disappointment. None of the current Saleae LA models have external clock inputs, and they’re the only logic analyzers on the market I’m aware of that lack this feature. I hope to see an external clock and trigger added to Saleae’s future products.

The Pro 8 supports logic levels between 1.2 V and 5.5 V for digital signals, with user-selectable threshold voltages. In analog mode, the input voltage range is -10 V to +10V. Analog signals are captured at 12-bit resolution.

 
Sample Streaming

All of Saleae’s logic analyzers are streaming samplers, an important detail that affects how they perform. A streaming sampler is essentially the opposite of the more familiar buffered sampler design. Buffered logic analyzers contain dedicated high-speed memory for storage of signal data. Typically the memory is enough to hold a few thousand sample points, and when it’s full, signal acquisition stops. The acquired data is then displayed and analyzed as a post process.

In contrast, a streaming sampler has little or no built-in memory. Sample data is streamed over USB in real-time to the connected PC, where it’s stored in RAM or on the hard disk. This enables huge signal captures containing millions of samples, much larger than what’s possible with a buffered sampler. But when using many channels and high sample rates, streaming can overwhelm the PC’s available USB bandwidth, resulting in failures. Faster PCs and a USB 3.0 connection both help. This design explains why streaming samplers generally don’t have more than 8 or 16 channels – there just isn’t enough USB bandwidth to stream more channels in real-time.

 
Unboxing and Setup

saleae unboxing

The Logic Pro 8 seems impossibly small – just a two inch square aluminum puck. It comes packed with a USB cable and two flying lead wiring harnesses, containing eight signal wires and eight ground wires. The wires are terminated with a female 0.1 inch connector, which can be plugged directly onto standard male headers, or connected to one of the 16 included IC test hooks. The whole setup packs away into a cushioned nylon carrying case. It’s all quite nice, and the value of the included accessories is something to consider when comparing the Logic Pro 8 to its competition.

The Logic is so small and light, it can get lost on a desk, or pulled off the desk by the weight of the cables attached to it. This is one instance where Saleae may have too much of a good thing, and a bit of extra size and weight might be welcome.

I was slightly confused by all those ground wires at first. According to Saleae, it’s only necessary to connect one of the ground wires for most applications, but for best results with analog sampling all the ground wires should be connected.

There’s no software CD included with the Logic Pro 8, and only a small quick start card that advises readers to visit www.saleae.com for instructions and software. No matter, anything they included in the box would likely be out of date by the time it was opened anyway.

The client software runs on Windows, Mac OS X, or Linux, and downloading was easy, with no registration or other annoying hoops to jump through. I was surprised by this pop-up, though:

saleae-beta

You must use a beta version of the client software if you have one of the current Logic models, including the Logic Pro 8. The release/stable version of the client software only supports the older, discontinued models. I’m probably reading too much into the word “beta”, and Google has conditioned us all to be comfortable with software in perpetual beta, but this strikes me as a little strange. Supplying unfinished beta software to customers who’ve paid up to $499 for Saleae’s latest and greatest hardware risks annoying those customers and spoiling goodwill. Fortunately, Saleae’s customers seem to be an understanding bunch.

Once downloaded, the client installs a Saleae USB driver as well as the actual client application. There aren’t any configuration choices to make, so the whole installation process is quick and painless. Within a few minutes of opening the box and downloading the software, everything is set up and ready to use.

 
Working with the Logic

For basic setups, capturing and analyzing data with the Logic Pro 8 is simple. Just connect the ground and signal wires to the device under test, hit the friendly green “start” button, and in a few moments you’ll have a screen full of waveform data. Digital and analog data are displayed on the same screen, with the same time scale. From here you can zoom in and out of the captured waveforms, or pan left and right to view different time periods. The zooming and panning is all very smooth and quick.

Viewing a group of assorted waveforms is all well and good, but the real strength of a logic analyzer comes from the “analyze” part of its name. The Saleae client software includes several built in protocol analyzers that can extract high-level data from raw waveforms. For example, the async serial protocol analyzer can reconstruct a serial byte stream from raw RS-232 data on a channel. Tell the analyzer which channel to examine, and the bit rate and parity settings, and it does the rest. The decoded serial data is displayed as an annotation overlay on the raw waveform, and also in a table view. The current version of the software contains over 20 protocol analyzers, including async serial, I2C, SPI, JTAG, MIDI, simple parallel, and many others.

saleae-spi-example

Digital and analog sample rates can be adjusted independently, and the total capture length is also selectable. Unused channels can be turned off. In general, the fewer channels that are used, the higher the sampling rate that the Logic Pro 8 can achieve.

For the basic use case of grabbing some signals and eyeballing what’s happening, the Logic Pro 8 is excellent. It’s amazing how smooth and easy the whole process is, especially compared to other logic analyzers that might have similar specs. Navigating through the waveforms is a pleasure, and helpful measurement cursors pop up wherever you place the mouse pointer. It’s really a pleasure to use.

One missing feature I’d really like to see is a state mode, or state table view. With a complex system, I often find it’s easier to view things as a list of consecutive states, with one state per line, rather than as a collection of individual waveforms plotted against a time axis. The Saleae software does have a list panel showing the protocol analyzer results, but there’s not much functionality to it. When I’ve worked with other logic analyzers in the past, I spent almost all my time in the state view, and almost never looked at the waveform time view. Saleae needs to expand the existing list panel into a full-fledged data view screen, with functionality similar to the waveform view. Here’s what state view looks like on my old HP 1631D:

1631D-state-mode-500

One minor gripe is the lack of a continuous capture mode. When you press the start button, the client software will capture one buffer’s worth of data, display it, and then stop. Sometimes it’s helpful to use a logic analyzer to do continuous capturing like an oscilloscope, many times per second, constantly updating the results on screen. If you’re capturing the same event over and over, this makes it easy to see if any event is different from the others, or if there’s timing jitter. Unfortunately the Saleae software doesn’t support continuous capturing.

I did experience some software problems during my testing, and the client lived up to its “beta” label by crashing several times. Most of these only required relaunching the client software, and submitting a crash report to Saleae. But in one instance, a crash somehow left my USB mouse and keyboard in an unresponsive state. Even though the PC was still running, I was forced to do a hard reset of my PC.

 
Triggers

What about more complex signal capture scenarios, requiring a trigger? Pressing a button to begin an immediate signal capture is OK if you’re confident the event of interest will be somewhere in the captured data – either because it happens repeatedly, or you can force it to happen when desired. But many times it’s necessary to define a custom trigger to begin signal capture only when a particular event happens – say a specific value appears on a bus, or a rare error condition occurs.

I was disappointed to discover the Saleae software only supports fairly rudimentary triggering. Similar to an oscilloscope, it can trigger on a rising or falling edge on one channel, optionally requiring a specific hi/low level on other channels. That’s insufficient for capturing complex or rare events. Most other logic analyzers I’ve seen support a large array of different triggering options, like triggering when a bus or serial value is or isn’t present, or is in a particular range, or a set of conditions happens N times consecutively, or logical and sequential combinations of multiple individual trigger clauses. For example, here are the trigger setup options for the Intronix LA1034 LogicPort, a Saleae competitor:

intronix-triggering

The lack of triggering options may be a result of Saleae’s streaming sampler design. The acquisition hardware is essentially just a high speed data collection port, and it may lack the necessary smarts to check for trigger conditions in real time. From reading Saleae’s support forums, it appears that triggering is actually performed in software on the PC, rather than in hardware. I would have thought that would make it easy to support complex triggering options, but apparently the software isn’t able to compute complex triggers in real time either. In fact, even simple triggers have major problems on the new Saleae hardware. According to the support forums, if any analog channels are in use, the software can’t keep up with incoming sample data while also checking for a trigger condition. It falls further and further behind, and the client’s memory use balloons until it crashes. The Saleae engineering team is working on a fix, but for now their advice is to turn off analog channels when using triggers.

The absence of robust triggering options might not be too bad if it were possible to search the acquired data for a complex trigger-type event after the fact. Instead of triggering on an error condition, you could capture 10 million sample points, and then search for the error condition in the captured data. The Saleae client software does have a basic search capability, so for example you can look for all instances where the value 0xC8 appeared on the serial port. But there’s no capability to search for multi-byte sequences, or combinations or sequences of conditions, or any of the complex trigger conditions mentioned earlier. In a pinch, the captured data can be exported to Excel, where other tools can be used to search, but that’s not a great solution if it’s something you’ll need to do regularly.

 
External Clocks

Earlier I mentioned that the Saleae LAs don’t have external clock inputs. Why should you care about external clocks? The first reason is speed. When using an external clock (the DUT’s own clock), 4x oversampling isn’t necessary, and it’s possible to capture digital signals all the way up to the LA’s max sample rate. A 100 MS/sec logic analyzer using an external clock can capture digital signals at speeds up to 100 MHz. Exactly one sample is taken per clock cycle, on the clock edge. Without an external clock, 4x oversampling is required, and 75% of the LA’s potential performance is effectively thrown away. In practice you might be able to get away with 2x oversampling and only pay a 50% penalty, but the cost is still too high.

The second reason that external clocks are important is correctness of the captured data. With external clocking, samples are taken exactly at the clock edge, and the values captured are those that were seen by synchronous devices at the clock edge. Without an external clock, samples will be taken some unpredictable amount of time before and after the clock edge, and the true value at the clock edge can’t be known.

clock-sampling

Consider a system with two digital signals, A and B, and a separate clock signal. Imagine that the Logic Pro 8 is sampling these three signals four times per clock period, for 4x oversampling, at intervals shown by the green vertical lines. A has a transition shortly before the second rising clock edge, and B has a transition shortly after the edge. But when sampled by Logic and displayed in software, both A and B will appear to transition coincident with the clock edge. What were the actual values of A and B at the clock edge? We can’t tell.

To be fair, the speed penalty described here may not be an issue in most cases. The Logic is fast enough, and the digital signals are slow enough, that a 75% speed penalty isn’t fatal. Likewise the correctness problem may not be an issue in most cases either, as long as signals don’t change values too close to a clock edge, where “too close” means within one sample period. The average person using the Logic Pro 8 to debug a low-speed I2C communication stream won’t have any problems. But when pushing to higher speeds with tighter timing margins, the lack of an external clock is a real handicap.

 
Analog

To test the Logic Pro 8’s analog input capability, I used a microcontroller to generate a square wave at a few different frequencies, and then viewed it as both a digital and an analog signal. The Logic Pro 8 samples analog inputs at 50 MS/sec, with a 5 MHz analog bandwidth. When examining a 2.66 MHz digital square wave as an analog signal, there was a pronounced ringing and smoothing of the displayed analog waveform, and it didn’t look much like a square wave anymore. At 4 MHz, the analog waveform just looked like a sine wave.

2.66 MHz digital signal

 
4.0 MHz digital signal

I had initially assumed the analog inputs would be most useful for checking the signal integrity of digital signals – looking for overshoot, noise, or glitches that might cause problems. But given the very low bandwidth of the analog inputs, it’s just not possible to see useful analog domain details, even with low-speed digital signals. So scratch that idea.

What are the analog inputs good for, then? If you’re working on an audio-related project, the analog inputs are plenty fast enough to handle audio frequency analog signals. Or if you’re interfacing with an analog sensor, such as a light or force sensor or a capacitive touch sensor, the analog inputs will come in handy too. For the vast majority of projects, though, I suspect the analog inputs will go unused.

 
Plugins and Scripting

The Saleae client software offers two ways to extend its functionality: protocol analyzer plugins, and client scripting. Plugins are implemented as C++ shared libraries, and enable the client to be extended to support new protocols, or to add new options to existing protocols. Need decoding of NRZI serial data from a floppy disk? Add it yourself! This capability looks like it’s still a work in progress, and the support page lists a number of incompatibilities between analyzer SDK versions and client versions, and custom analyzers currently aren’t supported with the 64-bit Windows client.

The scripting API enables users to programmatically configure the client software, and trigger captures. Just open TCP socket 10429 on the client PC, and send text commands to control the running client. Because it’s a text-based protocol using a standard socket interface, the test script can be written in any language. This interface is great for using the logic analyzer as part of an automated test framework.

 
Saleae Logic Analyzers, Past and Present

The original 8 channel Saleae Logic was introduced in 2008, and was later followed by the Logic 16. These remained Saleae’s only LA products until 2014, when they were discontinued and replaced with four new models. The new models offer analog sampling and faster sample rates, but at higher prices than the models they replaced.

Model Price Availability Channels Analog Sample Rate,
3 channels
Sample Rate,
all channels
Logic $149 Discontinued 8 No 24 MS/s 24 MS/s
Logic 16 $299 Discontinued 16 No 100 MS/s 12.5 MS/s
Logic 4 $99 Available 4 1 12 MS/s 12 MS/s
Logic 8 $199 Available 8 Yes 100 MS/s 25 MS/s
Logic Pro 8 $399 Available 8 Yes 500 MS/s 100 MS/s
Logic Pro 16 $499 Available 16 Yes 500 MS/s 100 MS/s

 
It’s not clear to me that the current Saleae models are a better value than the discontinued ones. If you don’t care much about analog capability, than the original 8-channel Logic model at $149 was probably a better deal than the current Logic 8 at $199. For people who need more channels, the Logic Pro 16 is clearly more capable than the Logic 16 that it replaced, but at a 66% higher price it’s only a good deal if you actually need those extra capabilities. For electronics hobbyists working with relatively slow parallel bus-based systems, the old Logic 16 was ideal. The Logic Pro 8 stands out somewhat awkwardly on the price/performance scale. At twice the price of the similar Logic 8, and 80% of the price of the Logic Pro 16, it’s hard to see why anyone would choose it.

If I were Saleae, I would bring back the original Logic and Logic 16 models, selling them alongside the new models. The original models are both good logic analyzers, representing different cost vs performance tradeoffs than those offered by the current models. And I would cut the price of the Logic Pro 8 to $299, to make it more competitive with the rest of the product lineup.

 
Competing Models

How do the Saleae logic analyzers stack up against low-cost LA solutions from other vendors? I haven’t used any of these logic analyzers directly, but I’ve experimented with the client software for each one and dug through their documentation, to get an idea of their capabilities.

USBee SX and ZX – $169 and $495
These are 24 MS/sec 8 channel streaming samplers like the original Saleae Logic. Unlike the Saleae units, the USBee LAs feature external clock and trigger inputs in addition to the 8 data channels. The USBee client software has similar features to the Saleae client, with equally weak triggering options. The software feels fairly clunky and awkward, although it does have a state view mode. As far as I can tell, the ZX is just the SX with more powerful software. My overall impression of both models is not great.

Intronix LA1034 LogicPort – $389
This is a 500 MS/sec 34 channel buffered sampler. It can also do 200 MS/sec state mode with an external clock. Unlike the Saleae LAs, the advertised 500 MS/sec isn’t for just a few channels, but is available when using all 34 channels. The sample buffer holds up to 2K samples, or more if using sample compression. That’s still puny compared to the millions of samples you get with the Saleae units, but it makes up for it with powerful triggering options, so you can capture only the specific event of interest. The software isn’t as pretty as the Saleae client, but it’s quite useable and powerful, with a nice state view mode and many other advanced capabilities. The only thing I noticed missing is a search feature, although with only 2048 samples there’s not so much to search. I’ve never heard of Intronix before, but the LA1034 is enthusiastically recommended in a couple of electronics forums.

Open Logic Sniffer – $50
The OLS is an open source hardware product, designed by the Gadget Factory and Dangerous Prototypes, and sold by Seeed Studio. It’s a buffered sampler, configurable as a 200 MS/sec 16 channel LA with 8K sample depth, or a 100 MS/sec 32 channel LA with 4K sample depth. The price is a bit misleading, since it’s just a bare circuit board sold without a case or test leads, but a complete setup can be put together for about $75. It supports external clock sources, and has fairly powerful triggering options, but lacks a state view mode. The popular “JaWi client” software is a little funky compared to the polish of Saleae’s software, but for an open source product it’s pretty good. The Achilles’ heel of the OLS is the documentation and setup. Documentation is a confusing tangle of wikis and docs and versions, scattered across three different web sites, with redundant or conflicting information, and frequent links to obsolete information. When you do find what you’re looking for, it’s hard to know if it’s authoritative or current. But for those willing to endure a product that’s rough around the edges, the OLS may be a worthwhile option.

DS Logic – $99
Begun as a Kickstarter project in 2014, the DS Logic is a 16-channel buffered sampler with a hefty 16M sample depth. It samples digital data at rates up to 400 MS/sec, or 100 MS/sec when using all 16 channels, and also includes external clock and trigger inputs. The client software is very similar to Saleae’s – they practically cloned the UI – but it adds a few extra features like advanced trigger options and continuous capture. A $299 deluxe version adds an analog oscilloscope function with 30 MHz bandwidth, and wireless data collection capability.

 
Conclusions

saleae conclusions

Saleae Logic Pro 8 – Likes

  • Hardware quality
  • Ease of setup and use
  • Huge capture sizes
  • Extensible client software

Saleae Logic Pro 8 – Dislikes

  • No external clock input
  • Low analog input bandwidth
  • No state view
  • Limited trigger/search options
  • No continuous triggering

So does the Saleae Logic Pro 8 get BMOW’s recommendation? Not quite. It’s a nice piece of hardware with well-polished software, but it’s missing some key logic analyzer functions, and doesn’t offer enough extras compared to cheaper LA models to justify its $399 price tag. For basic electronics hobbyist use, I would instead recommend the Saleae Logic 8 at $199, or either of the two discontinued Saleae models if you can still find them. For more complex work requiring higher speeds, more channels, or non-trivial triggers, the LA1034 LogicPort, Open Logic Sniffer, and DS Logic all offer more functionality for the same price or less, although with less polished software. I’m looking forward to Saleae’s future hardware updates and especially their future software improvements, since three of my dislikes could be addressed entirely in software. We live in interesting times, and it’s exciting to see what the tool vendors will dream up next.

Read 16 comments and join the conversation 

Eagle vs. KiCad Revisited

kicad-vs-eagle-revisited

Four and a half years ago, I wrote a mini-review of Eagle vs. KiCad, two of the most popular software tools for hobbyists creating custom circuit boards. I concluded that while KiCad had lots of promise, it was too full of quirks and bugs to recommend, and Eagle was the better choice for most people.

This week I had an opportunity to try KiCad again. Although nothing had fundamentally changed, I found that my overall impression of the program was much more favorable. KiCad still has lots of annoying issues, but frankly so does Eagle. And with 4 1/2 years more design experience, I can now appreciate how some of what I originally saw as flaws in KiCad were actually just different design decisions, whose value I can now appreciate.

 
The Software

If you’re not familiar with either of these tools, Eagle is a commercial program created by the German company CadSoft, and according to Wikipedia it’s been in continuous development since 1988. There are several versions of Eagle, but the majority of hobbyists use Eagle Light. This version is free (as in beer), but is limited to a maximum board area of 100 x 80 mm, and may only be used for non-commercial purposes. There’s also a separate non-profit license available, and an inexpensive commercial light license. Eagle is presently the “standard” for open source hardware and hobby projects found on the web, although this is changing.

KiCad is a free program (as in freedom, and beer), developed by a team of volunteers, and more recently with help from CERN. Wikipedia says KiCad has been around since 1992, although I only first heard of it a few years ago. While Eagle is a single monolithic program, KiCad is a loose collection of several different cooperating programs for schematic editing, board layout, and other tasks.

This is not a debate over free software. The nice folks at CadSoft are great for offering the very capable Eagle Light for free, and I have nothing against people charging money for software, since that’s how I’ve employed myself most of my life. :-) My interest is solely in which tool is better for the job.

I tested KiCad build 2014-10-27 and Eagle 6.3.0, both running on Windows 7. This is an older version of Eagle, which isn’t quite fair, but the test was more about how KiCad has changed since I last looked at it.

So which tool is better for hobbyists? I’m going to score them roughly even, but each has its strengths. Most open source hardware projects come with Eagle design files, so if you’re extending an existing project the choice may already be made for you. Eagle is also more scriptable, and may have an easier transition path to professional-level EDA software. But KiCad’s user interface is more intuitive, its board layout tool has some nice extra features, and it can create boards of any size. If forced to pick a favorite, I would say KiCad just edges ahead for the win.

 
KiCad Gripes

In my original review, my biggest complaint with KiCad was the way footprint selection was divorced from component selection. In Eagle you can place a 555 timer in your schematic, switch to the board view, and place it. In KiCad, you can place a 555 timer in your schematic, then you need to run another tool to select which footprint to associate with it. Only then can you place the chip on the board and route its connections. To a beginner this is a turn-off, feeling confusing and cumbersome. But once you’ve been around the block a few times, you’ll appreciate the flexibility this approach offers. This is one feature that grew on me after a while.

The earlier review complained about graphical “droppies” on the screen, and unfortunately this hasn’t gotten better. In the layout tool especially, virtually every time you do anything, you’ll be left with half-redrawn garbage on the screen, broken lines, or other visual artifacts. But as annoying as this problem is, I found I quickly developed the habit to refresh the view or change the zoom level after every operation, restoring the screen to normal.

Other problems from my original review seemed at least partially fixed: flawed footprints (didn’t notice any this time), random pieces of text in French or German (still happens), rat’s nest wire drawing problems (seems OK now).

KiCad doesn’t automatically add a board outline. This confused me before, and it confused me again this time. It’s just a minor gripe, though. By making me draw my own board outline, it may even force me to think harder about exactly what shape the board should be, or defer the board outline decisions until the main components have already been placed.

Creating new libraries of schematic symbols and footprints was pretty challenging to understand the first time, and I had to search out web tutorials. The process is just as bad with Eagle, though. Once I had my new library created in KiCad, the process of copying, modifying, and creating new symbols and footprints felt substantially easier than with Eagle.

Modifying tracks that have already been placed still seems cumbersome with KiCad. Once a board gets crowded, you often need to introduce extra little bends and angles in already-placed tracks, in order to make room for new tracks. KiCad can do this, but it seemed like 90% of the time it complained about “two collinear segments”, or just moved sections of track in a way other than what I wanted. I found I had to resort to deleting the track and routing it over again more often than I’d like.

One spot where KiCad still lags is the integration between its various sub-tools. With Eagle, changes made in the schematic are automatically reflected in the board layout, and vice versa. With KiCad you have to export a netlist file from the schematic editor, and import it to the board layout editor, every time you make a change.

 
KiCad Likes

The more I used it, the more I appreciated KiCad’s “Google Maps” board layout view, where high zoom levels show each track and pin labeled with a signal name like a street map. Very handy.

In the last review I complained about problems with the design rule checker, the tool that verifies clearance between tracks and neighboring tracks, pads, and vias. This time I had no trouble with the design rules, because they’re automatically enforced as you route the tracks. If a particular track placement would violate a design rule, it just won’t let you put the track there. If routing worked this way before, I don’t remember it. It’s a very nice feature.

I also griped about the autorouter last time. Since then, KiCad has added integration with FreeRouting, an external Java-based autorouter. There seems to be some legal dispute surrounding FreeRouting, and the web-based version of the tool that KiCad links to no longer exists. However, I was able to download a precompiled Window executable of FreeRouting, which worked fine with the file I exported from KiCad. It successfully routed what I thought was a difficult section of board in only a few seconds, and the result was easy to import back into KiCad. The result did have a few crazy tracks that spanned half the board, but if I’d been doing that part myself manually, I probably would have given up before I ever finished it.

 
Final Cut

The reality is that KiCad, Eagle, or any other circuit layout tool has a fairly steep learning curve, and you’ll have to invest many hours of time learning to use it effectively. With the current version of KiCad, I believe it’s worth that investment of time. Those who are already happy with Eagle will probably find little compelling reason to switch, but for new hobbyist engineers, KiCad certainly deserves a close look.

Read 16 comments and join the conversation 

Rigol DS1074Z Oscilloscope Review

Here’s my long-overdue review of the Rigol DS1074Z four-channel oscilloscope, which I purchased a couple of months ago. At around $550, the DS1074Z occupies a unique place in today’s oscilloscope market between the $300 entry-level scopes, and the $850+ higher end scopes like Rigol’s new DS2000 series. It’s also one of the only scopes in this range to offer four channels.

I felt a little unprepared to do a review, since during the time that I’ve had the scope,  I’ve really only scratched the surface of its features. If there’s something not covered in the video or something else you’d like me to test, leave me a note in the comments. A quick summary of what’s covered in the video:

Likes – display, menus, measurement features, memory size, SPI decoding
Dislikes – fan noise

If you’re shopping for a new oscilloscope and can stretch your budget beyond the entry-level choices, the DS1074Z is definitely worth a look.

Read 13 comments and join the conversation 

Cortex M3 For Dummies – STM32 Discovery Board

The ARM Cortex M3 has generated lots of buzz lately. Maybe you’ve been working with Arduinos, AVRs, or PICs for a while, and heard about the Cortex M3, but weren’t sure what it was all about or whether it was even relevant to you. This review will try to shed some light on the Cortex M3’s capabilities and development tools, using the STM32VLDiscovery board from ST Microelectronics.

The nice folks at Newark sent me a STM32VLDiscovery Cortex M3 evaluation board for review. This little board packs a big bunch for a remarkably low price. It can be found for under $10 if you hunt around for a deal, which is an amazing value. ST is also currently running a promotion in which residents of the USA and Canada can get a free STM32F4Discovery board, which is similar to the board reviewed here.

Although the STM32 Discovery board isn’t marketed as an Arduino competitor, it could be one. Its size, layout, and functionality make it a reasonable replacement for many applications needing a small microcontroller board with lots of I/Os for experiments and mad scientist projects. To help put its specs into context, I’ve selected a few other boards that readers may be familiar with for comparison purposes. All the boards contain a microcontroller along with one or two buttons and LEDs, with I/Os connected to hobbyist-friendly 0.1 inch headers, and can be programmed with a plain USB cable. In addition to the STM32 Discovery, they are the Arduino Uno, Arduino Mega 2560, and Copper AVR32.

STM32VL Discovery Arduino Uno Arduino Mega 2560 Copper AVR32
Price $10 $25 $50 $38
Processor STM32F100 Cortex-M3 ATmega328P AVR ATmega2560 AVR AT32UC3B1256 AVR
Type 32 bit 8 bit 8 bit 32 bit
Flash (KB) 128 32 256 256
EEPROM (KB) 0 1 4 0
RAM (KB) 8 2 8 32
Max Speed (MHz) 24 20 16 60
Voltage (V) 2.0 – 3.6 1.8 – 5.5 1.8 – 5.5 3.0 – 3.6
User I/O Pins 51 20 70 28
SPI channels 2 2 5 3
I2C channels 2 1 1 1
UART channels 3 1 4 2
ADC channels 16 8 16 6
DAC channels 2 0 0 0
USB no no no yes

The table makes it clear that you’re getting a lot of microcontroller for your money. ST is very likely selling these boards at a loss, because their goals are different than the makers of the other boards. ST isn’t trying to sell you a prototyping product, but rather they’re trying to get you familiar with their line of STM32 microcontrollers so you’ll go on to incorporate them into a product of your own design. To appreciate this, it’s necessary to understand the ARM Model that gave rise to the Cortex M3.

 

The ARM Model

The Cortex M3 and other ARM processors were designed by ARM Holdings, a British semiconductor company. ARM doesn’t actually manufacture the processors they design, but instead they license the designs to other semiconductor companies, who then turn them into specific chips and sell them under their own brand names. Thus ST’s STM32 line of microcontrollers, Texas Instruments’ Stellaris line, NXP’s LPC1000 line, Atmel’s AT91SAMxx line, and many others are all Cortex M3 microcontrollers. All use the same instruction set, and have very similar features, so you could find a chip in any of those lines that’s a near functional equivalent of the STM32F100RB chip on the Discovery board. The chips aren’t exact clones, however. They differ in the amount of on-chip memory, clock speeds, pin configuration, peripheral units, and other features.

Unfortunately this is the Cortex M3’s biggest obstacle to gaining more traction in the electronics hobbyist community, because there isn’t really a “Cortex M3 community”. There’s a Stellaris community, and an LPCxxxx community, and an STM32 community, and so on. Each one is just different enough from the others to make manufacturer-independent tools and code sharing difficult. By fracturing the community into many different parts, it also makes it harder for it to reach a critical mass necessary to catch significant public interest in the way AVR and PIC have.

 

ARM Cortex M3

So what exactly is a Cortex M3? It’s a microcontroller, like an AVR or PIC. That means it has built-in Flash memory and RAM, lots of general-purpose I/O pins that can be controlled individually through software, and built-in peripherals for things like serial communication or analog-to-digital conversion. Where the Cortex M3 differs from 8-bit AVRs and PICs is that it’s a 32-bit processor, capable of running at speeds up to 100 MHz in some versions, and is a cousin of the higher-end ARM processors found in devices like Apple’s iPad and in some PCs. It also offers larger memories than typically found on AVR or PIC microcontrollers. In short, it’s like a beefed-up version of the micrcontroller you’re using now.

A few interesting features of the Cortex M3 aren’t found on typical 8-bit microcontrollers, like hardware divide support, an internal PLL for clock synthesis, and two digital to analog converters. The M3 also has a clever remappable pin feature, which lets you choose from among several options for which pins to use for I2C, SPI, USART, ADC, and other hardware units. This provides extra flexibility in board design and port usage.

The Cortex M3 has a newer sibling called the Cortex M0. The M0 is geared towards lower cost, lower power applications than the M3, but the two are very similar. Most of what you read about the M3 on the web also applies to the M0.

While it has arguably generated the most buzz lately, the Cortex M3 is by no means the only entry in the 32-bit high performance microcontroller market attempting to supplant 8-bit mcus like the AVR ATmega. Another example is the AVR32 chip, found on the Copper board. This 32-bit microcontroller is also made by Atmel, although it shares little with Atmel’s 8-bit microcontrollers beyond the AVR name.

 

The STM32VLDiscovery Board

ST’s name for their Cortex M3 product line is STM32, and the STMVL32Discovery board is the smallest and least expensive of their Cortex M3 evaluation boards. Now who is this woman that appears on all of ST’s marketing materials for the STM32 line? Her photoshopped eyes look a bit disturbing. Some of their STM32 materials also use a rainbow-colored butterfly logo, which I like much better.

The STMVL32Discovery board is actually two boards in one: everything to the left of the vertical line in the above diagram is an ST-Link programming/debug module. It can be used with the Cortex M3 module on the right side of the board, or to program and debug a STM32 mcu on another board in a stand-alone application that lacks a programmer. I believe an earlier version of the ST Discovery board actually had perforations in the PCB so you could break-off the ST-Link module and use the two halves separately, but with the STMVL32Discovery there’s only a line in the silkscreen to remind you there are two functionally separate modules on a single board.

At roughly 3.3 x 1.7 inches, the board is smaller than a standard-size Arduino. Two rows of 0.1 inch header down the sides of the board make it easy to connect other components. The headers are on both the top and the bottom of the board, so you can make connections from either side, which is a nice touch. You could almost drop the ST Discovery right into a breadboard, except for the 6-pin header along its right edge. I’m not sure what ST was thinking here, because when inserted into a breadboard, these six pins will all be tied together in a single row. Depending on the layout of your breadboard, you may be able to insert the ST Discovery such that those six pins hang off the edge, unconnected.

Besides the Cortex M3 itself, the board also contains two user-controllable LEDs, a user push button, and a reset button. That’s similar to what you’ll find on an Arduino board, and is just enough to test things out and make sure it’s all working before connecting more components.

The board is powered over USB, or optionally from an external supply. When first connected to your PC, it immediately runs a demo program that flashes the LEDs in different patterns when you press the user push button. Under Windows it also automatically mounts itself as a read-only USB storage drive, which contains a few files with links to the online documentation. That’s pretty slick.

There’s no printed documentation included with the ST Discovery board, but the online user manual is quite well done. It includes a quick-start guide, block diagram, layout diagram, mechanical drawing, full schematics, explanation of all the jumpers and solder bridges, and a pin description table. The companion software package includes the libraries and header files for ST’s version of the Cortex M3 and for the ST Discovery board peripherals, as well as several example projects. On ST’s web site you’ll also find detailed tutorials for building and running the example projects using the three officially supported development toolchains.

 

ARM Development Tools

There are a bewildering number of development toolchain choices for ARM Cortex M3 development. It’s hard to overstate just how painful this makes the getting started process for a beginner. Worse still, the only officially-supported toolchains (IAR, Keil, and Atollic) are professional tools which are very expensive, and certainly won’t be of interest to any hobby developers. When their web page has a “get a quote” link instead of listing an actual price, that’s the clue to look elsewhere. For reference, the Keil tools cost $4895 for a single license, and the others are similar. Ouch!

The professional tools do all offer a time-limited trial version or a code size-limited version, but few hobbyists will be happy with those as a permanent solution.

If you’re willing to cough up a little money (but not $4895) for a well-made development tool with good support, Rowley Associates Crossworks is well-regarded and is just $150 for a personal license.

After looking at more than a dozen different development tools, I decided to put together my own toolchain based on the CodeSourcery CodeBench g++ Lite GNU command line tools and Eclipse C/C++ IDE. This excellent setup guide for Eclipse and CodeSourcery with STM32 describes the process in detail, so I won’t list all the setup steps here. Depending on your familiarity with other development environments and your tolerance for this sort of job, you may find the process anywhere from slightly tedious to completely impossible. It involves installing Eclipse, the Java runtime, CodeSourcery, GDB server, the STM32 SDK, and the ST-Link programming utility from six different sources, and then configuring them all to work together properly.  It really makes you appreciate the convenience of a tool like Atmel’s AVR Studio, which performs all of the same functions in a single tool with a single download and install process. All together, it took me about 90 minutes to get the STM32 Cortex M3 development tools configured and program one of the example projects onto the ST Discovery board.

 

Cortex M3 Software Development

OK, let’s blink some LEDs. Here’s the simplest of the examples provided by ST:

#include "stm32f10x.h"
#include "STM32vldiscovery.h"
GPIO_InitTypeDef GPIO_InitStructure;
void Delay(__IO uint32_t nCount);
int main(void)
{      
    /* Configure all unused GPIO port pins in Analog Input mode (floating input
      trigger OFF), this will reduce the power consumption and increase the device
      immunity against EMI/EMC *************************************************/
    RCC_APB2PeriphClockCmd(RCC_APB2Periph_GPIOA | RCC_APB2Periph_GPIOB |
    RCC_APB2Periph_GPIOC | RCC_APB2Periph_GPIOD |
    RCC_APB2Periph_GPIOE, ENABLE);
    GPIO_InitStructure.GPIO_Pin = GPIO_Pin_All;
    GPIO_InitStructure.GPIO_Mode = GPIO_Mode_AIN;
    GPIO_Init(GPIOA, &GPIO_InitStructure);
    GPIO_Init(GPIOB, &GPIO_InitStructure);
    GPIO_Init(GPIOC, &GPIO_InitStructure);
    GPIO_Init(GPIOD, &GPIO_InitStructure);
    GPIO_Init(GPIOE, &GPIO_InitStructure);
    RCC_APB2PeriphClockCmd(RCC_APB2Periph_GPIOA | RCC_APB2Periph_GPIOB |
    RCC_APB2Periph_GPIOC | RCC_APB2Periph_GPIOD |
    RCC_APB2Periph_GPIOE, DISABLE);
    /* Initialize Leds LD3 and LD4 mounted on STM32VLDISCOVERY board */
    STM32vldiscovery_LEDInit(LED3);
    STM32vldiscovery_LEDInit(LED4);
    while (1)
    {
        /* Turn on LD2 and LD3 */
        STM32vldiscovery_LEDOn(LED3);
        STM32vldiscovery_LEDOff(LED4);
        /* Insert delay */
        Delay(0xAFFFFF);
        /* Turn off LD3 and LD4 */
        STM32vldiscovery_LEDOff(LED3);
        STM32vldiscovery_LEDOn(LED4);
        /* Insert delay */
        Delay(0xAFFFFF);
    }
}
void Delay(__IO uint32_t nCount)
{
    for(; nCount != 0; nCount--);
}

If you’ve previously developed software for other microcontrollers, then this probably looks fairly understandable. The first step is to configure all of the I/O pins as inputs, akin to setting the DDR (data direction) register on an AVR or calling pinMode() with the Arduino environment. In this case, the I/Os are configured by passing a struct to a function instead of directly twiddling some bits in a register, although it’s likely that the implementation of GPIO_Init() does something like that under the hood. That RCCABP2PeriphClockCmd() function looks a little strange, though– it appears the port’s clocks need to be explicitly enabled before configuration, then disabled afterwards.

The rest is just manipulation of the LEDs, but it’s a bit more abstract than most people are probably accustomed to seeing. You don’t normally need to initialize an LED or call a function to turn it on. These functions are provided by the ST Discovery board library to simplify development, but curious minds will want to know what they actually do. Here’s the implementation of STM32vldiscovery_LEDInit() and definitions of LED3 and LED4:

typedef enum
{
 LED3 = 0,
 LED4 = 1
} Led_TypeDef;
const uint32_t GPIO_CLK[LEDn] = {LED3_GPIO_CLK, LED4_GPIO_CLK};
#define LED3_GPIO_CLK                    RCC_APB2Periph_GPIOC 
#define LED4_GPIO_CLK                    RCC_APB2Periph_GPIOC
void STM32vldiscovery_LEDInit(Led_TypeDef Led)
{
    GPIO_InitTypeDef  GPIO_InitStructure;

    /* Enable the GPIO_LED Clock */
    RCC_APB2PeriphClockCmd(GPIO_CLK[Led], ENABLE);
    /* Configure the GPIO_LED pin */
    GPIO_InitStructure.GPIO_Pin = GPIO_PIN[Led];

    GPIO_InitStructure.GPIO_Mode = GPIO_Mode_Out_PP;
    GPIO_InitStructure.GPIO_Speed = GPIO_Speed_50MHz;
    GPIO_Init(GPIO_PORT[Led], &GPIO_InitStructure);
}

So STM32vldiscovery_LEDInit() enables the clock for port C, where the LEDs are connected, and then configures the correct pin in port C as an output. GPIO_Mode_Out_PP specifies a push-pull output, in other words a normal output that’s actively driven high and low. GPIO_Speed_50MHz controls the rise and fall times of the output signal. Selecting a faster speed will produce shorter rise and fall times, but will consume more current and generate more supply voltage swings and electrical noise.

Lastly, let’s take a look at STM32vldiscovery_LEDOn() and STM32vldiscovery_LEDOff():

void STM32vldiscovery_LEDOn(Led_TypeDef Led)
{
    GPIO_PORT[Led]->BSRR = GPIO_PIN[Led];  
}
void STM32vldiscovery_LEDOff(Led_TypeDef Led)
{
    GPIO_PORT[Led]->BRR = GPIO_PIN[Led];  
}

Each port has a separate bit set (BSRR) and reset (BRR) register. Writing a 1 to the corresponding bit position for a pin sets the pin’s value to 1 or 0, depending on which register is written. Alternatively, all 16 pins of a port can be set at once by writing a 16-bit value to the port’s ODR register.

Note that all these GPIO_ APIs are specific to ST’s Cortex M3 library. If you’re using some other flavor of Cortex M3, then your method of controlling I/Os will be different.

Building the code and programming it to the ST Discovery board’s Cortex M3 is straightforward. Within Eclipse, select Project -> Build Project from the menu, and if all goes well you’ll end up with a .bin file. The LED blink example generates a 7K .bin file for the Debug configuration, and 5K in release. That’s a pretty big binary for just the few lines of code in main.c, but all those GPIO_ functions and other support libraries bloat the code further. Due to the larger size of Cortex M3 binaries, the larger Flash memory of the M3 as compared to an Arduino isn’t as significant an advantage as it first appears.

To program the board, launch the ST-Link utility program, and open the .bin file. Programming doesn’t automatically reset the board or launch the new program, though. After programming is complete, you can open the MCU Core dialog and press the System Reset button. From here you can also do other neat stuff like view all the mcu register contents, or single-step the clock.

 

Debugging

The built-in ST-Link on the ST Discovery board also supports live debugging of the program running on the Cortex M3. If you’re coming from the Arduino, or vanilla AVR development with an AVRISP mkII programmer that lacks debugging capability, this is a quantum leap forward. The people who wrote the Eclipse + Code Sourcery + STM32 setup guide also wrote an excellent hardware debugging setup guide, so I won’t repeat the steps here. Unfortunately, it involves downloading the entire Atollic Lite toolchain (250MB) just to get the ST-Link compatible gdbserver it contains. This also requires registering with Atollic, and getting a (free) activation key. It’s a hassle, but you only need to do it once.

Debugging works just how you’d expect. You can step through running code, set breakpoints, examine variable values, and view the call stack. Stepping through code seems unexpectedly slow, taking 1-2 seconds to step over a single line. The gdbserver also crashed a few times while I was debugging, and the only clue was that attempts to start a new debugging session failed with a generic error. I also found that stepping over a line that branches to itself, like

for(; nCount != 0; nCount--);

put the debugger into a state where the program was suspended, but the debugger thought it was running, forcing me to press the “pause” button to regain control. The overall debugging experience was a bit rough around the edges, but was much better than nothing. A different toolchain might have provided a smoother debugging experience, and I don’t fault the Cortex M3 or the ST Discovery board hardware for the debugging problems I encountered.

 

Beyond the Examples

Running through the examples is a good way to get familiar with the Cortex M3, but to really learn what it’s like to develop for, there’s nothing better than creating your own custom project. Using the LED blink program as a template, I was able to port some code for controlling a Nokia 5110 LCD display, and print the message shown in the photo. The LEDs, transistor, and resistor in the upper right are from an unrelated circuit on my breadboard, and the only connections between the Discovery board and the LCD are the seven wires at the bottom.

My example program shows the code used to print the message on the LCD. There were no real surprises during development, and it only took me about 30 minutes to get the LCD working with the ST Discovery board. The only oddity I encountered was that there doesn’t appear to be a calibrated wait/delay function anywhere in the STM32 libraries. I wrote my own delay_ms() function that’s roughly accurate, assuming a 24 MHz clock and code compiled in the Debug configuration. It’s an imperfect solution, so if there’s not already a suitable delay function somewhere in another ARM library, it will probably be necessary to use a timer to get delays independent of clock speed and compiler options.

The GPIO_ library is great for getting projects up and running quickly. For high-speed performance-sensitive designs, however, there may be an unacceptable amount of overhead involved in calling a library function every time an I/O pin needs to be manipulated. In practice, the compiler may inline these functions when building Release configuration code, but I didn’t investigate to confirm it.

When powered by USB, the voltage at the ST Discovery board’s 3V3 pin is only about 2.97V, with no load. That might cause problems for some devices, but the Nokia 5110 LCD worked fine at that voltage.

 

Conclusions

Are the Cortex M3 and STM32 Discovery board what you’ve been looking for? In what kinds of projects do they fit best?

As an alternative to an Arduino or vanilla AVR board, the STM32 Discovery board looks like a clear winner in terms of price and hardware. It has more I/Os, more peripherals, larger memories, faster core clock MHz speed, and more performance per MHz due to its 32-bit internal design. All that, and it’s priced at less than half the cost of an Arduino.

What the STM32 Discovery lacks is easy-to-use development tools, and a large community of other users to collaborate with. The method I used to create a toolchain is probably too complex for many beginners, and they won’t be able to afford the commercial alternatives, so they’ll likely be shut out of Cortex M3 development entirely. Even if they do get the toolchain installed successfully, they won’t find the wealth of examples, 3rd-party libraries, or other support materials that exist in the AVR world. The value of that support shouldn’t be underestimated. Performance and memory aren’t everything.

The STM32 Discovery board is better matched for experienced microcontroller developers who’ve already done some work outside the Arduino environment, and are ready to accept additional complexity in exchange for major hardware improvements. The 24 MHz 32-bit Cortex M3 on the STM32 Discovery board is already a nice step up from 8-bit AVRs, but it’s just the beginning of the Cortex M3 line. Higher-end members of the Cortex M3 family can perform complex real-time tasks not possible on an AVR, drive high resolution color displays, run a real-time micro-OS capable of scheduling many programs at once, or even run Linux. For those who need this kind of power, the Cortex M3 is a great choice.

Read 10 comments and join the conversation 

Xilinx vs. Altera Tools For Hobbyists

I used to believe that Altera’s FPGA tools were much more hobbyist-friendly than the comparable Xilinx tools, and I frequently bashed the Xilinx tools whenever the topic came up. But after giving them a head-to-head comparison recently, I think I may have to eat my words. The truth is they’re both pretty rough and clunky, and difficult for a beginner to get the hang of, but the Xilinx tools are definitely superior in some important areas.

My FPGA apprenticeship started out poorly with Xilinx in 2009. Over the couple of years I’ve owned a Xilinx Spartan 3A FPGA starter kit, I’ve learned to really hate it and the confusing Xilinx tools and documentation. Trying to get the DDR2 DRAM working on the Xilinx board was an exercise in futility that occupied several months of my time, and I eventually just gave up, as I couldn’t even get the reference design to work. The Spartan 3A starter kit hardware also seems needlessly complex, like they threw one of every possible component on there just to serve as an example. That makes it a confusing mass of jumpers, options, and shared pins that obscures whatever you’re trying to create. Too often I also found the Xilinx documentation and examples incomprehensible, and their online support poor to none. Eventually I gave up on them, and vowed to only use parts from their competitor Altera in the future.

A year or so later, the opportunity to try Altera hardware and tools came, during the development of Tiny CPU. It was a mostly positive experience, although I didn’t really attempt anything very complex. When I needed it, I found the Altera documentation to be decent, and the project went forward without ever hitting any Altera-specific snags. I viewed the result as promising, but not really conclusively better than Xilinx.

During the recent development of Plus Too, I’ve finally had an opportunity to try both Xilinx and Altera tools for the same project, and make a direct comparison. I first spent about a week gettings things set up on the Xilinx board, which culminated in the “Sad Mac” I wrote about yesterday. Then for the past two days, I’ve been translating the existing design to get it working on the Altera board. I love the Altera DE1 hardware– it’s uncluttered, has SRAM *and* SDRAM, and comes with a nice program that can be used to interactively control the hardware or read/write the on-board memory. When it comes to the tools, however, moving from Xilinx to Altera definitely felt like taking a step backward.

Windows 3.1 called. It wants its interface back.

My first complaint about the Altera tools is the interface, which is a UI gem straight out of 1993. Yes I know it’s a petty complaint, but it reinforces the feeling of cruftiness that permeates everything else in the Altera tools. Check out a couple of screen shots:

Those message tabs remind me of MSVC 6.0. And the navbar icons use about nine unique colors across the whole set. And what’s with the balloon help menu?

Here are the corresponding sections of the Xilinx interface for comparison:

 

That feature is not licensed and has been disabled

Another gripe about the Altera tools is that so many features have been locked out of the free edition. I understand they need to hold something back for the professional edition of their tool, but some of the things they lock just seem petty. After a full compilation run, the output window will be full of warnings about all these tantalizing features you’re not getting. For example if you synthesize your Altera model on a computer with a multi-core CPU (pretty much any CPU these days), you’ll get this warning:

WARNING: Parallel compilation is not licensed and has been disabled

Thanks for nothing, Altera. The Xilinx tools happily spins off multiple threads for each of my CPU cores, and tells me it’s doing it too.

What simulation?

An essential part of FPGA development is simulating the design, because it’s generally much easier to find mistakes in simulation than in the real hardware. You can view every waveform, step through time, set breakpoints, and other sorts of things like you’d do in a functional programming language like C. With the Xilinx tools, I was able to simulate the Plus Too design by switching to the Simulation view, and double-clicking Simulate Behavioral Model. The built-in simulation tool ISim started right up, and within moments I was debugging the design, watching the simulated CPU talk to simulated RAM, ROM, and video. Professionals might need something more powerful than ISim, but it was great for my needs.

The Altera simulation experience was a nightmare in comparison. It took me some time to realize that the Altera Quartus II software doesn’t include any built-in simulator, so I wasted quite a while assuming I was doing something wrong when simulation didn’t work. Altera recommends that you use Modelsim Altera Edition, which is a separate product that must be downloaded and installed separately. Once that’s done, you need to go back into the Altera software and tell it to use Modelsim as the simulation tool, which involved more poking around in menus that are doubtless familiar to pros but took me a while to discover.

Once I had Modelsim AE launching with my design, I thought I was home free. Instead, I was greeted by a laundry list of errors like “Unresolved defparam reference to ‘altpll_component’ in altpll_component.port_extclk3.” After some more swearing and poking around, I found that Modelsim was relying on some environment variables that weren’t set. Environment variables… OK. I made the necessary environment settings, but it still didn’t work. It seemed that Modelsim was unable to parse the definitions for any Altera megafunctions (their IP blocks). A few hours of Googling for answers didn’t find any obvious solutions. It seemed that it might somehow be related to instantiating megafunctions with VHDL implementations from inside a Verilog file, but all my megafunctions were created by the Altera wizard, so they should be fine right? Wrong. Eventually I hand-edited the modelsim.ini file to force it to use Verilog implementations of the megafunctions instead of VHDL ones, and that worked. There was probably some simpler way to do it, but I never found it.

You got your VHDL in my Verilog

Once the simulation model finally compiled successfully, I was ready to start debugging, only to be met with the error message “ALTERA version supports only a single HDL”. Huh? The translation is “you can’t simulate designs containing both Verilog and VHDL files unless you buy the commercial version of ModelSim for over a thousand dollars”. Since my design files are Verilog, but the TG68 68000 core is VHDL, that meant I was dead in the water. In contrast, Xilinx’s ISim simulated all this with ease and no complaints.

It’s the software, stupid

For most of us electronics hobbyists who are interested in FPGAs, the choice of what device or board to use isn’t really determined by which has the most 18-bit multipliers or other whiz-bang features. It’s not really determined by the cost, either, since generally we’re only buying one device. Instead, it’s determined by how easy the device is to use, and how quickly we can accomplish our goals with the hardware. The best software tools are like a trusty set of wrenches that let us quickly open things up and tinker with them, focuing our attention on the novel parts of the project. Poor tools force you to spend time thinking about them instead of your project, and I wish there were better FPGA tools options for hobbyists. From the two major FPGA vendors, my nod goes to the Xilinx tools if you care strongly about simulation.

 

 

 

 

Read 19 comments and join the conversation 

Understanding Verilog Warnings

Those of you who’ve followed the blog for a while know about my many frustrations with Verilog. Because it feels sort of a like a procedural programming language, but very definitely isn’t one, I keep expecting to be far more competent at Verilog design than I actually am. While working on Plus Too, the Xilinx synthesis tool reported many, many warnings that I didn’t understand. The warning list grew to at least 100, and was so long that I just stopped reading it. That was dangerous, as most of the warnings were likely problems that needed to be addressed.

I’ve been writing C and C++ programs for years, and I’m very comfortable with the language, its details, and the compiler warnings and errors produced by various mistakes. I normally  find the warnings easy to understand, because they reference a specific file and line number, and use well-known terminology to describe the problem. Sure, some more obscure errors like “not an lvalue” would probably flummox a beginner, but at least he’d know what line to scrutinize.

Most Verilog warnings I see are non-localized, and do not reference a specific file or line number. They are design-wide warnings, resulting from an analysis of all the modules in all the .v files. This can make it unclear where to even being looking for the cause of a warning. A typical example is something like:

Xst:647 – Input <vblank> is never used. This port will be preserved and left unconnected if it belongs to a top-level block or it belongs to a sub-block and the hierarchy of this sub-block is preserved.

OK, there’s an unused input named vblank. But where? The vblank signal is routed through half a dozen different modules in the design, so how do I know which one I messed up? The only solution I’ve found is to search the whole project for all references to vblank, and verify each one. I also find that error message much too wordy.

Another example:

Xst:646 – Signal <ramAddr<0>> is assigned but never used. This unconnected signal will be trimmed during the optimization process.

This is basically the same as the first example, but has a totally different warning message. Why? Because one is single combinatorial output, and one is a bit in a register? Then there’s this:

Xst:2677 – Node <ac0/vt/videoAddr_17> of sequential type is unconnected in block <plusToo_top>

It’s essentially the same issue again, but yet another totally different warning message. This time it gives the name of the offending module, so it should be easier to track down.

The general meaning of all these warnings is fairly clear: some expected signal connections are missing. Find the problem, and either add the missing connection, or suppress the warning if the unconnected signal is intentional. There were two other warnings I saw frequently whose meanings were definitely not clear to me, however:

Xst:2042 – Unit dataController_top: 34 internal tristates are replaced by logic (pull-up yes): cpuData<0>, cpuData<10>, cpuData<11>, cpuData<12>, cpuData<13>, cpuData<14>, cpuData<15>, cpuData<1>, cpuData<2>, cpuData<3>, cpuData<4>, cpuData<5>, cpuData<6>, cpuData<7>, cpuData<8>, cpuData<9>, mouseClk, mouseData, ramData<0>, ramData<10>, ramData<11>, ramData<12>, ramData<13>, ramData<14>, ramData<15>, ramData<1>, ramData<2>, ramData<3>, ramData<4>, ramData<5>, ramData<6>, ramData<7>, ramData<8>, ramData<9>.

Um, what? This meant nothing to me. I wasn’t even sure if replacing internal tristates with logic was good or bad. The Xilinx tool shows each warning as a link you can click to get more info, but sadly it doesn’t work. Clicking the link just opens a web browser and does a search on the Xilinx site for “Xst:2042″, which returns no results. In fact, none of the synthesis warning links work. If a warning doesn’t make sense to you, you’re on your own.

After a lot of searching around on other web sites, I finally found a decent explanation. It seems that some (or all?) Xilinx devices do not support tristate logic (a signal with an output enable) anywhere but on the actual I/O pins. Signals internal to the FPGA can not be tristate. Tristate logic is typically used to enable multiple drivers to operate on a single shared bus, one at a time. So instead of using internal tristates, you need to construct your design using additional logic to select which module’s data should appear on the shared internal bus, using a mux or similar method.

That mostly makes sense, but I’m using the FPGA to simulate a system of separate parts (address controller, data controller, CPU, RAM, etc) that will eventually be physically separate chips communicating with tristate logic on shared busses. I don’t want to rewrite my design to eliminate tristate logic, because tristate logic is what will be used for these chips. For now I’ve left the logic as is, and I’m ignoring the warnings, and it seems to be working OK. I’m unclear exactly what the synthesis tool has substituted for the internal tristates, though– “logic (pull-up yes)”? What is that, and what problems might it cause?

The other confusing warning that’s been plaguing the design is:

Xst:2170 – Unit plusToo_top : the following signal(s) form a combinatorial loop: ramData<0>, ramData<0>LogicTrst20.

Xst:2170 – Unit plusToo_top : the following signal(s) form a combinatorial loop: ramData<1>, ramData<1>LogicTrst20.

…and so on, for every bit of ramData. This stems from my attempt to specify a bidirectional bus driver akin to a 74LS245:

assign ramData = (dataBusDriverEnable == 1’b1 && cpuRWn == 1’b0) ? cpuData : 16’hZZZZ;
assign cpuData = (dataBusDriverEnable == 1’b1 && cpuRWn == 1’b1) ? ramData : 16’hZZZZ;

This driver has ramData on one side, and cpuData on the other. When it’s enabled, it drives data from one side to the other. The direction in which data is driven is determined by the cpu read/write line. So why does this form a combinatorial loop? I’d expect to see that warning for something like:

assign a = b & c;

assign b = a & d;

but my bus driver code looks OK to me. I still haven’t found an explanation for this one, but I think it’s related to the previous issue about internal tristates. The synthesis tool is probably replacing my bidirectional bus driver tristates with some other logic, which then forms a combinatorial loop. I’m not sure how to fix this one without rewriting the design to use a different method than tristates. But again the final project will see ramData and cpuData on I/O pins connected to other chips using tristates, so I don’t want to rewrite the design.

 

Read 7 comments and join the conversation 

Older Posts »