So What’s With This IoT Stuff Anyway?

Yesterday’s announcement by NXP that they had completed their acquisition of Freescale Semiconductor got me to thinking about IoT (Internet of Things) again.  I’ve designed several “things”, and Freescale was a supplier of a number of microprocessors used in IoT things.  We also have used several of Freescale’s Kinetis series ARM based microcontrollers in some industrial products.  We have been hearing a lot of marketing hype about IoT for the last several years now, and IoT is starting to look a lot like the so-called home automation market which has been next year’s big thing for the last 30 years.

Two things impede the growth of the home automation market: 1. there’s no ‘killer app’ for it; 2. it’s simply too complicated to deal with for the vast majority of people.  In the case of IoT, what’s holding it back from wider adoption is similar: 1. there’s no universal ‘killer app’ for it; 2. ROI (return on investment) can be hard to come by; 3. complexity of implementation; 4. security concerns.  Of all these factors, the number one stumbling block to wider IoT deployment is surely ROI.  If a business case can be made for applying IoT then it will get done.  Complex issues with implementation and deployment are all solvable given enough time and effort, particularly if there are significant savings to be realized once the system is deployed and in operation.

The NXP merger with Freescale and the resultant company’s focus on automotive electronics points to one likely area for widespread deployment of IoT devices in the future.  Cars are getting ever more complicated and difficult to service.  As the mechanical quality of cars has improved overall in recent years, the greater electronic content of motor vehicles is increasingly becoming the failure point where things go wrong.  In our fleet typically the first step in troubleshooting any problem these days is to reach for the laptop and connect the OBD-2 diagnostic dongle and see what the car thinks is wrong with itself.  Most mechanics I know would rather pull an engine than troubleshoot an electrical problem in a modern vehicle.  Having vehicles connected to the internet could help maintain them and troubleshoot issues when the need arises.  Widespread collection of operational and fault data would lead to improved reliability and provide mechanics with guidance for fixing specific problems quickly.  This helps the auto companies by reducing warranty related costs.  Saab for example has a detailed troubleshooting procedure for any number of system issues.  Always the very last thing to do is to swap out the ECU (Engine computer unit) to attempt to fix a problem.  All replaced ECU’s are required to be returned to Saab for analysis.  They found that 97% of the returned ECU’s were functioning perfectly.  This points out the fact that most mechanics and technicians don’t actually troubleshoot problems, they just change boxes until the problem goes away.  Having vehicles continually monitored would likely eliminate a lot of this ‘box changing’, particularly if the vehicle, via the internet, could tell the techs exactly what was needed to fix the problem.  Ah, you say, just another excuse for people not to think and not really learn how things work?  Unfortunately people aren’t learning anyway, so the machines need to take matters into their own hands.  The reason the robots are taking over is that people aren’t keeping up with the increasing complexities of modern life.

IoT for the general public is simply not going anywhere.  Why does my refrigerator, microwave or washing machine need to be connected to the internet, or even my local network at home?  Blimey, refrigerators have been around for almost 100 years now, and they’ve gotten along fine until now without being “connected”.  Where’s the killer application here in the consumer world for universal connectivity?  Looking a little more closely at this, a lot of the hype appears to be driven by sales and marketing types looking for new ways to get advertising in front of people.  Instrumenting your pantry is being contemplated in order to push reminders to your smart phone that you’re about to run out of laundry detergent.  This is just too gimmicky for the vast majority of consumers, just like home automation in general.

Finding Value in a Troubled EDA Marketplace

When we recently went to evaluate several PCB EDA packages, being interested in the formal practice of value analysis, we tried to discern which packages provided the most “value” for us based on our needs and requirements. The overall function of a PCB EDA software package is to provide a means for creation of an electrical schematic and a database for the manufacture of a printed circuit board. This overall function is provided by numerous sub-functions in the EDA software package. Value then, is simply:


The customer (us) pays for functions that do the things we want done. If functions satisfy customer requirements at minimal cost then they can be said to have good value.  So, does that mean that a free and open source EDA tool such as KiCad has extremely high value? It depends. If one is a hobbyist with minimal funds available needing to make a PCB project, and also having no project deadlines then KiCad could be said to provide good value for this user. For the professional user, with deadlines and budgets to contend with and also the challenges of high speed design, the limited capabilities plus the time and effort required to use a tool like KiCad in this environment represent huge drawbacks, so KiCad provides very poor value for such a user. For us, the things that enhance the value of a particular EDA software package are provision of required capabilities and a work flow that promotes efficiency.  In addition we need to get things done quickly and accurately with a minimum of fuss. The actual price of the tool matters little to us since a tool that provides good value for us will rapidly return any up front investment.  It’s like buying quality woodworking tools: a good tool you pay for only once, a poor tool you pay for every time you use it.  The principal problem with the current state of the EDA marketplace in the USA is that people here tend to buy only on price instead of seeking the best value, and that mentality pervades industry purchasing management as well.  As a result the market players are in a race to the bottom.  We’ve seen these trends play out in other industries and the end result is seldom pretty.

After starting FOM Systems 11 years ago, at first I was using whatever EDA tools my customers happened to have on hand so I got some exposure to lots of different tools. Recently another design requiring some high speed design (impedance control, differential pair routing, flight time alignment (length matching) has come into the shop. As a small outfit specializing in challenging high speed boards we tend to do everything in house. The old paradigm of the engineers making the schematics and simply tossing them over the wall to a pcb designer or service bureau doesn’t work for these kinds of projects. High speed boards are, as RF engineers well know, one of the most important components in the system. If one is to efficiently and successfully create boards like this one needs control over the entire PCB creation process. In addition to creating the schematics I also place, route and prepare the manufacturing data for the boards I design. When routing I’ll often make schematic changes on the fly to improve routing or to change components. The person routing the board needs an engineer’s knowledge of the circuit when doing high speed designs. So when it comes to tool selection, as an engineer and business owner, I’m interested in two things: getting work done as easily as possible without the tools getting in my way, and being able to get it done quickly and accurately to minimize my time on the job. Over the last several months we have been evaluating a number of PCB EDA packages: Eagle CAD from Cadsoft, Altium, KiCad, CADSTAR and most recently the new release of PADS VX.0 from Mentor. It’s interesting that KiCad (which is open source) seems to have about the same capabilities as Eagle, but KiCad’s interactive router is much more pleasant to use. The interactive router in Eagle about makes me crazy. But these two are suitable for only the simplest of two or 4 layer boards. KiCad hit the wall when trying to create a footprint for a Texas Instruments ‘power pad’ VQFP package since it can’t add shapes to copper layers other than pads.  Also one of KiCad’s biggest limitations is it has no library manager. That alone makes it virtually useless for any serious commercial work. Using Altium is an exercise in frustration, the user interface is simply way too complex, the learning curve too steep, and it takes way too long to get anything done. Altium’s new Chinese owners seem to be trying to buy the market by lowballing everyone else, but even if it was free it still wouldn’t represent good value for me. Then just last week we began evaluating Mentor PADS VX.0. Golly their sales and marketing folks do a nice job, especially with the web based evaluations, little “sandboxes” you can launch the tools in and play around with. But there’s still the issue of a confusing integration of multiple disparate tool sets.  This was the big knock against PADS I had 10 years ago but it looks like not much has changed. Why do they have PADSLogic AND dxDesigner in the same package? Lots of confusing options for symbol and footprint creation as well. Making library entries was quite cumbersome and confusing. It’s worse than the BoardStation 500 days and the “Package” utility. We used to say that anybody could design with Mentor, but it was Package that separated the men from the boys. Then we tried their high speed tools. It was disappointing how lame the interactive router was, and that adjusting lengthening patterns was very frustrating. Also disappointing that the lengthening patterns used segmented traces and not rounded traces of constant widths. This is bad, folks. It’s the segment corners that give rise to impedance lumps and cause problems with EMC. But it was after building a complex schematic symbol, its footprint, and then having to enter all 144 pad names AGAIN when creating the library entry (having previously done so when creating the symbol) that I threw in the towel. It had taken me the better part of a day just to make this one part, even with all the good helps they provide. What a mish mash. There’s also some pure stupidity in this tool. They call power pins on components “signal pins”. Doh. Also why are pcb footprints called “decals” when everyone else in this universe calls them footprints? And why are symbols called ‘schematic decals’ when schematic symbols are called just that, also everywhere else? Was this tool created by the government or the military? And why does the install require a 1.7Gb download? There seems like way too much cruft in there.

So here I am, facing a deadline, needing something that just works, can do high speed, and doesn’t get in my way. About 10 years ago I took on a challenge from a customer in Europe to help them get a new project out. They had a PCB EDA tool called CADSTAR from a company I’d never heard of: Zuken Ltd. It seemed pretty lean, the user interface for the schematic editor wasn’t very fancy but in a lot of ways it reminded me of the old Mentor DesignArchitect that I liked so well from the old days. At any rate back then I was able to go from never having seen or used this tool before to having a working complex 8 layer board in my hands in six weeks. I never had any formal training on CADSTAR, I just hacked it with the (limited) help of its help files. So I decided to go look at CADSTAR once again. The new CADSTAR 15 release seems to have a gussied up user interface but it looks like most of the old features are still there.  In less than an hour I had completed the complex component that I had spent a DAY on with PADS and never managed to finish. In an afternoon I got more done with CADSTAR than I had in the previous week using PADS. CADSTAR isn’t overly bloated. In fact the installation file is smaller than KiCad’s. It installed with no issues and just worked right out of the box. CADSTAR doesn’t seem to be very demanding of the hardware either, as it runs almost as well on my old Dell laptop as it does on my i7 workstation. But the best part is, it just works, and works fast. Not much mouse clicking needed either, as keyboard shortcuts are set up pretty well, and easily customized. I mouse left-handed so I’ve swapped the positions of all the function keys and mapped a few favourites to F8 through F12. Regarding its high speed routing capabilities, CADSTAR has way more options than PADS when it comes to setting up various styles of lengthening. CADSTAR’s constraint management also works a lot better, it’s way easier to set up hierarchical skew groups for matched length routing. Adjusting previously routed matched length traces with the PREditor XR 5000 HS routing tool is just dead easy and super fast. I can really fly with this tool. But what is surprising is how so much simpler the whole CADSTAR experience is compared to PADS and Altium, and I can get higher quality work out in a fraction of the time. So where have they been hiding this tool all these years? Everyone in America has heard of OrCad, Mentor, Altium, Eagle and some others but nobody seems to know anything about CADSTAR. Ironically, outside of North America it’s the largest selling PCB EDA package. Could it be because the rest of the world works harder at seeking value instead of just looking at the initial price? And the initial price isn’t that bad either.

Looking for CADSTAR information?  The latest information is here.

OMAP Noisy NTSC/PAL Video Issues.

The Orion project, mentioned in another post here, has video output capabilities like the Nokia N900 and other smart phones using the TI OMAP35xx series processors.  These chips have dual 10 bit D/A converters for generating analog video signals, either in S-Video format (separate luminance and chrominance) or composite NTSC or PAL analog signals.  Video out had been tested early on with the first version of the board, but the final production board failed FCC part 15 at the test lab recently with the issue traced to the video circuit.  Even with the video output cable properly terminated in 75 ohms there were massive parasitic oscillations on the sync and blanking portion of the signals and in some instances on the entire video signal over the whole frame.  Based on very limited information in the TI Technical Reference manual for the OMAP35xx processor and information from other applications, we used the feedback circuit pictured below for the composite video output:

OMAP video feedback circuit

OMAP Video Feedback Circuit

Unfortunately, with the latest version of the OMAP silicon on the most recent boards, the video output, even with proper 75 ohm termination, was exhibiting nasty parasitics as shown below:

Angstrom video

Horizontal Interval

After creating a test schematic in Linear Tech’s LT Spice IV then running some analysis on it, this gave some indication of where the trouble might be.  As shown in the plot below the pole in the resonant circuit was around 12.7MHz, whereas the parasitics were between 16MHz to 19.3MHz.  After manipulating the values of R75 and C144 shown in the schematic above the pole in the response of the network moved out to about 20MHz.  After doing the plots and determining the component values it was time to modify the board.

Spice Plot, Response of Feedback Network (original values)

Spice Plot of Feedback Network, new values.

After modifying the board with new values for R75 (1.00K)  and C144 (18pF) we measured the video again.  Note that the baseline amount of feedback increased by about 4dB and the filter’s peak response narrowed somewhat.  Nonetheless now we can see that the video output is stable once again:

Horizontal Interval (NTSC) After Feedback Network Changes

The TI Technical Reference Manual for the OMAP processor mentions that the output amplitude of the peak to peak video signal is low, but they claim this is ok.  I beg to differ.   Indeed it is about 600mV to 700mV peak to peak, whereas NTSC video, sync tip to peak white, is 1.0Vpp.  Some TV’s can probably work with this for sure, but my Panasonic plasma display would not lock up to it.  I could see stuff parading across the screen but it always complained that there was no signal on the analog input.  So beware.  NTSC composite output is probably of little value anyway these days, except if you have a system like the Orion that has no other video display method.  But for this product video output is not a primary use case, so the video quality (or lack thereof) isn’t a show stopping concern.