Tuesday, January 13, 2009

Closed Door To Open Source

Perhaps if I recompiled the tool source code and tried again?

For those of you who are interested in chip design software and all things EDA, there's a very good new blog at the Electronics Design News site. (Disclosure: I know the author; we worked together for a couple of years.)

Today, it raises the question of why there are no real commercial instances of open source software taking the EDA world by storm? A lot of very reasonable explanations are posited, but in essence they all come down to one thing: "the market is too small". (In the comments, another answer, "there's no extant base of code with which to kick-start the process like there was in the operating system world" is, I think, just another consequence of the same problem.)

Whilst I agree with the notion that EDA is a small market (see below), it's actually quite a lucrative one. Roughly speaking, the EDA tools business is in the $4billion range, a not inconsiderable sum by any means. In comparison, the embedded software world, a sector that also gets a mention in the post, is less that half that size. And it's worth noting of course that open source is playing a significant role in that space, albeit a fairly static one; the word "spoiler" comes to mind ...

Although the scope of the EDA market is indeed a factor, it's not the dollar size that matters as much as the number of active projects using those tools that's the problem. Perceived wisdom is that the number of new chip starts is declining, a state of affairs made worse, at least for the tools vendors, by the fact that the complexity of those that are underway is actually increasing. It's also the case that EDA suffers from a very Balkanised work-flow, each stage of which has it's own discrete set of tools (synthesis, place-and-route, verification, etc.) meaning that a vendor wanting to offer a complete solution has to invest in multiple areas at once. Even so, and given the costs of these tools, you would have thought that some open-source effort would have gained real traction somewhere along the line, but it hasn't and, I'd contend, it likely won't.

Regardless of whether it's open- or closed-source, the quality of the end result of any significant software development project is still largely determined by how much testing can be done. The reality of the situation though is that this is almost all done after the event, when the project is largely complete or, worse, by the customer when the solution has been fielded. (Early releases of Vista for example??)

In the open source world, this basic methodological approach means you throw something out there and wait for it to be applied to real projects. they find the problems and the community provides the fixes. This is a fine and noble approach if we are talking about an operating system or a network stack; it would be an unmitigated disaster if we're talking silicon design.

Taping -out a new chip these days runs in the $3m to $5m range. Not only is each design very costly to create, it also has a very long initial testing cycle time, often in the order of months. Oh, and just to cap it all, what you are working on is likely leading edge, highly proprietary IP that's guarded more jealously than a junk yard dog protects his bone. Net-net? As a user, there's no way on earth in which you can mature an open-source initiative without risking a) millions of dollars, b) losing a key market window, or c) exposing your latest highly-secret chip designs to all and sundry.

Where does this leave us? On the plus side, EDA is unlikely the find itself threatened by open-source offerings anytime soon, if ever; it's a large market, at least when judged by the standards of tool offerings in other markets; EDA tools are a must-have, not a nice-to-have, so you have lock-in like you wouldn't believe; And there are very few competitors, so once you have set up your tent then it should be plain sailing until you reach happy-ever-after land all fat, dumb and happy, right?

Not so fast. Remember what we've already covered. R&D costs are extremely high in order to keep pace with the underlying physical advances that are driving ever-shrinking mask sizes. Roughly speaking, two generations of smaller node sizes equals one complete re-spin of EDA technology required in order to complete the necessary design work. We've already seen that design starts are declining, therefore the active players are chasing an ever dwindling number of fish in the overall EDA barrel. Oh, and the Valley hates you like there is no tomorrow.

EDA used to be a great market within which start-ups could get founded, develop innovative solutions and then get acquired by the handful of active players such as Cadence, Synopsys, Mentor, etc. But then it stopped. Those same companies said "enough", and switched instead to trying to develop organically. This left the VCs carrying a whole bunch of EDA start-ups that had been funded on the basis that the gravy train would roll-along forever and for which, all of a sudden, there was now no way out. (Sound familiar? Social networking clones anyone? Sequoia's Web 2.0 meltdown message?) The Valley hath no fury like a bunch of VCs scorned and, all elephant like, they have very long memories. Consequently, getting a new start-up -either semiconductor or EDA based - funded in 2009 will be next to impossible unless something dramatically changes.

"No market" ultimately impacts much more than whether or not open-source solutions arise, it fundamentally prohibits innovation. Even if you don't buy this premise one iota, it's nevertheless a self-fulfilling prophecy: just a perception of there being no market is enough to cause this state of affairs to come into being, and that's exactly where EDA sits today.

No comments: