Diversify Or Die

With Moore’s Law running into barriers all over the place, it is becoming clear that the shrinking of digital circuits is not going to be a particularly profitable pursuit.


When, every two years, you could double an IC’s functions, double its speed and halve its power requirement, it was not difficult to sell your new products. Even the most intransigent customer couldn’t resist that sort of sell.

But now, power dissipation is going up with each generation, speed has to stay the same or dissipation goes up more and, while shrinking can still add functions, or reduce cost-per-function that, by itself, is not so compelling a sell.

So, to create value and profits, some IC companies are thinking horizontally: What do we do well that we can make and sell?

Sharp, one of the most lateral-thinking IC companies, is using its microelectronics experience and knowledge to diversify widely.

One move is hunting around for places to put its solar cells – like on car roofs, cameras and cellphones;

Another diversification is to make low-power units to charge ions so as to clear the air of disease-carrying particles and viruses in homes, cars, aeroplanes, trains and buses.

A third diversification is into all the smart meter/smart grid stuff; and a fourth is LED lighting – which uses 12% of the power of incandescent lights.

This seems to be an attractive way of getting out of the effects of the demise of digital scaling – to use the digital scaling expertise to do something else.

A few years ago, the great Andy Grove, Co-Founder, and former CEO and Chairman of Intel, tried to persuade Intel to do something different. He suggested Intel should diversify into electric batteries for cars.

Intel, run these days by a very different breed of men to founder-scientists Bob Noyce, Gordon Moore and Andy Grove, couldn’t get their heads around the proposition.

Presumably they prefer to stick to shrinking digital circuits until the decline in the company’s share price forces the management to think outside their blinkers.

Some managements, like Sharp and STMicroelectronics, don’t have to be forced. Ten years ago, ST diversified into MEMS – a business sector that is now delivering a sustainable, profitable business.

All over you see people doing interesting things with microelectronics – like using it to analyse swabs and blood samples, developing sensors for a myriad of applications, building energy harvesting products and ad hoc networks.

Pity the digital shrinkers because their days are numbered; for the survivors, it’s diversify or die.

TOMORROW MORNING: Ten Best Books About Chips


Comments

9 comments

  1. Ian I agree heat is a problem but we have the technology to remove roughly 250W/cm2 from an IC so we aren’t quite at that limit yet. Also a single core processor in 22nm would be far smaller than 1cm2. Yes it’s expensive and definitely not green but the demand is there as while for many applications running parallel cores works fine, too many others have been limited at 3GHz processing for far too long.
    The problem I was told Intel Israel struck was that the EDA tools available at the time couldn’t simply couldn’t resolve and balance all the physical data path timing delays at these speeds. However the tools are ever improving and now include much better power control techniques as well so I believe it could be time for Intel to try again.

  2. The real problem with 5GHZ and 10GHz CPUs isn’t that they can’t be built — they can — but that increasing clock rate increases the throughput in GOPS per CPU core but it also increases power consumption per operation.
    If you have a given amount of processing to do, it uses less power to have more parallel processors running slower than one fast processor. This is why chips processing very high-speed data (e.g. 100Gb/s) process many bits in parallel at a low rate (e.g. 200b at 500MHz) instead of fewer bits at a higher rate (e.g. 20b at 5GHz), the power efficiency is much better.
    This has become an issue now because of the insane amount of logic that can be packed onto one chip and clocked very fast; you can design a CMOS chip right now which would dissipate >100W/cm2 if clocked at 5GHz.
    It’s not tools causing this problem, it’s physics.
    Ian

  3. Well the brain happily chuggs along at a few hundred Hz and does ok so parallel processing can be acceptable but I agree Intel shouldn’t have just given up on fast micros as there is a huge market out there waiting for them when they do get round to shipping the 5 and eventually 10 GHz processors they should have been shipping for years now. Multicore seems to have diverted their attention from solving the real problem but hopefully they will refocus sometime soon as its not an insurmountable problem, just one that the available EDA tools couldn’t solve.

  4. The cost cutting per generation of Moore’s Law improvements may have been helped by ever larger economies of scale. So, six inch wafer fabs beat four inch wafer fabs because the worldwide market had grown rapidly enough to pull demand to make full use of six inch wafers. When demand grew quickly enough it was justified to move up to 8 inch silicon, and so on.
    At some point in time, the next wafer fab can only be on the same wafer size as the previous, or smaller, so without tiny processor architecture it does not deliver the cost saving which every consumer purchaser is so used to. Would people throw away their next swanky new computer to get the model after that?
    I’m in a house with three competent PC’s at one desk and another three in the house. The 1GHz ten year old hardware works fine for a lot of normal use if sensible allocations are made.
    We have literally as many keyboards monitors and printers as one mid-nineties mid-tier university physics research team, with two orders of magnitude greater processing capacity, and set most of it either idle or doing whatever unproductive background tasks go on in modern computers.
    In the long run we would want to consolidate to a smaller number of boxes with much less power consumption and less visible domestic clutter.
    I have a long memory and notice how much churning by my PC is non-productive. I notice how it performs, in common home office use, no better than a clean install of software matched to ten year old hardware.
    There must exist a size of wafer fab which is simply too big to provide anything other than overcapacity. The expected consequence of having a few of those running is to see wildly exuberant advertising to try to push forward consumer purchasing decisions to be committed before it is time for the backlog of bills to be paid.

  5. That, Robert, is an extremely good point. As ever, it’s the standard programmable part which benefits best from the manufacturing economics of the industry.

  6. Diversification has many built-in problems and is fundamentally incompatible with high volume batch oriented production. This problem is only made more acute by the insane functionality that can be crammed onto a 40nm chip. Even a 1sqmm die can easily accommodate a full ARM microcontroller, or a whole slew of analog functions. When a single wafer yields 30K good die you quickly have a problem to identify applications that can consume the output of a single 12 wafer lot.
    I think it will be interesting to see if existing high volume chips start getting packaged in such a way that the component functions can be used. If you think about what is on a GSM baseband chip, you typically have at least an ARM7 core, a complete power management unit (references LDO’s and Buck regs) plus Li-ion charge circuits, several Audio and RF codecs microphone amps and speaker drivers plus a whole bunch of GPIO and an LCD display controller, without even thinking about the digital filters and protocol engines. All this for under $2USD.
    This makes a complete GSM chipset cheaper than a buck converter (from someone like LTC). So I think finding new ways to leverage these high volume devices will be more interesting than developing diversified chips from scratch.

  7. Well surely, Mike, one insurmountable problem has already been reached – Intel couldn’t keep increasing the clock speed of its cores, and that’s why it went multi-core. Whether that’s a problem surmounted, or an insurmountable problem for which a fudge has been found, depends on the eye of the beholder.

  8. I’m not expecting Moore’s Law to pose an insurmountable problem for Intel for many many generations yet. I would even bet that the first successful commercial molecular and/or quantum computers will have some remnants of the x86 instruction set in them.
    But for companies without the deep pockets of Intel and Samsung it isn’t so much the limitations of IC process shrinking so much as the huge costs of doing so that will push them along the More than Moore route.

  9. As Moore’s Law runs into difficulties, another side effect will probably be a gradual migration from inefficient ISAs (x86) to more efficient ones (eg – ARM). Intel may one day regret that they sold off their rights to their StrongARM/XScale architecture.

Leave a Reply

Your email address will not be published. Required fields are marked *

*