Developing an understanding of the theory of innovation

Monday, August 17, 2009

When good enough means disruption part 2

Back in April I wrote an article about various potential disruptors built on the advance of cheap CPU, cheap disk space and faster, cheaper wireless mobile or cellular broadband.

The article was based on the theme of feature and performance overshoot by the mainstream vendors. In their pursuit of the highest margin, price-insensitive and therefore most demanding consumers, suppliers of technology overshoot the mainstream requirements.

This allows low end disruptors a foothold in the market, offering significantly lower price products which just fulfill consumer requirements, or more importantly for the disruptor fulfill a new need within the product area which isn't currently serviced by the major players.

A good example of overshoot is the current PC. Most if not all are more than fast enough for all applications apart from computing gaming. The major vendors Intel and AMD (and Sun) have moved from packing more transistors per CPU per nanometer to having more CPUs per chip, Dual Core and Quad Core for Intel and AMD and Niagara for Sun.
Interestingly enough this change in architecture is starting to disrupt the major leaders in multicore CPUs i.e. the graphics card vendors Nvidia and ATI.
Most graphics cards can have up to 128 mini CPU Cores per card. Many people are starting to use their massive parallel computation abilities in areas outside of rendering pictures and video for games and movies. The vendors have released Software Development Kits (SDKs) to help this new usage and spurn additional demand for their hardware.
This is a case where the technology was complex, the language hard to develop code in and now has changed to approaching a standard module/interface, which is making it easier to develop code. So going from closed and integrated to being open and potentially modular.
So how are the CPU players Intel and AMD disrupting the graphic cards. By pursuing open interfaces on the motherboard to talk to their CPUs and by adding graphics cores to their CPU architecture.
It used to be that onboard graphics equaled poor to substandard graphic abilities, hence the requirement for dedicated graphics cards. As time progresses the CPU vendors can add more cores to the CPUs, some which can be dedicated graphic cores.
The other disruptor in this open interface is FPGA (customized CPU cards) which can optimize in hardware what originally was done purely in software on generalized CPUs.

Have Fun

Paul