{continued from previous two posts}
IBM had since deposed of their business school graduates. Gerstner had redirected IBM's research and development abilities into making better products. No longer worry about profits. So IBM was a major manufacturer of semiconductors for other company's products. IBM started production of a new High K material based in Hafnium. While in production, the glass pealed off the semiconductor. IBM, like so many semiconductor manufacturers, could not make Hafnium work in production.
Both Intel and AMD were suffering the same problems. AMD processors had become so hot that some even had burned leg skin if using an AMD laptop on their lap. But Intel had Moore's Law. Which dictated that a High K solution had to work. As noted so many times previously, how long does it take to develop an innovative product? Four to ten years. Intel CPUs are no different. Intel starts each new CPU by first making a static RAM. To prove the process. If it works, then a CPU is constructed around that static memory. But Intel could not make Hafnium work.
It took balls and the benchmark called Moore's Law. Intel decided to design the next generation processor using Hafnium (High K transistors) even though the process was not working. And knowing the major disaster that happened in IBM. Two years later, the risk paid off. Intel processors can be faster while consuming less power than an AMD. Intel's massive risk, if it failed, meant Intel might not have a new product for at least another year. A financial disaster. But again, the benchmark quantified by Moore's Law meant the spread sheet would see undisturbed profits four and ten years later.
AMD is about to go the way of Zilog, Motorola, and IBM. Intel drove a further nail into the AMD coffins when it developed the ATOM processor line. The processor that made possible notebook computers.
Which brings us to current history. But again, let's go back.
In 1990, Apple tried to make a Tablet computer then called a PDA (Newton). Apple cofounded a new microprocessor in the late 1980s with Advanced RISC Machine (ARM) in the UK. To develop a RISC processor. The Motorola 68000 was a CISC processor. Therefore failed. Intel was successful because their architecture was more RISC. For example, only one register did counting. Two others did index pointing to memory locations. Only one register could input/output data to peripherals. Two registers processed a continuous series of data (ie a text string). Clearly RISC was a superior solution. But 1990 Apple was run by business school graduates. So the Newton was a disaster. Apple had to sell their piece of the ARM processor. The near disaster force the British in ARM to upgrade the business model previously used by DIX, the Intel/Harris/AMD, et al consortium, and Compaq's EISA.
ARM had to collect partners to perform different aspects of a processor design. Rather than design, build, and market the entire processor, ARM licensed the basic design. This meant an entire semiconductor could be manufactured also containing an entire computer. Remember uprocessor and single chip computer? This is called System on a Chip (SoC). Partners include everyone from Cisco to Cadence Systems. competitors Qualcomm and Broadcom, Nvidia, or the many PC/104 companies. Even Intel has licensed the processor.
Discussed were three bottlenecks. Speed, power, and software. Intel held the high ground with products that already had all three solved. But those advantages do not apply to smart phones or tablets. With all new software and a demand more concerned with power (ie less chips), the ARM processor is now the denominate processor in those product. Intel's ATOM for notebooks may not be sufficient when the high ground has moved like a wave on the ocean. Fundamental to that change was the final arrival of interactive displays. Eliminating the keyboard and mouse. Next may be voice recognition.
Well, Moore's Law worked when processor processing power was important. Moore's Law no longer applies to the next wave of innovation. Computers are now designed inside the integrated circuits (SoC) rather than the Central Processing Unit. To some extent, that was obvious long ago. All PC keyboards were always a complete computer (8051). Disk drives contain their own computer. NIC for Ethernet and WiFi controllers are a single chip already containing a computer (SoC). To have software stuck with hardware from only two manufacturers (AMD or Intel) no longer makes sense.
IBM's mainframes were undermined by the mini-computer. Mini-computers somewhat undermined by Workstations. Which in turn became PCs. Then laptops. And now mobile computers (smart phones and tablets). Missing is the shoe phone.
To understand where industry is going, where jobs are, and how to make money in investing; first learn from this history and trends. What is coming can be found in what already existed. The progression of the processor. The next business model based in what the technical challenges really are. How innovation works when business school concepts are properly disparaged. What happens when top management does not come from where the work gets done. And most important, benchmarks defined by a product oriented parameter - Moore's Law.
So, what is a benchmark to replace Moore's Law? That simple concept helped to drive out business school thinking - keep America productive. Free market competition alone is not enough. In free market competition, the damage is done long before competition punishes the bad company for not throwing out management. And that, folks, demonstrates what is necessary to know and make successful investments.
Successful companies must have benchmarks to quantify the only thing that matters. Spread sheets only report that reality typically four and more years later - too late. For example, Big Pharma has no quantifier. Essential is to see a product (product oriented thinking) long before spread sheets can measure innovation. Intel's long successful history demonstrates it.
Last edited by tw; 01-13-2012 at 06:43 PM.
|