Original article appeared at http://www.wired.com/wired/archive/10.07/Nvidia_pr.html
Everything (that) Jen-Hsun Huang needs to know about running a semiconductor company he learned as a boy in eastern Kentucky. It was there, at the Oneida Baptist Institute, two decades before he founded Nvidia, that he developed a fanatical work ethic - scrubbing all the toilets in his three-story dorm every day. It was there that he learned to pursue his passion - taking up table tennis with the help of the guy who came to fill the vending machines, rising to the rank of master, and appearing in the pages of Sports Illustrated by age 14. And it was there that he first showed signs of leadership. Sharing quarters with an illiterate 17-year-old covered in tattoos and knife scars, Huang taught his roommate how to read.
Never mind that his stint in the Bluegrass State was a mistake - that Huang's aunt and uncle, recent immigrants to Tacoma, Washington, who spoke little English, unwittingly sent him to a reform school instead of a prep school. He got three meals a day and escaped the violence and civil unrest his parents faced, first in Taiwan, then in Thailand. "Wow. There it is. I haven't been back since," the 39-year-old Huang says softly, recalling his childhood while pulling up the school's Web site from Nvidia headquarters in Santa Clara, California. "I remember that part of my life more vividly than just about any other."
Two decades from now, Huang may look back on 2002 with similar clarity. Founded in 1993, Nvidia has grown 100 percent annually over the past four years to become the world's largest maker of graphics processing units (GPUs) - the semiconductors that drive ultra-realistic gaming, 3-D imagery, and video in PCs. Last year, the company made $177 million on $1.4 billion in sales while its top competitor, ATI Technologies, lost $54 million. Nvidia has risen to dominance by pushing the power of its high-end GPUs, letting the technology trickle down to cheaper price points, and hitting delivery windows. If that process sounds familiar, it is. In effect, Nvidia has become the Intel of graphics.
Until now, all of this has happened with Intel's blessing, because graphics chips help sell PCs. But Nvidia's new strategy is clear - and risky: to directly challenge Intel for control of the box. Nvidia's GeForce line of GPUs, which sell for as much as $400 retail, has long been a hit with gamers because of its ability to render high-quality 3-D on the fly. To move beyond gaming, Nvidia has partnered with Intel rival AMD to develop the nForce, an integrated chipset designed to handle multimedia tasks, like theater-quality DVD playback. It's a move that puts Nvidia squarely in competition with Intel.
Nvidia executives seem reverent when discussion turns to Intel, but they're quick to drop appearances whenever there's talk about the future of the industry. "What we've done in the past five years is staggering," says VP of investor relations Michael Hara. "What we can do in the next five years is going to blow your mind. In 10 years, we should be bigger than Intel."
To see how that could be possible requires a basic understanding of how semiconductors evolve. A personal computer uses the CPU to translate software instructions and do data-intensive calculations, like those required to run PowerPoint or Excel. For peripheral tasks like audio processing or Ethernet connectivity, the CPU offloads duties to specialized processors. The manufacturer of a specialized chip adds transistors over time, making it more powerful, until it masters its task. Then the chip begins to shrink until, ultimately, its functionality disappears into the CPU or another semiconductor. It's a harsh reality for a company like Creative Labs, which once had a business built on soundcards. Now, basic sound is a giveaway.
The cycle has served Intel well, until now. As the specialized chips around it have become commodified, the CPU has survived thanks to its power and versatility. But when it comes to multimedia - and that's where the demand is - the CPU gives way to the graphics chip, which is hundreds of times more efficient. The latest GeForce, scheduled to launch this summer, will have nearly 120 million transistors - more than double those on a Pentium 4. Unlike other specialized chips, the GPU will not likely shrink so much that it will be swallowed by the CPU. If anything, the reverse could happen. After all, no one needs a speedy 2-GHz CPU to run Excel.
For a perfect example of the changing dynamic between the GPU and CPU, look at the Xbox. It uses a special version of Nvidia's nForce chipset, built around a tricked-out GeForce3 to handle graphics and sound. Microsoft paid Nvidia more than it did Intel for its 733-MHz Pentium III. For Huang, it's a proof of concept. "The Xbox is how the computer will be built in the next 20 years. More semiconductor capacity will go to the user experience," he says. "The microprocessor will be dedicated to other things like artificial intelligence. That trend is helpful to us. It's a trend that's inevitable."
If he's right, then Nvidia will be smartly positioned. Even more so should it encroach on the CPU business. An April 1 story posted on Slashdot announced a merger between Nvidia and AMD - where Huang used to work. It was meant as a joke, but such a merger has been the subject of serious speculation for months. It would give Nvidia total control of the PC's innards and, eventually, an array of post-PC devices. Huang doesn't cop to this plan, but when he talks about expanding into handhelds, dashboards, and cell phones, he never suggests waiting for Intel - or anyone else. "Some people say (that) the network is the computer. We believe (that) the display is the computer," he says. "Anywhere there's a pixel, that's where we want to be."
Huang is Nvidia's amiable patriarch, doling out equal doses of reassuring hugs and tough love. He roams the halls of company headquarters, chatting and laughing with workers, remembering the names of their spouses and asking after their children. But he has little tolerance for screwups. In one legendary meeting, he's said to have ripped into a project team for its tendency to repeat mistakes. "Do you suck?" he asked the stunned employees. "Because if you suck, just get up and say you suck." The message: If you need help, ask for it.
Fact is, Huang knows (that) there's little room for even one mistake in his business, much less the same one twice. It's the nature of the graphics-chip industry: A company rises to leadership only to miss a delivery window and rolls over for an upstart with a better technology. Cirrus Logic, 3dfx Interactive, Pseng Labs, s3, Rendition, Chips and Technologies - they once were all leaders; now they're all gone.
Nvidia has sidestepped the boom-and-bust cycle by hewing to a simple philosophy: Technology matters, but the production calendar rules. "The first breath of success for Nvidia came when we recognized that the PC market has a pulse that's regular and predictable," says chief scientist David Kirk. PC manufacturers ship machines to resellers twice a year - in April and August. That means (that) Nvidia has to have a new chip ready each February and June.
The process begins with hundreds of designers developing millions of lines of code for a new chip. When that's done, the bugs are worked out on a couple of $4.5 million Ikos emulator stations, which simulate the semiconductor in software form. Then the design is sent off to Taiwan Semiconductor (see "This Fab for Hire," page 105). When the chip comes back, the testing starts. During a tour of the Nvidia campus a few weeks before the GeForce4 launch, the GPUs were getting a workout in temperatures ranging from 32 to 140 degrees. What's even trickier than making the deadlines is ensuring (that) you don't hit them too early. Nvidia's architects squeeze every last day out of a production cycle to deliver the most powerful chip possible. "If you show up the day before the deadline and your competitor shows up a month before, you have a 10 percent wind at your back," Kirk says. He knows where his team needs to be on any product, within a day or two, 18 months in advance. "We have superior architecture because we work hard. But that's not enough. We have to show up at the right time."
And be fast. By doubling the number of transistors on its GPUs every six months - three times Moore's law - Nvidia has won favor among desktop OEMs, where the company has a 66 percent share of high-end graphics chips, and in workstations, where 40 percent of machines use Nvidia technology. Although Nvidia was a late entrant to notebooks, it plans to have 20 percent of the market by January. Nvidia's products are now available in Dell's Inspiron line, and Toshiba has even begun plastering the Nvidia logo on the outside of its Satellite model (think Intel Inside) to attract Nvidia zealots - a following so strong that dozens of fan sites chronicle the company's every technological development.
Nvidia has also scored points with game developers, who are happy to have a new toolbox of sophisticated anti-aliasing and shading techniques at their disposal. "I could get in real trouble for saying this," says the CEO of a top games publisher, "but the Xbox is the best console on the market, and Nvidia has a lot to do with that."
It's a quirk of fate that Nvidia had anything to do with the Xbox - or any other Microsoft product. In Nvidia's early days, Huang tried to end-run Microsoft with a proprietary programming interface. The decision, he now admits, almost killed Nvidia. In a move of desperation, he directed his engineers to build GPUs to work with Microsoft's Direct3D standard. It not only saved the company, but established a partnership that eventually led to the contract to develop the chipset for the Xbox - a contract worth as much as $500 million a year.
Here's why the graphics business is a great place to be: Eye candy - the purple glow along the horizon at sunset, a city skyline during a thunderstorm, the wrinkles in a puppy's face, pornography - has power. While computers today mainly convey text information and 2-D images, advances in graphics processing will change what's on our screens. And soon, high-res screens could be everywhere.
It doesn't take much imagination to envision new uses for 3-D imagery. Already, many rental cars come equipped with a satellite-guided, 2-D map and a robo-voice that scolds you for missing a turn. Before long, they'll have 3-D maps, like those being produced by Nvidia partner Keyhole Technologies, with the terrain rendered in real time. You'll know what landmarks to look for, how to route around road construction, and how far to the next In-N-Out Burger. Same for air travel. F-22 fighter pilots already use simulated 3-D environments in the cockpit. Another Nvidia partner, Quantum3D, sees the day when commercial jets will have screens that render airscape in real time to help pilots fly, and land, in zero visibility. Or how about medicine? One day, doctors will use 3-D as freely as scalpels during surgery.
If these scenarios play out, Nvidia will be there to profit. In the meantime, as a way to drive revenue, the company is banking on our increasing tendency to use our PCs to watch DVDs. Its nForce chipset, launched for AMD's Athlon architecture last October with less power than the chipset used in the Xbox, handles 3-D graphics as well as a GeForce2, turning your PC into a home theater by improving the video and sound experience. Because it's a fraction of the price of a high-end GeForce4, it holds the potential to increase sales outside of the gaming market. By April, Hewlett-Packard and Compaq had joined other manufacturers in introducing AMD/Nvidia machines. Such deals should help.
As graphics power continues to trickle down, and as the chipsets grow ever smaller, Huang plans to put them in anything with a screen. Maybe you can't see playing Halo on your cell phone, but what about on a Game Boy? How about a BlackBerry video show of your children?
For Nvidia, making all this happen means overcoming a slew of obstacles. For starters, the SEC is investigating the company's accounting practices. It's a serious matter - but not Enronesque. In late April, Nvidia finished an internal investigation and actually readjusted its profits upward. More troubling are the disappointing sales of the Xbox in Japan and Europe. If the trend continues, it would indicate that the gaming world is not all that impressed by Huang's vision of the future - and where gamers go, computing's version of Middle America is sure to follow. The Xbox will pick up when the quality and number of games increase. That will happen only when developers learn to use the power that Nvidia is giving them. "We need to make sure they're bringing our technology to its knees," says Huang.
And then there's Intel. The move to multimedia isn't lost on the CPU giant. It has a 40 percent share of integrated chipsets, and planned to launch its latest version, for the Pentium 4, in May. By developing nForce exclusively for AMD, Huang put Nvidia in a tricky position. Nvidia needs to work closely with Intel to ensure that its highly profitable GPU is compatible with the CPU. Now that Nvidia is competing with Intel, that partnership may be in jeopardy.
By choosing AMD, some say (that) Huang is just going for the best deal - Intel charges a licensing fee; AMD does not. But Huang suggests that it's more about control. "The structure of the arrangement for companies building chipsets [for the Pentium 4] is so constrained that the opportunities are fleeting. They can only succeed where Intel is not," he says. "Going into that marketplace right now is a waste of our energy. We decided to go where we have the freedom to innovate. Once we build up that position and have an architecture that people recognize, then it's time to do a Pentium 4 chipset." In other words, he's trying to end-run Intel, as he attempted with Microsoft in 1995. This time, with a following among OEMs and gamers, a team of first-rate engineers, and a powerful brand, Nvidia is far stronger. Intel is still Intel, of course - one of the most effective and ruthless companies on the planet, with a stranglehold on the motherboard. And yet, recognizing the future of computing and being in a position to capitalize on it are two different things. Even with its dominant share in chipsets, Intel has the whiff of desperation about it. A few years ago, Intel tried its hand in graphics chips, only to make a quick exit. Now it's undercutting Nvidia on the low end with lesser technology (early reviews have not been kind) - the same approach many would-be Intel killers have tried in CPUs.
Nvidia, on the other hand, is in a position of strength. Huang's betting that the architecture of our computers is evolving along with our appetite for 3-D games, imagery, and video. If he's right, he's golden. If he's not, roadkill.
Either way, the prospect of creating a WinVidia platform is far too powerful to ignore. Especially for someone as self-assured as Huang. He may have learned a lot about patience and persistence in eastern Kentucky, but he seems to have skipped out on the Baptist lessons in temptation and pride.
- Jeffrey M. O'Brien
Senior editor Jeffrey M. O'Brien (firstname.lastname@example.org) wrote about file-sharing in Wired 10.05.