Larrabee pullout: GPU battle 'far from over'

Intel's decision to shutter what would have been its first discrete GPU (graphics processor unit) offers more breathing space for graphics market leaders Nvidia and AMD, but the battle is far from over, an analyst has pointed out.
On Monday, the chip giant announced it would not release a standalone graphics chip as its first Larrabee product, contrary to its earlier plans.
The chipmaker had performed a public demonstration of the Larrabee platform at the Intel Developer Forum in September, and followed up with another presentation at the SC09 conference last month--leading many to believe the company would deliver on its "2009 or 2010" timeframe for Larrabee.
According to In-Stat's chief technology strategist Jim McGregor, however, the latest development was not totally unexpected. "History in the electronics industry indicates that few new technologies meet their initial schedules and adoption of new technologies and methodologies takes two to three times longer than anticipated," he said in a research note this week.
Intel is trying very hard to jam a square peg into a round hole. It may be possible, but obviously, it isn't easy.
Tom Halfhill, In-Stat
McGregor noted that the decision to drop the standalone graphics chip was nevertheless "a major blow to Intel", and would allow AMD and Nvidia "more breathing room in the higher margin discrete GPU space". But in terms of integrated graphics, Intel still has an edge.
"[Intel's] plan was to enter the discrete GPU market on the high-end and [subsequently scale] the technology down to the integrated graphics solutions," he said. "Now, [it] will have to rely on the older graphics architecture for integrated solutions while it regroups and reinforces the Larrabee development efforts.
"This does put Intel at a disadvantage in term of graphics technology, but with at least a twelve-month lead on rival AMD [in terms of] introducing processors with integrated graphics, the upcoming Westmere processor generation should still provide Intel with a price-, power- and performance-competitive offering for the value and mainstream PC segments."
Tough task at hand
From the start, in developing Larrabee, Intel tried to create a graphics architecture "that was programmable just like a standard x86 processor", which required both a new hardware architecture and a new programming model. Both were significantly challenging tasks, he noted.
Tom Halfhill, senior analyst for In-Stat's Microprocessor Report, concurred. "Larrabee was a potential threat to [AMD and Nvidia's] GPU businesses, but now it should be apparent that designing a state-of-the-art graphics processor is very hard, even for the world's biggest semiconductor company. Anyone who thought Intel would easily stomp AMD and Nvidia needs to rethink their position," he said.
To achieve its goals for graphics performance, Intel may have to compromise on x86 compatibility, he pointed out, "Intel is trying very hard to jam a square peg into a round hole. It may be possible, but obviously, it isn't easy."
Robert Sherbin, vice president of corporate communications at Nvidia, added in an e-mail: "The fact that a company with Intel's technical prowess and financial resources has struggled so hard to succeed with parallel computing shows just how exceptionally difficult a challenge this is."
Graphics efforts to continue
In-Stat's Halfhill noted however, that Intel is not likely to give up on its GPU ambitions anytime soon.
"AMD is working on x86-compatible PC processors with integrated ATI graphics, so Intel will need a competing product," he said. "In addition, the growing market for general-purpose high-performance computing on GPUs (GPGPU) demands a response from Intel. And finally, even if Intel's graphics technology isn't competitive with the discrete GPUs from AMD and Nvidia, Intel could still adapt it for PC chipsets with integrated graphics."
McGregor pointed out that Intel could not afford to consider ending graphics development, given its growing focus on consumer electronics, as graphics "is a critical technology to all the major consumer and computing platforms". According to him, the battle in the graphics space is "far from over", given the rate of innovation in the industry.
On GPU handling of computing tasks, Nvidia's Sherbin elaborated: "GPU computing has surely reached the tipping point. CUDA (Compute Unified Device Architecture) has been adopted in a wide range of applications."
"In consumer applications, nearly every major consumer video application has been, or will soon be, accelerated by CUDA," he said, naming computational biology and chemistry, and fluid dynamics simulation, as examples of high-performance computing applications.
With the launch of Microsoft Windows 7 and Apple Snow Leopard, GPU computing also "went mainstream", said Sherbin. "In these new operating systems, the GPU was not only [a] graphics processor, but also a general-purpose parallel processor accessible to any application."
Responding to ZDNet Asia's queries, a Hong Kong-based Intel spokesperson reiterated that Larrabee silicon and software development are behind where the chipmaker wanted to be at this point. "Additional plans for discrete graphics products will be discussed some time in 2010," he said.