Now that the dust has settled on AMD's take over of ATi, DVdoctor Inc's John Ferrick - a former VP at Novell - considers what brought the two companies together and where the new AMD is headed.
So AMD and ATi finally did the deal. But what does it all mean?
Let's look first at why it happened and, in doing so, explode one all too common myth. It's widely held that all these ultra-clever hi-tech companies have clear strategies that stretch years ahead - and that they're primarily technology-driven. There may be an element of truth there but a lot more is going on.
It was interesting to read all the talk about why AMD did it, but what's largely been overlooked is what ATi saw in it. My perspective is that ATi made some very clear overtures to Intel, not just to AMD. Why else do you think that Intel was talking at the recent Computex trade show about there being a rumour of a deal?
ATi sees the future is in solutions that are ever more integrated, with much more integration on dies and chip-sets. It also needs low-cost fab facilities. In addition, ATi was aware that Intel had suffered three consecutive quarters with dismal results and that a major bloodbath was likely - to massively reduce costs.
Leading up to Q4 of 2005, Intel was celebrating and predicting a continuing sequence of record quarters. Then Q4 hit. Sales dropped and margins dropped but Intel showed no outward signs of panic.
However, Q1 2006 saw sales and margins down again and this was followed by horrible results for Q2, with sales down by around $4 billion from what Intel had been expecting a year earlier.
I'd estimate that for Q4 2005 to Q 4 2006 sales could have been down by as much as $10 billion in total. Intel needed to take massive steps to cut its overheads and for more reasons than you might think.
The company has, for instance, chosen to become extremely aggressive with pricing. This is to help it claw back market share lost to AMD.
But lower pricing means lower income per CPU and that's a problem, the more so when you realise that Intel has lots of supply contracts that include price protection. The prices in those contracts will have to be revised downwards and Intel's revenue will fall even further.
At the same time, Intel must know that the market overall isn't going to grow very much, so total sales aren't going to increase very much and nor are its own sales - even if it does pull back some share from AMD.
But what about Intel's sparkling new Core Duo, Core2 Duo and Core2 Extreme family of CPUs; haven't they received widespread critical acclaim from reviewers?
Indeed they have. But the thing to realise is that if, as a manufacturer, you believe that you've got great new products and you're pitching them at very competitive price points, what you don't usually do is set in train massive cost-savings. Not if you believe that these well-priced great new products are going to dramatically grow your sales.
So, Intel must believe that the Core and Core2 family – no matter how good it is - isn't going to power the overall market into putting on a huge sustained sales spurt and bring it a whole lot of new business.
Otherwise, it would stage a holding operation – shuffling things around a bit and also increasing its marketing to make sure that the expect sales boost did take place.
But what's actually happening is that it's laying off thousands of people, including managers - something I can't ever remember it doing before.
In my view, Intel's actions show that it thinks that there's a long-term problem. Indeed, the company itself has drawn parallels with the situation it found itself in back in 1985 when the then top man, Andy Grove, decided Intel had to completely exit the DRAM business.
Andy Grove himself is well known for saying that, "only the paranoid survive" – and ATi understands only too well that Intel is probably the most self-centred, winner-takes-all company in the business. So ATi will have known that marriage to Intel wasn't a very pleasant prospect compared to a life shared with AMD.
And some say that Dave Orton, the top man at ATi, has had his eyes on becoming head honcho of AMD not too long after the take over.
It's also significant that ATi sees itself as a company that design chips and only very reluctantly one that makes graphics cards.
An interesting event set the stage for all this during the Xbox 360 component-selection negotiations.
Microsoft remained furious at NVIDIA for its refusal to move on the chip-set prices for the first Xbox, so was determined not to get held hostage again and wanted a fab-licensing agreement.
NVIDIA refused and ATi got the business – presumably because it was willing to structure a fab-licensing deal.
It is probably not so clear but, in my estimation, AMD does see itself as a marketing/fab company. And it certainly is a company that's much more pragmatic than Intel and much more willing to work deals.
It's pragmatism is well illustrated by the way that it allowed NVIDIA to have the lion's share of the motherboard-chip business. Having known some of the early AMD team back in the Sanders' days, it's clear to me that marketing and production were more important to AMD than simply design.
This emphasis on marketing and production has served AMD well. Had it been concerned mainly with engineering design, then it would probably never have gotten into the Intel chip-set clone business or even set out to make better CPUs than Intel.
Contrast this with IBM or any of the companies who falsely believed they had not just different architectures to Intel's but better ones, too - or, at least, were wrong in thinking that what they had was enough to ensure success.
So with AMD and ATi, we had a willing couple seriously considering marriage.
But what got them to the altar?
From ATi's side, there was Intel's massive reorganisation and concerns about fab-access and costs. There was also the possibility that Intel wouldn't renew its bus deal with ATi – even though such fears turned out to be misplaced.
From AMD's perspective – and the company is probably even better at analysing Intel's products than Intel itself – it perceived that its long-standing price-performance leadership over its rival was in jeopardy as a result of the new-generation Intel Core and Core2 family of CPUs.
Also significant is AMD's knowledge that Intel is likely to be willing to tolerate as a big a profits-hit as is needed for its CPUs to become more aggressive on pricing.
There's an old adage in Silicon Valley that if you are going to have a loss, you might as well make it massive, and get it over with, because it really will have little effect on share price and perception.
Intel might not actually end up posting any loss-making quarters but what it is doing in massively cutting costs is attempting to offset the major reduction in sales and margins on CPUs and chip-sets.
What's more important for Intel is that it meets expectations. So what it might do is simply take everything it can find that can be written off, every possible reserve and create a huge mind-boggling reduction in profitability and then get on with life.
Alongside, it might just institute even more mind-bogglingly-large cuts than it's already announced, so that the outside world doesn't see a lot of red ink.
It could be argued that Intel doesn't really need to price its new CPUs as aggressively as it has done.
However, chip-pricing is totally marketing-created.
When you first start a line on a design, yields are poor, there's massive waste and so you guess at what the price will be when it comes to market.
But there's really no direct relationship between the cost of manufacture and the price at which you sell.
You write off your development, you write off the cost of fabrication plant and then price your product solely based on market conditions – and that price can be very different from the one you envisaged/guesstimated at the outset.
I remember - in my distant past - arguing the case for massively cutting the price of an Ethernet chip-set that was priced to market at 49 dollars. I reasoned that if we took that down to just four dollars we could increase the market by many orders of magnitude.
It took about a half hour to get the decision made, and the industry saw the resultant massive reduction in the price of networking kit as a result - and the effect that had is pretty much history.
Intel wanted to regain market share and, with the divisional red ink already flowing, felt it was in a position to tolerate what it saw as the necessary very aggressive pricing.
On the technology side, I believe that AMD sees that we are into a new section of the Moore’s Law curve.
Instead of getting more performance by adding more transistors, reducing die size and increasing clock speed - as was traditionally done - AMD and the industry sees that there are likely to be modest improvements in the basic core, but a massive increase in parallelism.
Dual core is just the beginning. With feature sizes down to 65 microns and better yields, we are looking at it being practical to produce four-core, eight-core, 16-core and beyond.
In addition, as most gamers or users of graphics-intensive apps know, the graphics processor is the big decider in many real-life situations, just as it is in the benchmarks game.
But, when I hear all the talk about SLI and CrossFire, I pull up short. To my mind, this way of getting two separate graphics cards working together can only be a short-term interim solution – and one that's clearly not anything like mass-market.
And, actually, it's not even a very new idea. Back in 1998, we had Voodoo2 3dfx SLI solutions. NVIDIA, of course, bought the company as it was on the verge of demise in 2000 – over five years ago.
It would be better, in my view, to put multiple GPU cores on the same board. That's something that's already starting to happen and is only likely to accelerate as dies are made ever smaller and more cool-running.
The issue to some degree had been to see if the applications we run would move to this multiprocessor/multi-GPU model. And they have.
But is even the multi-GPU card a genuine long-term solution? I doubt it.
Every time you have to go off the chip-set /die/ motherboard slot you take a performance hit.
So I think it's more likely – and this is almost like going back to the days of dedicated sockets for floating-point units (FPUs) – that we're going to see the rise of motherboard-socketed GPUs.
A motherboard could have two, four or more GPU sockets and each plug-in graphics module could have two or more processor units. You'd be running multiple modules, each with multiple GPU cores. And how much sweeter is that than SLI or CrossFire?
This, of course, is something that is already happening with CPUs and, there, too, that trend could become more common.
You'd pick and choose the number of CPU and GPU cores based on your needs as a high-end user.
Something else at work is the mindset of engineers. Like mathematicians, they tend to favour elegant solutions – and SLI and CrossFire are in no way elegant.
Longer term, and another reason why I think AMD did the deal, is that we might see an MMX-type approach where the CPU chip-set starts to include more and more of what we think of as GPU functions.
At the other end of the scale, AMD certainly sees the various server markets as important - and low-cost integrated graphics solutions are highly significant there already.
In addition, there is clearly an emerging market for ultra low-cost PCs priced to be affordable to the huge numbers of people in China and India and other countries with large populations and low incomes.
What these people need if they are to come aboard the PC world isn't just low-cost solutions but solutions that also have decent performance - and a high level of integration makes that possible.
Another area that shouldn't be overlooked is ATi’s success in the games market.
NVIDIA might be ahead in the graphics-card side but there's no doubt that ATi has a considerable edge in games consoles.
Microsoft's Xbox 360 and Nintendo's Wii are ATi-based. NVIDIA's only recent design-win is for Sony PS3 and I'll talk about the significance (or otherwise) of that a bit later on.
AMD, understandably, was disappointed that it couldn't win a major CPU-deal with Microsoft, having lost out not just with the first Xbox but also with the Xbox 360. Now, though, with the ATi acquisition, AMD has instantly become a key partner with Microsoft.
Interestingly, AMD hase a technology-sharing deal with IBM - the supplier of Xbox 360 CPUs - so there is always the possibility for expanding that relationship around Xbox 360
Apple's adoption of Intel CPUs was another disappointment for AMD but, via ATi, it now has a strong relationship with Apple, too.
If Apple chief Steve Jobs truly wants to keep Intel on its toes, one way of doing it is with the threat of increasing the AMD content of Macs beyond that of just ATi GPUs.
And Intel does need to be kept on its toes because, although the company was really delighted to get the design-win with Apple, it sees the Apple desktop/notebook sector as a relatively small part of its overall business.
Another big factor for ATi when considering marriage was whether it could, if it remained single, deliver on the numerous design wins it's achieved in mobile phones and related areas.
These markets are design-sensitive, of course, but are equally sensitive to availabilities issues. The last thing that ATi wants is to be told that, "we love your design but are worried about you ability to produce in volume".
AMD continues to spend billions on fab capability and ATi needs access to such production facilities. AMD in turn would welcome the high-volume business that the huge and ever-growing mobile sector represents.
There is another potentially massive CPU and graphics-intensive market. It is for next-generation high-definition DVD players.
If you look inside the Toshiba HD DVD system you see a PC. If you look at the move to highly complex Codecs for HD, you see a need for much higher performance graphics/CPU chips – and ATi is already massively strong in the hi-def graphics-processing side since it dominates the world-wide supply of chipsets for HD TV sets.
Usually I am very suspicious of mergers. Like real marriages, far too many of them don't work and don't meet the expectations of either partner.
As you've probably realised by now, though, I do think that the coming together of AMD and ATi is logical and also likely to turn out to be good thing for both companies if it's properly managed.
But what happens to ATi's big rival, NVIDIA?
Unlike some, I don’t see the AMD/ATi marriage as being fatal to NVIDIA or even a major negative. AMD is pragmatic and will, I believe, want to continue its relationship with NVIDIA for motherboard chipsets.
NVIDIA itself will be very wary of any deal with Intel for highly-integrated solutions. As we've seen from NVIDIA's dealings with Microsoft, it's unlikely to be very willing to do technology-design deals and would, instead, insist on deals in which it's selling products, rather than the rights to use its technology. ATI/AMD will continue to supply cards, and so will NVIDIA.
It's not well understood but a lot of hi-tech companies have complex relationships where they work together on some things and compete on others.
Some firms are better at this balancing act than others. Intel's corporate paranoia, I think, has made it a less-than-ideal player but AMD and ATi seem to have no problem doing this and, nor I believe, will NVIDIA.
NVIDIA, as I said earlier, will supply GPU chipset for the forthcoming Sony PS3. Some people think that the PS3 will be a massive success but I doubt that. I think that Sony's entire strategy of building an alternate gaming platform and tying it to Blu-ray Disc is very high-risk - but that’s an opinion piece for another day.
So where does all this leave Intel? Probably with a bit more worry to add to its paranoia and probably willing to be even more aggressive on its pricing. The company, though, will focus sharply on execution, creating pressure that AMD needs to deal with.
Most important of all, I think that you and me – the end users – are going to come out of this as winners.
One of the concerns with the hugely-impressive benchmarks that Intel's been achieving with its new-gen Core and Core2 Duo CPUs is how AMD would be able to compete - short term and long term.
None of us would benefit in the long run if Intel's new CPUs were so good that they seriously wounded AMD. A weak competitor to Intel wouldn't make for a healthy market.
But with ATi on board, AMD now looks to be much better positioned than it was even a couple of weeks earlier.
And ATi now has assured access to fab facilities and can be a key player in the more integrated high-end multiple-core market, while NVIDIA will probably just keep moving right along.
But some questions do remain.
Does the future hold a much more expanded role for the AMD/ATi combo and Microsoft? With the Xbox group winning control of the media Windows area with Zune, we could be seeing more design-wins for the new double-A combo.
And does the future include a greatly expanded role for the AMD/ATi combo at Apple? Will AMD be able to get Apple to use its CPUs in Mac computers? More significantly, can AMD/ATi gain a major foothold in the hugely important iPod and derivatives business?
What is certain is that it's far easier to expand an existing relationship than to create a new one and the combined AMD/ATi group has its share of design-wins in some pretty key high-volume areas.
Clearly, interesting times lay ahead for AMD but how do you think things will pan out? Let us hear your take in the DVdoctor news forum.