I cannot imagine living in a world whereby microprocessor was yet to be invented. In the late sixties, two companies were born one after another, in the span of one year – first, Intel, then AMD. How did Intel and Texas Instruments (a semiconductor company) almost simultaneously invent the first microprocessor, I do not know. But I know that in the mid-seventies, AMD has reverse-engineered the Intel chip and cloned its own. And since then, the battle between the two has continued till today.
Do I really care who wins? Actually I don’t. But monopoly is bad in the name of Capitalism. Back in the 1990’s, AMD was seen as the alternative to the mainstream chip maker Intel. Maybe the price was right, maybe the computer enthusiasts back then preferred a chip that they could tweak and squeeze out the extra ounce of performance that was locked from the normal human beings like I and perhaps you, AMD continues to play a part in pushing the technology frontier by competing against Intel. Check and balance, so as to speak.
I did make a switch to AMD in the year 2006. Why? That year, at least to me, was the first time I perceived AMD to be ahead of Intel in terms of technology. Its native dual core was said to be more superior than the Intel version (back then, Intel’s Dual-Core was an emulator while later when Intel did catch up, the true dual core was branded as Core 2 Duo). Advertisement was running hot. In one computer exhibition, there was this video racing machine powered by the then-latest AMD chip. Seriously, out of nowhere, I needed to have an AMD Athlon 64 X2 chip – a dual core that supports the 64-bit architecture.
I could be an AMD fan. I could have put a geek forum signature “AMD4Live!” had they got the quad core chips right. Yes, Héctor! If you are reading this, you ought to reconsider your contribution or rather the lack of it.
In the year of 2007, AMD’s share price has dropped 60%. In the same year, its CEO Héctor Ruiz has received a raise to a base salary of slightly more than $1 million a year (due to year 2006 performance so the press release said). Recruited by AMD’s co-founder Jerry Sanders and back in 2002, a year when AMD was financially strapped, Héctor Ruiz took over the CEO role from the flamboyant Jerry Sanders. AMD did turn around. But who was the hero?
Some point to Héctor Ruiz. But we all know that technology takes time to materialize from conception to production. Some praise the foundation laid by the co-founder Jerry Sanders. Some criticize the corporate direction set by Héctor. Where would AMD be today had it focused on advancing the dual core technology that they were ahead instead of channeling the effort into the research and development of a quad core – a technology that if you look around you today, how many quad core computers do you see?
So, here is the double whammy for AMD. Not only does the ill-fated quad core code name Phenom fail miserably against the Intel’s Core 2 Quad, but also it has lost its edge in the dual core market. Phenom is a complete disaster in a few folds. First, the chip’s feature size is still stuck at 65 mn while Intel has got it down to 45 nm. I gather that the thinner it gets, the lesser energy consumption it should be. Second, the clock speed of the AMD quad core is trailing by a mile. Third, the room of overhead for overclocking – i.e. squeezing out extra performance from the chip – is lower than the Intel chips. And fourth, it has a bug inside that cripples what Phenom can possibly do.
What more can I say? Héctor, you hear me? I could love Phenom.
I am a pragmatic dude. So I switch back to Intel. And this time, I gun for a quad core. Many friends of mine gave me a why-do-you-need-a-quad-core-for kind of look whenever I mentioned that I would love to have a quad core. Only my buddy Benny who has been at my place, seen how I worked my computer understands.
That particular day – like any other days really – I have my computer switched on with two screens simultaneously displaying two different destops at the same time. I would use my left display for MSN messenger, Internet Explorer, general computer health meters, and so on, and I would use my right display for gaming and more. At times I do audio processing (for my band and my own work) while working on image processing (for my blog site mainly). Once in a blue moon, I do video processing too.
In short, I need a quad core. I massively mult-task.
But it was not an easy decision. Extreme versions that I need to throw in an extra S$1,000 aside, I could have got a better clock speed with a dual core. But considering that a multi-core is suppose to split up the job and hence achieve a faster result, I can take a drop in clock speed. My old AMD chip has a clock speed of 2.22 GHz, which is still very decent, and that replaced my even older Intel single core chip of 3 GHz.
So, is my new quad core really that fast? Not all applications support multi-core architecture. Adobe Flash for instance, is mostly a single threaded application. And it hangs from time to time even with my spanking high tech machine. Some applications can only utilize two cores. Well, at least all the rest of the background programs are now taken care by the other two cores.
Am I happy with my decently clocked Intel quad core? Definitely! And more so because I have decided to get the brand new 45nm technology instead of the first generation quad core (65nm). Though I must say, the Q6xxxx series are still the most bang for the buck, I go for the Intel Core 2 Quad Q9450 @ 2.66 GHz instead. A well-known benchmarking tool does give my new machine a 50% higher in score due to my new processor though, if that means anything.
* * *
OK. You see the three horizontal black lines on the left of my blog site? These are my yardsticks to remind me how long my post has become. Since I really wish not to exceed the second bar, pardon me as I switch into the geek mode and talk a bit faster here.
I have no intention to replace my long awaited nVidia 8800 GT graphic card that I proudly acquired January this year. It still can handle 99% of the games out there (except Crysis whereby hardly any graphic cards these days can handle in full glory, perhaps the 9-series). Besides, it is Direct 10 enabled and since I am now using Windows Vista, I am so going to see the smoke and fire in all realism. And by the way, I am not into ATI.
Finding a motherboard took me a long time to research. And since I have decided on the chip – an Intel Core 2 Quad – you would have thought choosing a motherboard is just a matter of finding one to fit the chip eh?
Not at all. Before you even get to that, you have to decide on the chipset. A chipset is an architecture that draws out how the microprocessor is connected to all the hardware components. Typically, it is divided into the northbridge and the southbridge. The northbridge is simply a chip that connects the microprocessor with the memory chips, the video card (high speed graphic bus), as well as to the southbridge via an internal bus. A southbridge connects to the rest of the components of your computer.
To decide between an Intel chipset and a nVidia chipset, I have to ask myself a question: How many graphic cards do I want to have in one machine?
Trust me, I have been there done that with SLi (my old 6800GS x 2). My verdict is: if you want to go down that path, it only makes sense if you want to achieve a performance that no one single card on earth can deliver. In another words, rather than spending your money to buy more than one average card, invest in a darn good one. Not all the games come with optimized coding for SLi. And a SLi setup consumes a lot more electricity (one power cable for each additional card and another one to power the SLi portion of your motherboard). It generates more heat, more noise.
But if a 3-way graphic card setup is your cup of tea, the new nVidia 700 series chipset will certainly delight you. However, I read from the forums that this new chipset has problem with video playback amongst others. My advice? Go Intel.
Right now, you have 43 chipsets to choose from if you go the Intel way. Confused? I zoomed down to either a P35 for mainstream desktop or a X38 / X48 for performance desktop. Since I am not going for the extreme version of the quad core, why X-series? The only thing that holds me back is that P35 chipset does not support PCI-e 2.0. And therefore, I may not benefit fully from any future video card upgrade as the newer generation such the nVidia 9-series does support PCI-e 2.0.
Does that matter? It does, it doesn’t. It does because the technology is out there and I can’t benefit from it. It doesn’t because the difference between PCI-e 1.0 and 2.0 is the bus width and hence speed. Has this link been saturated by the games these days for 1.0? I read that there is still some way to go. Besides, it will take some time for the games to fully utilize that. And besides, it is bloody expensive to get a X38 or X48 motherboard.
I have also decided to ‘recycle’ my beautiful 22″ Samsung wide-screen LCD monitor (2ms!) that I bought last September. And in total, the damage was S$1,700 in cash (the vendor charges more for other modes of payment). If say you follow my specification and add in the graphic card and the LCD monitor that I have, you can have your very own F-cup bitch for about S$2,500, which is very reasonable I think. Try to get a similar specification if you are a fan of Apple. You will have to at least pay double if not more.
OK, I have overshot my word limit by half a globe. I better stop here. The technical specification of my new and very quiet machine is …
Intel Core 2 Quad Q9450 @ 2.66 GHz, Gigabyte GA-P35-DS3, Corsair XMS2 DHX (2 x 2 GB) 4 GB, Asus EN8800GT 512 MB, Seagate 500 GB 7,200RPM SATA II 32 MB x 2, LG 20x DVD SATA Writer, Cooler Master 690 Casing (with 4 x 12 CM fans and side window!), ThermalTake 750 W Tough Power PSU with Modularized Cable, Samsung SyncMaster 2232GW 2ms 3000:1 Dynamic Contrast, Windows Vista Home Premium