My Flixster | ||
Thursday, July 26, 2007
Tuesday, July 24, 2007
ATI HD2900 XT BLUNDER vs nVidia 8800 GTX and ULTRA
If you read my earlier blog about the ATI HD2900 XT's probable superiority versus nVidia's 8800 GTX, you'll see in this chart that I was very wrong. I feel disappointed because the 8800 GTX was last years video card and since ATI's card just came out I expected that the HD2900 XT would be the new fastest gaming video card in the world. I also feel a bit relieved because I don't have to buy one and shell out $500.00 for it.
Specification wise, the nVidia 8800 GTX has less processor speed, only 576MHz when compared to the HD2900 XT's 743MHz. The 8800 GTX also has more memory, 768mb when compared to the HD2900 XT's 512mb, both cards use DDR3, so I doubt that it's the reason for the slowdown of ATI's video card. Additionally, if you want an even faster card you can go for the nVidia 8800 ULTRA. Personally I try to avoid Ultra models because I don't want to burn my card, which is inevitable since my system's are usually very hot with so many drives and processors in it. If you do go for an Ultra though , make sure you have a long term warranty, somewhere around 2 to 3 years. I usually burn my Ultra's after the first year. Non-ultra's last longer, almost forever actually until they're obsolete or sold or given them away. I also don't find the speed difference worth the much higher price and the risk of burning your card.
What's even more surprising is that the ATI HD2900 XT is even slower than nVidia's 7800 GTX. I never expected that a DirectX 10 video card released by ATI during this year's third quarter, will be beaten by a DirectX 9 video card released early last year or maybe the year before that, and in a DirectX 10 game. Absurd and unbelievable!
Sunday, July 22, 2007
USING THE XBOX 360 CONTROLLER WITH VISTA

The first time I used a controller for my pc was with Microsoft's Sidewinder, soon after I got a Thrust Master. Unfortunately during those days there were very few pc games that use controllers. In fact, I never really used them at all, they just sat on my table gathering dust until I gave them away due to age and uselessness.
It's a different story today, I was actually surprised that there were a lot of games that played better with the Xbox 360 controller. Need For Speed is my favorite, followed by Psychonauts and Jade Empire. In fact, I only use my mouse and keyboard for FPS (First Person Shooter) and RTS's (Real Time Strategy) these days. I don't think any controller can replace the mouse and keyboard for FPS and RTS's yet.
Wednesday, July 11, 2007
ATI vs nVidia Video Cards - Through The Eyes Of A Gamer
Most of the time, if you want to do more with your computer, it will need a faster video card, especially when converting videos to other formats, or when transcoding to and from dvd's, or if you want to create and edit pictures at their most infinite detail, and gaming pc's need all the crunching video power it can get especially for running newer high-end graphics demanding games.
The difference of having a fast video card is very obvious when transcoding a dual layer dvd, which contains approximately 8gb of data. Normally the transcoding process takes about 5 hours, but if you have a really fast video card it won't reach 20 minutes. If your system has a fast dual-core Intel or AMD processor and a high-end video card combined, transcoding a dvd won't even reach 7 minutes.
When playing games, a fast video card can mean the difference between turning all the special effects and features off, versus maxing them all out. For those skeptics who think they can live with lower quality graphics settings, I'll tell you now that it's as different as a five year olds drawings and Da Vinci's paintings. Take for example these two pc-games Lara Croft Legends and Far Cry, if I turn down all the special graphic effects and settings the trunks of trees become smooth and simple colored, like it's covered with paper, mountains are plain drabby grey with big blotches of brown and triangular like angles at points where elevation changes. But if you crank the graphics up to maximum effects, tree barks become real with textures and color, mountains are rocky and have actual vines hanging and crawling at the walls, shadows come out depending on the light source and suddenly you realize why your character has a flashlight and what its for, birds fly around, smoke billows from fires at a distance, water sparkles and looks so real you can imagine swimming in it, fish swimming everywhere and even sand looking like real sand. When you shoot the water with your sub-machine gun it really splashes where the bullets hit and you can even see bullets whizzing past you if you're underwater and bad guys are shooting at you from their gun boats on the surface.
With my experience of using many models of video cards, I think that ATI has always been a tad faster than nVidia. I know that there are “loyalists” out there who stick to one brand, and we all know that most games display the nVidia logo more often than ATI's, but actually I find ATI the better choice of the two. nVidia is more often than not the company that first comes out with a newer video card, but soon after its release ATI introduces a video card faster than nVidia's. There are times when ATI releases first, shortly after nVidia usually comes out with a faster card, but within a few weeks ATI will have one that's faster than nVidia's again. It's kind of like a weird yoyo cause and effect model between these two giants, and what's even more impressive is that it isn't really a price struggle between these two, but instead a power war. Obviously their high-end clientele look for performance first, then price.
My quest for the fastest video card started when Half-Life was released by Valve/Sierra 9 years ago, precisely at November 1998. During that time having a default 1mb video card was enough to run almost any game. In fact, monster video cards didn't have more than 8mb's of RAM and cost more than most high-end cards today. I couldn't believe that Half-Life needed at least an nVidia TNT with 16mb to perform just right. I remember buying Voodoo 2500 and 3500 cards with 16mb each and was able to play Half-Life very smoothly except that Voodoo cards could only produce 16-bits of colors, which isn't as pretty as 32-bit, so finally when nVidia released its TNT I finally got to play Half-Life for the 2nd time with its full 32-bits of graphics glory. I tried almost every nVidia video card model after that, starting with the TNT, TNT2, GeForce, GeForce 2, GeForce 4, GeForce FX-5200, GeForce 5700, but before I bought the GeForce 5950 I decided to wait for the rumored GeForce 6800.
Funny thing is that I was satisfied with my nVidia 5700, and I wouldn't have even considered upgrading if not for the game Far Cry, Doom 3, and Half-Life 2, all of which needs very high-end graphics cards. Before the GeForce 6800 was released, ATI came out with their X800 and it was so powerful I almost bought it, but decided to wait for the GeForce 6800's release and reviews. When the GeForce 6800 finally came out it blew away the ATI X800's performance, but then there was rumor of ATI coming out with their tweaked X800XT which will be faster than both the X800 and the GeForce 6800, so I waited again. When the ATI X800XT finally came out it was well worth the wait, I read the reviews and after learning that it was indeed faster than the GeForce 6800, I bought it! That video card surpassed all expectations! The ATI X800XT served me well and still does, in fact I have yet to see a game that won't run well with it.
After 2 years of using my ATI X800XT, I upgraded my system with an nForce chipset board because I wanted to try out the performance of an SLI (dual-video) card setup. Unfortunately it didn't satisfy my expectations. It was a bit faster, but I didn't see a big difference compared to my ATI X800XT. In fact the dual nVidia 7900 SLI didn't feel much faster than a single nVidia 7900 system. I changed my video card to the nVidia 7950GX2 which was like an SLI with 2 7900 video cards, but using only 1 PCI-Express slot. Again it was even slower than a single nVidia 7900 video card. I was about to buy the ATI X1900 video card but decided to wait for the nVidia 8800 GTX because of Vista and DirectX 10. ATI video cards last year did not have DirectX 10 yet, not until after mid-2007.
I am very satisfied with my nVidia 8800GTX video card, it surpassed all expectation and even more. But after reading about the specs of the up and coming ATI 2900 DirectX 10 video card, and learning that it should be about 50% faster than the nVidia 8800GTX, I think I'll be transferring to ATI again soon.
Bottom line is, ATI rarely crashes, it is more stable, and it is a very powerful video card. My problem with nVidia is it's not very compatibile with other chipsets. If I have an nVidia video card on a VIA chipset board, I find that there are games whose video's crash unless I upgrade the VIA's chipset or BIOS. In my experience, nVidia stops crashing after all the upgrades and devices are installed, but ATI works fine even if you haven't installed any upgrades or devices before using it.
Thursday, July 5, 2007
AMD vs Intel - Performance and Compatibility
After the heating issue was resolved I tried AMD, also because of their cheaper prices compared to Intel, and because I'm a gamer speed is very important. I started with the AMD Athlon 1800, then slowly upgraded it until the 3200, which is the fastest Athlon 32 even today.
When the Athlon 64's came out I told myself that there was no way I'd downgrade to a slower 32 bit Pentium 4, and believe it or not the AMD's were still very cheap when compared to the Intel P4, so I got the Athlon 64 3000 then after the 3500. Now I'm using an Athlon X2 5000. I never saw the need to spend over 300% more for an FX processor that only outputs less than 30% additional speed compared to the X2.
Intel finally lowered the prices of their Extreme Edition dual core processors, and they are much more faster than AMD's FX, which is quite tempting and I'm seriously considering going back to Intel for my next upgrade.
The only problem with Intel chips are that they have big compatibility issues! Intel chips can't run on just any chipset board, of course they don't tell you that, but if your board uses a chipset other than Intel, namely VIA and nForce, you'll experience crashes, blue screens, and incompatibilities with ATI and nVidia video cards. AMD on the other hand can run smoothly on VIA, nForce and also Intel chipsets.
Intel's compatibility problem might not be important to the average user whose only concerns are everyday applications and e-mailing and browsing the internet, but for a serious Power User like me, or gamers in general, that means everything.
For example, I now have an nVidia gForce 8800 GTX video card, and before that I had an nVidia gForce SLI 7900 setup (meaning that I have 2 video cards installed and running at tandem in my pc). The SLI setup will only work with nForce chipsets, and the nVidia gForce 8800 GTX will supposedly perform better with nForce chipsets. I have another pc that uses an ATI X800 XT PE and I'm planning to upgrade it with the upcoming latest ATI Radeon HD 2900 XT video card with a VIA chipset motherboard. On either of these setups I won't be able to use an Intel processor.
You might wonder why I don't just get an Intel chipset board? First, because the choices are too few and when you setup an extreme gaming rig you want each and every specification to be exactly how you want it and nothing less. Second, the board brands I prefer only use either VIA or nForce chipsets, and I'm not prepared to shift to a brand I've never used before, even if they have the exact specifications I want.
Is ATI vs nVidia like or unlike AMD vs Intel?
With the reality of pc's being used to take over most of our daily needs, the demand for speed ever increases with each new applications created everyday. Thanks to companies like ATI and nVidia, video processors are always at par with the ever advancing PC processors manufactured by both AMD and Intel.
High end video cards can be arguably more powerful than some presently available AMD and Intel processors. Because of this, many applications are programmed to detect and then use the faster chip, from either the video card or the processor. With the advent of dual and more processors, some programs force the 2nd core to either help the 1st one or the video card, whichever needs it. Despite dual cores being the new technology today, dual video cards have been in use long before the first dual core processors even came out.
Not surprisingly, these four brands have their loyal fans! Like other brands some people prefer to stick with their favorites to the bitter end, while others base their choices on reviews or popularity, some prefer to choose only the faster chips, many like me are influenced by their costs, and still others don't mind trying or using any brand.