Friday, July 16, 2010


When it comes to computer monitor we also want to have the best display that is possible. I first had a 19 inch Samsung T190 LCD with a native resolution of 1440 x 900 (WXGA+) and I use a DVI cable to connect it to my computer instead of the common VGA cable. Surely the DVI gives a better picture quality since it is digital as compared to the analog signal of a VGA connection. I was satisfied with the picture quality of the Samsung T190.

When I was able to upgrade to a Samsung PX2370 LED Monitor which is a Full HD computer monitor meaning it is capable of displaying a 1920 x 1080p resolution I had an option to connect it to my computer either via DVI (Digital Video Interface) or HDMI (High Definition Multimedia Interface). This LED monitor also has a brighter display compared to a LCD monitor.

Since my video card (Nvidia GTX 285) has both DVI and HDMI  I was trying to figure it out which is better a DVI or HDMI connection. At first I connected it to my computer thru DVI but when I access the  Nvidia control panel it shows that the resolution being displayed is 1920 x 1080i and not 1920 x 1080p which is the native resolution of my Samsung PX2370. I surfed the internet to find an answer and in all threads I have read regarding the subject matter it is opined that both DVI and HDMI offers uncompressed digital signal and thus both offers the same quality of picture and it makes no difference if you connect it to a computer either via DVI or HDMI.

However when I connected the Samsung PX2370 to my computer via HDMI  my monitor displayed a better picture quality and the Nvidia Control Panel  confirms that my monitor is now displaying a 1080p resolution. The information button of my Samsung PX2370 also shows that it is displaying a 1080p resolution.

Thus if you have a Full HD pc monitor it is more advisable to connect it to your computer via HDMI to b

Friday, February 12, 2010


Nvidia recently gave us a peek of its much awaited new GF 100 series of video cards. At first many speculated that it will be named GTX 300 but now it is clear that it will be called GF 100 series. These cards will be based on Fermi architecture which is allegedly much faster and consumes less power than its present GTX 200 cards. I hope so for this is the answer of Nvidia to ATI Radeon’s 5000 series cards which are capable of playing DX 11 games. At present I think the only DX 11 pc game is Dirt 2 and the upcoming game Battlefield 2. But I suppose that like the case of DX 10 games which are playable in DX 9, future DX 11 based pc games will also be playable in DX 10. So there is no reason to feel sorry for those who are currently using Nvidia 9000 and GTX 200 series cards. DX 11 games are looks better and realistic however the cost of high end video cards are quite high. I just hope that when Nvidia release its new GF 100 series that they will be reasonably price so that I myself would be able to purchase one (or two for SLI).


There are only a few pc games with Physx and I recently played Batman Arkham Asylum and I saw how great the game is especially when the Physx is enabled. However to enable Physx it is recommended to have at least one GTX 260 or higher for physx and a 9800 GTX for graphics and for a single video card a GTX 280 or higher. According to forums that I have read that in games with physx, SLI should be disabled. If you have an SLI setup, it is recommended that it be disabled so that one card will be running the graphics and the other the physx.

I have only one video card which is a GTX 285 and I played Batman with physx enabled (MAX SETTING) and 4x AA. If the Physx is on Normal setting I can play it with 8x AA and of course with Physx disabled I can play it with 16x AA. It would be great if I can have an overclock CPU. The only problem with Batman Arkham is that it is plaqued with numerous bugs.

This physx feature is available to those who are using Nvidia cards. I just don’t know if this can be enabled with ATI Radeon cards.