So I'm sitting here reading last month's (March 2010) PC Pro magazine, and I'm flicking through the Asus Tech-in-Style supplement and I noticed the latest Asus EeeTop PC ET2010PNT. It's an all-in-one with 20in display and Atom D510 inside. Underneath, it also lists 'Nvidia GeForce G310 ION2 graphics' as well!
ORLY?
[break]
We know Nvidia's Ion 2 platform is due soon - although we're tied to an NDA before we can reveal the date.
We do know it will feature
Optimus Technology for dynamic switching between integrated and Nvidia graphics - however whether this specifically will come to desktop products such as the EeeTop, instead of just notebooks, we don't yet know. It's cheaper and doesn't matter if you're tied to a wall for power anyway.
We do know it will be paired with the latest desktop Atoms (D410 and D510), as well as the "N" series equivalents, but other CPUs? We don't know yet - Ion has always been about bringing graphics to value CPUs, and the G310 doesn't really offer enough horsepower for products above these.
The
G310 details are:
- 16 shaders @ 625/1,530MHz
- 40nm, DirectX 10.1 compatible
- Up to 512MB DDR3/GDDR3 at 800MHz on a 64-bit interface
In Pineview systems the graphics will only have an x4 interface available to it, however that leaves nothing else for other onboard hardware like Gigabit Ethernet. It could be possible that Nvidia borrows some of its NF200 PCI-Express multiplier technology and applies it here, making use of less lanes. However, there's also the question how the limited DMI bus between CPU and NM10 PCH can cope with the additional graphics data.
Whether the connection limits the additional core performance gained from the extra frequencies, up from 450/1,100MHz in the original Ion, we've yet to see. It's a likely that because Nvidia doesn't have a DMI license to make a real chipset, we're still stuck with just a pair of SATA ports from the Intel NM10 - down from a plentiful six on the original Ion, that gave the flexibility to be used as needed.
We'll see soon how our predictions play out, and what tricks Nvidia has up its sleeves.
Want to comment? Please log in.