The iPad is just a few weeks away from launch and one of the most notable features about Apple's tablet device is the speed. A4, Apple's in-house "system on chip" that runs the iPad has been touted to be a much superior CPU compared to Samsung's 833 MHz chip that is present on iPhone 3GS.
However, is A4 really the next generation chip that it is made out to be? In a recent article, Jon Stokes from Ars Technica questions the acclaimed superiority of the A4 chip and makes an argument against it.
According to his sources, the A4 chip is a 1GHz SoC made of a single Cortex A8 core and PowerVR SGX GPU and not a dual-core Cortex A9 chip as originally thought. This would make it comparable to the chips present on many of the modern day smartphones and hence is nothing revolutionary. Jon Stakes also claims that the iPad's leaner hardware that does not include resource-guzzling parts like a camera makes the A4 chip run faster than it would on smartphones with these features present. Stokes writes:
"With one 30-pin connector on the bottom and no integrated camera of any kind, the A4 needs a lot less in the way of I/O support than comparable chips that are intended for smartphones or smartbooks. This means that the A4 is just a GPU, a CPU, memory interface block (NAND and DDR), possibly security hardware, system hardware, and a few I/O controllers. It's lean and mean to a degree that isn't possible with an off-the-shelf SoC."
Stokes speculates if the reason behind Apple not having made details about the A4 processing chip public is because there "isn't anything to write home about". He says that in the absence of a "wow factor" with A4, a revelation of its specs could divert the focus from features present on the iPad to what is not on the A4.
These arguments make sense, but at the end of the day, people who got a chance to play around with the iPad found it to be very responsive and a different league compared to the iPhone.
So does it really matter if it is really revolutionary or not, if it delivers great performance? Let us know your thoughts in the comments.
[via Ars Technica]