July 30, 2012
Moore's law illustrated (I)
The 256 MB USB flash drive cost as much as the 1 GB flash drive. The 8 GB flash drive cost a third of the 1 GB flash drive. I got the 8 GB flash drive because I needed to spend seven more bucks to get free shipping on an Amazon order. Seven bucks!
That's less than a dollar a gigabyte. I can remember paying a dollar a megabyte for hard drives. Hard drives now sell for ten cents a gigabyte. Not to mention that the actual chip on a USB flash drive is no bigger than your thumbnail.
Speaking of thumbs, the rule of thumb devised by Intel co-founder Gordon Moore held that microprocessor performance (usually counted in term of transistor density) would double every two years. It's held true now for three decades.
Moore's law will eventually run into quantum tunneling and heat dissipation problems, and the limits of photolithography. Manufacturers are already adjusting by moving motherboard components onto the CPU and increasing the number of cores.
When chip makers can't go smaller, they go sideways.
Supercomputers are not the hulking, super-specialized mainframes science fiction once imagined, but are massively parallel servers built using off-the-shelf components. And waiting in the wings are optical and quantum computers.
My first computer had 64 kilobytes of memory and two 191 kilobyte floppy disk drives.
That's less than a dollar a gigabyte. I can remember paying a dollar a megabyte for hard drives. Hard drives now sell for ten cents a gigabyte. Not to mention that the actual chip on a USB flash drive is no bigger than your thumbnail.
Speaking of thumbs, the rule of thumb devised by Intel co-founder Gordon Moore held that microprocessor performance (usually counted in term of transistor density) would double every two years. It's held true now for three decades.
Moore's law will eventually run into quantum tunneling and heat dissipation problems, and the limits of photolithography. Manufacturers are already adjusting by moving motherboard components onto the CPU and increasing the number of cores.
When chip makers can't go smaller, they go sideways.
Supercomputers are not the hulking, super-specialized mainframes science fiction once imagined, but are massively parallel servers built using off-the-shelf components. And waiting in the wings are optical and quantum computers.
My first computer had 64 kilobytes of memory and two 191 kilobyte floppy disk drives.
Related posts
Moore's law illustrated (II)
The accidental standard
MS-DOS at 30
Labels: computers, mainframe, tech history, technology
Comments
And, if my memory serves, that computer cost $1200. Today that will by you a kick-ass PC or two decent laptops.
Even more amusing is that my simple Samsung flip-phone has more computing horsepower and more memory.
Even more amusing is that my simple Samsung flip-phone has more computing horsepower and more memory.
What is the explanation for the lack of Moore's law as applied to video displays?
Case in point 6+ years ago I bought a high-end Sony Vaio laptop that had 1080 vertical pixels. Except for the crappy DVD drive (been replaced once and is still flaky) this has been an excellent computer.
I am a big fan of 1080 vertical pixels. Given the 1080p HD standard one would think this would be the standard for today's laptop screens. But it is not. Even today your mid-tier laptops will only have 900 pixels and the Intel graphics standard is 768.
Why does 1080p remain a high-end laptop feature?
Today's laptops provide more RAM, CPU, and hard drive storage but not pixels.
Why?
Case in point 6+ years ago I bought a high-end Sony Vaio laptop that had 1080 vertical pixels. Except for the crappy DVD drive (been replaced once and is still flaky) this has been an excellent computer.
I am a big fan of 1080 vertical pixels. Given the 1080p HD standard one would think this would be the standard for today's laptop screens. But it is not. Even today your mid-tier laptops will only have 900 pixels and the Intel graphics standard is 768.
Why does 1080p remain a high-end laptop feature?
Today's laptops provide more RAM, CPU, and hard drive storage but not pixels.
Why?
OK, the Sony is 1280 x 800. Not quite 1080p Still, my point remains. Today's 15 inch standard laptops are 1366 x 768
Two reasons spring to mind:
1. Quality control. Silicon wafer size has grown with transistor density, meaning that losses from die defects are minimized early in the production process. Losing a couple of pixels--which increase as the square of the resolution--requires junking the entire screen, making the marginal costs much higher for the same failure rate.
2. Consumer demand. Denser screens demand more processing resources, especially with low-end, shared-memory devices. Past a certain point, slowness is less acceptable than screen resolution. That’s why HD screens showed up first on phones and tablets engineered to balance the cost/performance tradeoff.
1. Quality control. Silicon wafer size has grown with transistor density, meaning that losses from die defects are minimized early in the production process. Losing a couple of pixels--which increase as the square of the resolution--requires junking the entire screen, making the marginal costs much higher for the same failure rate.
2. Consumer demand. Denser screens demand more processing resources, especially with low-end, shared-memory devices. Past a certain point, slowness is less acceptable than screen resolution. That’s why HD screens showed up first on phones and tablets engineered to balance the cost/performance tradeoff.