This story could be picked straight out of Clayton M. Christensen's "The Innovator's Dilemma".
As of today, these small, cheap computers are in most ways that matter inferior to their monolithic laptop and desktop counterparts. But many large players fear that they will grow in market share and hence start eating their margins. Fujitsu even goes as far as to deliberately stay out of the market.
However, the only reason that these smaller machines are less capable is that most processing today happens on the local machine. If web applications grow in the direction that Hacker News hopes, there will only be games and specialty applications (research and industry, plus maybe a few consumer applications) that require the horsepower of a traditional computer. Today, you could (rightly) state that lightweight machines are unergonomic and troublesome, but history dictates that with this much demand for a niche product, these problems will evaporate in a jiffy.
Even if the future still has a huge demand for desktop processing (say, if lots of data-transfer intensive things like video editing and processing become much more popular), lightweight machines still have huge competitive benefits: light weight, small size, low price and long battery life. One can even imagine the 'low price' requirement waivered, which would give manufacturers headroom to innovate further in form factor. I'm thinking something in the direction of what the OLPC team is experimenting with: multiple touchscreens, enabling new areas of computer use. No doubt there is lots of territory to be explored here.
Fujitsu (and maybe even Dell) is making a huge mistake in not embracing this nascent market. It's a good thing for their investors that they are diversified. Give it five years, and computers resembling these will be everywhere. Fujitsu will be unable to catch up. These machines will easily gobble up the part of the market which uses laptops as portable word-processors and email-readers, and will probably expand into uses for which which today's laptops are completely unsuitable.
Agreed. In (some fraction, likely of) 30-35 years time, the companies that are staying away because there's no money or the market is ill defined are going to look as silly as the ones who thought the market for desktop personal computers had no margin or was just going to ruin the business computer market.
If you're in the business of selling desktops and laptops you are going to get steamrolled. If you're in the business of selling people technology they want and can use, congratulations on your good fortune. I believe the innovation these machines bring is going to be _awesome_, and if the market for ugly shoebox sized boxes with noisy fans and 10+ attached cables dissolves, good riddance.
We hackers also need to pay close attention to this trend and how it will affect our work.
Between net books and smart phones, one wonders how much longer we can be lazy and get away with things like flash, video, and huge javascript modules.
In 10-12 years a 200$ cellphone is probably going to have a faster faster CPU, more memory, and more disk space than the desktop / laptop your using right now.
PS: A PC in 1998 ~= Pentium 2 @ 266MHz, 8gig Hard disk, 128 meg of ram.
Most cellphones are faster than a P2 and an 8gb Micro SD card = $31.50.
Or alternatively, they will have similar amounts of computing power, but will be much smaller, lighter, cheaper, and have a better battery life.
If we're more careful programming, we can easily do all sorts of interesting stuff on the limited, massively low power, and ultracheap hardware I'd like to see.
A PII uses 7.5 million transistors @ ~233 MHz.
A Intel Core 2 Quad has 582 million transistors @ ~2400 MHz.
I don't think there is a point to using anything much less than a PII on most consumer cell phone / PDA devices. And 10 years from now when 20 billion transistor CPU's are normal for home PC's I don't think using something less than a Core 2 Quad is going to have much of a point to it. (Note: I am talking about devices that show websites and take pictures.)
PS: Granted if your building a toaster then an 8080 is probably overkill.
PS: Granted if your building a toaster then an 8080 is probably overkill.
No, but using a $.95 7 million transistor cpu to simply emulate a more expensive mechanical timer might not be. And then you get internet and an LCD and all sorts of stuff basically for free. You could even emulate that 8080 if you wanted.
If your starting with an internet connection and an LCD then there is a lot of value in an ok (.95c plus) CPU . (Kindle, Cellphone, PDA, Laptop).
But, if your starting with a cheep (.05c or less) CPU then adding an ok CPU, LCD, and an internet connection probably overkill. (Watch, Toster, Fridge, dish washer etc).
For my mobile devices, I'd rather see a modest computing power increase, and (assuming power per transistor shrinks such that power usage for a top-of-the-line CPU is approximately constant) we end up using around 5 to 10% of the current desktop's power.
To me, a laptop with 20 hours of charge and maybe the power of a current quad-core would be far more useful than a short-lived, hot, noisy machine like we have now.
The same goes for a cellphone. I'd rather have 100 hours of on-time than the ability to use heavyweight technologies. Being able to forget my charger without worrying much when travelling is far more important to me.
In 10-12 years a 200$ cellphone is probably going to have a faster faster CPU, more memory, and more disk space than the desktop / laptop your using right now.
and it will be powered and cooled by what? I wouldn't want to hold a phone that gets as hot as my laptop does.
Well, the storage will be all memristors, so that will make the storage significantly larger, smaller, lower powered, and faster. I can't wait. HP says the first commercial memristor memory will be out in 2009, here's hoping. http://en.wikipedia.org/wiki/Memristor
Well, perhaps the storage may be better but I can't see how memristors will change the CPU heat equation. Why will memristors actually require less power to do the equivalent of what conventional CMOS transistors do? They both seem to require power to effect changes of state. Can anyone explain what it is about memristors that allow them to use much less power to operate at the same speed as CMOS?
Also the wiki page itself says
Although the HP memristor is a major discovery for electrical engineering theory, it has yet to be demonstrated in operation at practical speeds and densities. Graphs in Williams' original report show switching operation at only ~1 Hz. Although the small dimension of the device seem to imply fast operation, the charge carriers move very slowly
Woah. 1Hz is once per second, or twelve orders of magnitude slower than current CPU speeds, so unless we're talking about some radical alternative to Von Neumann I wouldn't expect them to be viable for CPUs for a few decades.
Caveat: I have only a minimal understanding of hardware and semiconductor physics.
As transistors get smaller they use less power per cycle. So a going from a 34.8W, 7.5million transistor PII @233 MHz to a 500million transistor Core 2 Duo at 2400 MHz uses a lot less than 34.8W(500 * 2400) / (7.5*233) = 23,896W or 23KW.
Note: Q6600(Core 2 Quad)@2.40 GHz uses 95 W but a lot of that is L2 cash which is less power hungry than it's computational units.
PS: If you want portable power today a U1400 Intel Core Solo @1.20 GHz uses 5.5 W of power and will crush a PII in the same way I expect a new 5.5W CPU to crush a Core 2 Duo in 10 years.
That all makes sense since a PII and a Duo use the same kind of transistors. I was more curious about what the effect on power would be when memristors augment or replace transistors on a CPU.
Why would that industry be worried? These machines aren't replacing more powerful laptops nor desktops: I bet people are buying them as an addition to a regular computer.
I was shopping for EeePC and reading reviews of it, and lots of people were saying just that - a cheap travel companion, not a laptop replacement. Hey, I myself was going to get it precisely for that reason.
Price isn't as important, you can get "real" laptops for cheap. I paid $299 for Acer laptop for my parents: 1.7Gz Core Duo 2, 1GB ram, 15" LCD, 120GB HDD, big&usable keyboard. The whole thing is big, that's why it's cheap.
The point here is size: these machines are painful to type on and screens are tiny for most kinds of work (be it documents, programming, image editing, etc). They're decent "time wasters" - youtube, reddit, etc.
One would think the Macbook Air would have been mentioned in this article. If the lack of a CD-ROM/DVD drive isn't evident of web application focus instead of hard data, nothing is.
Yes, but the MacBook Air addresses the main problem with the "netbook" computers. It has a large screen and keyboard, while still being ridiculously light and thin.
So, if you can make netbooks thin and light but still wide (for keyboard and screen), and still cheap, you pretty much eliminate the only remaining argument for more "full featured" laptops. For now, it seems thin, light, and wide is enough of an engineering challenge that it's only possible for expensive notebooks. But it will be interesting to see if that will change.
My Dell Latitude X1 is a real laptop and not much bigger than eepc. Not much more expensive than eepc on ebay, either. I think the eepc-things are a passing fad. Among other things, soon "real" notebooks of the same size won't be much more expensive than the semi-things.
It's the other half of Moore's Law, you get the same functionality at half the price. If you don't need more performance then the price drops, for example look at DRAM price curves.
As of today, these small, cheap computers are in most ways that matter inferior to their monolithic laptop and desktop counterparts. But many large players fear that they will grow in market share and hence start eating their margins. Fujitsu even goes as far as to deliberately stay out of the market.
However, the only reason that these smaller machines are less capable is that most processing today happens on the local machine. If web applications grow in the direction that Hacker News hopes, there will only be games and specialty applications (research and industry, plus maybe a few consumer applications) that require the horsepower of a traditional computer. Today, you could (rightly) state that lightweight machines are unergonomic and troublesome, but history dictates that with this much demand for a niche product, these problems will evaporate in a jiffy.
Even if the future still has a huge demand for desktop processing (say, if lots of data-transfer intensive things like video editing and processing become much more popular), lightweight machines still have huge competitive benefits: light weight, small size, low price and long battery life. One can even imagine the 'low price' requirement waivered, which would give manufacturers headroom to innovate further in form factor. I'm thinking something in the direction of what the OLPC team is experimenting with: multiple touchscreens, enabling new areas of computer use. No doubt there is lots of territory to be explored here.
Fujitsu (and maybe even Dell) is making a huge mistake in not embracing this nascent market. It's a good thing for their investors that they are diversified. Give it five years, and computers resembling these will be everywhere. Fujitsu will be unable to catch up. These machines will easily gobble up the part of the market which uses laptops as portable word-processors and email-readers, and will probably expand into uses for which which today's laptops are completely unsuitable.