Page 1 of 1
Unnecessary Advancements
Posted: Thu Jan 17, 2008 10:09 pm
by lollynoob
I've been thinking, and I've realized something; ever since people have owned computers for themselves(not just the PC; C64's and Apple II's apply as well), they've been doing pretty much the same things on them, despite the orders of magnitude by which processor speeds have increased. People still type documents and print them out, people still play games, people still talk over networks to other people to cooperate and enjoy themselves (even before the internet people did this; BBSes had pretty much the same function as forums today, there just weren't so many awful emoticons).
All things considered, is there really a point to the rat-race to the next hertz mark, or the next digit in your instructions-per-second count? It just seems sometimes that it's a pretty misguided effort; people don't really need fancy graphics to get the job done, and if they were never invented, people wouldn't be so afraid of not having them. If anything, the habituation to flashy effects (e.g. relaxen and watch das blinkenlights) seems counterproductive, as so much effort and money is put into developing the next big effect which will only distract people more from the task at hand.
Just think; what if all of this effort was put into something more useful? What if, instead of having 3GHz processors that burn out faster than their 25-year-old counterparts, corporations had made an 8086 that never wore out? What if, instead of the next shiny improvement in the appearance of a program, revisions to software actually produced a smaller, faster, more efficient application?
Now, I'm not naive, I realize that all of this would be horrible for the economy; flashy products that need constant replacement are a capitalist's dream come true--I'm just wondering if it's really the best thing to avoid genuine improvements in the name of money.
Posted: Fri Jan 18, 2008 3:00 am
by AJ
Hi,
Although I agree with some of what you say, occasionally you really do see something which gives a genuine benefit to humanity. Think, for example, of bionic implants.
There have already been relatively successful retinal implants using a CCD-type device, giving people some form of 'guiding sight', where they had previously lost their sight. Were it not for the miniaturisation of electrical components and being able to develop fast enough components which run cool, this would not be possible. Some of the expertise to do this kind of thing inevitably comes from tweaking advances made in desktop computing technology.
Also, how about protien folding - this offers some real possibilities for major medical advances (particularly in the field of Cancer research) and is only possible with huge amounts of CPU time. And weather prediction - has anyone else noticed how much better that has got in the past 5 years? Sometimes, a good weather forecast can save lives. What about flight training simulators? Realistic simulations can mean less crashes in the real world...the list goes on.
So yes - there is a lot of effort expended on the Gigahertz wars, and a lot of it is just for the 'bling' factor. But sometimes, some good comes out of it - and besides, would you want to go back to using an 8086 CPU for your day-to-day personal computing now?
Cheers,
Adam
Posted: Fri Jan 18, 2008 3:44 am
by JamesM
people still play games,
people don't really need fancy graphics to get the job done,
People want to play games. Companies make games for profit. Each company wants to get more sales that its competitors. Each company will therefore try to make better games than its competitors. Therefore, each generation needs better hardware.
This argument can be applied to any field related to computers.
Posted: Fri Jan 18, 2008 4:08 am
by Solar
When things began for me, I had a ZX81, 40x25 b/w display on a 50 Hz TV, 1 kB RAM, 3.5 MHz, and if I wanted to safe a program I had to hook up the thing to a cassette player and hope I got it right, because there was no way to verify success. "Type and print a document"? Ha, you wish. Games? "Guess the next random number between 1 and 10", or "move this piece of ASCII art around on the screen".
Next came the C64. 40x25 16-color display on a 50 Hz monitor, 64 kB RAM, and some more comfort when it came to storing data. Yes, I could "type and print a document", but nothing beyond the simplest of layouts, and it looked crappy when printed by a 9-needle dot matrix printer. Games were nice, I admit. I can remember playing "The Bard's Tale" and thinking how cool it would be to actually see the monsters coming around the corner and having to fight them in real-time, with some spiffy effects for the magics...
Next came the Amiga. 640x256 pixels with a 256-color palette on a 50 Hz monitor, 1 MB RAM, 7 MHz, and two floppy drives! Yay! After spending about the same amount of money all over again, that became 4 MB RAM, 25 MHz, and - gosh! - a hard drive. Now, for the first time, I learned about WYSIWYG. With the 24-needle printer, it even looked acceptable. Later, I even added a 28k modem to it, and a flicker fixer (doubling vertical resolution to 512, but still 50 Hz). Then I expanded again, 50 Hz 68060, more RAM, and a gfx card allowing 1024x768 at 75 Hz, true color.
That was about the first time that I, personally, could have "settled" for what I had.
The next computer was a laptop, meaning I could take my computer with me. It played video and DVD. I played my first ego-perspective realtime game. It could do networking, at a whooping 10 MBit. Internet was at 64 kBit, and I could even talk on the phone while I was online. The CPU was a thoroughbred, ten times faster than the one I had (500 MHz PIII). It still took ages to do anything major, like compiling a large application, but what the heck. At least I could fly a real flight simulator now.
My laptop today can do WLAN at 54 MBit, I recode and burn my own DVDs in minutes instead of hours. I like my 16 kBit DSL connection (including flatrate for internet and phone). I like to time-skip the TV I watch, or to record a movie from the start after watching it for 15 minutes to find out if it's good. Playing World of Warcraft was really cool for a while. Oh, and it did cost only 1/10th of my first laptop, quite literally.
Where on that road would I really have wanted to stop and say "no more"?
I wouldn't.
Today I am satisfied. I don't have needs that my current system cannot fulfill. Let's see again in two years or so.
But at work, we're working on a database that's in the hundreds-of-Gigabyte range, on a 16-CPU machine with 64 GByte RAM, and we're constantly hacking the system to keep it from dying from lack of ressources. Pilots get trained in simulators so they don't have to learn the hard lessons with passengers' lifes at stake. Medicine, weather research and outer space exploration eat up clock cycles and will never have enough, for decades to come.
I admit that some developments have taken on pathologic tendencies. I'd rather wait half a year for the next generation of chips instead of overclocking what I have today, for example, because the price/risk - performance ratio just isn't there.
But I don't see the market being saturated. We're slowly getting where the home user doesn't need more GHz anymore, but then there's the price and the power consumption to talk about, and the ecological impact of production... and if all the fifteen billion people of tomorrow want to surf the internet with 100 MBit each, the backbone providers will need all the clock cycles they can get...
Posted: Sun Jan 20, 2008 1:53 pm
by Telgin
The improvements you see in desktop computers is mostly a side effect of the general improvement in computer technology. After all, that's not all that's done with computers.
For instance, it wouldn't be possible to have a computer guided missile or bomb way back in the ENIAC days. A more recent example would be something like MRI technology, which relies on so much data being sent through a computer that it probably wouldn't be possible with computers from 20 years ago.
Now, you're right that consumers aren't seeing much of an enhancement with modern improvements. Text processing can't get but so advanced (although Microsoft Office can consume a staggering amount of memory and processor cycles). Some day though, we'll see something genuinely useful. Computers now aren't fast enough to do something impressive like true AI, but someday they will be. Then there will be a point.
Posted: Mon Jan 21, 2008 2:46 am
by Solar
For artificial intelligence we'd have to overcome genuine stupidity first.
Sorry, couldn't resist. I understand why researchers are excited about AI, but I don't see how it would solve any problems in a reliable way.
Posted: Mon Jan 21, 2008 12:37 pm
by mathematician
Sometimes I think the same, but you try runiing a modern WYSIWYG word processor on a 4.77MHz PC with 1mb of memory.
Posted: Mon Jan 21, 2008 1:23 pm
by Zacariaz
Do we need faster cpu/gpu? I dont really think so, but then again, i didnt a few years ago either, but today i wouldnt be able to live without my core 2.
I do believe that if we need new, better and faster hardware, it will mainly be needed for games (that is for "normal" people). The problems with games today, as i see it, is that its focused more on the graphics that the gameplay, thus i havent encountered a game for years that is worth playing for more than a week. Sooner or later the graphics will be so real life like that they'll have to focus on the gameplay instead. Though the developement of new hardware will not stall for that reason i do believe the goals will be chanced radically. Fx. focusing on energy consumption instead.
This was just a somewhat messy and longwinded example, but my point is that i believe that we will, some day, realize that we dont need anymore "power". Sure, the development wont stop because of that, but the Hz race will.
Did that make any sence at all?
Posted: Mon Jan 21, 2008 3:37 pm
by lollynoob
First of all, thanks everyone, for all the thought you guys have put into this whim of an idea I came up with, but secondly, I'd like to make a clarification.
The main thing is that when I wrote my original message, it was targeted at desktop computers; not server systems or things like embedded systems for medical use, but for the computer the average person needs. Sure, a 386 (I'm skipping a few generations here for genuine usability) couldn't do real-time visual calculations in someone's false eye, but for desktop usage, it could stand up to near any task with properly written software. Along with being more than enough for usage, if the focus of processor manufacturers wasn't on speed, it could be on other things such as size, durability, or power consumption ("hold on, let me pull out my 30-year-old laptop that's been running on its batteries for a few days, to check something online"). Also, if such a processor was settled on (one that "just worked", as people like to say about things), imagine the innovation that could exist in fields such as software design--applications, commercial or not, couldn't rely on flashy gimmicks over actual performance to gain buyers; the software would have to actually be more efficient, more intuitive, and generally better, to get support.
If anything, it seems, stagnation in the personal computer market would only cause improvements.
Posted: Mon Jan 21, 2008 3:58 pm
by mathematician
I can remember when somebody I knew installed Windows 3.2 onto a 486. After typing a just few words her comment was "Isn't it slow?" She decided to go back to MS-DOS.
Posted: Mon Jan 21, 2008 6:51 pm
by ucosty
lollynoob wrote:First of all, thanks everyone, for all the thought you guys have put into this whim of an idea I came up with, but secondly, I'd like to make a clarification.
The main thing is that when I wrote my original message, it was targeted at desktop computers; not server systems or things like embedded systems for medical use, but for the computer the average person needs. Sure, a 386 (I'm skipping a few generations here for genuine usability) couldn't do real-time visual calculations in someone's false eye, but for desktop usage, it could stand up to near any task with properly written software. Along with being more than enough for usage, if the focus of processor manufacturers wasn't on speed, it could be on other things such as size, durability, or power consumption ("hold on, let me pull out my 30-year-old laptop that's been running on its batteries for a few days, to check something online"). Also, if such a processor was settled on (one that "just worked", as people like to say about things), imagine the innovation that could exist in fields such as software design--applications, commercial or not, couldn't rely on flashy gimmicks over actual performance to gain buyers; the software would have to actually be more efficient, more intuitive, and generally better, to get support.
If anything, it seems, stagnation in the personal computer market would only cause improvements.
A 386 would be terrible. How many tasks, no matter how well written, could you run on one? Right at this very moment I have a Firefox instance with several tabs open, a Cisco Software IP Phone running, iTunes playing music, an SSH shell to my phone server, a VMWare session running a single VM in the background and several background tasks monitoring email and IM, and a web server with MySQL for development stuff. I'm not even running at capacity here and often have even more running. To top that off I'm talking about my laptop - my home desktop frequently runs more.
I know I'm probably not representative of the average computer user but to suggest that the increments in processor speed, ram and hard drive capacity and video performance are useless is to reduce everybodies needs to the lowest common denominator.
size, durability, or power consumption
As far as I know processors don't wear out. I have an SGI Indy at home with a build date in 1994 that is going strong with all original parts. I even fired up an old Atari 2600 which worked until the crappy power brick died.
That leaves size and power consumption. If you want small and low power consumption there are *lots* of CPUs that fit this requirement. Since you were talking about 8086's and 386's I am going to assume you don't even care about speed that much.
I'll just point you to the Wikipedia entry for Via processors.
http://en.wikipedia.org/wiki/List_of_VI ... processors
Of course I shouldn't need to mention ARM or the commercially licensed variants.
And for size you can't really beat PicoITX
http://www.trendygadget.com/2007/04/21/ ... -x86-mobo/
That being said the electronics in modern PDA's is quite small and the processors are getting better all the time. According to wikipeida the iPhone sports a 620 MHz ARM cpu.
Posted: Fri Jan 25, 2008 5:35 pm
by lollynoob
Right at this very moment I have a Firefox instance with several tabs open, a Cisco Software IP Phone running, iTunes playing music, an SSH shell to my phone server, a VMWare session running a single VM in the background and several background tasks monitoring email and IM, and a web server with MySQL for development stuff. I'm not even running at capacity here and often have even more running. To top that off I'm talking about my laptop - my home desktop frequently runs more.
Surely you can't be browsing multiple web pages, talking to someone over VoIP, listening to music, managing your phone server, dealing with another system in emulation, waiting for an e-mail, instant messaging, and doing web development
all at the same time. If you are, the fact that you do that scares me, let alone the fact that you often manage more tasks than this; I wonder when you'd get the time to actually accomplish something.
It's examples like this that make me wish the personal computer had hung back at around early-90s speeds. People would actually get things done.
http://lifehacker.com/software/multitas ... 246988.php
Posted: Sat Jan 26, 2008 2:24 am
by ucosty
If you do any real web development you'll need a server, a client application and an editor at a minimum. You'll probably also have photoshop/illustrator/whatever open so you reference your design material. In my case I was developing an application that interacted with Cisco IP phones and therefore also needed a Cisco IP phone client and the server to make it work.