Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

What I find fascinating in this short story is the concept of THE computer. That was the vision of the computer back only two or three decades ago (and I'm old enough to remember). Everyone has a terminal into the main computer, instead of the myriad of personal computers (handhelds, laptops, etc) that we actually have.

The concept of a major computer site taking up a huge amount of space might be superficially coming true with all the data centers we're building, and terminals are superficially like accessing the cloud instead of doing the computation locally, the concept of a single, massive entity is not really being pursued. We understand the limitations. We don't have the technology or even a theory behind how we would program one massive program, utilizing the zillions of little processors, as a single entity. We can't get multi-threading right even for comparatively trivial programs (compared to a Multivac, I mean) that do no AI.



Perspective change for you: It's the Internet.

It takes up a huge amount of space, utilizing zillions of processors, and is insanely parallel.


I find it surprising that no OS yet comes with a distributed processing framework (i.e. a pluggable system to support things like Folding@Home) installed and running by default. I predict that it might be the norm in ten years or so; then we'll actually get the effect you mention, where anyone can rent processor time on "the InterVAC" (or at least "the MicrosoftVAC" and "the GNUVAC", if it doesn't get standardized.)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: