I was just visiting my late-60′s-year-old mother, and (as usual) complaining to high heaven about the state of her digital/internet existence: an underpowered, beyond-obsolete eMac with a dialup internet connection. We started talking about what, in a perfect world, would be the best thing for her to upgrade to (besides cable internet, for god’s sake).
As we have seen, “old people-optimized tech” is becoming quite the niche. But when I thought frankly about it, I couldn’t help but think that perhaps the best computer for an aging person who “just wants something that works” isn’t a ruthlessly simplified internet appliance, but the exact opposite: a top of the line, specced-to-the-nines, more-horsepower-than-she’ll-ever-come-close-to-using desktop supercomputer.
Why? Isn’t all that expensive power just going to be “wasted” on someone like my mom? Absolutely. And that’s the whole point.
In fact, the whole idea of modern computer interfaces is completely predicated on “wasting” formidable computing power on frivolous things like graphical user interfaces, onscreen animation, and eye candy of every sort. (You know, all the stuff that makes using your computer not require the intelligence of an astrophysicist and the temperament of an actuary.) And if there’s anyone who needs a maximized amount of wasted power working in their favor, it’s a 66-year-old lady going on the internet.
SO:
She should have the fastest processor that money can buy (within reason), to reduce lagging and latency to a point indistinguishable from zero. In fact, an overpowered processor is the perfect investment for a non-power-user, because it will ensure that any of the simple things she does with her computer for the next X years will still happen near-instantaneously long after her machine is technically obsolete. Also, as the internet continues to grow in sophistication, “simple things” will become more processor-intensive whether we like it or not.
She should be maxed out like a mothertrucker on RAM. Not because she’s going to be multitasking between Final Cut Pro and AfterEffects. Rather, because again, anything that happens instantly and seamlessly is one less thing for her to even think about when using her computer. Plus, old people are not geniuses with file management. They dump everything on their desktop so they don’t “lose” it and keep application windows open in perpetuity. Having more RAM than is humanly necessary will offset these bad habits and ensure that whatever memory-hungry applications might come down the pike in the next 5 years, she’ll always have more than enough to never see a spinning beach ball.
She should have a huge hard drive. Will my Mom be stocking up untold gigabytes of torrent files and lossless music? Of course not, but having a 1TB black hole inside her computer just means (again, are you seeing a pattern?) that she doesn’t have to think, choose, hem or haw in any way about saving things, ever. No matter how many baby pictures or videos of grandchildren get sent her way.
She should have the latest and greatest major OS. An old person’s computer should be a seamless link between them and the same robust digital “present tense” that the rest of the world is living in, especially their children and grandchildren. Custom OS’s like the one on the Litl or this new one are awesome in their own limited, circumscribed ways, but if few people are using them, that’s just another possible barrier for seamless communication (I think)—the elderly user gets pushed into a ghetto with different basic rules and tools than everyone else.
At this point, is OS X or Windows 7 really that daunting on the surface? No. But they’re still flexible and powerful enough to handle whatever’s coming in the near future that might turn out to be as ubiquitous as Youtube or videochatting has.
What all this means is that if I had my druthers, I’d set my mom up with a rig that would blow my own current system out of the water, performance-wise. But that’s the thing: I’m going to own three or four new systems in the next ten years. My mom won’t, and shouldn’t have to.