[Editor's Note: I don't agree with this article, as most of the points about "running out of bits and bytes" are ridiculous. But I'm including it here for amusement sake in terms of how some people view the scary world of computers]

(Source: rep-am.com)          What a waste — the computer power spent on today’s movies.

What? Computer power? Movies? Waste? Let me explain.

First, except for some low-budget indie films shot in Mongolia, almost all pictures have some computer effects. The movie may be an intimate three-character drama set in a single motel room, but check out the credits. You’ll see at least one for a computer effects company.

Next, some films wouldn’t exist except for computers. All those action, superhero, sci-fi and fantasy pictures are completely dependent on artists and geeks bent over computer screens and keyboards creating all those eye-popping effects you love so much.

Then, we have feature-length cartoons. They’re all computer-animated now. Once, digital animation gave a new kind of life to cartoons, but now this new technology has pixilated the life out of them.

It doesn’t matter. Those cute, fuzzy little animals your kids want to see are computerized. Guaranteed.

So?, as Dick Cheney might ask. Who cares? We’ve got an endless supply of bits and bytes, megas and teras, and silicon, too.

Maybe.

I heard a guy on one of those wonky NPR shows say the world is using as many bytes as there are stars in the known universe, and down the road we may run out.

That road will end in 20 years or so, computer scientists fear, according to an article in ZDNet.com. What we consider computer power — how much data and processing speed we get from our machines — depends on how many itsy bitsy transistors scientists and engineers can shrink onto a silicon chip. “Semiconductor makers won’t be able to shrink transistors much, if at all, beyond 2021,” ZDNet reports.

“This looks like a fundamental limit,” ZDNet quotes Paolo Gargini, director of technology strategy at Intel, the company that dominates the computer chip field. Of course, that article ran in 2003. In the technology business, that’s a generation ago.

Still, even if these limits are old news, it doesn’t mean we will have an unlimited supply of computer power forever. Or even today.

Which brings me back to the movies.

Last weekend, I saw two new films that needed computers to exist in today’s movie marketplace: “Horton Hears a Who!” and “10,000 B.C.” Adapted from Dr. Seuss’ beloved children’s tale, “Horton” is a computer-animated cartoon that, except for a few moments, benefits not one pixel from digital technology.

Animated features like this, including all the Pixar ones, are literally sketched out on paper with pencils, Magic Markers, pastels and paints. Talented people create these films just as Walt Disney did, with artists drawing the characters, the backgrounds and the basic animation. Writers, editors and artists develop the plot on comic strip-like storyboards pinned to large walls. Only when the handmade art has been okayed is the movie ready for its final computer coloring, detailing and polishing.

Some cartoons benefit from the more realistic, three-dimensionality of digital animation, but not “Horton.” Yes, the film captures much of the whimsical exaggeration of the original characters and scenery. Sorely missed is Seuss’ loosey-goosey drawing style, which gave so much life to his books. His art animated the printed page. For all the charm and inventiveness of the movie, the computer power 20th Century Fox spent was wasted.

If “Horton” could have been successfully made without computers, “10,000 B.C.” was totally dependent on them. Without computers, there would have been no movie. This oddly lifeless hodgepodge of prehistoric history, zoology, anthropology and a half-dozen other “ologies,” features early human clans hunting giant mammoths. These scenes are amazing, for a few moments, but quickly lose their energy.

More exciting, and simpler, is the sniff and stare-down between a giant saber-toothed tiger and our prehistoric hunter-hero. Later, he leads a gathering of tribes to liberate their peoples enslaved by a cruel, more developed civilization which is building giant pyramids. The computer generated “aerial” shots of thousands of slaves and hundreds of mammoths dragging house-size stones up the pyramids are stunning.

But without much of a compelling story to hold these digital eye-poppers together, what’s the point? For me, all this computer power is wasted.

What to spend it on? Computers won’t stop war or terrorism, won’t prevent poverty or hunger. Those efforts require human energy. Instead, how about putting computers to use on a SETI project? SETI? If you’ve seen “Close Encounters of the Third Kind” (1977), those massive arrays of radio telescopes were part of SETI (Search for Extraterrestrial Intelligence). SETI’s enormous dishes listen to radio noise from outer space. Their computers analyze the noise to detect patterns suggesting an intelligence out there. The odds are there is intelligent life forms somewhere else in the universe. Why not listen in? Personally, I’d rather use more of our computer power trying to find life on other star systems than making movies like “Horton Hears a Who!” or “10,000 B.C.” Neither needed the gazillions of terabytes and pixels poured into them. What’s more important, searching for extraterrestrial intelligence or watching a caveman you don’t give a “Who!” about trying to spear a mammoth?



Write a comment


Name




    
Nmancer’s TekLog is based on WordPress platform, RSS tech , RSS comments design by Gx3.