RE: Anthropomorphism is bad because...

Subject: RE: Anthropomorphism is bad because...
From: Chris Despopoulos <despopoulos_chriss -at- yahoo -dot- com>
To: techwr-l -at- lists -dot- techwr-l -dot- com
Date: Fri, 25 Jun 2010 05:46:32 -0700 (PDT)

Well, we already do program anticipation. We just don't program the emotional components of anticipation. Let me put it another way. Humans and machines can both anticipate your actions. But they do it for different reasons.

Machines can recognize a pattern, evaluate the probable trajectory of that pattern, and prepare themselves (or for systems, prepare other machines) to efficiently handle a probable state before that state arises. They do this because they are instructed to. They don't "want" to be more efficient, they don't seek approval, and they don't feel disappointment or threat when the anticipation doesn't pan out. They simply anticipate a state because they can, and are instructed to do so.

People (and other creatures) are motivated ahead of the problem. They are born with drives and instincts that they satisfy without even knowing it. These drives form the basis of their decisions. Take the drive for reproductive success (not just for offspring, but for successful offspring). The chains of decisions that explode from this drive are long, complex, and innumerable. One thing they lead you to do is recognize a pattern, evaluate the trajectory of that pattern, and prepare yourself (and possibly others) to handle a probable state. But you won't do any of that unless the situation matters to you. Ultimately, the situation has to trigger something at a low level -- something close to your drives and instincts. Success at work, more sophisticated adornment, success of loved ones, sheer pleasure, or some perverted variant of these or many other possibilities. Something has to trigger your *decision* to invest in the pattern recognition and
anticipation. That investment is real -- you have something to gain or lose. This is why you equate emotions with anticipation. A machine cannot feel these things when it anticipates.

So really, to say that programming the emotional component of anticipation will yield machines that fight to stay alive (er, turned on) is backward. The reverse is true... When the machine is invested in staying on, and learns its actions can prolong the on state, then it will invest in anticipation. I suspect that as long as decision units are binary, that will never happen. A sentient organism includes astronomical numbers of cells that can each transmit a range of pleasure/pain. Averaging these values is much more complex than averaging binary cells. Of course, this kind of averaging is probably more akin to an analog computer, which can be more efficient. Here's a cool example... How do you find the balance point of a rod? You could measure the rod, analyze the material, weigh and measure, weigh and measure. Or you could hold your arms out, extend your index fingers, put the rod on your fingers, then bring your fingers together. As you do
that, the rod distributes its weight such that your fingers meet at the balance point. I suspect this is analogous to how animals average out their arrays of pleasure/pain signals to arrive at decisions... More so than the digital Rube Goldberg machines we're cobbling together today.


Gain access to everything you need to create and publish information
through multiple channels. Your choice of authoring (and import)
formats with virtually any output. Try Doc-To-Help free for 30-days.

You are currently subscribed to TECHWR-L as archive -at- web -dot- techwr-l -dot- com -dot-

To unsubscribe send a blank email to
techwr-l-unsubscribe -at- lists -dot- techwr-l -dot- com
or visit

To subscribe, send a blank email to techwr-l-join -at- lists -dot- techwr-l -dot- com

Send administrative questions to admin -at- techwr-l -dot- com -dot- Visit for more resources and info.

Please move off-topic discussions to the Chat list, at:

Previous by Author: Re: Anthropomorphism is bad because...
Next by Author: Tools for producing a Tutorial
Previous by Thread: RE: Anthropomorphism is bad because...
Next by Thread: RE: Re: Procedures in real time

What this post helpful? Share it with friends and colleagues:

Sponsored Ads