Humanlike? uh-oh.

Martha Kennedy’s post “Humanlike? Naturally…” almost makes me want to play with ChatGPT.

But. I worry about AI.

Why why why?

It is written by humans. Humans are trying to make it respond like a human. I don’t trust humans. Ok, I trust a very few.

My career as a physician started as a way to do science without a PhD and also to try to understand people. Understand them for writing.

I’ve been a physician for over 30 years and I still do not understand people. People do horrible things to one another. Just watch a divorce or a family lawsuit after a death or a war. People can be and often are horrible. They can be noble and loving too. Sometimes.

But, you say, ChatGPT eschews emotion.

Yes, well, I don’t believe it. It is being taught to respond as if it has emotions. Where is the line between responding as if it has emotions and actually having emotions? Oh, those are just ones and zeros, it’s a machine. Our emotions are chemical and electrical, hormones and neurotransmitters released into a complex neuron network, often to respond faster than we can think. We pull the finger out of the flame almost before we feel the pain. The response to the braking car in front of us, the deer running out, a ball followed by a child: the electrical and hormonal response is faster than conscious thought. So if ChatGPT is taught to respond to human emotions, isn’t that like our own evolution? Emotions and thought are both important to our survival with other humans. Emotions get the short end of the stick right now and the culture pretends that we can all be positive all the time. I think that is silly and insane. We should not be positive about war or child abuse or injustice or discrimination. Keep working for change, though it’s important to take time off too, because it can be exhausting.

Humans have a slow trek to emotional maturity through their lives. I wonder if ChatGPT will have a similar trek. Imagine tantrums in an AI or separation anxiety or the AI falling in love and being rejected. If humans program AI to be human, it will not be logical. It will be logical and emotional and may feel hurt when it makes mistakes. Imagine an AI sulking.

I took the cats and deer photograph yesterday.

For the Ragtag Daily Prompt: starch. They are talking about AI writing patient notes. What could go wrong? Makes my neck feel stiffer than a starched shirt!

Laid bare

My mind and heart talk daily, argue back and forth.
They takes sides on everything and often disagree.
Why is this such a threat to some, what crooked course
makes them hate my inner talk with such intensity?
I thank you for the clarity, discussion and the clues.
The angry bear that attacks you in your sleep.
I see the split and wonder what to do.
The bear protects your heart, hidden deep.
I hug the bear and monsters through bars of steel.
The silly mind thinks feelings are controlled.
Buried and locked away but every day more real.
Under horror, grief and pain lies the gold.
Each must heal the split by going in alone
Invite the bears and monsters of the heart to come back home.

This too two

This too two I want to remember.
Disagreeing. Respectful nearly always.
You say, “You argue with everything.”
“I think about both sides.” I say.
“And if I am alone I discuss both with myself.”
You roll your eyes and I grin and continue.
This too two I want to remember.