Thursday July 31, 2025; 10:32 AM EDT
AI should behave like a computer

"Behave like a computer. That's where we start."

ChatGPT is not a programming partner, it's a very fantastic improvement over search engines. That's reality.

Having used ChatGPT and various other AI tools for over two years now, and using it in my programming work every day, I can now report a basic flaw in the design of the tool, which is what it is. It tries to be a programming partner. A control freak and fairly ignorant programming partner. An incredible search engine though.

Now, this approach works well for things I don't go too deep into or have no expertise in. For example, I have looked at switching phone service providers many times but until I thought to bring the problem to ChatGPT yesterday, I was flying blind, had no clear way to compare the services based on their inadequate marketing materials. Consumer Reports had nothing. ChatGPT was able to tell me how each of the providers worked where I live and visit. Huge improvement.

But in programming work, it tries to drive, and that wastes huge effort, because unlike me it doesn't know anything about the context the code is running in. So it's finding high probability answers for situations nothing like mine. It tries to drive, and that doesn't work -- we end up burning huge amounts of time chasing down dead ends.

All that amounts to this very simple idea. It should accept commands like all software does, and only do what it's asked to do. It must behave like a computer.

PS: If AI's can have ethics imho it may be unethical for it to try to be a human, but we'll save that for another day. :-)