They’re not real. In fact, they’re little more than a complicated sorting routine, even if some programmers will insist that the most advanced systems are beginning to go from Design-for-Effect to Design-for-Emulation. But that won’t stop the average individual from believing in them as if they were infallible, or at least more reliable than their fellow humans, because MPAI.
I work in computer graphics for a small company on our own game engine. We also have an in house team of artists creating content for this engine. I am often tasked with taking suggestions from the artists and implementing them in the engine. The artists have no technical expertise, so I meet with them to understand their ideas and needs, and do so again to explain the functionality, limitations, and so on of my solution.
Recently it happened quite a few times that after such a meeting (where I explained that their requirements aren’t 100% achievable and provided an alternative working solution) I get a message from the artists, along with a screenshot of an ostensible but frankly ludicrous solution proposed by ChatGPT. They then ask why I could not do what ChatGPT suggests.
I then have to take the time to explain why ChatGPT’s proposed solution wouldn’t work, which is tedious and difficult when the other persons do not understand many of the basic ideas involved. They also seem skeptical, and I get the idea they feel I’m incompetent because as I understand it ChatGPT is very useful in their setting, and they have come to believe it to be the ultimate source of knowledge.
How can I, without being condescending to either my coworkers or their use of ChatGPT, ask them to not make suggestions to me that they personally don’t understand and are based solely on ChatGPT?
It seems that our most reliable guides to the future were Douglas Adams and Bruce Bethke, not Robert Heinlein, Isaac Asimov, or even William Gibson. Because if you’re not factoring in the sheer absurdity of human retardery when making projections, you’re going to be completely off base.