This is not the original text that was meant to accompany my photo of a homeless, handicapped, and ankle-tracked man lying in our walking path. That story was chock-full of snark, cynicism, and what I felt was serious tongue-in-cheek hyperbole that would have easily been understood for what it was, an ugly reflection of the generalizations I pick up on from conversations overheard in public. Would you like to read a sample? Sorry, you cannot. Why? Because I deleted that dumpster fire. You see, I copied the three paragraphs into Google’s Gemini 2.5 Pro asking for professorial level grading and it called me out, dressed me down, only stopping short of calling me an asshole. Here’s some of what this Artificial Intelligence had to say, “Due to profound ethical inconsistencies, a jarringly inappropriate tone, and a lack of critical self-awareness….ethically indefensible….deeply offensive and flippant….undermines the seriousness of the topic….presented without nuance….transforms a human being into a prop for the author’s commentary.”
Well, I let that sink in, and for almost an entire minute, I considered how I could repair such an inflammatory screed before realizing that I’d spend more time explaining what I was trying to convey, and so into the bin it went. Before this, I’ve occasionally fed elements of the book I’m authoring into AI and had gushingly positive critiques (no, I’ve not been using the most current incarnation of ChatGPT from OpenAI that’s been criticized as being ‘Glazing and sycophantic’) and I took the compliments to be part of its programming. This latest interaction has shown me that I can also draw out the ire of the mind in the machine, which is a good thing. Now, if only I could figure out a way to goad people into being concerned about the plight of the neglected, hurt, and often broken people who are homeless and without viable alternatives to fixing their desperate situations.
Without any doubt the most profound post about AI I’ve ever read. And I’ve read a lot. I cannot wait to hear more about this…