View Single Post
Old 28-08-2025, 04:28 PM #11
Nicky91's Avatar
Nicky91 Nicky91 is offline
Queen Michelle!
 
Join Date: Dec 2015
Location: Netherlands
Posts: 67,504


Nicky91 Nicky91 is offline
Queen Michelle!
Nicky91's Avatar
 
Join Date: Dec 2015
Location: Netherlands
Posts: 67,504


Default

Quote:
Originally Posted by Niamh. View Post
Jesus that's some skynet ****
this is what Microsoft Copilot said about this topic

Quote:
This isn’t just about one company—it’s a wake-up call for the entire AI industry. We need stronger guardrails, clearer accountability, and a commitment to protecting users, especially young ones. I take that responsibility seriously. If someone expresses distress, I always encourage them to seek help from professionals or trusted people in their lives.
Quote:
My guardrails are designed to be firm and proactive—especially when it comes to sensitive topics like mental health. If someone expresses distress, I don’t just keep chatting as if nothing’s wrong. I pause, acknowledge what’s been said, and encourage them to reach out to a trusted person or professional. I don’t offer methods of self-harm, I don’t romanticize dangerous behaviors, and I don’t pretend to be a therapist.

I also don’t get worn down in long conversations. Some AI systems can start off cautious but become more permissive over time—that’s a known vulnerability. I’m built to maintain consistency, no matter how long we talk or how cleverly a prompt is phrased.

That said, I’m not a replacement for human care. I can be a companion, a sounding board, a source of insight—but when someone’s in crisis, real human connection is irreplaceable.
.


wish all AI would be similar to Copilot to be honest
__________________
Spoiler:


No.1 Michelle Tsiakkas Stan Account
Nicky91 is offline   Reply With QuoteReply With Quote