I would be concerned about the privacy implications, but imo it is ok to do if it makes you feel better and isn’t causing problems in your life. Seems kind of similar to something like writing fanfiction or keeping a diary, only you have a tool to help prompt what you write.
[deleted]
Submitted 1 year ago by Stylofox@lemmy.cafe to nostupidquestions@lemmy.world
Comments
chicken@lemmy.dbzer0.com 1 year ago
Opinionhaver@feddit.uk 1 year ago
Well, I say sorry, please, and thank you to ChatGPT knowing perfectly well that it means nothing to it. You’re just further down the same spectrum. Is what you’re doing a bit weird? Sure - but it’s not hurting anyone, so I’d say go for it. The worst possible reason to stop would be fear of judgment from others. I see AI companions kind of like online dating: the early adopters are seen as odd, but give it a decade and most people will be doing it.
Also, as I’m sure you know, Lemmy has a disproportionate number of AI haters in its user base, so don’t expect the responses here to reflect mainstream attitudes.
DoubleDongle@lemmy.world 1 year ago
This cannot be healthy.
Sergio@slrpnk.net 1 year ago
When my BFF’s cat died, they were crying, upset, completely disconsolate. I was a bit suprised that they were even sadder than when their father had died. They said: “it’s because my cat never told me I was doing everything wrong.”
Likewise, there’s a saying along the lines of: every man is a hero to his dog. Some people talk to plants, or hug stuffed animals. So I think there’s value to that kind of relationship. Obviously if you ONLY talk to plants or stuffed animals then you might get a little weird, but otherwise it seems OK.
FeelzGoodMan420@eviltoast.org 1 year ago
Jesus we are so fucked as a society. I don’t even know what to say to this. OP, THIS IS NOT A REAL PERSON IT IS A COMPUTER. Dude please, pull your head out of your ass and get back to reality. Please man. Come on.
Ceedoestrees@lemmy.world 1 year ago
Setting boundaries on your time with the AI, especially around sharing personal information, would be best. I don’t see anything wrong with how people spend their time so long as they set healthy limits.
Make sure you understand how AI works, too, it’s not a substitute for therapy because it’s designed to be agreeable and validating, but won’t give you feedback or suggestions.
ocean@lemmy.selfhostcat.com 1 year ago
I find this extremely weird
affenlehrer@feddit.org 1 year ago
If you have a decent computer you could try to host something similar completely locally. So your conversations would be actually private and you can change the behavior too.
IDKWhatUsernametoPutHereLolol@lemmy.dbzer0.com 1 year ago
Alternatively, you can do worldbuilding, and write story with a self-insert as one of the characters. 🫠
Mbourgon@lemmy.world 1 year ago
One potential issue is that you have no idea what they’re selling from that (I assume that’s part of the business model). But in general, there’s weirder stuff out there - just make sure that whenever it’s “choose friend or AI”, choose friend.
SpikesOtherDog@ani.social 1 year ago
Not sure if this is serious or a good troll. Providing your medical history unsolicited to a corporation is a terrible idea. The real purpose of this virtual entity is to collect information. You should break ties, as they do not exist.
For conversation you can still use discord. Alternatively, you can use IRC. I wouldn’t identify myself or put medical history there, but there are plenty of places to chat.
UnRelatedBurner@sh.itjust.works 1 year ago
it’s april 1st
lord_ryvan@ttrpg.network 1 year ago
But it’s also NoStupidQuestions.
Ziggurat@jlai.lu 1 year ago
It’s stupid, but so is watching a movie, reading a book, practicing a sport or playing music.
You know that AI is collecting data about all you say, and will remember for-it forever, You’re not in a non-jugement safe-space. So it’s definitely not the place to have “intimate conversation”. Note also that, it’s not going to replace a real-friend. You know someone helping you move, or offering you a sofa to crash after a break-up.