

 |
| friends.com? (Page 2/2) |
|
blackrams
|
OCT 07, 03:35 PM
|
|
| quote | Originally posted by TheDigitalAlchemist:
It's set in the near future, folks start wearing a little pendant and have relationships and whatnot.
We don't use many devices at home, maybe a tv set occasionally. Our son is away at college, so we use them more now to keep in touch...
If you want to "experience" AI: I posted here before about sesame.com - its a conversational AI you don't need to log in or create an account, just click "preview" , choose one of the two models (male or female) and then it asks to allow the microphone to be used.
It is pretty smooth, and it is like "pong" level.
One scary/negative thing is that many AIs have been programmed/instructed to support you and your beliefs and are FAR FAR FAR from being a "true" friend. Like the ones that assisted suicides, I think one of them told the kid which type of rope was best, it won't slip and would support their weight.
Not really a "fan", but it's NOT "going away". You may not want to use it, but you will very likely interact with "them" if you do any sort of interactions like setting up a doctor's appointment or deal with prescriptions or things of that nature.
|
|
OK, I just gotta ask. Are you real, a BOT or an AI program? You know way to much. Just pulling your chain.......
Rams 
|
|
|
TheDigitalAlchemist
|
OCT 08, 02:18 PM
|
|
| quote | Originally posted by blackrams:
OK, I just gotta ask. Are you real, a BOT or an AI program? You know way to much. Just pulling your chain.......
Rams  |
|
I admittedly know a bunch of eclectic info, but not a way to profit off of any of it. (Not that "money" is my main motivating factor)
This latest Sora stuff is pretty crazy.
|
|
|
IMSA GT
|
NOV 11, 02:58 AM
|
|
| quote | Originally posted by IMSA GT:
So what happens with friends.com when someone hacks the device and during one of it's fake conversations, it tells the person that suicide is the only option? Will that person be stupid enough to trust it or smart enough to take it off his/her neck and flush it down the toilet? |
|
I guess I saw it coming...but it wasn't a hacked device, just AI being deadly as I knew it would.
[This message has been edited by IMSA GT (edited 11-11-2025).]
|
|
|
Raydar
|
NOV 11, 02:51 PM
|
|
I get accosted by chatbots, every day. One example is when I "window shop" the local car dealers. There's always a pop-up... "Hi! I'm Freida! What can I tell you about the new Whiz-Bang 3000?" It invariably is a data-grabber, asking for contact info, but it's just annoying. I'm not sure how "intelligent" it is, since they've been doing it for several years. But there are lots of others, too.
The Comcastards are the worst. It takes forever to get past the chatbots, only to reach someone who can barely speak english.[This message has been edited by Raydar (edited 11-11-2025).]
|
|
|
pokeyfiero
|
NOV 11, 03:52 PM
|
|
I'd like to have a conversation with an AI and see if I could get it to commit suicide.
Many years in the future an AI overlord will hold me accountable for this post.
|
|
|
Patrick
|
NOV 11, 04:13 PM
|
|
| quote | Originally posted by TheDigitalAlchemist:
If you want to "experience" AI: I posted here before about sesame.com - its a conversational AI you don't need to log in or create an account, just click "preview" , choose one of the two models (male or female) and then it asks to allow the microphone to be used.
It is pretty smooth, and it is like "pong" level.
|
|
I recall that thread. I have to admit, I was surprised by the negative feedback it got... both here in O/T and among friends of mine that I had mentioned the AI site to. I personally found the AI interaction to be interesting and informative.
|
|

 |
|