Ha ha, no.
Google proposes Project Ellmann, a chatbot that intimately knows you
Submitted 11 months ago by throws_lemy@lemmy.nz to technology@lemmy.world
Comments
Squeak@lemmy.world 11 months ago
Sanctus@lemmy.world 11 months ago
Oh dont worry, we already have the data. This is just a formal announcement. Your new bot will arrive in 5 days.
andrew@lemmy.stuart.fun 11 months ago
The bot has already existed in a different form for years. Instead of you talking to it they asked it which ads are most effective to show you specifically.
Google is not just getting into ML. They’ve been at the bleeding edge for decades.
DampSquid@feddit.uk 11 months ago
Google has announced the closure of Project Ellmann, ending minutes of speculations
cheese_greater@lemmy.world 11 months ago
You’re just being cute, right?
reflex@kbin.social 11 months ago
Project Adman.
Fisk400@feddit.nu 11 months ago
Google can’t even keep a podcast service going. I certainly wouldn’t trust them with a little buddy that I care about.
800XL@lemmy.world 11 months ago
Jokes on us tho. Google is going ahead with this, it’s just never going to made available for public use. It’s only to use for figuring out exactly what we’ll buy.
treefrog@lemm.ee 11 months ago
Why have a bot just figure out what you want when it you can also have it do direct marketing?
This announcement wasn’t for consumers, but advertisers.
KairuByte@lemmy.dbzer0.com 11 months ago
Nope.
Reality_Suit@lemmy.one 11 months ago
Mmmmm, how intimate? Will it know…everything? blushes
TheBlue22@lemmy.blahaj.zone 11 months ago
Yeah, I’d rather not, piss off.
isVeryLoud@lemmy.ca 11 months ago
I propose that we do not.
Caligvla@lemmy.dbzer0.com 11 months ago
No thanks.
Paragone@lemmy.world 11 months ago
The right to NONassociation should always outrank the right to association.
Molesters may claim the right to closely-associate, but the right-to-be-not-molested should outrank their association-right.
Nonassociation needs to be a fundamental right.
In multiple contexts.
Abusees who want no-contact to have teeth,
molester-survivors,
etc.
Including identity-molestation/theft, and other abuses of one’s personal information.
_ /\ _
RanchOnPancakes@lemmy.world 11 months ago
paaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaass
ArugulaZ@kbin.social 11 months ago
A chatbot that needs to mind its own damn business, I say.
paraphrand@lemmy.world 11 months ago
I’ve literally dreamed of having such a boy since the late 90s. But decades of following the tech industry since have shown me how I might not want that after all…
shea@lemmy.blahaj.zone 11 months ago
You probably meant bot (and not boy) but it sure made a funny mental image. I’m imagining a little robo Pinocchio type boy
GravityAce@lemmy.ca 11 months ago
Google can’t even serve me ads in the right language right now so… doubt this chatbot thing is going to work
Aatube@kbin.social 11 months ago
And it's named freaking "ElIman"? L
Brunbrun6766@lemmy.world 11 months ago
Bring it on, I’ll make it hate me
Whiskeyomega@kbin.social 11 months ago
Sounds like my therapist....
Destraight@lemm.ee 11 months ago
I’m not going to use it. Just like I don’t use other stupid AI that Snapchat added
eighthourlunch@kbin.social 11 months ago
According to the people who know me intimately, the AI is gonna nope out even harder than I am.
Fades@lemmy.world 11 months ago
…… and they’ll sell or otherwise profit off of every goddamn byte of that data
canis_majoris@lemmy.ca 11 months ago
If Google actually made me a hentai waifu bot that would be based.
Kemwer@lemmy.world 11 months ago
Does it sound like Scarlett Johansson? If not, no.
sour@kbin.social 11 months ago
cute laptop always better
simple@lemm.ee 11 months ago
I’m impressed someone thought of that, wrote a presentation, rehearsed it, then presented it and at no point thought that it sounds creepy and invasive.
AtmaJnana@lemmy.world 11 months ago
It sounds like exactly what I would want, if it were open source, audited, and under my direct control.
www.w3.org/DesignIssues//Charlie.html
webghost0101@sopuli.xyz 11 months ago
It sound like exactly what i have been saying is the future of human growth.
Ai companions that are like a butler, best friend, therapist, mailperson, accountant, lawyer all in one.
Your ai talks to their ai, before you ever met they each return a baseline of info, conversational opener and suggestions for meeting at a date/location
And absolutely yes on the open source under my direct control cause holy shit end of the world if it is not.
kromem@lemmy.world 11 months ago
I really don’t get users.
Google already has the capacity to be doing this level of analysis on your data that you gave them to host for their own private internal purposes.
But we should reject the opportunity to have that aggregate picture of our data turned back over to ourselves to make the most of what’s already the case?
This really reminds me of the saying “nothing about the situation has changed, only your information about the situation has changed.”
treefrog@lemm.ee 11 months ago
They’re marketing this as a personal salesbot to advertisers now. That’s what changed.
webghost0101@sopuli.xyz 11 months ago
Even if google has your data, up till now there was not much brain muscle to properly analyze it in a realistic and detailed collection of intelligent knowledge. Just some cheap tricks like daily patterns.
An ai could potentially use the same data to learn things about you that you yourself do not. Its not our information that has changed but googles ability to harvest addition information from the data they already have.
I don’t use google service myself but this should alarm people that do. The information they have provided is much more powerful then what was anticipated years ago.
inspxtr@lemmy.world 11 months ago
suggests either these people are so detached from reality, or they are appealing this to a very specific sets of people under the guise of a general appeal
sour@kbin.social 11 months ago
not even family member know everything about life
is private