A few weeks ago, I explicitly disabled it everywhere I could find within account.google.com and I’ve not used anything gemini since. Now I get this email and find it enabled on my devices.
By default Gemini has permissions to access everything on screen, view your contacts/messages, and can be used from the lockscreen…
I’m not all that surprised; but I’m still annoyed. Especially with the opt-out of data collection/access after it’s been given access to everything.
on my devices
See that’s where we’re fucking up. If a company can install things on ”your” devices, it’s really their devices.
We don’t really own anything
These are pocket computer computers that turn into paperweights the moment our feudal lord Google decides to
Google “SafetyCore” says hello.
I strongly suspect the Gemini team is trying to artificially inflate the usage statistics for it by opting people in without their consent and requiring you to replace assistant with Gemini in order to access popular tools like image gen.
It’s likely that in a few months they’ll get exposed, probably not even get a slap on the wrist for it, and go back to their shady business practices.
Sounds about right.
It may inflate the numbers for now, but overall I think it’s just going to have a negative impact. I may have decided to play with it at somepoint, I’m just somewhere between disinterested and distrusting of the tech. Now that it’s been shoved down my throat though, I adamantly REFUSE to touch Gemini. Installing it on my devices and giving it all my data without consent is absurd overreach and should be a felony.
I have the assistant disabled entirely because I find its gesture annoying. Gemini force enabled it back a couple days ago.
I tried Gemini before. The problem was reliability.
E. g. I say “add an event to my calendar: tomorrow 2 p.m., doctor visit”, and it parses the voice perfectly. The regular assistant would then just create the entry.
But Gemini sometimes goes like: “Adding an entry to your calendar is easy! Do you have an iPhone? Then these are the steps: …”
This is why LLMs are not appropriate for applications where regular syntax is needed, like calling an API. I don’t understand why they made the LLM the first step, handing it off to the old hard coded Google assistant second, rather than the other way around. Having everything go through the LLM first is wasteful, slow and unpredictable. I am very confused about Google’s decision here
The funny part is when you say something and it corrects you
Or that weird thing ChatGPT was doing were it screamed “no” in your own voice before replying like nothing happened.
Edit: it was “no” not “help”
Or that weird thing ChatGPT was doing were it screamed “help” in your own voice before replying like nothing happened.
Wait what the fuck
It is vital that the
peasantsconsumers use our AI to keep theLords’Shareholders’ happy.deleted by creator
When’s Google gonna put Gemini on the Google Home/Nest devices? I’d switch to 'em if they end up getting Gemini on them, the amount of times I ask my Echo Dot a simple question that it can’t answer but Gemini can is ridiculous.