If the HomePod Mini is in the room that I am in, it usually recognizes. For example in the Master Bedroom, I say “hey Siri, turn on/off lights” and it does that for the Master Bedroom.
However the HomePod Mini (that I used above, my fault for not being clear, sorry) is in the Family Room. (it’s one of those great rooms shared with the Kitchen). In the Kitchen, I have 4 different lights with their own switches.
Kitchen Main Light (lights up the main part of the kitchen)
Kitchen Sink Light (lights up just the above sink lights only)
Kitchen Table (lights up the light above the table)
Kitchen Cabinets (the string light under cabinet lighting)
This is where the problem sets in for Siri to understand. When I say Kitchen Main Light versus Kitchen Sink Light or turn on Cabinets and Siri replies back with turn off cabinets. Unless I think of different names?
Sadly, you are correct. I’ve used Siri since it was a standalone app. In that time it has lost and gained abilities, and now has lost some again. And I do have something to compare it to.
My other assistant will respond to voice requests for information with a voice response, not “Here’s what I found on the web” and info on a screen that may be in my pocket. And my other assistant can continue to listen for 8 seconds for a follow up question then respond to two additional questions if needed.
Siri was the first digital assistant and it has been ten years since Apple acquired “her”. IMO, they squandered the lead this technology gave them and show no evidence that it will ever be more than a “me too” product.
That’s true. I had a shortcut that allowed me to say “Hey Siri, OK Google”, and use the Google assistant. But IOS 15 appears to have disabled it. Now I have to unlock my phone and swipe right to use the Assistant widget. On the plus side the HomePod mini, with Thread protocol, allows me to use my legacy HomePod, etc. to control my Nanoleaf lights.
But, IMO, an on device assistant like Siri will never be able to match a cloud based assistant. If it could why would Apple choose to keep falling further behind Google and Amazon?
Now Hey Siri on my iPhone 11 responds about 1 time out of 10 and dictation is slow to start and cuts off after a few seconds. Hey Siri still works on my iPad Pro but dictation sucks on it as well.
Siri is working well for me. I have rooms with lights and accessories and HomePods, in various configurations. Some with HomePods, some without.
“Hey Siri, turn on the kitchen” turns on the kitchen light from anywhere.
“Hey Siri, turn on the office sound “ turns on a Wemo with a sound machine plugged in.
Walking into any room with a HomePod and saying, “hey Siri, lights” will toggle the lights.
“Hey Siri, turn off the office” turns off the office lights and sound machine outlet.
“Hey Siri, turn on outside” turns on lights assigned to the “outside” room.
It’s all very intuitive.
That’s not entirely true. As a French (Canada) speaker, I don’t have access to the “private” functions of Siri, meaning that I have to use it cloud based. I can assure you that It doesn’t make any differences. Compare to Alexa and Google Assitant, Siri is… well, idiot.
While idiot may be a bit harsh, I agree that Siri is a distant third in the digital assistant race. And based on my recent experience may even be falling further behind.
People I respect say Alexa is much better than Siri but I have never used it. I do know that the Google Assistant is far more likely to understand my commands and requests, and answer a spoken request with a spoken answer. And Siri is more likely to answer by displaying data on a screen that may be in my pocket or the center console of my car. Or inform me I have to unlock my phone.
Siri was born brilliant, but since being “adopted” by Apple hasn’t been given the attention she deserves. I don’t know if an on-device assistant can ever match a cloud based one. So far there is no evidence that it can. In any event Apple is also playing catch up in cloud services.
And while I know the Google Assistant is starting to do more work on-device I suspect it is more to counter Apple’s privacy marketing campaign than for any technical advantage. Sadly, Siri remains a disappointment.
The thing I’d like to see with Siri, since they’re committed to not doing “big data” aggregation, is some mechanism for training it on-device, and perhaps a way of voluntarily submitting the “training” to Apple for review.
Say Siri messes up in some obvious way. Allow me to, perhaps using the keyboard, tell it what I said. Would this be inconvenient in the moment? Potentially. But if I had the time, I’d have a way to at least make my Siri better.
Just like how in map software, I’d LOVE to have a way to say, as I’m driving, “that turn isn’t available right now - navigate me a different way.”
Because what happens now is I drive past the turn I can’t make. The GPS starts insisting that I turn around. I’m in the middle of busy city traffic, and I have three options - follow its directions and once again attempt to make a turn where the road is physically blockaded, turn it off and guess on my own, or pull over into a parking lot and manually try to figure out a different route. Which isn’t too horrible, UNTIL the GPS - while I’m driving on my new route - says “I’ve found you a faster route”…and proceeds to reroute me down the previous route that’s physically closed off.
Some general way for a user to correct issues - either on-the-fly or via a dedicated training effort - would be huge.
I won’t pretend to know what would be needed to accomplish that. I managed telephone systems for a manufacturing company and later for a call center, and wrote scripts for routing calls and building auto attendants (Press or Say 1 for Sales, etc.)
Digital assistants are a whole new ballgame. I think I would need training just to know how to submit useful data to Apple.
For local training, a system similar to dictation correction (blue squiggly lines) would work for me. I’m guessing the cool visualized Siri response isn’t allowed to be sullied by any hint of text editing, though.
As far as submitting those corrections to Apple, I’d guess too few people would do it to matter compared to fruit available to pick in their huge volume of training and differential privacy data.
Yeah, I don’t know what would be needed to accomplish that - but we’re talking about a company that can look at collections of pixels and say “that’s a cat”, and that has some pretty extensive UI experience.
It’s absolutely a Hard Problem. But Apple has tons of money, and a history of solving hard problems.
That wouldn’t shock me - although there are definitely some inklings that Apple’s “design over function” mentality may be yielding a bit. I would think that improving Siri has to be high on their list of stuff to be considering, unless they’re willing to give up on the assistant game entirely.