Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Hopefully this is a signal that Apple is going to take steps to improve Siri. If Apple can find a way to get the best of both worlds from data privacy and machine learning then it will have been well worth the wait.


I am really annoyed by Siri these days. I was so excited when they introduced it in 2011 (?). But I feel like there was almost no progress since then. It’s so painful to use Siri even for the simplest of tasks


I don’t use Siri enough to have a personal opinion but read this article. Someone who tested 800 questions (on a HomePod not iPhone) and earlier this year got 52% correct responses and now got 74% right. From that it looks like it’s improving quite rapidly.

See comparison Siri Alexa google and Cortana: https://www.macrumors.com/2018/12/20/siri-on-homepod-vs-alex...


>From that it looks like it’s improving quite rapidly.

I fear the challenge will be one of perception. Speech is really a make-or-break first impression kind of technology, since speech is so personal. I'm afraid that Siri will need to become twice as better as the competition to regain the trust of users that it won't completely fail them, or even to just get them to retry it.

I wonder if, in the future, there will be some sort of marketing push called "Siri 2.0" or maybe even regular "releases" (even though I'm sure it's not updated on that kind of cadence). Since the technology is all invisible, there's no way to tell that something has changed, unlike a traditional OS, which changes its appearance even slightly.


Maybe. I don’t have any comparison. And I haven’t used Siri for a long time. But I started again 1-2 weeks ago and I’m super annoyed because it’s such a pain.


Totally agree. On my first iOS device (XR) since the first Android device arrived. The speech recognition on Android is FAR better....it would get it right 99% of the time. Siri is so bad I've considered starting a Twitter feed of "sh Siri thinks I said..."

Oddly, even speech to text with GBoard on the iPhone is noticeably worse than on Android as well.


There's a subreddit that captures your idea: https://www.reddit.com/r/SiriFail/.


I've been using iPhones for some number of generations now but the autocorrect on iOS is terrible compared with Android from 5+ years ago.


I would suggest trying Shortcuts, because Shortcuts are so phenominally easy to create, share, and modify that it blows my mind.

It took me all of 3 minutes to create a Shortcut that lets me message my wife with my ETA at my house when I say "hey siri coming home" (or use the Siri activation in my car) and then "yes" when she asks for confirmation to send it. Takes my current location, finds the route to my house, grabs the time, and plops it into a custom message that I typed up that's sent to my wife's iPhone.


Shortcuts is not quite the same as Siri.


After so many years Siri still doesn’t do “take me to the nearest McDonald’s” like Google Assistant can. And whenever Siri isn’t “confident” it just launches Safari, but it’s often not even the best starting point, and Siri sometimes gives up silently without feedback.

Also for a bit of fun, ask Siri for the population of Buffalo NY. I noticed because Siri also tells you that Buffalo is big, and then goes on to say...


I just tried “take me to the nearest McDonalds” and it replied “which local business, tap the one you want” with a list from nearest to farthest and when I click on it, it gives me directions to the one I choose.

As far as the second question - yeah that one was weird. It said that the population was 69 but the text summary that it displayed was 269,000. It’s the only city that displayed that bug.


Did you see how Google Assistant handles the nearest McDonald’s request? Google doesn’t ask you which McDonald’s is the closest one, tap the one you want.


I'd rather have the list than a single choice. That way I can pick one that isn't in a crappy neighborhood, or is on the way to a place I'm going anyway.

My biggest frustration with my car's built-in navigation is that when I ask it for the "nearest" item, more than 50% of the time it tells me to make a U-turn because it picked the mathematically nearest item, rather than picking the conventionally nearest item.


I actually think if you said "closest" or "nearest" and Siri just gives you a list, then Siri has missed out on useful information. You can also say "Nearby McDonald's" to Google and get a list, versus "Go to the nearest McDonalds" and you'll get navigation.

There are all sorts of ways to handle this request, but Siri's is the laziest, and in doing so asks more of the user's attention. If you just say "McDonalds" to either voice assistant, you also get a list. From that perspective, it's as if Siri ignores any information that might be gleaned from the rest of your sentence.

Siri learns from us but we also learn from Siri, and we might find that some words don't matter. You mind as well just say "McDonalds" if Siri is going to ignore the rest.


It understands “take me to the closest” and doesn’t give you a list but “take me to the nearest..” does.

Reminds me of some of the chatbots that I use to make. No matter how hard I tried to test them, they always worked as expected. The minute someone else used it, it fell apart. Subconsciously, I knew what would work and what wouldn’t.


Yeah sure you rather have a list. Why not have it return a list of answers for every question that had a definite answer like this then? Options are better.


But that’s more of an interface choice than a technical limitation. Siri both understood the question and had the ability to navigate to it.

In my case, there were 9 McDonalds within a 10 mile radius.

In cases where there was only one location nearby, it took me right to it.


I think you give Siri too much of the benefit of the doubt when you say that Siri understands your request. You may be correct, but maybe Siri just looks for a location like "McDonalds" and most of the time just shows you a list? It's easier when you're ignoring the rest of the sentence as if it has no relevance to improving your response.

It's very hard for us to discuss Siri's internal state; by contrast it's easier to discuss Google's observable performance, which is to semantically differentiate between these two requests.

You can just ask Google, "Nearby McDonalds" and you'll get a list. "Go to the nearest McDonalds" and you get navigation.


Take me to the nearest McDonalds - gives me a list.

“McDonalds” - gives me a list.

“Take me to the closest McDonalds” - brings up maps and starts navigating to the closest McDonalds.

After further experimentation. Siri doesn’t understand Nearest but does understand “closest” to mean that I don’t want a list.


If you said “take me to” an not “show me a list” then it actually did the wrong thing by showing you a list. I suppose “not working” is technically an interface choice, though.


"what is the population of buffalo, new york" (that's what i told siri) shows me London?


> Also for a bit of fun, ask Siri for the population of Buffalo NY.

Nice


Someone at Apple may be reading this thread, because it returned the answer normally for me.


Yeah, they must have—this afternoon it told me the population of Buffalo, New York was 69. I’ve got a window, I was pretty sure that’s not correct.


The worst part for me is it opens it in Safari and not Chrome. (As well that it opens directions with not-Google Maps)


Isn’t it still the case that “third party” browsers on iOS are just wrappers around Safari/WebKit anyway?

For the longest time Chrome on iOS was just using WebKit/safari’s UIWebView then later WKWebView for rendering webpages, much like many other iOS apps that display web content. For various reasons the App Store rules have always banned third party browser rendering engines, I haven’t heard any change in this policy recently?

The only real advantage of Chrome on iOS was ancillary features like Google account bookmark/history sync etc if you are all in on Chrome elsewhere, which isn’t all that useful in the context of a link provided by Siri, for me at any rate. The feature that lets apps using WKWebView access your password/auto fill data only works with Safari on iOS as well, which is all the reason I need not to bother with the WebKit-wrapper rivals anyway.

The maps issue is significantly more annoying to me.


i use siri to turn off and on hue lights (sometimes) and sometimes to set reminders. I do feel that since I dont have Siri on an "always on" speaker, it becomes less useful. I dislike the idea of having an "always on" speaker, so I'm not sure what to do. :)


If it makes you feel better, the “always on” recordings never leave your device. They’re not even that great at recognizing anything other than “this sounds like a ‘hey Siri’”.


For a fun experiment, yell RED FIRE TRUCK into your phone then immediately activate Siri manually.

With Hey Siri turned on, you will reliably get all three words. With it off, you will usually get Chuck or Truck, or sometimes FIRETRUCK depending on timing.

This seems to be for people who start talking before the Siri prompt is fully active like “Set <button> Timer for 5 minutes”.

But shows that Apple is typically pre-recording if not also pre-processing.


I don’t get anything, just activates Siri normally.


I’m on iphone8 and can get it to happen reliably. Weird.


How old is your iPhone? I don’t recall the first model to introduce it, but all iPhones now I believe have supported an “always on” hey Siri mode for a while, even if phone is in standby.


Yup, the first iPhone to introduce this was iPhone 6s.


Oh, well maybe it is on then...


Honestly voice control is a gimmick to me, I much prefer Apple to make exactly zero compromises on privacy and have Siri languish.


Especially because the ecosystem that Siri can use is so nice, with homekit devices and shortcuts.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: