Apple’s Siri is a beloved virtual assistant, but it’s no secret that it needs work. While being the first virtual assistant to be featured on a smartphone, it also fell behind the competition. That’s why today we’re discussing 6 different ways Apple can improve Siri. Read on to learn more.
When Siri first hit the scene for iPhones in 2011, it was a fascinating thing to behold. We all knew it was a gimmicky feature, at least when it was released.
However, despite its limitations, it was easy to see that the potential for something more significant was there. If, of course, Apple chose to embrace it.
In 2012, Google released its own virtual assistant called Google Now, promising a myriad of future capabilities. While fans of Siri naturally expected Apple to aggressively compete with Google, it completely dropped the ball instead.
By the time Google announced a reimagining of its Google Now service, now known as Google Assistant, it was 2016, and Siri lagged far behind. It was a sad moment for Siri, who should have been leading the competition. It was just as sad for iOS users.
Still, Apple has since come a long way in improving Siri and its qualities and capabilities. Yet it still hasn’t caught up to where it breaks even with Google Assistant.
While we do appreciate the work Apple’s put in over the last couple of years, there’s still so much more that needs to be done. That’s why today, we want to explore some of the ways we think Apple can improve its Siri virtual assistant. Let’s get started!
1. Bring a continued conversation mode to improve the flow
One of the best features introduced by Google Assistant is its Continued Conversation mode. Enabling this mode lets the assistant listen after it finishes a task (like providing information or performing an action).
This way, if you have a follow-up question, you can ask it without saying “OK, Google,” again. You can chain questions or comments together more fluidly and naturally.
Siri, on the other hand, has barely implemented this feature. It occasionally applies it, like during specific actions, for example, but not nearly enough to be effective.
And why hasn’t it? After all, it isn’t anything new. Google’s been doing it for a long while at this point.
It may seem like a little thing, but anyone who uses both virtual assistants knows it has an enormous impact. It’s hands down the first way Apple can improve Siri.
2. Provide more helpful in-app results and less webpage redirects
You ask Siri a question. Instead of getting a verbal summary along with an in-app result, you redirect to an external webpage along with an “I found this” response from Siri.
Chances are, you encounter this frequently, and you already know how frustrating it is. This is because in the time it takes to ask a question and get an answer that requires you to look at your phone, then open and read a page, you could have just Googled it yourself. What’s the point, Apple?
Users want and will appreciate getting more responses that are quick and helpful. We want verbal answers and in-app results we can see and hear without opening browsers and visiting websites.
3. Implement smarter interpretation of variances in users’ phrases
Here’s another issue many iPhone users have probably noticed: using 2 different yet similar voice commands and getting different results. For example, say you ask a question, and Siri says it doesn’t understand. You might ask the same question again, only worded slightly differently. Suddenly, it has an answer.
While it’s understandable that certain phrases might be different enough that Siri can read them two different ways, there are plenty of times when the verbiage has the same meaning. In these cases, they should then produce the same results.
4. Enhance contextual understanding of user follow-up comments
One of the most significant changes ever made to Google Assistant is its inclusion of contextual understanding. This allows it to understand that when you ask a question like “What’s the weather like today?” a follow-up question such as “What about tomorrow?” means you’re still referring to the weather.
Apple implemented this feature into Siri not all that long ago, and it certainly was an improvement. However, once again, it’s as though Apple began to implement a new capability and lost interest shortly after.
The reality is that better contextual understanding means more natural conversation with Siri and greater accuracy in communicating back. Combine that with a continued conversation mode, and you’ll be able to string together brief dialogue in a more effective way.
5. Bring offline support for basic tasks that don’t require internet
Here’s another way Apple can improve Siri: If you go somewhere you have a dead signal, you’ll quickly notice there’s little if anything you can do with Siri. It’s tethered to the internet so badly that you can’t even ask it to do things that shouldn’t even require a connection at all.
For example, dictating a note via voice command? Not happening. Apparently, the all-powerful voice assistant backed by an A15 chip can’t beat a pen and pad by itself. Um, what?
6. Expand Siri’s personality and add new ways to interact with it
This is a little more frivolous request than a helpful one, but it’s no less beneficial for the most authentic Siri fans, who are likely also its power users. It’s hard not to beat a dead horse (as the expression goes) regarding the differences between Google Assistant and Siri.
Yes, Siri is known for its sassy and sometimes aloof persona, yet Google Assistant still comes in at the top when it comes to being more personable. I’m personally impressed by how often I ask Google Assistant a question or make a passing comment, only to have a quirky (if not surprising) response.
It wasn’t always like this, and that’s the point: Google puts in the hard work at a far greater rate than Apple. To be fair, Apple is adding more these days than it used to. We can see the interactions slowly beginning to expand, but it still needs a lot of work.
There are already many ways to interact with Google Assistant, including playing games. There’s no reason Apple can’t bring ways to interact with Siri that even Google hasn’t done yet.
For example, merely referring to users by name more often or engaging us on its own somehow. I’m spitballing here, but you get the idea.
Siri is a character that lives in our phones. If you’re going to take the time to give it a simulated personality, then let us feel like it’s a little more alive.
Siri is behind in the virtual assistant race, but it doesn’t have to be
Anyone can see that virtual assistants offer a lot of present-day value and future potential. There’s a reason so many companies and manufacturers are doubling down on integrating them into their products, websites, and operating systems.
What was once thought to be a gimmick has far exceeded expectations and taken on a new role in our society and consumer experiences. That’s why more and more people continue to embrace things like Siri and Google Assistant.
Siri is behind in the race for virtual assistant superiority, but it doesn’t have to be. Apple’s shown off all kinds of new chip technologies that can wildly power machine learning processes. However, we see very little of that being applied to Siri in a user-facing way.
What we do see is a new voice or two thrown our way here and there. And, hey, we love new voice options! But we really need Siri to mature and evolve more than where it’s at.
There are so many ways Google Assistant comes out on top and integrates easier into smart homes and is more engaging to speak to. It’s been many years, and we’re still waiting on one of the most notoriously cutting-edge companies to catch up. Here’s looking at you, Apple.