At regular intervals, Apple gives insights on how to improve the digital voice assistant Siri. With aNow, the manufacturer is exploring the ways in which Siri can better understand the name of local businesses and restaurants.
So Apple wants to improve Siri
In short, Apple has developed custom language models that use user location awareness – known as Geo-LMs – to enhance Siri’s automatic speech recognition. These language models allow Siri to better anticipate which words or sequences of words the user chooses.
For example, Apple indicates that they have created Geo-LMs for 169 areas in the US. These cover around 80 percent of the population. In addition, Apple has developed a language model together for all other regions. If a user is in one of these regions, Siri uses the predefined data of the speech model. If a user is outside of the defined language models or the whereabouts of the user can not be determined, a standard model is used.
Everything sounds pretty technical, what Apple with the latest entry for the best. But if that leads to the goal, so much the better. Siri recognizes on the basis of your position whether you mean Schnitzelscheune XXL in Munich or Berlin or a certain local florist. According to their own accounts, the newly developed language models reduced Siri’s error rate from 41.9 to 48.9 percent in the eight major cities of Boston, Chicago, Los Angeles, Minneapolis, New York, Philadelphia, Seattle and San Francisco.
Most recently, Siri has certified that it is capable of speech recognition and answering questions, However, there is still room for improvement and the new language models will help make Siri smarter.