Apple is apparently working with language-generating artificial intelligence
Apple is apparently working with language-generating artificial intelligence as chatbots become more popular (AI).
The Cupertino-based tech firm recently conducted an internal conference with a significant language model and artificial intelligence theme, according to The New York Times.
Teams testing "language-generating notions" frequently include those working on the Siri voice assistant.
Alexa, Google Assistant, and Siri have all had trouble understanding different accents.
"The virtual assistants had more than ten years to establish themselves as essential. But, they were limited by clumsy design and errors, which allowed chatbots to flourish "cited in the study.
As ChatGPT swept the globe, Apple is apparently preparing to enter the AI race.
Also Read: Apple supplier Foxconn wins AirPod deal and launches $200 million factory in India
But, the business declined to comment on Siri.
Voice assistants are "as stupid as a rock," according to Microsoft Chairman and CEO Satya Nadella in an interview with The Financial Times.
The GPT-4 AI engine from OpenAI, which runs ChatGPT and takes both text and picture inputs, has just been announced.
The business wrote in a blog post, "We've launched GPT-4, the next milestone in OpenAI's endeavour to scale up deep learning.
To compete with OpenAI's ChatGPT, which is made available to "trusted testers" before being made "more generally available to the public," Google introduced its own AI service, "Bard," last month.
Experts predict that voice assistants and chatbot technology will merge in the future.
If not for Silicon Valley Bank's (SVB) bankruptcy last week, chatbots and AI would appear to rule almost all tech discussions. OpenAI, a Microsoft-funded organisation, has today revealed the GPT-4 language model.
The company's adversary, Anthropic, released the Claude chatbot. Google asserted that its Workspace apps, such as Gmail and Docs, will incorporate AI. Microsoft Bing has gained notice with a chatbot-enabled search. Which name did the action leave out? Apple.
Last month, the Cupertino-based company had an internal conference with a significant language modelling and AI emphasis. According to a report from The New York Times, several teams often test "language-generating concepts," including those working on Siri.
With tvOS 16.4, Apple has added a new basis for "Siri Natural Language Generation," claims a different source from 9to5Mac.
Several individuals, including myself, have long complained that Siri is unable to comprehend their inquiries. Siri (and other assistants like Alexa and Google Assistant) have had problems comprehending the various accents and phonetics of individuals from around the world, even while speaking the same language.
The recent success of ChatGPT and text-based search has made it easier for people to interact with different AI models. Nevertheless, for the present, the only way to interact with Apple's AI personal assistant Siri is to activate a feature in the accessibility settings.
John Burke, a former Apple engineer who worked on Siri, asserted in a New York Times article that the slow development of Apple's Assistant was caused by "clunky code," which made it challenging to create even minor feature enhancements.
He said that Siri has a huge database with a variety of phrases in it. So, if engineers needed to add features or terminology, the database had to be rebuilt, which is alleged to have taken up to six weeks.