People need to be able to talk with their technology – but none of the “conversational interfaces,” Apple’s Siri, Google Assistant, Amazon Alexa, Microsoft Cortana, Facebook Messenger, etc. can actually converse. In fact, they don’t actually comprehend a single word you say to them.
Embedding our NLCC in third party products such as the aforementioned will, for the first time ever, enable people to explain their wants, needs, and desires to a machine and expect a sensible response.
Our technology now surrounds us wherever we go. Interfacing with it via visual displays, keyboards and pointing devices has become increasingly problematical. The big tech companies have recognized this and are currently engaged in an expensive arms race to own the “conversation as a platform.” They are doing this because they understand that conversational interfaces have the potential to completely alter the landscape with respect to how people interact with their products and that impacts how these companies generate revenue. For example, where is Goggle if you could ask your digital assistant to google things for you and you never see the ads?
So far, these companies have a level playing field. The narrow AI tools they have at their disposal, lacking in the capacity to create or process knowledge, are ill suited to language comprehension and progress has been painfully slow. Even today, after $millions in investments, the proliferating crop of natural language digital personal assistants are as likely to be made fun of as made use of.
Computational Knowledge, even very early on, represents a quantum leap in the ability of machines to understand what humans are saying and the first of the big tech companies to license it will win the race to own the conversation (unless or until we license it to the rest.)
The Natural Language Comprehension Core (NLCC) is a product designed to be embedded in third party products, exactly as it embedded in our own line of Sapiens products. If the customer’s product already has a natural language interface the CC will take input from that and the customer can decide whether to forward all inputs to the NLCC or only those which their current system does not handle elegantly. We envision that in the majority of cases the capability to have a short discussion with the user to determine what it is they are actually after will vastly enhance the user experience.
Once the users intent has been determined and the relevant information has been gathered the NLCC will forward that information to the customers back end for execution.