Semantics and Pragmatics in Actual Software App and in Web Search Engines: Exploring Innovations

Charles Morris in 1938 defined semiotics built of three components: syntax, semantics, and pragmatics. These words were used by the computer scientists. Most part of web users see as a symbolic system. It was born with this character, but, in the last years, semantic web is become important. Even if semantic web has become more serious, one can not ignore the basic syntactic web, and the future is tuned to a pragmatic approach. To follow the path that seems to lead to a pragmatic web, we analyze the evolution of the most well-known search engine, Google, and one of the applications that, potentially at least, appears to be the turning point pragmatics of communication via the web: Apple’s Siri.

«When you ask a question, you do not want to get 10 million answers. You want the 10 best answers.»

What Is Semantic Search?

Planning a search with a search engine like Google, it comes a list of results based on the text digitized. In few words, syntactic search is based on written words. The results will be the same independently from the person who type the text in the search query box.

Semantic search uses a sort of artificial intelligence (AI) in order to understand the searcher’s intent and the meaning of the query, even if the data base is still the same, a dictionary, this kind of search uses something from your personal data, your character, your life style, in order to give you the best answer to your question. In a Semantic Search, Google will surf from word to word, how they are related each other, and how they เว็บตรง could be used in your search.

Here’s an example: in a restaurant, we ask for the menu. All what is on the menu is eatable? There’s soup and meal in the list, but also wine, beer, and maybe service’s price. There are big or small portions, but also a phone number and an address of the restaurant. All these things are not eatable!

Semantic on the web is not a recent concept: from 2008, all majors search engines began to research on natural language keywords.

Natural language understanding – NLU – is a subtopic of natural language processing in artificial intelligence that deals with machine reading comprehension. Dan Miller, senior analyst and founder, Opus Research, says «NLU should allow people to speak naturally and have a reasonable expectation that a machine on the other end is going to understand their intent.»

It is true that the technology behind the creation of the software for voice recognition is actually geared towards the recognition of the intent of the user, or of what the user would like to ask the machine? Or is it a truly rapid use of an extremely large database, simply compared to the sequence of letters typed by the user or pronounced?

 

Leave a Reply

Your email address will not be published. Required fields are marked *