DNP in Joint AI Research with the University of Electro-Communications

Aims to develop AI expression-based reactions to expressions and gestures

Dai Nippon Printing Co., Ltd.
The University of Electro-Communications

Dai Nippon Printing Co., Ltd. (DNP) has entered into joint research with the University of Electro-Communications (UEC) into Artificial Intelligence (AI). In conjunction with the Intelligent Systems Laboratory, under the guidance of Professor Takayuki Nagai, and a similar laboratory under the guidance of Professor Tomoaki Nakamura, the partners will research expression-based AI, whereby appropriate replies and gestures are automatically generated by AI in response to human oral communication, expression and gestures.
The targeted research result aims for the partners to develop expression-based AI that empowers information devices, such as robots, chatbot computer programs that simulate conversation over the internet, and digital signage to engage in conversation including appropriate gestures in FY 2017. Following this, it is planned to engage in verification tests of automatic presentations using real information devices. And looking ahead, the partners will aim for applications in services designed to support communication with consumers, such as clients, in in-store guidance programs and on EC sites.

[Background]

Great expectations are held out for AI-driven operational reforms and innovation as a technique for raising labor productivity. DNP has promoted the creation of an Intelligent Communication Platform that supports smooth information exchange between humans and diverse information devices since November 2014. This is a system that in addition to speech recognition functions, aims to achieve natural and intelligent dialogue with various information devices, through intelligent processing functions necessary for communication, such as ideas and dialogue, by analyzing the attributes, reactions, and intentions of the consumers engaged in dialogue and transmitting the optimal information for those consumers. With current AI technology, it is possible to analyze audio, or language, and images, such as movements and expressions, but human judgment set by programmers is necessary in order to develop the correlation between language, gestures and expressions.
At the same time, as method for AI to acquire concepts in an autonomous fashion, UEC is promoting the development of Symbol Emergence in Robotics, whereby AI autonomously develops by analyzing correlations with different parties while observing ambient conditions and acting in the same way as humans do in the growth stage. This is an attempt to develop a general purpose artificial intelligence that engineers human learning and grows to autonomously learn, by reproducing the process of human beings acquiring language and motor skills during their growth stage, and reproducing this in robots with visual, auditory and tactile sensors, along with motors. Through these efforts, it is aimed to establish basic technologies for the realization of smoother communication, by having machines comprehend the meanings of words and gestures made by human beings.
Through this joint research, the partners aim to automate correlative analysis between AI language and gestures and facial expressions.

[Joint Research Summary]

The partners will develop an expression-based AI that autonomously analyzes human gestures, expressions and language, and automatically generates text information in the form of language and gestures as a reply.

    1. Development of a library automatically generating gestures from sentences

    Using a teacher-less learning technique that automatically extracts the language and gestures used in replies from voluminous amounts of human image data, and then constructs a model that analyzes the relevant correlations, the partners will develop a library capable of automatically generating suitable gestural expressions to match language, without any humanly-set programs. The teacher-less learning system is a method of automatically analyzing vast amounts of data, and deriving rules, trends, etc. from the calculated volume of features.

    2. Reproduces gestural information on information devices with different formats

    By harmonizing a variety of information device modules, beginning with communication robots, the partners aim for the commercialization of automatically generated gestural expressions.

      [Looking Ahead]

      Based on this joint research UEC will further expand more generalized expression-based AI research, whereby AI is capable of maintaining a concept of self, and knows how to express that self. This does not mean merely robotic expressions as gestures alone, and by grasping this as a whole that includes speech and its meaning contents along with expressions, it will become possible to develop robots that can address humans in a more life-like manner.
      In addition to improving the Intelligent Communication Platform, DNP will integrate the platform into robots and chatbots with evolved communications functions, along with digital signage, and apply these in stores and various other facilities including event venues. As a result, in addition to supporting increased operational efficiency in retail outlet operations, links will also be made to new product development and the discovery of new businesses through the deepening of communication between companies and consumers.
      The research outlined above is part of attempts to achieve a versatile artificial intelligence system capable of coexisting with people, a goal targeted by the UEC Artificial Intelligence eXploration Research Center. For more information please visit http://aix.uec.ac.jp/en/. DNP is a Center participant, and will strengthen cooperation with UEC in order to promote AI technology commercialization research in areas including publishing, education and marketing along with social implementation.

       
      * Product prices, specification and service contents mentioned in this news release are current as of the date of publication. They may be changed at any time without notice.

      View lastest news

      search

      Select location
      Category
      Choose a subcategory and year to see relevant articles.

      Search