Physics + Information Processing = Intelligent computation
"Machine learning" in which computers make predictions about the future by reading the relationships behind the data and learning the rule of law, this is, so to speak, a modern magic mirror.
Our laboratory aims to be able to use the modern magic by everyone. For example, "deep learning". While it becomes possible to automatically perform very sophisticated tasks, why is it possible to do such a thing? We will do adventurous research to determine the essence of learning.
Deep learning requires a large amount of quality data. In reality, however, data cannot be gathered handily. We are therefore promoting "sparse modeling" which is a technique to discern essential parts even from small information. By utilizing this new information processing technology, we maximize the experimental and measurement efficiency for acquiring data.
What is underpinning these technologies is mathematical problem called "optimization problem". This optimization problem will appear in various scenes. Thus, competition to develop computing technology dedicated to efficiently solving optimization problems has begun all over the world. In our laboratory, we are promoting computational techniques using physical processes such as "quantum annealing".
Quantum annealing is one of optimization methods using quantum effects. Thanks to quantum effects, quantum annealing can find optimal solutions faster than traditional optimization methods in some problems. However, there are many optimization problems that cannot be solved efficiently by quantum annealing. In order to develop a method to efficiently solve these optimization problems, we aim to improve the performance of quantum annealing by making use of the method developed in the field of information science.
Another important issue is to develop a method that can handle large amounts of data efficiently. The data becomes meaningful when its elements that are related to each other gather together such as images, sounds, and texts. Therefore, to process huge data such as 4K, 8K images efficiently, it is important to treat this relationship in the data. In our laboratory, we are studying the graphical model that handles the huge data efficiently by expressing the relationship in the data as a graph structure. Based on the graphical modeling, we aim to develop the effective data processing systems by utilizing the latest technologies such as Bayesian statistics and statistical machine learning.
Based on these academic research results, we will change the world by giving back to society widely through many projects and collaborative research with companies.