Research

Research Topics

Large-Scale Knowledge Processing

With advances in the Internet technologies, flexible knowledge media have been emerging and becoming commonplace where we can search, analyze, annotate, edit and distribute large-scale and diverse knowledge contents as we like. However, as the explosive proliferation of big data grows apparent everywhere around us, we are also being pushed to rethink and redefine the underlying computing technologies to make sense and make full use of the knowledge data. Now many aspects of our society are getting heavily dependent upon data and computing, and breakthroughs are increasingly powered by advanced computing capabilities.

Our lab is driven by the art and theory of cutting-edge algorithms for large-scale knowledge processing, and focused on computational aspects of machine learning, artificial intelligence, data mining, data science, system optimization, and information security. Working through these research, we contribute to solving real-world problems such as improving social infrastructures on electronic power, telecommunications, transportation, business, disaster prevention, and analyzing big data in material sciences and life sciences.


Combinatorial Optimization and Mathematical Structure

The aim of combinatorial optimization is to find an optimal object among a collection of (infinite or exponentially many) discrete objects such as graphs. Many of combinatorial optimization problems are NP-hard or co-NP-hard. That is, under the widely believed conjecture that Pโ‰ NP, there is no polynomial-time algorithm for solving them. On the other hand, polynomial-time solvable combinatorial optimization problems also exist, in which we utilize “good mathematical structures” to devise efficient algorithms for them. Our major goals are to clarify the good mathematical structures, and to classify what combinatorial optimization problems are tractable and what are not. Discrete convexity, which is “convexity” of a function defined on the integer lattice, often appears as a good mathematical structure. The theory of discrete convexity — Discrete Convex Analysis (DCA) — is built on two kinds of convex functions, called L-convexity and M-convexity, and is expanding by using a variety of mathematics. Here L-convexity is a generalization of submodularity (an abstraction of the diminishing return property), and M-convexity is a generalization of matroid (a combinatorial structure abstracting and generalizing linear independence in vector spaces). The scope of DCA has been broadening by recent generalizations of L-/M-convexity, and hence DCA can develop further. Discrete convexity appears not only in combinatorial optimization, but also in economics, game theory, machine learning, and so on. Therefore we can explore a variety of areas via discrete convexity.


Publications


Projects