Lübecker Informatik-Diplomarbeit gewinnt CAST-Förderpreis IT-Sicherheit 2004. Wir gratulieren Ulrich Wölfel zum 1. Platz in der Kategorie "Studierende".
The maximum common subgraph problem is known to be NP-hard, although it has often been applied to various areas. In the field of molecular biology, we can reduce the problem space by analyzing the structures of chemical compounds in biological pathways. In doing so, we have found that the tree-width of chemical compounds are bounded by a constant, and that the possible spanning trees of any compound is polynomially bounded. We present a polynomial time algorithm for finding the maximum common connected induced subgraph of a degree-bounded partial k-tree and a connected graph, the number of whose possible spanning trees is polynomial.
In the general combinatorial auction, incentive compatibility, which requires agents bidding their bids truthfully, is one of the most important questions. We discuss incentive compatible mechanism based on linear pricing scheme for single-minded auction, a restricted case of combinatorial auction, in which each buyer desires a unique fixed bundle of various commodities, improving the previous works on pricing bundles (i.e. payments of buyers).
The computation of market equilibrium is a very important problem in both economic theory and TCS. We review the development of the problem in the near two years and discuss open problem.
The harmonic number H_{k} is a known lower bound for the competitiveness of any randomized online algorithm for the k-cache problem. It is also known that to achieve this competitiveness, the algorithm must remember the names of some of the pages that have been ejected from the cache. We call those pages "bookmarked" pages.
We present optimally competitive randomized online algorithms for the 2-cache problem and the 3-cache problem, using a novel technique called "knowledge states." Our algorithms sometimes randomly choose to "forget" information from the past.
The central device used is an "estimator" which estimates what the optimal cost has been so far, as a function of the state of the optimal offline algorithm. Not only does the algorithm use randomization to make its moves, but the estimator is updated at every step using a Las Vegas technique. The combination of estimator and algorithm distribution is called a knowledge state.
Our knowledge state algorithm for the 2-cache problem uses only one bookmark, and our knowledge state algorithm for the 3-cache problem uses only two bookmarks. One question is, how many bookmarks are needed by an optimally competitive randomized online algorithm for the k-cache problem, for general k? We know that an algorithm with no bookmarks can achieve competitiveness 2H_{k-1}. What is the tradeoff between number of bookmarks and competitiveness?
We have also used knowledge states successfully to achieve the lowest known competitiveness for the randomized 2-server problem on the line, although the result was obtained using a computer and has as yet not been verified by hand.
In general, the knowledge state approach, for any online problem, is a systematic tool for limiting the complexity of a randomized online algorithm, perhaps at the cost of loss of optimality.