- Michael Bauland, Martin Mundhenk, Thomas Schneider, Henning Schnoor, Ilka Schnoor, Heribert Vollmer:
The Tractability of Model-Checking for LTL: The Good, the Bad, and the Ugly Fragments.
Electronic Notes in Theoretical Computer Science, 2008 (to appear).
To appear.
Show abstract
In a seminal paper from 1985, Sistla and Clarke showed that the model-checking problem for Linear Temporal
Logic (LTL) is either NP-complete or PSPACE-complete, depending on the set of temporal operators
used. If, in contrast, the set of propositional operators is restricted, the complexity may decrease. This
paper systematically studies the model-checking problem for LTL formulae over restricted sets of propositional
and temporal operators. For almost all combinations of temporal and propositional operators, we
determine whether the model-checking problem is tractable (in P) or intractable (NP-hard). We then focus
on the tractable cases, showing that they all are NL-complete or even logspace solvable. This leads to a
surprising gap in complexity between tractable and intractable cases. It is worth noting that our analysis
covers an infinite set of problems, since there are infinitely many sets of propositional operators.
- Markus Hinkelmann, Andreas Jakoby, Peer Stechert:
t-Private and t-Secure Auctions.
Journal of Computer Science and Technology, 23(5):694-710, 2008.
Show abstract
In most of the auction systems the values of bids are known to the auctioneer. This allows him to manipulate the outcome of the auction. Hence, one might be interested in hiding these values. Some cryptographically secure protocols for electronic auctions have been presented in the last decade. Our work extends these protocols in several ways. On the basis of garbled circuits, i.e., encrypted circuits, we present protocols for sealed-bid auctions that fulfill the following requirements: 1) protocols are information-theoretically t-private for {\em honest but curious parties; 2) the number of bits that can be learned by {\em malicious adversaries is bounded by the output length of the auction; 3) the computational requirements for participating parties are very low: only random bit choices and bitwise computation of the {\tt XOR-function are necessary. Note that one can distinguish between the protocol that generates a garbled circuit for an auction and the protocol to evaluate the auction. In this paper we address both problems. We will present a t-private protocol for the construction of a garbled circuit that reaches the lower bound of 2t+1 parties, and a more randomness efficient protocol for (t+1)^2 parties. Finally, we address the problem of bid changes in an auction.
- Jens Gramm, Arfst Nickelsen, Till Tantau:
Fixed-Parameter Algorithms in Phylogenetics.
The Computer Journal, 51(1):79--101, 2008.
Show PDF | Show abstract
We survey the use of fixed-parameter algorithms in
the field of phylogenetics, which is the study of
evolutionary relationships. The central problem in
phylogenetics is the reconstruction of the
evolutionary history of biological species, but its
methods also apply to linguistics, philology, or
architecture. A basic computational problem is the
reconstruction of a likely phylogeny (genealogical
tree) for a set of species based on observed
differences in the phenotype like color or form of
limbs, based on differences in the genotype like
mutated nucleotide positions in the DNA sequence, or
based on given partial phylogenies. Ideally, one
would like to construct so-called perfect
phylogenies, which arise from a very simple
evolutionary model, but in practice one must often
be content with phylogenies whose ``distance from
perfection'' is as small as possible. The
computation of phylogenies has applications in
seemingly unrelated areas such as genomic sequencing
and finding and understanding genes. The numerous
computational problems arising in phylogenetics
often are NP-complete, but for many natural
parametrizations they can be solved using
fixed-parameter algorithms.
- W. Bein, L. Larmore, R. Reischuk:
Knowledge States: A Tool for Randomized Online Algorithms.
In Proceedings of 41. HICSS Int. Conference on System Sciences, pp. 476.
IEEE Computer Society,
2008.
Go to website - Nadia Creignou, Henning Schnoor, Ilka Schnoor:
Non-uniform Boolean Constraint Satisfaction Problems with Cardinality Constraint.
In Proceedings of Computer Science Logic, Volume 5213 of Lecture Notes in Computer Science, pp. 109-123.
Springer,
2008.
Show abstract
We study the computational complexity of Boolean constraint satisfaction problems with cardinality constraint. A Galois connection between clones and co-clones has received a lot of attention in the context of complexity considerations for constraint satisfaction problems. This connection fails when considering constraint satisfaction problems that support in addition a cardinality constraint. We prove that a similar Galois connection, involving a weaker closure operator and partial polymorphisms, can be applied to such problems. Thus, we establish dichotomies for the decision as well as for the counting problems in Schaefer’s framework.
- Christian Hundt, Maciej Liskiewicz:
Combinatorial Bounds and Algorithmic Aspects of Image Matching under Projective Transformations.
In Proceedings of the 33rd International Symposium on Mathematical Foundations of Computer Science (MFCS), Volume 5162 of Lecture Notes in Computer Science, pp. 395-406.
Springer,
2008.
Go to website - Christian Hundt, Maciej Liskiewicz:
Two-dimensional Pattern Matching with Combined Scaling and Rotation.
In Proceedings of the 19th Symposium on Combinatorial Pattern Matching (CPM), Volume 5029 of Lecture Notes in Computer Science, pp. 5-17.
Springer,
2008.
Go to website - Andreas Jacoby, Maciej Liskiewicz, Aleksander Madry:
Susceptible Two-party Quantum Computations.
In in Proceedings of the 2nd International Conference on Information Theoretic Security (ICITS), Volume 5155 of Lecture Notes in Computer Science, pp. 121-136.
Springer,
2008.
Go to website - Bodo Manthey, Till Tantau:
Smoothed Analysis of Binary Search Trees and Quicksort Under Additive Noise.
In Proceedings of MFCS 2008, Volume 5162 of Lecture Notes in Computer Science, pp. 467-478.
Springer,
2008.
Show abstract
Binary search trees are a fundamental data structure and their height plays a
key role in the analysis of divide-and-conquer algorithms like quicksort.
We analyze their smoothed height under additive uniform noise: An
adversary chooses a sequence of~$n$ real numbers in the range
$[0,1]$, each number is individually perturbed by adding a value
drawn uniformly at random from an interval of size~$d$, and the
resulting numbers are inserted into a search tree.
An analysis of the smoothed tree height subject to $n$ and $d$ lies
at the heart of our paper: We prove that the smoothed height of binary search
trees is $\Theta (\sqrt{n/d} + \log n)$, where $d \ge 1/n$ may depend on~$n$.
Our analysis starts with the simpler problem of determining the
smoothed number of left-to-right maxima in a sequence.
We establish matching bounds, namely
once more $\Theta (\sqrt{n/d} + \log n)$. We also apply our findings
to the performance of the quicksort algorithm and prove that the
smoothed number of comparisons made by quicksort is
$\Theta(\frac{n}{d+1} \sqrt{n/d} + n \log n)$.
- Michael Elberfeld, Till Tantau:
Computational Complexity of Perfect-Phylogeny-Related Haplotyping Problems.
In Proceedings of the 33rd International Symposium on Mathematical Foundations of Computer Science (MFCS 2008), Volume 5162 of Lecture Notes in Computer Science, pp. 299-310.
Springer,
2008.
Go to website | Show abstract
Haplotyping, also known as haplotype phase prediction, is the
problem of predicting likely haplotypes based on genotype data. This
problem, which has strong practical applications, can be approached
using both statistical as well as combinatorial methods. While the
most direct combinatorial approach, maximum
parsimony, leads to NP-complete problems, the perfect phylogeny
model proposed by Gusfield yields a problem, called PPH, that can be
solved in polynomial (even linear) time. Even this may
not be fast enough when the whole genome is studied, leading to the
question of whether parallel algorithms can be used to solve the PPH
problem. In the present paper we answer this question affirmatively,
but we also give lower complexity bounds on its complexity. In
detail, we show that the problem lies in Mod$_2$L, a subclass of the
circuit complexity class NC$^2$, and is hard for
logarithmic space and thus presumably not in NC$^1$. We also
investigate variants of the PPH problem that have been studied in
the literature, like the perfect path phylogeny haplotyping problem
and the combined problem where a perfect phylogeny of maximal parsimony
is sought, and show that some of these variants are TC$^0$-complete or
lie in AC$^0$.
- Johannes Textor, Björn Hansen:
Improved Simulation Algorithms for Agent Based Models of the Immune Response.
In Proceedings of the 19th European Meeting on Cybernetics and Systems (EMCSR 2008), pp. 476-481.
Austrian Society for Cybernetic Studies,
2008.
Show PDF | Go to website | Show abstract
The immune response is a complex process
which involves large numbers of highly diverse cells, making it a well suited application for agent based modelling. A major
problem with current models is the lack of
biophysically sound simulation methodology.
In this paper, we analyze the key algorithms
of an important agent based model of the
immune system, point out methodological
de?ciencies and propose alternatives which
correspond to discrete versions of established
continuous models. To analyze the e?ects
of these modi?cations, a detailed simulation
of the immune response based on the
proposed algorithms has been implemented and
tested.
- Markus Hinkelmann, Andreas Jakoby:
Preserving Privacy versus Data Retention.
Technical report SIIM-TR-A-08-04,
Schriftenreihe der Institute für Informatik/Mathematik der Universität zu Lübeck,
2008.
Show PDF | Show abstract
The retention of communication data has recently attracted much public interest, mostly because of the possibility of its misuse.
In this paper, we present protocols that address the privacy concerns of the communication partners.
Our data retention protocols store streams of encrypted data items,
some of which may be flagged as
critical (representing misbehavior). The frequent occurrence of critical data items justifies the self-decryption of all recently stored data items, critical or not.
Our first protocol allows the party gathering the retained data to decrypt
all data items collected within, say, the last half year
whenever the number of critical data items reaches some threshold within, say, the last month.
The protocol ensures that the senders of data remain anonymous but may reveal that different critical data items came from the same sender. We call this the affiliation of critical data. Our second, computationally more complex scheme obscures the affiliation of critical data with high probability.
- Markus Hinkelmann, Andreas Jakoby, Peer Stechert:
t-Private and t-Secure Auctions.
Technical report SIIM-TR-A-08-01,
Schriftenreihe der Institute für Informatik/Mathematik der Universität zu Lübeck,
2008.
Show PDF | Show abstract
In most of the used auction systems the values of bids are known to the auctioneer. This allows him to manipulate the outcome of the auction. Hence, one is
interested in hiding these values. Some cryptographically secure protocols for electronic auctions have been presented in the last decade.
Our work extends these protocols in several ways. Based on garbled circuits, i.e., encrypted circuits, we present protocols for sealed-bid auctions that fulfill the following requirements:
1. Protocols are information-theoretically t-private for honest but curious parties.
2. The number of bits that can be learned by malicious adversaries is bounded by the output length of the auction.
3. The computational requirements for participating parties are very low: only random bit choices and bitwise computation of the XOR-function are necessary.
Note that one can distinguish between the protocol
that generates a garbled circuit for an auction and the protocol to evaluate the auction. In this paper we address both problems. We will present a t-private protocol
for the construction of a garbled circuit that reaches the lower bound of 2t+1 parties, and a more randomness efficient protocol for (t+1)^2 parties.
Finally, we address the problem of bid changes in an auction.
- Till Tantau:
Complexity of the Undirected Radius and Diameter Problems for Succinctly Represented Graphs.
Technical report SIIM-TR-A-08-03,
Schriftenreihe der Institute für Informatik/Mathematik der Universität zu Lübeck,
2008.
Show PDF - Till Tantau:
Generalizations of the Hartmanis-Immerman-Sewelson Theorem and Applications to Infinite Subsets of P-Selective Sets.
Technical report ECCC-TR08-027,
Electronic Colloquium on Computational Complexity,
2008.
Show PDF | Show abstract
The Hartmanis--Immerman--Sewelson theorem is the classical link between the exponential and the polynomial time realm. It states that NE = E if, and only if, every sparse set in NP lies in P. We establish similar links for classes other than sparse sets: 1. E = UE if, and only if, all functions f: {1}^* to Sigma^* in NPSV_g lie in FP. 2. E = NE if, and only if, all functions f: {1}^* to Sigma^* in NPFewV lie in FP. 3. E = E^NP if, and only if, all functions f: {1}^* to Sigma^* in OptP lie in FP. 4. E = E^NP if, and only if, all standard left cuts in NP lie in P. 5. E = EH if, and only if, PH cap P/poly = P. We apply these results to the immunity of P-selective sets. It is known that they can be bi-immune, but not Pi_2^p/1-immune. Their immunity is closely related to top-Toda languages, whose complexity we link to the exponential realm, and also to king languages. We introduce the new notion of superkings, which are characterized in terms of existsforall-predicates rather than forallexists-predicates, and show that king languages cannot be Sigma_2^p-immune. As a consequence, P-selective sets cannot be Sigma_2^/1-immune and, if E^NP^NP = E, not even P/1-immune.
- Michael Elberfeld, Till Tantau:
Computational Complexity of Perfect-Phylogeny-Related Haplotyping Problems.
Technical report SIIM-TR-A-08-02,
Schriftenreihe der Institute für Informatik/Mathematik der Universität zu Lübeck,
2008.
Show PDF | Show abstract
Haplotyping, also known as haplotype phase prediction, is the
problem of predicting likely haplotypes based on genotype data. This
problem, which has strong practical applications, can be approached
using both statistical as well as combinatorial methods. While the
most direct combinatorial approach, maximum
parsimony, leads to NP-complete problems, the perfect phylogeny
model proposed by Gusfield yields a problem, called PPH, that can be
solved in polynomial (even linear) time. Even this may
not be fast enough when the whole genome is studied, leading to the
question of whether parallel algorithms can be used to solve the PPH
problem. In the present paper we answer this question affirmatively,
but we also give lower complexity bounds on its complexity. In
detail, we show that the problem lies in Mod$_2$L, a subclass of the
circuit complexity class NC$^2$, and is hard for
logarithmic space and thus presumably not in NC$^1$. We also
investigate variants of the PPH problem that have been studied in
the literature, like the perfect path phylogeny haplotyping problem
and the combined problem where a perfect phylogeny of maximal parsimony
is sought, and show that some of these variants are TC$^0$-complete or
lie in AC$^0$.
- Michael Elberfeld, Ilka Schnoor, Till Tantau:
Influence of Tree Topology Restrictions on the Complexity of Haplotyping with Missing Data.
Technical report SIIM-TR-A-08-05,
Schriftenreihe der Institute für Informatik/Mathematik der Universität zu Lübeck,
2008.
Show PDF | Show abstract
Haplotyping, also known as haplotype phase prediction, is the
problem of predicting likely haplotypes based on genotype data. One
fast haplotyping method is based on an evolutionary model where a
perfect phylogenetic tree is sought that explains the
observed data. Unfortunately, when data entries are missing, as is
often the case in real laboratory data, the resulting formal problem
IPPH, which stands for incomplete perfect phylogeny
haplotyping, is NP-complete and no theoretical results are known
concerning its approximability, fixed-parameter tractability or
exact algorithms for it. Even radically simplified versions, such as
the restriction to phylogenetic trees consisting of just two directed
paths from a given root, are still NP-complete, but here, at least,
a fixed-parameter algorithm is known. We generalize this algorithm
to arbitrary tree topologies and present the first
theoretical analysis of an algorithm that works on arbitrary
instances of the original IPPH problem. At the same time we
also show that restricting the tree topology does not always make
finding phylogenies easier: while the incomplete directed perfect phylogeny
problem is well-known to be solvable in polynomial time, we show
that the same problem restricted to path topologies is NP-complete.