Options
Prof. Dr.
Wrobel, Stefan
Now showing
1  10 of 27

PublicationDecision Snippet Features( 20210505)
;Welke, Pascal ;Alkhoury, FouadDecision trees excel at interpretability of their prediction results. To achieve required prediction accuracies, however, often large ensembles of decision trees random forests are considered, reducing interpretability due to large size. Additionally, their size slows down inference on modern hardware and restricts their applicability in lowmemory embedded devices. We introduce Decision Snippet Features, which are obtained from small subtrees that appear frequently in trained random forests. We subsequently show that linear models on top of these features achieve comparable and sometimes even better predictive performance than the original random forest, while reducing the model size by up to two orders of magnitude. 
PublicationMaximum Margin Separations in Finite Closure Systems( 2021)
;Seiffahrt, FlorianMonotone linkage functions provide a measure for proximities between elements and subsets of a ground set. Combining this notion with Vapniks idea of support vector machines, we extend the concepts of maximal closed set and halfspace separation in finite closure systems to those with maximum margin. In particular, we define the notion of margin for finite closure systems by means of monotone linkage functions and give a greedy algorithm computing a maximum margin closed set separation for two sets efficiently. The output closed sets are maximum margin halfspaces, i.e., form a partitioning of the ground set if the closure system is Kakutani. We have empirically evaluated our approach on different synthetic datasets. In addition to binary classification of finite subsets of the Euclidean space, we considered also the problem of vertex classification in graphs. Our experimental results provide clear evidence that maximal closed set separation with maximum margin results in a much better predictive performance than that with arbitrary maximal closed sets. 
PublicationHOPS: Probabilistic Subtree Mining for Small and Large Graphs( 2020)
;Welke, Pascal ;Seiffahrt, FlorianFrequent subgraph mining, i.e., the identification of relevant patterns in graph databases, is a wellknown data mining problem with high practical relevance, since next to summarizing the data, the resulting patterns can also be used to define powerful domainspecific similarity functions for prediction. In recent years, significant progress has been made towards subgraph mining algorithms that scale to complex graphs by focusing on tree patterns and probabilistically allowing a small amount of incompleteness in the result. Nonetheless, the complexity of the pattern matching component used for deciding subtree isomorphism on arbitrary graphs has significantly limited the scalability of existing approaches. In this paper, we adapt sampling techniques from mathematical combinatorics to the problem of probabilistic subtree mining in arbitrary databases of many small to mediumsize graphs or a single large graph. By restricting on tree patterns, we provide an algorithm tha t approximately counts or decides subtree isomorphism for arbitrary transaction graphs in sublinear time with onesided error. Our empirical evaluation on a range of benchmark graph datasets shows that the novel algorithm substantially outperforms stateoftheart approaches both in the task of approximate counting of embeddings in single large graphs and in probabilistic frequent subtree mining in large databases of small to medium sized graphs. 
PublicationMaximal Closed Set and HalfSpace Separations in Finite Closure Systems( 2020)
;Seiffarth, FlorianMotivated by various binary classification problems in structured data (e.g., graphs or other relational and algebraic structures), we investigate some algorithmic properties of closed set and halfspace separation in abstract closure systems. Assuming that the underlying closure system is finite and given by the corresponding closure operator, we formulate some negative and positive complexity results for these two separation problems. In particular, we prove that deciding halfspace separability in abstract closure systems is NPcomplete in general. On the other hand, for the relaxed problem of maximal closed set separation we propose a simple greedy algorithm and show that it is efficient and has the best possible lower bound on the number of closure operator calls. As a second direction to overcome the negative result above, we consider Kakutani closure systems and show first that our greedy algorithm provides an algorithmic characterization of this kind of set systems. As one of the major potential application fields, we then focus on Kakutani closure systems over graphs and generalize a fundamental characterization result based on the Pasch axiom to graph structure partitioning of finite sets. Though the primary focus of this work is on the generality of the results obtained, we experimentally demonstrate the practical usefulness of our approach on vertex classification in different graph datasets. 
PublicationAdiabatic Quantum Computing for MaxSum Diversification( 2020)The combinatorial problem of maxsum diversification asks for a maximally diverse subset of a given set of data. Here, we show that it can be expressed as an Ising energy minimization problem. Given this result, maxsum diversification can be solved on adiabatic quantum computers and we present proof of concept simulations which support this claim. This, in turn, suggests that quantum computing might play a role in data mining. We therefore discuss quantum computing in a tutorial like manner and elaborate on its current strengths and weaknesses for data analysis.

PublicationMaxSum Dispersion via Quantum Annealing( 2019)We devise an Ising model for the maxsum dispersion problem which occurs in contexts such as Web search or text summarization. Given this Ising model, maxsum dispersion can be solved on adiabatic quantum computers; in proof of concept simulations, we solve the corresponding SchrÃ¶dinger equations and observe our approach to work well.

PublicationArtificial intelligence meets is researchers: Can it replace us?( 2019)
;Loebbecke, C. ;Sawy, O. el ;Kankanhalli, A. ;Lynne Markus, M. ;Te'Eni, D.In the era of accelerating digitization and rapid advances in Artificial Intelligence (AI), increasingly more job tasks may be automated by AI. However, there is little critical analysis of how this will happen, if at all, and to what kind of professions to greater or lesser extents. A few studies suggest that highly creative and knowledgeintensive tasks cannot be substituted by AI. Yet, there have been examples of creative art pieces generated by AI algorithms that even art critics could not distinguish from humandrawn paintings. As IS (and most other) researchers, we pride ourselves on the scarcity, novelty, and creativity of our work. Thus, this panel will debate the critical question for IS academics whether AI can and will replace our major activity, IS research,  or even us IS researchers. 
PublicationA QUBO Formulation of the kMedoids Problem( 2019)We are concerned with kmedoids clustering and propose aquadratic unconstrained binary optimization (QUBO) formulation of the problem of identifying k medoids among n data points without having to cluster the data. Given our QUBO formulation of this NPhard problem, it should be possible to solve it on adiabatic quantum computers.

PublicationLeveraging Domain Knowledge for Reinforcement Learning using MMC ArchitecturesDespite the success of reinforcement learning methods in various simulated robotic applications, endtoend training suffers from extensive training times due to high sample complexity and does not scale well to realistic systems. In this work, we speed up reinforcement learning by incorporating domain knowledge into policy learning. We revisit an architecture based on the mean of multiple computations (MMC) principle known from computational biology and adapt it to solve a reacher task. We approximate the policy using a simple MMC network, experimentally compare this idea to endtoend deep learning architectures, and show that our approach reduces the number of interactions required to approximate a suitable policy by a factor of ten.

PublicationEffcient Decentralized Deep Learning by Dynamic Model Averaging( 2019)
;Sicking, Joachim ;HÃ¼ger, Fabian ;Schlicht, PeterWe propose an efficient protocol for decentralized training of deep neural networks from distributed data sources. The proposed protocol allows to handle different phases of model training equally well and to quickly adapt to concept drifts. This leads to a reduction of communication by an order of magnitude compared to periodically communicating stateoftheart approaches. Moreover, we derive a communication bound that scales well with the hardness of the serialized learning problem. The reduction in communication comes at almost no cost, as the predictive performance remains virtually unchanged. Indeed, the proposed protocol retains loss bounds of periodically averaging schemes. An extensive empirical evaluation validates major improvement of the tradeoff between model performance and communication which could be beneficial for numerous decentralized learning applications, such as autonomous driving, or voice recognition and image classification on mobile phones.