Context-aware learning for intelligent mobile multimodal user interfaces
The paper presents an association rule mining based learning approach for multimodal user interface adaptation in mobile environments. High-level knowledge about user preferences in multimodal interaction is inferred, using data mining techniques based on context parameters of the environment. The current approach facilitates automatic selection of multimodality capable interaction devices and their according rendering facilities for media output streams. An overview of the learning subsystem being part of the Distributed Communication Sphere (DCS) management architecture, proposed within the EU IST-027617 project SPICE, will be introduced. Further the design of the learning approach will be discussed, including the definition and adaptation of snapshot data based on environment parameters. Frequent snapshots form the basis for learning, and therefore for the described association rule mining algorithm. Theoretical simulation results are presented and an outlook towards next research steps is given.