The International Space Station ISS is a scientific laboratory in which astronauts conduct a great variety of experiments on a tight schedule. In order to fulfill their tasks efficiently and correctly, astronauts need assistance, which (at least partially) can be provided by IT systems on board, among them robotic assistants like the Crew Interactive Mobile Companion CIMON. However, the creation of user interfaces for such systems is a challenge, because astronauts often have to interact hands-free or cannot direct their attention to a visual user interface. These challenges can be met by providing multimodal user interfaces that enable speech interaction, among other modalities. We describe the use context for speech interfaces on the ISS, specific requirements and possible solutions. Our concepts rely on previous work carried out in acoustically demanding environments.