CC BY 4.0Fabritius, MarcMarcFabritiusKraus, WernerWernerKrausPott, AndreasAndreasPott2023-12-192023-12-192023https://publica.fraunhofer.de/handle/publica/458168https://doi.org/10.24406/h-45816810.1007/978-3-031-45705-0_5510.24406/h-458168The accuracy of cable-driven parallel robots (CDPRs) is an important performance criteria in many of their applications. While various modeling and calibration approaches have been proposed to improve the accuracy of CDPRs, only few works in the literature systematically compare the accuracy of different models and approaches in practice. Therefore, this work compares the accuracy improvements achieved by different CDPR and machine-learning (ML) models (linear regression, boosted regression trees, and neural networks) that are optimized or trained based on measurement data from a CDPR. A hyperparameter study is performed to select the most accurate models, which exhibit the least overfitting on a validation dataset. The accuracy of these models is evaluated in practice using an additional test measurement. Optimized CDPR models yield accuracy improvements of up to 61% on the training and 30% on the validation dataset. The best ML model achieves improvements of 66% and 41%, respectively. These results show that suitable optimized CDPR and ML models can significantly improve the accuracy of CDPR in practice.encable-driven parallel robotsaccuracymodel optimizationmachine-learningneural networksXGBoostImproving the Accuracy of Cable-Driven Parallel Robots Through Model Optimization and Machine-Learningconference paper