Xu, JunqiJunqiXuWang, LvchengLvchengWangBoukhers, ZeydZeydBoukhersIndurkhya, BipinBipinIndurkhyaYang, CongCongYang2025-07-222025-07-222025-06-30https://publica.fraunhofer.de/handle/publica/48983310.1145/3731715.3733492Automatic Question Generation (AQG) aims to generate valid, coherent questions based on given text passages in pre-trained language models. While AQG has been a significant area of retrieval augmentation and agent-based systems, current QG models face limitations, especially with sequential models like Transformer, which struggle with modeling complex logical structures and limited question coherence and depth. This paper proposes Q-Chain, a framework designed to optimize logic and educational values. Q-Chain features: (1)Differentiable logic layers in GMP model conditional dependencies and counterfactuals, ensuring logical rigor. (2) Bloom's taxonomy-guided gating adjusts question difficulty to align with pedagogical goals. (3) Direct generation of logic-structured question graphs enhanced by counterfactual training. Experimental results show Q-Chain outperforms GPT-3.5, T5, and BART on the MedQuAD dataset (0.82 vs 0.80 vs 0.72 and 0.74 F1-score) and shows superior robustness to noisy inputs, achieving a 4.1/5 human rating on counterfactual questions, 40% better than BART.enAutomatic Question GenerationMultimodalFew-shotPrototypeCausal ReasoningKnowledge GraphsCross-modal AlignmentQ-Chain: A Causal-Aware Framework for Structural and Educational Question Generationconference paper