Abstract
This study introduces KnitDiff, a constraint-aware diffusion modeling framework for intelligent knit pattern generation. Integrating a KnitPrompt Engine with Stable Diffusion models, KnitDiff bridges creative design and textile engineering by translating user-friendly language into technically compliant prompts. The system incorporates three key innovations: (1) a knit vocabulary constructor that maps natural language inputs to structured textile parameters; (2) a dynamic learning system that adapts to user preferences and evolving fashion trends; and (3) a multidimensional evaluation module that quantifies texture, structural integrity, and aesthetic coherence. Reinforcement learning and quality-aware feedback loops guide iterative optimization, while a comprehensive prompt library facilitates both design diversity and manufacturability. Empirical evaluations show KnitDiff significantly improves generation quality, user satisfaction, and alignment with production standards compared with baseline models. The framework offers a scalable solution for democratizing textile design through human–AI collaboration.
Keywords
Get full access to this article
View all access options for this article.
