FASHNAI
July 22, 2025
FASHN v1.6 - Quality Mode Runtime Optimization
We’ve deployed additional optimizations that reduce end-to-end inference time by approximately 2 seconds. The average runtime now ranges from 10–15 seconds (down from 12–17 seconds), depending on input resolution.
Importantly, this performance boost comes with no loss in output quality.