Biased SD Video Generation
Ajitesh Bankula • RPI
Measuring and mitigating visual drift in diffusion pipelines using optical flow and activation/“neural noise” tracking.
Quick links: Code · Data·Resources · Results · License
Overview
This project investigates consistency issues in diffusion-based video generation and proposes metrics and interventions to reduce visual drift. We explore optical-flow-based stability measures and neuron/activation probes that correlate with cinematic style features, then test targeted adjustments to improve temporal coherence.
Data
Small sample files live in Data → samples (kept under Git LFS). Full datasets are linked on the Data page and Releases. (go to quick links at top to access)
Results
Key figures, tables, and short reports are listed on the Results page. (go to quick links at top to access)
Cite
@misc{bankula2025biasedsd,
title={Biased SD Video Generation: Measuring and Mitigating Visual Drift in Diffusion Pipelines},
author={Bankula, Ajitesh},
year={2025},
howpublished={\url{https://ajiteshbankulaa.github.io/BiasedSDVideoGeneration/}},
note={Version 0.1}
}
License
This project is licensed under the Apache License 2.0. See the LICENSE file on GitHub.
Last updated: 2025-08-11