Sample frames from the CHUG dataset, showcasing diverse real-world UGC-HDR content with variations in lighting, motion, orientation, and distortions. Best viewed when zoomed in
chug-video.txt
.aws s3 cp s3://ugchdrmturk/videos/VIDEO_ID.mp4 ./CHUG_Videos/
cat chug-video.txt | while read video; do
aws s3 cp s3://ugchdrmturk/videos/${video}.mp4 ./CHUG_Videos/
done
VIDEO_ID
in this URL:High Dynamic Range (HDR) videos enhance visual experiences with superior brightness, contrast, and color depth. The surge of User-Generated Content (UGC) on platforms like YouTube and TikTok introduces unique challenges for HDR video quality assessment (VQA) due to diverse capture conditions, editing artifacts, and compression distortions. Existing HDR-VQA datasets primarily focus on professionally generated content (PGC), leaving a gap in understanding real-world UGC-HDR degradations. To address this, we introduce CHUG: Crowdsourced User-Generated HDR Video Quality Dataset, the first large-scale subjective study on UGC-HDR quality. CHUG comprises 856 UGC-HDR source videos, transcoded across multiple resolutions and bitrates to simulate real-world scenarios, totaling 5,992 videos. A large-scale study via Amazon Mechanical Turk collected 211,848 perceptual ratings. CHUG provides a benchmark for analyzing UGC-specific distortions in HDR videos.
Indoor Scene
Museum
Costumes
Kayaking
City
Mountains
Carousel
Nature
Sunset
Screen
Light Show
Birds
BibTex Code Here