Firstly, if XBMC already has multithreaded 4K h.264 support, I apologize but would appreciate some help in understanding what the issue is instead.
Anyway, deets first:
Test Video File:
http://videos5.hd-trailers.net/videos/El...K-HDTN.mp4
System:
AMD A8-3870K
16GB RAM
Windows 7 Ultimate
XBMC 13 Gotham Alpha 4
My experience seems to be that 4K h.264 decoding in XBMC is not multithreaded. When I attempt to playback the 4K file, playback begins using software decoding (Weather 'Allow DXVA' is enabled or not) and it promptly gets choppy at about 10fps, the CPU usage floats around 25%-27% and it's basically unwatchable. This is much like watching certain 10bit h.264 media on machines with lowerish multicore CPUs before Gotham alphas allowed multithreading.
I do have success playing 10bit h.264 and even 1080p h.264 lossless which demands upwards to 70% of the CPU at times. (Student film maker, so I have some h.264 lossless files of my own work laying around which I use to upload to YouTube to avoid doing two lossy compression passes) However 4K seems to bury the needle on one core and in the case of the A8-3870K one core just ISN'T enough.
As we're on the verge of 4K it seems not allowing multithreading on 4K is a major shortcoming in the near future even if it isn't quite one now.
Anyway, deets first:
Test Video File:
http://videos5.hd-trailers.net/videos/El...K-HDTN.mp4
System:
AMD A8-3870K
16GB RAM
Windows 7 Ultimate
XBMC 13 Gotham Alpha 4
My experience seems to be that 4K h.264 decoding in XBMC is not multithreaded. When I attempt to playback the 4K file, playback begins using software decoding (Weather 'Allow DXVA' is enabled or not) and it promptly gets choppy at about 10fps, the CPU usage floats around 25%-27% and it's basically unwatchable. This is much like watching certain 10bit h.264 media on machines with lowerish multicore CPUs before Gotham alphas allowed multithreading.
I do have success playing 10bit h.264 and even 1080p h.264 lossless which demands upwards to 70% of the CPU at times. (Student film maker, so I have some h.264 lossless files of my own work laying around which I use to upload to YouTube to avoid doing two lossy compression passes) However 4K seems to bury the needle on one core and in the case of the A8-3870K one core just ISN'T enough.
As we're on the verge of 4K it seems not allowing multithreading on 4K is a major shortcoming in the near future even if it isn't quite one now.