ChanServ changed the topic of #lima to: Development channel for open source lima driver for ARM Mali4** GPUs - Kernel driver has landed in mainline, userspace driver is part of mesa - Logs at https://oftc.irclog.whitequark.org/lima/
camus has joined #lima
megi has joined #lima
drod has quit [Remote host closed the connection]
Daanct12 has joined #lima
Daanct12 has quit [Remote host closed the connection]
Daanct12 has quit [Read error: Connection reset by peer]
Daanct12 has joined #lima
Daaanct12 has joined #lima
Daanct12 has quit [Ping timeout: 480 seconds]
Guest2699 has joined #lima
Guest2699 has quit [Remote host closed the connection]
drod has joined #lima
<marex>
If I want to sample I420 (planar YUV), would that have any performance issues compared to sampling packed YUV like YUYV ?
<marex>
I guess since the GPU only has two texture samplers, planar YUV would suffer from performance loss ?
jernej has quit [Ping timeout: 480 seconds]
jernej has joined #lima
jernej has quit [Remote host closed the connection]
jernej has joined #lima
Danct12 has quit [Remote host closed the connection]
Danct12 has joined #lima
Danct12 has quit [Quit: Quitting]
Daaanct12 has quit [Remote host closed the connection]
Danct12 has joined #lima
<anarsoul>
marex: well, we expose 16 samplers, but it's always one sampler per instruction and thus one sampler per clock
<anarsoul>
performance will be affected if you do unpacking in shader
* anarsoul
looked up what YUYV is
<anarsoul>
I'm not sure if it'll be more performant than planar YUV
<marex>
anarsoul: as far as I can tell, if I read one 32bit word from memory per 2 pixels (packed YUV), it will result in one DRAM access every two pixels
<marex>
it's kind-of like sampling RGBA8888
<marex>
anarsoul: but if I have e.g. I420 planar YUV, then for every pixel, the GPU has to do three DRAM accesses, one for Y component and one for each U and V component
<marex>
which would be considerably worse, right ?
<marex>
(and yes, I would very much like to do the YUV to RGB conversion in pixel shader)
dllud_ has joined #lima
dllud has quit [Remote host closed the connection]