ChanServ changed the topic of #etnaviv to: #etnaviv - the home of the reverse-engineered Vivante GPU driver - Logs https://oftc.irclog.whitequark.org/etnaviv
tlwoerner has joined #etnaviv
tlwoerner_ has quit [Ping timeout: 480 seconds]
tlwoerner_ has joined #etnaviv
tlwoerner has quit [Ping timeout: 480 seconds]
tlwoerner has joined #etnaviv
tlwoerner_ has quit [Ping timeout: 480 seconds]
tlwoerner has quit [Remote host closed the connection]
pcercuei has quit [Quit: dodo]
Leopold_ has quit []
Leopold_ has joined #etnaviv
frieder has joined #etnaviv
lynxeye has joined #etnaviv
pcercuei has joined #etnaviv
chewitt has joined #etnaviv
mvlad has joined #etnaviv
<marex> austriancoder: hey, I was looking at https://gitlab.freedesktop.org/mesa/mesa/-/merge_requests/3418/ again (NV12 blit), since I am running into bandwidth issues on MX8MP, but that is using GPU 3D for the blit, right ? I wonder if it would make sense to also revisit https://lists.freedesktop.org/archives/mesa-dev/2018-July/199703.html this GPU 2D YUV blit, which could take some load off the GPU 3D ?
<marex> austriancoder: basically I am looking for a way to do pixel format conversion between the VPU there (planar or semiplanar YUV) and lcdif (packet YUV or RGB) , and I think the GPU2D is the way to go , but I am not sure whether this is the right way to integrate that implementation
cphealy has joined #etnaviv
<lynxeye> marex: depends on where you want your video to go. The 2D GPU is faster than the 3D GPU and using 2D GPU will reduce load on the underpowered 3D GPU, but if you want to integrate the video in a 3D scene then the yuv tiler is probably the better option as the texture lookup is then still sampling yuv at 16bpp, instead of rgb at 32bpp.
<marex> lynxeye: I want to pump it directly into lcdif, hence the gpu2d
<marex> lynxeye: fullscreen that is
<marex> lynxeye: and sigh, I feel like the subtle design issues of the mx8mp are starting to show up now
<lynxeye> marex: In that case you might want to talk directly with the 2D GPU in your video pipeline. We did a PoC of a gstreamer element doing this on i.MX6 ages ago.
<marex> lynxeye: I was thinking about that too, but I dont want to do downstream hacks
<lynxeye> I don't see why one couldn't upstream such a gst element, aside from the obvious effort to do so.
<marex> lynxeye: upstreaming it into gstreamer isnt the problem, it doesn't feel really systematic to me
<marex> lynxeye: like, if I can do ... v4l2sl... ! glvideoconvert ! gldownload (uh) ! kmssink , that would be ... not nice either
<marex> hmmmmmmm
<lynxeye> right and going through GL you need to render, so you do the conversion using the yuv tiler or 2d gpu into a internal buffer and then use that to render into the buffer for kmssink. That's more 3D GPU load and bandwidth usage than straight using the 2D GPU to convert between the 2 buffers.
<marex> lynxeye: indeed
<marex> lynxeye: lemme think this through one more time
pcercuei has quit [Quit: bbl]
lynxeye has quit [Quit: Leaving.]
Leopold_ has quit [Remote host closed the connection]
Leopold_ has joined #etnaviv
frieder has quit [Remote host closed the connection]
<marex> hmmm , programming the G2D is suprisingly easy
<mntirc> marex: oh?
<marex> mntirc: yeah ... sec
<marex> mntirc: compile libdrm -- $ meson builddir/ -Dbuildtype=debug -Dtests=true -Detnaviv=enabled && ninja -C builddir/
<marex> mntirc: edit tests/etnaviv/etnaviv_2d_test.c
<marex> mntirc: compile and explore -- $ ninja -C builddir/ && ./builddir/tests/etnaviv/etnaviv_2d_test /dev/dri/renderD128 /tmp/etna.bmp
<marex> mntirc: then observe result in /tmp/etna.bmp
<marex> mntirc: repeat last two steps until happy
<mntirc> oh cool
<mntirc> gotta play around with that soon
<marex> mntirc: I got nv12 to rgb blit out of it like this https://paste.debian.net/hidden/222571a2/
<marex> mntirc: mostly copied from the patch link above
mvlad has quit [Remote host closed the connection]
gruetze_ is now known as gruetzkopf