ChanServ changed the topic of #wayland to: https://wayland.freedesktop.org | Discussion about the Wayland protocol and its implementations, plus libinput
tristianc6704 has joined #wayland
aktaboot has quit [Ping timeout: 480 seconds]
guru__ has joined #wayland
guru_ has quit [Ping timeout: 480 seconds]
columbarius has joined #wayland
co1umbarius has quit [Ping timeout: 480 seconds]
niecoinny[m] has quit [Quit: Client limit exceeded: 20000]
apol[m] has quit [Quit: Client limit exceeded: 20000]
kts has joined #wayland
bindu_ has quit []
kts has quit [Ping timeout: 480 seconds]
rv1sr has joined #wayland
kts has joined #wayland
doras has quit []
doraskayo has quit []
gnustomp[m] has quit []
xerpi[m] has quit []
bindu has joined #wayland
kts has quit [Ping timeout: 480 seconds]
bindu has quit [Ping timeout: 480 seconds]
psydroid[m] has quit []
bindu has joined #wayland
<emilio[m]>
<mclasen> "so you're telling me if browsers..." <- No, I'm telling you the computed value on the parent is always an absolute color because we compute it eagerly, so that we don't have to implement the weird "walk all the ancestor chain resolving currentColor"
<mclasen>
thanks, I think I figured out how to handle it
<emilio[m]>
<mclasen> "but what do you do if the..." <- But you're right that the spec might need to clarify this (probably make it closer to how browsers behave) since otherwise stuff like this is a problem
<emilio[m]>
Well I guess not
<emilio[m]>
You eventually always have an absolute color at the root
bindu_ has joined #wayland
bindu has quit [Ping timeout: 480 seconds]
bindu has joined #wayland
bindu_ has quit [Ping timeout: 480 seconds]
<Company>
emilio[m]: the spec says currentColor should be inherited, and inheriting according to the spec inherits the computed value, and that might include currentColor, ie color-mix(currentColor 50%) which would then require looking up currentColor which is inherited as the computed value and now you have recursion
<Company>
that's because the inheritance part of the spec talks about computed values while currentColor gets looked up only for the used value, but it's not clear that this is done by looking up the used value of the parent (in fact, a strict reading of the spec would say the opposite)
<zamundaaa[m]>
I think you need to explain what you actually want to do. First it was knowing window positions, then layouting, now widgets. What is it?
<zamundaaa[m]>
As in, what do you want to do on a higher level?
<TzilTzal>
All of the above. I'd like to create child windows and lay them out where I want to.
<zamundaaa[m]>
That's not what I'm asking. What is the user facing thing you want to do?
<TzilTzal>
Precisely what I just said. Isn't that user facing?
<kennylevinsen>
You can control position of xdg_popups (e.g., context menus, tooltips, overlays) relative to a top-level, but you cannot place top-levels. An app only sees what happens within a surface, relative to it, and nothing else.
iomari891 has quit [Ping timeout: 480 seconds]
<kennylevinsen>
You cannot see anything in absolute coordinates, and cannot place anything in absolute coordinates. Where top-levels go is decided solely by the compositor and user - hence "compositor policy".
andyrtr has joined #wayland
privacy has quit [Remote host closed the connection]
iomari891 has joined #wayland
<TzilTzal>
This is similar to X11 where you can give only hints to the position of a top level window... but I'm also asking about querying the position - at least of xdg_popups?
<kennylevinsen>
xdg_popups go where you asked for them to be, relative to their parent surface
<kennylevinsen>
Unless you specified a position constraint that allows the compositor to shift it, which is useful for context menus and such
<TzilTzal>
so if I want to create a button widget, I suppose I would use a xdg_popup as a child of my top level?
<TzilTzal>
(and paint it accordingly, etc..)
<zamundaaa[m]>
TzilTzal there's a lot of possible meanings for "child window". It sounds like you're after subsurfaces
<TzilTzal>
Possibly.
<kennylevinsen>
Ash i think I understand what's going on here - you're trying to make a single window and just draw stuff in it, right?
<zamundaaa[m]>
Those are fixed to a different surface, and you can layout them in more or less any way you want, inside the confines of the parent window
<TzilTzal>
kennylevinsen: something like that, but I might want to also have something like a modal dialog (e.g. for file selection) and center it to the main window...
<zamundaaa[m]>
TzilTzal: modal dialogs are very different from popups
<TzilTzal>
Yes, I know they are.
<zamundaaa[m]>
To get them placed appropriately, you'll want to set the parent window with xdg foreign
wmlhwl has joined #wayland
<zamundaaa[m]>
(or in the future, use xdg-dialog)
<TzilTzal>
This is starting to sound even more complicated/combersome than X11 lol
<zamundaaa[m]>
oh, nevermind, xdg-dialog was merged a while ago
<kennylevinsen>
TzilTzal: for stuff in one window, the normal way is to just render everything into a single buffer as part of rendering and submit that as the toplevel content. Alternatively as zamundaa mentioned, you can use subsurface to render things independently and place them within the toplevel as if they were part of it. The first is the recommended approach - more efficient.
<zamundaaa[m]>
you might still want to use the older method for backwards compatibility with compositors that don't support it yet
<kennylevinsen>
"Everything is a window" is a very X specific thing
<kennylevinsen>
Here, windows are distinct and separate things with their own life, but possibly some relation to something else - like being a context menu of a top level, or a modal dialog of a toplevel
<kennylevinsen>
But widgets are not windows.
<TzilTzal>
in Qt, everything inherits from QWidget :P
<ebassi>
"every widget is a window" is a very X11 thing that used to happen in the late '90s
<ebassi>
No modern toolkit actually does it that way
<ebassi>
Every toolkit has a windowing system surface that is bound to the top level "window", and then draws every element of the UI on it
<TzilTzal>
ebassi: they probably do to a certain extent because in Windows you'd still get a handle to that window/widget. So it's not simply an area on your toplevel surface you just paint into (if that's what you mean)
<ebassi>
TzilTzal: What has Windows to do with it? I thought we were talking about X11 and Wayland
<ebassi>
But, no: most modern toolkits also don't have native handles for each widget, even on Windows
<kennylevinsen>
TzilTzal: moderne rendering on all platforms are indeed "paint it all into one buffer"
<TzilTzal>
ebassi: we are, but you said no modern toolkit thinks of things as "windows" so I thought you were talking in general terms.
<ebassi>
Some older toolkits may have a native surface for input, but it's, again, an older method
<kennylevinsen>
(win32/windows forms/etc. are not exactly modern UI frameworks
<TzilTzal>
kennylevinsen: if you mean that they just simulate or implement every widget themselves on each platform, I'm not sure that's the case. On Mac they would use the native NS object in most cases.
<ebassi>
And it only works on X11
<ebassi>
TzilTzal: Anyway, it seems you're asking about Qt, so I'd redirect you on a Qt support channel
<TzilTzal>
ebassi: no, I just used it as an example.
<TzilTzal>
Anyway, thanks for your help.
<ebassi>
If you're asking about implementing your own toolkit, then I'd recommend not modelling it on what toolkits used to do 20 years ago
<TzilTzal>
ebassi: fair enough; it's a bit a balance though of using some of the underlying native functionality as opposed to implementing/simulating it yourself, isn't it?
<TzilTzal>
(I'm actually not looking to write a full toolkit.. just trying some fairly simple stuff with wayland. Don't want to use a heavy-weight solution like Qt).
<ebassi>
TzilTzal: Well, in this case there is no balance: using subsurfaces for each separate UI element does not buy you anything
<ebassi>
You want a single buffer, and you want to draw everything into it, so you can reason about transparencies, and shadows, and things like date
<TzilTzal>
ebassi: what about something like Z ordering and detecting mouse clicks, etc..?
<ebassi>
Subsurfaces come into play for things like video overlays, or zero-copy rendering of GPU buffers
<ebassi>
TzilTzal: That's something that is entirely up to the toolkit
<ebassi>
Say, for instance, that your toolkit allows 3D transformations
<ebassi>
You get a click on a coordinate on the top level window; it's up to the toolkit to do the reverse transformation and find the UI element under those coordinates
<ebassi>
The windowing system has zero idea about those transformations
<kennylevinsen>
Native functionality is mostly an illusion - what you have is just libraries trying to render the same. E.g., SwiftUI vs. Cocoa vs. Carbon. Flutter's Cupertino libs, whatever react native has, ...
<TzilTzal>
Well, that's an advantage you get with the "everything is a window" approach I guess. You don't worry about that.
<TzilTzal>
The hierarchy is part of the windows system and not your toolkit.
<TzilTzal>
Either way - I'll go read about some of what you mentioned. Thanks again for your help.
iomari891 has quit [Ping timeout: 480 seconds]
TzilTzal has quit [Remote host closed the connection]
rv1sr has quit []
wmlhwl has quit [Remote host closed the connection]
fmuellner has joined #wayland
___nick___ has quit [Ping timeout: 480 seconds]
sima has quit [Ping timeout: 480 seconds]
coldfeet has quit [Remote host closed the connection]