from https://wayland.freedesktop.org/faq.html
Is Wayland network transparent / does it support remote rendering? No, that is outside the scope of Wayland. To support remote rendering you need to define a rendering API, which is something I've been very careful to avoid doing. The reason Wayland is so simple and feasible at all is that I'm sidestepping this big task and pushing it to the clients. It's an interesting challenge, a very big task and it's hard to get right, but essentially orthogonal to what Wayland tries to achieve.
This doesn't mean that remote rendering won't be possible with Wayland, it just means that you will have to put a remote rendering server on top of Wayland. One such server could be the X.org server, but other options include an RDP server, a VNC server or somebody could even invent their own new remote rendering model. Which is a feature when you think about it; layering X.org on top of Wayland has very little overhead, but the other types of remote rendering servers no longer requires X.org, and experimenting with new protocols is easier.
It is also possible to put a remoting protocol into a wayland compositor, either a standalone remoting compositor or as a part of a full desktop compositor. This will let us forward native Wayland applications. The standalone compositor could let you log into a server and run an application back on your desktop. Building the forwarding into the desktop compositor could let you export or share a window on the fly with a remote wayland compositor, for example, a friend's desktop.
As of 2020, there are several projects that use these methods to provide GUI access to remote computers. The compositor Weston provides an RDP backend. GNOME has a remote desktop server that supports VNC. WayVNC is a VNC server that works with compositors, like Sway, based on the wlroots library. Waypipe works with all Wayland compositors and offers almost-transparent application forwarding, like ssh -X.
Sorry you're being downvoted. You're not crazy; this exists. It's internally called Click Methods in libinput, and you're looking to use the option called clickfinger: https://wayland.freedesktop.org/libinput/doc/1.11.3/clickpad_softbuttons.html
Erm, Wayland has a working clipboard api: https://wayland.freedesktop.org/docs/html/ch04.html#sect-Protocol-data-sharing
> Show me that there is no discussion to be had on configurability and I will change my tune.
From the libinput FAQ:
># Can you add a configuration option for $FEATURE?
> No. At least that's going to be the initial answer. [...]
> So the answer to this question will almost always be 'no'. A configuration option is, in most cases, a cop-out.
And yes, I have read the overlong wording the author uses to justify his position (that essentially boils down to “I don't want to support it”). But the most important phrasing of note there is the arrogance behind this particular statement in the FAQ itself:
> libinput has several features that are handled automatically (and correctly) that users wanted to have configuration options for initially.
(emphasis mine). Of course, those same several features that are handled automatically and “correctly” are a continuing source of issues for a lot of users, which leads to continual changes over time (amazing how many ways you can implement some stuff, all different and all “correct”), and users finding their hardware suddenly not working anymore, and then having to backinstall older libinput versions, or change input systems (assuming they are wise enough to not having switched to Wayland yet).
> I will change my tune.
Unless you happen to be a major contributor to libinput and willing to go against the gatekeeper's will (or the gatekeeper himself), you changing your tune wouldn't really affect the situation in any significant way.
But Wayland didn't start from scratch. They've taken advantage of existing code and protocols in their design.
> Wayland is not really duplicating much work. Where possible, Wayland reuses existing drivers and infrastructure. One of the reasons this project is feasible at all is that Wayland reuses the DRI drivers, the kernel side GEM scheduler and kernel mode setting. Wayland doesn't have to compete with other projects for drivers and driver developers, it lives within the X.org, mesa and drm community and benefits from all the hardware enablement and driver development happening there.
E: Quote from Wayland FAQ
I can't talk too much about my context (aggressive NDAs), but I have a very wide picture - the software I need to support is a mix of desktops, training simulators and mobile devices. We've used X (not Xorg) for parts of this system for many many years and are in a strategic shift where alternatives are being evaluated in a methodical way. Wayland and about 6 others are on the table. The one I definitely do not want is at an advantage right now (and it's not Wayland..).
The core protocols are not as simple as you may think. They include things like dragging, dropping, minimizing, maximizing, fullscreen - you name it. See for yourself: here
The average desktop user is unlikely to care which window system they use. If you don't already have a reason to switch from one to another, then there is little to no point in switching.
There is of course no answer to your question where there is no method by which to evaluate either case, even if it was an unambiguous binary choice, but some relevant questions are answered in the Wayland FAQ.
Most folks would be advised to stick to whatever is offered by default in their desktop distribution of choice since it is likely to be the best integrated and supported. You should switch and report back on your experience.
Ok so you are arguing like the last 8 years of wayland development didn't happen and don't consider that wayland. Awesome. Lets take a look at the garbage wl_shell protocol that is part of "core", a useless "core" since even compositors are dropping wl_shell and wl_output but what the hell.
https://wayland.freedesktop.org/docs/html/apa.html#protocol-spec-wl_shell Imagine that, it defines window management as popups. transient surfaces. fullscreen surfaces. "in move" state. in "toplevel" state. "in resize" state. in "maximized" state. in "minimized" state. The client needs to know these things so it can draw titlebar text, buttons accordingly. To decorate. The server need to be instructed about "move" and "toplevel" because it can't know if a pixel on a surface is contents, border or shadow because the client drew them, when you know, it decorated.
Downvote? Is this no longer true ( from here: https://wayland.freedesktop.org/faq.html#heading_toc_j_11 )
"""How can I replace Wayland's Window Manager?
The Wayland architecture integrates the display server, window manager and compositor into one process. You can think of Wayland as a toolkit for creating clients and compositors. It is not a specific single compositor or window manager. If you want a different window manager, you can write a new one. A 'libweston' effort is underway in order to allow new environments to reuse Weston's codebase and mechanics, whilst providing their own look and feel. ""
Most X11 deployments use libinput these days and Wayland doesn't even handle input (to the best of my knowledge) and again most compositors just use libinput.
Libinput seems to use device specific acceleration:
> libinput uses device-specific pointer acceleration > methods, with the default being the Linear pointer > acceleration. > > The methods share common properties, such as Velocity > calculation. > > This page explains the high-level concepts used in the > code. It aims to provide an overview for developers and is > not necessarily useful for users.
In the end it's all about how to present it to the the user in a digestible way.
The missing feature is called "coasting" or "kinetic scrolling". And I agree it sucks when it doesn't work.
Recent Ubuntu based distros use libinput as their touchpad driver. Older versions of Ubuntu used the Synaptics driver. Synaptics implemented coasting but it had a quirk. Suppose in Firefox you start a scroll so it coasts and then go to press ctrl+tab to switch tabs. As soon as ctrl is pressed the current page zooms out. This is because the touchpad driver is still sending a scroll signal to the app, so it executes the ctrl+scroll down function which is zoom out.
libinput is a newer driver. They implemented coasting the "proper" way, which requires application support. If apps don't support coasting with libinput you get no coasting at all. It does fix the quirk of Synaptics. Over time more apps will get support for coasting, either natively or by being based on a UI framework that has support.
I find so few apps currently have coasting support it's too painful to run libinput. So I install the old Synaptics driver, xserver-xorg-input-synaptics. In a few years libinput support may be widespread enough to make it worthwhile.
I run Kubuntu LTS. Perhaps more up to date distros, especially rolling release distros will have newer versions of apps with better coasting support.
We rely on 3rd party libraries/drivers to handle mouse input. As far as I know, none of them support customizing the curve beyond choosing whether or not to have mouse acceleration enabled at all. Here's what libinput does (the default on most distros these days): https://wayland.freedesktop.org/libinput/doc/latest/pointer-acceleration.html
No. On X11, the display server and the window manager (and maybe also the compositing manager) are usually different programs. On Wayland, that's usually all done by the same program, which is usually called the compositor. See https://wayland.freedesktop.org/architecture.html.
OpenSUSE lets you choose which protocol is used to build the graphical interface. You can learn about this on freedesktop.org, but you might be asking for a simple answer to a practical question. That is, "Which option should I choose, why, and what difference does it make?"
Um... OK? So I've looked at the easy guide to the Wayland Architecture again, and I sorta think I can mostly understand it.
So like Kwin and Mutter etc as well as being windows managers for KDE and Gnome will now implement the compositor in that architecture diagram -- so Mir will be a Wayland-protocol compositor but not on desktops -- on IoT like embedded displays in vehicles and the like. I think.
So It's not Mir as a competitor to Wayland -- Wayland is a protocol and Mir will be display server/compositor. But Mir also had a "native" protocol, I think, and I can't figure out the future plans regarding that from the OP's article and the articles that it links to.
Is my minimal understanding on the right lines?
Hey -- I would just dive right in and start by reading this code overview.
Simula involves the following:
For VR libraries, we're working with a few to try to figure out which one can get us Simula rendering on the HTC Vive. Here's OSVR, for example.
Do you have a Vive?
Pretty sure it's supposed to work with no configuration.
You might be interested in:
http://who-t.blogspot.com/2017/07/libinput-and-pressure-based-palm.html https://wayland.freedesktop.org/libinput/doc/latest/palm_detection.html https://wayland.freedesktop.org/libinput/doc/latest/reporting_bugs.html
Regarding the stylus, developers are usually tied to the keyboard, and thus would never use a stylus to control their device. Not to blame them, but development in free software communities is often driven by "scratching your own itch". I know what I am talking about, because my brother is bothering me since 10+ years, when "Linux" is finally able to replace Windows on his stylus tablet, and why I (as a developer) "don't do anything about it" (or any other developer, for that matter).
Windows (even Windows Vista) is still ahead in stylus support compared to free alternatives. Configurable pen flicks, gestures, cursive handwriting recognition in multiple languages, on-screen keyboard that is switchable to handwriting input etc.
On the positive side, Wacom support in libinput/Wayland has been added, see e.g. https://wayland.freedesktop.org/libinput/doc/latest/tablet-support.html and it now needs more integration in desktop environments, applications, and (more importantly) in GUI toolkits.
See e.g. https://www.reddit.com/r/Fedora/comments/5dylj0/fedora_25_wayland_wacom_as_primary_mouse_input/ for a similar discussion, which is not really Fedora related.
> That last part was, of course, a fucking lie. Nobody agreed on anything.
This is impressively wrong. What you managed to describe in the previous paragraph happens to be the actual state of things. And then you said that it was false.
See XDG portals. GNOME, KDE, and wlr all agreed on and implemented the XDG portal standards which provide, among other things, standard interfaces for screenshots, screen recording, remote desktop, etc.
Clipboard is standardized in core Wayland, see https://wayland.freedesktop.org/docs/html/ch04.html#sect-Protocol-data-sharing. Primary selection is implemented as a standard extension, see https://pontaoski.github.io/readway/wp_primary_selection_unstable_v1.html.
/u/SpaghettiSort
Sadly there's no support for 4 finger taps in libinput:
> libinput does not support four-finger taps or any tapping with more than four fingers, even though some hardware can distinguish between that many fingers.
Aditionally utilities such as libinput-gestures only recognize events raised by libinput, so there's no way to do this in libinput atm.
This is an X server bug: using Coordinate Transformation Matrix
for a relative pointing device (such as a mouse) is broken (the matrix is wrongly applied to coordinates from the WarpPointer
request).
Looks like the only remaining way to configure pointer speed when using the libinput
driver is to set the mouse resolution in <code>hwdb</code> configs:
xinput list-props 'USB OPTICAL MOUSE'
and look for the Device Node
property (it would be something like /dev/input/event12
).mouse-dpi-tool /dev/input/event12
(use the device node which you found at the step 1), move the mouse as requested, then press Ctrl+C to exit. The program should print the appropriate hwdb entry for the device (two lines, the second line starts with a single space)./etc/udev/hwdb.d/71-mouse-local.hwdb
and place the new entry there.systemd-hwdb update
to update the hwdb cache.udevadm trigger /dev/input/event12
(use the device node which you found at the step 1) to apply the new resolution settings.You may want to adjust the resolution value in MOUSE_DPI
to your liking (increasing the resolution value will make the pointer move slower). In some cases you may need to fix the first line of the entry (e.g., for a PS/2 mouse mouse-dpi-tool
gave unknown bus type
, which did not match the real device).
libinput only gives you the raw inputs but does not manipulate them in any way. Xorg can use libinput as the input driver. Xorg server has its own mouse and keyboard settings.
Here is an excerpt from the documentation:
This is one of the Wayland criticisms as it does not define any standards for stuff like mouse or keyboard configuration and many more, instead each project has to implement their own stuff which will cause a lot of incompatibilities and fragmentation.
Have a look at https://wiki.archlinux.org/index.php/Libinput and https://wayland.freedesktop.org/libinput/doc/latest/palm_detection.html
If you can't make it work, and you'd like some more help, tell us your specific type of computer. Some have really finicky touchpads.
I think the way libinput is supposed to work is that you tell it your mouse's DPI and it normalizes things so that all mice make the cursor go more or less the same speed, at which point you can adjust the sensitivity parameter up/down to get the actual pointer speed you prefer. Games and other apps that want super-high-precision mouse info grab the raw (unaccelerated) relative mouse deltas via the XInput2 extension.
https://wayland.freedesktop.org/libinput/doc/latest/motion_normalization.html
I may very well be full of shit though, I haven't had a chance to try the hwdb thing yet with my G100s.
Unfortunately no
Many old laptops have semi-mt touchpads. These touchpads are not reliable for gestures and libinput disables gestures on them.
That is "Inertial Scrolling" and is available on Wayland. Unfortunately Linux Mint (Cinnamon, MATE, XFCE) still uses Xorg and the transition to Wayland could take a bit of time.
https://wayland.freedesktop.org/libinput/doc/latest/scrolling.html
Wayland is a protocol that the developers can implement how they see fit.
Wayland window managers are also the compositors and the display servers in one program. On X the WM runs on top of the server as any other userspace program.
There is XWayland which could be used to run X programs on Wayland for some time, but in the long term a Wayland backend would probably be ideal. (Based on what happened last time Wayland was hyped, moving to Wayland is also very long term...)
Libinput is the library that handles this
https://www.freedesktop.org/wiki/Software/libinput/
There's a setting somewhere for this, you just need to find it. Maybe here....
https://wayland.freedesktop.org/libinput/doc/latest/touchpad_pressure.html#touchpad_pressure_hwdb
Feedback:
I think you should add that Wayland is a replacement X11 , x.org equivalent in the Wayland ecosystem is the various Wayland compositor libraries (wlroots/smithay/libweston) .
>Has fantastic Wayland support.
I would not call GNOME support "fantastic", Canonical decided to stay with x.org for 18.04 (for reasonable reasons) , the Wayland port also has some performance problems even when compared to the x.org port .
> i3 cons
Also has screen tearing , there is compton but in my experience using it has it's own problems.
The most important part is making sure this page is somehow discoverable, maybe finding him a good home like the Wayland documentation or Linux journey where it could be more discoverable and other people can maintain/improve it.
> Wayland makes no assumptions about the display.
Except it does communicate output properties, hence the wl_output subprotocol. The definition of that protocol wouldn't hold for a polar screen or actual 3D content.
> Somebody built a VR Wayland compositor that displayed windows in 3D space where they could be picked up, pinned to walls, rotated, whatever so windows had not only XYZ location but also rotation and scale to worry about.
They are still just planes in space with surface relative coordinates. There's nothing that would make Wayland more or less suited than X for this purpose and practically the same thing the VR desktop for windows does.
> For all Wayland cares the screen could be circular and using polar coordinates for its pixels. > ... > But yeah, Wayland doesn't allow windows to decide their own position.
Except both these assumptions are incorrect due to the wl_subsurface protocol that does allow for parent- relative positioning in both X,Y and Z (order).
An easy trick to reproduce X windows positioning model? Write a client that take a surface and size it according to the output (optionally, set as fullscreen), mark it as transient, attach and commit a buffer with zero alpha. Create all your real windows as subsurfaces and use the set_position part of the subsurface protocol to control their positioning. Write your client to act as a compositor itself. Now you have window positioning control, input capture, input injection, clipboard control.
Yeah I've seen it described as 'living documentation', I'd expect it to only be for reference rather than real world use but the Wayland site does say that " The Weston compositor is a minimal and fast compositor and is suitable for many embedded and mobile use cases."
When reddit threads on Wayland pop up there's usually someone decrying Westons performance, so some people are using it, also I've seen Weston suggested loads of times as a fix for screen tearing.
https://wayland.freedesktop.org/docs/html/ch04.html
I started the work some time ago based on this, but never made significant progress, because it just wasn't high priority enough. But the wire-protocol is pretty simple.
The Wayland protocol itself does not have any networking functionality built-in: there are rumours on the web that it is not "network transparent" like X11 is. It is however possible to write a compositor which talks to remote computers over the network. That essentially means, that you are free to choose a protocol for network communication- it does not need to be the same as the server-client protocol used on your local machine (that is the case with X11). You can write it as efficient as you want it to be.
See also https://wayland.freedesktop.org/faq.html#heading_toc_j_8
would be much more interesting to see:
a thorough, technical (not the emotional crap being spewed, tons and tons of I feel like this and that) comparison of X, MIR and Wayland communication PROTOCOLS. For example, If someone would try to convince everyone to switch from HTTP to PANDA (made-up name, but it sound cute right) and dump all the associated infrastructure and technical expertize built up during the last 20 or so years, there should be a damn fine explanation for doing so, not "it feels snappier". I'm looking for that explanation and havn't seen it in either daniels presentation or someone else's.
going beyond the plain and obvious tearing / non-tearing, yes it's one of the more glaring problems from misconfiguration. I've had it in Weston too, because.. shitty drivers. I read things like https://wayland.freedesktop.org/docs/html/ch02.html#sect-Compositors-Embedding-Compositor
Ok, System Composer, Session Composer, Embedding Composer. The happy ambiguity of what someone means with 'the composer' now. But a composer is not a 'shell' or is it?
"This is often used for applets in a panel, browser plugins and similar. Wayland doesn't directly allow this, but clients can communicate GEM buffer names out-of-band, for example, using D-Bus, or command line arguments when the panel launches the applet"
This reads to me like "for this very common desktop IPC- related problem" (which is all wayland is supposed to be) we don't approach it but be happy to stack other IPC solutions together to work around that.
Seems like the most important work here is not as much liberating us from X as it is liberating us from the X driver model ...
> this review doesn't dive into
Wayland wiki says it doesn't support it so my guess is that it isn't in the stable version so the review not diving into it is normal.
It looks like you're struggling a little with mouse control. Might want to make sure you're using the flat mouse accel profile in libinput (might not be applicable for you) and switch over to lower overall sensitivity, plus maybe a little mouse accel in game.
Otherwise, try not to jump onto people with rail out in the open -- you'll generally get out-DPS'd pretty hard by good players. So maybe try use rockets more, but slow down your aim with them so you're not aiming so short, and so you're thinking about where they are likely to go/dodge to, and can aim accordingly.
Since you're using the term "window manager", I'm assuming you're talking about X.
The window manager isn't really involved in either of these things. But if you're using a compositor, then that is involved in getting pixels from the clients to the screen. (Of course the compositor is often a part of the window manager, so the window manager is involved in that case; but it is involved because it is the compositor, not because it is the window manager.)
This page has a short explanation of how this works, both on X and on Wayland.
> What is the relationship between the display server and the window manager?
The window manager is a client and the display server is the server.
https://wayland.freedesktop.org/architecture.html
Don't be fooled by the address link. The site explains both how Xorg and Wayland work and their differences, by using flowcharts detailing each steps. I don't know how to ELI5 all of those, but I'm sure you can understand them.
Not sure what your specs are, etc. I can say on ubuntu 19.04 I had to rebuild libinput from source and my trackpad actually started working with multitouch stuff, and scrolling in general was much more fluid and natural
https://wayland.freedesktop.org/libinput/doc/latest/building.html
I think this (the LIBINPUT_IGNORE_DEVICE udev property) is the correct way to force any wayland compositor to ignore an input device.
This pages gives a good overview of X and Wayland architecture: https://wayland.freedesktop.org/architecture.html
> I ask because after installing Arch the lazy way (EndeavourOS, thank you guys for the work), my Youtube videos were insufferably laggy until I learned that I had to enable full composition somewhere.
Are you running Nvidia by any chance? (it has an "enable full composition")
I was running X without compositor on AMD for quite a while and never any issues with tearing or stuttering.
Maybe give libinput a try.
I can't vouch for it because i've never used it myself but i have seen it being posted as a solution here and there.
X was created for a world which no longer exists. The idea was that it would implement its own lightweight graphical window toolkit, and be network transparent, meaning you could operate graphical windows on a remote server from a local terminal over a network. The current reality is that nobody uses that toolkit, because it's ugly and lacks features, so they use GTK or Qt which essentially just push pictures of windows to the screen, rendering the networking aspect all but useless. It's been extended and had things tacked on for 30 years to try and keep it up with the times. The result is a nightmarish tangled web of bastardized code which is very difficult to maintain. It wasn't designed to support compositing, so it doesn't do that very well. There's a bunch of unnecessary overhead with compositing on X because it's essentially a hack. This causes some lag.
https://wayland.freedesktop.org/x-architecture.png https://wayland.freedesktop.org/wayland-architecture.png
Wayland is designed from scratch with a narrower scope. X11 is bloated and carries a lot of legacy cruft around with it.
tldr:Wayland was created by X maintainers who were tired of enduring the trauma of maintaining X.
I haven't learnt how opensuse packaging works but to get the latest, this could be case of (a) building libinput from git (I was doing this a while back on Ubuntu, when upstream had T480 touchpad improvements I was impatient for) (https://wayland.freedesktop.org/libinput/doc/latest/building.html)
(b) use the tumbleweed package, which I guess is pretty up to date or (c) use OpenSuse packaging to build your own updated package.
I never had problems using a new libinput.
If you are using xorg, then the best way is to use libinput and learn how to use the command
xinput
​
The distributions have been slow to support all the libinput settings,which is crazy since it is the only things which works with wayland so it is clearly the future.
Although KDE now seems to support all the libinput settings as of 5.16. Except multi-finger gestures, that still requires manual setup.
In general libinput exposes fewer settings than older drivers (by design).
e.g.
# xinput (to list all devices)
# xinput --list-props 13
DEVICE="SynPS/2 Synaptics TouchPad"
xinput --set-prop "$DEVICE" "libinput Accel Speed" 0
xinput --set-prop "$DEVICE" "libinput Natural Scrolling Enabled" 1
xinput --set-prop "$DEVICE" "libinput Tapping Enabled" 1
xinput --set-prop "$DEVICE" "libinput Disable While Typing Enabled" 1
I've been using a Thinkpad since 2017 and the libinput touchpad was very bad on my P50 initially. It turns out however that libinput is quite easy to work with and understand, and this is open source ... so I got in touch with the very friendly developer and sent some tweaks. The p50 is much better now. I use a T480 now, and I find the trackpad experience very good, as good as Windows (it's dual boot).
​
The trackpoint: libinput still does not do this very well in my opinion, it's way too fast for me, out of the box.
However, there is a "secret" setting: the 'magic multiplier': https://wayland.freedesktop.org/libinput/doc/latest/trackpoint-configuration.html
put the file in: /usr/share/libinput/local-overrides.quirks
I use on the t480
[Trackpoint Override]
MatchUdevType=pointingstick
AttrTrackpointMultiplier=.75
​
on a X230 my daughter uses, I made the multiplier 0.5
This is actually a case of open source done right. It allows for collecting of evidence to convince driver devs of the need to patch.
https://wayland.freedesktop.org/libinput/doc/1.11.3/touchpad_jumping_cursor.html
Completely different architectures, as viewed here https://wayland.freedesktop.org/architecture.html As you can see X11 is complicated and has many layers, wayland does not. Less layers == smoother/faster performance.
Not noobish at all...just very hard to answer. Best I can do is linking to their FAQ. It is meant as part of a replacement for X11 (Xorg on linux) so as base for graphical output but with different goals and different architecture. A lot more work falls on DE/WMs now but with the gain that the solutions can be much better tailored to the individual devices now.
Window protocol, if you will. "System" does not imply implementation, X11 is as much of a system and a protocol Wayland is, it's not because we colloquially say "X11" to refer to the X.org implementation that X11 is a specific implementation of the protocol.
Contrarily to what you think, I know the basics, thanks, and I also happen to know a bit about the inner workings of Android. Now, you tell me where in <code>surfaceflinger</code>'s source code do you see code that handles more than just display (e.g. redispatching input events)?
At the same time, Wayland does more than just display.
A few examples:
wl_event_loop
. Where does surfaceflinger expose a general-purpose, I/O-relaying event loop to applications? Have a look at Android's Looper
object.wl_data_offer
and related structs/functions. Does surfaceflinger handle drag and drop, or more relevantly copy-paste actions? It does not. Binder
is responsible for that.wl_shell
. Surfaceflinger doesn't handle shell borders/surfaces AFAIK. Answsers to what does will most likely be found in SystemUI's code.wl_seat
. Surfaceflinger blits to a framebuffer, that's all, it's not aware of seats or any remotely similar concept.wl_pointer
, wl_keyboard
, wl_touch
. That's inputflinger
's job.You're either pretending you know, or you do not realize the extent of your knowledge. Please refrain from looking down on others just because you think you know better; actually show you do.
Is Wayland network transparent / does it support remote rendering? > This doesn't mean that remote rendering won't be possible with Wayland, it just means that you will have to put a remote rendering server on top of Wayland. […]
And please also read this. > The idea behind the network transparency is to send drawing commands over the wire (useful idee for the requirements of 30 years ago). Nowadays modern applications do not use X11 any more for rendering. They use technologies like Cairo, Clutter, QPainter (Raster) or OpenGL directly. Without using X11 for rendering you end in streaming pixels over the wire. And there are clearly better technologies to do that than X11. Face it: network transparency is going to break very soon even without Wayland. I want to see the Qt 5 used over the wire.
Plus that what /u/Valmar33 wrote.
sorry, I do not know what that could be. Possible stuff to look at: are you using libinput or the synaptics driver? If libinput, I read about a version that was causing problems (I do not remember the details, developpement seems quite active, might be worth checking your version and its know problems). If libinput, "sudo libinput debug-events" will show you the data from the trackpad, not sure what is good or bad. Hopefully someone else has a better idea.
Other touchpad related stuff I read: configuration of the sensibility with libinput: https://wayland.freedesktop.org/libinput/doc/1.7.3/touchpad_pressure.html For info, my hwdb config file contains:
libinput:name:SynPS/2 Synaptics TouchPad:dmi:*svnLENOVO:pvrThinkPadX240* LIBINPUT_ATTR_PRESSURE_RANGE=45:40 LIBINPUT_ATTR_PALM_PRESSURE_THRESHOLD=120 ID_INPUT_WIDTH_MM=87 ID_INPUT_HEIGHT_MM=67 LIBINPUT_ATTR_SIZE_HINT=87x67
That's what I thought until I read this: https://wayland.freedesktop.org/libinput/doc/latest/
>libinput is not used directly by applications, rather it is used by the xf86-input-libinput X.Org driver or wayland compositors. The typical software stack for a system running Wayland is:
Some examples that I found on the internet are calling functions that I can't find documentation for, but I did find a keyboard listener struct in libwayland-client's documentation.
One of the reasons I discontinued GLXOSD is that I don't have a pure Wayland-capable machine to use for testing and I don't really want to spend time fixing input handling bugs for the dying X.
There seems to be some kind of mechanism to enumerate the available modes in the docs https://wayland.freedesktop.org/docs/html/apa.html#protocol-spec-wl_output Not having used the API I don't know how this works, though
As far as i know with libinput you can only select either the adaptive profile which has acceleration or the flat profile without acceleration. Of course you can adjust the pointer speed on both profiles. Every OS has their own acceleration algorithm which feels a little bit different. Here is described how libinput does it: https://wayland.freedesktop.org/libinput/doc/latest/pointer-acceleration.html
I myself use the flat profile because i dont like any kind of acceleration for gaming.
Yup, just tried it, works as advertised. Though the <code>libinput</code> docs aren't fooling when they say
> Note that HW does not usually provide information about run-time resolution changes, libinput will thus not detect when a resolution changes to the non-default value.
So if you have a multi-DPI mouse like the G100s make sure the default DPI (the asterisk'd one) is the one you want to use, like this:
# /etc/udev/hwdb.d/71-mouse.hwdb # Logitech G100s mouse:usb:v046dpc247:name:Logitech G100s Optical Gaming Mouse: MOUSE_DPI=1000@500 1750@500 *2500@500
If you're using MATE 1.18 note that the Mouse Preferences "Acceleration" slider won't actually let you select a 0.0 libinput
sensitivity adjustment via the keyboard, only ±0.1 (see the <code>mate-settings-daemon</code> source). Use dconf-editor
on /org/mate/desktop/periperals/mouse
to set motion-acceleration
to 5.5
(which maps to 0.0) instead.
Hava a look at https://wayland.freedesktop.org/libinput/doc/latest/reporting_bugs.html The libinput developers are quite fast to respond to bug reports in my experience. Maybe even a workaround exists.
> also totem video player offers to install the missing h264 codec but the install hangs then stops. Any advice?
I would install mpv from rpmfusion. It's a lot better than Totem IMHO ;)
they said they won't implement it basically because it's too hard^* which i don't understand since the synaptic driver handles it just fine.
edit: * well it's because there's a "bug" that sounds like expected behavior to me...
Might have something to do with new pressure settings introduced in libinput 1.7.x, for example my touchpad was impossible to use without really high amounts of pressure and attention until I adjusted it using this guide:
https://wayland.freedesktop.org/libinput/doc/latest/touchpad_pressure.html
It was happening on both Wayland and Xorg sessions.
Thanks! That's exactly what I failed to find on my own. (I think I ended up at https://wayland.freedesktop.org/docs/html/ch03.html#sect-Wayland-Architecture-wayland_architecture by Googling, but never thought to check the other chapters.)
I probably don't have the time to do it either, but at least now I know where to find the documentation if I get the itch..
[wayland_msg_hdr_including_type_(extension_"screenshot"), 8bits_for_format, 2or8bits_region_switch(fullscreen, region or window; 6bits can be for extensions), .. if window or specific region then window id or coordinates here]
image format would be just a bunch of #define -s (or enum, although i don't like them), or even an id from a pre-shared array of [id, format] pairs (with "format" being an array of [color(r/g/b/a), type(half_float,float,double,nbit_int)]).
server then returns either a "would you like me to give it to you in this way" (just a plain shared memory fd + size, or an opengl buffer or whatever vulkan thing, or or things we can't be sure of will be available on $platform; although just passing raw data through an UDS should always work and be used as a fallback)
a request for a certain way of delivery can even take up a few bits of the spare 6 bits above
how about writing down on a piece of paper all the requirements people have, then hacking it ?
even writing down all the platforms, past present and future, that wayland will run on and what ways of doing things they support
(note that i have just woken up and that i have not read the wayland protocol spec as it is in XML (i just found this) and i have an nvidia card so i can't play)
PS thinking about it, the server could return a "can't convert to format, would you like it in this format" msg, as alsa does it
(or "this format is native, would you rather take it like this ?")
as i said, if it's technical i volunteer to do some research and propose a protocol.
it's not rocket surgery
edit:
nvm, i doubt you care what i say
Wayland was created to make the lives of xorg developers easier. They did this by placing more of the burden for the display stack on the shoulders Desktops, window manager team.
from Wayland FAQ:
> ...A 'libweston' effort is underway in order to allow new environments to reuse Weston's codebase and mechanics, whilst providing their own look and feel.
So there is hope
EDIT: a word
I'm very much not sure, and judging by you getting this far you may have already been here, but googling a bit brought this up, may be helpful:
I don't see this option in that 'list-props'.
Also, nice keyboard. I've got the Customizer 104 USB from them (they call it something different now) and I love it but I do wish I'd gotten one with the Trackpoint.
What you want is called "kinetic scrolling".
According to the libinput website, it is not available using libinput:
> libinput does not implement kinetic scrolling for touchpads. Instead it provides the libinput_event_pointer_get_axis_source() function that enables callers to implement kinetic scrolling on a per-widget basis, see Scroll sources.
-- https://wayland.freedesktop.org/libinput/doc/latest/faq.html
It is now up to the applications to implement kinectic scrolling.
Sorry for the harsh tone. Dunno why I was such an ass.
As far as I understand it, GNOME always uses Linear pointer acceleration. It uses the speed setting as an acceleration factor (see figure: Linear pointer acceleration). Depending on the acceleration factor (-1 to 1) it uses a different acceleration curve. Everything bellow -0.75 is almost liner (acceleration should not be noticeable) but also really slow.
So the only way to remove acceleration is to reduce mouse speed. I don't know how to set a decent (for me) mouse speed without triggering a higher acceleration curve and therefor "activating" acceleration again.
That is why libinput has a flat acceleration profile (see link, bottom).
Right. You can read about the architecture here if you want. There are some major potential performance gains with Wayland and some security improvements too. On the other hand it no longer includes network transparency for having a program running on a server and its associated window appearing on your laptop, for instance, and people will have to use remote desktop tools instead. And the fact that window decorations - the tital bar and its buttons - aren't handled by the window manager could theoretically cause problems for people such as myself doing certain weird things with their computers but it seems to be less of a problem than expected.
Well that's strange indeed. The drag lock timeout lasts 300 milliseconds so it should be noticeable.
If you're going to report it to libinput I suggest reading this first as it could help you get your issue solved quickly.
Not possible with the current touchpad driver libinput, but it was possible with the older synaptics driver. I would advice against using old drivers though.
Depending on the application, single tap-and-drag will continue to scroll the view allowing you to highlight more text without moving the cursor. Firefox does this for example, but GNOME Shell doesn't.
If you want to set libinput tap-and-drag-lock instead run this command:
gsettings set org.gnome.desktop.peripherals.touchpad tap-and-drag-lock true
To disable it run the same command with false
.
You can also try to measure if the currently used range and resolution are correct: https://wayland.freedesktop.org/libinput/doc/latest/absolute-coordinate-ranges.html
> I am not sure why it says "Tap-to-click: disabled", because it is enabled in the settings.
This tool is not aware of anything changed by the compositor, it can only show the default settings.
That would be something to be fixed at the libinput or kernel level when it comes to detecting the right events and the "speed" of the events would probably best be fixed at either the libinput or the toolkit/application level. GNOME only forwards these events.
This might help with detecting multi-touch: https://wayland.freedesktop.org/libinput/doc/latest/touchpad-pressure-debugging.html
And check if sudo libinput list-devices
has the correct size for your touchpad.
You seem to have libinput edge scrolling enabled, change it to two fingers.
the first sentence from wayland.freedesktop.org:
"Wayland is intended as a simpler replacement for X, easier to develop
and maintain. GNOME and KDE are expected to be ported to it."
Yes that's exactly the heuristics the page is talking about, but still you should contact the developers as your hardware is clearly not working properly. I can't do anything to help you, sorry.
It looks like Linux doesn't fully support your touchpad.
If you know this touchpad works with 3 finger gestures (say in Windows) you could try raising an issue in libinput's gitlab. They'll help you better and faster if you give them the needed information to help you. They'll even forward the problem to an appropriate party if they can't fix it themselves.
Some touchpads, even if they are multi touch, only report the positional coordinates of at most 2 fingers. Those are called 2-slot finger touchpads. Even if the touchpad can detect the presence of 3/4 finger taps, it can't determine their motion and so can't do 3/4 swipes.
As the page says, there is some software heuristics done to two-slot touchpads to send 3/4 swipes even if the positional coordinates of the 3rd/4th finger can't be determined, but if the touchpad has worse hardware support than expected then no amount of software can't help here.
Did it work before?
Try <code>libinput debug events</code> and see if you can generate GESTURE_SWIPE_*
events with 3
fingers. A short example:
event8 GESTURE_SWIPE_BEGIN +1.466s 3 event8 GESTURE_SWIPE_UPDATE +1.466s 3 -36.94/ 6.74 (-36.90/ 6.74 unaccelerated) event8 GESTURE_SWIPE_UPDATE +1.479s 3 -51.13/ 3.36 (-40.13/ 2.64 unaccelerated) event8 GESTURE_SWIPE_UPDATE +1.496s 3 -69.51/ 1.01 (-40.42/ 0.59 unaccelerated) event8 GESTURE_SWIPE_END +1.520s 3
The other key is play/pause. If you have an audio or video playing this key will resume/stop playback. I can confirm it shows this 🚫 when you have nothing playing.
I can't tell from your testing if you were spamming the ctrl key or you were just holding it. Also I don't know if you tested the key combinations here or you were just pressing ctrl. But I can see it takes ~8 ms for a KEY_PLAYPAUSE
press to show after a KEY_LEFTCTRL
press, and ~10ms for a KEY_PLAYPAUSE
release after a KEY_LEFTCTRL
release. In one part the KEY_PLAYPAUSE
release comes before the KEY_LEFTCTRL
, the time between releases is shorter: 7ms unlike the usual ~10ms.
So it seems a bit regular, but also erratic. I doubt this has anything to do with the physical ⏯️ key located in the F10 position, it seems like a keyboard driver bug, you should start reporting it to libinput.
For the moment, one workaround would be to disable the <code>PLAYPAUSE</code> key, so it doesn't interfere with other shortcuts. In file /usr/share/X11/xkb/keycodes/evdev
find the line:
<I172> = 172; // #define KEY_PLAYPAUSE 164
And comment it out (put //
at the start). You could also disable it in the virtual console if you use it. Both workarounds will disable your physical ⏯️ key too, so it's important you find a proper fix.
Well the good news is that it seems to be recognized as something, as that is the GNOME On Screen Display. Try running
sudo libinput debug-events --show-keycodes
and press the control key to know the keycode. If you don’t have libinput tools, install it like so:
sudo apt update sudo apt install libinput-tools
In traditional numpads all the keys only have one keycode, and XKB uses the NumLock
modifier to toggle between the two symbol layers (NmLk off = directional, NmLk on = numeric). The default also allows for using Shift
to change between the layouts (NmLk on + Shift = directional instead of numeric). Both NumLock
and Shift
are software modifiers.
The FN key is usually a hardware one, but some bluetooth keyboards regard them as software. In case it’s a software one then you have the case described above, plus the FN modifier, totaling 4 layers (the combination of NmLk
+ Fn
). If it’s a hardware one (I think this is it) then you have two keycodes layers (Fn
on or off), each of them having 2 layers (NmLk
on or off). If you want you can add more modifiers to the mix (Shift
, AltGr
) giving you 2^x symbol layers (where x = number of modifiers).
Regardless of what you have, you only need two symbol layers total: one for media keys and other for numbers, and you can certainly remap them with a XKB layout.
But first let’s run some diagnostics, that way we can better understand the keypad, and maybe we can fix the unused NmLk modifier. Install libinput helper tools:
sudo apt update sudo apt libinput-tools
And run sudo libinput debug-events --show-keycodes
. Press some keys and take notice of their KEY_X (###)
, if nothing shows up there’s a good chance the key can’t be remapped.
Natively wayland does not solve this problem as this is out of scope an overcomplicates things. Waypipe can solve your problem.
Sorry I forgot to say you need to install the helper tools. I think for Pop_OS the package is libinput-tools
:
sudo apt update sudo apt install libinput-tools
It's a libinput "clickpad" option. You want "clickfinger" (https://wayland.freedesktop.org/libinput/doc/latest/clickpad-softbuttons.html?highlight=clickfinger#clickfinger) rather than "software button areas" (https://wayland.freedesktop.org/libinput/doc/latest/clickpad-softbuttons.html?highlight=clickfinger#software-buttons). To set this in Sway, as listed in man sway-input
, you need the following:
input type:touchpad { click_method clickfinger }
Note that it's more clear to call it a "two-finger click" and a "three-finger click", since "double click"/"triple click" usually mean clicking wih a single finger two or three times in a row.
Don't know exactly how you do this, but I do know you'll want to check out this to see how libinput deals with button areas and 'clickfinger' behaviour, and the 'Libinput Configuration' section of sway-input's man page (or man sway-input
) to see how to apply that in your Sway config
file.
https://wayland.freedesktop.org/faq.html
Is wayland replacing the X server?
-Mostly, yes.
You might wanna get of your horse and actually listen to user requests instead of bragging about stuff that already works. X is dead yes but that doesnt help users if for their usecases wayland isnt ready yet, this is how products die when they dont listen anymore to their users which keep using the old bloated stuff till it breaks and they switch to smthing else. Also since wayland sits between the compositor and its clients its not only up to the compositor what to implement, the protocol (wayland) has to provide a way for it. X11 being bloated and a security risk without a future is all the more reason to actually implement features instead of arguing why it doesnt matter that they are not there yet.
A good introduction to Wayland's basic principles and functionality. Contains a few basic example clients (missing handling for multiple seats, output scaling, etc)
/usr/share/wayland/wayland.xml
and /usr/share/wayland-protocols/*/*/*
Wayland protocol definitions from which wayland-scanner
generates wrapper code
A webapp for reading Wayland protocol definitions. The protocol definitions aren't always up-to-date, though
Documentation for Wayland's basic principles, wire protocol and libwayland functions
Ah, it’s <code>ClickMethod</code> then. The default is <code>buttonareas</code>, change it to <code>clickfingers</code>.
By double tap do you mean two finger taps simultaneously? Or one tap after another a la double click?
If it’s the former this is already default, you just need to enable Tapping
and ensure Tapping Button Map
is still set to lrm
.
If it’s the latter then I don’t think so.
There's no need for that to happen. The reason you need to bypass the compositor in X is because the compositor is a separate pipeline. It completely interjects itself in the framebuffer process, which will always add that delay. This is because X was never built with compositors in mind (since they didn't exist in the 80's). With Wayland the compositor is the pipeline. Apps don't get to the framebuffer without being composited ever. There is no skipping it, you'll be skipping the entire pipeline
>The whole point of Wayland is that it's just a protocol.
It's not just a protocol.
From https://wayland.freedesktop.org/ ,
"Wayland is a protocol for a compositor to talk to its clients as well as a C library implementation of that protocol."
You need to use Weston. I don't intend Wayland any time soon.
Script I use to start and close Waydroid. I bund to META + W -> https://github.com/rizzini/Dotfiles/blob/master/Documentos/scripts/waydroid.sh
a few references to touchscreens (i haven't looked through them) in the libinput documentation: https://wayland.freedesktop.org/libinput/doc/latest/search.html?q=touchscreen&check_keywords=yes&area=default
probably no, unfortunately. I used to use both two fingers and edge scrolling with supported touchpad but on wayland with libinput, I can't use both.
https://wayland.freedesktop.org/libinput/doc/latest/scrolling.html
See: https://wayland.freedesktop.org/faq.html#heading_toc_j_8
VNC, RDP and NoMachine NX work.
Oh, and ssh X forwarding has worked for a long while for some cases, waypipe takes care of the rest.
You need to install the "build dependencies" for libinput first.
On Fedora, this is easy.
See: https://wayland.freedesktop.org/libinput/doc/latest/building.html
and in particular run the Fedora command under the subheading Build Dependencies
https://wayland.freedesktop.org/libinput/doc/latest/building.html#build-dependencies
There is no binary for this. Not many people know about it, it's only got 11 stars. I don't know why, it solves two very annoying problems for me with Gnone Wayland: the scroll speed can be changed, and I can do drag scrolling with a mouse button just as I can configure under xorg.
Wayland is not supported because its input stack is so different from the one Xorg uses, Touchegg's scope is limited to Xorg's only, here the author briefly describes this problem (note that now touchegg supports X11/libinput). In order to give proper Wayland support a new utility would need to be written up from scratch with Wayland's input stack in mind.
I want to point out Wayland compositors recognize pinch gestures via libinput natively, and apps are adding support for smooth pinch-zooming such as firefox.
I think the answer is wayland. I can't find any hints to this in the libinput source, and there is this:
https://wayland.freedesktop.org/libinput/doc/1.10.7/scrolling.html#button_scrolling
​
So if Wayland knows how to do this out of the box, why o why can't we choose to turn it on for other scrolling devices!
There are no user exposed options for what you want in libinput, but it's still possible if not a little bit harder.
For the 3 finger tap you can try this workaround, edit the libinput source code to ignore evdev events involving more than 2 finger taps. Note that in the newest version 1.18.1 Aug 03 2021
, /src/evdev-mt-touchpad-tap.c
has nfingers > 3
in line 136:
if (nfingers < 1 || nfingers > 3)
change it to:
if (nfingers < 1 || nfingers > 2)
For palm rejection you'd need to configure device quirks, here's the official documentation on how to debug touchpad pressure/size ranges. Bear in mind that the device quirk API is not public, you can do local changes but these will be overwritten in the next libinput update. Please submit a bug report upstream if you want to keep your quirks.
Here is my xorg:
[ 152.787] (II) event6 - Elan Touchpad: SYN_DROPPED event - some input events have been lost.[ 152.787] (EE) client bug: timer event6 tap: scheduled expiry is in the past (-423ms), your system is too slow[ 152.803] (EE) event6 - Elan Touchpad: kernel bug: Touch jump detected and discarded.See https://wayland.freedesktop.org/libinput/doc/1.18.1/touchpad-jumping-cursors.html for details[ 153.402] (II) event6 - Elan Touchpad: device removed
A few things were omitted.
The built-in refresh rate monitor does NOT work on wayland.
Libinput, the only? lib for input on wayland, does NOT handle joysticks
>Joysticks have one or more axes and one or more buttons. Beyond that it is
difficult to find common ground between joysticks and much of the
interaction is application-specific, not system-specific
In reality joysticks use the same USB standard, they're even in the same class of devices as keyboards and mice.
>Modern game controllers and joysticks are often USB HID class devices. Unlike legacy game port devices, USB HID class game devices do not normally require proprietary drivers to function. Nearly all game devices will function using onboard drivers as long as the device is designed around the drivers and the USB HID class specifications.
Hopefully, if Steam deck pulls enough people, we'll get joysticks working out of the box. And maybe even proper drivers for audio.
> libinput-bin/stable,now
Suggests it is in fact installed, as does the libinput
prefix for properties.
So what do your mean with sensitivity? Speed of the cursor or minimal threshold for movement? Or something else?
I would try reducing libinput Accel Speed
to 0.25
or so
$ xinput set-prop 11 304 0.25