著者: Ian Zimmerman 日付: To: dng 題目: Re: [DNG] Why X does keyboard and mouse.
On 2020-12-31 12:21, Hendrik Boom wrote:
> On Thu, Dec 31, 2020 at 11:53:51AM -0500, Steve Litt wrote: > > <rant>
> > It didn't have to be this way. In 2020, better alternatives could have
> > been made. If I were the project manager, the first thing I'd do is
> > uncouple keyboard, mouse and video from each other. Why X has anything
> > to do with keyboard or mouse is beyond me. > Perhaps because X was originally a means of having a graphical user
> interface to multiple machines over a network. Which usualy involves
> a screen, keyboard, and mouse.
Also, many people prefer to use the keyboard to generate many, most, or
even all the pointer events. While this remapping or emulation (whatever
you want to call it) can in theory be done on the application or toolkit
level [1] that means duplication and some applications will inevitably
left behind.
I'm surprised nobody else brought this up here :-P
So I think the idea of an integrated _input_ management layer is still
sound, at least.
Something also needs to keep track in which window pointer events occur
and which window has focus for keyboard events. Sure, that can be
decoupled from the low-level _generation_ of events, but aren't we
already well along that path with Xorg and libinput?
I agree that X11 sucks but I think the reasons are more in the low level
protocol design than in the overall partition of responsibilities.
[1]: which is the approach of Wayland, if I understand right.