Ancient X11 scaling technology
130 comments
·June 24, 2025pedrocr
kccqzy
> doing the "draw at 2x scale and then scale down" dance that was popularized by OSX
Originally OS X defaulted to drawing at 2x scale without any scaling down because the hardware was designed to have the right number of pixels for 2x scale. The earliest retina MacBook Pro in 2012 for example was 2x in both width and height of the earlier non-retina MacBook Pro.
Eventually I guess the cost of the hardware made this too hard. I mean for example how many different SKUs are there for 27-inch 5K LCD panels versus 27-inch 4K ones?
But before Apple committed to integer scaling factors and then scaling down, it experimented with more traditional approaches. You can see this in earlier OS X releases such as Tiger or Leopard. The thing is, it probably took too much effort for even Apple itself to implement in its first-party apps so Apple knew there would be low adoption among third party apps. Take a look at this HiDPI rendering example in Leopard: https://cdn.arstechnica.net/wp-content/uploads/archive/revie... It was Apple's own TextEdit app and it was buggy. They did have a nice UI to change the scaling factor to be non-integral: https://superuser.com/a/13675
pedrocr
> Originally OS X defaulted to drawing at 2x scale without any scaling down because the hardware was designed to have the right number of pixels for 2x scale.
That's an interesting related discussion. The idea that there is a physically correct 2x scale and fractional scaling is a tradeoff is not necessarily correct. First because different users will want to place the same monitor at different distances from their eyes, or have different eyesight, or a myriad other differences. So the ideal scaling factor for the same physical device depends on the user and the setup. But more importantly because having integer scaling be sharp and snapped to pixels and fractional scaling a tradeoff is mostly a software limitation. GUI toolkits can still place all ther UI at pixel boundaries even if you give them a target scaling of 1.785. They do need extra logic to do that and most can't. But in a weird twist of destiny the most used app these days is the browser and the rendering engines are designed to output at arbitrary factors natively and in most cases can't because the windowing system forces these extra transforms on them. 3D engines are another example, where they can output whatever arbitrary resolution is needed but aren't allowed to. Most games can probably get around that in some kind of fullscreen mode that bypasses the scaling.
I think we've mostly ignored these issues because computers are so fast and monitors have gotten so high resolution that the significant performance penalty (2x easily) and introduced blurryness mostly goes unnoticed.
> Take a look at this HiDPI rendering example in Leopard
That's a really cool example, thanks. At one point Ubuntu's Unity had a fake fractional scaling slider that just used integer scaling plus font size changes for the intermediate levels. That mostly works very well from the point of view of the user. Because of the current limitations in Wayland I mostly do that still manually. It works great for single monitor and can work for multiple monitors if the scaling factors work out because the font scaling is universal and not per output.
sho_hn
What you want is exactly how fractional scaling works (on Wayland) in KDE Plasma and other well-behaved Wayland software: The scale factor can be something quirky like your 1.785, and the GUI code will generally make sure that things nevertheless snap to the pixel grid to avoid blurry results, as close to the requested scaling as possible. No "extra window system transforms".
astrange
> But more importantly because having integer scaling be sharp and snapped to pixels and fractional scaling a tradeoff is mostly a software limitation. GUI toolkits can still place all ther UI at pixel boundaries even if you give them a target scaling of 1.785. They do need extra logic to do that and most can't.
The reason Apple started with 2x scaling is because this turned out to not be true. Free-scaling UIs were tried for years before that and never once got to acceptable quality. Not if you want to have image assets or animations involved, or if you can't fix other people's coordinate rounding bugs.
Other platforms have much lower standards for good-looking UIs, as you can tell from eg their much worse text rendering and having all of it designed by random European programmers instead of designers.
cosmic_cheese
Even today you run into the occasional foreign UI toolkit app that only renders at 1x and gets scaled up. We’re probably still years out from all desktop apps handling scaling correctly.
ndiddy
Wayland has supported X11 style fractional scaling since 2022: https://wayland.app/protocols/fractional-scale-v1 . Both Qt and GTK support fractional scaling on Wayland.
bscphil
Rather annoyingly, the compositor support table on this page seems to be showing only the latest version of each compositor (plus or minus a month or two, e.g. it's behind on KWin). I assume support for the protocol predates these versions for the most part? Do you know when the first versions of KDE and Gnome to support the protocol were released? Asking because some folks in this thread have claimed that a large majority of shipped Wayland systems don't support it, and it would be interesting to know if that's not the case (e.g. if Debian stable had support in Qt and GTK applications).
null
jdsully
Windows tried this for a long time and literally no app was able to make it work properly. I spent years of my life making Excel have a sane rendering model that worked on device independent pixels and all that, but its just really hard for people not to think in raw pixels.
kllrnohj
And yet every Android app does it just fine :)
The real answer is just it's hard to bolt this on later, the UI toolkit needs to support it from the start
sho_hn
> doing the "draw at 2x scale and then scale down" dance that was popularized by OSX and copied by Linux
Linux does not do that.
> It's strange that Wayland didn't do it this way from the start
It did (initially for integer scale factors, later also for fractional ones, though some Wayland-based environments did it earlier downstream).
maxdamantus
> Linux does not do that.
It did (or at least Wayland compositors did).
> It did
It didn't.
I complained about this a few years ago on HN [0], and produced some screenshots [1] demonstrating the scaling artifacts resulting from fractional scaling (1.25).
This was before fractional scaling existed in the Wayland protocol, so I assume that if I try it again today with updated software I won't observe the issue (though I haven't tried yet).
In some of my posts from [0] I explain why it might not matter that much to most people, but essentially, modern font rendering already blurs text [2], so further blurring isn't that noticable.
[0] https://news.ycombinator.com/item?id=32021261
zozbot234
Isn't OS X graphics supposed to be based on Display Postscript/PDF technology throughout? Why does it have to render at 2x and downsample, instead of simply rendering vector-based primitives at native resolution?
kalleboo
OS X could do it, they actually used to support enabling fractional rendering like this through a developer tool (Quartz Debug)
There were multiple problems making it actually look good though - ranging from making things line up properly at fractional sizes (e.g. a "1 point line" becomes blurry at 1.25 scale), and that most applications use bitmap images and not vector graphics for their icons (and this includes the graphic primitives Apple used for the "lickable" button throughout the OS.
edit: I actually have an iMac G4 here so I took some screenshots since I couldn't find any online. Here is MacOS X 10.4 natively rendering windows at fractional sizes: https://kalleboo.com/linked/os_x_fractional_scaling/
IIRC later versions of OS X than this actually had vector graphics for buttons/window controls
astrange
No, CoreGraphics just happened to have drawing primitives similar to PDF.
Nobody wants to deal with vectors for everything. They're not performant enough (harder to GPU accelerate) and you couldn't do the skeumorphic UIs of the time with them. They have gotten more popular since, thanks to flat UIs and other platforms with free scaling.
qarl
You're thinking of NeXTSTEP. Before OS X.
kergonath
NeXTSTEP was Display PostScript. MacOS X uses Display PDF since way back in the developer previews.
wmf
No, I think integer coordinates are pervasive in Carbon and maybe even Cocoa. To do fractional scaling "properly" you need to use floating point coordinates everywhere.
null
kalleboo
Cocoa/Quartz 2D/Core Graphics uses floating-point coordinates everywhere and drawing is resolution-independent (e.g., the exact same drawing commands are used for screen vs print). Apple used to tout OS X drawing was "based on PDF" but I think that only meant it had the same drawing primitives and could be captured in a PDF output context.
QuickDraw in Carbon was included to allow for porting MacOS 9 apps, was always discouraged, and is long gone today (it was never supported in 64-bit).
wmf
None of the toolkits (Motif, Tk, Gtk, Qt, etc.) could handle fractional scaling so if Wayland had taken the easy way out it would break every app.
nixosbestos
Except for the fact that Wayland has had a fractional scaling protocol for some time now. Qt implements it. There's some unknown reason that GTK won't pick it up. But anyway, it's definitely there. There's even a beta-level implementation in Firefox, etc.
6510
If the initial picture is large enough the blur from down-scaling isn't so bad. Say 1.3 pixel per pixel vs 10.1 pixels per pixel.
wmf
Drawing a circle is kind of cheating. The hard part of scaling is drawing UI elements like raster icons or 1px hairlines to look non-blurry.
okanat
And also doing it for multiple monitors with differing scales. Nobody claims X11 doesn't support different DPIs. The problems occur when you have monitors with differing pixel densities.
At the moment only Windows handles that use case perfectly, not even macOS. Wayland comes second if the optional fractional scaling is implemented by the toolkit and the compositor. I am skeptical of the Linux desktop ecosystem to do correct thing there though. Both server-side decorations and fractional scaling being optional (i.e. requires runtime opt-in from compositor and the toolkit) are missteps for a desktop protocol. Both missing features are directly attributable to GNOME and their chokehold of GTK and other core libraries.
axus
Speaking of X11 and Windows, any recommended Windows Xservers to add to this StackOverflow post? https://stackoverflow.com/questions/61110603/how-to-set-up-w...
I hadn't heard of WSLg, vcxsrv was the best I could do for free.
okanat
With WSLg, Windows runs a native Wayland server under Windows and it will use Xwayland to display X11 apps. You should be able to use any GUI app without any extra setup. You should double check the environment variables though. Sometimes .bashrc etc. or WSL's systemd support interferes with them.
Avamander
Where does Windows handle it? It's a hodgepodge of different frameworks that often look absolutely abysmal at any scale besides 100%.
okanat
Every UI framework that runs on Windows has to communicate using Win32 API at the lowest level. Here is the guide: https://learn.microsoft.com/en-us/windows/win32/hidpi/high-d...
Every GUI application on Windows runs an infinite event loop. In that loop you handle messages like [WM_INPUT](https://learn.microsoft.com/en-us/windows/win32/inputdev/wm-...). With Windows 8, Microsoft added a new message type: [WM_DPICHANGED](https://learn.microsoft.com/en-us/windows/win32/hidpi/wm-dpi...). To not break the existing applications with an unknown message, Windows requires the applications to opt-in. The application needs to report its DPI awareness using the function [SetProcessDpiAwareness](https://learn.microsoft.com/en-us/windows/win32/api/shellsca...). The setting of the DPI awareness state can also be done by attaching an XML manifest file to the .exe file.
With the message Windows not only provides the exact DPI to render the Window contents for the display but also the size of the window rectangle for the perfect pixel alignment and to prevent weird behavior while switching displays. After receiving the DPI, it is up to application to draw things at that DPI however it desires. The OS has no direct access to dictate how it is drawn but it does provide lots of helper libraries and functions for font rendering and for classic Windows UI elements.
If the application is using a Microsoft-implemented .NET UX library (WinForms, WPF or UWP), Microsoft has already implemented the redrawing functions. You only need to include manifest file into the .exe resources.
After all of this implementation, why does one get blurry apps? Because those applications don't opt in to handle WM_DPICHANGED. So, the only option that's left for Windows is to let the application to draw itself at the default DPI and then stretch its image. Windows will map the input messages to the default DPI pixel positions.
Microsoft does provide a half way between a fully DPI aware app and an unaware app, if the app uses the old Windows resource files to store the UI in the .exe resources. Since those apps are guaranteed to use Windows standard UI elements, Windows can intercept the drawing functions and at least draw the standard controls with the correct DPI. That's called "system aware". Since it is intercepting the application's way of drawing, it may result in weird UI bugs though.
akdor1154
This is exactly right.
There is no mechanism for the user to specify a per-screen text DPI in X11.
(Or maybe there secretly is, and i should wait for the author to show us?)
somat
X11 has had this since day one. However the trade offs to actually employing it are... unfortunate. It leans real hard on the application to actually cross screen boundaries and very few applications were willing to put the work in. so xrandr was invented. which does more of what people want with multiple screens by treating them as parts of one large virtual screen but you loose the per screen dpi.
okanat
Natively in X11? No. Even with Xrandr. It is no. But you can obtain the display size and then draw things differently using OpenGL but now you're reinventing the display protocol in your drawing engine (which is what GLX is after all but I digress). You need to onboard every toolkit to your protocol.
DonHopkins
[delayed]
phkahler
>> The hard part of scaling is drawing UI elements like raster icons or 1px hairlines to look non-blurry.
And doing so actually using X not OpenGL.
kllrnohj
Yeah this is kinda the big elephant in the room here? They didn't prove what they set out to prove. Yes obviously OpenGL does scaling just fine, the entire point of Wayland is to get the compositor to just being a compositor. They didn't do any scaling with X. They didn't do anything at all with X other than ask it some basic display information.
slackfan
X shouldn't be displaying anything that isn't a right angle anyway.
All circular UI elements are haram.
zozbot234
That depends on what kind of filtering is used when upscaling those icons. If you use modern resampling filters, you are more likely to get a subtle "oil painting" or "watercolor"-like effect with some very minor ringing effects next to sharp transitions (the effect of correctly-applied antialiasing, with a tight limit on spatial frequencies) as opposed to any visible blur. These filters may be somewhat compute-intensive when used for upscaling the entire screen - but if you only upscale small raster icons or other raster images, and use native-resolution rendering for everything else, that effect is negligible.
dark-star
yeah, exactly. Nobody claimed that it is impossible to determin the physical geometry of your display (but that might be tricky for remote X sessions, I don't know if it would work there too?)
kvemkon
> tricky for remote X sessions, I don't know if it would work there too
The author did exactly this:
> Even better, I didn’t mention that I wasn’t actually running this program on my laptop. It was running on my router in another room, but everything worked as if
kunzhi
Interesting article, I'll admit when I first saw the title I was thinking of a different kind of "scaling" - namely the client/server decoupling in X11.
I still think X11 forwarding over SSH is a super cool and unsung/undersung feature. I know there are plenty of good reasons we don't really "do it these days" but I have had some good experiences where running the UI of a server app locally was useful. (Okay, it was more fun than useful, but it was useful.)
xioxox
It's certainly very useful. I do half my work using X11 over ssh and it works reasonably well over a LAN (at least using emacs, plotting, etc).
inetknght
"reasonably well" as in... yeah it works. But it's extremely laggy (for comparison, I know people who forwarded DirectX calls over 10Mbit ethernet and could get ~15 frames/sec playing Unreal Tournament in the early 00's), and any network blip is liable to cause a window that you can neither interact with nor forcefully close.
It felt like a prototype feature that never became production-ready for that reason alone. Then there's all the security concerns that solidify that.
But yes, it does work reasonably well, and it is actually really cool. I just wish it were... better.
cpach
Love this post. Reminds me of my former coworker G. He had exactly this attitude, and it made it possible for him to deliver results on most tasks that he set out for.
sho_hn
It's actually a somewhat bad and uninformed post, or perhaps the mistake (unclear whether knowingly or not) is to disprove a claim made by uninformed people.
No one with a good grasp of the space ever claimed that it wasn't possible on X11 to call into APIs to retrieve physical display size and map that to how many pixels to render. This has been possible for decades, and while not completely trivial is not the hard part about doing good UI scaling.
Doing good UI scaling requires infrastructure for dynamic scaling changes, for different scale factors per display within the same scene, for guaranteeing crisp hairlines at any scale factor, and so on and so forth.
Many of these problems could have been solved in X11 with additional effort, and some even made it to partial solutions available. The community simply chose to put its energy into bringing it all together in the Wayland stack instead.
kvemkon
> to disprove a claim made by uninformed people
KDE developer wrote recently:
> X11 isn’t able to perform up to the standards of what people expect today with respect to .., 10 bits-per-color monitors,.. multi-monitor setups (especially with mixed DPIs or refresh rates),... [1]
Multi-monitor setups are working since 20+ years. 10 bits are also supported (otherwise how would the PRO versions of graphic cards support this feature).
> chose to put its energy into bringing it all together in
I cannot recall, was there any paper analyzing why working and almost working X11 features do not fit, few additional X11 extensions cannot be proposed anymore and another solution from scratch is inevitable. What is a significant difference of a X11 and a wayland protocol extension.
[1] https://pointieststick.com/2025/06/21/about-plasmas-x11-sess...
denkmoon
Multi monitor with mixed DPIs absolutely does not work well in x11 in 2025. I don’t know about 20+ years ago.
sho_hn
Nate (the author of the blog post you linked), who I know personally very well, is a QA/product person focused on integration and fit and finish issues. What he means to say is that as a polished product, this is now available in the form of a Wayland-based desktop session without fiddling, while the same cannot be said of X11-based ones. It's meant as a pragmatic take, not as a history lesson.
That's quite similar to how I chose to phrase is, and comes down to where the community chose to spend the effort to solve all the integration issues to make it so.
Did the community decide that after a long soul-seeking process that ended with a conclusion that things were impossible to make happen in X11, and does that paper you invoke exist? No, not really. Conversations like this certainly did take place, but I would say more in informal settings, e.g. discussions on lists and at places like the X.org conference. Plenty of "Does it make sense to that in X11 still or do we start over?" chatter in both back in the day.
If I recall right, the most serious effort was a couple of people taking a few weeks to entertain a "Could we fix this in an X12 and how much would that break?" scenario. Digging up the old fdo wiki pages on that one would for sure be interesting for the history books.
The most close analogue I can think of that most in the HN audience are familiar with is probably the Python 2->3 transition and decision to clean thing up at the expense of backward compat. To this day, you will of course find folks arguing emotionally on either side of the Python argument as well.
For the most part, the story of how this happened is a bit simpler: It used to be that the most used X11 display server was a huge monolith that did many things the kernel would not, all the way to crazy things like managing PCI bus access in user space.
This slowly changed over the years, with strengthening kernel infra like DRM, the appearance of Kernel Mode Setting, with the evolution of libraries like Mesa. Suddenly implementing a display server became a much simpler affair that mostly could call into a bunch of stuff elsewhere.
This created an opening for a new smaller project fully focused on the wire protocol and protocol semantics part, throwing away a lot of old baggage and code. Someone took the time to do that and demonstrate how it looks like, other people liked what they saw and Wayland was born.
This also means: Plenty of the useful code of the X11 era actually still exists. One of the biggest myths is that Wayland somehow started over from scratch. A lot of the aforementioned stuff that over the years migrated from the X11 server to e.g. the kernel is obviously still what makes things work now, and libraries such as libinput, xkbcommon that nearly every Wayland display server implementation uses are likewise factored out of the X11 stack.
creatonez
> Perhaps not the most exciting task, but I figure it’s isomorphic to any other scaling challenge
And doing this for everything in the entire ecosystem of ancient GUI libraries? And dealing with the litany of different ways folks have done icons, text, and even just drawing lines onto the screen? That's where you run into a lot of trouble.
jchw
Sigh. And now that it's been long enough, everyone will conveniently forget all of the reasons why this wound up being insufficient, and think that all of the desktop environment and toolkit developers are simply stupid. (Importantly, applications actually did do this by default at one point. I remember a wonky-looking nvidia-xsettings because of this.)
The thing X11 really is missing (at least most importantly) is DPI virtualization. UI scaling isn't a feature most display servers implement because most display servers don't implement the actual UI bits. The lack of DPI virtualization is a problem though, because it leaves windows on their own to figure out how to logically scale input and output coordinates. Worse, they have to do it per monitor, and can't do anything about the fact that part of the window will look wrong if it overlaps two displays with different scaling. If anything doesn't do this or does it slightly differently, it will look wrong, and the user has little recourse beyond searching for environment variables or X properties that might make it work.
Explaining all of that is harder than saying that X11 has poor display scaling support. Saying it "doesn't support UI/display scaling" is kind of a misnomer though; that's not exactly the problem.
zozbot234
> can't do anything about the fact that part of the window will look wrong if it overlaps two displays with different scaling
It's silly that people keep complaining about this. It's a very minor effect, and one that can be solved in principle only by moving to pure vector rendering for everything. Generally speaking, a window will only ever span a single screen. It's convenient to be able to drag a window to a separate monitor, but having that kind of overlap as a permanent feature of one's workflow is just crazy.
> The thing X11 really is missing (at least most importantly) is DPI virtualization.
Shouldn't that kind of DPI virtualization be a concern for toolkits rather than the X server or protocol? As long as X is getting accurate DPI information from the hardware and reporting that to clients, what else is needed?
jchw
> It's silly that people keep complaining about this. It's a very minor effect, and one that can be solved in principle only by moving to pure vector rendering for everything.
If you have DPI virtualization, a very sufficient solution already exists: pick a reasonable scale factor for the underlying buffer and use it, then resample for any outputs that don't match. This is what happens in most Wayland compositors. Exactly what you pick isn't too important. You could pick whichever output overlaps the most with the window, or the output that has the highest scale factor, or some other criteria. It will not result in perfect pixels everywhere, but it is perfectly sufficient to clean up the visual artifacts.
Another solution would be to simply only present the surface on whatever output it primarily overlaps with. MacOS does this and it's seemingly sufficient. Unfortunately, as far as I understand, this isn't really trivial to do in X11 for the same reasons why DPI virtualization isn't trivial: whether you render it or not, the window is still in that region and will still receive input there.
> Generally speaking, a window will only ever span a single screen. It's convenient to be able to drag a window to a separate monitor, but having that kind of overlap as a permanent feature of one's workflow is just crazy.
The issue with the overlap isn't that people routinely need this; if they did, macOS or Windows would also need a more complete solution. In reality though, it's just a very janky visual glitch that isn't really too consequential for your actual workflow. Still, it really can make moving windows across outputs super janky, especially since in practice different applications do sometimes choose different behaviors. (e.g. will your toolkit choose to resize the window so it has the same logical size? will this impact the window dragging operation?)
So really, the main benefit of solving this particular edge case is just to make the UX of window management better.
While UX and visual jank concerns are below concerns about functionality, I still think they have non-zero (and sometimes non-low) importance. Laptop users expect to be able to dock and manage windows effectively regardless of whether the monitors they are using have the same ideal scale factor as the laptop's internal panel; the behavior should be clean and effective and legacy apps should ideally at least appear correct even if blurry. Being able to do DPI virtualization solves the whole set of problems very cleanly. MacOS is doing this right, Windows is finally doing this right, Wayland is doing this right, X11 still can't yet. (It's not physically impossible, but it would require quite a lot of work since it would require modifying everything that handles coordinate spaces I believe.)
> Shouldn't that kind of DPI virtualization be a concern for toolkits rather than the X server or protocol? As long as X is getting accurate DPI information from the hardware and reporting that to clients, what else is needed?
Accurate DPI information is insufficient as users may want to scale differently anyways, either due to preference, higher viewing distance, or disability. So that already isn't enough.
That said, the other issue is that there already exists applications that don't do perfect per monitor scaling, and there doesn't exist a single standard way to have the per-monitor scaling preferences propagated in X11. It's not even necessarily a solved problem among the latest versions of all of the toolkits, since it at minimum requires support for desktop environment settings daemons and etc.
BearOso
I think having any kind of "scaling" preferences focuses too much on the technical aspect. It could be narrowed down to one setting like "zoom level" or just "size." This would mean that all UI elements change size exactly proportionately to one another. Ideally, rendering should happen at the exact resolution of the display, and scaling, as in resizing a bitmap using bilinear interpolation or whatever, doesn't need to be part of the pipeline except for outdated legacy programs.
In the past, the problem with UI toolkits doing proportional sizing was because they used bitmaps for UI elements. Since newer versions of Qt and Gtk 4 render programmatically, they can do it the right way. Windows mostly does this, too, even with win32 as long as you're using the newer themes. MacOS is the only one that has assets prerendered at integer factors everywhere and needs to perform framebuffer scaling to change sizes. But Apple doesn't care because they don't want you using third-party monitors anyway.
Edit: I'm not sure about Apple's new theme. Maybe this is their transition point away from fixed asset sizes.
jeffbee
I sort of wanted Fresco (previously Berlin, inspired by InterViews) to succeed, because in their model the UI toolkits really were server-side and they could be changed out while the application was running. Because they were targeting an abstract device (could be a 1200 dpi printer and a 72 dpi display at the same time) they got the property you mentioned, for free.
amiga386
It's not "can you provide the screen DPI to a window?" people bemoan, it's "can you draw one window across two screens with differing DPIs, transparent to the application?"
GranPC
Can this handle the case in which you have two displays with different DPIs side-by-side, and a window is placed in the middle of them both?
jekwoooooe
It’s astounding to me that in Linux, in 2025, I can’t just simply output a custom resolution. You are probably typing a response right now with some xrandr nonsense and I PROMISE you, it won’t do it. I can’t even scale my screen within a normal resolution to make it fit within a boundary. But I can do this in windows with an nvidis gpu. Crazy
dlcarrier
You don't have to use xrandr to create a custom framebuffer with scaling and/or centering, although it is capable of doing so. You can also use Gamescope (https://wiki.archlinux.org/title/Gamescope), which works on both X11 and Wayland, and with any GPU.
Traditionally it's used to launch a full-screen application, usually a game, but you can launch your window manager through it, if you want your desktop session to use a custom resolution with custom scaling and/or letterboxing.
lmm
> You are probably typing a response right now with some xrandr nonsense and I PROMISE you, it won’t do it.
Skill issue. You probably held your keyboard wrong or something. Simple xrandr commands work fine like they have for decades. (Of course if you've moved to Wayland then who knows).
null
throw39304949
Linux has no drivers for "nvidis gpu". You are probably defaulting into framebuffer at 800x600, just like Windows 11 on my old unsupported graphic card (works just fine on Linux).
As for xrandr nonsense, get AMD card!
jekwoooooe
Nope I can output 4k just fine. That’s not the issue. The issue is I want a 2.35:1 aspect resolution. I can do this on windows and nvidia but not on Linux (steam deck, intel nuc, etc)
kragen
I think it was yesterday that people's on HN were saying GLX doesn't work over the network?
rwmj
$ ssh <remote> glxgears
runs fine!jeffbee
Are there people who believe this? What do they think Indirect GLX is? XQuartz as the server and some Linux box as the client has always worked perfectly for me, GLX included.
compiler-devel
Brilliant. This is another piece of evidence on the pile of why we got Wayland: it's because people who understood X11 mostly retired and everyone else couldn't be bothered to learn X11 because it's "yucky C code" or something. And it bothers me that we lose remote rendering with Wayland (unless one fights with waypipe) that was just built-in to X11. Yes, it was slow, but actually if you're running a VM on your local system and using SSH to connect to it, then it works great. Sigh. I'm an old person yelling at clouds.
sho_hn
This is nonsensical myth-making. Despite the clickbait title, the APIs called in those code samples are very basic and not some forgotten wizardry.
compiler-devel
What part is nonsensical? Because Wayland is basically a fulfillment of jwz's CADT.
sho_hn
The part where we got Wayland because we lost a magic caste of rockstar engineers who could call XRRGetOutputInfo/XRRGetCrtcInfo.
nullc
> Yes, it was slow,
Not particularly if you are on a low latency network. Modern UI toolkits make applications way less responsive that classical X11 applications running across gigabit ethernet.
And even on a fast network the wayland alternative of 'use RDP' is almost unusable.
lelandbatey
I admire your tenacity. I think folks say "X11 doesn't support DPI scaling" when they should say "most programs written against X11 to use official Xlib functionality don't understand scaling".
In the article, the author uses OpenGL to make sure that they're interacting with the screen at a "lower level" than plenty of apps that were written against X. But that's the rub, I think the author neatly sidestepped by mostly using stuff that's not in "vanilla" X11. In fact, the "standard" API of X via Xlib seems to only expose functions for working in raw pixels and raw pixel coordinates without any kind of scaling awareness. See XDrawLine as an example: https://www.x.org/releases/current/doc/man/man3/XDrawLine.3....
It seems to me that the RandR extension through xrandr is the thing providing the scaling info, not X11 itself. You can see that because the author calls `XRRGetScreenResourcesCurrent()` a function that's not a part of vanilla X11 (see list of X library functions here as example: https://www.x.org/releases/current/doc/man/man3/ )
Now, xrandr has been a thing since the early 2000s hence why xrandr is ubiquitous, but due to it's nature as an extension and plenty of existing code sitting around that's totally scale-unaware, I can see why folks believe X11 is scale unaware.
arp242
So on my laptop I've been doing:
xrandr --output eDP --scale 0.8x0.8
For years and years, and I never really noticed any problems with it. Guess I don't run any "bad" scale-unaware programs? Or maybe I just never noticed(?)At least from my perspective, for all practical purposes it seems to "just work".
nixosbestos
Good luck if you plug in an external monitor. (Not to speak of refresh rates)
dlcarrier
--output eDP
This parameter specifies which display to scale, so only the built-in display will be scaled. Running xrandr without any parameters returns all available outputs, as well as the resolutions the currently connected displays support.arp242
I don't know about that; I use just one screen (laptop or HDMI, not both at the same time which is presumably what you're referring to) and it works for that. That's not really what the previous person was talking about either.
rwmj
At the office I plug in a monitor over USB-C and that just works on my X11 laptop. If something in a browser on the monitor was too large or too small I'd just zoom in/out until it was fine.
That's probably better than most scaling done on Wayland today because it's doing the rendering directly at the target resolution instead of doing the "draw at 2x scale and then scale down" dance that was popularized by OSX and copied by Linux. If you do it that way you both lose performance and get blurry output. The only corner case a compositor needs to cover is when a client is straddling two outputs. And even in that case you can render at the higher size and get perfect output in one output and the same downside in blurryness in the other, so it's still strictly better.
It's strange that Wayland didn't do it this way from the start given its philosophy of delegating most things to the clients. All you really need to do arbitrary scaling is tell apps "you're rendering to a MxN pixel buffer and as a hint the scaling factor of the output you'll be composited to is X.Y". After that the client can handle events in real coordinates and scale in the best way possible for its particular context. For a browser, PDF viewer or image processing app that can render at arbitrary resolutions not being able to do that is very frustrating if you want good quality and performance. Hopefully we'll be finally getting that in Wayland now.