5/30/2023 0 Comments 1366x768 civilization v image![]() ![]() So is it 1280x800? No, it only looks like it. In case of "looks like" it multiplies the resolution (1280x800 becomes 2560x1600) which is then scaled back to fit the physical pixels on the display. The "looks like" resolution follows this same principle and therefore never really is that exact resolution hence the proper wording of "looks like" instead of the incorrectly "is exactly" what you two are thinking. In the end that's the only thing you gain: sharpness. Effectively you have the same workspace as when you had twice as less pixels or 1152x720 (2304/2 = 11/2 = 720) but you gain a lot of sharpness. The downside to this is that the physical working area becomes smaller. Both horizontal and vertical thus 2x2 that's the reason why we talk about retina displays having 4x as much pixels than the non-retina version. Apple solves this by blowing things up and it uses a fixed number for that: 2. Fonts and all other elements will become really small. The problem with that is when you match the actual resolution of the OS to that of the display. No more than that number of pixel and no less. It's not the fastest machine so better leave it at the default resolution for best performance. In case of the MacBook with the Core M this can make quite a change. That means running anything but the default resolution will have an impact on computing power. It is, however, important to understand that the resolution that is actually used is twice the one you get. With the hidpi/retina displays you simply get away easier with this. In the end you pick between a nice and crispy screen or something that is a bit more fuzzy but gives you either bigger letters or more workspace (and smaller letters). It has to calculate for things that aren't there (either physical pixels in case of larger resolution or the virtual ones in case of lower resolutions). It's the same way as things like jpeg and mp3 work. This new way of doing things makes it rather difficult for a lot of people to grasp.Īnyway, the main thing to understand is that every resolution that cannot be translated easily to the pixels will require some computing. IIRC Apple calls them pixel points which would be a group of 4 pixels. Resolution is now tied to something different. With modern hidpi/retina displays things are done differently. The problem lies in the fact that since the beginning we've tied them together. Resolution actually says nothing about pixels and vice versa. In contrast to this, those scaled resolutions that look like a higher number of physical pixels do suffer significantly more, because there aren't enough physical pixels to display that resolution, so you do actually have physically missing pixels. It works extremely well for them.įor those scaled resolutions that look like a lower number of physical pixels than the screen has, the fuzziness is not very pronounced, and is not created by a lack of physical pixels (because there is actually more pixels than the resolution being displayed), but by the fact that there is not an exact pixel doubling scaling factor. The method you describe is the one used by Apple to create the best possible rendering of a scaled resolution for hiDPI purposes. This is true whether or not the resolution being rendered is a whole multiple of the native physical resolution or not. Apologies to the OP.įirstly, every single resolution that is not equal to the actual 1 to 1 physical pixels of the screen is a "scaled" resolution by definition. Not sure whether you are referring to me or not, but we have now officially hijacked this thread, unfortunately. ![]()
0 Comments
Leave a Reply. |