3840 x 2160 UHD 1ms Response Time DVI, HDMI & DisplayPort 60Hz Refresh Rate 300cd/m2 Brightness
13 comments
GwanGy
29 Jul 172#1
The day would not be complete without an Acer monitor from Ebuyer. This looks like quite a good deal too ..
EndlessWaves
29 Jul 17#2
It's nice to see the HiDPI/Retina screens coming down in price, now if only they'd release them for other resolutions too.
89quidyoucantgowrong to EndlessWaves
29 Jul 173#4
'retina' {pukes}
taras to EndlessWaves
29 Jul 17#6
"rentina screens" what the .. - its not even a pixel standard - just beyond x dpi .. either use the x & y pixel counts, vertical pixel count or its proper name. I'm not being horrible but retina is not a standard - its marketing blurb
parsimony
29 Jul 17#3
I'm using one of these right now and it's a steal at this price.
Retina is a standard - one that Apple came up with. It's a calculation based on screen size, pixel count, arc length of normal human vision, and assumed distance away from the screen where it's assumed that individual pixels can no longer be distinguished. Whether these assumptions are correct is up for debate, but I don't see the issue with using it as a general term for a modern sharp display. It's not the end of the world if he uses that term on an unscientific forum for deals, unless we're just being snobby or anti-Apple.
EndlessWaves
29 Jul 172#11
Actually in computer terms it's a major shift in technology.
Traditionally stuff in computers is sized in pixels and displayed 1:1 as that resulted in optimum image quality. So if you had something 200 units wide it would get physically smaller as your DPI increased. If your DPI got too high for your viewing distance it would be unusable.
In recent years it's become useful to separate physical size and pixels so that a 200 unit wide item could use 200 pixels. Or it could use 400 pixels and, if it's able to, add extra detail.
Newer operating systems like Android were designed this way from the start, but existing operating systems like Mac OS and Windows have been transitioning for the last decade.
In the older/normal style more pixels are always found on larger screens. Everything on screen looks identical but you can fit more of it on. That's exactly what you get if you buy a 2560x1440 screen at 27" or a 4K screen at 40"+, text and programs that individually look identical to a 1920x1080 21.5" screen, just much more of it on screen at once.
(Obviously there are thing like videos and games that have always shown the same picture regardless of size, but they're exceptions).
But this 3840x2160 screen at just 28" is completely different. If you tried to use it normally then many things would be unusably tiny. People will typically use it at either 1.5x sizing/scaling or 2x, giving similar on-screen space to a 2560x1440 or 1920x1080 screen respectively.
So unlike most higher resolution monitors this one isn't giving you any more space. It's something different for a computer monitor.
HiDPI/Retina are the most appropriate terms I'm aware of to signify this transition. If there is a better one I'd certainly be interested in learning about it.
fishmaster
29 Jul 17#12
A further explanation of this with pictures for people who prefer pictures as explanations (I know I do :smiley: )
The crucial words are "Pixel Density" and this is the reason why this 4K monitor doesn't give you an enlarged workspace. However if you want more workspace at the expense of being able to see text/icons properly without glasses :wink: then you can employ scaling factors in your OS.
Also HiDPI handling has been in OSX (now macOS) for a lot longer than Windows as mentioned earlier due to the Retina standard Apple introduced in 2010 with the iPhone 4. Note it's not the resolution that is Retina but the formula 2dr tan(0.5˚) based on a concept of pixels by degree, so this takes in to account the distance from the screen as well as the resolution.
If you want to understand simply the maths behind the formula I gave then read the detailed description by user Palaemon in this forum >
I don't know when Apple first started supporting it but I know the technology has been in Windows since at least 2000/XP for monitors like the IBM T220 & T221 which offered 3840x2400 at 22" back in 2001.
When Microsoft introduced .NET in 2006 the only GUI option they offered for .NET programs was the resolution independent WPF. They have been pushing the technology for some time.
Apple had the advantage of being able to control the screens of most Mac OS computers, so third party developers have had to deal with a rapidly increasing proportion of users that need it. I suspect they have put support higher up their priority list than their Windows counterparts.
Opening post
1ms Response Time
DVI, HDMI & DisplayPort
60Hz Refresh Rate
300cd/m2 Brightness
13 comments
actual specs
https://www.acer.com/ac/en/GB/content/model/UM.PR0EE.001
Traditionally stuff in computers is sized in pixels and displayed 1:1 as that resulted in optimum image quality. So if you had something 200 units wide it would get physically smaller as your DPI increased. If your DPI got too high for your viewing distance it would be unusable.
In recent years it's become useful to separate physical size and pixels so that a 200 unit wide item could use 200 pixels. Or it could use 400 pixels and, if it's able to, add extra detail.
Newer operating systems like Android were designed this way from the start, but existing operating systems like Mac OS and Windows have been transitioning for the last decade.
In the older/normal style more pixels are always found on larger screens. Everything on screen looks identical but you can fit more of it on. That's exactly what you get if you buy a 2560x1440 screen at 27" or a 4K screen at 40"+, text and programs that individually look identical to a 1920x1080 21.5" screen, just much more of it on screen at once.
(Obviously there are thing like videos and games that have always shown the same picture regardless of size, but they're exceptions).
But this 3840x2160 screen at just 28" is completely different. If you tried to use it normally then many things would be unusably tiny. People will typically use it at either 1.5x sizing/scaling or 2x, giving similar on-screen space to a 2560x1440 or 1920x1080 screen respectively.
So unlike most higher resolution monitors this one isn't giving you any more space. It's something different for a computer monitor.
HiDPI/Retina are the most appropriate terms I'm aware of to signify this transition. If there is a better one I'd certainly be interested in learning about it.
http://www.eizoglobal.com/library/basics/pixel_density_4k/index.html
The crucial words are "Pixel Density" and this is the reason why this 4K monitor doesn't give you an enlarged workspace. However if you want more workspace at the expense of being able to see text/icons properly without glasses :wink: then you can employ scaling factors in your OS.
Also HiDPI handling has been in OSX (now macOS) for a lot longer than Windows as mentioned earlier due to the Retina standard Apple introduced in 2010 with the iPhone 4. Note it's not the resolution that is Retina but the formula 2dr tan(0.5˚) based on a concept of pixels by degree, so this takes in to account the distance from the screen as well as the resolution.
If you want to understand simply the maths behind the formula I gave then read the detailed description by user Palaemon in this forum >
https://www.photography-forum.org/threads/pixel-density-camera-purchase-conundrums.126320/
When Microsoft introduced .NET in 2006 the only GUI option they offered for .NET programs was the resolution independent WPF. They have been pushing the technology for some time.
Apple had the advantage of being able to control the screens of most Mac OS computers, so third party developers have had to deal with a rapidly increasing proportion of users that need it. I suspect they have put support higher up their priority list than their Windows counterparts.