Comments Locked

52 Comments

Back to Article

  • ddrіver - Saturday, December 16, 2017 - link

    Iiyama still exists?
  • damianrobertjones - Saturday, December 16, 2017 - link

    Yes they do. You might need to get out more.
  • ddrіver - Sunday, December 17, 2017 - link

    Lots of monitors in the street?
  • Alexvrb - Monday, December 18, 2017 - link

    Display gang wars in Chicago.
  • Hinton - Friday, December 22, 2017 - link

    I can sense you get good grades in school.
  • guidryp - Saturday, December 16, 2017 - link

    When will this curved monitor fad just die?

    Thankfully TVs seem to be leaving this behind.
  • bubblyboo - Saturday, December 16, 2017 - link

    We need more 10-bit VA monitors ASAP. HDR does not happen on a 1500:1 IPS display.
  • Lolimaster - Saturday, December 16, 2017 - link

    IPS was always shitty with their pathetic contrast, even if you find an IPS monitor with a bit more than 1000:1 it still has the ips glow cancer and probably with matter grainy antiglare.
  • Beaver M. - Saturday, December 16, 2017 - link

    The thing is, curved panels reduce IPS glow greatly. But the other thing is, that the production process of those seems to still be in its infant stages, because they are even more prone to backlight bleeding.
  • Wolfpup - Thursday, January 4, 2018 - link

    Ugh, yeah, how did people get the idea IPS was better than VA? Like I bought in to the "IPS is just as good" thing until I started TV shopping again a year ago, and it's like umm, no, they're not even close. IPS just has shittastic contrast, and even a bad VA panel destroys it. My VA TV I bought last year has LITERALLY 10x the contrast of a GOOD IPS TV/monitor. *LITERALLY*. And even a bad one would have 2x to 4x the contrast of the best IPS monitors/TVs

    I was looking at supposedly high-end TVs using IPS that couldn't display any detail at all in Lucifer, like it was just a murky, mushy mess, while like a low end Sony 1080p VA set that was dirt cheap *DESTROYED* the "higher end" IPS sets. Not even close (while of course salesmen were trying to tell me they looked great LOL)

    Thankfully I found a set that's both 4K/HDR AND VA, but you have to be careful...
  • Beaver M. - Saturday, December 16, 2017 - link

    No thanks. VA Panels are too slow (incl. high input lag) and their texture flickering problems are annoying as hell.
    I rather have no HDR and have to deal with IPS glow than the VA problems.
  • Dug - Monday, December 18, 2017 - link

    All are not too slow and not all have flickering problems.
  • Beaver M. - Tuesday, December 19, 2017 - link

    They cant reach IPS, which arent the fastest either.
    Which ones dont have TEXTURE flickering problems?
  • Hinton - Friday, December 22, 2017 - link

    I have a 32" AMVA panel, 1440p.

    There's no flickering, and its not fucking curved.
  • Hinton - Friday, December 22, 2017 - link

    It doesn't have HDR.

    And since its VA, being curved is moronic.

    ---

    But I guess, judging by the comments, there's a reason for them, since their customer base is apparently clueless.
  • Wolfpup - Thursday, January 4, 2018 - link

    Texture flickering problems? High input lag? No they don't and no they don't. I don't even know what the former means, but if there's issues with it, it's an issue with the specific monitor/TV, not all of them in general.
  • imaheadcase - Saturday, December 16, 2017 - link

    When people like you start to understand the reason behind monitors vs TV and why TVs are changing and monitors are not going to anytime soon.

    Because monitors its a noticeable difference using it vs flat. Unless a monitor as a certain feature that you want for flat..zero reason now not to go curved. Curved monitors are just better in every way.

    TVs it never made sense to because you are simply not going to get as benefit from it because its just %99 movie/tv watching. The only way it would make sense is if it was a OLED tv..but then you have to give first born for it so never happened.
  • imaheadcase - Saturday, December 16, 2017 - link

    The reason question you should be asking is when is freesync fad going to end.
  • TechZombie - Saturday, December 16, 2017 - link

    When Gsync displays don't have a $100-200 difference in price.
  • Beaver M. - Saturday, December 16, 2017 - link

    more like $200 to $350.
  • Morawka - Sunday, December 17, 2017 - link

    IIRC there is actual DRAM on the Gsync Module. Look at DRAM prices right now, they are crazy high.
  • peevee - Monday, December 18, 2017 - link

    A pathetic 4MB framebuffer (which at even $10/GB comes to 4 cents worth of RAM) does not even come close to explaining the difference.
  • Morawka - Monday, December 18, 2017 - link

    There are 3 Skyhenix dies on each Gsync Module, totally 2Gb each.

    https://www.anandtech.com/show/7582/nvidia-gsync-r...
  • Morawka - Monday, December 18, 2017 - link

    The G-Sync board itself features an FPGA and 768MB of DDR3 memory
  • Samus - Saturday, December 16, 2017 - link

    You sound like someone who hasn't used a 144Hz monitor.
  • DanNeely - Saturday, December 16, 2017 - link

    Probably never. Nvidia charges an arm and a leg for the GSync controller; as long as that's the case it's only ever going to show up on expensive premium displays. Freesync (and it's HDMI equivalent in v2.1 of that spec) are free; and the basic version (with limited refresh ranges) can be - and is being - baked into commodity panel controllers for a very low premium and will probably work its way into everything except race to the bottom grade displays over the next few years.

    If the same thing will happen with the much wider variably in refresh rates and other upgrades in Freesync 2 is a more open question. The general absence on the market of them (vs FS1 displays) suggests that either controller availability is still a problem, the controller itself is significantly more expensive (if this is manufacturing cost or first on the market premium pricing matters), or that the increased ranges need better and significantly more expensive panels. Some of those will correct themselves with time; others might might keep FS2 availablily relatively uncommon until AMD is competitive with NVidia at the very top of the market again.
  • FullmetalTitan - Saturday, December 16, 2017 - link

    The difference between FS1 and FS2 is primarily the existence of some extra qualifications. The reason g-sync is a high premium for monitors is a combination of controller, licensing, and certification.
    As FS1 requires no formal certification (there are many optional features in this space such as low framerate compensation, and some panels only support very narrow ranges of 45-60Hz but not higher or lower rates) it is basically a free feature.
    FS2 requires HDR10 support and a controller capable of tone mapping bypass/switching (because windows doesn't play nice outside of sRGB space) and support for AMDs API to support said tone mapping bypass/switching.
    It certainly won't cost anywhere near the premium that g-sync does, that isn't the game AMD has historically played, plus they are bringing lessons from console into the PC space.
  • FullmetalTitan - Saturday, December 16, 2017 - link

    To clarify: the controller issue is actually a pretty easy adjustment for the panel makers.
    I suspect the issue is going to fall more heavily on the target market these FS2 panels are addressing weighed against what currently constitutes the "typical AMD user" crowd. I expect to see more of the kind of bundling deal that was setup for the RX Vega launch in the future, with AMD pushing specific models from partners that support FS2 (RX Vega was offering bundles on Samsung's new quantum dot curved monitor and ultra-wide line)
  • Lolimaster - Saturday, December 16, 2017 - link

    For productivity the curve is annoying, even worse when reading or looking at pictures, it only made sense for gaming (specially sims) or watching movies on a big ass screen, nothing else.
  • Alexvrb - Saturday, December 16, 2017 - link

    "it only made sense for gaming"

    Yes, well, perhaps the headline has you confused. This gaming display is actually... a gaming display. If you're looking for a display that is best suited towards <insert primary purpose other than gaming>, you should probably look elsewhere.
  • limitedaccess - Sunday, December 17, 2017 - link

    Generalist usage is very important for many buyers. Even if they want a gaming focused display that display could be used for many other tasks as well. In that sense gaming focused displays that are non curved would be a pro, while curved would be a negative.

    Even gaming is a rather broad category. How is a curved display for non first person or to some extent third person games? What if you primarily played overhead type view games? What about those that primarily play competitive games? Is a curve a benefit or detriment or those people looking or a gaming display?

    The idea of a curved monitor being better in every way, or even specially for gaming is flawed.
  • Hurr Durr - Sunday, December 17, 2017 - link

    Color correctness or curve is the last consideration for competitive play, they only want speed.
  • Alexvrb - Monday, December 18, 2017 - link

    I never said it was better in every way. Far from it. Where did you get that idea? Look at the comment I was replying to. "For productivity the curve is annoying". Well, for gaming it's typically fine. If it wasn't, they wouldn't sell. Tons of gamers love em. If you don't, that's swell, buy a flat one.
  • peevee - Monday, December 18, 2017 - link

    What's wrong with a mild curve for productivity?
  • Hixbot - Saturday, December 16, 2017 - link

    I think most desktop use case you would want flat. If you're buying a monitor for only games and movies, and won't Web browse, word processor, design, or other computing tasks, buy curved. Actually even for games and movies, curved just adds geometric distortion. It would be cool if a game or movie could know how far your sitting from the monitor and correct the geometry for peripheral vision, but until then you are watching flat content on a curved screen.
  • sonny73n - Monday, December 18, 2017 - link

    Curve, curve! Do you see things around you convexed? If not, why do you need a concaved monitor? And how do you keep your eyes all times at the center of your curved monitor arbitrary arc?
  • Samus - Saturday, December 16, 2017 - link

    That's because it is a gimmick on TV's. You simply aren't close enough to even notice the curve. On a monitor you are two feet away from the immersion works better.

    Also, this monitor looks awesome. Finally 144Hz VA panels.
  • Alexvrb - Saturday, December 16, 2017 - link

    I have to agree that curved displays are nice for gaming (though not essentialy). As this is a gaming-focused display, those not gaming should probably look elsewhere for a display.

    As far as high refresh VA panels go, I agree but we definitely need testing for input lag.
  • Lolimaster - Saturday, December 16, 2017 - link

    The new VA Quantum Dot monitors from Samsung would be great with that 144Hz 1080p/1440p without the crappy curve.
  • Solandri - Sunday, December 17, 2017 - link

    The curve is structural. Take a piece of paper, grab the bottom corners and try to hold it upright. The top half will fold over. Now try it again except give it a slight curve. The top half will stay up.

    The curve gives the structure some rigidity against flopping over. This wasn't important when TVs had big tubes, or LCD backlights required they be 2 inches thick. But the thinner you make the frame, the less rigidity it has. Curving it can give you back some of that rigidity without having to make it thicker or use stronger structural materials like steel.
  • Icehawk - Saturday, December 16, 2017 - link

    1080p and 32"? Hard pass. 27" is as big as I am willing to go with that and even there I much prefer a 2k monitor.
  • abrowne1993 - Saturday, December 16, 2017 - link

    I don't think 2K means what you think it means.
  • FullmetalTitan - Saturday, December 16, 2017 - link

    I think you know he meant 1440p/QHD
  • Death666Angel - Saturday, December 16, 2017 - link

    That's a stupid use of it though. 2k / 4k, 1080p / 2160p. 2560x1440 is closer to 3k than 2k.
  • FullmetalTitan - Saturday, December 16, 2017 - link

    I agree, the fact that every major resolution 'node' has like 4 names is endlessly frustrating. At least this is more sane than the mobile space where there is essentially a continuum of resolutions and aspect ratios
  • Lord of the Bored - Sunday, December 17, 2017 - link

    And that's why I still type a complete resolution out manually like a REAL MAN that knows how to configure IRQ lines with jumper blocks and all that other arcane crap that died out over a decade ago as people stopped needing to know how a computer actually WORKED and just started leaning on buzzwords.
    ...
    That said, I'm hooked to a 1920x1080 display right now. 16:9, as the resolution implies, but doesn't actually specify(Alert! Assumption of square pixels! Image distortion imminent! Danger! Danger!).
  • Hurr Durr - Sunday, December 17, 2017 - link

    >ME KNOW IRQs
    >ME DA REAL MAN

    I bet you also install gentoo in da club.
  • Icehawk - Monday, December 18, 2017 - link

    LOL, yeah I meant "whatever x 1440", "2k" was probably not the best way of saying that - I agree the naming conventions are super screwed and it's probably better to just spell out exactly what resolution you are calling out for. IMO, a vertical resolution under 1440 in 2017 is just not something I would ever buy again for personal use.

    So far IME the ultrawides can be a negative for gaming, Overwatch for example will actually shrink the amount you can see vertically on a 21:9 :(
  • piroroadkill - Monday, December 18, 2017 - link

    16:9, curved - huh? What for..
    31.5".. nice... 1920×1080... what the?
    Who is this product for?
  • valinor89 - Tuesday, December 19, 2017 - link

    People with seriously impaired vision, aparently...
  • BenJeremy - Monday, December 18, 2017 - link

    For computer monitors, curved displays make sense (single viewer)... that said, what I don't understand is the continued release of 1080p resolution monitors, and shallow curves. Gamers need WQHD or better in ultra-wide, and we want it to wrap around us.

    Monitors have competition now, at least for gaming, with VR headsets. Giving gamers a wider FOV will provide player advantages and keep monitors relevant.
  • Wolfpup - Thursday, January 4, 2018 - link

    Okay, VA got my attention. WAY too many monitors (and even TVs) using IPS (or TN). VA's the way to go IMO.

    BUT curved? I don't want that, and of course ideally I'd like probably 4K at this point for that size, and want...well, ideally want Freesync AND Gsync (not even sure it's possible to support both).

    But other than being curved, that actually sounds like a really nice lower end monitor. Particularly when even "high end" monitors are using TN and IPS most of the time now.

Log in

Don't have an account? Sign up now