I need to buy a new monitor.
Should I get a 144hz monitor? Is it worth the extra money?

Edit: Also, what do you think about 1080p vs 1440p?

9 years ago*

Comment has been collapsed.

Is 144hz monitor worth the extra cost? Is the change in FPS noticable to the eye?

View Results
Yes - 144 hz worth the extra money
No - Stick to 60 hz

If you've got enough FPS then yes

9 years ago
Permalink

Comment has been collapsed.

It doesn't cause motion sickness in case its 'too much fps'?

9 years ago
Permalink

Comment has been collapsed.

well if you have like 120+fps you will notice the difference, smooth and that, if you have less than 120 then dont buy it

9 years ago
Permalink

Comment has been collapsed.

There is no too much fps.

9 years ago
Permalink

Comment has been collapsed.

ROFL
best joke of the year, you made my day ^_^

my answer is the same as everyone elses, if you have the fps and the cash go for it, if you're like me and have below 60fps on almost any modern game (med-high settings), don't

9 years ago
Permalink

Comment has been collapsed.

More Hz reduces motion sickness. Not increase it.
I got my Asus rog swift because I wanted to reduce my motion sickness in games. It seems to have had an impact.

9 years ago
Permalink

Comment has been collapsed.

More Hz reduces motion sickness. Not increase it.

Not always. Well, not for me though. I can't speak for everyone else.

9 years ago
Permalink

Comment has been collapsed.

Well, maybe it didnt help your case, but I doubt it made it worse.

9 years ago
Permalink

Comment has been collapsed.

60fps on 144hz monitor is heaps better than 60fps on 60hz.
Regardless of your FPS, 144hz is awesome.

9 years ago
Permalink

Comment has been collapsed.

3Klicksphillip has an excellent YouTube video on it.

9 years ago
Permalink

Comment has been collapsed.

Yes, as the first comment said, it depends on what you play and what you want. For me IPS 60Hz1440p is better than 144hz because I love the visuals you can get instead of the smoothness that 144Hz can give you and I don't play much FPS or MOBAs. There is a Korean monitor that its 1440p 27' IPS and is overclockeable (96-120Hz) so you can get really good visuals with a greater smoothness than a 60Hz monitor. Its pretty cheap also, its around $300. http://www.amazon.com/Perfect-QNIX-Evolution-2560x1440-Monitor/dp/B00TPGCAAS/ref=sr_1_1?ie=UTF8&qid=1436080706&sr=8-1&keywords=qnix+27+2560x1440.
Beware, if you ever buy this be sure to order a third party DVI cable because the one that comes with the monitor is SHIT.

9 years ago
Permalink

Comment has been collapsed.

Interesting, is there a noticable difference between 1080p and 1440p?

9 years ago
Permalink

Comment has been collapsed.

Yep. I didn't use to think so, but there is. However only get 1440p if you have a very high end GPU. Otherwise don't bother.

9 years ago
Permalink

Comment has been collapsed.

Yes, but it's nothing super exciting unless your talking about a 27" or higher size (or have an very fine/professional eye and aren't just gaming on it)

9 years ago
Permalink

Comment has been collapsed.

Native resolutions are literally all that matter when it comes to what resolution monitor you are getting. There's no difference as long as its native.

9 years ago
Permalink

Comment has been collapsed.

If you can wait, go ahead and wait. It's not something you should go out and get immediately if you have other things you want and already have a great 1080p monitor.

9 years ago
Permalink

Comment has been collapsed.

9 years ago
Permalink

Comment has been collapsed.

if u play at a high competitive level in a FPS game, go for it

9 years ago
Permalink

Comment has been collapsed.

If your rig can render more than 100 FPS then go for it, otherwise it's a waste of money for next couple years.

9 years ago
Permalink

Comment has been collapsed.

A freesync-gsync monitor is better.

9 years ago
Permalink

Comment has been collapsed.

Even if I have an AMD card?

9 years ago
Permalink

Comment has been collapsed.

Freesync would be your choice then

9 years ago
Permalink

Comment has been collapsed.

Freesync monitors are for AMD cards and Gsync for Nvidia. You can go to AMD site to see if your card is compatible.

9 years ago
Permalink

Comment has been collapsed.

Nope: freesync monitors are for everyone (is just a displayport protocol modification, a VESA optional standard, for NOW) because you don't need additional hardware like gsync, BUT nvidia wont support it 'cause they want to promote their closed technology.
So gync is and will be only for nvidia but f-sync not, technically they just need to support in their drivers. :P

http://wccftech.com/nvidia-plans-support-freesync-displayport-adaptive-sync/

9 years ago
Permalink

Comment has been collapsed.

I know that. But right now freesync is supported only by AMD cards and we cant make suggestions to someone based on what might happen in the future.

9 years ago
Permalink

Comment has been collapsed.

you are right, but my intention is to clarify that choose gsync over fsync will bound you to nvidia cards forever (or at least for the lifespan fr their closed technlogy). i'm not a fan of physix first and now this monitor-racism thing. -.-
just to clarify, i'm not an amd fanboy... nvidia has over 70% of market share, i hope/suppose is not only for the brand thing...:P

9 years ago
Permalink

Comment has been collapsed.

I understand. Every gpu I ever had is Amd because usually they offer better value for money and I think supporting the underdog is better for the consumers in the long term but the current reality is that you cant recommend a freesync monitor to an Nvidia user except if they are thinking about buying a new gpu too. :)

9 years ago
Permalink

Comment has been collapsed.

Gsync is heaps better though. The range for Freesync are too small. Gsync also works in windowed mode, which Freesync does not. I paid a lot for my gsync monitor, and I have yet to regret it.

9 years ago
Permalink

Comment has been collapsed.

even if it is better, de facto is chaining you to nvidia cards.. so it's a no-no for me. And after the last news about this company i will stay very far from them (the 4gb false advertisting, the arkham knight 60fps trailer and generally the poor optimization of every game sponsored by nvidia). I'm very glad it's working for you, though. :)

9 years ago
Permalink

Comment has been collapsed.

I don't really care for AMD or NVIDIA. I chose gsync because it was a better product. If in 2 years time I decided I really must have an AMD card, then I will just sell the monitor and get something comparable with AMD. But I honestly don't see Nvidia being massively outplayed in the GPU department they I would bother changing the whole setup. My last GPU was AMD 5780 that I had for 5 years but since then they have only cut down their R and D budget and haven't had any big wins. Not to mention Nvidia seems to have better drivers and less of a frame stuttering, that used to have me migraines before.

9 years ago
Permalink

Comment has been collapsed.

Freesync maximum range is 9-240 hz but it is limited by the maximum supported range of the monitor.

9 years ago
Permalink

Comment has been collapsed.

That is the theoretical rate.
Practically, so far no monitor exists even on paper that actually has a decent range.
If the OP decided to buy a freesync monitor today, he will have to put up with the restricted range, which will not be an issue with Gsync.

9 years ago
Permalink

Comment has been collapsed.

Deleted

This comment was deleted 11 months ago.

9 years ago
Permalink

Comment has been collapsed.

That is because all 144 Hz monitors use TN panels, which are worse than IPS/VA panels in terms of image quality (and color fidelity).

9 years ago
Permalink

Comment has been collapsed.

Since last year's summer there are 120 and 144hz ips panels out

9 years ago
Permalink

Comment has been collapsed.

Depends on the panel. While as a general rule IPS is better than TN, the high end TN are a better than a majority of IPS if you don't need color accuracy and can ignore the narrow viewing angles.

9 years ago
Permalink

Comment has been collapsed.

It's a matter of taste, but you will never see a TV with a TN panel for this exact reason. I appreciate image quality and TN panels (600€ ones included) are way below my own standards.

Anyway, the perfect LCD monitor technology doesn't exist, all panels have their pros and cons.

9 years ago
Permalink

Comment has been collapsed.

Reason TV are not TN is because of viewing angels. PC monitor is in front of you. You will almost always be sitting in exactly same position, so angles arent an issue. TV is supposed to be viewed from different angels by multiple people at same time.

9 years ago
Permalink

Comment has been collapsed.

Because shitty TN panels viewing angles AND image quality. A TV is meant to display accurate colors and the best picture.

9 years ago
Permalink

Comment has been collapsed.

Nope. Good TN panels do surpass the average eIPS screens you see for cheap.

If people cared about accurate colors on TVs, they wouldn't be coming out of the factory with white points around 9000K.

9 years ago
Permalink

Comment has been collapsed.

See it like this, how often do you purchase monitor?

They usually last years to come, so imho go with the best 144Hz monitor you are able to afford ^^

9 years ago
Permalink

Comment has been collapsed.

Thats true. monitors stay with you for awhile. Tryin to decide if its worth..well literally twice the price.

9 years ago
Permalink

Comment has been collapsed.

Just buy a dirt cheap 60Hz monitor and upgrade to 144Hz in the future when they will be cheaper and better.

9 years ago
Permalink

Comment has been collapsed.

depends on what games are you playing

9 years ago
Permalink

Comment has been collapsed.

If you have excess of money, you can buy it. If not, I think it's not worth to buy it. For me difference in performance is too small to pay twice much as for 60 Hz.

9 years ago
Permalink

Comment has been collapsed.

It is worth it ONLY if you are playing fast pace competitive shooters.

9 years ago
Permalink

Comment has been collapsed.

if you have to ask, then NO... they are not IPS or VA panels, so the colors & viewing angles wont be as good, you will very rarely be able to reach anywhere close to that framerate unless you have extreme hardware (even then, a lot of games can get cpu limited)

as for freesync, do you care about tearing? do you always have vsync on like i do? are you sure you're staying with your amd card for over the next year?

you asked about 1440p... well you're going to get all kinds of anecdotal stories, you really should know what you want after reading about how various specs & features work (in this case 1440p means lower fps since it's more demanding, it also means all 1080p content like blurays, videos, or console games will be blurry since they have to upscale, or have black bars on all sides if the monitor can display at 1:1 scaling)

9 years ago
Permalink

Comment has been collapsed.

Interesting, thank you for the reply.

mmm a bit case for things over 1080p vs hardware demand is that you don't need all the fancy FXAA filters on 1440p that you do need on 1080p so its less taxing on the system?

It does seem like the resolution above 1080p have to mature a bit before buying a 2k screen..mmm..

9 years ago
Permalink

Comment has been collapsed.

Just for your information, I'll add, that in contrary to what kn00tcn says, blurays will look better on a 1440p screen, because usually 1080p is not able to display all the details (because of aspect ratio).

And buy a monitor with IPS or PLS screen. Seriously, don't even consider TN. And 1080p = 2K. 1440p = 2,5K.

9 years ago*
Permalink

Comment has been collapsed.

uwotm8? isnt the actual video data 1920x1080? it doesnt have more pixels than that... (it doesnt have to be x1080 due to black bars, but it's 1920 for sure) therefore it's impossible to look better

9 years ago
Permalink

Comment has been collapsed.

It is. But the aspect ratio usually isn't 16:9 = 1,78:1. It is 2,35:1. That makes the pixels rectangular by default, not squared, like in a monitor. A bluray movie on a 1920x1080 monitor will be scaled down to 1920x817 pixels. To prevent any data loss, the same movie would have to be scaled up to 2538x1080 with correct aspect ratio. Read about this in short here or here or wikipedia.

9 years ago
Permalink

Comment has been collapsed.

what does that have to do with anything? i am talking about losing sharpness due to scaling the image as your first link exactly said:

"If you're one of the dozen or so people who bought one of those 21:9 TVs, or have a ultra-wide-screen projection setup (as I do), expanding the image to fill the wider screen can result in a slight softness."

if the data is 1920 wide (it doesnt matter how tall, black bars are your friend as someone in the second link said), then there is no more data, you have the maximum quality (source material sharpness) when displaying it at 1920

so then how can blurays 'look better'? not all are super wide btw, what about 16:9 1080p tv shows?

simlarly, if you have a 21:9 monitor that's the same width as a 16:9 monitor at the same viewing distance showing some 21:9 movie, then both images will be visually the same other than the 21:9 being blurrier due to it having a different amount of pixels than the source footage, the 16:9 merely has extra black bars since it's a taller monitor in comparison... only CRTs can display arbitrary resolutions without post processing

9 years ago
Permalink

Comment has been collapsed.

that also doesnt make sense, are you going to mathematically position the monitor exactly further away than a 1080 one so that the size of the rectangle is the same? or would you have the monitor in the same position, just have more space, more pixels, same quality & distance, more immersive (i know i wouldnt move further away, i also sit quite close to my monitors, only a forearm's length)

9 years ago
Permalink

Comment has been collapsed.

Why would there be bars if you are watching 1080p content on a 1440p screen?
As for videos, I seriously doubt anyone can tell by looking at a video whether it has been upscaled or not. In fact there are plugins that let you play video at 144hz. I personally never saw the point, but some people can apparently tell the difference.

9 years ago*
Permalink

Comment has been collapsed.

yes you can tell the difference, some people want to smooth videos, that's fine so there's a point, just be aware that the image is being altered

you would have black bars if you choose to display 1:1 content, for example a console game will be 1920x1080, it CANNOT render more than that, therefore a 1440p monitor is going to be blurry by default, or crisp with black bars at 1:1 (i can overcomplicate the example by mentioning how some games are 1600x900 or 1280x720 for last gen)

9 years ago
Permalink

Comment has been collapsed.

Can somebody answer me this, please.. it might be stupid but
F.e. if I go and buy an 1080p 144Hz monitor, and I can't maintain 144 FPS in games, will it be stuttering, looking ugly, tearing and stuff like that.
I have lots of games that I can't max out at 60 fps, so that's what I'm wondering, or will it be the same as if I had 60Hz monitor, with only one difference that I can see more fps if I get more than 60 fps?
And I've read everywhere that they're good for competitive games, are they really?

9 years ago
Permalink

Comment has been collapsed.

as far as i understand- tearing is mainly caused by having too many fps for the monitor. therefore you should get less tearing. This will mean that you also don't need to use v-sync to reduce tearing- and not using v-sync will help reduce input lagging. Your minimum FPS won't change but your maximum will- so the ups/downs may be larger- (stuttering) but at high FPS this should not be as noticeable..

Edit: changed it a bit to link input lag with vsync and stuttering with fps ups/downs

9 years ago*
Permalink

Comment has been collapsed.

Thanks a lot, that explains it.
I never used Vsync, I can't stand the input lag.
And since I'm mainly playing shooters, and I love multiplayer games, I think this would be nice.. but It's still way too expensive. Cheapest ones I've found are ~250$ on Amazon.

9 years ago
Permalink

Comment has been collapsed.

yes i mean to say input lag with v sync- i edited my description. yeah i think it might be nice- but for some people they won't see any difference so you should hopefully try it out sometime before buying

9 years ago
Permalink

Comment has been collapsed.

Hopefully I get to try it out sometime. I've watched some YT videos, sometimes enemies appear like half a second earlier.

9 years ago
Permalink

Comment has been collapsed.

9 years ago
Permalink

Comment has been collapsed.

i love shooters & multiplayer, but i always vsync... there are things to note about input lag:
-monitors themselves have their own input lag caused by their image processing, every lcd is always laggier than a crt, in extreme cases a laggy lcd could be worse with vsync off than a crt with vsync on (a laggy lcd being something over 30ms lag)
-you may have noticed some games arent as laggy as others when vsync is enabled, this means it's time to investigate...
-one trick i do is to cap the max framerate to 60 (some people do 59 but i dont want to see that stutter), i have noticed some game engines that dont internally cap, vsync on results in extreme lag while capped+vsync greatly reduces the lag
-tools i use to cap: bandicam, RTSS (rivatuner statistics server, it comes with afterburner), or if the game engine has its own config tweak i would try it first
-there are even tools to force vsync in their own somewhat complex ways with multiple options, like radeonpro

$250 is expensive? but that's almost down to the price of regular monitors at 24" size, even crap ones start at $130+

9 years ago
Permalink

Comment has been collapsed.

I'm mainly playing CS:GO now, sometimes Dota. That's about it that I care, that is competitive. I don't really mind lower framerates or something in games like Witcher 3 or similar..
Yeah, I do cap my games to 60, except in CSGO where it is 120.
Oh it is expensive, I come from Bosnia, wages here are around 800-900 BAM, and now, 1$ is around 1.8 BAM.
Not to mention that the tehnology we have here is at least few years behind anywhere on the west, so I'd have to pay for shipping and all other stuff.

9 years ago
Permalink

Comment has been collapsed.

but what is your own wage & how much are you willing to spend? :P (i expect bosnia to be catching up, maybe there are online stores or other enthusiasts, i... left bulgaria when i was a kid but i visit every few years, it's greatly improved compared to before... in the 90s they thought a hamburger contained ham, now there are multiple competing tech stores like western countries, eastern europe seems to be doing better lately, nevermind greece's money issues)

if you're trying to keep costs down, you might as well not bother with luxury features, especially if you're not sure if you'll notice or you're fine with what you have now (no annoyances, etc), plus the technologies SHOULD improve over time, so buying later might end up being a good situation

the other thing is your priorities, my primary need in a monitor is color quality not speed

competitive is one thing, but do you think a few milliseconds will make a big difference? are you on a team?

9 years ago
Permalink

Comment has been collapsed.

I'm almost 17 years old, and I'm trying my best to earn as much as I can helping around, a bit over Youtube, and stuff like that.

Oh I know tehnology is gonna improve, and Im looking more forward the VR than this monitor, but I was simply wondering if it's that good as some people keep saying it is. And maybe if it was a bit cheaper I could sell my current one and try to buy it.

I currently have LG E2342, bought it.. idk 2-3 years ago at least, and I've never had that many monitors so I can compare the color quality.. never had that much choice here either.

In CS:GO sometimes you can see an enemy almost half a second before than you could with 60Hz, I watched some YT comparison videos and saw that. Because I don't have godlike reflexes it could come in handy.
Nope, I'm currently LE. I was merely referring to the matchmaking in CS:GO. It's fun, because people are trying to win and are communicating. But if someday I could improve to go to even semi-pro matches it would be great. But if you've meant team as a party, sure, kinda, it's not a full party, but got a couple of friends who I play with.

9 years ago
Permalink

Comment has been collapsed.

show me the comparison, i have also heard of people just running very high framerates even on 60hz monitors & feeling they have an edge

think about the time difference though, 30fps is 33milliseconds, 60fps is 16.66ms, 120fps is 8.33ms, 144fps is 6.94ms... notice how it's not linear

9 years ago*
Permalink

Comment has been collapsed.

Sorry, been busy. I've been just searching on YT this: CS:GO 60 vs 144 Hz monitor, and watched some videos, but they're now buried deep down in history, and I couldn't have found the one I was talking about, because they're quite long.

It's really late now and I've really got no clue where you're getting those ms delays or whatever they might be from..

9 years ago
Permalink

Comment has been collapsed.

i was writing the length of time per frame for those 4 framerates, i want to show you how it's just a few milliseconds difference at high fps, making it not as big of a deal to have high refresh (compared some console games being 30fps or if your computer was slow)

9 years ago
Permalink

Comment has been collapsed.

tearing is caused by not being aligned to the refresh rate, ANY framerate: on 60hz if you're 59fps or 61fps, you're tearing (technically you're tearing at 60fps since you're not synced, the tear point stays in one spot which could be even more distracting)

9 years ago
Permalink

Comment has been collapsed.

the higher the refresh, the less obvious tearing is (since there are more & shorter tear points)

9 years ago
Permalink

Comment has been collapsed.

Thanks for all these informative posts/comments.

9 years ago
Permalink

Comment has been collapsed.

9 years ago
Permalink

Comment has been collapsed.

9 years ago
Permalink

Comment has been collapsed.

9 years ago
Permalink

Comment has been collapsed.

^this.

9 years ago
Permalink

Comment has been collapsed.

9 years ago
Permalink

Comment has been collapsed.

9 years ago
Permalink

Comment has been collapsed.

Yes it does, especially it can be seen on the latest hits. Tried the witcher 3 on 144hz, after that switched to 100 hz and noticed small difference, then switched to 60hz and I could properly see not so smooth fps, so my answer - yes.

9 years ago
Permalink

Comment has been collapsed.

It is worth it if you play a lot of FPS, but don't try to do graphic/photo work with it, lol.

9 years ago
Permalink

Comment has been collapsed.

It is worth it, because it puts less strain on the eyes. So if you work a lot or spend much time in front of a monitor, then get it.
For gaming… only if you have a hardware that can handle 144 frames per seconds. AFAIK even today that needs an SLI GTX 980/Crossfire R9 290X, or a Titan X for games made in the past 2-3 years. Especially over 1080p.

1440p or 2k is a lot better, yes,because the higher dpi makes everything look sharper.

It's another question that a 2k 144 Hz monitor costs a fortune.

9 years ago
Permalink

Comment has been collapsed.

2k is just another fancy way of saying 1080p. And those are cheap.

9 years ago
Permalink

Comment has been collapsed.

Are you sure? Because 2160p monitors such as this one do exist on the consumer market for some time.

9 years ago
Permalink

Comment has been collapsed.

Your example is of a 4k monitor. It says it right there in the description.

According to Wiki:
"2K resolution is a generic term for display devices or content having horizontal resolution on the order of 2,000 pixels. DCI or the Digital Cinema Initiatives defines 2K resolution standard as 2048×1080, or 1998x1080 as Flat presentation."

While not exactly the 1920x1080, its close enough as far as performance on that resolution is concerned.

The whole naming system is ambiguous. But once you get closer to 4k pixel wide monitors, they are called 4k, regardless of how many horizontal pixels they have.

9 years ago
Permalink

Comment has been collapsed.

I really hate marketing sometimes with their stupid, always changing naming conventions…
Thank you for clearing this up for me though, I appreciate it.

9 years ago
Permalink

Comment has been collapsed.

if you have good pc buy 144Hz, if you don't play much competitive multiplayer you can even buy LED TV with Clear motion/MotionFlow/TruMotion 200hz and more, you'll have double fps (CPUs in TV will render rest of frames) in games without needing new gpu. I have it, works like charm.

9 years ago
Permalink

Comment has been collapsed.

Just save your money, do not buy any expensive monitor... Maybe finally on christmass there will be VR - ocullus or any other. Then you will need to upgrade your graphics card (if you do not have any great now) and also any VR helm, that should cost around 300 dollars.
With VR you will see much bigger screen and also higher framerate (I think they are planning 90fps)

9 years ago
Permalink

Comment has been collapsed.

Yep, current technology that will be outdated and replaced in the very near future. stick with what you got for now.

9 years ago
Permalink

Comment has been collapsed.

Closed 9 years ago by Tzell.