9 years ago*

Comment has been collapsed.

What resolution do you usually game at?

View Results
4k
1440
1080
other
Deleted

This comment was deleted 4 years ago.

9 years ago*
Permalink

Comment has been collapsed.

Deleted

This comment was deleted 5 years ago.

9 years ago
Permalink

Comment has been collapsed.

Deleted

This comment was deleted 4 years ago.

9 years ago
Permalink

Comment has been collapsed.

For HDMI 1.4 yes but HDMI 2.0 can do 60hz, as well as nvidia keplar+ cards using compression on HDMI 1.4

9 years ago
Permalink

Comment has been collapsed.

2880x1620

9 years ago
Permalink

Comment has been collapsed.

The UI of what? Many games have UI display scaling. If your referring to Windows, set display scaling to 150% in the control panel.

9 years ago
Permalink

Comment has been collapsed.

900p

9 years ago
Permalink

Comment has been collapsed.

Still here. Moving to 1080p when my monitor dies.

9 years ago
Permalink

Comment has been collapsed.

Deleted

This comment was deleted 5 years ago.

9 years ago
Permalink

Comment has been collapsed.

Got a 1440x900 (16:10). The older standard.

9 years ago
Permalink

Comment has been collapsed.

The best standard. 900p masterrace :p

9 years ago
Permalink

Comment has been collapsed.

don't know specifics or how it works in the technical form, but im just using a plain intel HD4000 but I have the output on a 1080p screen

9 years ago
Permalink

Comment has been collapsed.

My next computer this year will be 2k. I doubt any one GPU not breaking the bank, will handle 4k at 60fps.

9 years ago
Permalink

Comment has been collapsed.

Deleted

This comment was deleted 5 years ago.

9 years ago
Permalink

Comment has been collapsed.

Too expensive for my blood to go higher than 1080p

9 years ago
Permalink

Comment has been collapsed.

Yep, I'm gonna stay on 5760x1080 for a while myself until prices go down.

9 years ago
Permalink

Comment has been collapsed.

Yeah... I play on one screen only

9 years ago
Permalink

Comment has been collapsed.

Same here though I am thinking to get 2nd monitor to follow chats or watch movies while grinding in games...

9 years ago
Permalink

Comment has been collapsed.

I'm using 1280x1024 on my 42 inch TV, partially because I like to be able to read things without tweaking the font sizes, and because I'm not that big of a stickler for resolution.

9 years ago
Permalink

Comment has been collapsed.

I have a 1600x900 monitor but I play in 1080p.

9 years ago
Permalink

Comment has been collapsed.

I don't understand the punchline.

9 years ago
Permalink

Comment has been collapsed.

Errrr, it's not a punchline? lol

Sort of like downsampling I guess -> Clicky

9 years ago
Permalink

Comment has been collapsed.

I didn't know you could downsample like that w/o a tool like gedosato. I have both 900p and 1080p monitors, and I don't really notice much difference with the 900p one, although it can mainly only run older or lower spec games with its intel 4400.

9 years ago
Permalink

Comment has been collapsed.

I am using a 42 inch 4k tv as monitor. It works superb for some games, but for multiplayer shooters... well it's not the best thing of the world. The worst thing it's the refresh rate, max 30hz. Some games look weird because of the refresh rate when you start to play in 4k, but if you really like the quality of the image, you get accustomed to it. If you play a lot of multiplayer shooters, I DON'T recommend a 4k tv. Not sure about the monitors.

For games like Diablo, Legos, Batman, Saints Row, it works well. I like shooters, but most of the ones I play are singleplayer, because to play with good sensibility on multiplayer it's the hell. I am running Heroes of The Storm and runs flawlessly on this tv.

If you want to go pro in multiplayer shooting, a 4k tv as a monitor is not the best thing to do. Also, you will notice a slowness with the mouse over windows. Not so much, but it will be noticeable.

Finally, I made tons of mods and maps for a couple of games. For design, 4k is a beast. I can do ton of stuff with the huge screen. I had been using 3 monitors in the past, but the borders are so annoying when you are focusing on something and you need to look into the other screen. For design, mods, maps and stuff, 4k it's super cool. Works with me.

Running with good gear, most games go ultra in 4k. 16gb ram, 5 tb hard drive and AMD R9 200 Series. Personally, I love the experience and quality of everything. All looks amazing, I am more than happy to be able to play games with this pc.

Best Regards!

9 years ago
Permalink

Comment has been collapsed.

Deleted

This comment was deleted 5 years ago.

9 years ago
Permalink

Comment has been collapsed.

No problem, feel free to join me anytime. Best luck!

9 years ago
Permalink

Comment has been collapsed.

Technology today is capable of Downsampling 4k into your resolution. Ground Zeroes and eventually Phantom Pain, look terrific on 1200p. It is a great demo of what you can expect with a larger monitor.

9 years ago
Permalink

Comment has been collapsed.

TIL monitors cost 4k dollars.

9 years ago
Permalink

Comment has been collapsed.

1200p

9 years ago
Permalink

Comment has been collapsed.

Good old 8:5 :) I have 1050p myself. I wish TVs had standardised on 1200p instead of 1080p because the former really hit a sweet spot.

9 years ago
Permalink

Comment has been collapsed.

4k. Crossfire Radeon R9 280. Illiyama 4k 28" monitor.

9 years ago
Permalink

Comment has been collapsed.

1080p.

Want to upgrade, don't have money for that.

9 years ago
Permalink

Comment has been collapsed.

I have a 1080p monitor on my desktop but I run 1440 on my surface. Sometimes I stream games to my surface.

9 years ago
Permalink

Comment has been collapsed.

Full HD on 27" Acer monitor. AMD Radeon R9 280X, 3GB GDDR, 8GB RAM. Everything is smooth as butter.

9 years ago
Permalink

Comment has been collapsed.

so what kind of butter do you use?

9 years ago
Permalink

Comment has been collapsed.

Peanut butter, of course!

9 years ago
Permalink

Comment has been collapsed.

oh so jelly

9 years ago
Permalink

Comment has been collapsed.

i run 2560*1080 (21:9) with this system: i5-3570, GTX 970, 16GB RAM, 1TB SSD. my main issue with certain games is not the performance, but the aspect ratio. but most games support it (or there is a fix). love 21:9! :)

a single 780 is not enough for 4k. not even close. ;)

9 years ago
Permalink

Comment has been collapsed.

Depends on the game. I bet you could get easily 60 fps on maxium detail out of Quake 3 even on 770.

But Metro 2033 needed four Titans to get Ultra detail, 4K and 60 fps.

9 years ago
Permalink

Comment has been collapsed.

of course it depends. everything always depends. ^^

but i don't think someone who asks this question wants to play only 15 year old games. the general answer for modern games? not even close! ;)

9 years ago
Permalink

Comment has been collapsed.

And Metro 2033 is five years old now - it is not exactly a modern game either. And 780 is a slightly weaker edition of Titan.

I was just making out a point that you can't generalize. I bet that after some time I could find some 2014 or maybe even 2015 games working just about fine in 4k on GeForce 780. But still - it is about a quarter of a power needed to get five year old game to run perfectly in 4k. If I remember correctly Metro 2033 got framerate drops to something like 30 fps in 1080p with a single Titan (don't quote me on that) But I think that you can get Skyrim to run at 30+ fps on High/Very High settings on both of these cards. (you might need to overclock them a little, and again - don't quote me on that, but I remember seeing Skyrim at Very High details, 1440p, 80+ fps on a overclocked Titan without SLI)

9 years ago
Permalink

Comment has been collapsed.

of course you can generalize. that doesn't mean there can't be any exceptions. crysis was that one game which asked for more gpu power than most people could deliver, for so many years after release. but in general you were pretty fine for years with a 8800GTX. so the statement "in general you can run basically everything maxed out with a 8800GTX" was absolutely fine, even if there was this one exception.

so, when i say now, that a 780 is not sufficient for 4k, it should be obvious that i am not talking about quake 3. ;) and i think it is a reasonable assumption, that if someone asks for 4k, he probably wants to play games in 4k, which are graphically up-to-date and quite demanding. that doesn't mean we are only talking about high-end graphics like crysis 3, bf4 or ac unity. even games like borderlands 2/3 will need significant processing power in 4k and high settings.

9 years ago
Permalink

Comment has been collapsed.

Borderlands pre-sequel got some optimization issues - for example they always want to run on three cores - leftover from conversion from XBox 360, dual-core processors got uneven processor load, four cores keep one core on idle, problem is increasing with higher resolutions as all rendering is in single thread. As a result you can get it to run 60 fps on Geforce 680 with quad-core core i5/i7, but if you have dual-core processor you will get less than 40 fps. The high-end Three core AMD will also work fine. Also the power of a single core is important - just adding more cores won't fix anything. To be honest it is the problem with all games using Direct X. (Maybe DX 12 will fix this)

Still - You probably could get 60 fps in 4K in Crisis 3 or Battlefield 4 on a single GF 780. Just set the details to minimum.

Also the question is - do you want 60 fps, or are you fine with 30 fps? Because you can get 30 fps on 780 in Tomb Raider, Hitman: Absolution and Dirt: Showdown. - all on high/very high settings. And I am sure that this list is much longer - I just don't have the time to look for more examples.

9 years ago
Permalink

Comment has been collapsed.

sorry, you can't get 60fps in Crysis 3 on a 780. not even on low settings. it's actually not that easy to find a benchmark that also shows low quality settings. but i found one:

http://www.anandtech.com/show/8526/nvidia-geforce-gtx-980-review/14

39.4fps is pretty bad for a shooter. in fact many pc gamers would consider that barely playable. and this is on low settings - choosing lowest settings just for 4k resolution is already questionable in my opinion. raise the graphics quality, and you can completely forget it.

not even the 980 gets 60fps. that's what i mean. if you want to be able to play modern games in 4k, you need a setup with at least 2 gpu's. yes, there are less demanding games which you will be able to play at 4k@60fps, if you adjust the settings. but if you invest so heavily in a 4k-setup, are those games really why you are doing this? i would assume, someone who wants a 4k-ready rig, also wants to enjoy the games with high graphical fidelity. and many of those are demanding like crazy, unfortunately. ;)

you mentioned tomb raider:

http://www.tomshardware.com/news/amd-radeon-r9-290x-benchmark-performance,24732.html

27fps on ultra@4k. so even to maintain 30fps average, you will have to reduce graphics quality. and we are talking about average fps here, of course. you probably will have to go significantly lower to stay above 30 all the time. and of course you have to be willing to play at 30fps instead of 60 in the first place. i personally wouldn't want that, even if 30fps are more acceptable in a third person game, than they are in shooters, racing games and whatever.

well, that's my point. with a 780 you will have to make huge compromises left and right, if you want to play at 4k. either you have to heavily reduce graphics settings, or you can completely forget it, because you won't get a playable framerate, even on lowest settings.

9 years ago
Permalink

Comment has been collapsed.

Can you show me where I mentioned that you can play Tomb Raider on Ultra in 4k? Because as I recall I said clearly - High/Very High settings. On Crisis 3 I also said "probably".

And if you look few posts below you will find my post explaining why it is better to play at Ultra details in 1080p instead of lowering quality to get 4k resolution.

9 years ago
Permalink

Comment has been collapsed.

yes, you said probably. and i didn't say anywhere that you didn't. or that you said anything else. you said you assume that a 780 can probably run crysis 3 on low settings at 4k. and then it was my turn again in this discussion, and i tried to show you that a 780 won't do it, even with low settings. no need to get defensive, we are just discussing a topic we obviously both like to talk about. :)

9 years ago
Permalink

Comment has been collapsed.

So maybe I overreacted. Sorry about that.

And yes, I like to talk about it. I love to talk about how the games are made, and about technical aspect of gaming. And for some time now I am considering a resolution battles to be just one new way to introduce technology that is next to useless, but very easy to sell. 4K resolution is a great marketing tool - but since we still lack the hardware to push 1080p to the limit, it is much too early to jump to 4K.

Fun fact - Skyrim (and probably 99% of all games) must upscale some textures on the fly while playing in 720p, because source textures resolution is too low. And while it is usually invisible for the player, graphics quality could be easilly boosted by using higher resolution textures. Getting high resolution mod still isn't high enough to get it to work flawlessly in 1080p - there is still room for improvement, which will give better results than just pushing rendering resolution higher. Compressed textures in optimal resolution for displaying Skyrim in 4k would take between 65 and 80 GB - and until games are prepared to use this kind of resolution, playing in 4k is just another antialiasing.

9 years ago
Permalink

Comment has been collapsed.

yep, texture quality is extremely important. and that is something that is especially bad in many console ports, unfortunately. i have to think of this TotalBiscuit video. he plays the first few minutes of The Vanishing of Ethan Carter (great game, btw). he walks on the bridge. and there is a sign. and he stops and says something like this:

this is an awesome sign texture. i can actually read all the text on it. this is really impressive. in fact i think this is the most gorgeous sign i have ever seen in a video game.

xD

9 years ago
Permalink

Comment has been collapsed.

It's too early for 4k (with a reasonable budget)

9 years ago
Permalink

Comment has been collapsed.

+1

9 years ago
Permalink

Comment has been collapsed.

Deleted

This comment was deleted 5 years ago.

9 years ago
Permalink

Comment has been collapsed.

Deleted

This comment was deleted 2 months ago.

9 years ago
Permalink

Comment has been collapsed.

1080p on desktop, 1366x768 on laptop. 720p on phone.

Seriously, resolution is overrated. 1080p 60 fps vs 4k 45 fps - and you will always get better gaming experience on 1080p. On 4k you will see the LoD while it remains invisible in 1080p. Antialiasing, texture quality, Lights, Shadows - all of them will improve visuals in a way that is unobtainable by simply pumping more resolution into a game.

First of all - human eye got it's own internal resolution. And it is very close to 1080p. Does it means that it is pointless to get higher resolutions? Well, no. Because to make sure that you won't get distortion while transferring any digital signal you need to get sampling twice as high as highest possible frequency of the signal. The same applies to resolution, but in two dimensions. Using simpler words - human eye can see a very small difference between 1080p and 4k.

But these numbers are just best possible scenario. In UK there was a test. A group of random people was looking at two or three screens in different resolutions. On all 40" screens was shown the same movie, recorded in 1080p, and on one screen it was shown in movie's native resolution, on second one in 720p. The group which seen three screens also saw the same movie in 480p - all rescaled, and using lossless compression. All the screens had been calibrated for optimal colour balance, brightness, contrast and saturation. And the results were that in two screen group 58% claimed that they can't see any difference between both screens, and 11% claimed that 720p was better, only 31% shown correctly that 1080p is better. While looking at three screens 29% answered that they can't see any difference between them. 8% claimed that 480p was best, 21% that 720p and 42% that 1080p.

Now I know that you can't directly translate results to games - mostly because of aliasing, UI and fonts but still it shows us the pattern. Whey you get to a high enough resolution you can't see any difference. (and the sales department got an army of people calibrating and de-calibrating screens to prove that you can) And while in my opinion if you can't see a difference between 480p and 1080p than it means that it is time to get some professional help, and maybe glasses, I still claim that everything above 1080p at today's hardware is overkill.

There is a big price tag on 4k resolution. It doubles resolution in two dimensions, so the computer needs to calculate 4 times more pixels. And it kinda sucks. You won't get as big performance drop on adding any antialiasing, doubling texture resolution (which by the way is often too low for optimal display at 1080p), improving shadows or doing one of hundred other things which can improve video quality. Also note that to get a true 4K resolution you need to increase the resolution of textures (or you will get blurry textures - it won't be clearly visible, but at some angles, and in some scenes you will notice that something is wrong) And this would increase texture size by a factor of 4. So textures for a game instead of using 8 GB (average size of a textures on a PS4 game - usually textures are build for 1080p, even on a 900p games, besides usually there is very little to no difference between target resolution for texture in these two modes) of space, will now take about 32 GB. So another 24 GB to download/put on Blu-Ray. It is a problem which isn't solved yet (but let's face it - games will soon exceed the size of BR disk anyway, and 2 BR games/60-100 GB download will be required in some games) The part of the problem is texture compression. (you can't use .jpg but you need or uncompressed textures or something like A8L8, S3TC or 3Dc - and they result in big files, like 16MB in uncompressed and between 4 and 2 MB in compressed, 2 or 3 textures per object, multiplied by number of objects in game... - there are ways around this, but you need a lot of time and effort to use them in a way that won't cause the player to notice them. It is much cheaper just to leave all textures big) In most games difference in visual quality is much bigger between setting high in 1080p and Ultra in 1080p than between High in 1080p and High in 4k.

tl;dr - Get 4k screen only if you can play games on Ultra details. If you can't - stick to 1080p or 1440p.

Also the technical aspect of graphics doesn't matter as much as overall aesthetics. Play Silent Hill 2, Shadow of the Colossus, This War of Mine or Braid and you will know what I am talking about.

9 years ago
Permalink

Comment has been collapsed.

Another human eye can't see for shit post. It took hours to read that at 20fps.

9 years ago
Permalink

Comment has been collapsed.

Deleted

This comment was deleted 5 years ago.

9 years ago
Permalink

Comment has been collapsed.

First of all - sorry for a little chaotic way of putting this. I know I have a tendency to do this, and English isn't my first language. (I don't live in UK, I live in Poland - but Polish surveys are usually manipulated in a way beyond comprehension)

The one thing important in my post - it is not hard to run game at 4k resolution. The problem is that unless you got extreme hardware (like four 780 instead of one) it will look worse than the same game in 1080p on the same hardware, or it will drop frames.

9 years ago
Permalink

Comment has been collapsed.

Am I the only one who games at 3K, then?

EDIT: Are people talking about resolution or #p? I'm assuming the former but just figured I would ask. Never know these days.

9 years ago*
Permalink

Comment has been collapsed.

Does running some games at DSR at over 4K via DSR on a 1920 by 1200 (greater than 1080P) monitor kind of count as 4K? I marked "other" (as 1920 by 1200 isn't 1080P).

9 years ago*
Permalink

Comment has been collapsed.

Deleted

This comment was deleted 5 years ago.

9 years ago
Permalink

Comment has been collapsed.

I switch between down-sampling from up to 3840 by 2400 and 3840 by 2160 (4K), depending on whether I'm using my computer monitor or my TV as my main display.

9 years ago
Permalink

Comment has been collapsed.

2K games are not bad either, though there is some criticism regarding their DLC practices ;-)

9 years ago
Permalink

Comment has been collapsed.

1366x768 Master Race.
...Anyone?

9 years ago
Permalink

Comment has been collapsed.

Me!

9 years ago
Permalink

Comment has been collapsed.

Me too

9 years ago
Permalink

Comment has been collapsed.

You're not hipster enough for 1680x945.

9 years ago
Permalink

Comment has been collapsed.

Me! With a Zotac GTX 970! (Yeah, it seems like a waste).

9 years ago
Permalink

Comment has been collapsed.

It does! I hope that you're at least using DSR, but even then it DEFINITELY seems like a waste...

9 years ago
Permalink

Comment has been collapsed.

But of course I'm using DSR! Even so, I didn't buy this gpu for gaming, but for rendering stuff in Blender, etc...

9 years ago
Permalink

Comment has been collapsed.

I love my 1920x1200 monitor :3 I really with that 19:10 was the standard. As it is now it's better for everything that's not a movie ;p

9 years ago
Permalink

Comment has been collapsed.

Closed 9 years ago by upandout8.