Edit: Thanks for the help guys. It truly helped me solidify my CPU choice. Thank you.

---------------------‐
I have been researching new CPUs to choose one for my next gaming build, and one thing has confused me. I hope someone here can explain it to me.

I currently have a i7 4790K, which is getting old and hot and prone to random reboots prompting me to build a new PC.
My options were basically Ryzen 3900x or Intel i9 9900k.
I was really keen to finally go back to AMD CPU after a long time (Athlon 3200+ was my last AMD CPU), but despite all the promises of Ryzen 3 beating Intel 9900k in games, it actually didn't. In most games benchmarked it was about 10%-15% less FPS compared to the 9900k. Far Cry 5 had an even bigger delta between the two. If you consider Emulators and games like CSGO, Ryzen starts to look like a pretty bad investment from a purely gaming perspective.

That by itself would suggest that Ryzen 3, while a very capable chip in most scenarios, is not quite good enough to beat the 9900k in gaming. However reviewers and people generally everywhere seem to just ignore it saying that the difference will shrink at higher resolutions. This is what I am struggling to come to terms with. How does a higher resolution makes the Ryzen better? All it shows is that games tend to become GPU bound when you play them at 1440p on ultra settings. They don't take the following into account at all:

1) As a gamer I am likely to change the video settings to get the desired FPS, not necessarily making the game GPU bound.

2) An average games including me will likely use multiple GPU's over the life of their build. My last 2 PC lasted about 5 years each, and I bought multiple GPUs over those 5 years. I am currently running a 1080 will will likely upgrade to a 3080Ti or even 5080Ti later on at some point. Games will become more demanding and my GPU will be upgraded to keep up with them, but my CPU will likely stay the same because changing the CPU means I have to purchase a new motherboard, and probably new memory since DDR5 would be the standard. Spending the same amount of money on Ryzen vs Intel means that I am literally getting 10-15% less FPS in most games and will become a bottleneck when I purchase new GPU.

This isn't necessarily just a complaint about these particular CPU tests but rather as a whole. I have been hearing these same arguments for a very long time. Fact is, every single build I have ever made gets to a point where it is CPU bound for a while, before I am forced to pull the trigger and upgrade the whole setup.

I am hoping one of you guys can provide some clarification about why do people ignore the 1080p testing results and would rather go by the 1440p results where the CPU isn't even the bottleneck. Is there something inherent to these higher resolutions that causes the CPU to be less useful regardless of the settings? Or do they still scale similarly as long as the game isnt GPU bound? I intend to purchase a CPU tomorrow, so this is my last ditch effort to get some additional information that can help me make an education decision.

Thanks.

4 years ago*

Comment has been collapsed.

gamersnexus did quite a good job covering the new ryzen launch. AMD Ryzen 9 3900X Review & Benchmarks: Premiere, Blender, Gaming, & More

3900x and 9900k may be roughly at the same price but you get x1.5 the cores and x3 the threads with ryzen.
if you JUST want to game then there is no need to care but if you want to do some streaming or audio / video / photo / professional stuff then more cores are always handy.

also running games at 300fps with intel instead of 250 fps with amd doesn't gain you anything in the real world.

4 years ago
Permalink

Comment has been collapsed.

Yeah, I love watching their stuff.
I don't make any videos, I don't do any scientific work, or any editing (just the occasional one).
If Ryzen was only 5% less FPS than Intel, I would still have gone Ryzen. But since gaming is my primary reason for using the PC, I feel I should focus on the gaming results.

4 years ago
Permalink

Comment has been collapsed.

well then you don't really need 12 cores anyway.
you could go for the 3800x with 8c/16t. doesn't really go faster but saves you 100 bucks for other stuff. better board or maybe some extra ram. or a sexy nh-d15.

fun fact: i'm rocking an i7 4770k and asus strix 1080 myself.

4 years ago
Permalink

Comment has been collapsed.

Budget isn't really an issue for me right now. :)
I already intend to buy the NH-D15 no matter what CPU I purchase.
And I already intend to get 32GB ram, so I don't think purchasing more will make it any better.

4 years ago
Permalink

Comment has been collapsed.

make sure to watch buildzoids mainboard guide if you haven't already. ;)

if i had the money i'd buy a 3900x right away. i want them cores.
10% more fps doesn't mean much to me.
100 or 110 fps? both great.
30 or 33 fps? both shit.

in fact i'm planning a 3900x build for my little sister right now.
then i'll get her 'old' 1700x system. which i planned and built just two years ago. xD

again my current 4770k has considerably better single thread performance than 1700x but i prefer more cores.

4 years ago
Permalink

Comment has been collapsed.

I will watch it. Thanks.
And I agree that 100-110 FPS isn't that useful for most people, and neither is 30-33 FPS. However it is quite a difference at 60-66.
I am in particular quite sensitive to stuttering as it causes me migraines in certain games. (Couldn't play games like Witcher 3 or Darksiders 2 for a very long time until I had high end hardware to run those at a very high FPS). Higher FPS mitigates it somewhat.

Once again, if I had a usercase where more cores would be helpful, I would totally buy the Ryzen if only to future proof the build with more cores and lower temps.

4 years ago
Permalink

Comment has been collapsed.

btw Witcher 3 at start and after many updates (or now) worked completely different even on older gfx, later really much better optimized (much more fps) and this is always very important despite the spec.

4 years ago
Permalink

Comment has been collapsed.

I might give it a try again some day and hope that it doesn't cause me any migraines. Back when I did try it, I had to use mods and cheat engine to remove the fish eye effects and increase the FOV just to not be nauseated by the game. Maybe with higher FPS and less choppiness, it might be playable.

4 years ago
Permalink

Comment has been collapsed.

Deleted

This comment was deleted 4 years ago.

4 years ago
Permalink

Comment has been collapsed.

Thanks for bringing this up. This is exactly what I want to clarify.
Yes, I game at 1440p and probably will be for the foreseeable future, since I am sure 4k is better to look at, I prefer high frame rates for the smoother feel. Plus gaming at lower resolutions triggers my migraines in some games. Higher frames helps considerably and allows me to play for longer before I have to step away.

Like I pointed in my post, at higher resolution a 2080Ti might be a bottleneck, but a 3080Ti or 5080Ti probably will not. I intend to keep this CPU for an average of 5 years (Unless they have an amazing leap in CPU capabilities sooner), so after my GPU upgrade I do expect my games to start becoming CPU bound even at 1440p resolutions. I expect my future GPU to have 30-100% more capability than a 2080Ti, especially with the next gen Nvidia quite possibly being a 7nm GPU. In those cases, would it still be only 5% difference when the CPU is a bottleneck? And even if we don't quite get 100% performance increase, I doubt I would be playing at Ultra settings like a lot of benchmarks seem to test. I tend to lower my settings realistically to get more FPS out.

If I am GPU bound I can lower settings and mitigate it, but if I am CPU bound, there isn't a whole lot I can do to make a game run better.
Or is there something inherent to the resolution that makes the CPU less relevant at 1440p compared to 1080p?

4 years ago
Permalink

Comment has been collapsed.

I deleted my reply earlier when I should not have because I didn't read your original post entirely and thought I was answering the wrong concerns. I've addressed in another comment why the bottlenecking changes as you go up resolution.

To answer: yes, as GPUs get more powerful the CPU will increasingly become a bottleneck at some tipping point. When Rivatuner says your CPU is at 100% load, that's when it's bottlenecking you.

But remember that improvement in tech is geometric. The Intel may be 10% faster now but that difference will shrink in comparison as other technology advances. And Ryzen has waaay more threads to work with which means it will actually scale better into the future than Intel does because multi-threading optimization is how CPU performance increases will happen going forward. According to Cinebench results normalized for clockrate, Zen2 actually beats Intel on IPC. Higher clockrate is literally Intel's only remaining advantage now, and it's nothing a good OC can't fix since Zen2 draws less power anyway.

4 years ago*
Permalink

Comment has been collapsed.

Would you overclock the 9900k to 5GHz? That's what it seems to require to get that 15% lead based on reading couple comparisons. Since games don't use multiple cores that well it all comes down to single core speed that you can get with overclocking.

4 years ago
Permalink

Comment has been collapsed.

Eventually I might overclock it. But only when my CPU becomes a bottleneck in games, which probably won't be the case straight away.
But overclocking isn't a given. And even if I do, I might not aim for 5Ghz, but rather a bit lower to keep the thermals in check. I guess it all depends on the silicon quality of the CPU.

4 years ago
Permalink

Comment has been collapsed.

Is your current CPU overclocked? Why is it heating up and crashing? The only reason it should heat up more than normal is if it isn't making good contact and probably needs the thermal paste replaced or if you are pushing the voltage because you want higher clocks or to stabilize a failing CPU.

I am curious because I am still running a 3570k that has been overclocked since I bought it in 2012 and it is still running fine.

4 years ago
Permalink

Comment has been collapsed.

Not quite sure. My CPU always had issues from day one.
It used to get so hot that it used to crash on stock, so I never really got a chance to overclock it.
Then I went out and bought an aftermarket cooler and that meant changing the paste, after which the temps got a bit better. But the CPU was still throttling. So I now keep my case open and it is fine under normal gaming sessions. However for the past month or so, it has been crashing about twice or thrice a week and forces a restart. I can only imagine I had a crappy sample from the beginning, because I don't know what else could be causing it. I even tried to swap the power supply and motherboards from another one of my PCs.

4 years ago
Permalink

Comment has been collapsed.

That sounds horrible. If the CPU was throttling, even on the stock cooler, it is clearly defective. You probably could have got it replaced fairly easily under warranty when it was new.

4 years ago
Permalink

Comment has been collapsed.

Yeah, I probably should have. But I didn't realize how bad it was initially and I didnt wanna wait for the replacement which would have left me without a PC for what I perceived at that time was a small issue. Of course knowing what I do now, I agree that I should have RMA'ed it.

4 years ago
Permalink

Comment has been collapsed.

One thing to note in case you do plan to overclock is that the 3900x draws less power than the 9900k at full load, and probably runs at a lower temp too

4 years ago
Permalink

Comment has been collapsed.

That's true and it is indeed a factor.
That said overclocking the Ryzen isn't really on the cards. A lot of samples are not even reaching the advertised boost, let alone any further overclocks. And the ones that did overclock don't seem to be giving any better performance in games.

4 years ago
Permalink

Comment has been collapsed.

The thing is, the 3900x is better than the 9900k in (almost)everything except gaming, so if you do anything else besides gaming you should go with the 3900x. And with future games the extra core count will help a lot, because as individual cores aren't getting faster, developers have to appeal to optimize for multi-core cpu's. That will only increase over time, not decrease.

4 years ago
Permalink

Comment has been collapsed.

I agree that if I did anything aside from gaming, 12 cores would be good. But honestly now a days I just game on my PC. My days of running and experimenting with multiple VMs, linux, hackintoshes, etc are behind me.

Even today very few games use 8 cores, and the ones that do don't necessarily show much improvement over 6 cores. Next gen consoles will likely have 8 cores as well, so I am not too concerned about games suddenly making the jump to 8+ cores in the next 5 years. There might be the occasional one like Ashes of singularity, but until a significant portion of gamers have more than 8 cores, devs won't really spend their effort on using more cores.

4 years ago
Permalink

Comment has been collapsed.

If you take any repeating process, the lowest possible turnover rate is equal to the slowest stage in that process. This goes for anything, not just computation. It's a simple truth they really should teach in middle school.

At low resolutions, the GPU can handle the load faster than the CPU can issue commands to the GPU. The tardiness of the CPU dictates how fast the frames can be drawn. At some point as resolution increases, this changes and the CPU is waiting for the GPU to complete its tasks.* The CPU is no longer the slowest stage in the pipeline and so variance there doesn't matter to the total frame drawing rate.

So at 720p/1080p the Ryzen is indeed slower than Intel. But bump that up to 1440p+ and your processor is literally twiddling thumbs waiting for the GPU to finish.

4 years ago
Permalink

Comment has been collapsed.

Yes, but like I mention in my original post, I expect to purchase a 3080Ti and later possibly 5080Ti (going by my previous purchasing history) for use with this CPU. I can easily expect a 60-100% better performance from a 4080Ti/5080Ti or whatever they chose to name it at that time through better hardware and not playing the games at ultra. Would the CPU at 1440p still not be the limiting factor? Unlike GPU, a CPU tends to stay around for longer.

4 years ago
Permalink

Comment has been collapsed.

I've responded above. Sorry I'm a bit scatterbrained at the moment.

4 years ago
Permalink

Comment has been collapsed.

In a sense, yes, but not in the long run and in higher resolutions.
Many game engines were designed around Intel since they are the market leaders (and in some cases, the engine developers were "sponsored" by them), plus Intel is still using their very, very, very, very old architecture that boils down to "have a fully functional strong individual core and let's see how many of them fit on a die." AMD goes rather in the "have a lot of small computational chiplets and one large chip for the common tasks like IO" way.

Furthermore, at larger resolutions, the graphical CPU overhead will become a lot more visible, regardless of CPU. All games have an internal logic, a scripting engine. It can only use the CPU to run the game portion of the game. But as you increase resolution, a larger and larger part of the CPU's workload will be supplying the GPU with graphics overhead, so the scripting engine's load takes up smaller percentages, so the Intel optimisation gradually fades. (And, well, AMD started "sponsoring" game engine development as well, so we can expect a slightly more even playing field.)

Also, a final tidbit: the guys behind Linus Tech Tips built their own new large LAN gaming arena from Ryzens, and that was months before the 3000 series…

4 years ago
Permalink

Comment has been collapsed.

So. It's not a simple matter of loading up the CPU cores to do the work that needs to be done. Rather the workload is divided into different parts (scripting engine's load + supplying GPU). And the reason for intel's higher FPS is the scripting part and not due to the supplying the GPU part? So when supplying the GPU takes up a greater portion of the workload, scripting speed % differences become less relevant as a whole.

Just curious, is the supplying GPU part not affected by the architecture? If yes, are there any benchmarks or even videos talking about this particular side of the CPU use? Just want to get some more information so I can understand the concept better.

4 years ago
Permalink

Comment has been collapsed.

It is affected by architecture. However, when testers put the game affinity to a single side of the Ryzens (halving the available threads), suddenly the difference was more in the 5% range. Currently, there are obvious issues with Windows nor knowing how to handle the chiplets, its scheduling timer is all over the place, making the data travel unnecessarily through the interconnecting bridge within the CPU. If people manually force an application to stay on one side, the 3900X almost turns into two i9-9900K CPUs. (Which it is if you want to really dumb things down.)
However, Intel supposedly also will go the chiplet way once they ditch 14nm (so, in about seven centuries), so either they run into the same problem or the engineers at Microsoft must solve this problem by then with a scheduler that is still not expecting to have 15-year-old hardware under it.

4 years ago
Permalink

Comment has been collapsed.

Yeah, I read about the scheduler problems under windows. Funnily enough Linux has no such issues. I do admit that Windows will probably fix this particular issue soon.
But I have seen benchmarks where only specific cores were used to get past this issue, and it still lags behind.

4 years ago
Permalink

Comment has been collapsed.

They will lag behind for some time, just like how Intel lagged behind like crazy in the Athlon days. Game engines are usually optimised for the more popular architecture type, and these days, that is clearly Intel and Nvidia. If you are literally doing nothing but gaming on 100% time of a PC (and maybe zero-load work like browsing), then buying an i9-9900K can be a better option, albeit not necessarily future-proof. One of the many reasons all major (and many minor) tech outlets say that Ryzen is the way to go is because they will only go upwards at the current trends, while Intel is playing catch-up in a game where they nailed their foot on the floor deliberately.

4 years ago
Permalink

Comment has been collapsed.

But I like to see more + in Intel's 14nm+++++++ :hehe:

4 years ago
Permalink

Comment has been collapsed.

Gamer Meld even has that one in their store as a mug. =D

4 years ago
Permalink

Comment has been collapsed.

Lol. Looks like Intel's failure and "drawers full of designs" will be remembered for generations :D

4 years ago
Permalink

Comment has been collapsed.

those 10-15% differences will decrease as the ryzen 3000 platform matures. Also worth noting that while intel does have higher fps in the games tested, in reality that 10 fps difference is nothing you will notice. Gamers these day get to worked up about squeezing out every possible fps possible. You're not gonna feel/see a difference between 140 and 130 fps. On lower end computers such a difference would of course be noticeable(50 fps vs 60fps for example).

That having said, if gaming is all you do and you are certain you're not gonna want to use pcie 4.0 for faster ssd's(gpu's later on) then I9 would be the obvious choice between the 2.

4 years ago
Permalink

Comment has been collapsed.

Not sure about not noticing a 10-15% dip in FPS. FPS is just one side of the story. Even when the FPS remains the same and the frame times become less even, you can notice a stutter. I certainly can. If you aren't CPU limited, then yes 90-100 FPS feels similar enough. But when you are CPU limited, that FPS drop will also come with frame time stutters that can be perceived. I admit I am more sensitive to them than a typical gamer, but tech report published an article in 2011 talking about it. https://techreport.com/review/21516/inside-the-second-a-new-look-at-game-benchmarking/. Most benchmarks still don't use this, but some including gamersnexus do.

As for x570, I don't think I would be tempted to get that over a x470 even if I did get Ryzen. A comparable Asus Rog motherboard for these 2 chipsets has a cost difference of about AUD$200. And that new motherboard gets you a higher power draw, making it run hotter and a chipset fan, that will likely fail before the products end of life (had bad experiences in the past with chipset fans). The only upside is PCIE 4, which from a gaming perspective might potentially give about 1% FPS boost in very specific games? Apparently even using a PCIE 2 today only reduces the FPS by about 2%. We are nowhere close to the stage where PCIE 4 is actually a worthwhile improvement. And while I did consider a boot SSD for PCIE 4, people everywhere assure me while they are faster for many operations, even a sata drive is barely any slower in actually loading games and applications.

4 years ago
Permalink

Comment has been collapsed.

Stutters and average FPS are not completely correlated. You can see in reviews for example that the i5-9600K has good average FPS but low 1% lows.

The 3900X does have somewhat lower 1% lows on average, but there are some games where it has lower average FPS and comparable 1% lows, (looking at TechSpot's review, for example).

4 years ago
Permalink

Comment has been collapsed.

ZombieLoad Attack
https://zombieloadattack.com/
Fending off Zombieload attacks will crush your performance | ZDNet
https://www.zdnet.com/article/fending-off-zombieload-attacks-will-crush-your-performance/
Intel Tried to Bribe Dutch University to Suppress Knowledge of MDS Vulnerability | TechPowerUp
https://www.techpowerup.com/255563/intel-tried-to-bribe-dutch-university-to-suppress-knowledge-of-mds-vulnerability

Benchmark before the incident and benchmark after the incident
You should be mindful of that area.

4 years ago
Permalink

Comment has been collapsed.

All the benchmarks I have seen were done with all the patches applied. Intel still comes out ahead.
As far as scummy practices go, I wanted to purchase Ryzen because of that. But then Ryzen came out and pretended with their benchmarks that gaming performance of their 3800x was similar to 9900k, which turned out to be false. So I can't even give them the moral high ground on this.

4 years ago
Permalink

Comment has been collapsed.

good post, but keep in mind that all of the above mitigations are optional. also there is no virus or tool out yet that uses those attack methods. But intel appear to be much more vulnerable than amd, op should consider that too.

4 years ago
Permalink

Comment has been collapsed.

Stick with your gut.

4 years ago
Permalink

Comment has been collapsed.

Aside from fewer cores, the 9900K also runs hotter then the 3900X, which means it may fail sooner.
That said, yea the 3000 series did not live up to the hype (its fast, but not nearly as fast as rumored), and Intel is still king for high end gaming.

4 years ago
Permalink

Comment has been collapsed.

cpu are not prone to failing most of the time the motherboard dies.

4 years ago
Permalink

Comment has been collapsed.

It kinda did live up to the hype, for people who do editing, rendering or 3D related work.

4 years ago
Permalink

Comment has been collapsed.

Price wise sure, but remember the rumors were talking about them hitting 5ghz. They're great, but not 5ghz great.
For production purposes (as opposed to gaming) though I'd recommend a Ryzen 3900X over the 9900K any day.

4 years ago
Permalink

Comment has been collapsed.

Assuming that you're going to stick with this PC for a few years, I'd say that the main thing which is likely to be relevant to gaming is that next gen consoles will be Ryzen based, which could mean that developers will become more familiar with how to get the best performance of this architecture. Also the larger number of cores could have an effect in the long run, like 4 core CPUs, which was more than enough 2 years ago, is now starting to show their age.

4 years ago
Permalink

Comment has been collapsed.

Consoles currently have 8 cores, and rumours pin the next gen to be the same. I do think we are going to a stage where 8 cores might be used for gaming, but I don't see us moving past that stage. It takes too much development effort for games, when in all honesty they are likely to be GPU bound for an average gamer.

4 years ago
Permalink

Comment has been collapsed.

Yes, but current consoles have pretty weak cores without hyperthreading, they would at best be comparable to 4 core desktop CPUs, and probably weak ones at that. We currently have games on the PC which do use 8 cores reasonably well, and a lot more computing power than consoles provide.

It would of course depend on how the desktop platform develops. But it's been said that Intel's next chip family will have 10 cores, so presumably we'll continue going up on core count.

Of course 8 cores may be enough, but as developers get used to a lot of high power cores, there's a chance that even 12 core CPUs will see some benefit from that.

Also, with AMD you'll have an upgrade path if you do decide to upgrade the CPU.

In think that in most likelihood the 9900K will be the better gaming CPU for at least 1 year or two. Beyond that, I'm not sure.

4 years ago
Permalink

Comment has been collapsed.

AMD have said that they will only support AM4 for one more generation. And if I do buy the 3900x I doubt I would be upgrading next year again.

4 years ago
Permalink

Comment has been collapsed.

Possibly not, but you could buy a 2020 16 core CPU in two years.

4 years ago
Permalink

Comment has been collapsed.

Like I said, cores are not useful for gaming beyond 8. If I needed more cores, I would just buy the 3950x.

4 years ago
Permalink

Comment has been collapsed.

I don't plan on getting either, because both are way too expensive for my tastes.
But I can provide a couple of reasons for the question you're asking:

  • (Your point #2) The 9900k might be better TODAY, when most games only know how to utilize 4-6 cores (at best). But as you mentioned, a CPU is something that's supposed to last 5+ years. So the situation may be very different in a few years.

And more importantly:

  • (Your point #1) As you mentioned, you're going to be changing your game according to your FPS. That is true.
    But what's also true for the vast majority of gamers, is that we change our games to look as good as possible (within the FPS constraints).
    What I mean is - you won't run your game on Ultra graphics, if it means you're getting 10 FPS from it.
    But it's equally redundant to run your game on Medium, just because it gives you 300 FPS.
    What most people will do, is crank up their game to look as good as possible, while still maintaining a reasonable (30+) FPS ratio. Which means, you'll almost always be running a GPU-bound game.
    Otherwise, your game would have run on 300+ FPS, and not the 30 - 90 FPS that most people run their game.
    As soon as the FPS ratio is limited by any kind of reasonable limit, it's the GPU that's binding you and not the CPU.
4 years ago
Permalink

Comment has been collapsed.

For your first point, I am not convinced that games will use more cores in the next few years. The consoles having 8 cores was a big reason for that push in games. Next gen seems to be also using 8 cores, so I don't expect to see any changes on that front for the next generation.

Second, Any game will eventually be CPU or GPU limited. Right now, a lot of the games I currently play like Vermintide, Division 2, CSGO, etc are CPU bound for me. I can't get to 100 FPS (300 for CSGO) even if I turn everything down to what I consider a bare minimum acceptable because the games are CPU limited. I am not saying I need 100 FPS to play games, but when I am spending that much money, its kinda something I expect.

And finally, aside from thermals and the small chance games might use more than 8 cores in near future, I still cannot find a single reason to actually go to Ryzen. For the same price, I will let go of assured performance boost for the possibility of a future performance boost.

4 years ago
Permalink

Comment has been collapsed.

I agree with you when the first gen ryzen came out a good amount of people (tech industry) insisted that games will be much more multithreaded in the future and ryzen will become more relevant, guess what happened, single threaded performance still rules for gaming.

EDIT intel seems to be much less secure compared to Ryzen though

4 years ago
Permalink

Comment has been collapsed.

For your first point, I am not convinced that games will use more cores in the next few years. The consoles having 8 cores was a big reason for that push in games. Next gen seems to be also using 8 cores, so I don't expect to see any changes on that front for the next generation.

I agree it's not clear cut. I expected the PC games to become N-core optimized by now, instead of being 2-core/4-core/6-core/8-core optimized. So it's not known when (if ever) this will happen.
But I think it's definitely a possibility people consider.

Second, Any game will eventually be CPU or GPU limited. Right now, a lot of the games I currently play like Vermintide, Division 2, CSGO, etc are CPU bound for me. I can't get to 100 FPS (300 for CSGO) even if I turn everything down to what I consider a bare minimum acceptable because the games are CPU limited. I am not saying I need 100 FPS to play games, but when I am spending that much money, its kinda something I expect.

Well, any game will be bound by something. Even if you build an infinite CPU/GPU, then it will be bound by memory, disk or bus speed. That's understandable.
I haven't played any of the games you mentioned, so I'm not sure how new / graphics-intense they are. I know CSGO is an older game, with older graphics, so it makes sense it reaches 300 FPS and still doesn't utilize all the GPU capabilities.
A new / modern game will have a much lower FPS, and will become GPU-bound much sooner.
Quite frankly you don't need a GTX1080 or i9 9900k to run CSGO. It will run as well with i3 from 5 generations ago, and any low-end GPU from Nvidia or AMD. So all that CPU/GPU power is wasted on it...

4 years ago*
Permalink

Comment has been collapsed.

Your CPU is still decent, for random reboots check temperatures on CPU and motherboard sometimes simply changing thermal paste under chipset helps, but this still can be PSU issue brand model ?

Ryzen is very decent CPU I personally waiting for DDR5 and 5th gen Ryzen or when games will require something more than i7 3770 @ 4.3Ghz

4 years ago
Permalink

Comment has been collapsed.

I have tried new paste, new cooler and even a new motherboard and PSU.

4 years ago
Permalink

Comment has been collapsed.

Delid CPU and change paste ?

4 years ago
Permalink

Comment has been collapsed.

You say that your cpu is getting old and hot and causes reboot.
I seen the price and the perf differents.
I think it's not worth to change due to the bad increase of performances.
You better have to change your thermic pasta and check other reason.why you cpu is so hot.
Lack of cooling? Dust?
Reamember that Cpu is not the most important part for games.

4 years ago
Permalink

Comment has been collapsed.

I am CPU bound on quite a lot of games I play.
And I have tried new paste, new cooler and even a new motherboard and PSU.

4 years ago
Permalink

Comment has been collapsed.

i'll choose ryzen
cheaper + bundle game maybe later
diff fps in game around 10-15% doesn't matter for me, u play 144hz? hehe
for game 5 years more than enuf, if it's for working i think only 2-3 years

4 years ago
Permalink

Comment has been collapsed.

Just upgrade now your gpu with a RTX 2070 super or rtx 2080 super, and wait until next year to see how the cpu market evolves.

Note: I have this exact CPU (i7 4790K) with a GTX970 G2, and i'm doing exactly that. Just upgrading the gpu for a 2070 super until next year.

4 years ago
Permalink

Comment has been collapsed.

Don't forget that the 9900K needs a cooler for it. Not just any cheap cooler but a really good one.

The 3900X comes with a very good cooler included in the price.

Nobody should even buy an intel now unless it's ridiculously cheap.

4 years ago
Permalink

Comment has been collapsed.

Im buying a good cooler no matter what brand I buy.

4 years ago
Permalink

Comment has been collapsed.

9900k is overkill for gaming, but what people usually forget (or live with their parents) it uses up a lot more power with little gain compared to lets say a i8700k.

4 years ago*
Permalink

Comment has been collapsed.

Just some quick comments...

When I did my last rebuild (Dec 2018), I went AMD TR2 2950X, and it is freaking AMAZING. No OC required, although that's certainly a possibility. I always aftermarket cool my CPUs, so I generally don't calculate it into the price; however, I will say that there aren't as many options that can handle the TR2 footprint, so you may want to look first. Most are kinda pricey, but I found an air cooler with 2 fans/grills that is efficient and not expensive (Cooler Master). Only downside is that it's big and doesn't have good clearance for high-profile RAM, limiting the # of slots you can use (going to convert to liquid cooling soon, because I want to slap a lot of RAM in here). But if you don't use high profile RAM, it's an amazing and inexpensive cooler.

Having 16 cores is a game-changer for me, as I do a lot of audio/video conversion, and ripping bluray has never been faster. Gaming-wise, I'm running ultra-wide, but still 1080p, and I can literally through anything at this with just a 1070ti, always max out all settings, and everything runs like butter. It's literally NEVER CPU bound no matter what I put through there, again with no OC. I expect if I was doing 4K I could hit something, but would expect to be more GPU bound than CPU bound in my case.

I'm glad that I went AMD this time...I bought an upper-end CPU from AMD and it wasn't cheap, but it was still a lot less than the equivalent Intel and it's actually faster than Intel CPUs in the same class.

4 years ago
Permalink

Comment has been collapsed.

Yeah, I love that Ryzen has shaken up the CPU industry so much in so little time. Maybe we can now start seeing some more innovation and faster improvements. I'm glad Thread Ripper is working out well for you.

4 years ago
Permalink

Comment has been collapsed.

Me too. It's beast, and runs cool and 100% stable. And OMG, the video processing! Stuff that used to run in 6-10 hours on my 8350 FX Black edition now completes in 45 minutes to an hour. It's amazing.

4 years ago
Permalink

Comment has been collapsed.

I saw news today that shops sold out their whole stock, and need to wait for next shipments, even some early buyers still wait for their orders.

So you may be forced to wait a bit to buy AMD right now.

https://www.reddit.com/r/Amd/comments/cdo02r/3900x_sold_out_across_all_of_europe/

4 years ago
Permalink

Comment has been collapsed.

Yeah, it is proving to be quite a popular option. Especially for productivity use. I went ahead and purchased an intel though. I felt that it suited my gaming needs more. In the end I think both the options are close enough that none of them are wrong choices.

4 years ago
Permalink

Comment has been collapsed.

Ryzen, because if you wanna upgrade most of them use the same motherboard socket, while Intel basically forces you to buy a new mobo (possibly RAM too) if you wanna change it. Or limits your mobo options if it's the motherboard you want to change/upgrade. So I guess you can say it's a more customizable build.

4 years ago
Permalink

Comment has been collapsed.

AMD has already stated that they are phasing out AM4 next year. So even if I were to buy a CPU today, by the time my next upgrade it due, it will be a full system upgrade either way. Especially since by that time I would expect DDR 5 to be commonplace.

4 years ago
Permalink

Comment has been collapsed.

I am a long time Intel guy but Ryzen is much better bang for your buck.

If all you do is gaming you don't need such a powerful CPU though, the thing is that chips like 9900k and 3900x are way overkill for any gaming you will be doing.

If you just want to game than your current CPU is more than adequate. You just need to figure out why it is getting bad thermals.

If you are just looking for a reason to get a new chip than Ryzen will be more future proof. Those extra cores might not seem like much now but if you end up ever trying to stream a game on some triple monitor setup or do any sort of multi-tasking you will be glad you have it.

A Ryzen 1600/2600/3600 would be more than enough for your needs and would be reasonably priced compared to their Intel counterparts.

4 years ago
Permalink

Comment has been collapsed.

If you haven't made your purchase yet, some things to consider, in descending order of weight:

Multi-tasking - Unless you keep a very sanitized gaming environment, it's easy for small background applications to eat away at your peak performance. VOIP programs, anti-virus, backup software, chat programs, a browser with some (or in my case 100s of) tabs open, launchers (steam, blizzard), etc. Running in windowed mode. Even if you don't pull things like streaming, or running a Plex server, or other potentially intensive tasks, these small programs will chew away that Intel lead pretty quickly. Benchmarks are there to show differences in ideal/clean circumstances. Real world that 5% difference may disappear or even reverse to AMD's favor with just your normal workload.

Changes in gaming - VR and 3D audio are becoming more accessible, which will benefit from additional threads. As AI gets better, I expect that we'll see some games include more aggressive trainable AI that will eat any additional threads you can throw at it. Some of these are already being done. It's reasonable to expect it to become more common over the next 2-4 years.

Monitor refresh rate - Your performance might be bound by your monitor. Adaptive refresh rate may be more important than raw FPS
PCI 4.0 - While not a requirement or strong performance benefit now, as textures become bigger this may become more important. Esp sense you mentioned this PC lasting several GPU upgrades.

Competition - Finally, there may be some value for you to go with the underdog. If the performance hit is small enough, voting to support the innovative company with your wallet will have an impact on the market long run. Intel has long been sitting on their laurels and capitalizing on their lack of competition. Supporting the underdog will help encourage Intel to start being innovative again.

Changes in hobbies/life - Giving up less than 5% fps in most cases may be worth "future proofing" any changes you may have in the next few years. If it were only a minor increase to general computing, I'd not put this down, but it's night an day difference. You might bite the streaming bug. Or get into hosting a server for a game you play. Or whatever. The ability to game with minimal FPS hit while doing something else is a pretty big opportunity gain.


If none of that resonates with you and you're still excited to get an Intel chip, then by all means, go Intel. I'm just here to play devil's advocate to make you feel better about your choice. ^-^

4 years ago
Permalink

Comment has been collapsed.

Thanks for sharing the POV. I do love it when I have someone playing the devil's advocate as it helps me make sense of my decisions, which is part of what this thread was for. I have just ordered my Intel, and now looking forward to receiving it. :)

4 years ago
Permalink

Comment has been collapsed.

We got the 2700 a while ago because we also needed a new mobo.
The amd socket doesnt change as often as intel, so the mobo will last us longer.
We are not the 144fps ultra 4k type of gamers tho, we wanted to have something decent medium range for at least 5 years.
Also the possibility to up the cpu + gpu in a few years without the need for a new mobo was appealing.

4 years ago
Permalink

Comment has been collapsed.

Yeah, I think you made a good choice.

4 years ago
Permalink

Comment has been collapsed.

Sign in through Steam to add a comment.