Please note: Although we may stop by occasionally, this is not a developer Q&A.
The second iteration of 2v8 is now LIVE - find out more information here: https://forums.bhvr.com/dead-by-daylight/kb/articles/480-2v8-developer-update

Horrendous Performance

notstarboard
notstarboard Member Posts: 3,903
edited October 11 in Ask the Community

When I took a hiatus from this game in 2021 or 2022, I used to get about 40-50% GPU utilization on the lobby screen on 1080p low settings with an i5-6500, GTX 1060 6GB, and 16 GB DDR3. I have since upgraded to an i7-7700 and an RX 6600 and after coming back to the game I now have a whopping 65-70% GPU utilization on the lobby screen on 1080p low settings. It was more like 70-75% last patch with all of the Castlevania eye candy. An RX 6600 is something like 75-100% stronger than a GTX 1060 6 GB. Even if DBD runs substantially better on Nvidia hardware an RX 6600 would still be a hefty upgrade over a GTX 1060. Moreover, the recommended GPUs on the Steam page are somehow still a GTX 760 or AMD HD 8800, which are something like 25% the performance of an RX 6600.

I understand that engine updates have been made over the life of the game, including an update to UE5, but this game is not remotely visually complex enough, especially on the lobby screen, to justify such high GPU usage.

I've seen reports of resolution being dropped on console and frame rate being worse than ever. (Is it even possible to play this game on a Switch or PS4 anymore?) The game hangs and stutters like nobody's business while launching nowadays, and will even crash while launching every so often. I still get sporadic half-second freezes during regular matches too, although this has been happening since 2018 at least.

I did not submit this to the bug reporting part of the forum because I don't think this is a bug. Everything appears to be working as intended. The game is just terribly optimized.

I would like to call attention to this and generate some more discussion. Not only does poor performance hurt the game by putting it out of reach of gamers with lower-spec systems, including some systems that once could play this game just fine, but it's a huge waste of energy when you consider how many player hours are racked up every day. I would like to see the developers slow their release cadence to prioritize performance and stability, along with improvements to existing content. In theory this could also help them release killers that are in a good state from the get-go instead of the unsustainable cycle of "ill-conceived killer > hasty rework > killer still problematic > repeat" that has plagued many of the recent releases. Curious if others agree.

Post edited by Rizzo on

Answers

  • ArkInk
    ArkInk Member Posts: 730

    I agree. Slowing chapters to maybe 3 a year and putting the extra resources towards performance and general design reworks seems like a good idea given how often updates as of late have arrived with serious issues.

  • jajay119
    jajay119 Member Posts: 1,061

    frame rate is awful of series X when indoors - but not even indoors properly. The little tents/texh hut on the skull merchant maps run terribly. I’m not sure what they’ve done but it’s been that way for a while now.

    No wonder the fog hasn’t been seen for ages: my console would probably explode.

  • Aceislife
    Aceislife Member Posts: 434
    edited October 10

    So wait. You're not happy about GPU utilization going up a bit, even though since you didn't mention the frames, I'll assume your framerate is still fine? Did you play at 120fps with your 1060 too? Since your utilization is that low, it tells me your gpu has absolutely zero problem giving you the performance you're getting. You're looking at numbers that don't matter. I'd imagine the move to UE5 made the game a little harder to run, but since your RX6600 can run the game at (presumably) 120fps with just ~70% usage, that already tells you how easy this game is to run.

  • notstarboard
    notstarboard Member Posts: 3,903
    edited October 10

    My average FPS has always been fine, but I played at 60 FPS then and now. That implies that the game requires more than twice as much horsepower to run at low settings now than it did a couple years ago.

    GPU utilization matters because it shows how much approximate performance overhead you have. Even more importantly, it also correlates to power use, which is important so long as electric grids are mostly powered by fossil fuels.

  • Shinkiro
    Shinkiro Member Posts: 116

    Your GPU utilization is completely normal for gaming, you're making a fuss about nothing. Remember the total utilization isnt just the game but everything the GPU is currently doing. This includes resources reserved but not in actual use. Unless you're at 100% and getting bottlenecked by your GPU its basically irrelevant if your performance isnt being compromised.

    That said your CPU is getting on a bit, i dont see why you "upgraded" to a CPU from 2017.

  • notstarboard
    notstarboard Member Posts: 3,903
    edited October 10

    This is not normal for DBD, for my hardware on any game with these hardware requirements, or for rendering any scene as visually basic as the DBD campfire. My understanding is that GPU utilization reflects the current load on the GPU, not "reserved resources" that aren't actually being used; even if the underlying system is based on requesting resources that must then be allocated before computations can be done (no idea), this is happening with such a high frequency that it would still boil down to GPU utilization = load on the GPU at a macro level. GPU load does include non-DBD activities, yes, but I have the same barebones Windows install as ever. I don't even have a second monitor, and running in full-screen mode doesn't make an appreciable difference.

    The i7-7700 is the second-best CPU my motherboard supports. The only stronger CPU is the i7-7700K, but the i7-7700K is substantially more expensive nowadays than the i7-7700 in exchange for only a small performance uplift. I was also wary of buying a several-year-old CPU that others may have overclocked aggressively, especially when my motherboard doesn't support overclocking, so I couldn't even go that route to further extend the life of my system. The i7-7700 is something like 30-40% stronger than an i5-6500, which is the difference between being able to play Battlefield with Discord up, for example, and dropping frames like crazy even when I run Discord on another device. It will likely extend the life of this system by 3-5 years, which was definitely worth a net ~$60 to me.

    DBD is not very CPU intensive to run, at least. An i7-7700 is plenty for DBD.

  • Aceislife
    Aceislife Member Posts: 434
    edited October 10

    A far more accurate metric would be to see how much fps you can push, by the way why on earth would you play at 60fps? Even if you have a 60hz screen, your input latency would be lower running the game at 120hz.

  • notstarboard
    notstarboard Member Posts: 3,903
    edited October 10

    Running 120 FPS through a 60 Hz display imperceptably reduces latency at the cost of a massive increase in power draw, fan noise, temps, etc. It's not remotely worth it for me. VSync is a setting I swear by no matter what game I'm playing; preventing screen tearing alone would be worth it for me, but when you factor in the aforementioned drawbacks of higher frame rates it's a slam dunk.

    My display can technically do 75 Hz but I only run it at 60 Hz because the increase in power draw isn't worth the minor improvements to smoothness and latency.

  • Aceislife
    Aceislife Member Posts: 434

    Even using Gsync at 60hz when you could do 120hz is bad enough latency wise (by the way its a noticeable improvement), but using Vsync guarantees even worse input latency compared to Gsync/Freesync. There would be screen tearing without botf of them, you're right about that.

  • notstarboard
    notstarboard Member Posts: 3,903

    The absolute worst case for latency add on a specific frame with VSync would be 1/30th of a second (33 ms). That's for a hypothetical infinite frame rate. At ~120 FPS, the average latency add would be around 1/60th of a second (17ms).

    This is not a game that requires precise inputs in such a small window. The vast majority of people won't even notice 17ms, but they will notice screen tearing and their fans spinning loudly.

    If you're playing eSports, sure, minimize latency to the theoretical limit. If not, you're just wasting energy for little tangible benefit; given that the planet is warming alarmingly, I see no reason for an average gamer to use about double the power for such a tiny benefit, especially when it comes with screen tearing. I do think GSync/FreeSync on a 60 Hz display would be respectful of power use as well and are fine options; the key is just not rendering way more frames than you can even display on a 60 Hz display.

  • notstarboard
    notstarboard Member Posts: 3,903

    @Rizzo why was this booted to the nether regions of the forum? :(

  • Rizzo
    Rizzo Member, Administrator, Mod Posts: 17,845

    The thread was moved to the correct section, as technical issues are discussed here.

  • Aceislife
    Aceislife Member Posts: 434
    edited October 11

    You do realize that 17ms is between every frame you get? You can try it for yourself, just flick the mouse back and forth, it's super noticeable. And it's more about an enjoyable experience than competitive advantage, but thats just my opinion. Also another thing, why do you seem so scared of the gpu using more power?

  • notstarboard
    notstarboard Member Posts: 3,903

    I realize that, but 17ms is nothing. It's >6 times faster than a typical blink, and only a bit more than the typical latency you'd get from using a Bluetooth mouse instead of a wired one (~10ms). It's not noticeable for me at all, especially in a game like DBD, and the effect is so slight overall that I'll still use VSync for online FPS.

    I care a lot about power use because 60% of the electricity from the grid in my country is derived from fossil fuels and the planet is warming alarmingly. I don't want my hobbies to contribute to that more than they have to.

  • notstarboard
    notstarboard Member Posts: 3,903

    Strange response, as I would have assumed technical issues would go in one of the "Feedback and Issues" subsections. This is meant as more of a discussion of development philosophy than a technical issue, because as I stated in my post, everything is working as intended.

    Ask the Community is described as the section of the forum for "simple questions", and this is not a simple question. I am looking for discussion around this issue and this issue is of general interest, ergo, General Discussions / "open discussions on various topics". The amount of upvotes for a post like this indicate that this is of interest to other players.I admit this is probably too cynical, but I can't help but feel like this was moved because the mods and/or devs don't want to draw attention to this in the main forum.

    I have been playing long enough to remember the devs circa 2019 finally being forced to comment on the game's poor performance, especially on console, due to community pressure. This resulted in them committing to major optimization efforts but, to the extent that they were done at all, these never actually improved performance. And now the game is about twice as hard to run as it was back then in exchange for only modest visual improvements at best. It's clear to me that optimization is not a focus in spite of persistent performance issues, and the developers need to hear from the community on this.

  • Aceislife
    Aceislife Member Posts: 434
    edited October 11

    Ok, thinking that what, 30-50 watt difference (probably way less) is going to effect those things in the slightest is delusion. And the ways you as an invidual make sacrifices will not help with it. Make no mistake of that. And I assume you've tried both, since you seem to know, instead of just looking at the numbers.

  • notstarboard
    notstarboard Member Posts: 3,903

    Do you vote, even though the odds of your vote making a difference in the election are vanishingly small? Most people do, because if we do our part and trust that others will do the same, we can collectively have a better outcome.

    Saving 30-50 Watts playing DBD is not even a drop in the bucket, but when you make lots of these sorts of optimizations in your life, and when millions of other people do the same, you can start to make an impact. Advocacy is of course also important, but reducing your personal footprint is still worth doing.

    I have indeed tried both; I don't notice any difference in DBD. Even in shooters the difference is so subtle I'm not sure if I can notice it or not.