http://dbd.game/killswitch
Killrate is a bad standard for balancing
Honestly, killrate is such a bad standard for balancing. The difference in games between a 3k and a 4k, are not big enough to warrant a 25% difference in strength rating.
There are multiple factors that I will get to, but a big factor is slugging for the 4k. If a killer slugs for the 4k, usually they have a huge advantage on finding and hooking the survivor, finding and hooking the survivor and the survivor physically cannot open the gate. If a Killer always slugs for the 4k, its not hard to imagine that they get the 4k way more often than a killer who never slugs for the 4k, so their killrate in those games is 25% higher. This could easily add up to a more than 5% higher total killrate.
But the choice to slug for the 4k compared to not slugging for the 4k, is completely unrelated to the power of the killer and mostly the skill of a player. I would not say that a player who slugged for the 4k performed 25% better compared to one who didn't.
It also makes some killers seem stronger. I'll give any example. I know Nightlight is far from entirely accurate, but its the best data we got. Looking at the Killrate, Dredge (57.07%) has a higher killrate than Hag (55.29%). If you look at the details however, Hag has a higher winrate, getting more 3k (19.23%) or 4k (30.77%), getting wins a total of 50% of the time, compared to dredge. But because Dredges 4 kill rate (39.13%) is MORE THAN 5 times as high as his 3 kill rate (7.61%), Dredge is higher. This is not accurate of how oppressive the killers actually are. I'd argue for most soloqueues a Hag is worse than a Dredge, and for SWF they are roughly the same.
Now, why the 4k rate is higher, Idk. Nightlight has a small sample size, and its data tends to be inaccurate for factors I dont want to get into now, so it could for example be a slug happy Dredge on the server nightlight users are more active or something. I do not think, that this is a completely accurate representation of the actual killrate. But, my point still stands. A 4k does not intrinsically mean more than a 3k, except that the killer was potentially happy to slug for the 4k.
Even without slugging for the 4k, I think that the performance difference in getting a 4k or 3k, is not nearly as big as the differen between a 0k or 1k/1k or 2k/2k or 3k. But it still affects the killrate the same.
There are also intrinsic factors. Mobility, Stealth, Detection, Slugging Potential. All those 4 factors contribute to getting a higher 4k rate than a 3k rate, even without intentionally slugging for the 4k. A Wraith has an easier time to find the hatch and patrolling the gates than a Bubba. Twins is more likely to naturally go into a gamestate of all remaining Survivors being on the floor than a Clown. These factors do increase the killrate by 25% in relevant games, which can add up to a higher killrate of a few percentage points overall, even if they had very little to do with the performance of the killer in the actual trial.
So my proposal is, that while killrate is kept in mind in the back of peoples head, I think that instead of average killrate, we should use the killer's winrate/tierate/lossrate to balance killers. (Wins being 3 or 4 kills, tie being 2 kills, loss being 1 or 0 kills.)
Comments
-
This is a good point. Getting the first kill of a match is not the same amount of effort as getting the final kill of a match, especially given how killers can snowball.
2 -
Kill rate, admittedly, is not the only measure of player skill. All else being equal (map, killer choice, latency, etc) if two different killer players went up against the exact same 4 man squad, the killer who got 12 unique hooks would be generally considered more skilled than the killer who got 4, even if they both got a 4k.
However, kills are the only primary measure that matters generally. Because as soon as you change this metric to anything else, it leads to very bizarre and abusable scenarios.
Let's take hooks. If you just make the metric "how many hooks does the killer get on average", an immediate consequence is that (some) people will excessively slug. The killer can still eliminate all 4 survivors, or even just get a couple kills this way, and definitively win the match this way, yet the "metric" will show a loss.
The devs have been very specific in the past that they don't want to introduce an official measurement that would allow a survivor to die (definitely a loss), while the killer somehow also records a loss (no hooks). Nor should they introduce this, either.
6 -
Especially when its communal killrates. Some players quite literally are just really good at the game and earn their high MMR.
Personally, they should be nerfing/buffing killers based on how obviously their kits are ass/broken. Nurse and Blight being an immediate thought, their powers are pretty unfair. Trapper, Ghostface and Skull Merchant are the top 3 worst killers in the game in terms of kit design and strength. Skull Merchant is awaiting a rework soon, but what about Trapper and some buffs to Ghostface? Why can Ghostface even be revealed? Why is it so buggy? Trapper still doesn't have trapper sack basekit.
Theres just so much the devs could be doing instead of basing changes entirely off of stats or super casual players who somehow can't learn how Trapper's power works.
1 -
Probably why the devs were trying to address things like slugging and tunnelling. So they, and we, could get a more accurate picture.
4 -
Your post was very long so I didn't read it all so I'll keep my short. You can't balance killers when you have methods or strategies that effect Killrate as much as we currently have. Your own post explains it in extreme detail. So until every killer tunnels and slugs for the 4k or none do you can't create a healthy game.
2