D&D 5E Precision Strike: How to calculate

FrogReaver

The most respectful and polite poster ever
If one was to use all battlemaster superiority dice on precision attack, how would that effect a characters effective chance to hit?

How would someone go about calculating or even estimating this?
 

log in or register to remove this ad

I estimate that with fairly smart application that precision attack can increase effective chance to hit by about 15%.
 

Well, the really critical section of the maneuver is this one, underlined:

"Precision Attack. When you make a weapon attack roll against a creature, you can expend one superiority die to add it to that roll. You can use this maneuver before or after the attack roll, but before any effects of the attack have been applied."

Assuming you don't have a DM who dickishly "applies the effects" before you have a chance to think about it, this means that you'll be adding the die only in situations where you think you've failed the roll, but only by a moderate margin--in other words, attacks in the (very very roughly) 5-12 region depending on the enemy's AC. A rolled 5 is unlikely to hit even a low-AC enemy early on (2+5+3 = 10, which will miss most enemies), whereas a rolled 12 is very likely to hit most things (2+12+3 = 17, few enemies at level 3 will have an AC of 18 or higher). This range might rise, slightly, as you get into very high levels where enemies could have an AC of 20 or more, but I doubt it would go much higher than that. I think it's also fair to assume that a Battlemaster is going to pay attention to things like the rolled number for their allies' attacks--if they see a rolled 11 hitting, they won't bother spending the dice for numbers higher than that, but will shift their acceptable range to match. Same for seeing a rolled 15 miss. (Assuming the characters have the same total attack bonus, which is ~fairly~ likely.)

So the range from 5 to 12 is 8 faces of the d20, shifted around contextually depending on the player's observations. We assume, then, that this means 40% of the time, when it's available, the Battlemaster adds (die/2) to their attacks, and that all of those attacks would normally miss. If the rolled number is at the low end (5), then the BM is adding +4/5/6 (depending on level) to that attack. This turns a rolled 5 (= 10 total) into a rolled 14/15/16--enough to hit most moderately-armored targets; similarly, it turns a rolled 12 (= 17 total) into a rolled 21+, which hits nearly everything. So we can assume that most of these attacks hit; let's call it 75%.

Hitting 75% of the 40% of attacks that would have always missed before translates to (.75)*(.40) = .30, a 30% increase to hit. This is both a relatively high estimate (because I assumed all of those 40% of attacks *always* would miss without it) and a relatively low estimate (because I rounded down the average Superiority Dice value). On the whole, I'd say it's a good number, but if you want to be very conservative, you could call it a 20% to 25% increase in number of successful attacks and you would almost certainly not be over-estimating the true result. 15% sounds like an excessively low estimate to me, if we're assuming very shrewd use.

Unfortunately, there are at least three different axes of variation that make it impossible to truly "calculate" the amount of increase. Shrewd play on the player's part, variations in the target AC, and whether the PC's numbers remain "in step" with the growth of average enemy AC over time. The first, of course, is the most meaningful reason--if we assume a less-shrewd player, many of the dice may be wasted, resulting in a reduced benefit.
 

It gives you fixed +22,5% chance to hit while you need to roll natural 10+ on attack d20 roll. with d8 combat die.

needed roll 10, 55% chance to hit, 77,5% with precision attack.
needed roll 20, 5% chance to hit, 27,5% with precision attack.

below needed roll of 10 bonuses are reduced as high d8 rolls are irrelevant.

with d10 it's +27,5% chance on needed natural roll of 12+
with d12 it's +32,5% chance on needed natural roll of 14+
 

It gives you fixed +22,5% chance to hit while you need to roll natural 10+ on attack d20 roll. with d8 combat die.

needed roll 10, 55% chance to hit, 77,5% with precision attack.
needed roll 20, 5% chance to hit, 27,5% with precision attack.

below needed roll of 10 bonuses are reduced as high d8 rolls are irrelevant.

with d10 it's +27,5% chance on needed natural roll of 12+
with d12 it's +32,5% chance on needed natural roll of 14+

While this is fair, it assumes that the player is not being at all "strategic" about their use of the ability. Strategically employing it, only in those cases where you need it rather than all the time, tweaks these numbers (up, rather than down, but still).

At the same time, this also sets a floor for how much benefit you can get out of it, so that IS a very useful thing to know. Minimum 22.5% net-average benefit. This will go up, if you're able to strategically use it excluding those cases where you know you won't need it (like I said, even a fairly un-strategic player will probably not use Precision Attack with a rolled 15.)
 

While this is fair, it assumes that the player is not being at all "strategic" about their use of the ability. Strategically employing it, only in those cases where you need it rather than all the time, tweaks these numbers (up, rather than down, but still).

At the same time, this also sets a floor for how much benefit you can get out of it, so that IS a very useful thing to know. Minimum 22.5% net-average benefit. This will go up, if you're able to strategically use it excluding those cases where you know you won't need it (like I said, even a fairly un-strategic player will probably not use Precision Attack with a rolled 15.)

Agree, but in 5 man party targets AC gets revealed in one round, maybe two.

And the meneuver is more valuable as the higher is the targets AC.

because the absolute increase of to hit bonus is the same, but relative bonus to DPS vary greatly.

with needing only roll of 10 you get from 55% to-hit to 77,5% to-hit. A 41% in DPS

but with needing of a roll of 20 where 5% becomes 27,5%, the DPS is increased by 450%
 

It gives you fixed +22,5% chance to hit while you need to roll natural 10+ on attack d20 roll. with d8 combat die.

needed roll 10, 55% chance to hit, 77,5% with precision attack.
needed roll 20, 5% chance to hit, 27,5% with precision attack.

below needed roll of 10 bonuses are reduced as high d8 rolls are irrelevant.

with d10 it's +27,5% chance on needed natural roll of 12+
with d12 it's +32,5% chance on needed natural roll of 14+

Are you taking into account that a player only gets about 4 uses of precision attack (that eventually climbs to 6) per short rest?

And once you realize it can't be used on every turn then you will also realize that holding a d8 as a fixed 4.5 avg value will actually skew the results higher as well. The few times you will use it there is no chance of missing assuming you know the monsters AC. However, in reality when you use it there's on average a 10% to 25% chance you still miss.
 
Last edited:

Well, the really critical section of the maneuver is this one, underlined:

"Precision Attack. When you make a weapon attack roll against a creature, you can expend one superiority die to add it to that roll. You can use this maneuver before or after the attack roll, but before any effects of the attack have been applied."

Assuming you don't have a DM who dickishly "applies the effects" before you have a chance to think about it, this means that you'll be adding the die only in situations where you think you've failed the roll, but only by a moderate margin--in other words, attacks in the (very very roughly) 5-12 region depending on the enemy's AC. A rolled 5 is unlikely to hit even a low-AC enemy early on (2+5+3 = 10, which will miss most enemies), whereas a rolled 12 is very likely to hit most things (2+12+3 = 17, few enemies at level 3 will have an AC of 18 or higher). This range might rise, slightly, as you get into very high levels where enemies could have an AC of 20 or more, but I doubt it would go much higher than that. I think it's also fair to assume that a Battlemaster is going to pay attention to things like the rolled number for their allies' attacks--if they see a rolled 11 hitting, they won't bother spending the dice for numbers higher than that, but will shift their acceptable range to match. Same for seeing a rolled 15 miss. (Assuming the characters have the same total attack bonus, which is ~fairly~ likely.)

So the range from 5 to 12 is 8 faces of the d20, shifted around contextually depending on the player's observations. We assume, then, that this means 40% of the time, when it's available, the Battlemaster adds (die/2) to their attacks, and that all of those attacks would normally miss. If the rolled number is at the low end (5), then the BM is adding +4/5/6 (depending on level) to that attack. This turns a rolled 5 (= 10 total) into a rolled 14/15/16--enough to hit most moderately-armored targets; similarly, it turns a rolled 12 (= 17 total) into a rolled 21+, which hits nearly everything. So we can assume that most of these attacks hit; let's call it 75%.

Hitting 75% of the 40% of attacks that would have always missed before translates to (.75)*(.40) = .30, a 30% increase to hit. This is both a relatively high estimate (because I assumed all of those 40% of attacks *always* would miss without it) and a relatively low estimate (because I rounded down the average Superiority Dice value). On the whole, I'd say it's a good number, but if you want to be very conservative, you could call it a 20% to 25% increase in number of successful attacks and you would almost certainly not be over-estimating the true result. 15% sounds like an excessively low estimate to me, if we're assuming very shrewd use.

Unfortunately, there are at least three different axes of variation that make it impossible to truly "calculate" the amount of increase. Shrewd play on the player's part, variations in the target AC, and whether the PC's numbers remain "in step" with the growth of average enemy AC over time. The first, of course, is the most meaningful reason--if we assume a less-shrewd player, many of the dice may be wasted, resulting in a reduced benefit.

30% more hits at 50% chance to hit means 15% increased chance to hit?

50% * 1.30 = 65%?
 

While this is fair, it assumes that the player is not being at all "strategic" about their use of the ability. Strategically employing it, only in those cases where you need it rather than all the time, tweaks these numbers (up, rather than down, but still).

At the same time, this also sets a floor for how much benefit you can get out of it, so that IS a very useful thing to know. Minimum 22.5% net-average benefit. This will go up, if you're able to strategically use it excluding those cases where you know you won't need it (like I said, even a fairly un-strategic player will probably not use Precision Attack with a rolled 15.)

22.5% is actually a theoretical maximum that is in actuality unachievable. It does make for a nice theoretical maximum though. The best you can do is use precision attack wisely and you will approach the 22.5% increase.
 

Well, the really critical section of the maneuver is this one, underlined:

"Precision Attack. When you make a weapon attack roll against a creature, you can expend one superiority die to add it to that roll. You can use this maneuver before or after the attack roll, but before any effects of the attack have been applied."

Assuming you don't have a DM who dickishly "applies the effects" before you have a chance to think about it, this means that you'll be adding the die only in situations where you think you've failed the roll, but only by a moderate margin--in other words, attacks in the (very very roughly) 5-12 region depending on the enemy's AC. A rolled 5 is unlikely to hit even a low-AC enemy early on (2+5+3 = 10, which will miss most enemies), whereas a rolled 12 is very likely to hit most things (2+12+3 = 17, few enemies at level 3 will have an AC of 18 or higher). This range might rise, slightly, as you get into very high levels where enemies could have an AC of 20 or more, but I doubt it would go much higher than that. I think it's also fair to assume that a Battlemaster is going to pay attention to things like the rolled number for their allies' attacks--if they see a rolled 11 hitting, they won't bother spending the dice for numbers higher than that, but will shift their acceptable range to match. Same for seeing a rolled 15 miss. (Assuming the characters have the same total attack bonus, which is ~fairly~ likely.)

1. Very wise to note that you will only apply the dice when you have failed the roll by a moderate margin. (In testing I've found the sweet spot to be missing by up to 4.


So the range from 5 to 12 is 8 faces of the d20, shifted around contextually depending on the player's observations. We assume, then, that this means 40% of the time, when it's available, the Battlemaster adds (die/2) to their attacks, and that all of those attacks would normally miss. If the rolled number is at the low end (5), then the BM is adding +4/5/6 (depending on level) to that attack. This turns a rolled 5 (= 10 total) into a rolled 14/15/16--enough to hit most moderately-armored targets; similarly, it turns a rolled 12 (= 17 total) into a rolled 21+, which hits nearly everything. So we can assume that most of these attacks hit; let's call it 75%.

2. Very wise noticing that a characters chance to hit doesn't actually effect the calculations
3. The 75% of the battle master superiority uses turn a miss into a hit is actually a very good estimation. However, getting that number involves realizing it's a dice and does have a chance to miss (unless you need a +1 in order to hit). That miss chance is what causes you to have about a 75%-85% chance of not landing one of the superiority dice precision attacks.

While your base chance to hit doesn't impact the calculations, the number of attacks you expect to make per short rest does impact the calculations. More attacks means more chances for closer misses. This either has the effect of letting you use more superiority dice on precision attack per short rest (assuming you don't modify your usage rule) OR you modify it and that gives you a higher chance that upon using a superiority dice that a miss turns into a hit (because there are more misses by 1 and by 2 and by 3... when you increase the number of attacks).

Hitting 75% of the 40% of attacks that would have always missed before translates to (.75)*(.40) = .30, a 30% increase to hit. This is both a relatively high estimate (because I assumed all of those 40% of attacks *always* would miss without it) and a relatively low estimate (because I rounded down the average Superiority Dice value). On the whole, I'd say it's a good number, but if you want to be very conservative, you could call it a 20% to 25% increase in number of successful attacks and you would almost certainly not be over-estimating the true result. 15% sounds like an excessively low estimate to me, if we're assuming very shrewd use.

4. Using a superiority dice on 40% of attacks that would have missed that you have a chance of turning into a hit would actually lower your success rate of turning a miss into a hit to:
(8/8 + 7/8 + 6/8 + 5/8 + 4/8 + 3/8 + 2/8 + 1/8) / 8 = 56.25% chance to hit. 56.25% * 40% comes out to 22.5% as it should.

5. To further lower that calculation from 22.5% you must realize that not every time an opportunity to use a superiority dice comes up for precision attack that you can use it. You only have 4 uses per short rest. You could have more than 4 opportunities to use it between short rests. Any short rest period where you have already used 4 precision attacks and need a 5th would have the effect of lowering that 22.5% even more. This will come up a non trivial amount of the time.

75% is a good estimate if you only plan on using precision attack on more 20% of the attacks. 75% * 20% = 15%. The 56.25% is the estimate to use if you use it on 40% of attacks.

Unfortunately, there are at least three different axes of variation that make it impossible to truly "calculate" the amount of increase. Shrewd play on the player's part, variations in the target AC, and whether the PC's numbers remain "in step" with the growth of average enemy AC over time. The first, of course, is the most meaningful reason--if we assume a less-shrewd player, many of the dice may be wasted, resulting in a reduced benefit.

Actually this discussion is helping me alot. Thank you. I think maybe soon I will be able to get the calculations to back up my simulation!
 

Trending content

Remove ads

Top