Recently, I've been examining the "passing premium," the difference in expected gain between a pass play and a run play. After extending the research of this paper, it appears that passing yields a better average gain than running, even after accounting for incompletions, sacks, and interceptions. This would suggest that NFL teams should pass more than they currently do because balance indicates optimization.
"We can think of passing and running as two investments, each with its own expected payoff and volatility. "
Unfortunately, the optimum run and pass mix is more complicated than comparing average expected gains. Commenter "JG" pointed out that passing's comparatively high variance (they are often incomplete, or result in sacks or turnovers) means that passing should have a higher expected payoff. It would not be worthwhile if it didn't. If a team could get the same expected gain by only running, why would it ever risk a pass?
This kind of analysis is based on financial portfolio theory, a branch of math that analyzes and weighs risks and rewards. We can think of passing and running as two investments, each with its own expected payoff and volatility. When a team calls a running play it invests in a run at the price of 1 down, hoping for a payoff in yards. Running would be like buying a share of GE. Passing would be more like buying a share of a tech startup. There is more upside for rapid gain, but there is also a decent chance you'll lose the kids' college fund.
The author of this site proposes several possible applications of the Sharpe Ratio in football. The Sharpe Ratio is a financial measure of expected returns per unit of variability. Specifically, it is the ratio of average returns of an investment over a risk-free alternative to the standard deviation of the investment's value.
By comparing the Sharpe Ratio of running and passing, we can see if there is a premium of one tactic over the other accounting for each tactic's risk. We could also compare two different passing strategies, a high risk/high reward passing offense or a high percentage "dink and dunk" offense.
Consider a simple fictitious example below. Team A is the high-risk/reward passing team and Team B is the higher percentage passing team. The table lists the results of several pass attempts of each team (order is not important in the Sharpe Ratio). Both teams average the same number of yards per attempt. Team A had more incompletions and sacks, yet had more yards per completion. For the zero-risk alternative, I'll use a zero-yard "QB flop" play. Each team had one interception, a -45 yard equivalent.
This kind of analysis is based on financial portfolio theory, a branch of math that analyzes and weighs risks and rewards. We can think of passing and running as two investments, each with its own expected payoff and volatility. When a team calls a running play it invests in a run at the price of 1 down, hoping for a payoff in yards. Running would be like buying a share of GE. Passing would be more like buying a share of a tech startup. There is more upside for rapid gain, but there is also a decent chance you'll lose the kids' college fund.
The author of this site proposes several possible applications of the Sharpe Ratio in football. The Sharpe Ratio is a financial measure of expected returns per unit of variability. Specifically, it is the ratio of average returns of an investment over a risk-free alternative to the standard deviation of the investment's value.
By comparing the Sharpe Ratio of running and passing, we can see if there is a premium of one tactic over the other accounting for each tactic's risk. We could also compare two different passing strategies, a high risk/high reward passing offense or a high percentage "dink and dunk" offense.
Consider a simple fictitious example below. Team A is the high-risk/reward passing team and Team B is the higher percentage passing team. The table lists the results of several pass attempts of each team (order is not important in the Sharpe Ratio). Both teams average the same number of yards per attempt. Team A had more incompletions and sacks, yet had more yards per completion. For the zero-risk alternative, I'll use a zero-yard "QB flop" play. Each team had one interception, a -45 yard equivalent.
Pass | Team A | Team B |
Pass 1 | 40 | 27 |
Pass 2 | 22 | 18 |
Pass 3 | 20 | 18 |
Pass 4 | 15 | 13 |
Pass 5 | 15 | 13 |
Pass 6 | 10 | 13 |
Pass 7 | 3 | 10 |
Pass 8 | 0 | 5 |
Pass 9 | 0 | 3 |
Pass 10 | 0 | 0 |
Pass 11 | 0 | 0 |
Pass 12 | -5 | 0 |
Pass 13 | -5 | -5 |
Pass 14 | -10 | -10 |
Pass 15 | -45 | -45 |
Avg YPA | 4.00 | 4.00 |
Std Dev | 18.86 | 16.75 |
Sharpe Ratio | 0.11 | 0.12 |
In this example, the Sharpe Ratio is higher for Team B's high percentage offense, suggesting its rewards are more worth its risks. We would get similar results for any comparison of higher-risk tactics vs. low risk tactics, assuming the average net gain is equal.
The potential for the application of the Sharpe Ratio and all of Portfolio Theory in football strategy is vast. We might finally answer the question of whether a boom/bust running back like Barry Sanders is better than a straight-ahead pounder like Jamal Lewis. We could analyze the merits of Mike Martz's high risk/reward passing doctrine. I'm sure I'll be pursuing such applications in future research. In the meantime, however, the next post will critique a very interesting research paper that makes great strides in applying portfolio theory directly to the passing premium issue.
Continue reading part 2 of The Passing Paradox.
The value of consistency is HUGE in football. An offense that could get 2.5 yards every down, without fail, would be the greatest offense in the history of the league, far better than the 2007 Patriots, who gained 6.2 yards per play.
The difference between a 0 yard gain and a 3 yard loss on 1st and 10 is gigantic, but the difference between a 15 yard gain and an 18 yard gain is basically meaningless. The marginal utility of yards gained declines dramatically once you get beyond the first down marker.
I would expect football's optimal play selection to be skewed very heavily towards high-percentage plays. (Not necessarily always runs. Wes Welker-type plays are probably awesome too.)
Consistency could be overrated too, in a way. The more plays a team needs to execute on its way to scoring, the more chances there are for something to go wrong.
For example, take the theoretical 2.5 yd/snap team. Say about 3% of all plays result in a fumble lost, and a 2.5 yd/snap team would need 32 plays to put together an 80-yd drive. They'd almost never reach the end zone, fumbling about once per drive.
A high risk offense that moved the ball in 20 yard chunks but risked an interception or fumble on a proportionate number of plays (20/2.5 = 8, 3*8= 24%) would still score relatively often. They would only turn over the ball on 65% of its drives and score TDs on the other 35%.
Both examples are straw-men, but hopefully they express my point. The marginal utility of yards above 10 has to take into account that the team will need fewer first downs and fewer plays to score. There is more risk inherent in each additional play required. Plus every additional first down required allows the defense the opportunity to stop the drive.
So aggressive and apparently risky deep throws may not be so risky in the long run. Pun not intended!
I'd be interested to see this "Sharpe Ratio" analysis applied not on a per-play basis, but on a per-fresh-set-of-downs basis. So, to take the example of your two straw men, the first gains an average of 8.85 yards per set of downs, with a 11.5% chance of a turnover, while the second team gains an average of 15.2 yards per set of downs, with a 24% chance of turnover.
This approach encapsulates one of the main advantages of a consistent offense - namely, that you can sustain drives.
Agreed-But you're getting a little ahead of me. Ultimately, the most useful application would be a Sharpe ratio for every down and distance combination.
First, I really enjoyed reading this series of posts on the "Passing Paradox" and overall I think you've come up with some great stuff on this site.
Second, I have a question about the applicability of portfolio theory to an analysis of NFL play-calling. Stock returns (according to the Efficient Markets hypothesis) are thought to follow a random walk. In other words, there is no correlation between past and future returns. Do you believe this is the case with plays during an NFL game? Have you seen calculations that suggest the absence of autocorrelation among NFL plays?
Keep up the good work.
Max-Interesting point. Aside from watching too much CNBC, I'm not an expert on market theories. But here are a few thoughts.
I think of daily equity prices as random noise on top of fundamental trends. Underneath are the health and growth of a company and general market conditions. But the day-to-day variation is essentially random. In other words, it's a biased random walk.
Football plays are similar in that a team has fundamental abilities and skills, but the play-to-play outcome is a random function on top of the team's fundamental abilities.
For example, a good team's passing outcomes might be 8 yds, 0 yds, 15 yds, 12 yds. A bad team's passing outcomes might be 0 yds, 7 yds, 6 yds, 9 yds. The play-to-play outcomes are random, but the better team has an underlying fundamental advantage.
The market theories are often focused on forecasting the next day's stock price. I wouldn't suggest that we use portfolio theory to try to predict the outcome of the next play in a game, just to suggest what the optimum play choice should be for various situations--the same way financial advisers build portfolios for people in different stages of life.
For example, take the theoretical 2.5 yd/snap team. Say about 3% of all plays result in a fumble lost, and a 2.5 yd/snap team would need 32 plays to put together an 80-yd drive. They'd almost never reach the end zone, fumbling about once per drive.
Not true. If they fumble 3% of the time, then they gain 2.5 yards 97% of the time, so they'd put together an 80-yd drive (.97)^32 = 38% of the time, and score 2.6 points per drive, making them one of the best offenses in the NFL.
A high risk offense that moved the ball in 20 yard chunks but risked an interception or fumble on a proportionate number of plays (20/2.5 = 8, 3*8= 24%) would still score relatively often. They would only turn over the ball on 65% of its drives and score TDs on the other 35%.
No, they'd score on (.76)^4 = 33% of their drives, and average 2.3 points per drive, which is still very good, but not as good as the more consistent offense. As long as they turn the ball over on a proportionate number of plays, the more consistent offense will always average more points.
Ok. I think you're right. But hopefully my point is still valid--the fewer plays needed to move the ball, the fewer opportunities there are for something to go wrong. Low risk isn't that great a thing if you are exposed to that risk very frequently.
The 2.5 yard/play team would only get about three drives...
Though I know it wasn't the intent, the portfolio/risk analysis has given me a finer appreciation for completion percentage. I've always been a sheer YPA guy, feeling a lower completion percentage paired with higher yards per completion is roughly equivalent. It goes to show the genius behind the WCO (an offense I don't particularly like) as it devises a way to get most of the benefits of the pass while minimizing most of the risk.
I would think an analysis would show within a small range of YPA, those with higher completion percentages are much better (running game, etc. being equal), due to ability to sustain drives--this trend only fading when people have very low completion percentages, but very high YPC.
I'll have to look and see if you've covered this in one of your other pieces. Football is a bit of an odd sport to cover, because in almost every scenario you can use the moneyball outs/downs theory for sustainabilty and consistency being the key thing. A guy who completes 6 yard passes nearly every time he throws is better than the guy who completes 12 yard ones occasionally. However, where this analysis falls apart (and what makes football difficult) is when you incorporate the "super gainer" who completes few passes (or gets few solid run gains) but the ones he gets are TDs.
It seems in most scenarios, there's not a huge advantage to 20 yard completions over 10 yard completions (some, due to the concept of fewer first downs needed) but there is a cutoff point, around 40 yards, where most are going for TDs, and all the models go haywire. It's somewhat similar to the homerun hitter in baseball, but as with all stats in football, more complex.
I see a similarity to portfolio theory in that the optimum portfolio should include a mix of "holdings" (run and pass) to minimize risk for a given expected gain. Plus there is a sort of unsystematic risk associated with the wrong mix of playcalls that can be diversified away (by keeping the defense guessing). And there is an ever-present systematic risk of a low return associated with the "market". In this case this "market risk" would be the defense's ability, when matched up against a given offense, to create a loss of yardage no matter what mix of plays is chosen. But a difference in football is that overinvesting in one holding - calling all passes for instance, will decrease the expected return and increase the risk, due to the defense's ability to guess the playcall. If you put all of your money in GE, GE's expected return and beta still is what it is. In the investment world, expected return and sigma do not worsen if some opponent knows that there is a 100% chance that an individual investor's next investment, or playcall, will be GE. In football, they do.
The run/pass decision is slightly different that equity selection in that your choice of mix greatly affects the average expected gain. If you are throwing on every down, your expected gain will be lower than if it is 50/50.