You can purchase prints of any of the posters here. More to come!
It seems pretty obvious that the ideal situation to kick in would be inside of a dome, but how much more accurate does it make kickers?
Over the past five seasons, from 2010 to 2014, there were five domed stadiums in play: Atlanta, Detroit, New Orleans, St. Louis, and Minnesota (excluding 2014). We’ll ignore retractable roofs for this study.
There was an average of 995 field goal attempts each season in the NFL, which comes out to about 31 per kicker.
Field goals in our focus group—those kicked in domes—had an 85.8% success rate (601 of 700). Those kicked in the non-domed stadiums had an 83.6% success rate (3575 of 4276).
So yes, our prediction that kickers would be more accurate in domes does appear to be true. The catch is that they were only 2.2% better than non-dome percentage. If kickers attempt 31 field goals in a season, being 2.2% worse is affecting just over half of a field goal.
I would take any extra advantage I could get, but over a season three points is not a whole heck of a lot.
Appendix: Fantasy Football
An oft-repeated bit of advice in fantasy football is to grab a kicker who plays in a dome, based on the aforementioned thinking that kickers in domes are more accurate. As we just saw while that is technically true, it won’t have much affect over the whole season.
The bigger fault with this thinking is that team offenses have a much larger effect on kickers than the stadium. It is odd that this thinking persists when the kickers in domes over the past three years have finished in these positions among kickers with standard fantasy kicker scoring: 1st, 3rd, 4th, 5th, 16th, 17th, 18th, 20th, 22nd, 22nd, 24th, 29th, 34th.
The top three, with the first, third, and fourth rankings were Blair Walsh, Matt Bryant and Jason Hanson, but nobody would have been able to take advantage as all three had those outstanding seasons in 2012.
The PAT, or more officially the “try,” has been around for decades, but the NFL has decided that kickers nowadays have gotten so good that things need to be a little more challenging when it comes to extra points. Thus teams in 2015 will be given a choice: Either snap the ball from the 15-yard line to kick the extra point (making it a 33-yard kick) or keep the ball on the two and go for the two-point conversion.
This leads to a few questions, like how much tougher will this make converting extra points? And knowing that, does it now make more sense to keep the ball at the two and go for two points?
Kickers in 2014 made 114 of 118 (97%) of field goals from 30 to 33 yards, and more specifically 32 of 33 (97%) kicks from the 33-yard line. It’s not quite as automatic as extra points were–just eight of 1251 (99%) PATs were missed–but with the increased amount of emphasis on kicks from that range, it is probably a safe bet that the 97% clip won’t drop much. (If it holds at a 97% mark, it would mean about 38 missed XPs rather than eight. That’s about one per team per year.)
So if moving the ball back won’t have much effect, we should just keep kicking, right?
Over the past five seasons teams have gone for two 289 times and converted 189, which comes out to a 48% success rate. Assuming the 33-yard FG rate holds, on average a team is scoring an average of .97 points per PAT (1pt x 97%). By converting 48% of two-point attempts (2pts x 48%) on average going for two is worth .96 points.
(Click to embiggen.)
The two options are surprisingly even in terms of value, considering that teams elect to kick the PAT after 97% of TDs. This discrepancy makes it tough to give a clear recommendation. If the number of attempted two-point conversions increases, it could mean the 48% success rate will rise or drop to a degree (especially against teams with a weak/strong goal line defense) that it may make more or less sense to go for two more (or less).
If teams, for example, could start converting two-point conversions at anything over a 50% clip, it would make it worth going for every time. Take a similar situation: There were 44 occurrences last season in which teams went for it on 4th and two; the teams converted 26 (59%). If teams could keep that rate for two-point conversions, they would be worth an average of 1.18 points. That’s worth about an extra seven points over the season (it’s not much, but nobody would turn down an extra touchdown).
As it stands now though, the numbers say there is virtually no advantage in either option. We’re talking about less than half a point per season. Moving the extra point attempt back may lead to teams going for two more often, which could change things, but if the conversion rates stay consistent, it won’t make any difference at all.
So whether your team lines up to go for two or just trots out the kicker as if nothing has changed: Don’t worry, they’re safe either way.
Gentlemen, we can rebuild him. We have the technology.
We have the capability to make the world’s first bionic umpire. K-Zone will be that umpire. Better than he was before. Better… stronger… faster… more accurate‑er.
And then we will ignore him. Because The Human Element™
When many televised ballgames go to or return from a commercial break, they will flash up each team’s runs, hits, and errors, but sometimes they’ll throw in a bonus number: The number of runners each team has left on base. The inclusion of that number seems to indicate that stranded runners are an important aspect of the game. But is this actually true?
It does not come up often—announcers will bemoan a team leaving the bases loaded—but it seems that nobody talks about leaving guys on being bad because it is so obvious that failing to knock in runners will cost their team runs. In other words, a high number of runners left on will result in fewer guys crossing the plate. Analysts don’t tout teams for leaving another ten guys on tonight!
Here’s the thing: If we look at every MLB game from 2010-2014, we find that as the number of runners left on increases, so does the number of runs scored. Exactly the opposite of what everyone thinks!
Our conventional thinking is as if each team has a finite number of base runners each night, and the job is to knock them home. This is obviously not the case. A pitcher prevents a team from scoring runs by preventing runners from getting on base. From this perspective, the two clearly go hand-in-hand. There is a snowball effect at work; the more guys who get on base, the more are going to score.
But the number of runs scored comes down to the timing of the hits, not just the number of hits. It is obviously not the goal to leave more guys on base, it is the other way around. The teams who score runs are putting more guys on base and so they are stranded when the timely hit is not there, but at least they gave themselves a chance to hit with runners on base.
So don’t be too concerned the next time your team leaves a dozen guys on; not that you will necessarily be thrilled, but more baserunners is better than less.