Baseball’s Unbreakable Records, In Graph Form

There was a great thread last week about which baseball records will never be broken. I took some of the top suggestions and put them into graph form.

Many of the graphs present the top five leaders and active leader. This can give the appearance that many of them are within closer grasp than they sound at first, but keep in mind that there are roughly 20,000 other guys who did not even make the chart in the first place.

It would seem there are many that will indeed stand forever. One I wasn’t aware of prior to reading the thread is the one that takes the cake:

The use of pitcher has clearly changed over the years. A guy today hits 100 pitches and he is done. That obviously influenced the above chart as it does this one:

A reduction in the innings also means a reduction in strikeouts. CC Sabathia has already played for 13 seasons, is 33 years old, and is not even halfway to Nolan Ryan’s strikeout record:

With the increasing bullpen emphasis, pitcher wins also matter much less in today’s game. When guys were throwing a complete game in 95% of their starts though, they got a lot more decisions:

It also meant that they got many more losses, which is why Cy Young’s loss record will probably never be broken either. Ironically, to lose this many games means not that the pitcher has to be bad, but has to be good enough to hang around so long he can build up the loss tally:

Of course if Young’s loss record is broken, he will no doubt hold on to his wins record much longer:

Another change in pitchers has been the increase of the number of runs scored. It is unlikely we will ever see another pitcher with a sub-1 ERA over a whole season. Current pitchers are not necessarily less dominant, though, if we look at ERA+, which compares each ERA to the league average of the year many recent pitchers are in line with the historic seasons of Keefe and Leonard.

Perhaps it’s not a real record category, but Johnny Vander Meer’s feat of throwing two consecutive no-hitters will be tough to beat. Unlike most of the other records, some which take decades to accomplish, this one only takes 18 innings. So I could potentially see someone tying this at some point, but three in a row? Good luck.

On to offense, we find Dimaggio’s famed 56-game hit streak. Unlike the pitchers, there has not been a drastic change in how offensive players are used. So it may take a long time, but this one isn’t insurmountable.

Ichiro arrived in the US at age 27 and stole a career-high 56 bags during his rookie year. Had he started in MLB at age 21 and stole 56 every season until he turned 27, that would bring his current total to one more than Tim Raines for fourth all-time.

Derek Jeter has played at least 145 games in 15 of his 19 seasons. If he continued to hit his career mark of .312 and didn’t miss another game, he would need five more years to break Rose’s record. It’s a safe bet this one will stand for a while.

Another one that would take about five years under optimal conditions to break would be Hank Aaron’s career record for most total bases. A-Rod is doubtful to reach that. Jeter won’t make it either. Albert Pujols, at 34, is about seven great years away.

One you don’t hear every day is Chief Wilson’s record of 36 triples in a season, which he did in 1912. Only Granderson and Lance Johnson have more than 20 triples in the past 20 seasons.

It would be stretch for Pujols to play that long, but not so much for Cal Ripken, Jr. who as you know played every game for 16 seasons. Another cool fact someone mentioned: Japanese player Tomoaki Kanemoto did not miss an at-bat or an inning on defense for ten years.

Finally, we’ve got the Cubs’ drought of 105 years without a World Series. If Epsein can’t pull it off, it may take 200 years.

Are Soccer Goals Really That Big of a Deal?

Soccer fans celebrate scoring harder than fans of any other sport. A goal in hockey and or a run in a baseball will bring about a few high fives. Touchdowns are fun, but there is a big difference between the excitement of a 90-yard run and a 3-yard dump off into the end zone. Unless the score is tight and the hour late, a layup might bring a smile to a basketball fan’s face. Might.

But you already knew all that. The more something happens the less exciting it gets. That’s why NFL Sundays are so big, because a win for a football team is worth ten for a baseball team.

But isn’t this GGGGGGGOOOOOOOOLLLLLLLLLL shenanigans just a little overkill? How much bigger is a goal in soccer than a touchdown in football?

Average points scored per team in each game:

Points Per Game

If we divide everything out so that soccer goals are equal to 1 then round everything off, we find this:

Adjusted Points

One goal in soccer is the equivalent of two goals in hockey, a three-run homer, two touchdowns and a field goal, or 35 slam dunks.

Football and basketball, which award more than one point per score, are the most interesting. Our brains have difficulty understanding such differing values. For the same reason, you do not become ten times happier if your blog gets 100 hits as opposed to ten hits. If you did, a million hits would paralyze you with utter joy for a fortnight.

In football, even without the field goal, two quick touchdowns are indeed a very big deal and would have even the most passive fan pumping her fist.

The NBA is on a whole different level. Try to imagine some absurd shot in a basketball game that would award 70 points to your team. It’s almost incomprehensible because it would significantly, and I mean significantly, boost your team’s chances of winning the game. You would lose your mind if your team sunk that shot.

This obviously isn’t a perfect comparison. On occasion, a soccer or baseball team will score three times or more the average—four goals or 12 runs, respectively. No basketball team will ever reach that point. The record for points in an NBA game was 173, which was 2.1 times the league average in 1953.

Still, the numbers give us the general idea that soccer fans aren’t all that crazy after all.

No, Game Length Isn’t Hurting Attendance

“If the people don’t want to come out to the ballpark, nobody’s going to stop them.” - Yogi Berra

The Pirates and Cubs played a 16-inning game last week and few people who stayed until the end cared that it took almost six hours. Most would have stayed until the sun came up. But longer games are apparently the last thing some people want to see.

A “high-ranking executive” told Buster Olney on Monday he was so worried about games being too long that he proposed shortening them to seven innings. He backed the idea so strongly that he would not allow his name to be published in connection with it.

I would never support making games seven innings, but game length comes up every year, so maybe something does need to change. But before we get ahead of ourselves let’s actually look at the numbers: How long are games, really? And is attendance falling because of it?

Click to see full-size.

Click to see full size.

The length record was set in 2000, when the average game was 178 minutes long. It dropped off after that, but according to the guys holding the stopwatches games have once again increased in length over the past decade. By a whopping ten minutes. Ten minutes, guys. That’s more than nine minutes! It does add up though, over a whole season that’s 27 hours that you could have spent doing whatever it is you do when you aren’t watching baseball… reading high quality baseball blogs perhaps.

Surely this 5% increase in game length has driven fans past their threshold for how much baseball they can consume, right? Eh.

Click to see full size.

Click to see full size.

The attendance has clearly been more chaotic over the past decade than the length of games, but it was higher last year than it was in 2004. The major drop was between two seasons in which there was virtually no change in game length. Besides, even if it were declining it would take a lot more than a few graphs to say that it was due to game length.

Younger fans may want shorter games, but if a 2-hour, 50-minute game is fine when a 2-hour, 57-minute game is too long, then the problem isn’t baseball.

Sources: One and Two

This post was originally published at The McEffect.

Congratulations! Maybe.

A story from Alan Watts:

Once upon a time there was a Chinese farmer whose horse ran away. And all the neighbors came around to commiserate that evening, “So sorry to hear your horse has run away. That’s too bad.”

And he said, “Maybe.”

The next day the horse came back, bringing seven wild horses with it. And everybody came around in the evening and said, “Oh, isn’t that lucky? What a great turn of events! You’ve now got eight horses.”

And he said, “Maybe.”

The next day his son tried to ride one of the horses, and was thrown off and broke his leg. And they all said, “Oh dear, that’s too bad.”

And he said, “Maybe.”

The following day the conscription officers came around to force people into the army and they rejected his son because he had a broken leg. And all the people came around and said, “Isn’t that great!”

And he said, “Maybe.”

That is the attitude of not thinking in terms of gain or loss, advantage or disadvantage… so you never really know if something is fortune or misfortune.

Exploration of Evil

We join our young soldier in a hospital. A shell exploded near him two months ago, but his time spent recovering from his wounds is almost complete. His full recovery is fast approaching and he has been assigned to a non-combat position away from the frontlines.

But the guilt of knowing his friends will remain in the heat of combat while he works behind a desk is unbearable. He sends a note to his commanding officer, petitioning him that he should be able to rejoin his company and fight once again. His officer permits his request and he fights again on the front lines. About a year after he returned to battle a gas attack leaves him unable to speak or see. He is sent back to the hospital and the war ends while he is recovering for the second time.

Similar plots have played out during every war. While few people volunteer to go into the horrors of battle, many injured soldiers do anything they can to get back to the most dangerous locations in the world. They know they could die a painful death at any time—they have seen their friends suffer and die time and time again—but like our young soldier, there is no place they would rather be than beside their friends, fighting to end the war: Even if it meant sacrificing their own life for another’s. People who act with such unselfishness are often called heroes.

This particular soldier, on the other hand, is not usually called a hero. Most people just call him Adolf Hitler.

Orcs Have Feelings Too

“The sad truth is that most evil is done by people who never make up their minds to be good or evil.”
- Hannah Arendt

The Lord of the Rings is a timeless tale of adventure that teaches many lessons worth learning. It also contains a huge lie. Very few characters in The Lord of the Rings are not clearly good or evil. Gandalf is good. Sauron is bad. Merry and Pippen can be stupid, but there is no question that they mean well. The orcs, with actions as ugly as their physical appearance, are clearly evil. The bad guys in JRR Tolkien’s fantasy world exist for no other reason than to be killed by good guys. It is a great story, but we often forget that real life does not work this way. In real life, orcs have feelings too.

Another fantasy story, the one that is the basis for the popular television show A Game of Thrones, is a bit more realistic in terms of morality. Don’t worry, no spoilers here. George R.R. Martin tells each chapter of his novel from a different character’s perspective. The plots of the show and the book are similar, but when I read the rationale behind certain actions, the “bad guys” never seem so bad. Their actions have not changed, but knowing the motivations behind them makes a difference of whether these characters are good or evil!

I got to wondering if knowing the reason behind actions really can effect whether something, or someone, is good or evil, and if that is the case does evil even exist in the first place? Surely it must, I mean, what about the horrible things the Hitler did? Those are our questions. I will tell you upfront there are no definitive answers. But that does not mean we cannot learn something from looking, whether it is at fiction or reality. After all, we cannot prevent ourselves from being evil if we don’t know what it is in the first place.

What is Evil Anyway?

“There is a sum of evil equal to the sum of good, the continuing equilibrium of the world requires that there be as many good people as wicked people.”
- Marquis of Sade

Good could not exist without evil. The age-old question of why is there evil in the world appears nonsensical when you consider it under those terms. Nobody questions why there is good in the world, why then, question the existence of evil? If everyone were good, it would cease to be good. We don’t call the sun “the yellow sun,” because there is no non-yellow sun to differentiate it from. Evil is the yang to good’s yin; either both exist or neither.

In day-to-day life, we identify certain acts as being good or evil based on a moral code. The word moral comes from the Latin for “proper behavior.” How we come about that moral code is a bit more complicated. Perhaps this doesn’t represent the entire world accurately, but I suspect most people I have encountered think they are moral absolutists. This means they feel morals come from a code of conduct, often created by a God, which determines whether actions are good or evil. In moral absolutism, the reasons and circumstances surrounding actions are meaningless. An absolutist would have to determine which code to use, and determine why it was superior to the other codes. It is not always so straightforward, however, because people who claim their codes come from a list like The Ten Commandments, rarely object to the laws of modern-day governments that are not included in the Commandments—like the one that outlaws slavery.

Absolutism is not the only option, though. A moral universalist would not be opposed to taking into account the circumstances of the action. They say we can determine whether actions are good or evil by examining other cultures and finding where they agree. This is why most cultures consider killing a person an evil action, but if it is during a war or in self-defense, fire away. Similarly, moral relativists feel that different cultures can all have legitimate, if different, definitions of what is good or evil. Some relativists do not feel that any one code is objectively more correct than any other is though. It is not fair to reverse the Golden Rule: Just because you treat other people the way you want to be treated does not mean they have to treat you the way you want to be treated. Finally, we have moral nihilism, which posits that there is no such thing as good or evil.

Religion and morals often become intermixed, but as we see with the Ten Commandments, it is not always easy to develop an unmoving set of rules to govern an ever-moving world. Many religious people are not absolutists, but I suspect many who are think they are, are that way because they have never taken a step back to really consider where good and evil come from. Many countries have developed sensible laws, even without the guidance of a particular religion. The Westminster Dictionary of Christian Ethics states religion and morality “are to be defined differently and have no definitional connections with each other.” Still, many people believe atheists, who obviously do not hold to codes passed down by God, are moral nihilists. I could not find any evidence for that claim, and Richard Dawkins seems to make a good point about morals in general:

“If I ask myself why I don’t steal, why I pay my taxes, why I do all the things that keep society going? I suppose it’s a slightly irrational feeling that I wouldn’t wish to live in the kind of society where people behave in the sort of ways that I wouldn’t wish them to behave in, and therefore I shouldn’t behave in those ways either.”

Few people, it would seem, object to The Golden Rule. It need not come from God himself, or even a religious text, to be a valid way to conduct one’s actions.

Each stance has strengths and weaknesses, which is why we find a diversity of definitions over not only time and location, but from individual to individual. Everybody has a different ideal society, and we can only say things are good or bad insofar as they move the current world towards or away from that goal. An editor at Reuters said they refuse to use the word ‘terrorist’ in their articles because “one man’s terrorist is another man’s freedom fighter.” From the perspective of most people, 9/11 was a horrible event, but the guys responsible for it probably considered it a success. Arguments can be made for either case, but we can never arrive at an objectively correct answer because my ideal world is not inherently superior to yours–if it were, this whole section wouldn’t be necessary. For the purposes of this piece though, we’ll take a universalist’s standpoint and continue as if evil is indeed a real thing.

Perspective: How Fiction Can Teach Us About Good and Evil

“So what I told you was true, from a certain point of view… You’re going to find that many of the truths we cling to depend greatly on our own point of view.”
– Obi-Wan Kenobi

With the exceptions like the Game of Thrones novels, we usually only hear one perspective throughout books and movies. One of my professors in college stressed that a reader should always be aware of who the narrator is for this reason. It can make a major difference in what we learn from the story. A story like “The Tell-Tale Heart” would be very different if it were told by someone other than the insane guy who is killing weird looking people (sorry if I ruined the end for you). The events might be the same, but depending on who is telling the story, the information left out can be just as important as what is included.

Author Alain de Botton said, “Tragic art, as it developed in the theaters of ancient Greece, in the fifth century B.C., was essentially an art form devoted to tracing how people fail, and also according them a level of sympathy, which ordinary life would not necessarily accord them.” Botton expands on this theme in his book, How Proust Can Change Your Life. Two examples from the book:

  • Tragic end for Verona lovebirds: after mistakenly thinking his sweetheart dead, a young man took his life. Having discovered the fate of her lover, the woman killed herself in turn.
  • A young mother threw herself under a train and died in Russia after domestic problems.

In other words, if you were to see the Tell-Tale Heart Guy on the Channel 2 news, you would hear a minute-long story about a deranged murderer who is clearly evil and probably deserves to be put in jail for life, at best. More at 11! When we read the book though, hearing his perspective at least allows us to have a little bit of sympathy for him as he descends into madness.

We read, as did Proust, stories like this in the newspaper or see them on the news every day without thinking much of them. We are well aware there is a great deal more to the lives of Romeo, Juliet, and Anna Karenina than what can fit into a sentence or two. In recognizing this, Botton writes, we should become aware that there is nothing special about the lives of these specific characters, but rather it is how they are presented to us that makes the difference. If given the time and thought, any line in a newspaper could grow to become a meaningful tale. We can reduce years to a single line or write thousands of pages about them.

As for our original question: Can more backstory influence what we think of a person? It obviously can change how we perceive a person, whether or not that makes their actions less evil is dependent on how we believe morals are determined. The less we know about someone, the more we assume. If we already see someone as a bad guy, we are not sure what to make of things like the story of Hitler in the introduction. And on that note…

What About Hitler?

“Understanding is not excusing.”
– Philip Zimbardo

Inevitably, when you start saying things like ‘evil is not absolute, perspective can make a difference’ people will say: But what about Hitler? Surely he was a bad guy! To do anything other than denounce Hitler as a creature who ascended out of Hell ready to kill makes some people nervous. It also completely ignores the reality of the situation. This is not to say we must view Hitler as good. But without taking into account not just Germany, but the entire world at the time, we judge Hitler based on his two-line newspaper bio—A failed art student goes on to become the leader of Germany; the largest genocide ever occurs under his watch—rather than seeing him as a real person that lived in the same world you and I do. We do so at our own peril.

Not the typical Hitler mental image.

Not the typical Hitler mental image.

Even a number of people who have studied him in-depth have been incapable of overcoming the simple-minded view that in a world of endless shades of grey, people are either black or white. One of the few who attempt to is RHS Stolfi, who writes in his biography of Hitler:

“The great biographers present Hitler as incapable of calm and casual social conversation and observe that he preferred to engage in tirade and pontification… Interpreting Hitler as a hate-filled egomaniac, the biographers underestimate the man, misjudge the disruption of the times and prove incapable of overcoming elemental hatred for the subject of their biographies…”

A paper called “The 8 Stages of Genocide” was written in 1996, outlining what it takes for events like the holocaust to take place. The first is to divide people into groups, the second is to attach symbols to those groups, the third is the dehumanization of the people within a certain group. We can see in these first three steps not only what that Nazis were able to accomplish, but perhaps ironically, how many people nowadays view Hitler and the Nazis. We paint a picture of the Nazis being so evil for blindly killing so many people, and yet never hesitate to feel they deserve nothing more than to be lined up and shot themselves. Again, we do not have to consider the Holocaust any less horrible to step back and examine it from a different angle, but it is perhaps difficult to do so because the history books tend to leave out a few things about the United States’ own past.

Eugenics: No Really, This Actually Happened

“A father shall immediately put to death a son recently born, who is a monster, or has a form different from that of members of the human race.”
- The Laws of the Twelve Tables, Rome

The beginning of the movie 300 includes a scene in which the community’s elders examine a newborn child for “defects.” Hollywood did not write that. The scene was true for every child in Sparta, and many in Rome and Athens. The result of this widely accepted fanticide was a culture where all citizens fit the mold of what the community thought they should be. The practice, though it would be considered barbaric by today’s standards, did not die out with the Romans.

After reading Charles Darwin’s On the Origin of Species, English polymath Sir Francis Galton began work on a theory that he ultimately labeled eugenics. With Darwin having explained evolution and the passing of traits from one generation to the next, Galton felt people should define and then pursue the ideal society. If animals could be could be bred selectively to acquire a desired trait in their offspring, so could humans.

Galton wrote in 1869, “…to obtain by careful selection a permanent breed of dogs or horses gifted with peculiar powers of running, or of doing anything else, so it would be quite practicable to produce a highly-gifted race of men by judicious marriages during several consecutive generations.” This was not a particularly controversial movement. Supporters included George Bernard Shaw, H. G. Wells, Theodore Roosevelt, and Winston Churchill. Alexander Graham Bell wrote a few papers in which he explored the heredity of deafness and encouraged deaf couples not to reproduce.

Many people of the day, including Americans, had a tendency to confuse people with animals. In 1908, the Louisiana State Fair held a ‘Better Baby Contest’ in which children were scored on a 1000-point grading scale in various physical and mental categories. On the other end of life, though it was never done, some Americans supported the construction gas chambers for euthanasia. Yes, you read that right.

BetterBabyContest

As thought on eugenics continued—which we define today as “the theory and practice of improving the genetic quality of the human population”—it branched off into two directions: Positive and negative. Positive eugenics is what Galton pursued, which is to say selective breeding among humans based on which hereditary traits seemed favorable. Negative eugenics was the discouragement of reproduction by those with hereditary traits perceived as poor. The later would grow from discouragement to the sterilization of those with less-desirable traits.

In 1907, the State of Indiana passed an act “to prevent procreation of confirmed criminals, idiots, imbeciles and rapists.” Indiana was also the first state in the United States to pass legislation that permitted the forced sterilization of select individuals. Over 30 states followed with similar laws. Some were challenged, but in 1927, the US Supreme Court ruled in the case of Buck v. Bell that compulsory sterilization of mental patients was constitutional. US Supreme Court Justice Oliver Wendall Holmes, Jr. wrote:

“It is better for all the world, if instead of waiting to execute degenerate offspring for crime, or to let them starve for their imbecility, society can prevent those who are manifestly unfit from continuing their kind.”

Over 60,000 Americans were forcibly sterilized between 1907 and 1963, with over one-third being performed in California. People in North Carolina could be forcibly sterilized for having an IQ under 70; the youngest was a ten-year old girl. American eugenics advocates continually sought to expand the program in their own country. Upon hearing of what the Nazis were doing, the superintendent of Virginia’s Western State Hospital told the local paper, “the Germans are beating us at our own game.”

As tragic as forced sterilization is, Hitler and the Nazis obviously took things further. Eugenics programs in the United States declined after WWII. Had the Nazi’s actions not been so extreme, the programs may have very well continued. The ruling in Buck v. Bell has never been officially overturned.

Milgrim, Zimbardo, You, Me

“All evil begins with 15 volts.”
– Philip Zimbardo

The two most infamous psychology experiments were conducted by two guys who went to the same high school. Both give key insights into how it is that people do evil things. First were the experiments of Stanley Milgrim. I could tell you what he did, but it is better if you just listen to Philip Zimbardo, explain it:


Milgrim demonstrated that the thinking of “Well, Americans could never do that,” was completely bogus. Create the right conditions and you can get practically anybody to do horrible things. If there was any doubt, Zimbardo’s own experiment eliminated it quickly. Here he is explaining his ‘Stanford Prison Experiment’ (Warning: There is some rear nudity).


The only way we are to prevent ourselves from committing evil, is to realize that everyone—even your grandma—is, under the right circumstances, capable of doing horrible things. Zimbardo’s research tells us that not only are the individuals responsible for evil, but perhaps even more so are external factors, including the situation and the system. Zimbardo says that in order to prevent those circumstances from happening, there must be oversight. “This is what you will do. This is what you can’t do. We will do no harm. And if you do harm, here’s what’s going to happen: You’re going to get in trouble, you’re going to lose your job.”

Preventing Evil Comes From Understanding It

“Who will guard the guards themselves?”
– Juvenal

Bad acting and Jar-Jar Binkses aside, Star Wars, specifically the life of Anakin Skywalker, is classic American tragedy. Anakin chops up little kids and becomes the most evil person in the galaxy, but we will never hate him like some goofy villain on Batman because we know everything Anakin did was for love. Watching Star Wars allows us to see Anakin’s life in full context—an opportunity we rarely have for people. Taking a step back to realize there is more than meets the eye, rather than immediately judging the people we see on the news is the only way not to become them. Thomas Boswell on the modern day bad guys in another field:


Boswell was talking about steroids in baseball, a controversial topic, but one clearly less important than the Nazis. His point, Keats’s point, about negative capability holds true regardless of the subject though. Dividing people into neat, little groups is easy. But to say, ‘Hitler was evil. Period. The End. Next topic.’ does not help us learn anything, nor does it prevent us from doing bad things ourselves. Dismissing him so easily and not taking the time to understand him as a person increases the chances that we do something to hurt another.

After the First World War, Hitler never killed anyone himself, but he helped create the circumstances for others to do so on a scale never before seen. The Nazis who carried out the plans of the Holocaust—whether from Hitler or other leaders—were just as guilty. They were not a unique case, however. Milgrim’s experiments confirmed his worst fears that Americans were just as capable of the same brutal behavior. We need to hold ourselves to a higher degree of responsibility than is presented through our common moral codes; to realize that not only our actions, but how the society that we are building could lead to bad places.

Some people have an extremely high resistance (it would be interesting to see how Gandhi performed on the Milgrim Shock Test), but nobody is purely good or evil. Our surroundings influence us, even if we do not realize it. The more we view our world through the lens of Lord of the Rings and the less through Star Wars, we dramatically increase our chances to do evil. If we see ourselves as not-orcs, we can do anything and still be ‘good people.’ That is a fiction.

Further sources

Replays Won’t Be Perfect, But They’ll Teach Us Something

They say money can’t buy happiness, but fans sure seemed glad on Monday that Major League Baseball spent $30 million on a new replay room this off-season. Whether that will last is another matter.

Both challenges at PNC Park on Opening Day went in the Pirates’ favor, but there are a few issues with the system that will come to center stage sooner or later. Some might lead to changes in subsequent years and depending on how things shake out, others could potentially legitimize people who think their hometown team is being shortchanged more often than not.

To Review or Not To Review?

As you may have heard, each team gets to challenge one play per game. If the play is overturned, you get another challenge. Also, from the 7th inning onward the umps can elect to review any play they want to, which will not be charged as a challenge to either team.

On Monday, Bryan Morris picked off Emilio Bonifacio at first base. The first base umpire, Bob Davidson, blew the call. Being that it was the 10th inning, the umps could have reviewed it themselves, but none of them seemed to consider the idea until Clint Hurdle came out to challenge it.

This raises a few questions. First, how close does a play have to be before the umps review it on their own? Had Hurdle stayed in the dugout would the umps have avoided the replay because they knew Hurdle had a challenge remaining? Or would they have been more likely to review the play if Hurdle had already used his challenges?

As it played out, the umps essentially forced Hurdle to use his challenge, rather than looking at it themselves. In this case he got another one, but had Hurdle blown a challenge earlier in the game he could have argued until he was red in the face (which we know he excels at) and the umps could have refused to review the play—even though the call was wrong.

The scenario is almost certain to happen: The call will be wrong, the manager will be out of challenges, he’ll argue, the umps will refuse to review, the call will stay wrong, and the umps will be blasted on the post-game show. The irony for the umps is that it almost has to be that way. If the umps just call up New York every time a manager comes out to argue, it would defeat the whole point of limiting challenges in the first place.

But it gets worse, because if you want to argue that managers should be more careful with when they use their challenge and they deserve what they get, even if the call was wrong, then what is the point of even having replay?

Replays Are Now Stats

When NFL refs blow a call, there are so many of them on the field that it can be difficult to say who should shoulder the blame. But in baseball, everyone watching knows that Davidson blew the call (not to pick on him, he certainly won’t be the last). Because we can become so specific, replays have become another stat.

At the end of the season, we will add up which ump had the most calls overturned. Is there some punishment that exists for that guy? That is, beyond the public embarrassment of knowing he’s the worst at his job. We shouldn’t judge too quickly though, as we won’t know for years if some umps are constantly good (or bad) year-after-year or if their quality fluctuates over time.

Like most people, I tend to accept that when you play over 1400 innings in a season, some calls will go your way and others won’t. But now we’ll have numbers to tell us which teams, or even players, had the most incorrect calls made against them. Depending how things shake out, replay statistics could show us that things do in fact even out over time and we were right all along. Then again, we might find that the umps are consistently favoring some teams over others. How do you stop that? More replays.

No matter what they end up telling us, the replays and the numbers they produce will be worth looking at more than once.

This post was originally published at The McEffect.