What if the groove rule, that much maligned restriction announced by golf's ruling bodies in 2008 in an effort to diminish the spin producing capacity of certain iron and wedge groove designs, the rule that so many in the golf industry decried as an unnecessary, ineffectual and expensive burden on their business, the rule that was to everyday golfers at best confusing and at worst ignored, what if that rule actually was having the desired effect?
It just might be the case. The same statistics the U.S. Golf Association used to prove there was a disconnect between driving accuracy money rank on the PGA Tour, and which then led to a decision to roll back groove performance, this year are showing a marked improvement.
To put things in specific perspective, the USGA's charts showed that the correlation between driving accuracy and money rank was around 0.5 in the 1980s, which importantly was very similar to the correlation number at that time for both putting and greens in regulation. Then, the number for driving accuracy dipped to around 0.25 for most of the 1990s and early 2000s and then from 2003-2006 fell to basically zero. This set the stage and made the argument that there was insufficient penalty for missing the fairway, and the belief was that if the new groove stipulations would start penalizing control from the rough, players presumably would learn the value of hitting approach shots from the fairway. Theoretically, the thinking went, players who were more accurate would do better on tour, win more money and ultimately the correlation between driving accuracy would improve.
That didn't exactly happen immediately. While the correlation coefficient in 2009, the last year aggressive grooves were allowed, dipped below zero, it only slightly improved in the first year the new groove rule was in place and actually dipped below zero in 2011.
This year, however, the correlation factor is 0.24, essentially three times higher than it was in 2010. More interestingly, perhaps, is where that number compares to past history on the USGA's famous charts. Essentially, it matches the average correlation coefficient from 1992-2002.
Do you remember the Bomb and Gouge school of golf, the style of hitting it as far as possible with little regard to accuracy and then wedging it safely on the green thanks in no small part to the efficiency of aggressive grooves, the style that become popular in the late 1990s and early 2000s? It can be suggested that the groove rule was a response to Bomb and Gouge. In a way, the groove rule attacked the "Gouge" part of the equation. So it would also seem logical that if the groove rule were successful, it would cut in half the rapid decline of the correlation coefficient between driving accuracy and money rank.
The numbers from this year seem to suggest that's exactly what has happened. It's certainly possible that this year's numbers are an anomaly, a spike that is not indicative of where the real trend is headed, or they could be a result dictated by some other factor like course conditions or type of player graduating from the Web.com tour or even typical hole locations at tournaments. It is also not a positive sign for those in favor of the groove rule fundamentally returning the way golf is played back to the 1980s that Rory McIlroy, No. 1 on the money list and the No. 1 player in the world, is 152 in driving accuracy.
But the one number that fueled the early discussion about whether grooves needed to be changed is different this year, in a dramatic fashion. Perhaps it's time to wonder why and what it might mean for rulemaking in the future.