A lot off buzz is out there about the NFL using renewable energy credits (REC’s) to offset the energy usage during the Super Bowl today. This is nothing new for the NFL, in fact it has been using REC’s for the past four Super Bowls. This year the NFL is teaming up with NextEra energy to use REC’s to match consumption not only during the big game, but also all preparation leading up to the game, including the Pro Bowl last weekend. This got me thinking about my Super Bowl game watching habits and how much energy I’ll be consuming. Like most Americans I’ll be attending a party with friends gathered around an energy sucking HD TV, I’ll be opening the fridge dozens of times to retrieve beers and I’ll probably order pizza from a kitchen that keeps an oven running all day. So, while the NFL might be offsetting the energy consumed during the game, what about the rest of the energy consumed by millions of Americans like myself, and does energy usage actually increase on the day of a big game?
Looking at historical load data from ERCOT I was surprised by what I discovered. Using the months of January and February in 2008 as a test, energy usage on Super Bowl Sunday in the state of Texas was actually the lowest consumption Sunday during the two months! The load in Texas turned out to be nearly 10% lower on Super Bowl Sunday than the average load for the two months.
So maybe the NFL deserves extra credit for its efforts. Not only are they using offsets for the energy the game consumes, but it might have a hand in lowering energy consumption for the entire US. Maybe it’s because instead of us all sitting around at our individual residences consuming energy, this is a Sunday where we gather together to spike energy consumption at a friend’s house. On average, 17 people attend a Super Bowl party. Banding together 17 people to watch one TV must save energy, right? Game on!