Photovoltaics and Grid Efficiency: Two Competing Goals

As public awareness of the dangers of climate change has grown over the past decade, there has been an increasing push to make our electric grid “greener.” One of the results of this push has been a huge increase in the amount of solar photovoltaics now supplying power to the grid. Indeed, solar PV has been one of the fastest growing renewable sources over the past few years, with module shipments growing 122% between 2009 and 2010 and 43% between 2010 and 2011 according to the EIA [1]. Driven by federal, state, and local incentives, many of these modules are being installed on the rooftops of homes and connected to the distribution grid, thereby reducing the amount of electricity that must be supplied by conventional sources like coal and natural gas.

Growth of PV Module Shipments (Source: EIA)

Growth of PV Module Shipments (Source: EIA)

However, solar and other renewables are not the only game in town when it comes to reducing the carbon footprint of our electric grid. There has also been a drive towards greater efficiency in an effort to allow our existing generation capacity to meet higher demand. Here, I am not referring to the more visible efficiency efforts like promoting the latest lighting and HVAC technologies in new buildings and conservation among residential customers, though these are definitely important. Rather, I am talking about a behind-the-scenes effort to increase the efficiency of the grid itself. This is done through a practice known as conservation voltage reduction (CVR). This practice is still being studied by a number of utilities, but has been estimated to reduce energy consumption by 2-4% [2]. While that may not sound like much, consider that the US consumed 3.75 PWh (that’s petawatt-hours) of electricity last year [3]. A 2% reduction would account for 75 TWh, or roughly the total electricity consumption of the state of New Jersey in 2011 [4]! But, there may be trouble on the horizon as the ever-increasing number of PV panels connected to the grid threatens to disrupt CVR schemes.

The idea behind is fairly straightforward: reduce the power consumption of load devices by reducing the voltage supplied to them. When an appliance or light is plugged into a wall outlet, it is supplied with a voltage that causes an electrical current to flow. The amount of current that flows depends on the electrical impedance of the appliance. For a DC circuit, the power drawn is equal to the product of the current and the voltage (P = I*V). For an AC circuit, the story is a bit more complicated, but similar enough that the above definition can be used for this explanation. So, by reducing the voltage level slightly, the current to most devices is also reduced (because the impedance of the device has not changed). As a result, the device draws less power and energy is conserved over the long term. This is all done without affecting the operation of the appliance. Though the voltage of most outlets in the US is nominally 120 V, it is allowed to vary by ±6 V. So, your wall outlet voltage might be as high as 126 V or as low as 114 V, and your appliances have to be capable of functioning at any voltage within this range (known as the ANSI Range A) [5]. So, what the CVR scheme does, is keep the voltage within the lower half of this acceptable range, while still keeping it above the lower limit of 114 V. That way, your appliances still function normally while consuming less power.

So, why is solar bad for conservation voltage reduction? Most solar systems in the US are grid-tied systems, meaning they feed the power they produce back into the grid rather than using it to power your home directly. From the point of view of the utility, the PV panels act like new, small generators supplying power locally. Since there is more power being supplied near the load, less has to be supplied by the utility. This is great from an environmental standpoint. However, it also has the side effect of causing the voltage level along the line to get higher [6]. While this would not normally be a problem (as long as the voltage stayed within the ANSI range), it would interfere with CVR schemes. The system is designed by the utility to deliberately keep the voltage low along the line. But the PV panels increase the voltage in a way that was not planned for. And, because solar is an intermittent source, it is difficult for the utility to predict when it may have to compensate for the voltage rise from the PV to keep the voltage low along the lines.

Line voltage and ANSI Range A Limits (Source: Rocky Mountain Power [5])

Line voltage and ANSI Range A Limits (Source: Rocky Mountain Power [5])

All this being said, there is some good news here. Currently, the amount of power being generated by residential PV panels is not enough to significantly affect the line voltage or any potential CVR schemes. Studies are being done to determine how much power must be supplied by photovoltaics before serious problems arise. However, utilities will need to be careful in planning their efficiency efforts going forward, as the penetration of photovoltaics is likely to continue growing in the coming years.

[1] http://www.eia.gov/renewable/annual/solar_photo/

[2] https://www.dom.com/business/dominion-voltage/edge-program.jsp

[3] http://www.eia.gov/electricity/annual/html/epa_01_02.html

[4] http://www.eia.gov/electricity/data/state/

[5] http://psc.state.wy.us/pscdocs/dwnload/CVR%20Presentation%204%2012%202012%20(1).pdf

[6] http://www1.eere.energy.gov/solar/pdfs/42298.pdf

Advertisements

Leave a comment

Filed under Uncategorized

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s