Optimizing Array Voltage for Battery-Based Systems
Inside this Article
Off-grid system designers have significant experience working with low- and medium-capacity...
We introduce electricians and integrators who are new to batterybased grid-tied PV installations to...
Reduce up-front system cost and optimize long-term performance by avoiding the pitfalls of string...
Before the advent of modern maximum power point tracking (MPPT) photovoltaic controllers, configuring a PV array for a given battery bank’s nominal voltage was a fairly simple exercise. Older series-type pulse-width-modulation charge controllers had a simple 1:1 relationship with the battery bank. For example, a 12 Vdc-nominal module operating at approximately 17 Vmp could be used to charge a 12 Vdc-nominal flooded-cell lead acid (FLA) battery, which typically requires about 14.4 V during the absorption cycle and about 15 V for equalization charging. Because of this simple relationship, array voltage received minimal concern during the design stage, other than correctly sizing the PV-to-controller wiring to minimize voltage drop. Similarly, calculating maximum circuit current to specify the PV controller and overcurrent protection devices was fairly straightforward.
With today’s more advanced charge controller technology, MPPT controllers are the industry standard. They offer significant advantages over non-MPPT controllers, including optimized energy harvest and the option to configure the array at higher voltages than the nominal battery voltage. Most MPPT controllers have dc-to-dc voltage stepdown functionality, which allows you to use a high-voltage array for lowvoltage battery charging. The benefits of higher array voltages include lower array currents, reduced power loss in the homerun wiring, reduced conduit size and cost, and the option to locate the array farther from the battery pack to minimize any potential shading issues. Installers always prefer smaller-gauge conductors, because they are easier to work with.
By nature of their operation, MPPT controllers deliver improved system performance and can reduce copper costs when compared to non-MPPT controllers. However, for optimal performance, they also present greater system design complexity. Factors that must be considered during the design phase include battery charging voltage requirements, the use of high-voltage arrays, the evolving requirements of the National Electrical Code, the impact of a site’s annual ambient temperature range on array voltage and the characteristics of the modules specified.
At a battery reference temperature of 25°C (77°F), a 48 Vdc-nominal FLA battery bank typically requires net charging voltages of approximately 59 Vdc for the absorption stage and 62 Vdc for the equalization stage. MPPT controllers typically have temperature-adjusted operational limits between 140 Vdc and 145 Vdc, with absolute voltage limits of 150 V. In addition, most readily available circuit breakers used in dc applications carry a 150 Vdc rating. At first glance, the 91-volt span between 59 Vdc and 150 Vdc may seem fairly large, but operational and NEC considerations shrink that range into a fairly narrow “sweet spot” in short order. Consider, for instance, a relatively extreme example where the array will be operating in an environment that is very hot in the summer and very cold in the winter.
A key operational trait of the dc-todc buck-type converter used in many MPPT controllers is that the output voltage is always lower than the input voltage—a 2 Vdc drop is common. Combine this loss with another 2 Vdc drop in the conductors between the array and the battery, and the array’s operational maximum power point voltage must always be about 4 V higher than a battery bank’s target charging voltages.
In addition, a module’s maximum power voltage (Vmp) at STC is based on an illuminated cell temperature of 25°C in a laboratory environment. Because PV cells and modules frequently operate at approximately 25°C to 35°C above ambient temperature, a PV module really cannot be expected to produce fullrated power unless the ambient temperature is approximately -10°C. A module’s temperature-related power variance is manifested primarily as a change in output voltage. For crystalline modules, a typical open-circuit voltage temperature coefficient is -0.35%/°C. The module’s voltage drops as the cell temperature increases. In many locations, it is not uncommon for cell temperature to reach 65°C or higher at mid-day in the summer (30°C to 35°C ambient plus 30°C to 35°C cell temperature rise above ambient). In this case, the actual cell temperature will be about 40°C above the 25°C STC temperature. With these assumptions, the resulting voltage drop percentage due to elevated cell temperature is: 40°C x -0.35%/°C = -14%.
In this example, the array’s operational voltage is about 86% of the STC specifications (100% – 14%). Put another way, the array’s STC Vmp specification needs to be 116% of the estimated minimum operational voltage required. With this percentage in hand, you can now specify the array’s minimum voltage at STC: (59 V (minimum target battery voltage) + 4 V (loss in CC and wiring)) x 116% = 73 V.