The PV I-U curves (I-U versus temperature, Irradiations, and Under standard conductions) cant be indicated in PV inputs. As I could understand from the program, derating factor is the alternative. But how could the user estimate the Derating Factor from I-U curves?
HOMER assumes the PV array is outfitted with a maximum power point tracker (MPPT), in which case the output of the array is effectively linear with incident solar radiation, regardless of the DC bus voltage. Then the PV array's IV (current versus voltage) curve is not relevant
In the absense of an MPPT, the IV curve is relevant but HOMER will not model the system properly. There is no straightforward way to calculate the derating factor from the IV curve. Without an MPPT, the PV array may be exposed to a fluctuating voltage which at times may be very far from the optimal point, and at other times be fairly close to it. My guess is that in a PV-battery system without an MPPT, if the DC bus voltage is well matched to the PV array the appropriate derating factor might be between 50% and 70%. If the DC voltage is not well matched to the PV array the appropriate derating factor could be considerably less.
To roughly account for reduced output in warm climates, there are two choices:
- HOMER includes the option to specifically model temperature, which will derate the PV output based on the temperature.
- If a rougher estimate is sufficient, you can simply reduce the derating factor to 80% or 70% (and not model the temperature), but that factor applies throughout the year so it doesn't capture any seasonal effect.