david p. larson direct power output forecasts from remote...

8
David P. Larson Mem. ASME Department of Mechanical and Aerospace Engineering, Jacobs School of Engineering, University of California, San Diego, La Jolla, CA 92093-0411 e-mail: [email protected] Carlos F. M. Coimbra 1 Professor Mem. ASME Department of Mechanical and Aerospace Engineering, Jacobs School of Engineering, University of California, San Diego, La Jolla, CA 92093-0411 e-mail: [email protected] Direct Power Output Forecasts From Remote Sensing Image Processing A direct methodology for intra-day forecasts (1–6h ahead) of power output (PO) from photovoltaic (PV) solar plants is proposed. The forecasting methodology uses publicly available images from geosynchronous satellites to predict PO directly without resorting to intermediate irradiance (resource) forecasting. Forecasts are evaluated using four years (January 2012–December 2015) of hourly PO data from 2 nontracking, 1 MWp PV plants in California. For both sites, the proposed methodology achieves forecasting skills ranging from 24% to 69% relative to reference persistence model results, with root-mean- square error (RMSE) values ranging from 90 to 136kW across the studied horizons. Additionally, we consider the performance of the proposed methodology when applied to imagery from the next generation of geosynchronous satellites, e.g., Himawari-8 and geo- stationary operational environmental satellite (GOES-R). [DOI: 10.1115/1.4038983] 1 Introduction Solar forecasting is one of the enabling technologies for inte- gration of intermittent renewable resources into the electric grid. Research on the topic is typically categorized by forecast horizon: intra-hour (<1 h ahead), intra-day (1 h ahead), and day-ahead (1–3 days ahead). Intra-hour forecasting generally relies on local telemetry and ground-based sky imagery (see, e.g., see Refs. [14]), while day-ahead forecasts are usually based on numerical weather prediction models and model output statistics (see, e.g., Refs. [510]). For the interval ranging from 1 to 6 h ahead, pure [11] or hybrid [12] remote sensing methods are competitive. The interplay of several forecasting techniques for solar energy integration is reviewed in Refs. [1315]. 1.1 Prior Work. A majority of previous solar forecasting research has focused on the solar resource rather than power out- put (PO). In particular, the existing literature is strongly biased toward forecasting of global horizontal irradiance (GHI) forecasts, the component of solar and atmospheric radiation that is the most relevant for nontracking photovoltaic (PV) power systems. There are fewer studies on predicting direct normal irradiance [4,1618], and even fewer on the prediction of PO of operational solar power plants [68,1922]. Relevant intra-day forecasting studies in the past have focused on satellite-to-irradiance models coupled with cloud motion vec- tor (CMV) fields. Satellite-to-irradiance modeling is a mature, but still active area of research (see, e.g., Refs. [2225]). Perez et al. [11] predicted hourly GHI for 1–6 h ahead at SURFRAD stations throughout the continental United States, while Coimbra and coworkers [12] developed a hybrid forecasting methodology based on CMV fields and artificial neural networks (ANNs) to predict GHI in Merced, CA, and Davis, CA, over horizons of 30–120 min ahead. Nonnenmacher et al. [16] forecasted GHI 1–3 h ahead in San Diego, CA, using streamlines determined from CMV fields. Zagouras et al. [26] investigated the use of lagged exogenous variables and spatiotemporal correlations to improve 1–3 h ahead GHI forecasts at seven CIMIS sites in California. Most recently, Mozorra Aguiar et al. [27] used ANNs to forecast GHI 1–6 h ahead for sites in the Canary Islands, off the northwest coast of Africa. For a recent review of PO forecasting of solar power plants, readers are referred to Ref. [15]. In this work, we highlight the PO forecasting literature that focused on sites in California, the area of interest in this paper. Chu et al. [28] evaluated intra-hour fore- casts of PO of a 48 MWp PV plant near Las Vegas, NV, while Lipperheide et al. [29] forecasted the PO of the same 48 MWp plant, but focused on horizons of less than 180 s ahead. Lonij et al. [30] used a network of residential PV systems in Tucson, AZ to forecast PO with horizons of 15–90 min. Pedro and Coim- bra [19] evaluated 1–2 h ahead PO forecasts, generated without exogenous inputs, for a 1 MWp PV farm in Merced, CA. Larson et al. [7] recently forecasted day-ahead PO of two 1 MWp PV plants in San Diego County, CA. 1.2 Contributions. The goal of this work is to accurately and directly predict PO of PV power plants 1–6 h ahead. Rather than forecasting the solar resource and then developing a secondary resource-to-power (RTP) model, we focus on directly predicting the PO from the input variables. Removing the need for the RTP model minimizes data cross-dependencies and simplifies the implementation of the method by utilities and solar power plant operational managers. The proposed methodology only requires historical PO and satellite imagery for training, whereas other methods may require meteorological data streams (humidity, tem- perature, etc.) as well as diagnostic information related to power plant operations (e.g., PV panel temperature) for the RTP model, in addition to the inputs for the solar resource forecast. Such data streams may not be available due to (1) costs of instrument instal- lation and maintenance, or (2) information security requirements at the power plant. After training, the proposed methodology does not depend on ground telemetry or PO data to generate forecasts, i.e., the subse- quent forecasts are not dependent on real-time data from the power plant. This means our forecast models are independent of data connection with the power plant, and therefore are not sus- ceptible to failures due to sensor malfunctions or data transfer latency. While remote sensing is not fully dependable at all times, the reliability of remote sensing data transfer is typically much higher than local ground telemetry. Also, the two solar power plants used for testing and validation in this work are typical of hundreds of installations in California. 2 Forecast Methodology We seek to develop a learnable mapping f ðÞ between a set of real-valued satellite-derived features x i ðÞ 2 R n at the current time t and ground conditions y i ðÞ 2 R at some future time t þ Dt. More 1 Corresponding author. Contributed by the Solar Energy Division of ASME for publication in the JOURNAL OF SOLAR ENERGY ENGINEERING:INCLUDING WIND ENERGY AND BUILDING ENERGY CONSERVATION. Manuscript received June 5, 2017; final manuscript received December 14, 2017; published online February 20, 2018. Assoc. Editor: Geoffrey T. Klise. Journal of Solar Energy Engineering APRIL 2018, Vol. 140 / 021011-1 Copyright V C 2018 by ASME Downloaded From: http://solarenergyengineering.asmedigitalcollection.asme.org/ on 02/21/2018 Terms of Use: http://www.asme.org/about-asme/terms-of-use

Upload: others

Post on 26-Jan-2020

5 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: David P. Larson Direct Power Output Forecasts From Remote ...coimbra.ucsd.edu/publications/papers/2018_Larson_Coimbra.pdfDirect Power Output Forecasts From Remote Sensing Image Processing

David P. LarsonMem. ASME

Department of Mechanical and

Aerospace Engineering,

Jacobs School of Engineering,

University of California,

San Diego, La Jolla, CA 92093-0411

e-mail: [email protected]

Carlos F. M. Coimbra1

Professor

Mem. ASME

Department of Mechanical and

Aerospace Engineering,

Jacobs School of Engineering,

University of California,

San Diego, La Jolla, CA 92093-0411

e-mail: [email protected]

Direct Power Output ForecastsFrom Remote Sensing ImageProcessingA direct methodology for intra-day forecasts (1–6 h ahead) of power output (PO) fromphotovoltaic (PV) solar plants is proposed. The forecasting methodology uses publiclyavailable images from geosynchronous satellites to predict PO directly without resortingto intermediate irradiance (resource) forecasting. Forecasts are evaluated using fouryears (January 2012–December 2015) of hourly PO data from 2 nontracking, 1 MWp PVplants in California. For both sites, the proposed methodology achieves forecasting skillsranging from 24% to 69% relative to reference persistence model results, with root-mean-square error (RMSE) values ranging from 90 to 136 kW across the studied horizons.Additionally, we consider the performance of the proposed methodology when applied toimagery from the next generation of geosynchronous satellites, e.g., Himawari-8 and geo-stationary operational environmental satellite (GOES-R). [DOI: 10.1115/1.4038983]

1 Introduction

Solar forecasting is one of the enabling technologies for inte-gration of intermittent renewable resources into the electric grid.Research on the topic is typically categorized by forecast horizon:intra-hour (<1 h ahead), intra-day (�1 h ahead), and day-ahead(�1–3 days ahead). Intra-hour forecasting generally relies on localtelemetry and ground-based sky imagery (see, e.g., see Refs.[1–4]), while day-ahead forecasts are usually based on numericalweather prediction models and model output statistics (see, e.g.,Refs. [5–10]). For the interval ranging from 1 to 6 h ahead, pure[11] or hybrid [12] remote sensing methods are competitive. Theinterplay of several forecasting techniques for solar energyintegration is reviewed in Refs. [13–15].

1.1 Prior Work. A majority of previous solar forecastingresearch has focused on the solar resource rather than power out-put (PO). In particular, the existing literature is strongly biasedtoward forecasting of global horizontal irradiance (GHI) forecasts,the component of solar and atmospheric radiation that is the mostrelevant for nontracking photovoltaic (PV) power systems. Thereare fewer studies on predicting direct normal irradiance [4,16–18],and even fewer on the prediction of PO of operational solar powerplants [6–8,19–22].

Relevant intra-day forecasting studies in the past have focusedon satellite-to-irradiance models coupled with cloud motion vec-tor (CMV) fields. Satellite-to-irradiance modeling is a mature, butstill active area of research (see, e.g., Refs. [22–25]). Perez et al.[11] predicted hourly GHI for 1–6 h ahead at SURFRAD stationsthroughout the continental United States, while Coimbra andcoworkers [12] developed a hybrid forecasting methodologybased on CMV fields and artificial neural networks (ANNs) topredict GHI in Merced, CA, and Davis, CA, over horizons of30–120 min ahead. Nonnenmacher et al. [16] forecasted GHI1–3 h ahead in San Diego, CA, using streamlines determined fromCMV fields. Zagouras et al. [26] investigated the use of laggedexogenous variables and spatiotemporal correlations to improve1–3 h ahead GHI forecasts at seven CIMIS sites in California.Most recently, Mozorra Aguiar et al. [27] used ANNs to forecastGHI 1–6 h ahead for sites in the Canary Islands, off the northwestcoast of Africa.

For a recent review of PO forecasting of solar power plants,readers are referred to Ref. [15]. In this work, we highlight the POforecasting literature that focused on sites in California, the areaof interest in this paper. Chu et al. [28] evaluated intra-hour fore-casts of PO of a 48 MWp PV plant near Las Vegas, NV, whileLipperheide et al. [29] forecasted the PO of the same 48 MWpplant, but focused on horizons of less than 180 s ahead. Lonijet al. [30] used a network of residential PV systems in Tucson,AZ to forecast PO with horizons of 15–90 min. Pedro and Coim-bra [19] evaluated 1–2 h ahead PO forecasts, generated withoutexogenous inputs, for a 1 MWp PV farm in Merced, CA. Larsonet al. [7] recently forecasted day-ahead PO of two 1 MWp PVplants in San Diego County, CA.

1.2 Contributions. The goal of this work is to accurately anddirectly predict PO of PV power plants 1–6 h ahead. Rather thanforecasting the solar resource and then developing a secondaryresource-to-power (RTP) model, we focus on directly predictingthe PO from the input variables. Removing the need for the RTPmodel minimizes data cross-dependencies and simplifies theimplementation of the method by utilities and solar power plantoperational managers. The proposed methodology only requireshistorical PO and satellite imagery for training, whereas othermethods may require meteorological data streams (humidity, tem-perature, etc.) as well as diagnostic information related to powerplant operations (e.g., PV panel temperature) for the RTP model,in addition to the inputs for the solar resource forecast. Such datastreams may not be available due to (1) costs of instrument instal-lation and maintenance, or (2) information security requirementsat the power plant.

After training, the proposed methodology does not depend onground telemetry or PO data to generate forecasts, i.e., the subse-quent forecasts are not dependent on real-time data from thepower plant. This means our forecast models are independent ofdata connection with the power plant, and therefore are not sus-ceptible to failures due to sensor malfunctions or data transferlatency. While remote sensing is not fully dependable at all times,the reliability of remote sensing data transfer is typically muchhigher than local ground telemetry. Also, the two solar powerplants used for testing and validation in this work are typical ofhundreds of installations in California.

2 Forecast Methodology

We seek to develop a learnable mapping f �ð Þ between a set of

real-valued satellite-derived features x ið Þ 2 Rn at the current time t

and ground conditions y ið Þ 2 R at some future time tþ Dt. More

1Corresponding author.Contributed by the Solar Energy Division of ASME for publication in the

JOURNAL OF SOLAR ENERGY ENGINEERING: INCLUDING WIND ENERGY AND BUILDING

ENERGY CONSERVATION. Manuscript received June 5, 2017; final manuscript receivedDecember 14, 2017; published online February 20, 2018. Assoc. Editor: Geoffrey T.Klise.

Journal of Solar Energy Engineering APRIL 2018, Vol. 140 / 021011-1Copyright VC 2018 by ASME

Downloaded From: http://solarenergyengineering.asmedigitalcollection.asme.org/ on 02/21/2018 Terms of Use: http://www.asme.org/about-asme/terms-of-use

Page 2: David P. Larson Direct Power Output Forecasts From Remote ...coimbra.ucsd.edu/publications/papers/2018_Larson_Coimbra.pdfDirect Power Output Forecasts From Remote Sensing Image Processing

specifically, we are interested in finding f : Rn 7!R for intra-dayforecasting of PO of PV solar farms (see Fig. 1). To accomplishthese goals, we consider a linear model based on least squares (LS)and a nonlinear model based on support vector regression (SVR).

Our model selections are influenced primarily by two goals.First, both researchers and nonresearchers, e.g., power plant oper-ators, should be able to implement the models. Least squares andsupport vector regression are standard regression methods, and aresupported by most modern scientific programming environments,e.g., MATLAB, PYTHON, and R. This increases the likelihood that ourforecast methodology will be adopted and that our results can bereproduced. Second, there should be prior literature supporting theuse of the models for solar forecasting tasks. Indeed, both methodshave been used successfully in published solar forecasting litera-ture [7,31–33].

2.1 Least Squares. Least squares is a common linear regres-sion method and can be solved efficiently. We use a regularized ver-sion of LS known as ridge regression, which solves the objective

minimizeXm

i¼1

f x ið Þ; h; b� �

� y ið Þ� �2

þ ajjhjj2 (1)

where {h 2 Rn; b 2 R} are the model parameters, and f ðx ið Þ;h; bÞ ¼ hTx ið Þ þ b is the forecast function. The ‘2 regularizationterm (ajjhjj2) is used to discourage overfitting, where a is the regu-larization constant.

2.2 Support Vector Regression. Support vector regression isa form of support vector machines used for nonlinear regressiontasks. In this study, we use �-SVR [34,35], as implemented inLIBSVM [36]. The objective function is

minimize1

2jjhjj22 þ C �eþ 1

m

Xm

i¼1

ni þ n�i� � !

subject to f x ið Þ; h; b� �

� y ið Þ � eþ ni

y ið Þ � f x ið Þ; h; b� �

� eþ n�ini; n�i � 0; i ¼ 1;…;m

e � 0

(2)

where {h 2 Rn; b 2 R; n 2 Rm; n� 2 Rm; e 2 R} are the model

parameters, f x ið Þ; h; b� �

¼ hT/ x ið Þð Þ þ b is the forecast function,

and / : Rn 7!Rn maps the input x ið Þ to a higher dimensional fea-ture space. The two remaining variables, � 2 0; 1ð � and C 2 R, arehyperparameters. To improve computational performance, we use

the radial basis function kernel: K x; x0ð Þ ¼ exp cjjx� x0jj22� �

.

2.3 Smart Persistence. Following prior literature, we includea smart persistence model as a benchmark for evaluating our fore-cast methodology

y tþ Dtð Þ ¼ y tð Þyclear tð Þ yclear tþ Dtð Þ (3)

where y(t) is the target value at time t, Dt is the forecast horizon,e.g., 1 h, y tþ Dtð Þ is the forecasted value, and yclear is the clearsky value. When the target (y) is GHI, the fraction in Eq. (3) iscommonly referred to as the clear sky index (kt). We use the airmass-independent model described in Refs. [37–39] to estimateclear sky GHI. For clear sky PO, we used approximately 160 man-ually identified clear days, covering all four seasons, to fit a linearmapping between clear sky GHI and PO for each PV plant. Asdiscussed in Ref. [7], the assumption of a linear relationshipbetween GHI and PO is valid for the two sites.

3 Data

We analyze our forecast methodology using backward-averaged hourly AC PO from two PV plants in Southern Califor-nia: canyon crest academy in San Diego and La Costa Canyon(LCC) in Carlsbad. Both plants have nontracking PV panels at afixed 5 deg incline, with AC nameplate capacities of 1 MWp each.Additionally, both CCA 32:959 deg;�117:190 degð Þ and LCC33:074 deg;�117:230 degð Þ are less than 10 km from the Pacific

Ocean, with temperate climates that result in minimal temperatureeffects on PV panel efficiency. However, the PV panels at LCCare aligned 30 deg southwest due to the site’s characteristics, lead-ing to a difference in peak output of the two sites under identicalconditions. For further details on the two sites, readers are referredto Ref. [7].

3.1 Satellite Images. The geostationary operational environ-mental satellite (GOES) system is comprised of geosynchronoussatellites operated by the National Oceanic and AtmosphericAgency through the National Environmental Satellite, Data, andInformation Service. For the purposes of this study, we will beonly using visible wavelength images from GOES-15, which iscurrently designated as GOES West. The visible image channel iscentered at 0.63 lm, with a spectral range of 0.53–0.75 lm, a spa-tial resolution of 1 km, and a temporal resolution of one image per30 min.

A w�w square region, centered on the target site, is extractedfrom each image and then flattened into a vector ~x ið Þ 2 Rn, wheren ¼ w� w. Each image is then normalized

x ið Þ ¼ ~x ið Þ � avg ~x ið Þð Þffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffivar ~x ið Þð Þ þ 10

p (4)

where ~x ið Þ 2 0; 255½ �n is the unprocessed 8 bit grayscale imagevector, avg �ð Þ is the arithmetic mean value, and var �ð Þ is the var-iance. In Eq. (4), the numerator normalizes the brightness and thedenominator normalizes the contrast. After normalization, theimage vectors are stacked to create a matrix X 2 Rm�n, where mis the number of images.

Fig. 1 A diagram of the overall forecast methodology. First, a 480 3 680 pixel satellite image at time t is cropped down to aw 3 w region of interest, centered around the target site. The w 3 w cropped image is then transformed into a n-length featurevector x(t) and fed into a forecast model, e.g., SVR, to produce a prediction of the target output y (t1Dt).

021011-2 / Vol. 140, APRIL 2018 Transactions of the ASME

Downloaded From: http://solarenergyengineering.asmedigitalcollection.asme.org/ on 02/21/2018 Terms of Use: http://www.asme.org/about-asme/terms-of-use

Page 3: David P. Larson Direct Power Output Forecasts From Remote ...coimbra.ucsd.edu/publications/papers/2018_Larson_Coimbra.pdfDirect Power Output Forecasts From Remote Sensing Image Processing

Four years (2012–2015) of hourly PO measurements andsatellite-derive features are merged to create data sets for trainingand testing our intra-day PO forecast methodology (see Fig. 2).Although the images are available every 30 min, we only useimages from the top of each hour to generate the forecasts. Addi-tionally, night values are removed from the data set, i.e., only datapoints with hz < 85 deg are included. No other exogenous inputs,e.g., GHI and ambient air temperature, were considered. In partic-ular, our choice to ignore temperature as a forecast input is justi-fied as both sites lie in coastal areas with temperate climates,where temperature fluctuations have less than a 0.5% effect onpanel efficiency.

The data sets used in our analysis exceed the overall time spanof any known intra-day PO forecast studies. Pedro and Coimbra’s[19] study is the most analogous one to our paper due to its focuson PO from a 1 MWp PV plant. However, their forecast testingset consisted of eight months, compared to the 24 months in ourtesting sets. And while they studied one 1 MWp PV plant, ourwork evaluates PO forecasts for two separate 1 MWp PV plants.Note that this comparison is not meant to lessen the scientificvalue of Ref. [19]. Rather, the comparison is meant only to illus-trate the value provided by the data sets in our paper.

4 Results and Discussion

We evaluate our methodology by predicting PO at CCA andLCC. For each site, the forecasts are generated at the top of thehour using the most recent satellite image. In addition to thesatellite-derived features, the LS and SVR models are suppliedclear sky values of PO at the forecast target time tþ Dt. The clearsky values are included to provide the models with both diurnaland seasonal information. Ground telemetry is not included asinput to the models in order to minimize data dependencies andimprove the robustness of the models to ground-level operatingconditions, e.g., delay or loss of data streams.

The forecasting models are trained on two years of data(2012–2013) and then tested on a separate set of two years(2014–2015) to ensure unbiased performance evaluations. Cross-validation is used to select the hyperparameters of the LS andSVR models for each horizon and site. Figure 2 illustrates thetraining and testing sets used for the four scenarios. The data setsshown are complete, i.e., they show the union of the output (PO)and input variables (satellite images). In Sec. 4.1, we discuss theerror metrics used in evaluating the forecasts. Section 4.2 evalu-ates the effect of input image size choice on the forecast perform-ance. In Sec. 4.3, we analyze the impact of the model choice, i.e.,LS versus SVR. And in Sec. 4.4, we compare the performance ofour forecast methodology to prior intra-day forecasting literature.

4.1 Error Metrics. We evaluate the models using standarderror metrics: the root-mean-square error (RMSE) and the meanbias error (MBE) [2,13], as well as the forecast skill (s) proposedby Marquez and Coimbra [40]

s ¼ 1� RMSE

RMSEp¼ 1�

ffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi1=N

XN

t¼1

ðy tð Þ � y tð ÞÞ2sffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi1=N

XN

t¼1

ðy tð Þ � yp tð ÞÞ2s (5)

where y is the measured value, y is the forecasted value, and yp isthe persistence forecasted value.

4.2 Effect of Image Size. The input image size may impactforecast performance. To investigate the effects, we compare theRMSE of LS and SVR intra-day forecast models trained on squareimage regions with w 2 f4; 8; 16; 32; 64; 128g. As shown inFig. 3, increasing the input image size decreases the RMSE forboth the LS and SVR models in general for CCA. However, thereare diminishing returns for both models as the number of features(n ¼ w� w) grows beyond w¼ 64. Doubling the image widthfrom w¼ 64 to w¼ 128 increases the mean RMSE of the LS mod-els by 5.6% as well as the minimum and maximum RMSE valuesby 2% and 5%, respectively. For the SVR models, increasingfrom w¼ 64 to w¼ 128 decreases the maximum RMSE by 8%while causing a negligible increase in mean RMSE (0.4%) and anon-negligible increase in the minimum RMSE, from 35 kW to49 kW. A similar trend is observed for the LCC site. Hereafter, allforecast results are reported for models trained on 64� 64 imageregions.

The optimal image size (w¼ 64) can be explained by two com-plementary mechanisms. First, both LS and SVR are well suitedfor data sets where the number of samples equals or exceeds thenumber of features, i.e., m¼ n or m > n. Images with w¼ 64produce n 4 k features, which is the same order of magnitude asthe number of samples in the training sets for each forecast hori-zon (m 3k� 6k). In comparison, w¼ 128 images produce n 16 k features, an order of magnitude larger than the number ofsamples, i.e., m n. Second, larger image sizes cover larger spa-tial regions, and therefore can provide the models with informa-tion on cloud systems further away from the target site. In otherwords, while the first mechanism provides pressure to decrease nrelative to m, the second mechanism drives an increase in n. Thebalance between these two factors led to the optimal image widthof w¼ 64.

4.3 Effect of Model Choice. Table 1 provides bulk statisticson the performance of the forecast models on the two years oftesting data (2014–2015). For both sites, the SVR forecastsachieve the lowest RMSE and highest skill values for all horizons.For 1–2 h ahead, the difference in skill between the LS and SVRmodels is non-negligible (Ds > 1%). However, beyond 3 h, thetwo models achieve similar skill values (Ds < 1%). This resultimplies that while SVR is the best of the models evaluated, bothLS and SVR are expressive enough to predict the intra-day PObehavior from satellite imagery. Additionally, for applications

Fig. 2 The merged PO and satellite image data sets for the two power plants: (a) CCA and (b)LCC. Two years of data (2012–2013) are designated for training the forecast models (shown ingray), with two additional years (2014–2015) used for testing (shown in black).

Journal of Solar Energy Engineering APRIL 2018, Vol. 140 / 021011-3

Downloaded From: http://solarenergyengineering.asmedigitalcollection.asme.org/ on 02/21/2018 Terms of Use: http://www.asme.org/about-asme/terms-of-use

Page 4: David P. Larson Direct Power Output Forecasts From Remote ...coimbra.ucsd.edu/publications/papers/2018_Larson_Coimbra.pdfDirect Power Output Forecasts From Remote Sensing Image Processing

where SVR is too complex to implement, users of our forecastmethodology can instead employ LS and achieve comparableresults.

For readers familiar with the forecast skill metric, the values inTable 1 may, at first glance, appear unusually large. Skill valuesreported in the PO forecasting literature rarely exceed 40%,regardless of forecast horizon [15]. However, the definition offorecast skill means the reference persistence model can have alarge impact, i.e., a poor performing reference model can inflatethe performance of a forecast method, as measured by the skill. Inthis study, both the persistence and smart persistence models per-formed poorly (RMSE > 250 kW) over the entire intra-day hori-zon range, thereby resulting in abnormally large, but nonethelessvalid forecast skill values.

Besides bulk statistics, it is also important to consider error dis-tributions when evaluating forecast models. As shown in Larsonet al. [7], violinplots present a compact solution to visualizingmultiple forecast error distributions [41]. Figure 4 uses violinplotsto compare the error distributions of the LS and SVR modelsacross the studied forecast horizons. Together with the bulk statis-tics, the violinplots of Fig. 4 reinforce the point that the LS andSVR models are both expressive enough to predict the intra-dayPO from the satellite imagery.

Table 2 summarizes forecast performance across all horizons(1–6 h ahead), but breaks down the results by season:Spring (March–May), Summer (June–August), Fall (September–November), and Winter (December–February). Forecast perform-ance is reported as relative RMSE (rRMSE), which is the RMSErelative to the mean PO (PO) of the specified season. For all sea-sons and both sites, SVR achieves the lowest rRMSE values ofthe tested models. While the highest SVR rRMSE values occurduring the Winter for both sites (0.294 for CCA and 0.295 forLCC), the lowest values are in the Fall for CCA (0.219) versusSummer for LCC (0.241). Additionally, while both sites employ

identical PV technology and are separated by less than 15 km,there is a non-negligible difference in forecast performancebetween the sites for Spring, Summer, and Fall. One factor in thisdifference is the abundance of solar microclimates throughout theregion. For more info, readers are referred to Ref. [42].

Figure 5 visualizes the distributions of forecast absolute errors(AE) as functions of solar zenith and azimuthal angles. As we donot consider nighttime forecasts, there are fewer data points forthe longer horizons. For example, the CCA testing set contains6800 1 h ahead forecasts versus 3900 6 h ahead forecasts. Toprevent biasing the results, the plots only include 1 h ahead fore-casts. By grouping the errors by quartile, we see that the largesterrors, i.e., the fourth quartile of errors, occur primarily in themorning. This result can be explained by two factors: (1) the highalbedo in the visible channel satellite imagery near sunrise and (2)the prevalence of a marine layer in the region. Recent studieshave proposed the use of other imaging wavelengths, e.g., infra-red, to reduce early morning forecast errors [43]. However, over-lapping image datasets were not available at the time of thisstudy.

4.4 Comparison to Prior Work. The performance of ourforecast methodology is comparable to prior intra-day PO studies.The genetic algorithm enhanced ANN from Pedro and Coimbra[19] achieved skill values of 32.2% and 35.1%, relative to a smartpersistence model, for horizons of 1 and 2 h, respectively. In com-parison, our methodology achieves skill values of 30.6% (CCA)and 29.1% (LCC) for 1 h ahead, and 46.1% (CCA) and 45.1%(LCC) for 2 h ahead. The divergence in skill values for 2 h aheadis possibly due to the genetic algorithm-ANN model’s lack ofexogenous inputs, which was a self-imposed constraint by theauthors. Unfortunately, a lack of overlapping satellite imageryand PO data sets for the PV plant in Ref. [19] prevents directcomparison.

Fig. 3 Effect of image size on forecast accuracy of the CCA training set (2012–2013), measured by RMSE (kW). The mean RMSEfor the LS (solid) and SVR (dashed) models across the six horizons (1–6 h ahead) are shown bounded by the minimum and maxi-mum RMSE values. A similar trend is observed for the LCC.

Table 1 Forecast performance on the testing set (2014–2015) for CCA and LCC for horizons of 1–6 h ahead. The column names(1 h, . . ., 6h) indicate the forecast horizon in hours. Negative MBE values correspond to models biased toward overpredicting POand vice versa. Skill is reported relative to a smart persistence model. Bolded values indicate the forecast with the best value for aspecific error metric, site, and forecast horizon.

RMSE (kW) MBE (kW) s (%)

1 h 2 h 3 h 4 h 5 h 6 h 1 h 2 h 3 h 4 h 5 h 6 h 1 h 2 h 3 h 4 h 5 h 6 h

CCA

Smart Pers. 137.1 201.5 268.5 329.2 376.9 401.9 51.6 95.0 140.3 185.0 219.8 243.6 — — — — — —

LS 114.4 126.1 134.6 136.9 135.0 128.4 3.2 4.9 2.2 21.2 22.2 23.3 16.5 37.4 49.9 58.4 64.2 68.1

SVR 104.8 121.8 132.0 136.4 134.8 128.4 �8.5 22.5 �13.3 �19.8 �21.9 �21.5 23.5 39.5 50.8 58.6 64.2 68.1

LCC

Smart Pers. 117.9 173.7 233.4 289.1 333.7 359.6 47.8 87.2 129.4 171.6 205.4 227.2 — — — — — —

LS 96.6 106.5 114.5 118.1 117.1 113.2 20.2 1.4 21.6 21.7 21.2 23.3 18.1 38.7 51.0 59.1 64.9 68.5

SVR 89.9 101.9 112.2 117.5 116.4 112.5 �10.3 �9.6 �13.7 �15.1 �15.4 �16.3 23.8 41.3 51.9 59.4 65.1 68.7

021011-4 / Vol. 140, APRIL 2018 Transactions of the ASME

Downloaded From: http://solarenergyengineering.asmedigitalcollection.asme.org/ on 02/21/2018 Terms of Use: http://www.asme.org/about-asme/terms-of-use

Page 5: David P. Larson Direct Power Output Forecasts From Remote ...coimbra.ucsd.edu/publications/papers/2018_Larson_Coimbra.pdfDirect Power Output Forecasts From Remote Sensing Image Processing

Wolf et al. [33] forecasted PO for 15 min–5 h ahead for 921 PVsystems of unspecified nameplate capacities. Although notreported directly, approximate forecast skills can be derived fromFigs. 8 to 9 of Ref. [33]. The SVR-based models achieved skillscores in the range of 20–36% for horizons of 1–5 h ahead. Ourmethodology, which also uses SVR, achieves skill scores of24–64% for the same forecast horizons. However, it is importantto note that Wolf et al. used power data with a 15 min temporalresolution while we forecasted hourly averaged values.

Although our work focuses on forecasting PO, comparisons tointra-day solar resource forecasting literature can still be of value.GHI forecast results are particularly well suited for comparison asthe PO of both CCA and LCC are linearly correlated with GHI

due to their use of nontracking PV panels [7]. And fortunately,there have been multiple intra-day GHI forecasting studies at siteswithin the same region as both power plants. Nonnenmacher andCoimbra [16] achieved GHI forecast skill scores of 8–16% forhorizons of 1, 2, and 3 h at the University of California San Diegocampus (32.88, �117.23). Zagouras et al. [26] obtained skills of13–23% (1–3 h ahead) for the CIMIS weather station at TorreyPines (32.90, �117.25). Perez et al. [11] produced forecasts withskills of 6–19% (1–6 h ahead) for the SURFRAD station at theDesert Rock Airport, NV, (36.62, �116.02). The skill scores(>23%) achieved by our methodology, compared to these threeanalogous studies, indicates that our results are competitive withother intra-day forecast methods.

Another important comparison is between the intra-day POforecast results of this paper and the day-ahead PO results of Lar-son et al. [7]. Both papers use PO data from the same PV plants,i.e., CCA and LCC, and cover similar time ranges (2011–2014 forRef. [7] versus 2012–2015 for this study). Larson et al. reportedday-ahead PO forecasts with skills of 15–23% for CCA and13–20% for LCC, relative to a persistence model. In compari-son, the intra-day PO forecasts attain higher skill scores than theday-ahead scenarios (24–68% for CCA and 24–69% forLCC). Intra-day forecasts achieving higher skill scores than day-ahead forecasts are in agreement with results from the forecastingliterature, e.g., Ref. [11].

5 Impact of Next-Generation Satellite Imagery

In November 2016, the first satellite in the next-generationGOES-R series was launched, and is currently undergoing testing.Although the transition from the current GOES-N Series will notbegin until approximately Fall 2017, we can already consider thepotential impacts of the next-generation imagery provided byGOES-R on the proposed forecasting methodology. The new ABIwill provide three times the spectral bands, four times the spatial

Fig. 4 Forecast errors on the testing set (2014–2015) for (a) CCA and (b) LCC, grouped byhorizon (1–6 h ahead). The distributions for LS and SVR are shown side by side to enabledirect comparisons of the two models for the same horizon. The horizontal dashed and dottedlines within each distribution represent the quartiles, with the width approximated using ker-nel density estimate techniques. Negative MBE values correspond to the model overpredict-ing the PO.

Table 2 Performance of forecasts across all studied horizons(1–6 h ahead) on the testing set (2014–2015). Results are brokendown by site and season: Spring (3/1–5/31), Summer (6/1–8/31),Fall (9/1–11/30), and Winter (12/1–2/28). RMSE values arereported relative to the mean PO of the season (PO). Boldvalues indicate the forecast model with the lowest RMSE for aspecific season.

rRMSE Spring Summer Fall Winter Total

CCA

PO (kW) 572.9 585.0 502.9 425.5 527.2

Smart Pers. 0.516 0.533 0.532 0.586 0.539

LS 0.227 0.239 0.226 0.300 0.244

SVR 0.223 0.232 0.219 0.294 0.238

LCC

PO (kW) 456.1 477.9 407.4 346.1 425.9

Smart Pers. 0.541 0.594 0.592 0.641 0.590

LS 0.250 0.250 0.250 0.296 0.259

SVR 0.243 0.241 0.244 0.295 0.252

Journal of Solar Energy Engineering APRIL 2018, Vol. 140 / 021011-5

Downloaded From: http://solarenergyengineering.asmedigitalcollection.asme.org/ on 02/21/2018 Terms of Use: http://www.asme.org/about-asme/terms-of-use

Page 6: David P. Larson Direct Power Output Forecasts From Remote ...coimbra.ucsd.edu/publications/papers/2018_Larson_Coimbra.pdfDirect Power Output Forecasts From Remote Sensing Image Processing

resolution, and up to five times the temporal resolution of the cur-rent GOES-N satellites.2

A direct comparison of forecasts with GOES-N and GOES-Rimagery for the two California sites is not possible at this time.However, the Advanced Himawari Imager (AHI) on the JapaneseMeteorological Agency’s Himawari-8 geostationary weathersatellite has comparable specifications to the GOES-R ABI: 16spectral bands, 0.5 km per pixel resolution visible images, and sixfull disk images per hour. While the AHI full disk images do notoverlap with California, the images are available from July2015–present.

5.1 Experiment Setup. Here, we aim to provide an analysisof the potential impact of the GOES-R ABI images on the pro-posed methodology using images from the Himawari-8 AHI andground telemetry from sites in Australia. The goal is to understandwhether the proposed methodology will still be applicable oncethe transition to the next-generation satellite imagery is complete.Additionally, the following analysis provides insight into whetherthe methodology can be adapted to multiple satellite imageryproducts with differing qualities, and therefore whether or not itcan be successfully applied to sites outside of North America.

We compare the forecast performance between AHI visibleimages at full resolution (0.5 km per pixel; comparable to theGOES-R ABI), and images that we have down-sampled to 1.0 kmper pixel to reduce model complexity and storage requirements.All other variables are fixed; we evaluate the same intra-day fore-cast models described in Sec. 2, extract features from hourly gray-scale images, and train on historical hourly PO data from anontracking PV power system.

Full resolution, full disk images from Himawari-8 were down-loaded from Japan’s National Institute of Information and Com-munications Technology (NICT). NICT provides a publicly

accessible web portal of historical Himawari-8 images.3 Althoughavailable as red/green/blue composites, we convert all images tograyscale to minimize the influence of the additional spectralchannels, which are not the focus of this analysis, and then per-form the preprocessing detailed in Sec. 3.1.

For the purposes of this analysis, we use publicly available datafrom the Australian PV Institute Solar Map, funded by the Austra-lian Renewable Energy Agency.4 The Australian PV Institute pro-vides hourly power generation data from 53 PV installationsacross Australia, with nameplate capacities in the range of 2–35kWp. We select the 20 kWp PV install in Blacktown, NSW�33:763 deg; 150:907 degð Þ, as it has one of the largest name-

plate capacities among the AVPI sites with data that overlaps withthe available Himawari-8 images, and has minimal quality controlissues.

One year of overlapping Himawari-8 images and PO data fromBlacktown is available for the analysis (January 2016–2017). Theyear of data is partition into a training set consisting of the oddmonths (January, March, etc.), and a testing set of the evenmonths (February, April, etc.). As noted in Larson et al. [7], thispartitioning ensures that the models are trained, and also tested,on data from all four seasons. Additionally, as with the two Cali-fornia sites, night values were removed from both the training andtesting sets. Based on the training set results, a spatial image sizeof 128 km� 128 km was selected for full analysis.

5.2 Comparison Results. Table 3 summarizes the intra-dayforecasting performance for the 20 kWp Blacktown site using theHimawari-8 images. Both the LS and SVR models achieve fore-cast skill values that are similar to the CCA and LCC sites(30–70%). Additionally, as with the two California sites, theSVR models have overall lower RMSE and higher skill valuesthan the LS models. However, for certain applications, e.g.,

Fig. 5 Kernel density estimates of the AE of the 1 h ahead SVR forecasts on the CCA testing set. The AE valuesare grouped into columns by quartile: (1) first quartile, (2) second quartile, (3) third quartile, and (4) fourth quar-tile. Darker areas represent higher densities. The top row (Reference) shows the AE densities as functions ofthe solar azimuthal (/) and zenith (hz) angles when the forecast was generated, while the bottom row (Valid)shows the solar angles of the targeted time for the forecasts. Sunrise and sunset are at hz 5 90 deg while/<180 deg corresponds to the morning and / > 180 deg to the evening. The majority of the largest errors, i.e.,the fourth quartile of errors, are produced by forecasts generated near sunrise. A similar error distribution isseen for the LCC site (figure omitted due to length constraints).

2For a complete comparison between the ABI and the imager on the currentGOES-N Series, please see: http://www.goes-r.gov/spacesegment/abi.html

3http://himawari8.nict.go.jp/4http://pv-map.apvi.org.au

021011-6 / Vol. 140, APRIL 2018 Transactions of the ASME

Downloaded From: http://solarenergyengineering.asmedigitalcollection.asme.org/ on 02/21/2018 Terms of Use: http://www.asme.org/about-asme/terms-of-use

Page 7: David P. Larson Direct Power Output Forecasts From Remote ...coimbra.ucsd.edu/publications/papers/2018_Larson_Coimbra.pdfDirect Power Output Forecasts From Remote Sensing Image Processing

generating forecasts on low-power devices, the difference in per-formance may not be statistically significant enough to justify theadded complexity and lower interpretability of the SVR models,as compared to LS.

Based on this analysis, down-sampling the image inputs doesnot have a statistically significant negative impact on the forecastperformance. The absolute difference in RMSE between the low-resolution and high-resolution images is less than or equal to0.4% for both the LS and SVR models. The lack of forecast per-formance degradation, coupled with the benefit of lower computa-tion and storage requirements of the low-resolution imagefeatures, suggests that down-sampling the image inputs is a viablestrategy.

Before moving on, we note a few caveats to these results. First,to enable a more direct comparison to the other intra-day resultsin this study, we did not consider the impact of the increased tem-poral resolution of the next-generation satellite images. Higherspatial and temporal resolution images may enable detection ofsmall-scale and short-lived cloud systems, which current satelliteimages can miss, and therefore could outweigh the lower compu-tation and storage benefits of down-sampling the images.

Second, our analysis only considers a year of data (six monthsof training and six months of testing) for one sub-1 MWp PV site.While we took steps to ensure the validity of our analysis (seeSec. 5.1), a longer data set would add increased confidence to theanalysis. Additionally, a PV power plant with a larger nameplatecapacity, e.g., 500 kWp or 1 MWp, would enable a more directcomparison to the intra-day forecast results for CCA and LCC.Unfortunately, data for a larger scale PV power plant in the areacovered by Himawari-8 were not available at the time of thisstudy.

6 Conclusions

We presented a methodology for forecasting PO of PV powerplants for horizons of 1–6 h ahead. Our methodology enablesdirect prediction of PO of operational solar farms from satelliteimagery. This has two major benefits: first, by directly forecastingPO, we remove the need for intermediate irradiance forecasts andresource-to-power modeling, which are typically site specific; sec-ond, by removing the need for additional data dependencies, e.g.,meteorological telemetry, we ensure that the methodology is gen-eralizable to power plants with varying levels of pre-existinginstrumentation. Together, these two features make the proposedforecasting methodology well posed to be applied to a range ofoperational PV power plants.

We evaluated the proposed direct remote sensing forecast meth-odology using two nontracking, 1 MWp PV plants in California.Four years (2012–2015) of combined PO and satellite imagerydata were used to analyze the forecast methodology’s

performance. Over a testing set of two years (2014–2015), ourmethodology achieved RMSE values of 90–136 kW over hori-zons of 1–6 h ahead and forecast skill scores of 24–69%, relativeto the baseline smart persistence model.

Additionally, we presented an analysis of the potential impactof next-generation satellite imagery from GOES-R on the intra-day forecasts. Using PO data from a PV power plant in Australiaand images from the Himawari-8 geostationary satellite as a proxyfor GOES-R, we evaluated the effects of the higher resolutionimagery. The results showed that the methodology can easily bedeployed for the imaging products of other satellites. The higherspatial resolution imagery does not improve or degrade the fore-cast performance. Down-sampling the images to reduce computa-tion and storage costs was also shown to be compatible with themethodology. These results help quantify the impact that a newgeneration of satellite imagery systems, e.g., GOES-R, will haveon direct remote sensing-to-power solar forecasting performance.

Acknowledgment

The authors gratefully acknowledge the technical support pro-vided by the administration and staff of Canyon Crest Academy,particularly Dr. E. Gerstin. Additionally, the authors would like tothank Dr. K. T. Murata and the NICT Science Cloud project fortechnical support related to accessing historical Himawari-8images.

References[1] Quesada-Ruiz, S., Chu, Y., Tovar-Pescador, J., Pedro, H. T. C., and Coimbra,

C. F. M., 2014, “Cloud-Tracking Methodology for Intra-Hour DNIForecasting,” Sol. Energy, 102, pp. 267–275.

[2] Marquez, R., and Coimbra, C. F. M., 2011, “Forecasting of Global and DirectSolar Irradiance Using Stochastic Learning Methods, Ground Experiments andthe NWS Database,” Sol. Energy, 85(5), pp. 746–756.

[3] Marquez, R., Gueorguiev, V., and Coimbra, C. F. M., 2012, “Forecasting ofGlobal Horizontal Irradiance Using Sky Cover Indices,” ASME J. Sol. EnergyEng., 135(1), p. 011017.

[4] Saade, E., Clough, D. E., and Weimer, A. W., 2013, “Use of Image-BasedDirect Normal Irradiance Forecasts in the Model Predictive Control of a Solar-Thermal Reactor,” ASME J. Sol. Energy Eng., 136(1), p. 010905.

[5] Perez, R., Lorenz, E., Pelland, S., Beauharnois, M., Van Knowe, G., Hemker,K. J., Heinemann, D., Remund, J., Muller, S. C., Traunmuller, W., Steinmauer,G., Pozo, D., Ruiz-Arias, J. A., Lara-Fanego, V., Ramirez-Santigosa, L.,Gaston-Romero, M., and Pomares, L. M., 2013, “Comparison of NumericalWeather Prediction Solar Irradiance Forecasts in the US, Canada and Europe,”Sol. Energy, 94, pp. 305–326.

[6] Pelland, S., Galanis, G., and Kallos, G., 2013, “Solar and Photovoltaic Forecast-ing Through Post-Processing of the Global Environmental MultiscaleNumerical Weather Prediction Model,” Prog. Photovolt.: Res. Appl., 21(3),pp. 284–296.

[7] Larson, D. P., Nonnenmacher, L., and Coimbra, C. F. M., 2016, “Day-AheadForecasting of Solar Power Output From Photovoltaic Plants in the AmericanSouthwest,” Renewable Energy, 91, pp. 11–20.

[8] Pierro, M., Bucci, F., De Felice, M., Maggioni, E., Perroto, A., Spada, F.,Moser, D., and Cornaro, C., 2016, “Deterministic and Stochastic Approaches

Table 3 Forecast performance on the testing set (even months of 2016) for the Blacktown, NSW, Australia site. RMSE and MBE arereported relative to the site nameplate capacity (20 kWp), while the forecast skill is relative to the smart persistence model (sameas Table 1). The LS and SVR model results are grouped by the resolution of the input images: low resolution (1.0 km per pixel) andhigh resolution (0.5 km per pixel). Both the high and low-resolution input images cover the same spatial distance of128 km 3 128 km (high resolution: 256 3 256 pixels, low resolution: 128 3 128 pixels). Bolded values denote the model with the bestperformance for a specific forecast horizon and metric.

RMSE (%) (%) s (%)

1 h 2 h 3 h 4 h 5 h 6 h 1 h 2 h 3 h 4 h 5 h 6 h 1 h 2 h 3 h 4 h 5 h 6 h

Smart Pers. 21.7 30.3 37.8 43.4 47.0 48.0 3.5 5.5 6.9 7.7 7.7 6.9 — — — — — —

Low-resolution

LS 15.5 16.5 17.0 16.8 16.0 14.1 0.5 �0.2 �0.7 �1.1 �1.2 �1.3 28.6 45.7 55.1 61.2 65.9 70.7

SVR 14.1 14.9 15.4 15.7 15.2 13.7 0.5 0.1 �1.0 �1.5 �2.0 �1.9 35.0 50.7 59.2 63.9 67.6 71.4

High-resolution

LS 15.4 16.5 17.2 17.1 16.3 14.5 0.6 �0.2 �0.8 �1.3 �1.5 �1.6 29.1 45.6 54.6 60.7 65.2 69.8

SVR 14.4 15.3 15.7 15.8 15.5 13.7 0.2 �0.6 �1.0 �1.4 �2.3 �1.6 33.8 49.6 58.5 63.5 67.1 71.4

Journal of Solar Energy Engineering APRIL 2018, Vol. 140 / 021011-7

Downloaded From: http://solarenergyengineering.asmedigitalcollection.asme.org/ on 02/21/2018 Terms of Use: http://www.asme.org/about-asme/terms-of-use

Page 8: David P. Larson Direct Power Output Forecasts From Remote ...coimbra.ucsd.edu/publications/papers/2018_Larson_Coimbra.pdfDirect Power Output Forecasts From Remote Sensing Image Processing

for Day-Ahead Solar Power Forecasting,” ASME J. Sol. Energy Eng., 139(2),p. 021010.

[9] Aryaputera, A. W., Yang, D., and Walsh, W. M., 2015, “Day-Ahead SolarIrradiance Forecasting in a Tropical Environment,” ASME J. Sol. Energy Eng.,137(5), p. 051009.

[10] Cornaro, C., Bucci, F., Del Frate, F., Peronaci, S., and Taravat, A., 2015,“Twenty-Four Hour Solar Irradiance Forecast Based on Neural Networks andNumerical Weather Prediction,” ASME J. Sol. Energy Eng., 137(3), p. 031011.

[11] Perez, R., Kivalov, S., Schlemmer, J., Hemker, K., Jr., Renn�e, D., and Hoff, T.E., 2010, “Validation of Short and Medium Term Operation Solar RadiationForecasts in the US,” Sol. Energy, 84(12), pp. 2161–2172.

[12] Marquez, R., Pedro, H. T. C., and Coimbra, C. F. M., 2013, “Hybrid Solar Fore-casting Method Uses Satellite Imaging and Ground Telemetry as Inputs toANNs,” Sol. Energy, 92, pp. 176–188.

[13] Inman, R. H., Pedro, H. T. C., and Coimbra, C. F. M., 2013, “Solar ForecastingMethods for Renewable Energy Integration,” Prog. Energy Combust. Sci.,39(6), pp. 535–576.

[14] Law, E. W., Prasad, A. A., Kay, M., and Taylor, R. A., 2014, “Direct NormalIrradiance Forecasting and Its Application to Concentrated Solar ThermalOutput Forecasting—A Review,” Sol. Energy, 108, pp. 287–307.

[15] Antonanzas, J., Osorio, N., Urraca, R., de Pison, F. J. M., and Antonanzas-Torres, F., 2016, “Review of Photovoltaic Power Forecasting,” Sol. Energy,136, pp. 78–111.

[16] Nonnenmacher, L., and Coimbra, C. F. M., 2014, “Streamline-Based Methodfor Intra-Day Solar Forecasting Through Remote Sensing,” Sol. Energy, 108,pp. 447–459.

[17] Chu, Y., Pedro, H. T. C., and Coimbra, C. F. M., 2013, “Hybrid Intra-Hour DNIForecasts With Sky Image Processing Enhanced by Stochastic Learning,” Sol.Energy, 98(Pt. C), pp. 592–603.

[18] Chu, Y., Li, M., Pedro, H. T. C., and Coimbra, C. F. M., 2015, “Real-TimePrediction Intervals for Intra-Hour DNI Forecasts,” Renewable Energy, 83, pp.234–244.

[19] Pedro, H. T. C., and Coimbra, C. F. M., 2012, “Assessment of ForecastingTechniques for Solar Power Output With No Exogenous Inputs,” Sol. Energy,86(7), pp. 2017–2028.

[20] Zamo, M., Mestre, O., Arbogast, P., and Pannekoucke, O., 2014, “ABenchmark of Statistical Regression Methods for Short-Term Forecasting ofPhotovoltaic Electricity Production, Part I: Deterministic Forecast of HourlyProduction,” Sol. Energy, 105, pp. 792–803.

[21] Zamo, M., Mestre, O., Arbogast, P., and Pannekoucke, O., 2014, “A Bench-mark of Statistical Regression Methods for Short-Term Forecasting of Photo-voltaic Electricity Production—Part II: Probabilistic Forecast of DailyProduction,” Sol. Energy, 105, pp. 804–816.

[22] Perez, R., Schlemmer, J., Hemker, Kivalov, S., Kankiewicz, A., and Gueymard,C., 2015, “Satellite-to-Irradiance Modeling—A New Version of the SUNYModel,” 42nd IEEE Photovoltaic Specialist Conference (PVSC), New Orleans,LA, June 14–19, pp. 1–7.

[23] Perez, R., Ineichen, P., Moore, K., Kmiecik, M., Chain, C., George, R., andVignola, F., 2002, “A New Operational Model for Satellite-Derived Irradiances:Description and Validation,” Sol. Energy, 73, pp. 307–317.

[24] Perez, R., Ineichen, P., Kmiecik, M., Moore, K., Renne, D., and George, R.,2004, “Producing Satellite-Derived Irradiances in Complex Arid Terrain,” Sol.Energy, 77(4), pp. 367–371.

[25] Vignola, V., Harlan, P., Perez, R., and Kmiecik, M., 2007, “Analysis ofSatellite Derived Beam and Global Solar Radiation Data,” Sol. Energy, 81(6),pp. 768–772.

[26] Zagouras, A., Pedro, H. T. C., and Coimbra, C. F. M., 2015, “On the Role ofLagged Exogenous Variables and Spatial-Temporal Correlations in Improvingthe Accuracy of Solar Forecasting Methods,” Renewable Energy, 78,pp. 203–218.

[27] Mozorra Aguiar, L., Pereira, B., David, M., Diaz, F., and Lauret, P., 2015, “Useof Satellite Data to Improve Solar Radiation Forecasting With BayesianArtificial Neural Networks,” Sol. Energy, 122, pp. 1309–1324.

[28] Chu, Y., Urquhart, B., Gohari, S. M. I., Pedro, H. T. C., Kleissl, J., andCoimbra, C. F. M., 2015, “Short-Term Reforecasting of Power Output From a48 MWe Solar PV Plant,” Sol. Energy, 112, pp. 68–77.

[29] Lipperheide, M., Bosch, J. L., and Kleissl, J., 2015, “Embedded NowcastingMethod Using Cloud Speed Persistence for a Photovoltaic Power Plant,” Sol.Energy, 112, pp. 232–238.

[30] Lonij, V. P., Brooks, A. E., Cronin, A. D., Leuthold, M., and Koch, K., 2013,“Intra-Hour Forecasts of Solar Power Production Using Measurements From aNetwork of Irradiance Sensors,” Sol. Energy, 97, pp. 58–66.

[31] Kaur, A., Nonnenmacher, L., Pedro, H. T. C., and Coimbra, C. F. M., 2016,“Benefits of Solar Forecasting for Energy Imbalance Markets,” RenewableEnergy, 86, pp. 819–830.

[32] Fonseca, J. G., Oozeki, T., Ohtake, H., Shimose, K., Takashima, T., and Ogi-moto, K., 2014, “Regional Forecasts and Smoothing Effect of PhotovoltaicPower Generation in Japan: An Approach With Principal Component Analy-sis,” Renewable Energy, 68, pp. 403–413.

[33] Wolff, B., Kuhnert, J., Lorenz, E., Krame, O., and Heinemann, D., 2016,“Comparing Support Vector Regression for PV Power Forecasting to a PhysicalModeling Approach Using Measurement, Numerical Weather Prediction, andCloud Motion Data,” Sol. Energy, 135, pp. 197–208.

[34] Vapnik, V. N., 1998, Statistical Learning Theory, Wiley, New York.[35] Chang, C. C., and Lin, C. J., 2002, “Training �-Support Vector Regression:

Theory and Algorithms,” Neural Comput., 14(8), pp. 1959–1977.[36] Chang, C. C., and Lin, C. J., 2011, “LIBSVM: A Library for Support Vector

Machines,” ACM Trans. Intell. Syst. Technol., 2(3), pp. 1–27.[37] Ineichen, P., and Perez, R., 2002, “A New Airmass Independent Formulation

for the Linke Turbidity Coefficient,” Sol. Energy, 73(3), pp. 151–157.[38] Ineichen, P., 2006, “Comparison of Eight Clear Sky Broadband Models against

16 Independent Data Banks,” Sol. Energy, 80(4), pp. 468–478.[39] Gueymard, C. A., 2012, “Clear-Sky Irradiance Predictions for Solar Resource

Mapping and Large-Scale Applications: Improved Validation Methodology andDetailed Performance Analysis of 18 Broadband Radiative Models,” Sol.Energy, 86(8), pp. 2145–2169.

[40] Marquez, R., and Coimbra, C. F. M., 2012, “Proposed Metric for Evaluation ofSolar Forecasting Models,” ASME J. Sol. Energy Eng., 135(1), p. 011016.

[41] Hintze, J. L., and Nelson, R. D., 1998, “Violin Plots: A Box Plot-Density TraceSynergism,” Am. Stat., 52(2), pp. 181–184.

[42] Zagouras, A., Inman, R. H., and Coimbra, C. F. M., 2014, “On the Determina-tion of Coherent Solar Microcolimates for Utility Planning and Operators,” Sol.Energy, 102, pp. 173–188.

[43] Hammer, A., Kuhnert, J., Weinreich, K., and Lorenz, E., 2015, “Short-TermForecasting of Surface Solar Irradiance Based on Meteosat-SEVIRI Data Usinga Nighttime Cloud Index,” Remote Sens., 7(7), pp. 9070–9090.

021011-8 / Vol. 140, APRIL 2018 Transactions of the ASME

Downloaded From: http://solarenergyengineering.asmedigitalcollection.asme.org/ on 02/21/2018 Terms of Use: http://www.asme.org/about-asme/terms-of-use