1 · web viewin the lib directory you will need to copy any grads scripts you have to this location...

22
The Convective Storms Group Guide to the Weather Research and Forecasting (WRF) Model Created by the NCSU Convective Storms Group: Dr. Matthew Parker Jerilyn Billings Adam French Last Updated: 6 August 2006 This Guide was modified from The Forecasting Lab’s Guide to the Weather Research and Forecasting (WRF) Model Found at: http://tempest.meas.ncsu.edu/wrf/ Created by the NCSU Forecasting Lab: Dr. Mike Brennan Megan Gentry Nicole Haglund Kevin Hill Dr. Gary Lackmann Kelly Mahoney 1

Upload: others

Post on 05-Aug-2021

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: 1 · Web viewIn the lib directory you will need to copy any grads scripts you have to this location i.e. dbz.gs etc. In the dat directory you will need to copy the .dat grads files,

The Convective Storms Group Guide to the Weather Research and Forecasting (WRF)

Model

Created by the NCSU Convective Storms Group:Dr. Matthew Parker

Jerilyn BillingsAdam French

Last Updated: 6 August 2006

This Guide was modified from The Forecasting Lab’s Guide to the Weather Research and Forecasting (WRF) Model

Found at: http://tempest.meas.ncsu.edu/wrf/Created by the NCSU Forecasting Lab:

Dr. Mike BrennanMegan Gentry

Nicole HaglundKevin Hill

Dr. Gary LackmannKelly Mahoney

With additional contributions from:Matt Borkowski

Zach BrownDr. Matt Parker

Last Updated: 27 July 2006

1

Page 2: 1 · Web viewIn the lib directory you will need to copy any grads scripts you have to this location i.e. dbz.gs etc. In the dat directory you will need to copy the .dat grads files,

1. Coping files for WRF and WRFSI on the Meso Servers

You will need to copy the wrf and wrfsi directories from other locations

Ideal Case:If you are running an ideal case use the following file

cd data

cp /usr/local/Install_Files/WRFV2.1.2.TAR.gz .

gunzip WRFV2.1.2.TAR.gz

tar –xvf WRFV2.1.2.TAR

This will create a WRFV2 directory

Real Case: (use this version for now) If you are running a real case use the following

cd data

cp /data/chnablu/WRFV2.1.1.TAR .

tar –xvf WRFV2.1.1.TAR

This will create the WRFV2 directory ( I suggest changing theit’s name, for example, real_WRFV2.1.1)

2. Compiling WRF and installing WRFSI1

1. Compiling WRF on Meso Serversa. To compile WRF and WRFSI it has to point to the netcdf libraries.

So, in your .bashrc file you need to add the following line:

export NETCDF=/usr/local/netcdfexport MPICH=/usr/local/mpich

1 Note that for best results, in our experience, WRF should be set up and compiled before WRFSI.

2

Page 3: 1 · Web viewIn the lib directory you will need to copy any grads scripts you have to this location i.e. dbz.gs etc. In the dat directory you will need to copy the .dat grads files,

b. Configure WRF In the WRFV2 directory type:

./configure

You will then choose from the options shown about how you want to compile WRF You’ll want to chose option 3 (MPICH, RSL_LITE) for nested runs

This creates a file called configure.wrf which contains all the default compilation options for the platform you are working on.

c. Compile WRF

If you type

./compile

you will see a list of choices for compiling the generic WRF model as well as options for several test cases.

To compile the model for a test case (ideal) type:(Remember use version 2.1.2)compile testname > messages i.e. testname = em_quarter_ss

If the model compiles successfully, the files

wrf.exe ideal.exe

will appear in the /main directory.

To compile WRF for a case with real data type(Remember use version 2.1.1)./compile em_real > messages

If this compiles successfully, the files

ndown.exereal.exewrf.exe

will appear in the /main directory

3

Page 4: 1 · Web viewIn the lib directory you will need to copy any grads scripts you have to this location i.e. dbz.gs etc. In the dat directory you will need to copy the .dat grads files,

To clean directories of all object files and executables (which you will need to do if your first compiling attempt didn’t work and you’re doing it again!), type

./clean -a

removes all built files, including configure.wrf

2. Compiling WRF and WRFSI on Bigdog

If you have an aliases.csh file on Bigdog you should not need to set any netcdf variables. Use the following aliases.csh if you don’t have one

cp /home/jmbilli2/aliases793.csh /home/yourid

You will also need to copy the WRFV2.1.1 and wrfsiv2.1.2 to your /share directory

cp $group793/wrf/WRF2.1.1.TAR.gz .

Untar and unzip the WRFV2 software

You’ll need to configure and compile this software on Bigdog, Similar process to that on meso servers.

In the WRFV2 directory type:

configure

You will then choose from the options shown about how you want to compile WRF (single processor (1), MPI (3), etc.). You’ll want to choose option 5

Next compile

compile

you will see a list of choices for compiling the generic WRF model as well as options for several test cases.

To compile the model for a test case type:

compile <testname>

If the model compiles successfully, the files

wrf.exe

4

Page 5: 1 · Web viewIn the lib directory you will need to copy any grads scripts you have to this location i.e. dbz.gs etc. In the dat directory you will need to copy the .dat grads files,

ideal.exe

will appear in the /main directory.

To compile WRF for a case with real data type

compile em_real

If this compiles successfully, the files

ndown.exereal.exewrf.exe

will appear in the /main directory

Finally you will copy the WRFSI software.

cd into the WRFV2 directory and copy the wrfsi cide

cp $group793/wrf/wrfsi_v2.1.1.tar.gz . Untar and unzip the WRFSI software.

3. Get GEOG data (Not needed at this time on meso servers)

Now you must obtain all of your geographical data. You can copy these files from someone else’s directory that has already downloaded these files. You can copy the file GEOG.tar.gz:

On Meso Servers

cp /data/chnablu/WRF_WRFSI/GEOG.tar.gz

You will need to unzip and untar this file in your wrfsi/extdata directory.

On Bigdog

cp /$group793/wrf/GEOG.tar.gz

You will need to unzip and untar this file in your wrfsi/extdata directory

4. Wind FixYou will need to untar and unzip this this file in your wrfsi/ directory to fix the

wind for narr data

5

Page 6: 1 · Web viewIn the lib directory you will need to copy any grads scripts you have to this location i.e. dbz.gs etc. In the dat directory you will need to copy the .dat grads files,

On Meso Servers

cp /data/chnablu/WRF_WRFSI/narr_si.tar .

On Bigdog

cp /lockers/PAMS/Volume1/gary/wrf/narr_si.tar .

5.Install wrfsi

On Meso Servers

Cd into the wrfsi/ directory and run the perl script

./install_wrfsi.pl

If this worked you should have an executable in this directory called “wrf_tools”

On Bigdog

a. Type “add pgi” in the terminal window (no quotes!)

b. Run perl script (found in /WRFV2/wrfsi/ directory) install_wrfsi.pl to install wrfsi.(answer yes when it asks if you want to install the graphical user interface!)

c. If this worked you should have an executable in this directory called “wrf_tools”

3. Using WRFSI to set up your run

1. If on bigdog cluster, get on a compute node Type qrsh at the command prompt

2. From /WRFV2/wrfsi/ directory, type wrf_tools

3. When WRFSI comes up, the first thing you will do is create your domain:a. In the “Horizontal Grid” tab, draw your approximate domain the box provided

and click “update map.” (unless your domain extends to very high latitudes, you should use LCC for your projection.)

b. Enter gridspacing and the horizontal grid dimensions, clicking update map again. Repeat until you have the grid you want.

c. Choose your vertical levels (make sure your ICs have data up to the level you

6

Page 7: 1 · Web viewIn the lib directory you will need to copy any grads scripts you have to this location i.e. dbz.gs etc. In the dat directory you will need to copy the .dat grads files,

choose for the pressure at the top of the model)

d. After checking to make sure that everything is pointing to the right data locations…

e. Localize your domain. (This actually creates the domain grid that you have specified.)

4. The second step is the “Initial Data” step. Here you will set up your run to see the initial and boundary conditions.

a. Type the path to each data source that you are using in the corresponding space.

b. Click on the script tab at the top, and choose your data source, editing the command line to reflect the forecast length in hours (-l), the time interval between boundary condition files in hours (-t), and the cycle time to use for the model simulation (-s) in YYYYMMDDHH format. Run this script for each data source that you are using (e.g., if you are using both Eta and a separate SST field, run the script once for each one.)

c. SST Data: If you are using a separate SST field, let –l be set to 0 and –t be set to 12 and HH be 00. Also, recent experiments suggest setting –l to the forecast length.

5. Finally, you are ready to interpolate your data. This is similar to the last step:a. In the “Controls” tab, just make sure the data labels are all pointing to the right

data. (Note: If using an additional SST field, type ‘SSTDATA’ into the “CONSTANTS_FULL_NAME” blank.

b. Click on the “Script” tab, and as before, modify the command line (only this time (-f) is the forecast length in hours, and the other two options are as above) [Depending on the size of your domain, running these scripts can take quite awhile so give them a chance!]

c. If this runs successfully you will have files that look like: wrf_real_input_em.d01.YYYY-MM-DD_HH:MM:SS in the /wrfsi/domains/yourdomainname/siprd directory.

4. Running WRF

[The data is already there to run WRF for one of the idealized test cases, and they can be run by looking at the instructions here:http://www.mmm.ucar.edu/wrf/users/wrfv2/runwrf.html]

To run a real case:

1. Move all of the “wrf_real_input_em.d01.YYYY-MM-DD_HH:MM:SS” files that

7

Page 8: 1 · Web viewIn the lib directory you will need to copy any grads scripts you have to this location i.e. dbz.gs etc. In the dat directory you will need to copy the .dat grads files,

were created in the last step of WRFSI into /test/em_real

2. Edit the namelist.input file (located in /test/em_real) to reflect your run specifications (see namelist.input hints at end of manual, or a description of the variables can be found online at: http://www.mmm.ucar.edu/wrf/users/wrfv2/wrf-namelist.html . Also, the list of available model physics packages is in the README file in the WRFV2 directory). Also, the wrfsi namelist may help you with your domain dimensions, etc. It is located at: /WRFV2/wrfsi/domains/yourdomainname/static/wrfsi.nl.

a. Single processor mode:To process the IC files, typereal.exeAnd the files wrfinput_d01 and wrfbdy_d01 will be generated.

To run the model, type wrf.exeAs the model runs, files called wrfout_d1_* will be created.

b. MPI mode1. First, be sure that “nio_tasks_per_group” in the namelist.input file is equal to the number of processors you will be using.

2. real.exe can be run on a compute node from a terminal window, or using multiple processors using a script like the one found in /lockers/PAMS/Volume1/jmbilli2/scripts/innernest.csh When real.exe runs successfully, the files wrfinput_d01 and wrfbdy_d01 will be generated in the test/em_real directory.

3. Finally, to run the model, run wrf.exe. This can also be easily done from a script such as the one found in: /lockers/PAMS/Volume1/jmbilli2/scripts/wrf_run.sh. As the model runs, files called wrfout_d1_* will be created.

5. Performing a nested WRF run

Assumptions: First, I am assuming that you have the WRF model installed as well as WRFSI and are able to open WRFSI without any problems. I also assume that you have been successful in performing a simulation using a single domain. If either of those statements are not true, then you will need to look elsewhere for more help before trying to perform a nested run. Here I will highlight the differences between running a single domain and running nested domains, but will skip over most of the procedure that is identical. The procedure for making a nested run is generally similar to a normal run so you need to begin by running WRFSI...

Running WRFSI

8

Page 9: 1 · Web viewIn the lib directory you will need to copy any grads scripts you have to this location i.e. dbz.gs etc. In the dat directory you will need to copy the .dat grads files,

Run the following steps on Bigdog

1. Create the outer domain:

After opening the program by typing wrf_tools, click on the domain selection button on the left hand side of the interface. Then you will want to either load your existing domain and place a nest within it, or create a new domain for your nested run. If you create a new domain you will first need to create the outer domain by dragging the cursor on the map, selecting the correct map projection and then clicking update map.

Next, you will see the domain which you have just created. This domain can be edited by clicking on one of the squares on the outer edge of the domain and dragging with the mouse. If you want to edit the grid in a more fine manner, to the right of the domain there is an option for the “fine scale editing mode”. If you select the option for “grid values” you can edit the number of horizontal grid points as well as the grid spacing.

At this point, you need to think about the grid spacing for your outer domain. If you will be performing a nested run with feedback on (i.e. data is passed between the grids) then you need to have an odd ratio between the outer domain and the inner domain(s). If you are leaving feedback off, this ratio can be odd or even. Either way, make sure that the ratio will be an integer.

As an example, the following grid spacings would be acceptable if you wanted feedback:

2 grids: 24 and 8km, or 21 and 7km

3 grids: 27, 9 and 3km

2. Create the inner nest(s)

After creating your outer domain and figuring out what grid spacing you are going to use for your nest(s), the next step is to create the nest(s). To the right of the map near the top, there are two tabs; MOAD domain and NEST domain. Click on the NEST domain tab to begin working on the nested domain.

First, you need to select the domain id. The first nest you create would have the id 2, a second nest would have the id 3, and so on for more nests. After selecting the id number, the parent id is automatically set as the domain that is outside of that inner domain. Next you can select the grid spacing ratio between the nested domain and its parent domain, keeping in mind that if you want feedback on this number must be odd. After those values are set, simply click on drag on the map to create the nested domain, like you did to create the outer domain. After creating the domain, you can edit it by changing the lower left and upper right grid values. Those numbers correspond to where the lower left and upper right grid points of your nested domain are located within the

9

Page 10: 1 · Web viewIn the lib directory you will need to copy any grads scripts you have to this location i.e. dbz.gs etc. In the dat directory you will need to copy the .dat grads files,

parent domain. Edit these numbers until you are satisfied with the location of the inner domain.

The procedure for adding more nested domains is identical to what you just did. To create a nested grid within the nest, simply go to domain id and click on d03. You will want to change the parent id to 2. After these changes, you would follow the same procedure as you did to create the first nested domain.

Finally, after you are done creating the domains click next. You then edit the vertical grid, the procedure here is similar to what you would do for a single domain run. For a nested run, you only specify the vertical grid dimensions once, and this will be used for all of the domains you are running. This means if you want to have very high vertical resolution on your inner-most domain, you will need to run all your domains with the same large number of vertical levels.

After clicking next again, you advance to the localization parms tab. Here you can see the values that exist for the domain configuration which you have already set up. Be sure that the num_domains variable is set to the total number of domains which you have created; i.e. 2 if you have one nested domain, 3 if you have 2 nested domains, etc. You may want to write down the values for DOMAIN_ORIGIN for the lower left and upper right grid points, because you will need these values later, or they can be found later on.

Next you will advance through the menus and localize the domain, which may take longer than it did for just 1 coarse domain so be patient.

The procedure for processing the initial data and interpolating the data is identical to what you do for a single domain. Before you interpolate the data, be sure that the following settings are correctly:

● Num_Active_Subnests: refers to the total number of nested domains you are using.

● Active_Subnests: should be the total number of domains in your model run.

When you are done with WRFSI, take a look at the output data, which is located in: /wrfsi/domains/*yourdomainname/siprd

You should have wrf_real_input_d01 files for the duration of your model run, as you would for a single domain model run. You should also have a wrf_real_input_d0* file just at the initial time for your model run, where the * refers to the number of the domain. If you have one nested domain, you will just have one wrf_real_input_d02 file at the initial time; if you have 2 nested domains, then you should have wrf_real_input_d02 and wrf_real_input_d03 files, and so on for however many domains you use.

**If you do not have the proper input files, check the num_domains variable in wrfsi and make sure the number there is the number of grids in your model run.

10

Page 11: 1 · Web viewIn the lib directory you will need to copy any grads scripts you have to this location i.e. dbz.gs etc. In the dat directory you will need to copy the .dat grads files,

Copy or move these initial condition files to the WRFV2/test/em_real directory, and next these initial condition files need to be processed using real.exe.

WRF Procedure

1. Preparing and running real.exe for the nested domain(s)

Next, go to the test/em_real directory. You will need to run real.exe as before, but the procedure for running it with nested domains is somewhat tedious. You will need to run real.exe twice if you have one nested domain and 3 times if you have 2 nested domains. Each time you run real.exe you are just processing the initial condition files for one of the domains at a time. You will run real.exe for the domain with the smallest grid spacing first, and then end by running it for the coarsest domain.

1a. Rename some files

The procedure for running real.exe is the same as before, however each time you run it you need to be sure that the files and variables are set for the domain you are processing. Since real.exe will be looking for the wrf_real_input_d01 file, you need to rename the file you are trying to process as d01.

First, rename the actual wrf_real_input_d01 file as something else (i.e. original_wrf_real_input_d01) so you do not accidentally overwrite it. Next, rename the wrf_real_input_d02 file (if you have 1 nested domain) or the wrf_real_input_d03 file (if you have 2 nested domains) as wrf_real_input_d01. This way, real.exe will process the domain with the smallest grid spacing first.

1b. Edit the namelist.input file for the domain you are processing

Next, you need to edit the namelist.input file. Since real.exe processes 1 domain at a time, you just need to edit the first column of numbers. Therefore, make sure that the “input from file” row of the namelist is set to true for only the first column. At this point, you need to edit the namelist.input file for the innermost domain. Most of the values in the file do not need to be edited, with the following exceptions:

● Run days, hours, minutes and seconds: set all these values to 0, since you are only processing the files for the nested domains at the initial time.

● Start time and end time: be sure to set the end date/time to the same as the start date/time, with both being the start time of your model run. The reason is you are only processing the files for the nested domains at the initial time.

● max_dom: this number refers to the number of domains you have, and at this time since you are only processing one domain should be set to 1.

11

Page 12: 1 · Web viewIn the lib directory you will need to copy any grads scripts you have to this location i.e. dbz.gs etc. In the dat directory you will need to copy the .dat grads files,

● s_we, e_we, s_sn, e_sn:

● s_we and s_sn: refers to the start point of your grid in the west/east direction and north/south direction, respectively. Here both should be set to 1.

● e_we and e_sn: refers to the end points of the grid you are processing. These values were listed in wrfsi, or can be found by typing the following into a separate terminal window:

ncdump -h wrf_real_input_em.d0*

where * refers to the number of the domain

The grid dimensions can be found near the end of the dump: WEST-EAST_GRID_DIMENSION and SOUTH-NORTH_GRID_DIMENSION. These represent the end points of the grid, and use these values to set e_we and e_sn:

e_we: should be set to the value found after: “ WEST-EAST_GRID_DIMENSION”, or the value found while running wrfsi.

e_sn: should be set to the value found after: “ SOUTH-NORTH_GRID_DIMENSION”, or the value found while running wrfsi.

● dx and dy: be sure to set these values to the grid spacing you chose while running wrfsi, noting that this number should be in meters.

● Note that other values in the column remain the same, and you do not need to change values such as grid id, level, i_parent_start, etc. since we are treating the domain as if it is the only domain.

If this doesn’t work follow you can also copy the namelist.input from /lockers/PAMS/Volume1/jmbilli2/scripts/namelist.input.innernest and change the times and grid dimensions in the first column

1c. Run real.exe

Now you should be all set to run real.exe, so simply type real.exe and see what happens (be sure you are on a compute node). If it runs successfully you will get a wrfinput_d01 file. If it does not work successfully, then you probably didn't set the grid dimensions properly and you should be sure to set e_we and e_sn correctly.

If your nested domain has a lot of grid points, you may need to run real.exe by submitting a script to the queue. Use /lockers/PAMS/Volume1/jmbilli2/scripts/innernest.sh changing the file destination to the correct destination. Then submit the script “qsub innernest.sh”. You can only submit a job on the head node. Again if it’s successful you will get a wrfinput_d01 file.

1d. Rename some files (again), save your namelist

12

Page 13: 1 · Web viewIn the lib directory you will need to copy any grads scripts you have to this location i.e. dbz.gs etc. In the dat directory you will need to copy the .dat grads files,

Next, you need to rename the wrf_real_input_em.d01 file back to its original name (either d02 or d03), and also rename the wrfinput_d01 file that you just got to wrfinput_d0*, where * refers to the number of the domain that you just processed.

In order to run real.exe again, you will need to change the values in the namelist.input file for the next domain. You will probably want to save the namelist.input file you created for the nested domain so you can have it if you ever need to rerun real.exe. Rename it something like namelist.input.innernest.

1e. Run real.exe for the next nested domain(s) (if you have more than 1 nested domain)

If you have another nested domain, you need to repeat the above procedure for that domain.

2. Run real.exe for your outermost domain

Finally, you need to run real.exe for the last time in order to process the initial condition files for your outermost domain. Be sure to change the name of the wrf input file for the outermost domain back to its original name before running real.exe again. In addition, you will need to make a few changes to the namelist.input file, outlined below:

● run days, hours, etc: now you are running real.exe for the duration of your model run, so set these values to the proper length of your model run.

● end year, month, etc: again, now you need to set these values to correspond to the proper end time of your model run.

● change grid spacing and grid dimensions to properly reflect the values of your outermost grid.

After running real.exe for the outermost domain, you should get wrfinput_d01 and wrfbdy_d01 files. This time you do not need to rename them!

If this doesn’t work you can also copy the outernest namelist.input from /lockers/PAMS/Volume1/jmbilli2/scripts/namelist.input.outer and change the times and grid dimensions in the first column.

Run the follwing steps on your chosen meso server.

3. Copy all namelist’s and wrfinput_d0* and wrfbdy_d01 to your chosen meso* server.

I suggest you create a directory in the test/em_real directory with a name that describes your run. In this directory, copy all files from the em_real directory can be done by

cp ../* .

Now you need to copy the input files over from bigdog.

13

Page 14: 1 · Web viewIn the lib directory you will need to copy any grads scripts you have to this location i.e. dbz.gs etc. In the dat directory you will need to copy the .dat grads files,

scp [email protected]:directory_where_files_located/wrfinput* .

scp [email protected]:directory_where_files_located/wrfbdy* .

I’d also copy the namelist’s that you’ve used.

scp [email protected]:directory_where_files_located/namelist.* .

4. Setting up the namelist to submit your model run

Now you need to edit the namelist.input file one last time. This time you will be making changes to either 2 or 3 columns depending on the number of nested domains you are using. Be sure to save the namelist.input file you last used for the outermost domain before you start editing it so you have it in case you need it later.

There are a lot of values you need to edit, so be sure to make the following changes:

● start time/end time: keep in mind you can start and end the simulations on individual domains at different times if you would like.

● input_from_file: set column 2 to true if you have one nest, and set column 2 and 3 to true if you have 2 nested domains.

● history_interval: you probably want to set this value to 180 for every grid so you get input every 3 hours, but you could use other values as well.

● frames_per_outfile: set this to 1 for each grid so you get an output file at each output interval.

● max_dom: be sure to set this value to the total number of domains that you are using (2 if you have one nested domain, 3 if you are using 2 nested domains)

● e_we: the value in these columns corresponds to the end point in the east-west direction for each grid. The value here for each grid will be the same as the e_we value that you used when you were running real.exe for that specific grid. If you forgot to save those namelists, you can always use the ncdump -h command as earlier and look at WEST-EAST_GRID_DIMENSION.

● e_sn: same as e_we, except the end point in the north-south direction for each grid. Follow instructions above for e_we, and use the value found after SOUTH-NORTH_GRID_DIMENSION.

● e_vert: be sure to change this value, if necessary, in the column(s) corresponding to your nested domain(s).

● dx and dy: change these values to correspond to your domains. (value in meters)

● i_parent_start and j_parent_start: for the outer domain, these values should both be 0. For your nested domain(s) these values, can be found by using the ncdump -h command and looking at i_parent_start and j_parent_start.

14

Page 15: 1 · Web viewIn the lib directory you will need to copy any grads scripts you have to this location i.e. dbz.gs etc. In the dat directory you will need to copy the .dat grads files,

● parent_grid_ratio and parent_time_step_ratio: these values should be the same and correspond to the ratio between the outer grid and the nested grid. Example: If you had a 27 km outer grid and 9 km nested grid, then the value in the second column would be 3. The model needs to use a smaller time step for the inner grid in order to satisfy CFL criteria.

● feedback: set to 0 for no feedback, or 1 if you want feedback.

● physics options: you can use different physics options on the different domains if desired. For example, if you had a small enough grid spacing on the nested domain you could choose to not use cumulus parameterization. Or, you can set the values the same for each domain.

Again another example for a nested namelist used to run the wrf can found in data/chnablu/scripts/namelist.input.run

5. Submit the model run

Submit the model choosing how many processors you’d like to use on your chosen meso server.

mpirun –np 7 ./wrf.exe 7 is the number of processors, and can be any number from 1-8

You can also submit the run so that you can logout of your local machine using the no hangup command

nohup mpirun –np 7 ./wrf.exe &

6. Viewing WRF Output

1. Setting up the files for Grads

You will need to make a grads directory in your home directory, within that directory you will need to add two more directories

libdat

In the lib directory you will need to copy any grads scripts you have to this location i.e. dbz.gs etc. In the dat directory you will need to copy the .dat grads files, so from your dat directory

cp /usr/local/lib/grads/* .

15

Page 16: 1 · Web viewIn the lib directory you will need to copy any grads scripts you have to this location i.e. dbz.gs etc. In the dat directory you will need to copy the .dat grads files,

In your .bashrc file you will need to add the line

export GASCRP=$HOME/grads/lib

2. Using Grads

Copy the Grads control file to your directory

cp /usr/local/WRF2GrADS/control_file .

Edit the path name in the control_file

wrf_to_grads control_file youroutputname

Edit the time interval in the youroutputname.ctl file Change the **hr to your model interval time Example: 30mn

gradsc

16