GoMSS Nowcast Documentation

This is a collection of documentation about the Gulf of Maine and Scotian Shelf (GoMSS) Nowcast project. The focus of the project is the automated production of a daily ocean physics prediction for the GoMSS region using the NEMO version 3.6 ocean model.

About the Project

High-resolution ocean forecasting models have been developed in DFO under the coordination of the DFO-EC-DND inter-departmental CONCEPTS program. The models are based on the Nucleus for European Modelling of the Ocean (NEMO), a state-of-the-art community ocean model framework that includes comprehensive ocean dynamics. Recent results have shown the benefits of upgrading the model codes to NEMO version 3.6. Besides the ongoing work in CONCEPTS to develop 24/7 operational forecasting capacity on EC computers, it is also desirable for DFO researchers to carry out operational tests for technical development, and the results for such tests can also provide open boundary forcing to the nearshore forecasting models under development for various DFO projects.

The objective of this project is to develop software for automatic online documenting of model codes, model configuration and analysis scripts, and routine downloading large-scale model outputs for setting up the initial and boundary conditions of the NEMO models, and atmospheric forcing fields for driving the ocean models.

This project draws inspiration and some software from the Salish Sea NEMO model, its nowcast/forecast system, and the software that powers it.

Contents

GoMSS NEMO Model

Atmospheric Forcing Domain

Comparison of the GoMSS NEMO model domain from GoMSS-NEMO-config/bathy_meter.nc and Environment Canada GEM 2.5km resolution HRDPS research model sub-domain from http://collaboration.cmc.ec.gc.ca/science/outgoing/dominik.jacques/meopar_east/.

In [2]:
import matplotlib.cm
import matplotlib.pyplot as plt
import numpy as np
import seaborn as sns
import xarray as xr
In [4]:
%matplotlib inline
NEMO Model Domain
In [5]:
bathy = xr.open_dataset('../../GoMSS-NEMO-config/bathy_meter.nc')
In [6]:
bathy
Out[6]:
<xarray.Dataset>
Dimensions:     (x: 668, y: 218)
Coordinates:
  * x           (x) int64 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 ...
  * y           (y) int64 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 ...
Data variables:
    X           (x) float64 1.0 2.0 3.0 4.0 5.0 6.0 7.0 8.0 9.0 10.0 11.0 ...
    Y           (y) float64 1.0 2.0 3.0 4.0 5.0 6.0 7.0 8.0 9.0 10.0 11.0 ...
    nav_lon     (y, x) float64 -68.37 -68.34 -68.32 -68.29 -68.27 -68.24 ...
    nav_lat     (y, x) float64 37.57 37.58 37.59 37.59 37.6 37.61 37.62 ...
    Bathymetry  (y, x) float64 4e+03 4e+03 4e+03 4e+03 4e+03 4e+03 4e+03 ...
Attributes:
    Bathymetry: Etopov1v_bathymetry
In [7]:
fig, ax = plt.subplots(1, 1, figsize=(14, 8))
cmap = matplotlib.cm.get_cmap('viridis_r')
cmap.set_bad('burlywood')
mesh = ax.pcolormesh(
    bathy.nav_lon, bathy.nav_lat,
    np.ma.masked_values(bathy.Bathymetry, 0),
    cmap=cmap,
)
cbar = fig.colorbar(mesh, ax=ax)
cbar.set_label('Depth [m]')
ax.set_aspect(
    1 / np.cos(np.median(bathy.nav_lat) * np.pi / 180),
    adjustable='box-forced',
)
ax.set_xlabel('Longitude [°E]')
ax.set_xlim(-71.5, -51)
ax.set_ylabel('Latitude [°N]')
ax.set_ylim(37.5, 48)
ax.grid(axis='both')
plt.tight_layout()
_images/GoMSS-NEMO-model_AtmosphericForcingDomain_6_0.png
EC HRDPS Research Model Domain
In [8]:
atmos = xr.open_dataset('../../2016072700_012_2.5km_534x684.nc')
In [9]:
atmos
Out[9]:
<xarray.Dataset>
Dimensions:       (time_counter: 1, x: 534, y: 684)
Coordinates:
  * time_counter  (time_counter) datetime64[ns] 2016-07-27T12:00:00
  * x             (x) int64 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 ...
  * y             (y) int64 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 ...
Data variables:
    nav_lon       (y, x) float32 285.263 285.289 285.314 285.339 285.364 ...
    nav_lat       (y, x) float32 40.6528 40.6412 40.6295 40.6178 40.6061 ...
    tair          (time_counter, y, x) float64 296.3 296.4 296.6 296.5 296.3 ...
    qair          (time_counter, y, x) float64 0.01233 0.01243 0.01255 ...
    seapres       (time_counter, y, x) float64 1.015e+05 1.015e+05 1.015e+05 ...
    precip        (time_counter, y, x) float64 0.0 0.0 0.0 0.0 0.0 0.0 0.0 ...
    therm_rad     (time_counter, y, x) float64 356.6 357.3 358.2 357.4 357.2 ...
    solar         (time_counter, y, x) float64 322.6 322.7 322.9 323.2 323.3 ...
Attributes:
    history: Wed Jul 27 08:14:44 2016: ncks -A /cnfs/dev/mrb2/arma/armadja/MEOPAR/ocean/tmp//FB_2016072700_012_534x684.nc /cnfs/dev/mrb2/arma/armadja/MEOPAR/ocean/tmp//2016072700_012_2.5km_534x684.nc
Wed Jul 27 08:14:43 2016: ncks -A /cnfs/dev/mrb2/arma/armadja/MEOPAR/ocean/tmp//FI_2016072700_012_534x684.nc /cnfs/dev/mrb2/arma/armadja/MEOPAR/ocean/tmp//2016072700_012_2.5km_534x684.nc
Wed Jul 27 08:14:43 2016: ncks -A /cnfs/dev/mrb2/arma/armadja/MEOPAR/ocean/tmp//PR_2016072700_012_534x684.nc /cnfs/dev/mrb2...
    NCO: 4.0.8
In [10]:
fig, ax = plt.subplots(1, 1, figsize=(14, 8))
mesh = ax.pcolormesh(
    atmos.nav_lon, atmos.nav_lat,
    np.ma.masked_values(atmos.tair[0, ...], 0),
    cmap='viridis',
)
cbar = fig.colorbar(mesh, ax=ax)
cbar.set_label('2m Air Temperature [K]')
ax.set_aspect(
    1 / np.cos(np.median(atmos.nav_lat) * np.pi / 180),
    adjustable='box-forced',
)
ax.set_xlabel('Longitude [°E]')
ax.set_xlim(285, 312)
ax.set_ylabel('Latitude [°N]')
ax.set_ylim(33.6, 53.2)
ax.grid(axis='both')
plt.tight_layout()
_images/GoMSS-NEMO-model_AtmosphericForcingDomain_10_0.png

So, the EC HRDPS domain clearly includes the GoMSS NEMO domain

HRDPS Grid Points on GoMSS NEMO Domain
In [11]:
fig, ax = plt.subplots(1, 1, figsize=(14, 8))
cmap = matplotlib.cm.get_cmap('viridis_r')
cmap.set_bad('burlywood')
mesh = ax.pcolormesh(
    bathy.nav_lon, bathy.nav_lat,
    np.ma.masked_values(bathy.Bathymetry, 0),
    cmap=cmap,
)
cbar = fig.colorbar(mesh, ax=ax)
cbar.set_label('Depth [m]')
ax.scatter(
    (atmos.nav_lon-360)[::10, ::10], atmos.nav_lat[::10, ::10],
    marker='.', color='red',
    label='Every 10th HRDPS Grid Point',
)
ax.set_aspect(
    1 / np.cos(np.median(bathy.nav_lat) * np.pi / 180),
    adjustable='box-forced',
)
ax.legend(loc='lower right', numpoints=1)
ax.set_xlabel('Longitude [°E]')
ax.set_xlim(-71.5, -51)
ax.set_ylabel('Latitude [°N]')
ax.set_ylim(37.5, 48)
ax.grid(axis='both')
plt.tight_layout()
_images/GoMSS-NEMO-model_AtmosphericForcingDomain_13_0.png

NEMO Atmospheric Forcing Files

Description of the algorithms used to construct NEMO atmospheric forcing files from the Environment Canada GEM 2.5km HRDPS research model product (http://collaboration.cmc.ec.gc.ca/science/outgoing/dominik.jacques/meopar_east/).

The algorithms are implemented in the nowcast.workers.make_weather_forcing worker module of the GoMSS_Nowcast package.

Contents

In [1]:
from glob import glob
from pathlib import Path

import matplotlib.pyplot as plt
import pandas
import xarray
In [2]:
%matplotlib inline
Concatentation of gzipped hourly forecast files

The hourly forecast files downloaded from http://collaboration.cmc.ec.gc.ca/science/outgoing/dominik.jacques/meopar_east/ are gzip compressed netCDF files. The NEMO atmospheric forcing files that the make_weather_forcing worker produces contain all of the hours from a given day’s forecast. In order to perform that concatenation in conjunction with the calculations on the precipitation variable, the hourly forecast files are decompressed into a temporary directory.

For this example, we’ll produce the 2016-10-26 forcing file. To do that we need to work with the files from the 2016-10-26 forecast (excluding the hour 0 file) as well as the final 2 hourly files from the 2016-10-25 forecast:

In [3]:
gunzip_dir = Path('/tmp/gunzipped/')
for f in sorted([f for f in gunzip_dir.iterdir()]):
    print(f)
/tmp/gunzipped/2016102500_023_2.5km_534x684.nc
/tmp/gunzipped/2016102500_024_2.5km_534x684.nc
/tmp/gunzipped/2016102600_001_2.5km_534x684.nc
/tmp/gunzipped/2016102600_002_2.5km_534x684.nc
/tmp/gunzipped/2016102600_003_2.5km_534x684.nc
/tmp/gunzipped/2016102600_004_2.5km_534x684.nc
/tmp/gunzipped/2016102600_005_2.5km_534x684.nc
/tmp/gunzipped/2016102600_006_2.5km_534x684.nc
/tmp/gunzipped/2016102600_007_2.5km_534x684.nc
/tmp/gunzipped/2016102600_008_2.5km_534x684.nc
/tmp/gunzipped/2016102600_009_2.5km_534x684.nc
/tmp/gunzipped/2016102600_010_2.5km_534x684.nc
/tmp/gunzipped/2016102600_011_2.5km_534x684.nc
/tmp/gunzipped/2016102600_012_2.5km_534x684.nc
/tmp/gunzipped/2016102600_013_2.5km_534x684.nc
/tmp/gunzipped/2016102600_014_2.5km_534x684.nc
/tmp/gunzipped/2016102600_015_2.5km_534x684.nc
/tmp/gunzipped/2016102600_016_2.5km_534x684.nc
/tmp/gunzipped/2016102600_017_2.5km_534x684.nc
/tmp/gunzipped/2016102600_018_2.5km_534x684.nc
/tmp/gunzipped/2016102600_019_2.5km_534x684.nc
/tmp/gunzipped/2016102600_020_2.5km_534x684.nc
/tmp/gunzipped/2016102600_021_2.5km_534x684.nc
/tmp/gunzipped/2016102600_022_2.5km_534x684.nc
/tmp/gunzipped/2016102600_023_2.5km_534x684.nc
/tmp/gunzipped/2016102600_024_2.5km_534x684.nc

To avoid restart artifacts from the HRDPS product, the final hour of the previous day’s forecast is used as the 0th hour of weather forcing:

In [4]:
nc_filepaths = [gunzip_dir/'2016102500_024_2.5km_534x684.nc']

We extend that list with the hourly forecast files for the day of interest:

In [5]:
nc_filepaths.extend(
    [gunzip_dir/Path(f).name for f in sorted(glob((gunzip_dir/'20161026*').as_posix()))])
for f in nc_filepaths:
    print(f)
/tmp/gunzipped/2016102500_024_2.5km_534x684.nc
/tmp/gunzipped/2016102600_001_2.5km_534x684.nc
/tmp/gunzipped/2016102600_002_2.5km_534x684.nc
/tmp/gunzipped/2016102600_003_2.5km_534x684.nc
/tmp/gunzipped/2016102600_004_2.5km_534x684.nc
/tmp/gunzipped/2016102600_005_2.5km_534x684.nc
/tmp/gunzipped/2016102600_006_2.5km_534x684.nc
/tmp/gunzipped/2016102600_007_2.5km_534x684.nc
/tmp/gunzipped/2016102600_008_2.5km_534x684.nc
/tmp/gunzipped/2016102600_009_2.5km_534x684.nc
/tmp/gunzipped/2016102600_010_2.5km_534x684.nc
/tmp/gunzipped/2016102600_011_2.5km_534x684.nc
/tmp/gunzipped/2016102600_012_2.5km_534x684.nc
/tmp/gunzipped/2016102600_013_2.5km_534x684.nc
/tmp/gunzipped/2016102600_014_2.5km_534x684.nc
/tmp/gunzipped/2016102600_015_2.5km_534x684.nc
/tmp/gunzipped/2016102600_016_2.5km_534x684.nc
/tmp/gunzipped/2016102600_017_2.5km_534x684.nc
/tmp/gunzipped/2016102600_018_2.5km_534x684.nc
/tmp/gunzipped/2016102600_019_2.5km_534x684.nc
/tmp/gunzipped/2016102600_020_2.5km_534x684.nc
/tmp/gunzipped/2016102600_021_2.5km_534x684.nc
/tmp/gunzipped/2016102600_022_2.5km_534x684.nc
/tmp/gunzipped/2016102600_023_2.5km_534x684.nc
/tmp/gunzipped/2016102600_024_2.5km_534x684.nc

Now we construct an “in-memory” representation of the the netCDF file that is our objective. To do that we use the xarray package.

In [6]:
ds = xarray.open_mfdataset([nc_filepath.as_posix() for nc_filepath in nc_filepaths])

“In-memory” is not strictly accurate because the concatenated dataset is rather large. The xarray.open_mfdataset() function employs another Python library called dask behind the scenes. dask delays the processing of the concatenated dataset until it is written to disk, enabling the the processing to be applied to the constituent forecast files one at a time. Doing so avoids loading all of the forecast files into memory at once, and takes advantage of multiple cores to do the processing.

So, we now have a data structure in which the hourly forecast files that we want are concatenated into time-series of atmospheric forcing variables.

Calculation of precipitation flux from cumulative precipitation

To visualize the dataset we will use a point in the HRDPS grid that is close to Yarmouth Airport (YQI). There is an Environment Canada meteo monitoring station at YQI from which hourly and daily observations are available.

In [7]:
y, x = 274, 155
print(ds.nav_lat.values[0, y, x], ds.nav_lon.values[0, y, x] - 360)
43.7817 -66.1517944336

Looking at the precipitation variable in our dataset:

In [8]:
ds.precip[:, y, x].plot(
    marker='o', label='{0.long_name} [{0.units}]'.format(ds.precip))
lgd = plt.legend(loc='best')
_images/GoMSS-NEMO-model_NEMO-AtmosForcing_15_0.png

It is evident that the forecast precipitation is a cumulative value.

The hourly precipitation amount (in \(\frac{m}{hr}\)) is the hour-to-hour differences of the total precipitation values. To convert that to precipitation flux the factor is:

\[\frac{m}{hr} \times 1000 \frac{kg}{m^3} \times \frac{1}{3600} \frac{hr}{s} = \frac{1}{3.6}\]

So, working backward from the end of the forecast period, we calculate the flux for each hour:

In [9]:
for hr in range(24, 1, -1):
    ds.precip.values[hr] = (ds.precip.values[hr] - ds.precip.values[hr-1]) / 3.6
    print(hr, ds.precip.values[hr, y, x])
24 2.53478194483e-06
23 7.41960118628e-06
22 3.80719171113e-05
21 3.07586419189e-05
20 3.72315601756e-06
19 3.96086963721e-06
18 5.9035503202e-07
17 4.94063392075e-06
16 1.63548651876e-05
15 7.79322969417e-06
14 3.79420170147e-06
13 3.09635652229e-06
12 3.78012191504e-05
11 0.000188606726523
10 7.70576459925e-05
9 3.61938469319e-07
8 1.35291985417e-06
7 1.09993359527e-05
6 5.07865479449e-06
5 1.09111574097e-05
4 4.44664894733e-06
3 1.38603910374e-05
2 9.56290983191e-06

The hour 1 flux is the hour 1 cumulative value divided by 3.6 because the cumulative value at hour 0 is, by definition, zero:

In [10]:
ds.precip.values[1] /= 36
print(1, ds.precip.values[1, y, x])
1 5.98205941868e-07

As above, we avoid restart artifacts from the HRDPS product by calculating the hour 0 flux from the difference between the cumulative precipitation values in the final 2 hours of the previous day’s forecast files:

In [11]:
day_m1_precip = {}
for hr in (23, 24):
    nc_filepath = gunzip_dir/'2016102500_0{}_2.5km_534x684.nc'.format(hr)
    with xarray.open_dataset(nc_filepath.as_posix()) as hr_ds:
        day_m1_precip[hr] = hr_ds.precip.values[0, ...]
ds.precip.values[0] = (day_m1_precip[24] - day_m1_precip[23]) / 36
print(0, ds.precip.values[0, y, x])
0 2.44207007604e-07

Finally, we update the metadata attributes of the precip variable to reflect the change from cumulative to flux values:

In [12]:
ds.precip.attrs['units'] = 'kg m-2 s-1'
ds.precip.attrs['long_name'] = 'precipitation flux'
ds.precip.attrs['valid_max'] = 3

With that our precipitation near YQI now looks like:

In [13]:
ds.precip[:, y, x].plot(
    marker='o', label='{0.long_name} [{0.units}]'.format(ds.precip))
lgd = plt.legend(loc='best')
_images/GoMSS-NEMO-model_NEMO-AtmosForcing_25_0.png
Calculation of solid precipitation component

NEMO uses the solid phase precipitation (snow) flux in addition to the total precipitation flux. Since the HRDPS product provides only total precipitation (which we have converted to flux above), we will calculate the snow flux by assuming that precipitation at grid points where the air temperature is >273.15 Kelvin is snow.

In [14]:
snow = ds.precip.copy()
snow.name = 'snow'
snow.attrs['short_name'] = 'snow'
snow.attrs['long_name'] = 'solid precipitation flux'
snow.values[ds.tair.values >= 273.15] = 0
snow
Out[14]:
<xarray.DataArray 'snow' (time_counter: 25, y: 684, x: 534)>
array([[[  0.00000000e+00,   0.00000000e+00,   0.00000000e+00, ...,
           0.00000000e+00,   0.00000000e+00,   0.00000000e+00],
        [  0.00000000e+00,   0.00000000e+00,   0.00000000e+00, ...,
           0.00000000e+00,   0.00000000e+00,   0.00000000e+00],
        [  0.00000000e+00,   0.00000000e+00,   0.00000000e+00, ...,
           0.00000000e+00,   0.00000000e+00,   0.00000000e+00],
        ...,
        [  0.00000000e+00,   0.00000000e+00,   0.00000000e+00, ...,
           0.00000000e+00,   0.00000000e+00,   0.00000000e+00],
        [  0.00000000e+00,   0.00000000e+00,   0.00000000e+00, ...,
           0.00000000e+00,   0.00000000e+00,   0.00000000e+00],
        [  0.00000000e+00,   0.00000000e+00,   0.00000000e+00, ...,
           0.00000000e+00,   0.00000000e+00,   0.00000000e+00]],

       [[  0.00000000e+00,   0.00000000e+00,   0.00000000e+00, ...,
           0.00000000e+00,   0.00000000e+00,   0.00000000e+00],
        [  0.00000000e+00,   0.00000000e+00,   0.00000000e+00, ...,
           0.00000000e+00,   0.00000000e+00,   0.00000000e+00],
        [  0.00000000e+00,   0.00000000e+00,   0.00000000e+00, ...,
           0.00000000e+00,   0.00000000e+00,   0.00000000e+00],
        ...,
        [  4.71105673e-10,   3.35895238e-10,   1.86492365e-10, ...,
           0.00000000e+00,   0.00000000e+00,   0.00000000e+00],
        [  4.66930149e-10,   3.34405960e-10,   1.79570322e-10, ...,
           0.00000000e+00,   0.00000000e+00,   0.00000000e+00],
        [  4.60884318e-10,   3.14873707e-10,   1.62259786e-10, ...,
           0.00000000e+00,   0.00000000e+00,   0.00000000e+00]],

       [[  0.00000000e+00,   0.00000000e+00,   0.00000000e+00, ...,
           0.00000000e+00,   0.00000000e+00,   0.00000000e+00],
        [  0.00000000e+00,   0.00000000e+00,   0.00000000e+00, ...,
           0.00000000e+00,   0.00000000e+00,   0.00000000e+00],
        [  0.00000000e+00,   0.00000000e+00,   0.00000000e+00, ...,
           0.00000000e+00,   0.00000000e+00,   0.00000000e+00],
        ...,
        [  0.00000000e+00,   0.00000000e+00,   0.00000000e+00, ...,
           0.00000000e+00,   0.00000000e+00,   0.00000000e+00],
        [  0.00000000e+00,   0.00000000e+00,   0.00000000e+00, ...,
           0.00000000e+00,   0.00000000e+00,   0.00000000e+00],
        [  0.00000000e+00,   0.00000000e+00,   0.00000000e+00, ...,
           0.00000000e+00,   0.00000000e+00,   0.00000000e+00]],

       ...,
       [[  0.00000000e+00,   0.00000000e+00,   0.00000000e+00, ...,
           0.00000000e+00,   0.00000000e+00,   0.00000000e+00],
        [  0.00000000e+00,   0.00000000e+00,   0.00000000e+00, ...,
           0.00000000e+00,   0.00000000e+00,   0.00000000e+00],
        [  0.00000000e+00,   0.00000000e+00,   0.00000000e+00, ...,
           0.00000000e+00,   0.00000000e+00,   0.00000000e+00],
        ...,
        [  8.63036891e-08,   4.41076745e-08,   1.20501846e-08, ...,
           0.00000000e+00,   0.00000000e+00,   0.00000000e+00],
        [  6.95294449e-08,   3.00454644e-08,   7.77361731e-09, ...,
           0.00000000e+00,   0.00000000e+00,   0.00000000e+00],
        [  4.83119125e-08,   2.01756123e-08,   6.08449439e-09, ...,
           0.00000000e+00,   0.00000000e+00,   0.00000000e+00]],

       [[  0.00000000e+00,   0.00000000e+00,   0.00000000e+00, ...,
           0.00000000e+00,   0.00000000e+00,   0.00000000e+00],
        [  0.00000000e+00,   0.00000000e+00,   0.00000000e+00, ...,
           0.00000000e+00,   0.00000000e+00,   0.00000000e+00],
        [  0.00000000e+00,   0.00000000e+00,   0.00000000e+00, ...,
           0.00000000e+00,   0.00000000e+00,   0.00000000e+00],
        ...,
        [  1.90385521e-07,   1.24513437e-07,   6.79671233e-08, ...,
           0.00000000e+00,   0.00000000e+00,   0.00000000e+00],
        [  1.85208925e-07,   1.17347238e-07,   6.19586229e-08, ...,
           0.00000000e+00,   0.00000000e+00,   0.00000000e+00],
        [  1.67548953e-07,   1.08436949e-07,   6.12702590e-08, ...,
           0.00000000e+00,   0.00000000e+00,   0.00000000e+00]],

       [[  0.00000000e+00,   0.00000000e+00,   0.00000000e+00, ...,
           0.00000000e+00,   0.00000000e+00,   0.00000000e+00],
        [  0.00000000e+00,   0.00000000e+00,   0.00000000e+00, ...,
           0.00000000e+00,   0.00000000e+00,   0.00000000e+00],
        [  0.00000000e+00,   0.00000000e+00,   0.00000000e+00, ...,
           0.00000000e+00,   0.00000000e+00,   0.00000000e+00],
        ...,
        [  3.47586359e-07,   2.59830652e-07,   1.76381094e-07, ...,
           0.00000000e+00,   0.00000000e+00,   0.00000000e+00],
        [  3.47946904e-07,   2.53843366e-07,   1.63869116e-07, ...,
           0.00000000e+00,   0.00000000e+00,   0.00000000e+00],
        [  3.07654394e-07,   2.29752730e-07,   1.54432540e-07, ...,
           0.00000000e+00,   0.00000000e+00,   0.00000000e+00]]])
Coordinates:
  * y             (y) int64 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 ...
  * x             (x) int64 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 ...
  * time_counter  (time_counter) datetime64[ns] 2016-10-26 ...
Attributes:
    units: kg m-2 s-1
    valid_min: 0.0
    valid_max: 3
    long_name: solid precipitation flux
    short_name: snow
    online_operation: inst(only(x))
    axis: TYX
    savelog10: 0.0

Merging the snow DataArray into the dataset to produce a new DataSet object:

In [15]:
ds_with_snow = xarray.merge((ds, snow))
ds_with_snow
Out[15]:
<xarray.Dataset>
Dimensions:       (time_counter: 25, x: 534, y: 684)
Coordinates:
  * y             (y) int64 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 ...
  * x             (x) int64 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 ...
  * time_counter  (time_counter) datetime64[ns] 2016-10-26 ...
Data variables:
    tair          (time_counter, y, x) float64 279.8 279.5 279.2 279.9 279.7 ...
    seapres       (time_counter, y, x) float64 1.027e+05 1.027e+05 1.027e+05 ...
    precip        (time_counter, y, x) float64 0.0 0.0 0.0 0.0 0.0 0.0 0.0 ...
    nav_lon       (time_counter, y, x) float32 285.263 285.289 285.314 ...
    nav_lat       (time_counter, y, x) float32 40.6528 40.6412 40.6295 ...
    solar         (time_counter, y, x) float64 0.0 0.0 0.0 0.0 0.0 0.0 0.0 ...
    therm_rad     (time_counter, y, x) float64 244.5 244.3 243.9 244.5 243.9 ...
    v_wind        (time_counter, y, x) float64 -2.484 -2.443 -2.443 -2.67 ...
    u_wind        (time_counter, y, x) float64 1.914 1.888 1.881 2.001 1.927 ...
    qair          (time_counter, y, x) float64 0.002977 0.002969 0.003103 ...
    snow          (time_counter, y, x) float64 0.0 0.0 0.0 0.0 0.0 0.0 0.0 ...

Although we can see above that there are non-zero snow fluxes at some grid points it appears that there is no significant snow flux at YQI on 2016-10-26:

In [16]:
ds_with_snow.snow[:, y, x].plot(
    marker='o', label='{0.long_name} [{0.units}]'.format(ds_with_snow.snow))
lgd = plt.legend(loc='best')
_images/GoMSS-NEMO-model_NEMO-AtmosForcing_31_0.png
Storing the atmospheric forcing file

The forcing files are stored as netCDF4/HDF5 files. Doing so allows Lempel-Ziv deflation to be applied to each of the variables, resulting in a 50% to 80% reduction in the file size on disk.

The time units must also be specified when storing the file. We will use the fairly common seconds since 1970-01-01 00:00:00 units.

Those things are applied to the dataset by providing an encoding dict:

In [17]:
encoding = {var: {'zlib': True} for var in ds_with_snow.data_vars}
encoding['time_counter'] = {
    'units': 'seconds since 1970-01-01 00:00:00',
    'zlib': True,
}

encoding
Out[17]:
{'nav_lat': {'zlib': True},
 'nav_lon': {'zlib': True},
 'precip': {'zlib': True},
 'qair': {'zlib': True},
 'seapres': {'zlib': True},
 'snow': {'zlib': True},
 'solar': {'zlib': True},
 'tair': {'zlib': True},
 'therm_rad': {'zlib': True},
 'time_counter': {'units': 'seconds since 1970-01-01 00:00:00', 'zlib': True},
 'u_wind': {'zlib': True},
 'v_wind': {'zlib': True}}

And, with that, we can store the dataset on disk:

In [18]:
ds_with_snow.to_netcdf('gem2.5_y2016m10d26.nc', encoding=encoding)

It is at this point that the concatenation of the hourly forecast files into the forcing dataset occurs, as well as the calculation on the precipitation variable. The to_netcdf() method may take 60 seconds or longer, depending on the memory size, CPU speed, and number of cores available.

Plots of forcing variables time-series at a point
In [19]:
fig, axs = plt.subplots(5, 2, figsize=(20, 10))
var_axs = [
    ('precip', 0, 0),
    ('snow', 0, 1),
    ('tair', 1, 0),
    ('qair', 1, 1),
    ('u_wind', 2, 0),
    ('v_wind', 2, 1),
    ('solar', 3, 0),
    ('therm_rad', 3, 1),
    ('seapres', 4, 0),
]
for var, ax_row, ax_col in var_axs:
    factor = 10000 if var == 'precip' else 1
    (ds_with_snow.data_vars[var][:, y, x] * factor).plot(
        ax=axs[ax_row, ax_col], marker='o',
        label='{0.long_name} [{0.units}]'.format(ds_with_snow.data_vars[var]))
    axs[ax_row, ax_col].legend(loc='best')
    axs[ax_row, ax_col].grid()
fig_title = fig.suptitle(
    '{0[0]:%Y-%m-%d %H:%M} UTC to {0[24]:%Y-%m-%d %H:%M} UTC\n'
    '{1}°N, {2}°W'
    .format(
        pandas.to_datetime(ds_with_snow.time_counter.values),
        ds_with_snow.nav_lat[0, y, x].values,
        abs(ds_with_snow.nav_lon[0, y, x].values - 360),
))
_images/GoMSS-NEMO-model_NEMO-AtmosForcing_38_0.png
In [20]:
ds.close()
ds_with_snow.close()

GoMSS Nowcast System

The GoMSS (Gulf of Maine and Scotian Shelf) Nowcast system is a deployment of the GoMSS_Nowcast Python package on the nemo2 host in the cluster.mathstat.dal.ca HPC cluster. GoMSS_Nowcast is built on top of the NEMO_Nowcast framework package.

Please see NEMO Nowcast Framework Architecture for a description of the system architecture.

The deployment and configuration of the system are described in the GoMSS_Nowcast docs.

Results of the automated daily model runs can be found in /data1/dlatornell/nowcast-sys/results/.

Best Practices

Building XIOS

This section documents the recommended way to build the XIOS input/output server that is used by the NEMO-3.6 code.

The key feature of this way of working with XIOS is that the build configuration files (also know as kbd:arch files) are maintained in a separate, version controlled, public repository from the XIOS codebase. That repository is called XIOS-ARCH and is available on Bitbucket at https://bitbucket.org/gomss-nowcast/xios-arch. Maintaining a public repository of the arch files makes them more easily sharable among researchers, both within the GoMSS model group, and with the NEMO user community in general. The repository serves as a “single source of truth”, making it easy for users to obtain the most up to date compiler options, etc. that their colleagues have figures out. The repository also preserves the history of changes to the arch files.

  1. Check out XIOS source code from the SVN repository. The arch files in the XIOS-ARCH repository have been tested with:

    • XIOS-1.0 SVN revision 703, for which the checkout command is:

      svn co -r 703 http://forge.ipsl.jussieu.fr/ioserver/svn/XIOS/branchs/xios-1.0 XIOS
      
    • XIOS-2.0 SVN revision 856, for which the checkout command is:

      svn co -r 856 http://forge.ipsl.jussieu.fr/ioserver/svn/XIOS/trunk XIOS
      

    Note

    If you test these files with a later SVN revision, or update them to work with a later revision, please update this file and the XIOS-ARCH/README.rst file with the revision number.

  2. Use Mercurial to clone the XIOS-ARCH repository beside your XIOS checkout: If you have ssh key authentication set up on Bitbucket (see Bitbucket ssh Set-up) you can use the command:

    hg clone ssh://hg@bitbucket.org/gomss-nowcast/xios-arch XIOS-ARCH
    

    Otherwise, use:

    hg clone https://your-userid@bitbucket.org/gomss-nowcast/xios-arch XIOS-ARCH
    

    substituting your Bitbucket userid for your-userid, and entering your Bitbucket password when prompted.

  3. Symlink the arch-* files for the system that you are working on into the XIOS/arch/ directory. For example, on nemo.cluster.mathstat.dal.ca, symlink each of the DAL/arch-MATHSTAT_CLUSTER.* files into the XIOS/arch/ directory:

    cd XIOS/arch
    ln -s ../../XIOS-ARCH/DAL/arch-MATHSTAT_CLUSTER.env
    ln -s ../../XIOS-ARCH/DAL/arch-MATHSTAT_CLUSTER.fcm
    ln -s ../../XIOS-ARCH/DAL/arch-MATHSTAT_CLUSTER.path
    
  4. Build XIOS with XIOS/make_xios. For example, on nemo.cluster.mathstat.dal.ca, use:

    ./make_xios --arch MATHSTAT_CLUSTER --job 16 2>&1 | tee make.log
    

    The --arch MATHSTAT_CLUSTER selects the correct build configuration for the cluster.mathstat.dal.ca HPC cluster and compilers. The --job 16 option distributes the build across 16 processors to speed things up. The 2>&1 | tee make.log suffix captures stderr and stdout together and stores them in make.log as well as displaying them on the screen.

Web Services

The GoMSS Nowcast project relies on a collection of web services to provide version control repository hosting, issue tracking, documentation rendering and hosting, and software package distribution. Together this collection of interlinked web service accounts provide a web portal for the project.

This section described how those web service accounts are set up, managed, and interconnected.

Software Repositories and Issue Tracking

A team account on the bitbucket.org service is used to provide centralized, publicly accessible storage for the Mercurial version control repositories that contain the project’s code and documentation. Bitbucket is also used to provide an issue tracker for each repository.

The Bitbucket Cloud Teams page provides information on the concept of Bitbucket Teams.

A base account for the project with the user name gomss-nowcast-project was created from the Bitbucket Signup page. The credentials for that account are controlled by Doug Latornell <doug.latornell@43ravens.ca>.

Using the gomss-nowcast-project account, a team called GoMSS-Nowcast was created with the team id gomss-nowcast. The team’s overview page is at https://bitbucket.org/gomss-nowcast/.

User Groups Management

The gomss-nowcast-project account defaults to being a member of both the Administrators and the Developers groups of the gomss-nowcast team. The Manage Teams > User groups page was used to add Doug Latornell to the Administrators and Developers groups.

Members of the Administrators can manage what groups other users are members of, and what type of access (Read, Write, or Admin) they have to each of the team repositories.

Members of the Developers group have write access to all team repositories, and can create new team repositories.

Repository Issue Trackers

When a new repository is created there is a Issue tracker checkbox that can be clicked to provide the repository with an Issue Tracker. See https://bitbucket.org/gomss-nowcast/docs/issues for an example.

If an issue tracker was not enabled when a repository was created it can be added later on the repository’s Settings > Issue tracker settings page.

There are also Settings > Issues links to set up lists of Components, Milestones, and Versions that can be used to classify and organize issues.

Repository issue trackers can be used to organizer development tasks, bug reports, and enhancement ideas.

Documentation Rendering and Hosting

An account on the readthedocs.org service is used to render and host the main project documentation as well as documentation from project software packages.

An account for the project with the user name gomss-nowcast was created from the readthedocs.org Signup page. The credentials for that account are controlled by Doug Latornell <doug.latornell@43ravens.ca>.

To render and host documentation on readthedocs.org from a version control repository the repository must be imported into a readthedocs.org project. Repositories can only be set up a projects from the gomss-nowcast account. However, once a project has been created other users’ accounts on readthedocs.org can be linked as administrators into the project.

Project administrators have access to:

  • An admin interface that provides control over various aspects of the rendering and hosting, including deletion of the project
  • A button that can be used to force a build of the project
  • The docs build history for the project, enabling them to debug build problems
Project Set-up

To set up a readthedocs.org project to render and host the documentation from a Bitbucket repository use the Import a Project button on the dashboard page. Then use the Import Manually button to get to the Project Details page.

Fill in the Name text box. The name that you choose will be the sub-domain part of the URL at which the docs will be hosted; e.g. the name gomss-nowcast will produce docs at http://gomss-nowcast.readthedocs.io/. Each project name must be unique.

Fill in the Repository URL text box with the URL of the repository containing the source files for the documentation that you want to render and host in the project; for example, https://bitbucket.org/gomss-nowcast/docs.

Set the Repository type selector to Mercurial.

Click the Next button.

If all goes well, the repository will be cloned from Bitbucket on to a readthedocs server, Sphinx will be run on the directory tree containing the conf.py file (typically docs/), and the rendered docs will be viewable at https://<project-name>.readthedocs.io/. In other words, an automated version of the process described in Building the Documentation to Preview Changes.

Webhook to Build Docs When Changes are Pushed

To cause the documentation hosted on readthedocs to be automatically updated every time changes are pushed to the corresponding repository on Bitbucket use the Settings > Webhooks page of the repository on Bitbucket.

Click the Add webhook button.

Fill in the Title text box with readthedocs.

Fill in the URL text box with http://readthedocs.org/build/<project_name>, where <project_name> is the project name that you chose for this repository on readthedocs.

Click the Save button.

To test the webhook, make a change to a file in your local clone of the repository, commit the change, and push it to Bitbucket.

  • On the Settings > Webhooks > View requests page for the readthedocs webhook you created on Bitbucket you should see a record of the webhook request being sent to http://readthedocs.org/build/<project_name>.
  • On the Projects > <project_name> > Builds page for the readthedocs project associated with the repository you should see a build in progress or recently completed.

Software Distribution Channel

An anaconda.org organization is used to host and distribute Python packages for installation via the conda package manager.

A personal anaconda.org account is required in order to create an organization. The GoMSS Nowcast organization was created by Doug Latornell <doug.latornell@43ravens.ca>. Other personal anaconda.org accounts can be given write or administrative access to the GoMSS Nowcast organization. Write access to the organization allows users to upload conda packages. Administrative access provides full control over all packages and users in the organization including the ability to add/delete users and groups of users.

Installing Packages from the Channel

To use conda packages you need to have installed the Anaconda Python distribution or the Miniconda package manager. Please see the conda docs for an introduction to conda, and advice on which installation option to choose.

Regardless of which installation option you choose, Python 3 is strongly recommended. All of the packages in the GoMSS Nowcast organization channel are developed, tested, and packaged for Python 3.5.

To install a package from the GoMSS Nowcast organization channel use the -c gomss-nowcast option in the conda install command, for example:

$ conda install -c gomss-nowcast nemo_nowcast

You can also add gomss-nowcast to the channels section of your .condarc file to cause the GoMSS Nowcast organization channel to be searched every time you use the conda install command; see the conda configuration docs for details.

To use the GoMSS Nowcast organization channel in a conda environment description file, include a stanza like:

channels:
  - gomss-nowcast
  - defaults

Creating Documentation

This section describes how documentation directory trees are set up. The special case of the GoMSS Nowcast Project Documentation is described separately from Documentation in Code Repositories because the former is in a repository that contains only documentation, while documentation in code repositories is only one element of their contents.

GoMSS Nowcast Project Documentation

Version Control Repository Creation

The GoMSS Nowcast project documentation was initialized by creating a public Mercurial version control repository in the gomss-nowcast team account on the Bitbucket web service. The repository from Bitbucket was cloned:

$ cd projects/gomss-nowcast/
$ hg clone ssh://hg@bitbucket.org/gomss-nowcast/docs
Initial Files and Directories Creation

In an activated Working Environment the sphinx-quickstart command was used to create a Sphinx configuration file, conf.py, a master documentation file, index.rst for the project docs, a Makefile to control the documenation build processes, and 3 empty directories:

  • _build/ where the rendered docs produced by the build process will be stored
  • _static where static asset files such as images used in the docs will be stored
  • _templates where custom templates for the docs pages will be stored
$ source activate gomss-nowcast-docs
(gomss-nowcast-docs)$ cd docs
(gomss-nowcast-docs)$ sphinx-quickstart

The default setting values offered by sphinx-quickstart were accepted with the following exceptions:

The project name will occur in several places in the built documentation.
> Project name: GoMSS Nowcast
> Author name(s): GoMSS Nowcast Project Contributors

Sphinx has the notion of a "version" and a "release" for the
software. Each version can have multiple releases. For example, for
Python the version is something like 2.5 or 3.0, while the release is
something like 2.5.1 or 3.0a1.  If you don't need this dual structure,
just set both to the same value.
> Project version: 1
> Project release [1]: 1

Please indicate if you want to use one of the following Sphinx extensions:
...
> intersphinx: link between Sphinx documentation of different projects (y/n) [n]: y
...
> mathjax: include math, rendered in the browser by MathJax (y/n) [n]: y
...

A Makefile and a Windows command file can be generated for you so that you
only have to run e.g. `make html' instead of invoking sphinx-build
directly.
> Create Makefile? (y/n) [y]:
> Create Windows command file? (y/n) [y]: n
Sphinx Build Configuration File

The conf.py file was edited to:

  • Set configuration options that we want
  • Delete the many commented out configuration settings that described defaults that we want to use
  • Add come code to make the copyright year range automatically adjust to include future years

Please read the conf.py file and the Sphinx build configuration file documentation to learn more.

Master Documentation File

The index.rst file was edited to add a description of the project, and to start the documentation Contents section with the CONTRIBUTORS.rst file.

When the documentation is rendered to HTML the contents of the index.rst file become those of the top level index.html page.

License Choice

A Creative Commons Attribution 4.0 International License was chosen for the GoMSS Nowcast project documentation because it is a permissive license that requires attribution for use of the documentation. It is appropriate for the contents of the repository because they are primarily text rather than code.

The blob of HTML that provides the necessary license links and metadata was generated on the Creative Commons License Chooser page.

The copyright notice and license citation were added to the index.rst file as its License section.

Project Contributors File

The CONTRIBUTORS.rst file was created and text added to it to provide information about the leadership of the GoMSS Nowcast Project and a list of people who contribute to it.

Documentation Build and Preview

The command described in Building the Documentation to Preview Changes were used to do an initial build of the documentation which was previewed in a browser by opening the docs/_build/html/index.html.

Initial Version Control Commit

After correction of any typos and/or layout issue revealed by the rendered docs preview, the new documentation files were added and committed to the version control repository with:

(gomss-nowcast-docs)$ hg add
(gomss-nowcast-docs)$ hg commit -m"Init repo w/ README, license, CONTRIBUTORS, index pg & Sphinx build config."

Documentation in Code Repositories

To be written

Building and Uploading Conda Packages

Conda packages are built using “recipes” that provide package metadata and build instructions. The conda recipes for the packages in the GoMSS Nowcast organization channel are stored in the project’s conda-recipes repository on Bitbucket. The sections below described two methods of creating conda recipes, building packages with them, and uploading those packages to the channel.

In order to be able to build packages and upload them to the project channel you need to install the conda-build and anaconda-client packages:

$ conda install conda-build anaconda-client

Building Conda Packages from PyPI

Reference: The Building conda packages with conda skeleton tutorial

The conda skeleton command provides a way to build package recipe files for Python packages that exist in the Python Package Index (PyPI). Here we’ll use the schedule package as an example of building a conda package based on a PyPI package and uploading it to the GoMSS Nowcast organization channel so that it can be installed via conda install.

  1. Run the conda skeleton command:

    $ cd conda-recipes/
    $ CONDA_BLD_PATH=/tmp/cbtmp conda skeleton pypi schedule
    

    After a bunch of output, the result will be a schedule directory containing 3 files:

    meta.yaml:

    Contains all the metadata in the recipe. Only the package name and package version sections are required; everything else is optional.

    bld.bat:

    Windows commands to build the package.

    build.sh:

    Linux and OS X commands to build the package.

  2. With the recipe files in place the package is built with:

    $ conda build --croot /tmp/cbtmp schedule
    

    After a lot of output, if the build is successful, it concludes with a message like:

    # If you want to upload this package to anaconda.org later, type:
    #
    # $ anaconda upload /home/doug/miniconda3/conda-bld/linux-64/schedule-0.3.2-py35_0.tar.bz2
    #
    # To have conda build upload to anaconda.org automatically, use
    # $ conda config --set anaconda_upload yes
    
  3. Upload the package to the channel use:

    $ anaconda upload -u gomss-nowcast /home/doug/miniconda3/conda-bld/linux-64/schedule-0.3.2-py35_0.tar.bz2
    
  4. Add the recipe files to the conda-recipes Mercurial repository, commit them, and push them to Bitbucket:

    $ hg add schedule
    $ hg commit -m"Add recipe for schedule-0.3.2 from PyPI."
    $ hg push
    

Building Conda Packages from Scratch

Reference: The Building conda packages from scratch tutorial

The nemo_nowcast package recipe was created from scratch. conda-recipes/nemo_nowcast contains two files:

  1. A README.md file to identify the recipe and its provenance:

    This is an artisanal, hand-crafted recipe.
    
  2. A meta.yaml file containing the recipe metadata and build instructions:

    # Conda package metadata for NEMO_Nowcast
    
    package:
      name: nemo_nowcast
      version: "0.3"
    
    build:
      number: 1
      script: python setup.py install
    
    source:
      hg_url: https://bitbucket.org/43ravens/nemo_nowcast
    
    requirements:
      build:
        - circus
        - python
        - pyyaml
    
      run:
        - python
    
    about:
      home: https://bitbucket.org/43ravens/nemo_nowcast
      license: Apache License, Version 2.0
      license_file: LICENSE
      summary: A framework for building NEMO ocean model nowcast automation systems.
    
    extra:
      maintainers:
       - Doug Latornell <doug.latornell@43ravens.ca>
    

The Building conda packages from scratch tutorial and meta.yaml file docs provide details of how to populate the sections of the meta.yaml file.

With the recipe files in place the package is built with:

$ cd conda-recipes
$ conda build --croot /tmp/cbtmp nemo-nowcast

After a lot of output, if the build is successful, it concludes with a message like:

Finished processing dependencies for NEMO-Nowcast==0.3
found egg dir: /home/doug/warehouse/conda_envs/_build/lib/python3.5/site-packages/NEMO_Nowcast-0.3-py3.5.egg
number of files: 20
Fixing permissions
Fixing permissions
BUILD END: nemo_nowcast-0.3-py35_1
Nothing to test for: nemo_nowcast-0.3-py35_1
# If you want to upload this package to anaconda.org later, type:
#
# $ anaconda upload /home/doug/miniconda3/conda-bld/linux-64/nemo_nowcast-0.3-py35_1.tar.bz2
#
# To have conda build upload to anaconda.org automatically, use
# $ conda config --set anaconda_upload yes

To upload the package to the channel use:

$ anaconda upload -u gomss-nowcast /home/doug/miniconda3/conda-bld/linux-64/nemo_nowcast-0.3-py35_1.tar.bz2

Don’t forget to add the recipe files to the conda-recipes Mercurial repository, commit them, and push them to Bitbucket:

$ hg add schedule
$ hg commit -m"Add recipe for nemo_nowcast tip from bitbucket."
$ hg push

Uploading Packages to the GoMSS-Nowcast Channel

To upload a package to the GoMSS Nowcast organization channel you need:

  • A personal anaconda.org account
  • Write access to the organization (contact Doug Latornell <doug.latornell@43ravens.ca> or another organization administrator)
  • The anaconda-client package installed (conda install anaconda-client)

Once you have built and tested a conda package, and authenticated into you personal anaconda.org account with the anaconda login command, you can upload your package with the command:

$ anaconda upload -u gomss-nowcast /path/to/package.tar

The -u gomss-nowcast option is required because anaconda upload defaults to using your personal account.

Documentation Maintenance

The GoMSS Nowcast project documentation can be edited on-line in the bitbucket.org interface by following the links in the upper right-hand corner of any page of the rendered pages.

This section described the set-up that you need to do if you want to edit the documentation on your own computer, and render it locally to HTML to preview your changes before you commit and push them to the public repository.

Software Versions

The GoMSS Nowcast project documentation is developed and maintained using Python 3.5 or later, and Sphinx 1.4.1 or later.

Getting the Source Files

Use Mercurial to clone the documentation repository from the gomss-nowcast team account on bitbucket.org with the command:

$ hg clone ssh://hg@bitbucket.org/gomss-nowcast/docs

or

$ hg clone https://<your_userid>@bitbucket.org/gomss-nowcast/docs

if you don’t have ssh key authentication set up on Bitbucket.

Working Environment

Setting up an isolated development environment using Conda is recommended. Assuming that you have Miniconda3 or the Anaconda Python distribution installed, you can create and activate an environment called gomss-nowcast-docs that will have all of the Python packages necessary to build the documentation on your computer with the commands:

$ cd docs
$ conda create -n gomss-nowcast-docs python=3 sphinx
$ source activate gomss-nowcast-docs

You can edit the documentation files in any code-type editor. They are plain text files. The files are written in reStructuredText and converted to HTML using Sphinx.

To deactivate the environment use:

(gomss-nowcast-docs)$ source deactivate

Building the Documentation to Preview Changes

Building the documentation is driven by docs/Makefile. With your Working Environment activated, use:

(gomss-nowcast-docs)$ cd docs
(gomss-nowcast-docs)$ make clean html

to do a clean build of the documentation. The output looks something like:

rm -rf _build/*
sphinx-build -b html -d _build/doctrees   . _build/html
Running Sphinx v1.4.1
making output directory...
loading pickled environment... not yet created
building [mo]: targets for 0 po files that are out of date
building [html]: targets for 3 source files that are out of date
updating environment: 3 added, 0 changed, 0 removed
reading sources... [100%] index
looking for now-outdated files... none found
pickling environment... done
checking consistency... done
preparing documents... done
writing output... [100%] index
generating indices... genindex
writing additional pages... search
copying static files... done
copying extra files... done
dumping search index in English (code: en) ... done
dumping object inventory... done
build succeeded.

Build finished. The HTML pages are in _build/html.

The HTML rendering of the docs ends up in /docs/_build/html/. You can open the index.html file in that directory in your browser to preview the results of the build before committing and pushing your changes to Bitbucket.

Whenever you push changes to the docs repository on Bitbucket the documentation is automatically re-built and rendered at http://gomss-nowcast.readthedocs.io/en/latest/.

Version Control Repository

The GoMSS Nowcast project documentation source files are available docs Mercurial repository at https://bitbucket.org/gomss-nowcast/docs.

Issue Tracker

Development tasks, bug reports, and enhancement ideas are recorded and managed in the issue tracker at https://bitbucket.org/gomss-nowcast/docs/issues.

GoMSS Nowcast Project Contributors

The Gulf of Maine and Scotian Shelf (GoMSS) Nowcast project is lead by Dr. Youyu Lu in the Ocean Assessment and Prediction Section, Fisheries and Oceans Canada, Bedford Institute of Oceanography.

The following people have contributed code, documentation, etc. to the GoMSS Nowcast project repositories hosted on Bitbucket:

License

The GoMSS Nowcast Project Documentation are copyright 2016 by the GoMSS Nowcast project contributors.

Creative Commons License The GoMSS Nowcast Project Documentation by the GoMSS Nowcast project contributors is licensed under a Creative Commons Attribution 4.0 International License .