Skip to main content
Daymet Home > Web Services

Web Services: Accessing Daymet Data

Access to the Daymet data set is available through web services which allow direct browser viewing and/or file download from a browser URL with defined parameters. Understanding these services allows a user to query and automate machine-to-machine downloads of data. Available web services are described in detail below.

Spatial and Temporal Subsets of Daymet Data Using the THREDDS NetCDF Subset Service (NCSS) for Grids

This document describes how to extract and download a spatial and/or temporal subset of individual variables of Daymet daily gridded data within a bounding box of longitude and latitude. This same methodology can be applied to the Daymet climatological summary files also available via THREDDS. Download this guide: NCSS_Daymet_Subset_Guide_v3.pdfPDF

Introduction to THREDDS and NetCDF Subset Service

Access to the Daymet collection of CF compliant netCDF files is available through a Thematic Real-time Environmental Distributed Data Services (THREDDS) Data server. A THREDDS data server allows users to find, access, and download data from a simple, hierarchical catalog within a Web browser or compatible client software. THREDDS instances are available that point to the Daymet mosaic data set, Daymet 2-degree x 2-degree tile data, or pre-derived annual and monthly climatologies. In the mosaic daily data set, each Daymet variable is a continuous grid over the entire North American study region, for each year of data. Due to file sizes, it may be cumbersome for a user to download the entire North American extent of the data. Subsetting the data through THREDDS is one option available for users who don't need access to entire data set.

Through the THREDDS data server, there is an integrated NetCDF Subset Service (NCSS) using a REST API that allows subsetting of netCDF datasets in coordinate space. From this, gridded data subsets are returned in CF-compliant netCDF-3 or netCDF-4 formats. The THREDDS data server provides an interactive GUI web form which allows users to perform NCSS-based spatio-temporal subsetting and download of one Daymet variable_year per submission. These interactive requests can be automated through programmatic, machine-to-machine requests which involve the construction and submission of an HTTP request to the THREDDS data server through an extended URL with defined parameters. Each of these methods is described below.

NetCDF Subset Service Access GUI

Link to the ORNL DAAC THREDDS Data Server for Daymet data or through the THREDDS access points available on the Daymet Project web site.

Several file sets for Daymet data are available. Daily mosaic files for North America are found here. Browse through the catalog of data to a year and then variable selection. Once there, a number of options for data Access will be presented including NetcdfSubset, or NCSS.

THREDDS Access Options

THREDDS Access Options

Clicking the NetcdfSubset link leads to the GUI web form for the NetCDF Subset Service.

NetCDF Subset GUI

GUI Interface to the NetCDF Subset Service

On the GUI form:

  • Under Select Variable(s): Check lat, lon and Daymet data variable (eg. tmin). It's very important to include the lat, lon variables from the Daymet file.
  • Enter the Coordinate and Time subset.
  • DO NOT select the "Add 2D Lat/Lon to file" option. All Daymet netCDF data files were created with CF-compatibility and Lat/Lon variables are already contained in the data files. Selecting the "Add 2D Lat/Lon to file" option instead of including the existing "lat" and "lon" variables in the subset will cause "lat" and "lon" variables to be recreated on-the-fly, may be incorrect due to transformation errors, and significantly slow down the subset process.
  • Choose the Output Format. netcdf-4 is recommended (see the accepted data formats section below)
  • Click Submit

A popup response will appear in your window asking you to save or open the file.

A note about the lat/lon subset and Daymet projection system

The Daymet data set is defined in a Lambert Conformal Conic (LCC) Projection system. It's therefore necessary for the NCSS subset tool to find a minimum bounding area that encompasses the input geographic lat/lon subset Bounding Box within the corresponding LCC projected system. This minimum bounding area is used to subset the Daymet data. The resulting output file will be square in the LCC projection, with the corners estimated by the minimum bounding area of the input lat/lon coordinates. An example is provided below. A bounding box that includes the Smoky Mountain National Park and surrounding National Forests is input in latitude/longitude. In a geographic coordinate system (fig 1a) the bounding box is square. In LCC projected coordinates (fig 1b) the geographic bounding box is shown in the LCC projection. Also shown in fig 1b is the minimum bounding area that includes the lat/lon subset coordinates (green). One time slice of the resulting subset Daymet minimum temperature netCDF data file is displayed in a geographic coordinate system (fig 2a) and in the Daymet LCC projection (fig 2b).

Geographic Coordinate System

NetCDF Subset GUI


NetCDF Subset GUI


Daymet LCC Projected System

NetCDF Subset GUI


NetCDF Subset GUI


NCSS subsets of Projected Daymet Files

Evaluating the URL request

The query generates the HTTP GET NCSS Request URL (note that this request can be obtained directly from the NCSS GUI page and can act as a standard for other requests) :

It is possible to simply copy and paste this URL into a web browser to issue the NCSS subset request. Knowing the structure of this http request allows simple modifications to be made for different queries. For example, a different netCDF file can be queried or changes can be made to the http request parameters to download different subsets.


In addition to the NCSS URL, parameters after the "?" provide the subsetting information. The parameters accepted by the NCSS are described below:

var: This parameter allows users to specify the netCDF variables, including both coordinate variables and Daymet data variables, to subset. Multiple "var=VARNAME" pairs can appear in the HTTP GET request URL if a subset includes multiple netCDF variables.

north: The northern extent of the bounding box (latitude in decimal degrees) of the subset

west: The western extent of the bounding box (longitude in decimal degrees) of the subset

east: The eastern extent of the bounding box (longitude in decimal degrees) of the subset

south: The southern extent of the bounding box (latitude in decimal degrees) of the subset

horizStride: Will take every nth point (in both x and Y) of the gridded dataset. The default, "1", will take every point

time_start: The beginning of the time range. Specify a time range subset in the form:
yyyy '-' mm '-' dd 'T' hh ':' mm ':' ss Z

time_end: The end of the time range. Specify a time range subset in the form:
yyyy '-' mm '-' dd 'T' hh ':' mm ':' ss Z

timeStride: Will take only every nth time in the available series on gridded datasets. The default, "1", will take every time step.

accept: The format of the subset data returned by the NCSS: "netcdf" for netCDF v3 format is the only option currently available.

It's possible to simply copy and paste this URL into a web browser to issue the NCSS subset request; changing the spatial and/or temporal variable ranges in each request or changing the queried netCDF file.

Automating Downloads

In general, all Daymet data granules available through the NCSS always follow this URL pattern:[YEAR]/daymet_v3_[DAYMETVAR]_[YEAR]_[region].nc4

Further automation is possible with a downloading agent such as wget. The output filename can be specified after the "-O", as shown below.

wget -O tmin_subset.nc4 ""

All Daymet data granules available through the NCSS always follow this URL pattern:

Users can programmatically change [YEAR] and [DAYMETVAR] and [region] to automatically construct the base URL for each single Daymet granule. Multiple URL submissions can be programmatically issued by dynamically updating parameters including the Daymet file, longitude and latitude, and start and end times. Stride should be kept equal to 1 to maintain the daily temporal resolution.

Below is an example bash script to automate the download of years 1980 and 1983 of Daymet data for three variables (tmin, tmax, and prcp) of the example Great Smoky Mountains National Park spatial subset from above.

for year in {1980..1983..1}; do
  for par in tmin tmax prcp; do

    if [ $(( $year % 4 )) -eq 0  ]; then
      wget -O ${par}_${year}GSMNPsubset.nc4 "${year}/daymet_v3_${par}_${year}_na.nc4?var=lat&var=lon&var=${par}&north=36.61&west=-85.37&east=-81.29&south=33.57&horizStride=1&time_start=${year}-01-01T12:00:00Z&time_end=${year}-12-30T12:00:00Z&timeStride=1&accept=netcdf"
      wget -O ${par}_${year}GSMNPsubset.nc4 "${year}/daymet_v3_${par}_${year}_na.nc4?var=lat&var=lon&var=${par}&north=36.61&west=-85.37&east=-81.29&south=33.57&horizStride=1&time_start=${year}-01-01T12:00:00Z&time_end=${year}-12-31T12:00:00Z&timeStride=1&accept=netcdf"

Download Size Limitations

The current Daymet NCSS has a size limit of 6GB for each single subset request. NCSS subset requests that result in data exceeding this size limit will return an error message, instead of the actual subset data.

Daymet Single Pixel Data Extraction


The Single Pixel Data Extraction Tool allows users to enter a single geographic point by latitude and longitude in decimal degrees. A routine is executed that translates the (lon, lat) coordinates into projected Daymet (x,y) coordinates. These coordinates are used to access the Daymet database of daily-interpolated surface weather variables. Daily data from the nearest 1 km x 1 km Daymet grid cell are extracted from the database and formatted as a table with one column for each Daymet variable and one row for each day. The Single Pixel Data Extraction Tool also provides the option to download multiple coordinates programmatically. A multiple extractor script is freely available for download .

In addition to the Single Pixel Extraction Tool, this Daymet Single Pixel Extraction Web Service API is provided based on REST URL transfer architecture. This web service allows browser viewing (both table and graph form) or CSV file download of the data for lat/lon locations directly provided from the browser URL. CSV file download is also possible through command utilities such as Wget and cURL.

Watch the segment of a NASA Earthdata Webinar explaining The Single Pixel Data Extraction Tool, Usage and Web-Based ServicesYouTube.

Understanding REST URL

Example REST URL:


Example REST URL:
  1. Latitude (required): Enter single geographic point by latitude, value between 52.0N and 14.5N.
    Usage Example: lat=43.1

    Longitude (required): Enter single geographic point by longitude, value between -131.0W and -53.0W.
    Usage Example: lon=-85.3

  2. CommaSeparatedVariables (optional): Daymet parameters include minimum and maximum temperature, precipitation, humidity, shortwave radiation, snow water equivalent, and day length.

    • tmax - maximum temperature
    • tmin - minimum temperature
    • srad - shortwave radiation
    • vp - vapor pressure
    • swe - snow-water equivalent
    • prcp - precipitation
    • dayl - daylength

    Usage Example: vars=tmax,tmin
    All variables are returned by default.

  3. CommaSeparatedYears (optional): Current Daymet product (version 3) is available from 1980 to the latest full calendar year.
    Usage Example: years=2012,2013
    Years takes higher precedence over dates.

  4. StartDate & EndDate (optional): Current Daymet product (version 3) is available from 1980 to the latest full calendar year. Date elements follow ISO 8601 convention: YYYY-MM-DD
    Usage Example: start=2012-01-31&end=2012-03-31

Types of Web services offered

There are several ways to invoke Daymet web services to download CSV data files or view data.

Wget and cURL

Wget and cURL are simple-to-use command line tools for downloading files. Wget returns an ASCII text CSV file. cURL returns data to the terminal, but the output can be piped into a file.

From the command line, execute:

$ wget ''


$ curl ''
example wget command $ wget ',tmin&years=2012,2013'
all parameters $ wget ',2013'
one parameter, all years $ wget ''
one parameter, one month $ wget ''
example curl command $ curl ',tmin&years=2012,2013'
all parameters $ curl ',2013'
one parameter, all years $ curl ''
one parameter, one month $ curl ''

See the wget and curl help files for details.

Browser View

Data can be viewed in any web browser. In the address bar, type:
example URL,tmin&years=2012,2013
all parameters,2013
one parameter, all years


Java - tool for automated data extraction at single or multiple locations for the single pixel extraction is available for download.

See the README.txt file for instructions for multiple location extraction.

Batch Downloads - Batch Download Examples
A GitHub repo of methods (Bash, Java, and Python) for automating the download of multiple locations of the Daymet Single Pixel data.

R - DaymetR (credit goes to Koen Hufkens)
Functions to (batch) download single pixel DAYMET data directly into your R workspace, or save them as CSV files on your computer. In addition, code is provided to download gridded data for a region of interest specified by a top left / bottom right coordinate pair or a single pixel location.
For details, please check the DaymetR website.

Python - DaymetPy (credit goes to Koen Hufkens)
Functions to (batch) download single pixel Daymet data directly into your python workspace, or save them as CSV files on your computer. Both a batch version as a single download version are provided. The routine is not restricted on the server side in the number of queries you make. Consider downloading gridded data if you download extensive coverage within a single region.
For details, please check the DaymetPy website.