Difference between revisions of "Ottawa Chapter/tutorials/Raster elevation"

From OSGeo
Jump to navigation Jump to search
m
m (fix formatting - <br> in <pre> changed to real CR)
Line 9: Line 9:
 
<p>DEMs from NRCan and many other sources are supported by the GDAL library, so importing them is pretty painless.  Run r.in.gdal, either at the command line or using the menu system (File|Import|Raster Map|Various formats using GDAL).  You will need to run this twice, since the DEM data is split into west and east tiles.  You will end up with two new rasters in your database - I named them deme and demw in my session, and will use those names below.
 
<p>DEMs from NRCan and many other sources are supported by the GDAL library, so importing them is pretty painless.  Run r.in.gdal, either at the command line or using the menu system (File|Import|Raster Map|Various formats using GDAL).  You will need to run this twice, since the DEM data is split into west and east tiles.  You will end up with two new rasters in your database - I named them deme and demw in my session, and will use those names below.
 
</p><p>To make it easier to work with the DEM, we'll start off by mosaicking them together into one file.  A nice simple way to do this is to set the working region in GRASS to a space that includes the whole DEM and use the raster calculator (r.mapcalc) to create a new combined map.  For the 031G05 map sheet, the relevant region is:  
 
</p><p>To make it easier to work with the DEM, we'll start off by mosaicking them together into one file.  A nice simple way to do this is to set the working region in GRASS to a space that includes the whole DEM and use the raster calculator (r.mapcalc) to create a new combined map.  For the 031G05 map sheet, the relevant region is:  
<br></p><pre>north: 45:30:00.375N<br>south: 45:14:59.625N<br>east: 75:29:59.625W<br>west: 76:00:00.375W<br>resolution 0.75 seconds (0:00:00.75)<br></pre>
+
<br></p><pre>north: 45:30:00.375N
 +
south: 45:14:59.625N
 +
east: 75:29:59.625W
 +
west: 76:00:00.375W
 +
resolution 0.75 seconds (0:00:00.75)
 +
</pre>
 
Now, we can use r.mapcalc to build a new raster that uses the deme data where that is available and the demw data everywhere else in the current region:
 
Now, we can use r.mapcalc to build a new raster that uses the deme data where that is available and the demw data everywhere else in the current region:
 
<pre>r.mapcalc dem="if(isnull(deme), demw, deme)"</pre>
 
<pre>r.mapcalc dem="if(isnull(deme), demw, deme)"</pre>
Line 16: Line 21:
 
</p><h2>Sampling a raster map</h2>
 
</p><h2>Sampling a raster map</h2>
 
<p>Many types of analysis make use of sampling raster layers at particular locations.  This is done, for example, in accuracy assessments of DEMs.  You might, for example, have a file of known elevations in the Ottawa area which you would like to compare to the elevations in this DEM.  Although we don't have that, we could simulate the experience by sampling this DEM, interpolating a NEW DEM, then comparing the elevations recorded at those locations in the two DEMs.  This can be done using the following steps (lines starting with # are comments and are not run;  you can cut and paste these commands into your GRASS prompt window if you want):
 
<p>Many types of analysis make use of sampling raster layers at particular locations.  This is done, for example, in accuracy assessments of DEMs.  You might, for example, have a file of known elevations in the Ottawa area which you would like to compare to the elevations in this DEM.  Although we don't have that, we could simulate the experience by sampling this DEM, interpolating a NEW DEM, then comparing the elevations recorded at those locations in the two DEMs.  This can be done using the following steps (lines starting with # are comments and are not run;  you can cut and paste these commands into your GRASS prompt window if you want):
</p><pre># first, create 250 randomly located points in our working region<br>v.random -z output=sample250 n=250<br># Now create an attribute database for those vector points, and an attribute field called <br>#  demelev which will hold double precision real numbers<br>v.db.addtable sample250 col="demelev double"<br># Fill in the demelev field in the sample250 table using the values in the raster dem<br>v.what.rast rast=dem vect=sample250 col=demelev<br># Show us the sample250 attribute database<br>v.db.select sample250<br></pre>  
+
</p><pre># first, create 250 randomly located points in our working region
 +
v.random -z output=sample250 n=250
 +
# Now create an attribute database for those vector points, and an attribute field called  
 +
#  demelev which will hold double precision real numbers
 +
v.db.addtable sample250 col="demelev double"
 +
# Fill in the demelev field in the sample250 table using the values in the raster dem
 +
v.what.rast rast=dem vect=sample250 col=demelev
 +
# Show us the sample250 attribute database
 +
v.db.select sample250
 +
</pre>  
 
Now for the interpolation - there are various possibilities, requiring different forms of input data.  For the purposes of this workshop, since it uses the data in the format we already have, and because this method is generally useful for DEM interpolation, we'll use regularized spline tension interpolation.  Then we'll sample the new raster using our 250 sample points from above, creating a new vector database with the values from the new DEM in a field called newdem, the values from the old vector data in a field called demelev, and it will also create a new field diff which has the difference between the two:
 
Now for the interpolation - there are various possibilities, requiring different forms of input data.  For the purposes of this workshop, since it uses the data in the format we already have, and because this method is generally useful for DEM interpolation, we'll use regularized spline tension interpolation.  Then we'll sample the new raster using our 250 sample points from above, creating a new vector database with the values from the new DEM in a field called newdem, the values from the old vector data in a field called demelev, and it will also create a new field diff which has the difference between the two:
<pre># Interpolate a new raster called newdem, using the data in the demelev field of the sample250 vector<br>v.surf.rst input=sample250 zcolumn=demelev elev=newdem <br># display this new raster in my mapping window<br>d.rast newdem<br># sample the new DEM elev and compare to the old DEM samples<br>v.sample in=sample250 col=demelev rast=newdem out=elev_sample<br># show us the results<br>v.db.select elev_sample<br># create a simple statistical summary of the differences between the two DEMs<br>v.univar elev_sample col=diff<br></pre>
+
<pre># Interpolate a new raster called newdem, using the data in the demelev field of the sample250 vector
 +
v.surf.rst input=sample250 zcolumn=demelev elev=newdem
 +
# display this new raster in my mapping window<br>d.rast newdem
 +
# sample the new DEM elev and compare to the old DEM samples
 +
v.sample in=sample250 col=demelev rast=newdem out=elev_sample
 +
# show us the results<br>v.db.select elev_sample
 +
# create a simple statistical summary of the differences between the two DEMs
 +
v.univar elev_sample col=diff
 +
</pre>
 
<h2>Elevation derivatives</h2>
 
<h2>Elevation derivatives</h2>
 
<p>Now that we have an elevation model, we would probably like to created derivative products.  First, you can create slope and aspect maps from an elevation model using <code>r.slope.aspect</code>.  Run the program now, specifying dem as the input map, and creating (at least) new slope and aspect maps.  These maps are often useful diagnostic tools for DEMs because they can highlight artefacts from the interpolation process or the source data - can you see any such features in your new maps?
 
<p>Now that we have an elevation model, we would probably like to created derivative products.  First, you can create slope and aspect maps from an elevation model using <code>r.slope.aspect</code>.  Run the program now, specifying dem as the input map, and creating (at least) new slope and aspect maps.  These maps are often useful diagnostic tools for DEMs because they can highlight artefacts from the interpolation process or the source data - can you see any such features in your new maps?
 
</p><p>Shaded relief maps offer a valuable visualization tool for elevation data - try out the r.shaded.relief command, with something like (feel free to experiment):
 
</p><p>Shaded relief maps offer a valuable visualization tool for elevation data - try out the r.shaded.relief command, with something like (feel free to experiment):
</p><pre>r.shaded.relief map=dem shadedmap=relief altitude=30 azimuth=270<br>d.rast relief<br></pre>
+
</p><pre>r.shaded.relief map=dem shadedmap=relief altitude=30 azimuth=270
 +
d.rast relief
 +
</pre>
 
<p>For even more impressive visualization possibilities, try out the NVIZ tool - you can launch it at the prompt with the command: <code>nviz el=dem</code>
 
<p>For even more impressive visualization possibilities, try out the NVIZ tool - you can launch it at the prompt with the command: <code>nviz el=dem</code>
 
</p><h2>Watershed analysis</h2>
 
</p><h2>Watershed analysis</h2>
Line 30: Line 54:
 
</p><h2>Line of Sight analysis</h2>
 
</p><h2>Line of Sight analysis</h2>
 
<p>The GRASS module <code>r.los</code> can calculate line of sight based on elevation data, a starting point, an observation height, and a limit on how far the maximum visibility should be.  This can be used to, for example, calculate the "vista" from a lookout, or to determine the potential coverage of a broadcast technology that requires line of sight access.
 
<p>The GRASS module <code>r.los</code> can calculate line of sight based on elevation data, a starting point, an observation height, and a limit on how far the maximum visibility should be.  This can be used to, for example, calculate the "vista" from a lookout, or to determine the potential coverage of a broadcast technology that requires line of sight access.
</p><p>We have a small problem right now, though, since at least as of this writing the <code>r.los</code> module does not work on lat/long data - it requires a rectangular coordinate system.  But that just provides an opportunity to talk about reprojecting data, so let's launch into that.
+
</p>
</p><p>In an unused shell terminal, start up a new session of GRASS, and again create a new location based on an EPSG code.  Any rectangular coordinate system will do for the purpose of demonstration - often UTM locations are desired, and in this institution we often use "MTM", a Canadian Modified Transverse Mercator system, because data from the City of Ottawa and the NCC that are available to our students are usually in this coordinate system.  So you can choose a different projection if you want, but I have demonstrated this technique using EPSG code 2146, which specifies MTM Zone 9.  As above, you will need to specify the EPSG code and a new location name, then GRASS will exit and you need to restart it, choosing the new location when it fires back up.
+
<p>We have a small problem right now, though, since at least as of this writing the <code>r.los</code> module does not work on lat/long data - it requires a rectangular coordinate system.  But that just provides an opportunity to talk about reprojecting data, so let's launch into that.
</p><p>Now you've got a new GRASS database that uses a rectangular coordinate system, but so far you've got no data.  If you run <code>g.region -p</code>, you'll see that you also have a default region that doesn't make a lot of sense (nonsense coordinates).  So the first thing we need to do before creating new data is to set a sensible region.  In this case, we want to be able to fit a reprojected version of our DEM into this region, so we need to transform the coordinates of the edges of the DEM from lat long to MTM (or other rectangular coordinate system).  I did this using cs2cs, a coordinate translator that is part of the PROJ package, which GRASS uses for projections.  This is a pretty low-level program - it simply takes a list of coordinates in one system and spits out equivalent versions in another system - it can do this on text files, or if you know how to manipulate these programs in a UNIX shell environment, you can do all sorts of possibilities on the command line - I'll provide one example here. If you type in the command in the form: <code>cs2cs  proj=latlong  datum=NAD83  to  (substitute new proj info here) -r
+
</p>
 +
<p>In an unused shell terminal, start up a new session of GRASS, and again create a new location based on an EPSG code.  Any rectangular coordinate system will do for the purpose of demonstration - often UTM locations are desired, and in this institution we often use "MTM", a Canadian Modified Transverse Mercator system, because data from the City of Ottawa and the NCC that are available to our students are usually in this coordinate system.  So you can choose a different projection if you want, but I have demonstrated this technique using EPSG code 2146, which specifies MTM Zone 9.  As above, you will need to specify the EPSG code and a new location name, then GRASS will exit and you need to restart it, choosing the new location when it fires back up.
 +
</p>
 +
<p>Now you've got a new GRASS database that uses a rectangular coordinate system, but so far you've got no data.  If you run <code>g.region -p</code>, you'll see that you also have a default region that doesn't make a lot of sense (nonsense coordinates).  So the first thing we need to do before creating new data is to set a sensible region.  In this case, we want to be able to fit a reprojected version of our DEM into this region, so we need to transform the coordinates of the edges of the DEM from lat long to MTM (or other rectangular coordinate system).  I did this using cs2cs, a coordinate translator that is part of the PROJ package, which GRASS uses for projections.  This is a pretty low-level program - it simply takes a list of coordinates in one system and spits out equivalent versions in another system - it can do this on text files, or if you know how to manipulate these programs in a UNIX shell environment, you can do all sorts of possibilities on the command line - I'll provide one example here. If you type in the command in the form: <code>cs2cs  proj=latlong  datum=NAD83  to  (substitute new proj info here) -r

Revision as of 15:38, 6 October 2007

We'll use elevation data to demostrate some of the raster processing capabilities of GRASS. Some previous experience with GRASS is assumed in these instructions, but not much - for example, having done the "GRASS in a nutshell" tutorial, or any of the vector-based tutorials we have already run, should be enough background.

GRASS stores raster data as grids of integer or real numbers, or can store categorical data by maintaining a lookup table of category codes against numbers stored in the grid. By default, the grids are compressed on the fly, reducing the storage space penalty sometimes ascribed to raster data.

Raster capabilities were the first main strength of GRASS, and all early development concentrated on raster data. The move from version 4 to 5 of the system brought in better treatment of NULL data, and the introduction of floating point raster grids. Since then, however, raster development has been rather quiet, while the entire vector engine of GRASS was replaced. Now that the vector side is making excellent progress, there is talk about rethinking the GRASS raster data model. However, this tutorial should be broadly applicable to any GRASS 5.x or 6.x system, and should remain applicable for the foreseeable future. The instructions included below, unless otherwise noted, were tested on a GRASS6.1 system.

Of course, raster processing (at least in a fully functional raster GIS) is a huge topic, so to narrow it down I've decided to concentrate today on functions particularly relevant to DEM analysis or production. And to get things going especially quickly, I'm starting by downloading an existing DEM. I will download data for the Ottawa area from Geobase.ca - you are welcome to use whatever elevation you like, the steps I present will be generally applicable for any other similar dataset.

I have started by creating an account on geobase.ca, and downloading the DEMs for the NTS map tiles 031G05, 031G06, 031G10, and 031G11 - those sheets that generally cover the Ottawa area, both sides of the provincial border. These files will be available for you to try in the lab on March 23. If you read the metadata, you will see that these come as pairs of 1201x1201 pixel files with a resolution of 0.75 seconds, NAD83 lat long data.

Setting up a GRASS location

To get started, we need to create a new location in GRASS. You can do this by either entering in all the projection information manually, or using the "EPSG shortcut" - choose the "Create new location with EPSG code" option in the startup screen, and (in this case), enter the EPSG code 4269, which specifies the NAD83 datum and the GRS80 ellipse. You will be asked to exit and then re-enter GRASS, choosing your newly created location. When you start GRASS back up, you will be ready to import the DEM into your newly created Lat-Long location.

Importing DEM data

DEMs from NRCan and many other sources are supported by the GDAL library, so importing them is pretty painless. Run r.in.gdal, either at the command line or using the menu system (File|Import|Raster Map|Various formats using GDAL). You will need to run this twice, since the DEM data is split into west and east tiles. You will end up with two new rasters in your database - I named them deme and demw in my session, and will use those names below.

To make it easier to work with the DEM, we'll start off by mosaicking them together into one file. A nice simple way to do this is to set the working region in GRASS to a space that includes the whole DEM and use the raster calculator (r.mapcalc) to create a new combined map. For the 031G05 map sheet, the relevant region is:

north: 45:30:00.375N
south: 45:14:59.625N
east: 75:29:59.625W
west: 76:00:00.375W
resolution 0.75 seconds (0:00:00.75)

Now, we can use r.mapcalc to build a new raster that uses the deme data where that is available and the demw data everywhere else in the current region:

r.mapcalc dem="if(isnull(deme), demw, deme)"


The mapcalc expression can be interpreted as "Create a new map called dem, which is filled in with values defined by checking each location and if the deme map is NULL there (no data), then use the data in the demw map, otherwise use the data in deme".

Now you can visualize your new dem map, using either the GUI (d.m), or the d.rast command. If the color map is not appropriate, you can change it using r.colors (in the GUI, Raster|Manage Map Colors|Modify Color Table).

Sampling a raster map

Many types of analysis make use of sampling raster layers at particular locations. This is done, for example, in accuracy assessments of DEMs. You might, for example, have a file of known elevations in the Ottawa area which you would like to compare to the elevations in this DEM. Although we don't have that, we could simulate the experience by sampling this DEM, interpolating a NEW DEM, then comparing the elevations recorded at those locations in the two DEMs. This can be done using the following steps (lines starting with # are comments and are not run; you can cut and paste these commands into your GRASS prompt window if you want):

# first, create 250 randomly located points in our working region
v.random -z output=sample250 n=250
# Now create an attribute database for those vector points, and an attribute field called 
#   demelev which will hold double precision real numbers
v.db.addtable sample250 col="demelev double"
# Fill in the demelev field in the sample250 table using the values in the raster dem
v.what.rast rast=dem vect=sample250 col=demelev
# Show us the sample250 attribute database
v.db.select sample250

Now for the interpolation - there are various possibilities, requiring different forms of input data. For the purposes of this workshop, since it uses the data in the format we already have, and because this method is generally useful for DEM interpolation, we'll use regularized spline tension interpolation. Then we'll sample the new raster using our 250 sample points from above, creating a new vector database with the values from the new DEM in a field called newdem, the values from the old vector data in a field called demelev, and it will also create a new field diff which has the difference between the two:

# Interpolate a new raster called newdem, using the data in the demelev field of the sample250 vector
v.surf.rst input=sample250 zcolumn=demelev elev=newdem
# display this new raster in my mapping window<br>d.rast newdem
# sample the new DEM elev and compare to the old DEM samples
v.sample in=sample250 col=demelev rast=newdem out=elev_sample
# show us the results<br>v.db.select elev_sample
# create a simple statistical summary of the differences between the two DEMs
v.univar elev_sample col=diff

Elevation derivatives

Now that we have an elevation model, we would probably like to created derivative products. First, you can create slope and aspect maps from an elevation model using r.slope.aspect. Run the program now, specifying dem as the input map, and creating (at least) new slope and aspect maps. These maps are often useful diagnostic tools for DEMs because they can highlight artefacts from the interpolation process or the source data - can you see any such features in your new maps?

Shaded relief maps offer a valuable visualization tool for elevation data - try out the r.shaded.relief command, with something like (feel free to experiment):

r.shaded.relief map=dem shadedmap=relief altitude=30 azimuth=270
d.rast relief

For even more impressive visualization possibilities, try out the NVIZ tool - you can launch it at the prompt with the command: nviz el=dem

Watershed analysis

Next, we're going to analyze the elevation data for hydrologically meaningful units - we'll use r.watershed to divide up the area into hillslopes and basins, and extract per-pixel drainage directions. This is all done using the "assumption" that water will always flow downhill - the algorithm "climbs" up from every point in the DEM, determining the area that drains to each location, and identifying local maxima so that drainage boundaries and flow paths can be determined. This is a fairly computationally intensive process, so we'll get it started and leave it running while we do the next steps. Execute the following command:

r.watershed elevation=dem threshold=500 drainage=draindir basin=basinmap half.basin=hillslope visual=watervis

While that's running, we'll consider line of sight analysis.

Line of Sight analysis

The GRASS module r.los can calculate line of sight based on elevation data, a starting point, an observation height, and a limit on how far the maximum visibility should be. This can be used to, for example, calculate the "vista" from a lookout, or to determine the potential coverage of a broadcast technology that requires line of sight access.

We have a small problem right now, though, since at least as of this writing the r.los module does not work on lat/long data - it requires a rectangular coordinate system. But that just provides an opportunity to talk about reprojecting data, so let's launch into that.

In an unused shell terminal, start up a new session of GRASS, and again create a new location based on an EPSG code. Any rectangular coordinate system will do for the purpose of demonstration - often UTM locations are desired, and in this institution we often use "MTM", a Canadian Modified Transverse Mercator system, because data from the City of Ottawa and the NCC that are available to our students are usually in this coordinate system. So you can choose a different projection if you want, but I have demonstrated this technique using EPSG code 2146, which specifies MTM Zone 9. As above, you will need to specify the EPSG code and a new location name, then GRASS will exit and you need to restart it, choosing the new location when it fires back up.

Now you've got a new GRASS database that uses a rectangular coordinate system, but so far you've got no data. If you run g.region -p, you'll see that you also have a default region that doesn't make a lot of sense (nonsense coordinates). So the first thing we need to do before creating new data is to set a sensible region. In this case, we want to be able to fit a reprojected version of our DEM into this region, so we need to transform the coordinates of the edges of the DEM from lat long to MTM (or other rectangular coordinate system). I did this using cs2cs, a coordinate translator that is part of the PROJ package, which GRASS uses for projections. This is a pretty low-level program - it simply takes a list of coordinates in one system and spits out equivalent versions in another system - it can do this on text files, or if you know how to manipulate these programs in a UNIX shell environment, you can do all sorts of possibilities on the command line - I'll provide one example here. If you type in the command in the form: cs2cs proj=latlong datum=NAD83 to (substitute new proj info here) -r