In the first part, we learned to use ERDAS Imagine to perform supervised classification of image files. We learned to use Signature Editor and create Areas of Interest (AOIs) in order to classify different land use types. One thing I liked about this was the ability to use the Inquire tool to click on an area where we know the land use type, and then use the Region Growing Properties window at Inquire to select an area matching that land use type, and I found it is more accurate than drawing a polygon signifying a land use type. Adjusting the Euclidean distance to get the best results is somewhat tricky and I could use a little more practice with it, but I think I get the general idea that it signifies the range of digital numbers (DN) away from the seed that will be accepted as part of that category.
We also learned to evaluate signatures histograms to determine if they are spectrally confused, which occurs when one signature contains pixels belonging to more than one feature. To check this in the histogram, we select two or more signatures and compare the values in a single band. If the two signatures overlap, they are spectrally confused, which causes problems when reclassifying the image. We can look at the spectral confusion of all the signatures by viewing the mean plots. If bands are close together, spectral confusion could be a problem.
In Exercise 3, we learned to classify images, and there are various options. In this case, we are using the Maximized Likelihood classification, which is based on the probability that a pixel belongs to a certain class. This process also creates a distance file, which is very interesting, as the brighter pixels signify a greater spectral Euclidean distance. So the brighter the pixel, the more likely that pixel is misclassified, and this helps determine when more signatures are required to obtain a good classification. We also learned to merge the classes, which is basically using the Recode tool that we learned to use last week, in which we merge all like signatures into one signature (i.e. merging 4 residential signatures into 1).
In the final portion of the lab, we are reclassifying land use for the area of Germantown, Maryland in order to track the dramatic increase in land consumption over the last 30 years. Initially, I used the Drawing tool to draw polygons at the Inquire points, but I wasn’t getting a good classification result, so I went to creating the signatures from the “seed” point, using the Inquire point as the seed points. I created the signatures based on the coordinates in the lab and used the histogram to attempt to use bands with little spectral confusion. Based on the histogram and the mean plots, I used bands 5, 4, and 6 for my image. My reclassified image is below.
I ran into some problems after recoding, as the classification is not as accurate as I had hoped. It is evident to me that many of the urban areas are misclassified as either roads or agricultural areas. The areas classified as water are classified well, as surprisingly are the vegetation layers. As they are similar spectrally I anticipated more misclassified vegetation than there are. I think the two land use classifications that are overclassified (where there are more pixels classified that way than actually are present) are roads and agriculture, and I believe that many urban/residential areas are misclassified as agriculture or roads, likely due to the similarities between the spectral signatures (possibly in band 6). I did not have time to add more signatures than detailed in the lab, and I think that would have helped my classification here. Even so, I think supervised classification is an excellent technique, and I'm glad I learned how to use it.
No comments:
Post a Comment