Monday, April 27, 2015

Cartographic Skills Final Project

Our final project was designed to take a semester's worth of learning to create a final map for this course. The objective was to take on the role of someone working at the U.S. Department of Education and create a map of 2013 mean SAT (or ACT) scores by state and the participation rate of high school graduates, with the idea that it would be published in the Washington Post.

We were to obtain a basemap from an outside source and apply a suitable projection to that basemap. We had to input the SAT data into Excel and import the tabular data into ArcMap. We wanted to present two datasets on one map (participation rate and test scores) and classify and symbolize the data in a way that easily communicates the information.

For this project, I chose to use SAT data from the U.S. Census Bureau. I downloaded the basemap (also from the U.S. Census Bureau) and projected it into the Albers Conical Equal Area projection. I initially had problems with my scale bar due to this, as I initially defined the projection before adding the data. When I restarted the project and added the data first, the scale bar problem was no longer present. After creating the Excel file, I added the data table and joined it with the states shapefile using the Joins and Relates tool, making sure to only use matching records (there were a lot of "null" values when I selected the other option). Once they were joined, I could go about making some map decisions. First, I wanted to display the participation rate as a choropleth map, which is appropriate here because the values of the participation rate change abruptly at enumeration boundaries (in this case, the state boundaries). I chose a color scheme that was suitable, using light colors for smaller values and darker colors for higher values. I used the natural breaks classification scheme because it was showing the data well and was keeping like values mainly clustered together in the same class. The natural breaks method minimizes the variance within classes and maximizes the variance between classes, which I think works well with this dataset. To display the second data set, I copied the original state/tabular data joined layer and overlaid it on top of the first one. Using this second layer, I could symbolize the SAT scores and still be able to see the choropleth map of participation rates. For this data, I decided to use graduated symbols and a natural breaks classification scheme. The graduated symbols allow the map reader to easily understand the information being provided and I felt displayed the information the best; small circles represent lower mean test scores and larger circles represent higher test scores, which is an intuitive way of displaying the data. Again, I used the natural breaks classification scheme to keep the range within each class fairly uniform and to keep like values within a specific class as much as possible. I used a total of 5 classes for both sets of data. For the choropleth map, if there were many more classes the color variation would be so small that you would have trouble distinguishing between them. With the graduated symbols, 5 classes seemed to be the right amount to display the data, and I didn't want my symbols to become too small or too large. Once I decided on the symbology and classification, I inserted my legend and scale bar, labeled the states and exported the map to finish on CorelDraw.

Most of the work I did on CorelDraw was moving things around to make the map look neat and professional. I arranged things so that the graduated symbol and the state label were not overlapping either each other or a state boundary. Sometimes this meant I needed to place the label and/or symbol off to the side and draw a line to the map. To stretch the map some without stretching the symbology, I placed the graduated symbols into their own layer separate from the map. I also made sure to stretch the scale bar with the map since I imported it and wanted it to remain accurate. I resized both Alaska and Hawaii and created an inset of using CorelDraw and the drawing features, noting that they are not to scale. Finally, I added the essential elements (title, data source, who created it) and used an effective background color that emphasizes the map.

I really enjoyed making this map, once my issues with the proper projection were sorted out. I liked that it took everything we've learned this semester from towards the beginning (data classification and symbology) to more recently (joining tabular data). This course has taught me that a lot of thought into what story I'm trying to tell is important in creating an effective map. I really enjoyed this course and I hope to take what I learned here into my endeavors in the future.


Saturday, April 25, 2015

Final Project

Our final project took a lot of time and effort and really made us take everything we've learned all semester to produce a well-polished product. Our assignment was to take the role of a GIS Analyst hired as a consultant by Florida Power and Light (FPL) to perform GIS analysis for the proposed Bobwhite-Manatee transmission line. We were given several criteria and we were to produce maps and analysis to determine if the criteria were met satisfactorily by the FPL preferred route. We were then to create a Powerpoint presentation to explain our process and conclusions. The maps for this project were produced entirely in ArcMap.

Criteria #1: It has relatively few homes in close proximity. This criteria is put in place due to the concerns of some that electromagnetic radiation could be hazardous to your health. For this assignment, "close proximity" is within 400 feet. For the map I created to demonstrate compliance with this criteria, I added the aerial imagery provided, the FPL preferred corridor, and created a buffer layer 400 feet outside the FPL preferred corridor. To identify homes, I had to create another layer using the Georeferencing toolbar to digitize the structures that were homes. I zoomed in and out, starting at the top of the corridor (and buffer) areas and placed a point wherever I believed there was a home either within the buffer or the corridor. I created another field in the attribute table and labeled it "1" if the home was within the corridor or "2" if it was within the buffer (but outside the corridor). The most difficult part of this was identification of homes. The ones with swimming pools in the backyard were somewhat obvious, but there were some structures that I discounted because they were too large to be a home (and didn't appear to be an apartment complex) or were in an industrial area. If I was still unsure, I erred on the side of caution and digitized the structure as a home. Once I had this layer on my map, I used the symbology of a green point for homes within the corridor and a white point for homes within the 400 foot buffer but outside the corridor. There are relatively few homes impacted, so this criteria appears to be well satisfied. We also want to minimize our impact on land parcels, so I created a second map showing this. I added the land parcels of Manatee and Sarasota counties. I used the Intersect tool to separate this into parcels from each county within the FPL corridor and within the 400 foot buffer, and displayed this on my map. We can see that many more parcels are affected in Manatee County than in Sarasota County. Much of this appears to be either public land or, at the very least, not a major impact on homeowners due to the small number of homes impacted. Regardless, the number of parcels impacted is a main factor in my cost analysis definition of suburban land (as opposed to rural land).

Criteria #2: It generally avoid schools and school sites. Again, our criteria is here due to health concerns of electromagnetic radiation, so we can fortunately use the same 400 foot buffer used to evaluate the first criteria. For this map, I still have the FPL corridor and the buffer layer, but now I have separate school and daycare sites layers added. As this data was for the entire state, I clipped it to the extent of the study area, which I displayed as a line fill feature which I think shows the area well. I showed the schools in the study area as a red school symbol and the daycares as a green school symbol. I did not need to clip this to within the buffer or corridor regions, as there are clearly none there. This route is an excellent choice based on this criteria.

Criteria #3: It avoids large areas of environmentally sensitive lands. To evaluate this criteria, I ended up creating 2 maps. First, I wanted to show the land parcels identified as conservation land as it compared to the area within the FPL preferred corridor. I clipped the conservation land layer to the extent of the study area, and also to the extent of the preferred corridor. The red polygons within the FPL corridor are where the corridor encroaches onto conservation land. The main point of this map is to show that within the study area there are over 14,000 acres of conservation land, but the FPL Preferred Corridor only encroaches on 164 acres on conservation land. The second map I created focused on displaying the wetlands and uplands in the study area and the corridor. I clipped the wetlands (and uplands) to the extent of the study area, and employed the same technique to the extent of the FPL corridor. I categorized the wetlands into lakes, swamps and marshes, and rivers using the attribute table and I made each category easily distinguishable. To determine the acreage within the FPL preferred corridor, I used the Select by Attributes feature and selected that wetland type. Notice that of the 19,000 acres of wetlands in the study area, the FPL corridor goes through approximately 900 acres. Based on both these maps, the FPL Preferred Corridor does indeed manage to avoid large areas of environmentally sensitive lands.

Criteria #4: This line can be built along this route for a reasonable cost. For the cost estimate, I had to rely heavily on the .pdf about cost of transmission lines provided on the course website. On this map, I show the study area and the FPL preferred corridor. To first determine the length of the new line, I zoomed in closely to the map and, starting at the top of the corridor, found the centerline and used the Measure tool. I traced the centerline to the bottom extent of the corridor and determined that the length of the new transmission line will be just under 25 miles. I drew a line on my map showing the length of the route. To provide a cost estimate, I used the .pdf provided to determine that it will cost $1,100,000 per mile. This value depends on the line being single circuit and using tubular steel pole construction. I categorized the area as a suburban area, as it crosses through a moderate number of land parcels, which multiplies the cost of the transmission line by 1.2 times. The new line is over 20 miles long, so no extra cost is attributed to the length of the line. This gave me a total engineering and construction cost of the new Bobwhite-Manatee transmission line to come out to $32,760,000. This seems to be a reasonable value, and the transmission line route is relatively straight, so I believe that this is one of the least expensive routes for the new line.

Conclusions: Based on the four criteria, this seems to be an excellent route for the new Bobwhite-Manatee transmission line. There are few homes and schools in the area, it minimizes its impact on land parcels and environmentally sensitive land, and it appears to be a very efficient route.

We also needed to create a Powerpoint presentation and a slide-by-slide summary, which you can see in the hyperlink below.

Alan Hickford Final Presentation

Slide-by-slide summary of Final Presentation


Friday, April 10, 2015

Module 12 - Google Earth

This week we learned a little more about Google Earth, specifically about how to convert maps from ArcGIS to KML format, how to create Google Earth maps, and how to record Google Earth tours.

The objective of this assignment was to:
1) display our dot density map of south Florida in Google Earth, and
2) Create a Google Earth tour of southern Florida's major cities.

First I opened ArcMap and opened my dot density map from a couple of weeks ago, as I wanted to convert this map into a KML format, which can be opened with Google Earth. I also went into Google Earth and turned off the unnecessary layers. I used the Map to KML tool (in ArcToolbox > Conversion Tools > To KML) to convert my dot map to KML, making sure to check the box "convert vector to raster", which turns all layers into image files. Although this doesn't allow us to edit the old features, it reduces the file size and the time required to run the conversion tool. I added the SouthFlorida shapefile and converted that to FML format using the Layer to KML tool, and copied and pasted both files to my computer.

One of the more interesting (and occasionally frustrating) parts of this lab was in using and setting up the layers in Google Earth. It was tricky figuring out how to change the altitudes and drawing order of the layers to get the map the way I wanted. For example, the counties layer needed to be at the "highest" altitude so that you could click on them and see the attributes, but I also needed to adjust the transparency and colors so I could see the rest of the map (I went with eliminating the fill and just showing the outline). I also played with the altitude of the density dots, displaying them at a higher altitude.

Next we wanted to create a tour of several locations in south Florida. The creation of the tour was rather straightforward. I placed placemarks wherever I wanted there to be a tour stop, then I turned off the layers so you couldn't see the yellow placemarks on the tour. I also turned off the legend so you could see the whole map on the tour. I started with a view of all of south Florida and hit record. I waited a few seconds, then clicked on the next stop of the tour, and so on, until I went to the last stop and back to the original view and stopped recording the tour. This is where I ran into some issues, as although when creating the tour I turned the dots layer off and on as described in the lab, that didn't seem to translate to the playback. It's either always on or always off (I could choose which during playback, it just wouldn't change the way I did it while recording). But other than that, the tour feature on Google Earth works very well and is a fun way to look at a map. The screenshot below is from my map, and it is a shot of downtown Miami. Again, the legend is turned off here to get a better look at the area.


Monday, April 6, 2015

Module 13 - Georeferencing, Editing, and ArcScene

This week we jumped right into georeferencing data, editing features, and an introduction to ArcScene. The major topics we learned were:

 - Georeferencing data using the Control Points tool
 - Georeferencing an unknown raster image to known vector data
 - Interpret residual and Root Mean Square errors
 - Digitize building and road features
 - Practice polynomial transformations
 - Create hyperlinks in ArcGIS
 - Create multiple ring buffers
 - Overlay data in a 3D environment

First we learned georeferencing in ArcMap. After adding the feature layers and the UWF N and S photos, I noticed that the photos do not line up because the layer didn't come with an accompanying world file. Starting with the north photo and using the Georeferencing toolbar, I fit the photo to the display. To line the image up properly, I used the "Add Control Points" tool. I identified a common point on both the known and the unknown layers and linked the common points by clicking first on the unknown raster image location, then on the known location in the reference data. This shifted the map so that they lined up better. I did this for 10 different points, making sure to spread those points throughout the image. Viewing the links table and the RMS error shows how accurate the data is lining up. The residual value shows how much the link agrees with how the layer is currently displayed; a lower value usually means the control point is more accurately georeferenced. However, if multiple control points are offset by a similar amount, a high residual value could mean a more accurately georeferenced point (it would be an outlier). So, just looking at the residual isn't enough, you need to visually inspect the map as well. The Root Mean Square (RMS) error is a measure of the differences between the predicted values and the actual values, and indicates accuracy of the spatial analysis. After updating the georeferencing, I performed the same process with the southern photo. This photo was more distorted, so the residual error values were different. Additionally, I needed to change the transformation to 2nd order polynomial. A higher order transformation allows the raster to bend and warp more than lower order transformations.

Next we learned about editing. With the Editor tool, we are able to create and edit data. For me, the most important thing to remember about editing is that I have to save the edits or I will lose them when I close the editor or the program. I wanted to digitize the UWF Gym. I started an editing session using the Buildings layer, and chose to create a polygon. I used straight segments to basically outline the perimeter of the building, using many points small distances apart to maintain accuracy. I edited the attribute table and saved the edits. Then I wanted to digitize a UWF campus road. This was basically the same process, except I wanted to digitize along the centerline of the road. I also used the snapping toolbar and enabled edge snapping to snap the ends of that road to the intersecting roads, and saved my edits.

The multiple ring buffer tool was a fairly simple and effective tool. We were given a scenario in which we want to create a new conservation easement for a bald eagle nest on campus property. UWF shows a 330 foot easement already in place around the nest, but the FWC requires a 660 foot protected area. Using the multiple ring buffer tool to show this was quite easy; you just need to input the buffer distances and set dissolve as all (if you want the tool to generate the output as one feature).

Adding a hyperlink is a useful tool to have as well. In an editing session of the Eagle's Nest layer, I edited the attribute table with the student web address using my username, and saved the edits. Then I went into the layer properties, and selected to show the content as a URL.

After all this, I needed to show a map of the UWF campus as well as show the Protection Buffer in an inset. Just zooming in on the buffer layer wouldn't really show where the buffers were in comparison to the campus, so I showed both on my inset map, with a basemap underneath.


Finally, we learned about creating 3D scenes. In ArcScene, I added the feature layers and photos, as well as the DEM. We had an introduction to ArcScene via last week's module in the other course, so I understood what to expect with my output. I selected "Floating on a custom surface" and the DEM layer, and to eliminate the black line between the raster images, I added a layer offset. To get the 3D appearance, I extruded the values by height, selecting "adding it to each feature's maximum height." This way the buildings are kept level. I also changed the vertical exaggeration to emphasize the 3D effect. I exported the scene to 2D, saved it, and added it to ArcMap. From here, I created a map adding the essential map elements, and making sure that the two items I itemized were displayed. I changed the color of the digitized features to red so they can be seen better. That map is shown below. On to the final!


Wednesday, April 1, 2015

Module 11: 3D Mapping

This week's lab introduced us to 3D mapping. I enjoyed learning about this topic, as I've seen an been interested in 3D maps, but have never created them or had much exposure to them. This lab had 3 parts. The first part was the ESRI training, which taught us visualization techniques. The second part involved learning some applications and converting 2D data to 3D, and the third part reinforced some more 3D concepts. While 3D maps aren't ideal in every situation (road maps or maps where structures or the 3D element can obscure whatever element you wish to see), they are very useful and visually pleasing, and I enjoyed learning about them. Some learning objectives were:

- Performing techniques to visualize raster and feature data in 3D
- Convert 2D feature to 3D using lidar data derived elevation values
- Demonstrate proficiency with 3D Analyst Extension and ArcScene
- Export data to kmz and view in Google Earth
- Describe applications of 3D data

The ESRI training guide walked us through 3D mapping and 5 main concepts/tools:

1. Setting base heights for raster and feature data
2. Setting vertical exaggeration
3. Setting illumination and background color
4. Extrude buildings and wells
5. Extrude parcel values

Setting base heights was very useful and seems to be used in all aspects of 3D mapping. Setting the vertical exaggeration is very useful as well. It's usually used to exaggerate small variations in terrain so they are easier to visualize, although in one step of the module I needed to decrease the vertical exaggeration as well. Learning about illumination and background color was informative as well. I was able to change the illumination depending on the time of year, time of day, and you can even change the weather if you desire (at least on ArcGlobe). When extruding buildings and wells, I was able to extrude the buildings upwards to show the heights of the buildings and the wells downward to show the depth of the wells. I think extruding parcel values was one of the more useful parts of this lab. I was able to extrude buildings based on their total value, and it really shows that data well and would be very useful in urban planning scenarios. The ESRI training also taught us 3D visualization terminology and concepts, such as z-values, raster data, triangulated irregular networks (TINs), terrain datasets, multipatch features, and 3D features.

The second part of the lab involved converting 2D to 3D data. The raster surface for this dataset was provided but was derived using lidar (light detection and ranging). This part of the lab was mainly about learning 3D Analyst and its capabilities. After adding the building footprint and Boston.tif files, I used 3D Analyst to create random points. The objective is to generate these random points within the building shapefile and add surface (elevation) information to those points and summarize for each building. We created 100 random points per building (more points per building, of course, mean a finer resolution). I added surface information and used the Summary Statistics tool. Here is where I will need to be careful in the future. If I use the wrong field to join the mean Z value to the building footprint layer (OID being the wrong field), the building information would be for the building next to it (OID values range from 1-343, where FID and CID values range from 0-342). Just like in programming, this looks like it would be very difficult to spot once I progressed further through an exercise. I performed the join and exported the data to save it. I extruded the features in the layer, using the expression builder to extrude to "Mean Z." We wanted to share the data so that it's viewable on a platform such as Google Earth. I used the "Layer to KML Tool" to create a .kmz file of my data, which can then be used in Google Earth.

The objective of Part 3 was to compare and contrast two maps. One was of Charles Minard's map of Napoleon's Russian Campaign of 1812 in 2D, and the other is a 3D version of this map. Both are excellent maps, but I like the 3D version better because I like that on the 3D map, the portion of the advance into Russia is shown above the surface, and the retreat is below the surface. Temperature information is extruded downward (negative temperatures) and the 3D line visible expanded(shrunk) as the size of the army increased(decreased). It showed every bit of information the 2D map showed, but I feel it looked better, less "cramped". To me, the 2D map tries to put almost too much on one map, and the text was very small. The 3D map also incorporated colors that stood out, making it that much easier to see and analyze.