Tuesday, March 28, 2017

Lab 7: Conducting a Distance Azimuth Tree Survey

Introduction

This lab was designed to practice surveying skills on large areas on land without the use of GPS technology or survey stations.  Often GPS and other advanced technology can fail or not be available and it is important to know how to survey even with the most basic techniques.  This lab uses the concept of distance and azimuth to complete a survey of the location of trees in an area of Putnam Park.

Study Area

The survey was conducted in Putnam Park on the University of Wisconsin- Eau Claire campus along Putnam Drive.  Putnam Park is a heavily forested area that curves through part of the Eau Claire campus. Putnam Park is owned by the university and was named a designated State Natural Area in 1976.

Figure 1.  The area of Putnam Park that was used for the tree survey on the University of Wisconsin- Eau Claire campus.

Methods

There are two different types of geographic data, explicit and implicit.

Explicit geographic data is data collected within centimeters of accuracy.  This data is collected using unmanned aerial vehicles (UAVS) and the data is typically only of great need by professional surveyors.

Implicit geographic data is collected for its relative use.  This data is not concerned with exact measurements and exact locations, but rather is generated for the overall layout of a geographic area.

In this lab implicit geographic data was created by using two measurements, distance and azimuth, from a geographic location.  Three different surveying methods were used each at a different location in Putnam Park to determine the location of the trees and diameter of the trees in the area.  For each tree recorded at each of the three locations four things were recorded.  The geographic coordinates of the origin (the origin was the same for all trees collected at each of the three locations), the distance (meters), the azimuth (degrees), and the diameter of the tree being measured (centimeters).  With the combination of these four measurements a survey of the relative location of some trees and the diameter of these trees can be generated.  

This lab took it back to the basics using only hand held devices and no advanced GPS systems. At each of the three locations where a different survey method was used first the specific coordinates of a specified location were recorded.  This location served as the origin from where the distance and azimuth were measured from for the trees in the area.  The origin coordinates used at each of the three locations was recorded and converted to decimal degrees.  Also at every location a compass was used to record the azimuth of each tree being recorded from the origin.  The last piece of equipment used at each location was a DBH tape, a tape measure especially made to measure the diameter at breast height of a tree.  All three of these devices can be seen in figures 2 and 3. Figure 4 shows a diagrams of how to measure DBH.  The measuring tape was wrapped around the tree trunk at chest height of the surveyor to obtain the measurement of the diameter of the tree at breast height.


Figure 2. Compass used at each                    Figure 3. GPS unit used to get origin
location to measure the azimuth of               coordinated and DBH tape used to get
each tree from the origin.                              the diameter of the recorded trees.

Figure 4. A diagram of how the DBH of trees were taken using a DBH tape.  The tape measure was wrapped around the trunk of the tree at breast height to obtain the diameter at breast height measurement. 

Location 1:
At location 1 the distance of each of the trees was found in the most high tech way of the three techniques used.  A laser distance finder, Figure 5 was used.  This device works by looking through the view finder and lining it up with the desired object that the distance from is desired.  At this location the compass was not used, but instead the laser distance finder gave the distance and azimuth measurements.  The benefit of this technique is that the surveyor does need to physically move to the object being surveyed but can stay in one location during the survey.  This survey technique could be used to survey a wide variety of objects with vertical height.

At this location first the coordinates of the origin were recorded using the GPS device.  Next the laser distance finder was used at the origin to target and measure the distance from the origin each tree was.  The laser distance finder also gave the azimuth of the targeted tree. Lastly the DBH tape was used to measure the diameter at breast height of each surveyed tree.  All four of these measurements were recorded.  Problems encountered included that trees could only be measured if in direct line of sight from the origin.  If a tree was directly behind another tree or a building then the tree could not be recorded from that origin point.  Another problem in using the laser distance finder was that the measurement to the tree needed to be taken at eye level on the tree.  It takes practice to identify this spot on a tree and could be difficult if the tree was located at a different elevation than the surveyor.  To avoid these problems during this survey session the only trees surveyed were at the same elevation as the surveyor and were in plain sight from the origin, not hidden behind other objects.    

Figure 5. Laser distance finder used at location 1 to find distance and azimuth.

Location 2:
Location 2 used the least advanced technology of the three surveying techniques used.  A tape measure, Figure 6 was used to measure the distance of the surveyed tree from the origin.  The benefit of this technique is that it is extremely low tech and there is no worry that the device will run out of battery or have technological issues because it does not operate on batteries.  This type of surveying should be used in extreme climates that would drain battery life quickly.

At this location first the coordinates of the origin were recorded using the GPS device.  Next at the origin the azimuth of the targeted tree was taken by using the compass. Next the tape measure was used by having someone at the origin hold one end of the tape measure and having someone else walk the other end to the surveyed tree to get the distance from the origin to where the tree was.  Lastly the DBH tape was used to measure the diameter at breast height of each surveyed tree.  All four of these measurements were recorded.  Problems encountered where that only trees in the direct line of sight of the origin with no interference from other things in the park could be measured.  The tape measure needed to be able to directly stretch from the origin to the surveyed tree, it could not bend to move around a different object, otherwise the measurement would not be accurate.  Another problem was in making sure the tape measure was tight when measuring.  The farther away the tree was from the origin the more likely the measuring tape was to sag.  Another problem was that the surveyor had to physically move to the object being surveyed to get the distance, if a river had to be crossed or a mud pit, this method would not be realistic to use.  To avoid these problems location 2 was surveyed in an area of the park that was more open and had a 360 degree span of trees.  Also the location had terrain that was easy to walk on and trees short distances from the origin.        

Figure 6. Tape Measure used at location 2 to find distance.

Location 3:
At location 3 a distance finder device was used.  The device had two parts, one held at the origin and one held to the tree.  The part held at the origin would need to be pointed to the part held against the tree and once a button on the origin part was pressed the tree part would beep.  The distance in meters measured between the two devices would display on the screen of the origin held part.  The benefit of this survey method is that it is physically targeting another physical component of the device leading to consistency and accuracy.  This device could be used to survey the distance of a wide variety of objects or locations. The limitation of the application of this method is that the one component has to physically be moved to the surveyed object so this method could not be used in hazardous locations.

At this location first the coordinates of the origin were recorded using the GPS device.  Next at the origin the azimuth of the targeted tree was taken by using the compass. Next the distance finder device was used to get the distance from the origin to the tree.  Lastly the DBH tape was used to measure the diameter at breast height of each surveyed tree.  All four of these measurements were recorded in a notebook.  Problems encountered where that if there was a large object, like a large tree, separating the two parts of the distance finder device the distance could not be registered.  Also like with the tape measure the surveyor had to physically move to the object being surveyed to get the distance, if a river had to be crossed or a mud pit, this method would not be realistic to use, but unlike the tape measure, there was no physical component between the two ends so if there was a river that could be crossed by bridge the distance could be recorded across.  To avoid these problems the origin was placed in an open area that did not have trees blocking other trees.  


Figure 7. 



Following the survey, the four measurements for each of the three locations were manually entered and organized in an excel spreadsheet, Figure 8.  At each location ten trees were surveyed resulting in a survey of 30 trees.

Figure 8. Each of the four measurements: coordinates, distance, azimuth, and tree diameter, for each tree surveyed in an excel spreadsheet.

After the data was organized and inputted into excel the excel spreadsheet was imported into a new file geodatabase.  The table was brought into ArcMap and the points were assessed to check that the data appeared correct.  Next two tools were run on the data points.  First, the Data Management tool: Bearing Distance to Line tool was used, by entering the excel table as the input table, setting the x field to the Longitude field, the y field to the Latitude field, the distance field to the distance column, and the bearing field to the azimuth field.  This tool creates a new feature class containing line features after taking into account the x and y coordinated, distance, and azimuth of coordinate locations.  The second tool run was the Data Management Tool: Feature Vertices to Points tool.  The input was the output that came from the Bearing Distance to Line tool.  This tool creates a new feature class containing points from the vertices from the line features created in the Bearing Distance to Line tool. In Figure 9, the output from both tools after being used in ArcMap is shown.    

Figure 9. 

Results

The results of plotting the location of surveyed trees worked really well.  Every surveyed tree was appropriately represented in the results.  The only concerns regarding the results of the study was that Putnam Park is a very hilly area with many different elevations.  With all 3 surveying methods there is some concern that elevation changes might have skewed the distance result.  This data could be used implicit geographic data.  As for the diameter measurements of the trees, it is expected that those should be fairly accurate with the only concern being that the DBH would vary by the surveyor's breast height which could skew the measurements.  Another concern is that the software made the origin points symbolized as a tree.  They are not trees by rather origin points from where the data was measured.  Overall the data appears accurate and a reliable data source.  Figures 10 and 11 show the results from the survey.  


Figure 10.  Locations 1, 2, and 3 are located in order left to right in the image.
The surveyed locations of the 30 trees surveyed.   

Figure 

Figure 11.  The 30 trees surveyed in putnam park symolized 
according to their DBH measurements. 
Sources

Wisconsin department of Natural Resources. (n.d.). Retrieved March 27, 2017, from http://dnr.wi.gov/topic/Lands/naturalareas/index.asp?SNA=134.

Monday, March 13, 2017

Lab 6: Construction of a point cloud data set, true orthomosaic, and digital surface model using Pix4D software

Introduction

This lab worked with the premier software for creating point clouds, Pix4D.  Previously captured images were imported into the software and they were processed into a point cloud and an animated fly over.

Basic Overview of Pix4D

This software is easy to use and is user friendly by having the Pix4D user guide readily available online.  The Pix4D mapper software is an image processing software that operates by finding thousands of common points between images.  A characteristic point found in an image is called a keypoint and when two keypoints found on two different images are identified as the same, they are called matched keypoints. The group of matched keypoints will form a singular 3D point.  The more overlap that exists between two images the more matched keypoints there will be increasing the accuracy of the 3D points created.        

  • Common Questions:
    • What is the overlap needed for Pix4D to process imagery?
      • The ideal image overlap depends on the type of terrain and the objects to be reconstructed.  The generally recommended overlap for most cases is to have at least 75% frontal overlap with 60% side overlap.  An example of a correct flight path with proper overlap can be seen in Figure 1. 
Figure 1. Model of a correct flight path that can be taken to 
acquire correct overlap from the Pix4D user guide.
    • What if the user is flying over sand/snow, or uniform fields?
      • If the user is flying over snow, sand, or a large uniform area, the overlap will need to be increased because in these cases, there is very little visual content due to the high uniformity.  
      • A high overlap of at least 85% frontal overlap and at least 70% side overlap must be used.  Also the exposure settings should be set to get as much contrast as possible in the image.   
    • What is Rapid Check?
      • Rapid check is a program that runs inside of Pix4D.  Rapid check is an alternative initial processing system that prioritizes speed over accuracy.  Imagery is processed very quickly but the accuracy is relatively low.   
    • Can Pix4D process multiple flights? What does the pilot need to maintain if so?
      • Pix4D can process images taken from multiple flights, it just needs to be made sure that there is enough overlap between the two images, and that the two images were taken under the same conditions.  Same conditions include, weather, cloud cover, sun direction, and that the terrain is generally the same with no new buildings or developments.  In Figure 2. the difference between two flight paths having enough overlap and not enough overlap can be seen.   
      • A pilot in order for multiple flights to be processed needs to make sure spatial resolution is maintained between the multiple flights by maintaining the same flight height between the flights.
Figure 2. Models of the correct and incorrect amount of 
overlap needed between two flights for Pix4D processing to occur.  
Model taken from the Pix4D user guide.

    • Can Pix4D process oblique images? What type of data do you need if so?
      • Pix4D can process high quality point clouds for oblique images.  Oblique imagery are aerial photographs taken with a 45 degree angle formed with the ground.  This type of imagery is helpful in image reconstruction of buildings because building imagery provides not just images of the tops of objects, but also of the sides of the objects.  This type of image processing requires a specific image acquisition plan.  For an example working with reconstructing a building, the building should be flown around a first time with a camera angle of 45 degrees.  The building should be flown around a second and third time should be flown at an increasing flight height while decreasing the camera angle with each round.  Enough overlap should be insured with a 5 to 10 degree change between each image.  Larger images require images taken at less degrees of change.  The Orthoplane tool needs to be utilized when processing oblique images.  
Figure 3. An example of how images should be acquired for 
oblique image processing of a building for Pix4D software. 

    • Are GCPs necessary for Pix4D? When are they highly recommended?
      • In an area of interest, GCPs are points of known coordinates.  These coordinates are obtained by using traditional surveying methods or other sources like LiDAR or other maps of the area.  GCPs are highly recommended, however they are not required for the use of Pix4Dmapper.  The accuracy of projects is increased when GCPs are used and GCPs can be used to check the accuracy of points on the map.     
    • What is the quality report?
      • The quality report is automatically generated for each output created by Pix4D.  The quality report gives an assessment of the image quality of the output of the software.  It gives a general quality check checking the images, dataset, camera optimization, matching, and georeferencing.  It gives a preview as to what the outputs will appear like before densification.  The quality report gives an image that shows were each image was taken in reference to the images rendering a flight path of the images.  It also gives more data quality measurements and data to help the user better understand the quality of their data.    
Using the Software

This demo with Pix4Dmapper was completed with imagery that was already taken for the Litchfield mine site.  First the Pix4Dmapper Pro software was opened and to create a new project was selected.  In the New Project window the name of the project and location for the new project to save to was set.  What the new project is named is what the folder that is created will be named where all of the outputs of the program will be saved.  Next was selected at the bottom of the window, and in the next screen all images to be processed are imported.  After selecting next again, the next window prompts to select image properties.  The Pix4D software looks at the exif file to read the metadata to render the proper coordinate system associated with the data if the images are geotagged.  The camera model is also detected from the metadata.  Both of these parameters can be changed if necessary.  While changing the camera model, the shutter model should also be checked to make sure it is in the correct setting.  In the next window, which is the Select Output Coordinate System window, the default is usually accepted.  The next window lets the user select the processing options template.  There are a variety to choose from like, 3D Maps, 3D Models, Ag Multispectral, and more.  The best one should be selected for you project needs.  Typically the 3D Maps option is used.  Finish can now be selected at the bottom of the window.  

The Map View Screen will be brought in and the overall layout of the pattern of the flight can be seen.  This is a good preliminary check of the data quality the user is working with.  If the data points appear in an organized manner with a linear, precise layout the data quality can be expected to be good, but if the data appears to be scattered and having no sort of pattern, the chances are that there is not enough overlap between images and the data quality rendered will be poor.  For the example being completed the data is very organized and linear so good data quality can be expected.  In the processing toolbar at the bottom of the window the Point Cloud and Mesh and the DSM, Orthomosaic, and index should be unchecked with only Initial Processing checked.  Processing options can also be customized if desired.  After reviewing this and making any changes start can be selected to begin initial processing.  This takes time, but once complete an initial quality report is created, Figure 4.  The quality report give an assessment of the image quality, for this example the image quality is good and there are no concerns.  The quality report includes a preview of the data, Figure 5, a diagram showing the initial image positions, Figure 6, and more data accuracy analysis. Also the data will appear in the window like it does in Figure 7.  When processing is complete all steps will appear green and if there is an error, it will appear red.    

For the final step, processing will be ran again, but this time Initial Processing will be unchecked and Point Cloud and Mesh and DSM, Orthomosaic, and Index should both be checked.  After selecting start there is a long wait period.  Once Pix4D is complete your data should appear like it does in Figure 8.  Further data interpolation can be conducted, an example includes creating an animation. The output DSM and orthomosaic can be found in folders located where the save parameters were set to.  The folders appear the way they do in Figure 9.                         

Figure 4. Quality report for the Litchfield mine site data.


Figure 5. Preview of data within the quality report.

Figure 6. Initial image positions layout, can see the flight path appeared like in the quality report. 


Figure 7. How the Litchfield mine data appears after running initial processing.  All of the steps are complete and the processing is done because the steps all appear green.  There are no errors because none of the steps appear red. 

Figure 8.  How the Litchfield mine data appears after running the point cloud and mesh, and DSM, orthomosaic, and index processing.  

Figure 9. What the output folders appear like after completing all of the processing steps.  They will be saved in the designated location in a folder that is named your project title. 


Discussion

The orthographic imagery and DEM created from using the Pix4D software were used to create the maps in Figure 10 and Figure 11.

In Figure 10 it can be seen that at the Litchfield mine site there are many piles of what appears to be a light colored gravel.  There is minimal vegetation surrounding the mine site.  There are wide paths located between the piles of mined gravel allowing for vehicles to enter the mine site to export the mined resources.
Figure 10.  A orthomosaic image of the Litchfield mine in Eau Claire County, WI created by Pix4D software. 


In Figure 11 it can be seen that the Litchfield mine is of relatively low elevations with the exception of small patches of higher elevation where the mined resources are piled.  The two highest elevations on the east and south edges of the DSM are not parts of the mine operation.  The highest pile of mined resources is located in the mid-east portion of the mine.  The largest pile is located in the central north of the mine.     

Figure 11.  A DSM of the Litchfield mine in Eau Claire County, WI created by Pix4D software. 


An animation for the Litchfield mine was also created.  The animation "flies" you over the area of interest, giving a full 3D view of the study area.


Video 1. Animation of the Litchfield mine site.

Final Critique of Pix4D software

Pix4D software contains a very thorough and capable set of tools.  In terms of creating a DSM, an orthomosaic, and an animation, this software is very good and makes quality outputs.  This software can have many applications and be useful across fields and disciplines.  The software is easy to use and the user manual is easy to read.  Still yet to experiment with the GCP tool set but if that operates as well as the tools and processes used in this lab, this software will be quite impressive.  This is a great tool set to learn and the applications are endless.

Monday, March 6, 2017

Lab 4: Development of a Field Navigation Map

Introduction

The purpose of this lab was to create two functional navigation maps that will be used in a future exercise.  Navigation requires two sets of tools.  First, tools to perform the actual navigation are needed.  These tools range from the stars to GPS systems, or even a simple map.  Second, a type of location system is needed to value to geographic locations.  Location systems often consist of some variation of a coordinate system and some type of a projection.  In this lab two types of coordinate systems were used: UTM coordinate system and a world Geographic Coordinate System of Decimal Degrees.    

Methods

Two different navigation maps were created during this lab of the same study area using different coordinate systems: UTM and GCS.

Study Area
The study area that was created into the navigation maps was a piece of land called the Priory.  The Priory is an area of land owned and operated by the University of Wisconsin- Eau Claire.  This land is used as a private dormitory and a 120 acre wooded area that is used as a children's nature preserve.  Maps of the Priory in relation to the University of Wisconsin- Eau Claire and the Priory itself are in Figure 1 and Figure 2.

Figure 1. Image highlighting the location of the Priory in reference to the University of Wisconsin-Eau Claire main campus.

Figure 2. Image displaying the appearance of the priory campus. 

Coordinate Systems

Coordinate systems are reference systems used to identify the locations of land features, imagery, and geographic observations.  Coordinate systems allow the collaboration of different data sets when they share the same coordinate system.

Coordinate systems are defined by four key features:

  • The measurement framework
    • It can either be geographic, meaning that spherical coordinates which are obtained from measurements from the earth's center.  Otherwise the measurement framework can be planimeteric meaning the earth's coordinates have been projected on to a 2D flat surface.
  • The units of measurement
    • Latitude and longitude measurements typically use decimal degrees while, projected coordinate systems use feet or meters usually.
  • The definition of the map projection for projected coordinate systems.
  • The properties of other used measurement systems
    • examples: spheroid reference, datum, standard parallels 


UTM

The UTM, Universal Transverse Mercator, coordinate system is a specialized version of the Transverse Mercator projection.  This projection operates under the basis that the globe is divided into 60 zones located in the north and south.  Each zone encompasses 6° of longitude and has its own central meridian.  A specific UTM zone is selected based on which UTM zone the study area resides in.  Within each UTM zone there will be minimum distortion of those lands.  The 48 continuous states of the United States fall with in 10 different UTM zones.  These specific zones are highlighted in Figure 3.  

Figure 3. The UTM zones of the 48 continuous states of the United States.

GCS

The GCS or Geographic Coordinate System utilizes a 3-D spherical surface to define locations on earth.  GCS utilizes an angular unit of measure, a prime meridian, and a spheroid based datum.  Longitude and latitude lines calculated by the angle measured from the center of the earth to the location on earth's surface.  These angles are often written as decimal degrees, Figure 4 shows a model of the earth divided by lines of longitude and latitude.    

Figure 4. Model of the globe demonstrating how longitude and latitude lines are used to calculate decimal degrees of locations on earth.


Results & Discussion

The following navigation maps were created by using either a UTM or GCS coordinate system.

In order to create the UTM coordinate system map, all of the datasets had to be projected to the same projection.  All of the data sets for the UTM priory map, the basemap, contour intervals, and study area boundary, were all projected in the North American Datum 1983 of UTM zone 15N.  A 50 meter grid of the UTM intervals was added to the map for navigation reference purposes.  The map created can be seen in Figure 5.

In order to create the GCS coordinate system map, all of the datasets had to be projected to the same projection.  All of the data sets for the GCS priory map, the basemap, contour intervals, and study area boundary, were all projected in the Geographic Coordinate System North American 1983.  A decimal degrees grid of the longitude and latitude lines was added to the map for navigation reference purposes.  The map created can be seen in Figure 6.

The maps were created with a cautious eye as to not allow the map to become too busy.  Often times maps are created using all available data sets and then the purpose of the map can become lost and the map can become so busy it is unusable.  The base map was used in both maps because, when navigating it can be useful to match the terrain you are seeing with the terrain on the map for location purposes.  The contour lines were clipped to be only slightly larger than the navigation boundary as to reduce the business of the map that would have occurred if the contour lines had covered the whole map.  The navigation boundary was included because it is important to know the general area you are located and narrow down the possible locations for which you are locating.





Figure 5. A UTM coordinate system map displayed using a 50 meter UTM grid of the Priory.


Figure 6. A geographic Coordinate system map displayed using a decimal degrees grid of the Priory.

Conclusion

This lab provided valuable insight into how to create navigation maps and the importance of understanding how different coordinate systems can be used.  Often now we rely on technology to fulfill our navigation needs through the increasing applicability of GPS navigation systems.  But, technology is not always readily available for use.  It is important to know how to use simple navigation maps to fulfill our navigation needs.  

Sources

Coordinate systems, map projections, and geographic (datum) transformations. (n.d.). Retrieved March 06, 2017, from http://resources.esri.com/help/9.3/arcgisengine/dotnet/89b720a5-7339-44bo-8b58-0f5bf2843393.htm.

Map Basics. (n.d.). Retrieved March 04, 2017, from http://www.edc.uri.edu/nrs/classes/nrs409509/Lectures/4MapBasics/Map_Basics.htm.

Priory Hall. (n.d.). Retrieved March 04, 2017, from http://www.uwec.edu/Housing/residencehalls/priory/priory-hall.htm.

What are geographic coordinate systems?. (n.d.). retrieved march 05, 2017, from http://desktop.arcgis.com/en/arcmap/10.3/guide-books/map-projections/about-geographic-coordinate-systems.htm.

Sunday, March 5, 2017

Lab 5: Using Survey 123 to gather survey data using your smart phone

Introduction

Smart phones are always becoming more and more advanced.  These devices can be used to collect data by connecting them to data collection devices or utilizing the GPS capabilities within the device itself.  This lab uses the online ESRI tutorial to demonstrate how Survey123 for ArcGIS can be used to gather survey based field data.

About Survey 123

Survey 123 is a website that makes accurate field data collection easy. Survey 123 for ArcGIS is also an app that can be downloaded onto smartphones and be used to collect field data.  Survey 123 allows users to create automated surveys that can be made available to a wide range of users.  The website also allows you analyze the results of the survey in a variety of ways and create a web app to share the survey results.

Methods 

For this lab, the Get Started with Survey 123 for ArcGIS tutorial was completed.  The tutorial consisted of four different lessons to be completed.  Each completed lesson is explained in Figure 1. This survey was created to help the homeowner association (HOA) develop a tool to support their community become prepared for disasters like home fires and earthquakes.  The survey gathers data about nine Fix-it safety checks to determine how prepared for disaster the community is.
Figure 1. The four lessons completed or the Get Started with Survey 123 for ArcGIS tutorial.

To begin the tutorial the Survey 123 for ArcGIS webpage had to be opened. This webpage's homepage appears as it does in Figure 2.  To gain access the user must log into an ArcGIS or Enterprise account.


Figure 2. The Survey 123 for ArcGIS homepage.

Lesson 1: Create a Survey


A web survey was created called the "HOA Emergency Preparedness Survey".  Survey 123 is equipped with a user-friendly interface that makes generating a survey easy.  Figure 3 is of the toolbar that allows the user to select what types of survey questions should be added to the survey.  Once selected, the question type can be customized to the options and layout the user desires.  In the tutorial, many different question types were utilized.  Figure 4 and Figure 5 show different survey questions created for the tutorial.  Survey questions can also be linked so that if the participant in the survey selects one answer a second question will appear to follow up on the answer selected for the first.  An example is, if for the survey question "What type of residence do you live in?" if "Single family (house)" is selected then another question will appear asking "How many levels does your home have?".  This question only appears for that selected answer due to the fact the second question would not be relevant if the alternative answer "Multi-family (apartment, condo)" was selected.

Figure 3. The Survey 123 toolbar showing all of the different types of survey questions that can be created. 

Figure 4. Samples of Number, Image, and Multiple Choice questions created for the survey tutorial. 

Figure 5. A sample of the numerous Single Choice, Yes/No, questions made for the tutorial survey.

Lesson 2: Complete and submit the Survey

Once the survey is set up in the desired fashion, the survey should be checked for spelling errors; because once the survey is submitted, it cannot be revised or changed.  Once the survey is submitted different settings can be changed so that desired audience can access the survey.  The group of people who are allowed to take the survey can be set to Everyone (Public), Members of my organization, or customizable groups.  In the Survey 123 webpage for the created survey the main toolbar includes several tabs related to the data of the survey, Overview, Design, Collaborate, Analyze, Data, and one that gives the URL for the survey.  The created survey can be accessed through that URL or by accessing it through the Survey 123 for ArcGIS app that can be downloaded onto smartphones or tablets.  Figure 6 shows what the free downloadable app looks like.  At this part of the tutorial survey data is needed so the user needs to complete the survey at least 6 times using the app and website.  Each time the survey is completed a friendly notification pictured in Figure 7 displays say the data was properly submitted.

Figure 6. The free Survey 123 for ArcGIS downloadable app for smartphones and tablets.

Figure 7. The pop-up display that appears after the survey has been successfully taken. 

Analyze Survey Data

After the survey data was collected the data was observed through graphs and maps highlighting trends in the data.  For each survey question, a graph is automatically rendered for user analysis.  The graph can be set to be a column, bar, pie, or map representation of the data.  Figure 8 shows a column representation of the data from one of the survey questions. Statistics are also automatically given for the results of each survey question, pictured in Figure 9.

Figure 8. Column chart for the results of one of the tutorial survey questions.

Figure 9. Statistics for the results of the "What type of residence do you live in?" survey question. 

Lesson 4: Share your survey data

In ArcGIS Map Viewer connected to Survey 123 a web map can be created to display survey results. Pop-ups for each data point can be configured to display the attributes that are desirable to viewers and keep confidentiality of the participants of the survey if applicable.  A web app can also be created to allow for a different viewing experience of the map. Figure 10 displays what the webpage to customize the web app appears like.

Figure 10. Web app configuration page.

Results

The main two things created in the Survey 123 online tutorial was a survey and a web app used to display the data.  The URL for each of these are given below.  A screenshot of the web app is shown in Figure 11.

URL for the survey created:
https://survey123.arcgis.com/share/ad825a0e454b46ff86e3cadda98b8abd

URL for the survey results Web App: https://arcg.is/11bCKf

Spatial patterns can be seen as the data is observed.  The study was conducted amongst a small study population in Eau Claire, WI at the University of Wisconsin- Eau Claire.  It would make sense that most homes appear in Eau Claire and Minnesota and other cities in Wisconsin.  The most common household sampled was single family (house) with two to three levels in the home. The average number of people in each household was 4, with the minimum being 2 and the maximum number of people in a household being 6.  The age range with the greatest representation within the households sampled was 18-60 years old, which makes sense since the survey was conducted on a college campus.  Generally, most homes sampled do not have computers, televisions, cabinets, or bookshelves secured.  The item most commonly found in each household was a tent. If the survey was given to a larger population of people, more reliable and more spatial patterns could be drawn.


Figure 11. A screenshot of the web app for the survey data points.


Conclusions

Survey 123 for ArcGIS might be useful in my future research.  I am a Biology and Geography double major with a minor in Environmental Science, so in my future I plan to be working on environmental research.  I can for sure see myself using Survey 123 in my future career goals.  This app would work well in the field.  For example, if I was collecting soil samples in the field I could first create a survey in the lab with question like "soil sample number", "appearance", "depth sampled at", or even a location map where my exact GPS location of where the sample was taken could be saved.  I could even have a picture field where I could take a picture of the sample site or sample itself for future reference and save it in the app.  When I conducted the soil sampling all of my survey data would be neatly organized in one location minimizing the risk of losing data or disorganizing my samples.  The Survey 123 for ArcGIS has many different applications and is a great tool that can be utilized by field researchers everywhere.

Sources


Get Started with Survey123 for ArcGIS. (n.d.). Retrieved March 05, 2017, from https://learn.arcgis.com/en/projects/get-started-with-survery123/.


Monday, February 20, 2017

Lab 3: Cartographic Fundamentals: Essentials in map creation, description, and interpretation


Introduction

Lab 3 worked to refine cartographic skills and proper map interpretation methods.  The fundamental map elements required in each map were emphasized: title, watermark, data sources, locator map, scale bar, and north arrow.

Map Creations & Interpretations

Displayed below are five different maps that were created using ArcMap and Powerpoint. 

Figure 1 gives a map of the data collected from labs 1 and 2.  The data was processed through a spline interpolation method to be displayed on the map.  In analyzing the map, there is an elevated plane in the northwest corner.  There is a hill arising out of a valley in the northeast corner.  In the southeast corner, there is a ridge semicircular in shape.  These three features lie in a higher elevation range, closer to the maximum height of 0.11 cm than any other features in the sandbox.  The elevated plateau in the northwest corner has the greatest elevation.  In the southwest corner there is a depression feature with the lowest elevation of all of the features in the sandbox.  The elevation of this feature lies close to -11.72 cm below the sea level of the sandbox.  The 360° three-dimensional rotation of the terrain highlights that the land surrounding these features is slightly uneven but lies in the mid elevations and remains relatively flat.  The locator map displays the University of Wisconsin- Eau Claire campus where the data was collected.  The sandbox was located east of the Philips Science Hall.  Figure 2 gives the statistics for the sandbox data set. All data points fell in the range of -11.72 to 0.11 cm.  The mean of the data set was -5.23 cm with a standard deviation of 1.97 cm.  The flat terrain covering the greatest portion of the map lying around the created features most likely lies at -5.23 cm. The max of the range lies in the northeast and the min lies in the southeast.  
Figure 1. Sandbox terrain data collected in labs 1 and 2 displayed using the spline interpolation method. 


Figure 2. Statistics calculated for the spline created from the sandbox survey sampled data points.  All statistical data is give in the unit of centimeters. 

The map in Figure 3 is of the grave sites at the Hadlyville Cemetery located in Eau Claire County in Wisconsin.  Each grave site has been labeled according to last name.  Notice, that family members have been buried close to their other family members in the cemetery. Four Petersen family members reside in the center of the cemetery.  The Sessions family grave site lies on the north east corner of the cemetery.  The map displays that couples were buried together as well.  The Chases and Hadleys are buried both in pairs on the south side of the cemetery closest to the farm field.  Generally grave sites are most concentrated in the northeast and southwest corners of the cemetery close to the tree line of the forest.      

Figure 3. The grave sites of the Hadlyville Cemetery by last name. 

The map in Figure 4 is of the grave sites of the Hadlyville Cemetery located in Eau Claire County in Wisconsin.  Each grave site has been labeled according to the year of death of each occupant.  It can be seen that this data is missing for several grave sites.  The majority of these grave sites are located on the eastern portion of the cemetery.  It can be seen that there is no real organization as to the location of grave sites related to the year of death.  The graves tend to be randomly scattered in terms of year of death.  The most recent death was in 2006 and is located in the northeast corner of the cemetery.  The earliest year of death at this cemetery was in 1859 and is located in western portion of the cemetery in the middle.  Most of the grave sites are from the nineteenth and twentieth centuries.   

Figure 4. The grave sites of the Halyville Cemetery labeled by year of death.  

The map in Figure 5 is of the grave sites of the Hadlyville Cemetery located in Eau Claire County in Wisconsin.  Each grave site has been symbolized according to the status of whether the grave is standing or not.  The status of several graves is unknown.  The majority of the graves are standing.  Only four of the grave sites with known status are not standing.  The locations of these grave sites are in the eastern and western southern corners of the cemetery located along the tree line of the forest along the eastern and western sides of the cemetery and in the middle of the cemetery.   

Figure 5. The grave sites of the Halyville Cemetery by whether the grave is standing.  

The map in Figure 6 is of the grave sites of the Hadlyville Cemetery located in Eau Claire County in Wisconsin.  Each grave site has been symbolized according to a range of year of death of each occupant.  Using this symbology it ca be seen that there is so system to location of burial related to year of death.  In the southeast corner of the cemetery only older graves exist, but other with other grave sites the years of death are scattered throughout the cemetery.  

Figure 6.  The grave sites of the Halyville Cemetery symbolized by year of death.  


Conclusions: 

Creating a properly constructed map is crucial for accurate map interpretation.  The symbology and attributes of a map have a large effect on the information that can be extracted from a map.   

Monday, February 13, 2017

Lab 2: Sandbox Survey Part 2: Visualizing and refining the terrain survey

Introduction

In Lab 1, a systematic point sampling technique was used to sample the terrain of a sandbox by using a grid sysem.  An excel file of X, Y, and Z values was created for 400 sample points.  Figure 1 shows a sample of what that excel file looked like.
Figure 1. A piece of the excel data table containing the sampling points.  The data in this table has been normalized. 

Lab 2 is a follow up lab to Lab 1, where the data survey points created in Lab 1 are brought into ArcGIS and are used to create a surface terrain that matches the data collected.  If the sampling method was effective the digital terrain model should match what the surface of the sandbox appeared like when the sampling points were taken in Lab 1.  

The first step in preparing the survey data for ArcGIS is a process called Data Normalization.  Data normalization, is a process involved with data management, which includes organizing, analyzing, and setting up the data in a way that promotes efficiency.  This process is especially important for data that will be used and shared.   For this lab the data had to be normalized in excel before being able to import in excel.  This process was completed by placing the survey points in a systematic system with the X, Y, and Z, points being placed in their correct columns. Each of these points represents a coordinate, and in this lab the data will be displayed in a way where the Z values for each X, Y coordinate point can be visualized.  The data featured in Figure 1 has been normalized. 

Methods   

After the data was normalized in excel. A geodatabase was created in ArcCatalog where the excel file could be imported.  In excel the data was set as numeric data and the single excel table was imported into the geodatabase.  In ArcMap the table was uploaded by using the Add XY Data function.  The data appeared in ArcMap the way it looks in Figure 2.  Once the data was observed for X, Y accuracy by appearing in a systematic grid a feature class was created from the data also containing the Z values.  The data appeared perfectly in a systematic grid, so there was no need to perform any changes to the data or improvements to the survey.                                       
Figure 2. When the sampled data points were brought into ArcMap this grid was formed.  The grid was observed to make sure no data points were entered incorrectly and the X, Y data appeared in a systematic grid. 

Five different interpolation methods were conducted on the data points to create a continuous surface: IDW, Natural Neighbor, Kriging, Spline, and TIN.  To conduct these interpolations both the Spatial Analyst and 3D Analyst extensions in ArcMap must be active.   

  • IDW: Inverse Distance Weighted interpolation
    • Cell values are determined through a function of inverse distance that weighs the combination of a group of sample points. This method operates under the assumption that the variable decreases the father it is away from a sample point.  
    • Place power on specific points, not good when points are not densely packed.
    • ArcToolbox-> Spatial Analyst Tools-> Interpolation-> IDW
  • Natural Neighbor:
    • An interpolation point is estimated by applying an algorithm to find the closest points to an area and apply appropriate weights to them to interpolate a value. 
    • Peaks, ridges and other unusual features do not get property represented because interpolated values must lie in-between a set of values, make it good for categorical data.
    • ArcToolbox-> Spatial Analyst Tools-> Interpolation-> Natural Neighbor
  • Kriging:
    • A formula that weights surrounding measured Z-values to estimate the Z-values for unmeasured locations. 
    • Requires measured Z-values, but does a good job at making predictions for unmeasured Z-values.
    • ArcToolbox-> 3D Analyst-> Raster Interpolation-> Kriging Tool
  • Spline:
    • A mathematical function is used to fit a terrain with minimized surface curvature to specific input points.  Can be thought of as a piece of rubber being bent and molded around set input points.
    • Creates a digitally smooth surface, not ideal for representing small changes in elevation. 
    • ArcToolbox-> Spatial Analyst Tools-> Interpolation-> Spline
  • TIN: Triangulated Irregular Network
    • A form of vector-based digital geographic data that is created by using data points.  The data points are connected by a series of triangles creating sharp edges. 
    • Creates a digitized appearance, does not create smooth edges and features appear more pointed than they are.  Able to handle varying complexity of data sizes by being able to adjust triangulation methods. 
    • ArcToolbox-> 3D Analyst-> Data Management-> Tin-> Create Tin

Each interpolation method was created in ArcMap and then the feature class was displayed in ArcScene where the terrain took on a 3D view.  In ArcScene elevation surfaces were set to float on a custom surface.  Visibility was set to render the layer at all times, with the light settings set to use smooth shading if possible and shade areal features relative to the scene's light position.
  
  Results & Discussion

For reference Figure 3 is an image from Lab 1 sandbox survey that shows what the actual terrain of the real life sandbox appeared like.  The best interpolation model will appear most like Figure 1 and most accurately represent the surface of this sandbox.


Figure 3. The terrain of the sandbox sampled to created the following interpolated images.


IDW
Figure 4 is a classic IDW example because the surface created appears bubbly.  This model places great emphasis on the sample point values and then decreases that value in the areas closest to that point.  In Figure 4 clear circular bumps and intents can be seen all over the image.  In Figure 5 the vertical changes between points can be seen.  Especially in the bottom right hand corner on the "C" various mountain shapes can be seen on what was a strait across ridge in the actual sandbox.  This method gets specific elevation values correct but is extremely incorrect in the elevation of values that are not exact points.  The terrain of the sandbox was not nearly this bumpy.  This is a bad representation of what the surface of the sandbox looked like.
Figure 4. Vertical view of the IDW interpolation method.
Figure 5. 3-D visualization of the IDW interpolation method.

Natural Neighbor:
This interpolation method demonstrates a more accurate representation of the terrain of the sandbox than IDW.  Each feature that was created in the sandbox can easily be seen in this method.  Figure 6 shoes that the edges of the features in the sandbox appear rougher that they actually appeared in the sandbox, but the features themselves appear as smooth as they were in the sandbox.  In Figure 7 rough peaks can be seen on the letter "C" which were not this exaggerated in the sandbox.  Overall this method does a good job of getting the general terrain of the sandbox correct, but details like edges and peaks are not accurately represented.
Figure 6. Vertical view of the Natural Neighbor interpolation method.

Figure 7. 3-D visualization of the Natural Neighbor interpolation method.

Kriging:
This interpolation method does not do a good job of representing the terrain of the sandbox.  In Figure 8 the message written in the sandbox is not readable.  This is because this interpolation method took a more geometric approach to creating the terrain model.  The image almost appears as though disks are laid on top of each other to represent a different elevation.  This model does not give off a natural appearance, but appears more digitized, this is because of the weighted formula used to create this model.  In Figure 9 it can be seen that the depths of the valleys and depressions are not properly represented.  They are not as deep as they actually were.  Same goes for the ridges and hills, they appear flatter than they were.      
Figure 8. Vertical view of the Kriging interpolation method.

Figure 9. 3-D visualization of the Kriging interpolation method.

Spline:
This interpolation method does a good job of representing the general elevation changes in the terrain. As seen in Figure 10 each element of the terrain is visible.  The concern with this interpretation method is how smooth this model makes the features appear.  In Figure 11 the over smoothness of every feature can be seen.  The "I" appears with perfect up slopes, where in the actual sandbox the slope was more uneven.  Small detail elevation changes cannot be seen when using this model, but general elevation changes get nicely represented.
Figure 10. Vertical view of the Spline interpolation method.

Figure 11. 3-D visualization of the Spline interpolation method. 

TIN:
In this interpolation method the created features in the sand can be easily seen.  In Figure 12 each feature is easily distinguishable, but it is clearly seen in the heart shape that this method does not represent curves well.  The heart appears more like a triangle than a heart.  This is understandable given TINs are created by the connection of triangle shapes, but makes this method not ideal for terrains with curving features.  Figure 13 shows that the elevations are represented in a very stratified linear fashion that is not always the correct representation.  It can also be seen that edges and peaks appear sharper than they actually were, again due to triangulation effects.  But generally this method does a good job representing elevation changes.
Figure 12. Vertical view of the TIN interpolation method.

Figure 13. 3-D visualization of the TIN interpolation method.


 
Conclusion

This two lab activity of mapping the surface of a sandbox took a hands on approach that allowed for exposure to different spatial sampling methods, data collection, data normalization, and data interpolation methods to create 3D surface models.  A systematic point sampling technique was used by using a grid system to collect vertices with X, Y, and Z data.  This data was normalized in excel and imported into ArcMap where five different data interpolation methods were used: IDW, Natural Neighbor, Kriging, Spline, and TIN.  These models were displayed in ArcScene which allowed for 3D viewing and better interpretation of the interpolated models.  Each interpolation method had advantages and disadvantages in representing the data and mapping the terrain. The methods conducted in these lab projects can be used in many other forms of gathering geospatial data.  Strings and tape measures might not be used when modeling real world terrain, but collecting X, Y, and, Z vertices and performing normalization and interpolation methods on the data is commonly used.  For example in remote sensing, LiDAR XYZ data points are collected normalized and interpolated. These labs worked to expose students to these methods.  
          
Sources

How IDW Works. (n.d.). retrieved February 11, 2017, from http://desktop.arcgis.com/en/arcmap/10.3/tools/3d-analyst-toolbox/how-idw-works.htm.

How Kriging Works. (n.d.). Retrieved February 11, 2017, from http://desktop.arcgis.com/en/arcmap/10.3/tools/spatial-analyst-toolbox/how-kriging-works.htm.

How Natural Neighbor Works. (n.d.). Retrieved February 11, 2017, from http://desktop.arcgis.com/en/arcmap/10.3/tools/3d-analyst-toolbox/how-natural-neighbor-works.htm.

How Spline Works.  (n.d.). Retrieved February 11, 2017, from http://desktop.arcgis.com/en/arcmap/10.3/tools/3d-analyst-toolbox/how-spline-works.htm.

Normalization- definitio- Esri Support GIS Dictionary. (n.d.). Retrieved February 11, 2017, from http://support.esri.com/other-resources/gis-dictionary/term/normalization.

TIN in ArcGIS Pro. (n.d.). Retrieved February 13, 2017, from http://pro.arcgis.com/en/pro-app/help/data/tin/tin-in-arcgis-pro.htm.