Thursday, December 22, 2016

Pix4DMapper: Processing UAS Data

Introduction:

For this weeks lab our task is to create a orthomosaic image using Pix4Dmapper, this is a program used by geographers to create accurate 3D imaged maps. This makes the data very easy to read and very easy to analyze. The task at hand is to use this program and the UAS data provided to create a map of the study area and calculate different aspects of that study area.

Using Pix4D:

The most important thing to do when using Pix4D is to make sure the images are overlapping, when I say overlapping I don't mean a small amount of overlap, rather the higher the overlap the more accurate of a 3D model may be created. The reason why overlaps are so important when using this specific program is because it makes the 3D image more accurate and sharper than normal 3D images through the use of aerial triangulation(AT). The main goal of aerial triangulation is to determine the position and orientation of the images collected using UAS, this allows the AT to pierce through multiple images that are overlapping and create a legible map of the study area. In some cases aerial photographs may not capture as much information as needed, this is when the terrain is covered with either snow or sand. With these terrain types the overlap rates are much higher, with both snow and sand, this is because of how reflective these substances are and show minimal visual content, this is why there needs to be a certain amount of frontal and side overlap. The minimum levels of overlap for sand and snow are 85% frontal and 70% side overlap, these are minimal requirements meaning that these are the lowest values they can achieve for the images to be captured correctly. There is also another feature in which the data can be verified by the program, this is called rapid check. Rapid check is where all of the data is processes very quickly, the disadvantage to this method is that the results have low accuracy since all of the data was processes at a quick rate rather than being thorough. There are also a few more positives to using this program, for one Pix4D can process multiple flights at once, but only if certain requirements are met, the main one is that the coordinate system for all the images are the same. Another positive is that Oblique images may be processes through the use of this program, but only if the overlaps are of good quality and GCPs are present. GCPs are important but will not be used during this project, but they are a good option because when used they provide the results with more accurate findings. All of the results are presented in the quality report, this report is where all of the data is analyzed and all the figures are created.

Results:

The data that the class was provided with was UAS imagery data collected by Professor Josheph Hupy, the study area is a sand mine that is located in the south region of Eau Claire. To start off the  project all of the images were imported to Pix4DMapper, this is where the area of interest was chosen. The AOI was selected directly over the sand mine, the next step is to process the images and wait for the quality report to be generated. The creation of the quality report took about 15 minutes and the results were quite interesting. Below are figures 1 - 3from the quality report representing the information put into the report as well as maps showing the study area in different ways.


Figure 1: The quality report from the overlapping images being processed and analyzed










Figure 2: Represents both the Orthomosaic and DSM processed from the overlapping images

Figure 3: Represents the where in the study area there were strong overlapping between the images and weak overlapping

The areas in green represent the areas where there were multiple overlaps between the images making the results for that area very accurate and easy to read. While the yellow and red areas represent areas that contain poor overlap between images. Most of the areas that contain poor overlap are on the edges of the map, but the main point taken from the figure 3 is that as long the study area is showing strong overlap between the images than there will be positive results.

Conclusion:

Overall this was a very enjoyable and informative lab, learning and using a new program to create and analyze 3D images is expanding my knowledge on how to create legible and accurate maps. Pix4D is overall is a very user friendly program, making is easy to use for first timers. As well as one of the higher forms of technology for creating high quality results. This course has taught me a lot throughout the semester and I feel that ending on learning how to use a new program such as this is the perfect way to complete the course.

Tuesday, December 6, 2016

GPS Topography Survey





Introduction:

For this weeks activity the class was asked to take different points around a study area located on the UW - Eau Claire campus. Each point was taken using a survey grade GPS device, this allowed the class to get its points to within 5 meters of the real life location. Using this device it will make it much easier to collect data in the field. For each data point taken the elevation data was taken as well using the GPS device. Using this elevation data the class will be creating a continuous surface of the study area where a map can be created using ArcMap. The most important part of creating a surface area map is to take enough data points with a difference in relief points so than the elevation in the land is noticeable on the map. This will allow the study area to be correctly represented, the sampling method is important as well. In this case the class used a random sampling method, this was most important because this would allow the study area to be best represented in terms of the difference in elevation. The GPS device was portable and took points on real world locations using a tripod that needed to be level in order to correctly take the point.


Study Area:

Below is Figure 1 which is the study area that was captured by using google maps so it doesn't correctly represent the elevation of the area.



Figure 1: The Area where the elevation points were collected by the class
Location: Area in front of Centennial hall at the University of Wisconsin - Eau Claire campus. In the middle where the statue is located.


Methods:

The materials that were used in the survey was a Tripod stand that needed to be level, a Topcon Tesla Field Controller, and a Topcon Hiper SR Positioning unit. Below are figures 2 and 3 representing the materials used in the survey.

Image result for Topcon Hiper SR positioning unit tripod
Figure 2: Topcon Hiper SR Positioning Unit on a level Tripod used in survey for collecting the elevation data points

Image result for Topcon tesla field controller
Figure 3: Topcoln Tesla Field Controller used to take points while using the other device to take points


These materials used a bluetooth handheld device which was able to download the data collected onto a computer. Once all of the data was collected, Professor Hupy processed the data and shared it into a temporary folder that was in the format of a text file. From there the class needed to transfer the data on the text file to an excel file so than it can be used in Arc map to create maps that represent the data taken from the study area. The final step was to create five different maps each with a different interpolation. The five different types of interpolation are IDW, Spline, Natural Neighbor, Kriging, and TIN tools.

Results/Discussion:

Below are figures 4 through 8 representing each of the interpolation tools used in this survey.

Figure 4: IDW Interpolation of the elevation data points taken from the study area

Figure 5: Kriging Interpolation of the elevation data points taken from the study area

Figure 6: Natural Neighbor Interpolation of the elevation data points taken from the study area  
Figure 7: Spline Interpolation of the elevation data points taken from the study area


Figure 8: TIN Raster created from the elevation data points taken from the study area

After looking at all of the different interpolation tools that can be used on the elevation data taken from the study area, I believe that the Natural Neighbor interpolation method is the most accurate out of all five. It does the best at representing the differences in the elevation level in the study area, other methods did a good job on this as well but not as effective or accurate as the Natural Neighbor. 


Conclusion:

I believe that this survey was an overall success, the task at hand was to collect data points from a study area selected by Professor Hupy using a survey grade GPS device. Once data was collected it was time to transfer the data in order to create maps. The overall process was simple and easy to understand, the GPS device made everything with collecting the data easier than it should have been. But is was a great learning experience and gave the class a view at what is capable using a GPS device especially one that is so accurate to real life locations. 

Tuesday, November 15, 2016

Microclimate

Introduction:

For this activity the class was split up into groups and assigned a zone to take data points from, each group was equipped with a Kestral unit which is shown in figure 1 below and a base plate compass to determine where the wind is coming from. Also each student was instructed to download Arc Collector onto his/her mobile device, because this was where we were going to log in all of the attribute data for each data point. My group was assigned zone 5 to take data points from, below is figure 2, which represents each of the 5 zones that data was taken from. All members of the class needed to record, their group number, temperature, dew point, wind speed, and wind direction. Each phone contained a live feed of all other group, allowing us to keep track of where the other data points were being taken during the activity.

Figure 1: Kestral unit used to collect attribute data for data points

Figure 2: Each zone where data was collected from

Methods:

While collecting data points the tools used were Arc collector on each students mobile device, a kestrel unit, and a base plate compass. The kestrel unit was used to obtain data such as temperature, wind speed, wind direction and dew point. A different measurement of each of these attributes were taken at every data point. Once all of those were collected they were then entered into Arc collector, easy to use in ArcMap because the arc collector was directly attached to a geodatabase on ArcGIS online. This made it easy to transfer all of the data points and attribute data for each point over to ArcMap. Once in ArcMap each group was required to make an interpolation of each field in the attribute data. I decided to do an Inverse Distance Weighted Interpolation or an IDW, This will generate a map that shows the different cell values of each area by averaging the sample data points in each area closest to the cell.


Results:

Below are figures representing each of the maps created for this field activity, each representing a different field in the attribute data.

Figure 3: Showing the temperature difference between areas around UWEC campus


Overall most areas in the figure above are about the same temperature wise except for the area in the far bottom right corner. It seems that area is much cooler or had a much smaller average temperature compared to other areas on the map. Below is figure 4, representing the dew point around the UWEC campus.


Figure 4: Showing the difference between Dew Point values in the area surround the UWEC campus


After looking at the figure above, once again most of the area contains the same dew point except for the area in the bottom right corner. Dew point is the measure of moisture in the air, since in the bottom right corner its a lower dew point than other areas. This signifies that there is less moisture in the air, meaning that area should also be colder than most areas, as seen in figure 3. Below is figure 5, representing the wind speed at the location of each data point.


Figure 5: Showing the wind speed collected at each data point location around the UWEC campus area


The areas that have the highest average wind speed are the areas with the largest elevation and the areas near the river. This is to be expected because wind travels faster over flatter surfaces such as water. Since wind moves from high to low pressure areas, that means at higher altitudes there is greater pressure so there would be faster moving winds at higher elevation. The final figure represents the direction in which the wind is moving, below is figure 6. This figure shows the wind direction of each data point at the time it was collected, and the areas are divided up into sections for what degree the wind is moving. In order to collect wind direction data properly, one must point in the direction the wind is coming from not where it is moving towards.


Figure 6: Represents the wind direction with arrows of each data point collected around the UWEC campus area


Conclusion:

This activity was a great experience for all students that were included, it shows them how easy it is to collect data and create maps representing the data that was collected. It taught me more then expected, proving that there are programs out there that are specifically built to make our lives simpler when it comes to data collection methods.

Tuesday, November 8, 2016

Priory Navigation Part II

Introduction:

For the second part of the navigation activity, the class was divided into groups and given five sets of points from Professor Hupy, each group was provided with a different set of points. Every group was provided with a GPS to track their current locations, printed out maps created in the previous activity, and a base plate compass. The GPS was only to be used for insight but the main tool to be used for direction was the compass and the map. Here is a link to the maps used in the activity; Priory Navigation Map. Below are figures 1 and 2 representing the tools used in the lab. 


Figure 1: Trimble GPS used to help guide the groups through the activity


Figure 2: Base plate compass the main tool used to guide the groups through this activity

Study Area:

The area where this navigation activity took place was the areas surrounding the Priory at UWEC, which is located at Priory Hall: 1190 Priory Rd Eau Claire, WI 54701. The conditions for the day were sunny with a high of around 65 degrees Fahrenheit, the terrain of the surrounding areas of the Priory varied depending on where the person is located. In some areas there were large trenches with steep topography, while other areas were flat and very easy to walk through.  

Methods:

The first step was for each group to receive their  points from Professor Hupy, the points that were given to group six are listed below;

Set #5 of Points:

618011, 4957883
618093, 4957823
618107, 4957942
618195, 4957878
618220, 4957840

The points above allowed group six to start the journey, since there weren't six sets of five points group six was required to do the fifth set of points starting backwards. Before anything the pace count for each person was needed so then the amount of steps required for each point was recorded, this was done by measuring out 50 meters and counting your steps counting the same foot every time while walking. In order to start the journey of finding each point listed above the first step was to locate each point on the map and mark it correctly, next the base plate compass was needed to find the correct direction to start walking. To do this the compass needed to be aligned with where group six was currently located and align the edge of the compass with the next point. Once the arrow was within the red borders on the compass, or "the red is in the shed" then groups six can start walking towards that point using the GPS device as basically a last resort just in case the groups were to get lost. This process was used after each point was found. Each group was to provide evidence that each point was located, so pictures of each point was required. Below are the images of each point that was collected by group six.


Figure 3: The first point located by group 6 or point #5 in the fifth set of points

Above was the first point located by group six, this was an especially hard point to find because it was the last point in the set of five so it was the point that is furthest away. It was also on the other side of a trench, making it extremely difficult to get to, but once reached the picture was taken and the process to find the next point had started. 


Figure 4: The second point located by group 6 or the fourth point of the fifth set of points

This point was much easier to find because it group six had started to get a hang of locating these points just using the compass and only referring to the GPS when needed. This point was located on the opposite side of the trench so to find this point crossing the trench once again was a necessity. 


Figure 5: The third point located by group 6 or the third point on the fifth set of points

The third point was probably the easiest to find, because it was located on the edge of a trail, so once group six had gotten out of the forest it was very easy to locate where point three was located. The point wasn't covered by any other trees and was basically in an open area which made the marker very easy to locate. 

Figure 6: The fourth point located by group 6 or the second point on the fifth set of points

The fourth point was also a difficult point, because it required group six to go back into the trench where the landscape was very steep, making it hard to walk through and hard to get down to the point. Also the marker for the fourth point was well covered by the surrounding trees and leaves making it very hard to locate. 


Figure 7: The final point located by group 6 or the first point on the fifth set of points

The final point was also an easy point to locate, because it was also just off of the trail but it was harder to find because by this time the GPS device used by group six had shut off. So at that time group six did not have a GPS to refer our current location to at the time. The only tool that group six had available at the time was the base plate compass making it harder to find the final point. 

Discussion:

This activity was very enjoyable, because this allowed each member of every group to learn a new set of skills. Navigation is a very important part of life, especially when technology fails and all that is available to someone is a paper map, that is where these skills can come in handy. Group six only had a few issues, one was that the GPS device had failed during the end of the activity, another was that after some of the points the pace of group six wasn't always accounted for or kept track of. Instead all that was used was the bearing to the next location. But in the end the errors that did occur were minuscule and didn't affect the field activity in any large ways. Below are figures 8 and 9, representing the navigation trails of each group and the navigation trail of group six alone. 


Figure 8: A map representing the trails recorded by the trimble GPS device for each group


After reviewing the figure above, most of the trails aren't very neat meaning that each group had problems locating each point. For instance if one was to look at some of the trails, they would notice that there are clumps of points meaning that the groups had to turn around and retrace their steps to figure out the correct bearing. But in some areas the trails straighten out meaning that during some points of the activity each group had figured out the compass and GPS device using them to their advantages.


Figure 9: A map representing the individual trail of group six recorded by the trimble GPS device

After looking at the map above, group six did not have too many areas where there were huge clumps of points, meaning that group didn't have to turn around a lot. Actually most of the points are represented in a straight line, so this group had a good idea of the general direction needed to travel to locate each point in the fifth set.


Conclusion:

This activity taught myself and my group members a lot of information, one not to relay mainly on technology or GPS device because at some point the batteries may die and you will be lost. Two, that navigational skills are very important because if the technology was to die and all you had on you was a compass and a map then it would be easy to correctly direct yourself. The final point learned from this activity is that one needs to always be prepared when going out into the field and even the smallest mistakes can cause problems, so make sure all of the information required is accurate and correct.  

Tuesday, November 1, 2016

Priory Navigation

Introduction:

The ways that people navigate where to go has evolved exponentially, moving from looking at paper maps during long car drives to using the GPS device on our cell phones. Usually there is no way a person can get lost now a days because they always have their phones on them, but what happens if the phone dies and all they have is a map. That is why the old navigation techniques will never die, and why the objective of this lab is to create two navigation maps that will be used to guide our way. In order for these maps to work correctly they need to be created as accurate as possible, and able to provide proper directions. For this to happen the maps created need to be put in the proper coordinate system as well as the correct projection so then the person reading the map will find themselves in the correct location. 


Methods:

The first map created contained a UTM grid at 50 meter spacing, while the second contained a decimal degree map. Along with that both maps needs to contain certain elements such as a north arrow, scale bar, RF scale, what projection its in, what is the coordinate system, list of data sources, a watermark with the cartographers name, and the pace count. Most of the data for the maps was supplied by Professor Joe Hupy, with a few modifications the maps were created. Both the UTM grid and the decimal degree grid needed to be created using ArcMap. The UTM grid was created with spacing at every 50 meters, this will allow it to be easier to navigate using the map. While the decimal degree grid was made using 5 second intervals for the same reason as the UTM grid. 


Results:

Figure 1: Priory study area with a decimal degree grid separated by 5 second intervals





Figure 2: Priory study area with a UTM grid with 50 meter intervals

Conclusion:

Overall I am very excited for this lab and learn how to correctly navigate through the selected study area by only using a map. This is going to be very interesting, because it is making the class think in a different way, a way that they aren't used to. Also will hopefully show the class the importance of being able to read actual maps rather than just being able to read a GPS device. This exercise helps the class understand the importance of accuracy as well, both when reading and creating maps meant for navigation. If one mistake goes down, then that can turn this field activity into a nightmare where groups may get lost if their maps aren't accurate to real world features. 

Tuesday, October 25, 2016

Distance Azimuth Survey

Introduction:

This project includes the class splitting up into groups once again and each taking data from a center points to different types of trees surrounding that point. Each group needs to create a survey plot that is going to prove useful when technology fails and backups are needed. The study area is a dirt trail right in the middle of a forest/swam full of different types of trees. Below is figure 1, which represents the study area of where points were taken by groups through GPS devices and the azimuth distance of each point to the center point is calculated.

Figure 1: Yellow blocks represent where the data was collected,
it is located on a dirt trail behind Davies and Phillips at UWEC

The figure above represents the study areas where the data points are being collected, the study areas selected are located on Putnam trail, directly behind the Davies Center on the University of Wisconsin - Eau Claire campus. This area is very swampy which is a good place for a variety of different types of trees to grow. This study area was selected because of the large variety of trees found in the area to take azimuth directions on. Once the groups arrived to the study area, they split up to three separate center points where the longitude and latitude were taken for each of the center points, next was to take azimuth distances to different species of trees found nearby. The groups were assigned GPS units to record the location for each tree origin in the survey. Along with measuring the distance of each tree there were more attributes collected, the azimuth, tree type, and diameter. Once all of the data was collected the groups combined all of the data into an Excel spreadsheet so than it can be used in ArcMap. Below is figure 2, representing the Excel spreadsheet created containing all of the data.



Figure 2: Excel spreadsheet for the survey containing all of the attribute data for each data point collected

The data collected was the longitude and latitude of each center point, then the distance from that center point to the trees chosen in meters for the survey. Along with the azimuth from the center point in degrees, the diameter of the tree taken at chest height, the tree species, and the sample area number. Once all of the data was collected the groups combined the data then moved to mapping out the collected data using ArcMap.


Methods:

This survey contained many steps, all of which are included below;

Step 1:

Locate the study area. This area must contain a large amount of different tree types and can be easily located using technology such as google maps for data accuracy reasons.

Step 2:

Obtain the GPS devices and measuring tape from professor and identify center point from which all of the data will be obtained from. Making sure that data accuracy is kept in mind when performing these steps.

Step 3:
Select ten different trees for the survey recording all of the data for each attribute.

Step 4:
Use a compass to obtain the azimuth by directing the compass at the tree in question for the survey.

Step 5:
Use the distance device provided by the professor to obtain the distance from the center point to the tree in question.

Step 6:
Figure out what species the tree is by the features of the tree, for instance, leaf shape, color of bark, texture, and other physical characteristics to determine the species.

Step 7:
Use the measuring tape provided to determine the diameter of the tree at chest height.

Step 8:
All members of the group record information collected for each tree for the survey

Step 9:
Transfer all of the data collected from the notebook to an Excel spreadsheet and combine with the other groups to have more data overall.

Step 10:
Convert the Excel Spreadsheet into a table on ArcMap, then perform the 'Bearing Distance to Line Command' tool to draw out the lines from the center point to the trees surveyed.

Step 11:
Use the feature class produced from the last tool used and use the 'Feature Vertices to Points' tool to use points at the end of each line to represent the tree taken from the center point.

Step 12:
Create a map of the end result from the data collected.


Below is figure 3, which represents the final map of the data collected.

Figure 3: Final points from the data collected in the field, representing the trees surveyed
distance from the center point

In figure 3, each tree that was collected in the survey is represented by a green tree, and the distance from the center point are represented by red lines each representing a shorter or longer distance than the last in meters.


Results/Discussion:

Initially there were a few problems that were encountered, nothing went wrong while collecting the data. But at first when the data was added, some of the groups added their data incorrectly which made a portion of the appear in the wrong location as can be seen below in figure 4.

Figure 4: Yellow boxes contain the data points collected from the original data 

Visible in the figure above is the data that was added incorrectly, the longitude and latitude data was added incorrectly for the study areas located most north and the study area at the very bottom of the map. One was initially located no where near the site where the data was collected and the other was located in the parking lot outside of the Davies Center approximately 30 meters in front of where it is supposed to be. This was solved by changing around a few of the numbers in the longitude and latitude so then the data appeared in the correct locations to where the data was collected. These methods are very useful when technology fails, and back ups are needed. As long as the right equipment is in the possession of the surveyor then all is well. A pro for this method is that it is easy to use as long as the data is recorded in an organized fashion, for example in a table format. The technology that has surpassed this method are distance finders and different types of GPS to collect data points. These points if collected through a survey grade GPS can log all of the attribute data collected in the field, and already compatible to transfer onto ArcMap for use. The results taken by group 1 were all in their correct locations it seemed, because all of the data collected was entered into the Excel spreadsheet correctly. But its hard to tell how accurate the locations of the data points are on the map because of how thick the trees are its hard to see through the cover provided by the trees.


Conclusion:

This was overall a good lab, the class including myself was able to learn survey techniques if/when technology fails in the field. The only major error in the survey was the incorrect GPS points logged in the spreadsheet so it is very important to use the correct GPS points. As well as logging all of the attribute information correctly so then there aren't any mistakes in the final product. The accuracy in this project was sub par because of how the tree covers was so bad the GPS locations were hard to pick up. Older equipment can be frustrating to work with in times, but it is a good skill to learn just in case its needed. It would have been more interesting to have collected even more data points so than the different groups could get a better idea of these methods, just in case its the only option in the end.










Tuesday, October 18, 2016

Field Activity #4: Digital Elevation Surface Part II

Introduction:

In the previous lab the class used a sandbox as their study area which was 114 x 114 cm filled with sand. Each groups first task was to create a variety of different types of terrain such as; a ridge, hill, depression, valley, and plain. Our group created a grid system of 19 x 19, where each point is 6 cm apart from one another. Once the grid was created measurements of each unit was taken where the top of the sandbox is the zero elevation level, after the data points were collected they were then transferred onto an excel spreadsheet and ready for use in ArcMap. Data normalization is used to reduce data redundancy and improve data integrity, this skill was very important in this lab because it allowed the class to easily find any errors in the data and fix them. Once the data is normalized each group has collected accurate and usable data, which will allow them to create 3-D models of their landscape. The data points collected show the difference in elevation from point to point, this allowed each group to create their landscape through the use of interpolation. Interpolation is a tool found in ArcMap, this tool predicts the values for cells or the data points collected in a raster. The different interpolation methods used in part II of this activity are; IDW, Kriging, Natural neighbor, Spline, and the creation of a TIN raster. 

Methods:

The first step for this project was to create a folder specifically for all of the data collected and maps created. Next is to create a geodatabase within the folder that was just created, after the geodatabase is created the excel file with all of the data is imported into our geodatabase making sure that all of the data entered is set as numeric in excel. The excel file containing all X,Y, and Z data is opened up in ArcMap, then the different interpolation methods are implemented. The first is the IDW (Inverse Distance Weighted) method, this specific method averages all of the data within one cell of the grid system. The closer the data points are to the center of the cell in question the larger weight it has on the averaging process of each individual cell. The Kriging method estimates the surface of the landscape within each individual cell and from a scattered set of points with z-values. This procedure produces a prediction of what the surface looks like from the data collected from the study area, while providing a measure of accuracy for each surface prediction given. Natural neighbor is the next method, this method uses the z-value that was provided for each individual cell, finds the closest other data points neighboring the query point and creates a surface where all areas are smooth except for the locations with input values. The Spline method estimates the input values for each individual cell using a mathematical function that minimizes the curve on the surface. Creating a overall smooth surface that passes exactly through the data points collected making the depiction of the study area very accurate. The final method is the TIN (Triangular Irregular Network) method, this method connects all of the data points by triangulating a set of vertices. This forms a series of triangles that are all connected by these vertices creating a surface that displays elevation. Once all of the each of the interpolation methods was used on ArcMap, the data was then moved onto ArcScene where the image was shown in 3-D format. In order to create a map for each of the interpolation methods, each of the 3-D images needed to be saved as a layer file so then they can be opened on ArcMap and the legends from ArcScene can be used in the final maps. Next each 3-D image was exported as a 2-D JPEG, still displaying the image in 3-D. The orientation used helps each viewer see most if not all different types of terrain represented in the study area. The scale is very important because it allows someone to tell how much elevation is present in a certain area. 

Results/Discussion:

The different interpolation methods show multiple advantages and disadvantages with representing the elevation data from the study area. Some of the methods proved to show the data in similar ways, but others were better at representing certain types of terrain. Below is figure 1 which are the results for the IDW interpolation method.
Figure 1: IDW Interpolation of Sandbox Study Area


As one can see from the image above the areas that are at the zero elevation level are represented by the color green, while the areas that are above that level are blue and the areas that are below are yellow. The disadvantages to this method are that in the top right corner, the depressions aren't represented to the correct depth each of them smaller then they are supposed to be. There is the same problem with the valley on the left side of the map, it also doesn't represent it do the correct depth, too shallow. What can be taken from this image is that the IDM interpolation method doesn't represent area's below the zero elevation accurately, while the area's above that elevation level are represented accurately. Below is figure 2, representing the Spline interpolation method of the sandbox study area. 
Figure 2: Kriging Interpolation of Sandbox Study Area


After looking at figure 2 one can see that the Kriging method is probably the worst at representing elevation. The valley and depressions aren't represented correctly when being compared to the study area, they are far too shallow and not deep enough. While the ridge and hill are far too close to the ground and not high enough, once again not representing these features correctly when being compared to their real life features in the study area. Figure 2 is supposed to be representing a 3-D image based on the data collected in this activity but yet it almost looks as if it is still 2-D. The method still allows somebody to be able to see where the features are located because of the colors but it doesn't correctly show the elevation levels that it's supposed to. The map seems to be showing that the valley is the same depth as most of the study area which is not the case because the only other area that should be showing close depth values to the valley would be the depressions. Figure 3 is below, in this figure is the natural neighbors interpolation method.
Figure 3: Natural Neighbor Interpolation for Sandbox Study Area


This method for interpolation does an especially good job at representing the valley and depressions, the valley is to the far left of figure 3 and is colored blue. The three depressions are at the top left of figure 3 and are also colored blue. The main issue with using this method is that all levels that are supposedly below the zero elevation level are all listed at the same depth through this method. This can lead to problems, because of this people can misread figure 3 and misunderstand how deep some of the features are supposed to be and how some aren't as deep as they should be. The next method is represented by figure 4, this is the spline interpolation method.

Figure 4: Spline Interpolation of Sandbox Study Area


This method is the best at representing the landscape that was created in the sandbox, the spline interpolation does the best job at representing the difference between what is supposed to be below the zero elevation level and what is supposed to be above. Also does a very good job at representing a gradual increase or decrease in the landscape for example in the valley, the walls of the valley are changing color as they go down meaning that the depth is increasing. The final method is represented in figure 5, not really a interpolation method but the Triangular Irregular Network or TIN does a very good job at representing different elevation and depth levels for valleys, depressions, hills, and ridges. While looking at figure 5, somebody can easily see which colors are the highest (white, gray, and brown) and which ones represent the lowest points in the landscape (green, beige, and light blue), but in the end the best at representing the entire landscape as a whole is the spline interpolation method.
Figure 5: TIN Raster of Sandbox Study Area

Revisit Survey:

When remaking the survey with the landscape that was created, group 2 went back to the sandbox and created more data points towards the back right of the study area, where the depressions and the ridge are located. In order to do this, the group decided to change their points from 6x6 in that area to 3x6 which will provide more detailed digital imagery of that area. After looking at figure 6 below, one can see that the area where the depressions and ridge are located, there is much more detail then before in figure 4. Not only does it show a more accurate depth for the depressions but more accurate height for the ridge. The colors are continuously changing in those areas now because of the more data points taken to turn that area into a more accurate depiction of the study area.
Figure 6: Spline Interpolation of Revised Sandbox Study Area


Conclusion:

This survey is mainly about data normalization, which is very similar to many other field based surveys that have been performed to obtain data. If somebody were to do this activity without normalizing the data it would be very time consuming because of all the extra data points that would needed to be collected. This specific project is different from any other project because in this project instead of mapping an already existing landscape, the groups were able to design their own and map it themselves. It is not always this easy to create such a detailed grid based survey because this project was based on such a small scale. Compared to other projects where someone might have to collect data for a study area spanning hundreds of acres, this would be much harder to create a grid for something that large. Elevation isn't the only type of data that these interpolation methods can be used on, other types of data such as climate or temperatures. For example by creating a grid system over a large area and taking temperature data from each grid unit, a map of the gradual climate change throughout that study area can be made through the use of interpolation. 

Tuesday, October 11, 2016

Field Activity #4: Creation of Digital Elevation Surface

Introduction:

For this project the class was given two bins that were filled full of sand, these bins represented our study area. In this activity, the first step was to create a variety of different types of terrain such as; a ridge, hill, depression, valley and plain. In order to map out these types of terrain a sampling method must be set up, but first what is sampling especially in a spatial sense? Sampling is taking a number of points/samples to determine the big picture, in a way its a shortcut method to investigate the variable(s) at question. Each sample point contains data about that specific location and that will allow a map to be created from the area that the sample points are collected from. There are a few different ways to go about sampling this landscape, the first is random sampling, basically this is where points would be randomly placed around the landscape and the data would be taken from each of those points and recorded. Hopefully the random points would be on the terrain features but since the sampling technique is random, there are no guarantees. Systematic sampling is another technique that could be used for this activity, this technique is where you start with a random point then move on from that point at a periodic interval that doesn't change throughout the experiment. This way there will likely be a grid pattern created which will make all of the sample points organized and easy to collect data from. The final sampling method is stratified sampling, this is where the samples are divided up into separate groups of where the groups are based on similar features or data points. The lab objective is to create a landscape in a bin that is provided full of sand, create a grid system to map out the landscape, log the data from the grid system into an excel spreadsheet, and finally use a computer program that will create a digital image of the landscape created.


Methods:

The sampling method used for this activity was stratified sampling because it makes the most sense with the landscape that was created, and it would be the least time consuming method. Another method that would be similar to this one would be the systematic sampling method. But since the stratified method was used, there were fewer points taken, and the overall accuracy will be much higher because of how similar points are being combined together. The materials that are used in this lab includes a sandbox with a wooden frame, copious amounts of beach sand, a meter stick for measuring, push pins to mark important locations for the grid system, and string to create a grid system for the landscape created. The sampling scheme was 6 cm x 6 cm, this is because it's small enough to collect enough data for the activity, but large enough to the point where the amount of data points isn't overwhelming and hard to organize. This sampling scheme was completed by a meter stick being put on the wooden frame of the sandbox, this is where the push pins were put 6 cm apart from one another on every side of the wooden frame. The string was then used to connect the pins that are directly across from each other so then the grid system can be created. Below is an image of the grid system that was created using push pins, lines of string, and a meter stick.

Figure 1: Grid System for sampling method and Topography that was created

Since the data collection started at one of the corners of the sandbox a traditional (x,y) coordinate system was set up, where each point correlates with two points, but in this project there is also a z value for the elevation of each. For the zero elevation level, the top of the wooden frame was used as a reference point for zero elevation. To record all of the data, a table was drawn out in a notebook with the exact same grid layout as the figure above so then the elevation data for each sample point can be correctly recorded and organized.  The data was entered in this fashion starting at the first point; 1-1-Z, 2-1-Z, 3-1-Z, 4-1-Z, and so on. this made it especially easy to transfer the data from a notebook to a excel spreadsheet.


Results/Discussion:

The final number of sample points collected is 361 points. The minimum number that was collected was -14, maximum number was 4, the mean was -4.7, and the standard deviation was 3. The sampling method that was chosen worked perfectly, as discussed by the group each grid unit was one sample point, and this made it much easier to organize all of the data that was collected. One of the major problems that occurred during this activity was that the lines were producing some slack which made it difficult to take the elevation points because the lines were at the same height of the wooden frame which was designated as the zero point for elevation. This was overcome by tightening up the lines to the required height for the zero point for elevation. Also each grid unit was hard to measure not based on the height of the line but the terrain itself because the entire area within each grid unit isn't flat making it hard to average out the elevation for each individual sample point.


Conclusion:

The sampling method that was used in this activity relates to most other sampling methods and the definition of sampling itself. But in this case it especially relates with the systematic sampling method because of how a grid system was created to acquire the points and not just placing random points out there and taking data from them. It is crucial when doing spatial analysis to have a sampling method, a sampling method allows you to generalize some data of a large area into a smaller sample. There is no possible way to collect every single point of data in a large area, but through sampling every point doesn't need to be collected and the same results are achieved. In relation to sampling spatial data for a larger area this is essentially the same concept but at a smaller scale. Once all of the numbers were analyzed the sampling method chosen did a decent job, each of the sampling points came out with an elevation level, all of the land types were accounted for, there are just some sampling points that could be more accurate. To refine the survey to increase the sampling density, a smaller sampling scheme should be used so than even more accurate data can be collected from each grid unit.