Welcome to my first post of the final semester of my GIS graduate certificate program! For the first lab in Special Topics in GIS, we learned about the fundamentals of data quality. In ArcGIS Pro, we explored data quality by using waypoints on a map to calculate the average location of the points and then use that average location to find the data's vertical and horizontal precision and accuracy.
Here is the map created in ArcGIS Pro showing buffers I created for 50, 68, and 95 precision percentiles. To calculate the distance needed each buffer to correspond with the three precision percentiles, I found the index number by multiplying each percentile by the total number of waypoints. Then, after sorting the waypoints in ascending order by distance I counted the waypoints from top to bottom until I reached the index number. The corresponding distance was the what I used when creating my buffer distances. I also calculated the vertical precision from the 68th percentile using the calculated average elevation that I derived from the provided data.
I compared the vertical and horizontal precision results with the vertical and horizontal accuracy. Precision is a measurement of how close together each observation is while accuracy measures how close they are to the true value. We used a new reference point in the data provided to help us determine the accuracy of our waypoints. The horizontal accuracy was measured using the Measure Tool at 3.14m while the horizontal precision at 68% was 4.39m, meaning the data is horizontally more accurate than precise. The average elevation of the way points was 28.52m (calculated) while the elevation for the reference point was 22.58m (given). The vertical accuracy was 5.94m which is determined by finding the absolute difference between the two elevations. The 68 percentile for vertical precision was 4.38m. This means that the data is slightly more precise than accurate since the accuracy is at a farther distance than the precision.

Comments
Post a Comment