Tag Archives: Quality

Data Quality: Step 1: The thin end of the wedge

For those who came in late my last post was on the idea of data quality and some really simple things you can do to make your work, and of lesser importance, others work easier (only kidding). The first time people think and in many cases cuss about this topic is when they receive data from a supplier (either internal or external) and it does not meet their standards. Some of the things that can be easily checked. As mentioned in my earlier post, Coordinate System is important and easy to check. The next is to check overlay, does the new data look right when placed over data you already have and are confident of.

Now we get onto the crux of this post, what was the agreement when you requested the data.

  1. How many features do you expect?
  2. Are all the fields expected to be filled out?
  3. What fields were you expecting?

These are easy to inspect using ArcCatalog but they are not always the foremost in our mind when we request data. If you have checked the data and it does not meet requirements you should be able to request the changes. BUT is that in the agreement? This puts a pressure on the timeliness of your checks. To make this easier there are 2 paths you can go down. Buy software which will run the checks you want, or write a model to pass over the data. If you want to check for NULL values or inconsistent values using the Frequency tool is a good way, as it provides a simple calculation for the number of values for the key fields


Some other tools that may be interesting are Check Geometry, Repair Geometry and, in my opinion the pick of the crop, GIS Data Reviewer. In coming blogs I will go into some other options for data quality

Have Fun, Christopher B

Data Quality: It deodorises and disinfects

It seems that Data Quality is an idea that we all seem to agree we need, and some are even doing. Like Metadata, it is something GIS Professionals feel needs to be provided to us, but how, as members of the GIS Community, do we make sure we are delivering what we preach.

Here is a list of a few simple things we can do to make sure that our clients both internal and external can actually achieve the outcomes required.

  1. Make sure that when Interrogating Data you use the coordinate systems requested by your client. Many projects are done in a default coordinate system like WGS84, which is fine, as long as all are aware of any distortions being propagated.
  2. If providing multiple datasets to a client that they are ALL in the same coordinate system. That way when the client goes on to use the data ArcMap does not have to do as much work. ArcMap if both datasets are in known coordinate systems will do transformation on the layers to try to make them fit, this takes processing power as well as time
  3. If you are converting data from one environment to another, be aware of any changes in format that may occur. For example if converting from Petrosys (Mining Software) to shapefile. Petrosys holds it’s information in high precision, and the shapefile does not. As a result you are losing accuracy and may end up with NULL features (Features whose area is less than the resolution of the feature class). There are many ways to correct this and I am happy to answer questions on it.
  4. Be aware of the clients needs. If they want to do street routing maybe a free download of Cartographic Feature Classes (tiled on map sheet) isn’t the best option as it has been edited for cartographic accuracy not network analysis
  5. If receiving data, make sure that the data you receive is what you asked for.
  6. Always provide information on the uses for data so it is not used to make misleading decisions
These things all sound simple but sometimes the simple is what we forget to do. I will be writing some more focused information in this blog about some fixes to problems that I and others have experienced.
Christopher B