It seems that Data Quality is an idea that we all seem to agree we need, and some are even doing. Like Metadata, it is something GIS Professionals feel needs to be provided to us, but how, as members of the GIS Community, do we make sure we are delivering what we preach.
Here is a list of a few simple things we can do to make sure that our clients both internal and external can actually achieve the outcomes required.
- Make sure that when Interrogating Data you use the coordinate systems requested by your client. Many projects are done in a default coordinate system like WGS84, which is fine, as long as all are aware of any distortions being propagated.
- If providing multiple datasets to a client that they are ALL in the same coordinate system. That way when the client goes on to use the data ArcMap does not have to do as much work. ArcMap if both datasets are in known coordinate systems will do transformation on the layers to try to make them fit, this takes processing power as well as time
- If you are converting data from one environment to another, be aware of any changes in format that may occur. For example if converting from Petrosys (Mining Software) to shapefile. Petrosys holds it’s information in high precision, and the shapefile does not. As a result you are losing accuracy and may end up with NULL features (Features whose area is less than the resolution of the feature class). There are many ways to correct this and I am happy to answer questions on it.
- Be aware of the clients needs. If they want to do street routing maybe a free download of Cartographic Feature Classes (tiled on map sheet) isn’t the best option as it has been edited for cartographic accuracy not network analysis
- If receiving data, make sure that the data you receive is what you asked for.
- Always provide information on the uses for data so it is not used to make misleading decisions