Managing Validity Errors In Tax Parcel Geospatial Datasets

Understanding Validity Errors in Parcel Data

Validity errors refer to problems in tax parcel geospatial data that cause the dataset to be invalid or not meet quality standards. These errors prevent parcels from being accurately represented spatially and analyzed properly. Common validity errors include inaccurate coordinate data, overlapping parcels, and gaps between parcels.

It is crucial that tax parcel datasets maintain topological validity to ensure real-world spaces are represented digitally in a accurate, consistent way. Validity errors can lead to incorrect map outputs, analysis errors, and problems joining attribute data. Managing these errors through detection, fixing, and prevention techniques is key for high quality parcel datasets.

Defining Validity Errors

The main types of validity errors in parcel datasets are:

  • Inaccurate coordinate data – Parcel vertices or lines have positions that do not match true ground locations
  • Overlapping parcels – Digital parcel boundaries unintentionally intersect or overlap
  • Gaps between parcels – There are sliver gaps between parcel polygons that should intersect

These errors cause problems because analysis relies on parcels accurately representing real spaces. Overlaps and gaps make parcel area calculations inaccurate. Poor coordinate accuracy prevents joining additional geospatial data.

Validity refers to logical consistency rules for how spatial data should be structured topologically. The two main standards are:

  • Must not overlap – No overlapping polygons, lines or points
  • Must not have gaps – All spatial objects that should be connected are properly joined and intersecting

When these rules are broken, the parcel fabric has logical inconsistencies and key analytical functions will fail. Hence monitoring and fixing validity errors is essential.

Common Causes of Validity Errors

There are a few core root causes that lead to introduction of validity errors into parcel datasets:

Inaccurate Coordinate Data

Poor coordinate accuracy is often traced back to shortcomings of GPS collection devices, which can log incorrect vertex positions that then get stored in the parcel layer. man-made features like building structures interfere with satellite signals which contributes to inaccuracies. Natural features like heavy tree canopy can also introduce small position errors when collecting data in the field.

Overlapping Parcels

Common sources of unintended parcel overlaps include:

  • Mapping errors during new parcel creation – new boundaries not properly snapped to existing fabric
  • Incorrect editing procedures – parcel edits, combinations, or splits resulting in overlaps
  • Coordinate shifts from datum or projection changes
  • Conflicting records for true parcel ownership boundaries

Proper snapping, topology rules, and editing training help avoid overlaps but they still occur. Overlaps invalidate analysis done based on unique parcel spaces so fixing them is essential.

Gaps Between Parcels

Gaps between parcel polygons typically stem from:

  • Coordinate inaccuracies leading to vertices not properly intersecting
  • Failure to snap parcel lines together when editing
  • Not accounting for measurement error leading to unintended gaps between thought-to-be connected boundaries

Gaps lead to slivers of unlabeled space which can cause errors calculating total land area. Fixing gaps requires finding and reconciling small discrepancies between neighbor parcel lines.

Detecting Validity Errors

To manage validity errors, GIS administrators must first systematically detect where problems exist within the parcel data. There are automated and manual techniques to shine light on coordinate, overlap, and gap issues.

Running Topology Rules

Most GIS platforms provide custom topology rules that can scan for geometric anomalies. By running Must Not Overlap and Must Not Have Gaps rules, the system automatically flags violations for further review. Users set a tolerance threshold which defines the spatial difference to trigger error creation. Running topology checking regularly after edits provides early warning of emerging issues.

Visual Inspection

GIS professionals can manually scan for errors by visually reviewing parcels at different map scales, looking for alignment issues, gaps and overlaps. This can reveal issues topology rules might not catch if threshold tolerance values are set too high. Certain problem areas known to historically have coordinate issues can be checked more rigorously. Visual review is time intensive but methods like new imagery comparison can make inspection more efficient.

Querying for Empty/Null Values

Validity errors often manifest as empty or null attributes when associated tables and spatial data become out of sync. Querying the attribute table for blank parcel number, ownership, or location values can reveal gaps and overlaps. High counts of nulls suggest alignment issues between tables and geometry.

These detection techniques can be used in conjunction to thoroughly monitor parcel data validity. GIS teams should determine the optimal checking frequency based on update cycles and quality requirements.

Fixing Validity Errors

Once detected, steps must be taken to reconcile gaps, eliminate overlaps, and improve coordinate accuracy. The correction method depends on the specific error source. Procedures fall into three core areas:

Handling Bad Coordinate Data

Adjusting inaccurate vertex and line coordinates requires realigning geometry to physical ground control points or other trusted geospatial reference data. This can involve:

  • Snapping to higher accuracy control network from survey grade GPS, total stations etc.
  • Snapping to recent orthoimagery known to be well-aligned to ground
  • Distributed offset to smoothly realign lines based on measured error

Coordinate correction should adhere to relative accuracy standards for correction threshold and quantifiable improvement. Metadata should log all alignment edits for posterity.

Resolving Parcel Overlaps

Fixing overlaps requires identifying the authoritative boundary line among conflicting polygon edges. This authoritative line is preserved while others are clipped to conform. Determining master line depends factors like:

  • Legal ownership – title transfers, easements etc.
  • Seniority – older boundary generally takes precedence
  • Data source – coordinates from GPS, imagery etc.
  • Spatial accuracy – resolution, precision and ground condition when measured

Once source line decided, intersecting polygons are reshaped with topology rules to snap boundary lines. This removes overlap while maintaining parcel fabric continuity.

Closing Parcel Gaps

Fixing gaps requires finding and eliminating sliver holes between parcel features that should intersect in reality. The major steps are:

  1. Identifying gaps from topology checks or visual review
  2. Snapping and extending neighboring polygon boundaries to logically fill gaps
  3. Apply edge matching technique to align shared borders
  4. Use autofill topology functions where available to reconstruct holes

As gaps are closed, spatial conformity between parcel polygons gradually improves, strengthening digital representation.

Automating Error Checking with Python

While GIS platforms provide built-in topology and reporting functions, GIS admins can further automate error detection using scripting. Python is well suited for repeatedly batch analyzing datasets to expose issues. Example techniques include:

Sample Script for Custom Topology Rules

“`python
import arcpy

# Local variables
parcel_features = “parcels.shp”
output_errors= “errors.shp”

# Must Not Have Gaps rule
arcpy.FeatureToLine_management(parcels)
arcpy.FeatureToPolygon_management(lines)
arcpy.SelectLayerByLocation_management(polygons, “ARE_IDENTICAL_TO”, parcels)
arcpy.CopyFeatures_management(polygons, output_errors)

# Must Not Overlap rule
arcpy.Intersect_analysis(parcels, intersections)
arcpy.SelectLayerByAttribute_management(intersections, “COUNTS > 1”)
arcpy.CopyFeatures_management(intersections, output_errors)
“`

This snippets finds gaps between parcels as sliver polygons and records where features overlap. Custom rules can be added to regularly flag issues for repair.

Using Overlap and Gap Detection Tools

Spatial analysis tools like Find Identical to detect gaps or Intersect to find overlaps can be scripted using Python iterators and data access methods. Admins can build validation functions to automatically populate reports of discovered issues. Something like:

“`python
overlap_count = {}

with arcpy.da.SearchCursor(parcels, [‘OID@’, ‘shape@’]) as cursor:
for row in cursor:
# Intersect current parcel with all other parcels
intersections = arcpy.Intersect_analysis(row[1], parcels)

# Track overlaps
if int(arcpy.GetCount_management(intersections).getOutput(0)) > 0:
overlap_count[row[0]] = intersections

print(“Overlapping parcels”, overlap_count.keys())

“`

Scripting geospatial analysis introduces added flexibility to customize batch error detection.

Maintaining Validity Over Time

To sustain parcel data health, key policies and practices should be emplaced for ongoing management once errors are resolved. Preventative measures are essential to minimize rework.

Training Data Editors

Many data validity issues originate from editors making inadvertent geometry mistakes during editing sessions. Comprehensive training on proper editing workflows can prevent introducing errors, for example:

  • Snapping rules to align shared borders
  • Topology checks before saving edits
  • Usage of autofill functions

Equipping editors with knowledge of spatial data rules and quality procedures substantially improves edit outcomes.

Establishing Update Procedures

Sound data maintenance procedures can sustain integrity, for instance:

  • Weekly visual review spot checks on high risk areas
  • Batch topology checks after major edits
  • Periodic alignment analysis against survey control data

Baking quality assurance into update workflows reduces long term maintenance effort.

Running Regular Reports

Lastly, monitoring metrics provides visibility into data health trends. Tracking factors like:

  • Number of gaps / overlaps flagged
  • Count of null attributes
  • Failed topology checks

…over time paints a picture of whether quality improving or degrading. Strategic reporting sustains accountability to quality standards.

Leave a Reply

Your email address will not be published. Required fields are marked *