Sunday, May 31, 2009

Ranty, rant, rant: There ought to be a law ...

Correct if I'm wrong, but we are near the end of the first decade of the 21st century, right?  We've had more than a decade of federal initiatives to make geospatial data more available and interoperable.  Then why do I get handed a largish data set (4 to 7GB depending on the format) produced by a federal agency in a proprietary format that requires me to have a license to ESRI products?  
My task is to move this data set into PostgreSQL/PostGIS, but since I don't have the necessary SDE libraries on the system, I can't even compile ogr2ogr to accomplish this.  This really is a vendor problem, specifically an ESRI problem.  If I was handed a Oracle export file, I can easily move this data by compiling ogr2ogr because Oracle provides the necessary libraries in its client SDK, which is available by download.  
Ironically, the purpose of this exercise to make this data more easily available.  However, I can't make it past first base because I'm hamstrung by the data format.  This is such a 90's problem.  Although shape files are the defacto lingua franca for geospatial data, the format has its limitations, namely a 2GB limit for dbf files.  It was unusual for a data set to be larger than 1GB in the 90's; but 20 years later, data sets over 2GB are not uncommon.  The current Federal procurement language should be amended to mandate that geospatial software produce entire data sets that are in an open and accessible format.

1 comment:

  1. Amen to that. ESRI first promised free libraries for file geodatabase in 2006. I'm not that thrilled with it in any event - multi-file formats are always problematic. I'm pulling for spatialite.

    ReplyDelete