question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Bounds error for tiled datasets without overlap

See original GitHub issue

With my own server (running the current version 1.4.1 of opentopodata), I see the following problem regarding the EU-DEM dataset:

curl "https://myserver/v1/eudem25m?locations=50.100,8.387"
{
  "results": [
    {
      "elevation": 362.4136962890625, 
      "location": {
        "lat": 50.1, 
        "lng": 8.387
      }
    }
  ], 
  "status": "OK"
}

curl "https://myserver/v1/eudem25m?locations=50.101,8.387"
{
  "error": "Location '50.101,8.387' has latitude outside of raster bounds", 
  "status": "INVALID_REQUEST"
}

curl "https://myserver/v1/eudem25m?locations=50.102,8.387"
{
  "results": [
    {
      "elevation": 362.61474609375, 
      "location": {
        "lat": 50.102, 
        "lng": 8.387
      }
    }
  ], 
  "status": "OK"
}

For some reason, the location 50.101,8.387 seems to trigger this bogus error about the latitude being outside of the raster bounds, although it’s clearly inside of the EU-DEM bounds (and for the two neighboring locations everything seems to be fine).

The error occurs every time I query this location on my own server, but interestingly it does not occur with the public server:

curl "https://api.opentopodata.org/v1/eudem25m?locations=50.101,8.387"             
{
  "results": [
    {
      "elevation": 360.98956298828125, 
      "location": {
        "lat": 50.101, 
        "lng": 8.387
      }
    }
  ], 
  "status": "OK"
}

I assume the difference might be due to a slightly different data setup?

And then, independent of this specific problem, I noticed another small issue that occurs when querying two locations at once:

curl "https://myserver/v1/eudem25m?locations=50.101,8.387|50.102,8.387"  
{
  "error": "Location '50.102,8.387' has latitude outside of raster bounds", 
  "status": "INVALID_REQUEST"
}

As shown above, the latitude 50.101 is the problematic one, but here the error message mentions the other one (50.102). This might be an off-by-one issue in the error diagnostics.

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Comments:15 (15 by maintainers)

github_iconTop GitHub Comments

1reaction
januswcommented, Jun 24, 2022

Late comment to this thread: By default, the gdal_merge script does not handle the NODATA values correctly for the bkg10m dataset. The arguments -n -9999 -a_nodata -9999 are required to achieve this.

0reactions
januswcommented, Feb 15, 2021

And finally for your command gdal_merge.py -o bkg-dgm10.tif bkg-dgm10/*.tif, gdal creates uncompressed geotiffs by default which would explain the 4x size increase. If you add -co COMPRESS=DEFLATE it should bring the size down.

Ah, good point. Will try that soon.

If one can get rid of the size penalty, this might be a good alternative to the buffering solution mentioned above. It is a bit simpler at least.

So, I finally got around to trying this (sorry that it took me so long), and created one large compressed TIF file from the BKG dataset (10m grid):

gdal_merge.py -o bkg-dgm10.tif ../bkg-dgm10/*.tif -co COMPRESS=DEFLATE -co BIGTIFF=YES -co NUM_THREADS=ALL_CPUS

That took around 7 min and produced a file of 8.1G. That’s completely fine (given that the size of the source files was 6.4G) and much better than the uncompressed file with 22G.

But unfortunately the performance for querying many points at once also got a bit worse. I tried this with a query of slightly more than 500 points, and the compressed single TIF was almost three times slower that the uncompressed one, but still more than two times faster than the VRT (that is with opentopodata version 1.5.0).

As noted above, the difference is barely noticable for single-point requests.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Tiled processing of large datasets—ArcGIS Pro | Documentation
During the processing of a tile, the tile may not be able to be processed within the available memory. This tile is then...
Read more >
ArcGIS Desktop Help 9.3 - Tiled processing of large datasets
These are features with many millions of vertices. Splitting and reassembling extremely large features multiple times across tile boundaries is ...
Read more >
GDAL Reporting: Invalid value for 'bounds' metadata and ...
some of the datasets are unable to open in QGIS Desktop or do a GDALINFO or GDALBUILDVRT. I'm getting the following message: ERROR...
Read more >
GDAL and Python: How to "pad the extent" or otherwise ...
In this case, the tile index/grid is the TMS structure, so each tile corresponds to the TMS bounds. So for those tiles along...
Read more >
Buffering tiles - Open Topo Data
However not all datasets come with an overlap, which makes it impossible to interpolate locations within 1 pixel of the edge of a...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found