arc.write error
See original GitHub issueI am getting the following error while writing an sf object
dataframe
to a GDB
. The dataframe imported is a feature class from a GDB brought in via the sf
package. After some data manipulation I am trying to write the dataframe
back into the same GDB
as a new
feature class/layer
.
I read in another post that sf
does not support writing a feature class to a gdb
so I had to use arcgisbinding
. However, when I use sf::st_write
to write the df
as a shapefile
, it works just fine.
The arc.write
works on the sample nc
dataset in the sf
package but does not work on my dataset. I don’t know if it matters but my sf object
has more than 700,000
rows and 82
columns.
How can I fix this?
Error
Error in .call_proxy("arc_write", path, pairlist(data = data, coords = coords, :
insert row failed
Code
library(sf)
library(arcgisbinding)
arc.check_product()
product: ArcGIS Pro (12.8.0.29751)
license: Advanced
version: 1.0.1.244
arc.write("path/GDB.gdb/Feature_Class_Name", data = df, overwrite = TRUE)
Issue Analytics
- State:
- Created a year ago
- Comments:20 (1 by maintainers)
Top Results From Across the Web
R arc.write error - GIS Stack Exchange
I am getting the following error while writing an sf object dataframe to a GDB . The dataframe imported is a feature class...
Read more >arc.write error - Esri Community
I am getting the following error while writing a dataframe to a GDB. The dataframe imported is feature class from a GDB brought...
Read more >arc.write error with version 1.0.1.231 · Issue #24 - GitHub
Previously working code returns an error after updating arcgisbinding package to 1.0.1.231 arc.write("E:\\file.shp", sp_table) returns.
Read more >160155: File read/write error occurred.—ArcGIS Pro
While this error can occur, it occurs so rarely that the typical causes have not been identified so no solution is available at...
Read more >Death, Taxes and the Esri ArcGIS 999999 Error: How to Fix It
When your log file reports a TopoEngine error, your best bet is to run the Repair Geometry tool. This tool inspects your features...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Another solution we can try is if you have shapefile you could do the following:
The above should work with the caveat that your data will be at the mercy of what we could write to a shapefile and the checks that went into creating that shapefile to start with, or lack thereof.
@Saadi4469 thank you. As per your question on why shapefile works, shapefiles are not subject to stringent value and geometry checks that feature classes are subject to. This means it is easy to export to a
.shp
but you may end up with problematic geometries in terms of their description (vertices) and/or relationships (such as coincidence, holes, etc.). Also please double check to make sure that the geometries and the values make sense. For shapefiles, if it cannot output a column it silently drops it whereas the arc object throws the error that you saw.Thank you for the column types. I do see some issues in the column types. I do not know details of your dataset but some columns that should be the same type are different for instance
high_chance_00_year00
is a character type whereaslow_chance_15_year00
is a numeric type. Would you spot check some of these columns that are inconsistent? If R converts the whole column to a character that might mean the somewhere there may be an inconsistent row value at that column such as,
insteda of.
or some other character.An easy solution here would be casting. You can set the types of the columns explicitly before writing it out.