Inserting into SQL server with fast_executemany results in MemoryError
See original GitHub issueEnvironment
To diagnose, we usually need to know the following, including version numbers. On Windows, be sure to specify 32-bit Python or 64-bit:
- Python: 3.6.8
- pyodbc: 4.0.26
- OS: Alpine 3.8
- DB: Azure SQL Database
- driver: Microsoft ODBC Driver 17 for SQL Server
Issue
I’m loading data from a SQL Server 2016 towards Azure SQL Database. When inserting rows with a parameterized insert statement and fast_executemany=False, it works perfectly fine. When turning fast_executemany on, a very brief error message is displayed:
in bulk_insert_rows cursor.executemany(sql, row_chunk) MemoryError
This is all I get. I’ve tried setting different encodings on the connection, such as described here: https://github.com/mkleehammer/pyodbc/wiki/Unicode. It fails every single time with fast_executemany on True and succeeds every single time with it turned off.
Looking for other ideas to troubleshoot. Thanks.
Issue Analytics
- State:
- Created 4 years ago
- Comments:15 (6 by maintainers)
@gordthompson yes, using SQL_WVARCHAR works:
(The (0, 0) for size and precision instructs the driver to bind as nvarchar(max) instead of regular nvarchar — and is needed if you want to insert >4000 characters.)
Although the fast_executemany feature was designed with SQL Server in mind, it is meant to be as generic as pyODBC, so it would not be a good idea to add references to DB-specific types (and how would it even know - it just looks like a very large character/binary column at the ODBC interface.) If you do have 2GB free (definitely possible on a 64-bit system) it can certainly make use of it.