Subject Re: [firebird-support] Execute Block size limit
Author liviuslivius
Yes, limiting number of inserts can help but.

What kind of connectivity components are you using? If firedac, then look at array dml. It make prepared execute block for you and i can get performance e.g 200 000 records per second.

You have small record sets to import, then write insert one by one is also good method as it require second to finish

Karol Bieniaszewski

-------- Oryginalna wiadomość --------
Od: "homer@... [firebird-support]" <>
Data: 08.03.2018 02:03 (GMT+01:00)
Temat: [firebird-support] Execute Block size limit


I have a Delphi program that assembles an Execute Block with a lot of Insert statements. The Insert statements can have an unknown number of columns up to about 35. There are no BLOBs, and all fields are text. The records come from a CSV or Excel file, and have an unknown number, possibly 2,000 to 5,000.

The assembled Execute Block works great with 12 fields and 100 records, but fails at some point after that. I'm looking for a best practice solution.  Do I need to limit the number of Insert statements, or will a Commit solve the problem if inserted after every 50 or so Inserts?

Thanks in advance. I'm new to using Execute Blocks, but was unable to find a definitive "best practice" answer. Sorry if this is a duplicate question, but I really did search.