Subject Re: [firebird-support] Bulk upload options
Author
Just found this article https://www.arbinada.com/en/node/1425


Firebird: bulk insert and bulk export
Submitted by Serguei_Tarassov on Fri, 02/08/2013 - 16:50
Firebird doesn't support INSERT BULK command or bcp.exe utility. But the
external tables can be a good replacement. Simple example is below.

Check that firebird.conf file allow to create external tables. Write or
uncomment line like

ExternalFileAccess = Full

Create external table


CREATE TABLE BCP_DATA EXTERNAL FILE 'с:\temp\temp.txt' (
ID CHAR(10) NOT NULL,
NAME CHAR(100),
CRLF CHAR(2) -- CHAR(1) for Linux
);
COMMIT;
Bulk export from some Firebird table (ensure that temp.txt file is empty):


INSERT INTO BCP_DATA(ID, NAME, CRLF)
SELECT ID, NAME, ascii_char(13) || ascii_char(10) FROM MY_REAL_TABLE;
As result, you have temp.txt file in "fixed length columns" format

Bulk import (ensure that temp.txt has data in fixed length format):


INSERT INTO MY_REAL_TABLE(ID, NAME)
SELECT ID, NAME FROM BCP_DATA;
I think this solutions will do!!!!




----- Original Message -----
From: "fabianch@... [firebird-support]"
<firebird-support@yahoogroups.com>
To: <firebird-support@yahoogroups.com>
Sent: Wednesday, May 16, 2018 12:43 PM
Subject: [firebird-support] Bulk upload options


>I am looking for the fastest way to run a bulk insert.
> I believe one way would be using isql.exe , I need to plan a solution to
> upload into a single table around 500 million records. Does anyone have
> experience with bulk updates via isql? Would it be X times faster than
> using a normal application inserting the records inside a begin
> transaction and commit at the end?
> The issue is I have a remote database (not a FB database) and my options
> are reading the remote DB and writing into FB one record at the time with
> a loop, or asking the remote DB to dump the table into a txt file, then
> compressing the file with winrar or whatever compression, and then using
> that file to execute a bulk upload to FB," This is an ongoing issue, is
> not a one of, so I am trying to design the best performing solution, not a
> one of trick.
> I know the volume of data will be around 250 Gb per run, hence that could
> be compressed at the server non-FB and sent to the FB server via internet,
> the compressed size would be probably just 5% of the uncompressed size as
> it is all text, so we are talking about 12 Gb aprox, the only question is
> "would then uploading the txt file into a table be quicker than a line by
> line insert into table?
>
>
>
>
>