Subject | RE: [firebird-support] External file load |
---|---|
Author | Rick Debay |
Post date | 2009-02-23T17:23:08Z |
> Your later proposal to deal with the oversizeness of the record usinga SP will bump into yet another limit
I implemented a selectable stored procedure, it didn't hit any limits.
> If your date data is in a variety of string formats that won't resolveto a *single* recognised Firebird date literal format that can be cast
Inside the SP it handles the various date formats.
> If you have any control over the creation of the fixed length inputdata
None, each row is 3,000 bytes.
Now the problem is that I'd like to pass the table name in to the stored
procedure, and use EXECUTE STATEMENT. Otherwise dropping the external
table requires that I drop the SP. The problem is that I get "token
size exceeds limit" when I try to compile the SP. What is the limit on
the size of the select statement I'm trying to execute?
-----Original Message-----
From: firebird-support@yahoogroups.com
[mailto:firebird-support@yahoogroups.com] On Behalf Of Helen Borrie
Sent: Friday, February 20, 2009 5:56 PM
To: firebird-support@yahoogroups.com
Subject: Re: [firebird-support] External file load
At 03:25 AM 21/02/2009, you wrote:
>I created an external table DDL to import a fixed width file with 326problem.
>columns. To move that data in to the internal table, I was going to
>select from a view. I want the view to trim trailing blanks and to
>convert the file's various date formats to Firebird dates.
>Without any data conversion the view can be created without any
>When I try to trim all text fields that are longer than one character,If you're trying to use UDFs then it's more than likely your UDF
>I get an error that the table definition, at 65574 bytes, is too large.
>The strange part is that originally I was left and right trimming ALL
>columns. The error gave the same table size; only left trimming
>specific columns didn't reduce the size at all.
declarations are specifying returns that are too large. If you're
importing to Fb 2.1 then study the internal functions and blow away
those UDFs.
>I have two questions, what can I do to reduce the size of the metadata,This message means that the size of the input data is wider than the
>and if I can't use a view how should I transform the data when
>importing in to the internal table?
>
>ISC ERROR CODE:335544351
>ISC ERROR MESSAGE: unsuccessful metadata update new record size of
>65574 bytes is too big TABLE V_THE_FILE
64KB limit on record size. This limit applies *regardless* of the
method you use to import this data. Cinderella wears a size 64KB
slipper. The Ugly Sisters in your external file can't fit into
Cinderella's slipper.
Review the plot.
The view definition will present a subsequent, different issue because
its BLR is larger than the 64 KB limit on BLR (that's the SELECT
statement that defines the view, when compiled).
Your later proposal to deal with the oversizeness of the record using a
SP will bump into yet another limit: 48 KB is the maximum size of BLR
that can be created for PSQL objects.
Go right back to the external table definition. Count bytes in the
input record. If the input data is unicode, you'll have a difficult job
counting bytes but it will max out at character_length/4. If it's ANSI,
you can count 1 for 1.
If you have any control over the creation of the fixed length input data
then make each column only as large as the largest input. If that's
still too large then you have an Ugly Sister. Choose a field
(preferably) or a group of fields that you are sure represent a unique
key for the input and split the input into two, including the key
field[s] in both, and define two tables. That way you can stitch them
together afterwards, within the limits.
Forget the "view" approach for mangling the input.
-- If you define the external table's string fields to be varchar of the
same size as the full width of the input (including the trailing
blanks), then the engine will do the right thing with the trailing
blanks.
-- If your date data is in a variety of string formats that won't
resolve to a *single* recognised Firebird date literal format that can
be cast, then import them as varchar and accept that you're going to
have to post-process them with a stored procedure.
-- For numeric data, make sure all data in each field is properly
convertible to the defined type; for example, anticipate integer
overflow by defining all int fields as BigInt; if you have ints that
are wider than 18 numerals then define them as varchar of appropriate
width; make sure the tool you are using to create the export file does
not overfill the decimal parts of non-integer numbers and also does not
attach decimal part to fields you have defined as integers.
./heLen
------------------------------------
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Visit http://www.firebirdsql.org and click the Resources item on the
main (top) menu. Try Knowledgebase and FAQ links !
Also search the knowledgebases at http://www.ibphoenix.com
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Yahoo! Groups Links
Disclaimer: This message (including attachments) is confidential and may be privileged. If you have received it by mistake please notify the sender by return e-mail and delete this message from your system. Any unauthorized use or dissemination of this message in whole or in part is strictly prohibited. Please note that e-mails are susceptible to change. RxStrategies, Inc. shall not be liable for the improper or incomplete transmission of the information contained in this communication or for any delay in its receipt or damage to your system. RxStrategies, Inc. does not guarantee that the integrity of this communication has been maintained nor that this communication is free from viruses, interceptions or interference.