-
Notifications
You must be signed in to change notification settings - Fork 55
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Got a 7GB pgsql dump #23
Comments
I haven’t looked at this in a very long time. This was based on some other
abandoned code.
There are a few strategies that come to mind for you. I haven’t explored
them, they’re just off the top of my head. Roughly ordered from easiest to
most difficult.
- see if you can get a machine with more resources to do the conversion,
maybe connecting from your laptop?
- see if you and break apart the dump file by insert statement for the
various tables. (And maybe later a single table into multiple inserts with
a regex)
- bring up the MySQL DB and in your app write a script that copies a row
from PG and writes to MySQL
- update this code here to work as a stream or pipe rather than loading
everything into memory
I realize the above aren’t super helpful. I mostly inherited this code and
the community has pushed it forward here and there.
…On Mon, Nov 18, 2019 at 10:38 AM SubZero5 ***@***.***> wrote:
Hi there,
I have got a 7.7GB pgsql dump file to be switched into a mysql dump.
However my PHP memory limit is 512MB and my system RAM is 4GB with
currently 700MB free.
How can we split the parsing so that pg2mysql divides the mysql dumps into
various number of parts?
COPY cn (id, did, fid, l, m, ff, g, by, db, id_rc, id_rd, ac, ad, an, sa,
dn, misc) FROM stdin; ... multiple ... lines ... of ... loooong ... data
... here \.
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
<#23?email_source=notifications&email_token=AABMICEG2KYBOJOFWXC5FW3QULOIVA5CNFSM4JOYL4JKYY3PNVWWK3TUL52HS4DFUVEXG43VMWVGG33NNVSW45C7NFSM4H2D5S4A>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AABMICFWKKIA62OSZPXRDELQULOIVANCNFSM4JOYL4JA>
.
|
appears related to #2
On Mon, Nov 18, 2019 at 10:51 AM CLundquist <[email protected]>
wrote:
… I haven’t looked at this in a very long time. This was based on some other
abandoned code.
There are a few strategies that come to mind for you. I haven’t explored
them, they’re just off the top of my head. Roughly ordered from easiest to
most difficult.
- see if you can get a machine with more resources to do the conversion,
maybe connecting from your laptop?
- see if you and break apart the dump file by insert statement for the
various tables. (And maybe later a single table into multiple inserts with
a regex)
- bring up the MySQL DB and in your app write a script that copies a row
from PG and writes to MySQL
- update this code here to work as a stream or pipe rather than loading
everything into memory
I realize the above aren’t super helpful. I mostly inherited this code and
the community has pushed it forward here and there.
On Mon, Nov 18, 2019 at 10:38 AM SubZero5 ***@***.***>
wrote:
> Hi there,
>
> I have got a 7.7GB pgsql dump file to be switched into a mysql dump.
> However my PHP memory limit is 512MB and my system RAM is 4GB with
> currently 700MB free.
> How can we split the parsing so that pg2mysql divides the mysql dumps
> into various number of parts?
>
> COPY cn (id, did, fid, l, m, ff, g, by, db, id_rc, id_rd, ac, ad, an, sa,
> dn, misc) FROM stdin; ... multiple ... lines ... of ... loooong ... data
> ... here \.
>
> —
> You are receiving this because you are subscribed to this thread.
> Reply to this email directly, view it on GitHub
> <#23?email_source=notifications&email_token=AABMICEG2KYBOJOFWXC5FW3QULOIVA5CNFSM4JOYL4JKYY3PNVWWK3TUL52HS4DFUVEXG43VMWVGG33NNVSW45C7NFSM4H2D5S4A>,
> or unsubscribe
> <https://github.com/notifications/unsubscribe-auth/AABMICFWKKIA62OSZPXRDELQULOIVANCNFSM4JOYL4JA>
> .
>
|
Even with 16RM On line 24 in pg2mysql.inc.php I saw Finally the script stops at line 419 where the resulting MySQL code is contaminated in a very long string to be returned later. This is causes the error:
See also https://www.airpair.com/php/fatal-error-allowed-memory-size |
to convert big file, change file pg2mysql.inc.php
|
Hi there,
I have got a 7.7GB pgsql dump file to be switched into a mysql dump.
However my PHP memory limit is 512MB and my system RAM is 4GB with currently 700MB free.
How can we split the parsing so that pg2mysql divides the mysql dumps into various number of parts?
COPY cn (id, did, fid, l, m, ff, g, by, db, id_rc, id_rd, ac, ad, an, sa, dn, misc) FROM stdin;
... multiple
... lines
... of
... loooong
... data
... here
\.
The text was updated successfully, but these errors were encountered: