Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

memory leak in v0.4.0 load data from postgres #706

Open
xma08 opened this issue Nov 18, 2024 · 2 comments
Open

memory leak in v0.4.0 load data from postgres #706

xma08 opened this issue Nov 18, 2024 · 2 comments
Labels
bug Something isn't working

Comments

@xma08
Copy link

xma08 commented Nov 18, 2024

What language are you using?

Python

What version are you using?

v0.4.0

What database are you using?

PostgreSQL

What dataframe are you using?

Pandas

Can you describe your bug?

A memory leak when upgrading version v0.3.3 to v.4.0 using the main API cx.read_sql loading from postgres to Pandas DataFrame

What are the steps to reproduce the behavior?

If possible, please include a minimal simple example including:

Example query / code
import connectorx as cx

sql = "select * from test_sample"
for _ in range(100):
    df = cx.read_sql("<conn string>", sql)

What is the error?

Running a simple query above over many iterations, the memory kept increasing. The issue was gone once rolling back to v0.3.3.

image
@xma08 xma08 added the bug Something isn't working label Nov 18, 2024
@wangxiaoying
Copy link
Contributor

Hi @xma08 , thanks for reporting the issue!

Have you tested the last alpha release pip install connectorx==0.3.4a3 before v0.4.0? Just want to narrow down the possible causes for the leak.

@xma08
Copy link
Author

xma08 commented Nov 29, 2024

Yes, just tried 0.3.4a3 and didn't find such issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants