Can't get "to_sql" to work on Datalore

Hi,
I can’t seem to get the following code to work on Datalore. The code runs fine with no errors, it just doesn’t upload any data to the database.
If I download the notebook and run as a Jupyter notebook it works fine and the data is uploaded.
Any ideas?

import psycopg2
from sqlalchemy import create_engine

conn_string = ‘postgresql://xxxxx’

db = create_engine(conn_string)
conn = db.connect()

try:
poa.to_sql(‘poa_irradiance’, con=conn, if_exists=‘append’, chunksize=2000, index=True, method=‘multi’)
except Exception as e:
print(e)

Have you tried to install the psycopg2-binary ? I got some error in the past without the psycopg2-binary, since psycopg2 can not be installed directly.

My environment.yaml file for datalore notebook inside the Notebook files

datalore-env-format-version: "0.2"
datalore-package-manager: "pip"
datalore-base-env: "minimal"
dependencies:
- pip:
  - SQLAlchemy==2.0.16
  - psycopg2-binary==2.9.6

Then the code like

import sqlalchemy, os
from sqlalchemy.engine import Engine

pw = os.environ["postgres_db_pw"]
engine: Engine = sqlalchemy.create_engine(f"postgresql+psycopg2://dbuser:{pw}@hostIP/db")

# save the dataframe to the staging db
# need to write dataframe index as a column
with engine.begin() as db_conn:
    df_11_unprocessed.to_sql(name="plasma_proteome_11", con=db_conn, if_exists="replace", index=True)
    df_mapper.to_sql(name="plasma_proteome_mapper", con=db_conn, if_exists="replace", index=True)

shall work. I set the environment variables to save credentials and attached that to the notebooks.

Hi, Thanks for you message.

Unfortunately, it didn’t work using your method either.

The size in kb of the postgres table increases but no rows are added to the table.

If I run the code in a Jupyter notebook on my local PC it works fine and the data is added to the table.

Weird! Any help would be appreciated.

image