Can't use Postgresql Connection ID to connect DB and create DAG #44553
Unanswered
donglilang
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello All,
I deployed the Openmetadata environment in the Docker. The airflow version is 2.9.1
I can create the airflow connection with UI, but can't list with CLI.
So I create one with the CLI:
airflow@68713c565cc0:/opt/airflow$ airflow connections add 'AF_POSTGRES_CONN' \
[2024-12-02T00:57:35.581+0000] {crypto.py:82} WARNING - empty cryptography key - values will not be stored encrypted.
Successfully added
conn_id
=AF_POSTGRES_CONN : postgres://openmetadata_user:******@172.16.240.3:5432/openmetadata_dbairflow@68713c565cc0:/opt/airflow$ airflow connections list
id | conn_id | conn_type | description | host | schema | login | password | port | is_encrypted | is_extra_encrypted | extra_dejson | get_uri
===+==================+===========+=============+==============+=================+===================+=======================+======+==============+====================+==============+====================================
1 | AF_POSTGRES_CONN | postgres | None | 172.16.240.3 | openmetadata_db | openmetadata_user | openmetadata_password | 5432 | False | False | {} | postgres://openmetadata_user:openme
| | | | | | | | | | | | [email protected]:5432/o
| | | | | | | | | | | | penmetadata_db
When I want to create DAG with Python, It raised this error message:
airflow@68713c565cc0:/opt/airflow/dags$ python ./demo_02.py
Traceback (most recent call last):
File "/opt/airflow/dags/./demo_02.py", line 35, in
File "/home/airflow/.local/lib/python3.10/site-packages/airflow/models/baseoperator.py", line 484, in apply_defaults
File "/home/airflow/.local/lib/python3.10/site-packages/airflow/providers/common/sql/operators/sql.py", line 232, in init
File "/home/airflow/.local/lib/python3.10/site-packages/airflow/models/baseoperator.py", line 484, in apply_defaults
File "/home/airflow/.local/lib/python3.10/site-packages/airflow/providers/common/sql/operators/sql.py", line 142, in init
File "/home/airflow/.local/lib/python3.10/site-packages/airflow/models/baseoperator.py", line 484, in apply_defaults
File "/home/airflow/.local/lib/python3.10/site-packages/airflow/models/baseoperator.py", line 881, in init
airflow.exceptions.AirflowException: Invalid arguments were passed to SQLExecuteQueryOperator (task_id: create_table). Invalid arguments were:
**kwargs: {'postgres_conn_id': 'AF_POSTGRES_CONN'}
airflow@68713c565cc0:/opt/airflow/dags$ cat demo_02.py
from datetime import datetime, timedelta
from airflow import DAG
#from airflow.providers.postgres.operators.postgres import PostgresOperator
from airflow.providers.common.sql.operators.sql import SQLExecuteQueryOperator as PostgresOperator
from airflow.providers.postgres.hooks.postgres import PostgresHook
from airflow.operators.python import PythonOperator
Define the default arguments for the DAG
default_args = {
}
Define the DAG
dag = DAG(
)
Define a Python function to insert records using PostgresHook
def insert_records_with_hook():
Define the tasks
create_table = PostgresOperator(
)
insert_records_task = PythonOperator(
)
Set up the task dependencies
create_table >> insert_records_task
Is there any issue for this? or who can give me one sample code to finish my taks?
Thanks a lot
Tony
Beta Was this translation helpful? Give feedback.
All reactions