Amazon Redshift/PostgreSQL¶
Amazon Redshift/PostgreSQL data sources and sinks are in the database category of I/O connectors.
Tool documentation¶
Connector type values¶
The "type": value to use in the source or sink JSON files.
| Connector type | Value |
|---|---|
| Data source | DKDataSource_PostgreSQL |
| Data sink | DKDataSink_PostgreSQL |
Connection properties¶
You can choose between two connection methods:
- Username-Password method: specify values for the
database,hostname,username,password, andportfields. - AWS Credentials method: specify values for the
cluster,access-key, andsecret-keyfields.
| Field | Scope | Type | Required? | Description |
|---|---|---|---|---|
database |
source/sink | string | depends on method | PostgreSQL database name. |
hostname |
source/sink | string | depends on method | PostgreSQL host name. |
username |
source/sink | string | depends on method | PostgreSQL username. |
password |
source/sink | string | depends on method | PostgreSQL password. |
port |
source/sink | string/number | depends on method | PostgreSQL port. |
cluster |
source/sink | string | depends on method | AWS cluster name. |
region |
source/sink | string | no | AWS region name. |
access-key |
source/sink | string | depends on method | AWS access key ID. |
secret-key |
source/sink | string | depends on method | AWS secret access key credential. |
aws-session-token |
source/sink | string | no | AWS session token optionally passed with access and secret keys in order to assume an IAM (AWS Identity and Access Management) role. |
Connections¶
See Connection Properties for more details on connection configurations.
Defined in kitchen-level variables¶
postgresql in kitchen overrides
{
"postgresqlconfig": {
"username": "#{vault://postgresql/username}",
"password": "#{vault://postgresql/password}",
"hostname": "#{vault://postgresql/hostname}",
"port": "#{vault://postgresql/port}",
"database": "#{vault://postgresql/database}"
}
}
The Connection tab in a Node Editor¶

Expanded connection syntax¶
For a data source¶
postgresql_datasource.json
{
"type": "DKDataSource_PostgreSQL",
"name": "postgresql_datasource",
"config": {
"username": "{{postgresqlconfig.username}}",
"password": "{{postgresqlconfig.password}}",
"hostname": "{{postgresqlconfig.hostname}}",
"port": "{{postgresqlconfig.port}}",
"database": "{{postgresqlconfig.database}}"
},
"keys": {},
"tests": {}
}
For a data sink¶
postgresql_datasink.json
{
"type": "DKDataSink_PostgreSQL",
"name": "postgresql_datasink",
"config": {
"username": "{{postgresqlconfig.username}}",
"password": "{{postgresqlconfig.password}}",
"hostname": "{{postgresqlconfig.hostname}}",
"port": "{{postgresqlconfig.port}}",
"database": "{{postgresqlconfig.database}}"
},
"keys": {},
"tests": {}
}
Condensed connection syntax¶
Note
Note: Do not use quotes for your condensed connection configuration variables.
For a data source¶
postgresql_datasource.json
{
"type": "DKDataSource_PostgreSQL",
"name": "postgresql_datasource",
"config-ref": "postgresqlconfig",
"keys": {},
"tests": {}
}
For a data sink¶
postgresql_datasink.json
{
"type": "DKDataSink_PostgreSQL",
"name": "postgresql_datasink",
"config-ref": "postgresqlconfig",
"keys": {},
"tests": {}
}
Local connections¶
PostgreSQL database contents can be viewed locally by configuring connections with IDEs like PyCharm or database query tools like DBeaver or MySQL Workbench.
Tip
Connecting PyCharm or DataGrip to MySQL: See the Database connection topic in PyCharm help for detailed information.
Other configuration properties¶
See the following topics for common properties and runtime variables:
File encoding requirements¶
Files used with data sources and data sinks must be encoded in UTF-8 in order to avoid non-Unicode characters causing problems with sinking data to database tables and errors when running related tests
For CSV and other delimited files, use Save as in the program and select the proper encoding, or consider using a text editor with encoding options.
Data source example¶
postgresql_datasource.json
{
"type": "DKDataSource_PostgreSQL",
"name": "postgresql_datasource",
"config-ref": "samples_postgres_config",
"set-runtime-vars": {
"key_count": "global_key_count"
},
"keys": {
"create_table": {
"sql": "{{load_text('postgres/create.sql')}}",
"query-type": "execute_scalar",
"set-runtime-vars": {
"result": "create_table_result"
}
},
"insert_data": {
"sql": "{{load_text('postgres/create.sql')}}",
"query-type": "execute_scalar",
"set-runtime-vars": {
"result": "insert_data_result"
}
},
"my_query": {
"sql": "select * from {{samples_postgres_config.schema}}.{{redshifttablename}};",
"query-type": "execute_query",
"set-runtime-vars" :{
"column_count": "my_query_column_count",
"row_count": "my_query_row_count"
}
}
},
"tests": {
"test-runtime_variable_create_table_result": {
"test-variable": "create_table_result",
"action": "stop-on-error",
"test-logic": "create_table_result == 0"
},
"test-runtime_variable_insert_data_result": {
"test-variable": "insert_data_result",
"action": "stop-on-error",
"test-logic": "insert_data_result == 2"
},
"test-runtime_variable_my_query_column_count": {
"test-variable": "my_query_column_count",
"action": "stop-on-error",
"test-logic": "my_query_column_count == 2"
},
"test-runtime_variable_my_query_row_count": {
"test-variable": "my_query_row_count",
"action": "stop-on-error",
"test-logic": "my_query_row_count == 2"
},
"test-runtime_variable_global_key_count": {
"test-variable": "global_key_count",
"action": "stop-on-error",
"test-logic": "global_key_count == 3"
}
}
}