Microsoft SQL Server/Azure Synapse¶
Microsoft SQL Server (MS SQL)/Azure Synapse data sources and sinks are in the database category of I/O connectors.
Tool documentation¶
Connector type values¶
The "type": value to use in the source or sink JSON files.
| Connector type | Value |
|---|---|
| Data source | DKDataSource_MSSQL |
| Data Ssnk | DKDataSink_MSSQL |
Connection properties¶
The properties to use when connecting to a Microsoft SQL Server/Azure Synapse instance from Automation.
| Field | Scope | Type | Required? | Description |
|---|---|---|---|---|
database |
source/sink | string | yes | MSSQL database name. |
hostname |
source/sink | string | yes | MSSQL host name. |
password |
source/sink | string | yes | MSSQL password. |
port |
source/sink | string/number | no | MSSQL port. Default port is 1433. |
username |
source/sink | string | yes | MSSQL username. An Azure Synapse username must include the subdomain. As per this pymssql resource, you must use username@xxx, as in 'username@server' rather than 'username' or 'username@server.sql.azuresynapse.net'. |
azure-database |
source/sink | Boolean | yes | Selection for Azure Synapse database (true) or Microsoft SQL Server database (false). |
Connections¶
See Connection Properties for more details on connection configurations.
Defined in kitchen-level variables¶
mssqlconfig in kitchen overrides
{
"mssqlconfig": {
"username": "#{vault://mssql/username}",
"password": "#{vault://mssql/password}",
"hostname": "#{vault://mssql/hostname}",
"database": "#{vault://mssql/database}",
"port": "#{vault://mssql/port}",
"azure-database": false
}
}
The Connection tab in a Node Editor¶

Expanded Connection Syntax¶
For a data source¶
mssql_datasource.json
{
"name": "mssql_datasource",
"type": "DKDataSource_MSSQL",
"config": {
"username": "{{mssqlconfig.username}}",
"password": "{{mssqlconfig.password}}",
"hostname": "{{mssqlconfig.hostname}}",
"database": "{{mssqlconfig.database}}",
"port": "{{mssqlconfig.port}}",
"azure-database": false
},
"keys": {},
"tests": {}
}
For a data sink¶
mssql_datasink.json
{
"name": "mssql_datasink",
"type": "DKDataSink_MSSQL",
"config": {
"username": "{{mssql.username}}",
"password": "{{mssql.password}}",
"hostname": "{{mssql.hostname}}",
"database": "{{mssql.database}}",
"port": "{{mssql.port}}",
"azure-database": false
},
"keys": {},
"tests": {}
}
Condensed connection syntax¶
Note
Note: Do not use quotes for your condensed connection configuration variables.
For a data source¶
mssql_datasource.json
{
"type": "DKDataSource_MSSQL",
"name": "mssql_datasource",
"config-ref": "mssqlconfig",
"keys": {},
"tests": {}
}
For a data sink¶
mssql_datasink.json
{
"type": "DKDataSink_MSSQL",
"name": "mssql_datasink",
"config-ref": "mssqlconfig",
"keys": {},
"tests": {}
}
Local connections¶
MS SQL database contents may be viewed locally by configuring connections with IDEs like PyCharm or database query tools like DBeaver or MySQL Workbench.
Tip
Connecting PyCharm or DataGrip to MySQL: See the Database connection topic in PyCharm help for detailed information.
Other configuration properties¶
See the following topics for common properties and runtime variables.
Additional MS SQL step (key) properties¶
| Field | Type | Required? | Description |
|---|---|---|---|
batch-size |
string/number | no | Supported for MS SQL data sinks only. Specifies the number of rows to send to the server as a single packet. Default 5000. Used by freebcp query type only. |
maxerrors |
integer | no | Available only when query-type is freebcp. Specifies the maximum amount of allowed errors on insert. |
File encoding requirements¶
Files used with data sources and data sinks must be encoded in UTF-8 in order to avoid non-Unicode characters causing problems with sinking data to database tables and errors when running related tests
For CSV and other delimited files, use Save as in the program and select the proper encoding, or consider using a text editor with encoding options.
Data source example¶
Configure a key to insert a WAITFOR interval.
mssql_datasource.json
{
"type": "DKDataSource_MSSQL",
"name": "mssql_datasource",
"config": {{mssqlconfig}},
"keys": {
"wait": {
"query-type": "execute_scalar",
"sql": "WAITFOR DELAY '{{delay_time2}}';"
}
},
"tests": {
}
}
Data sink example¶
Load data from a file-based source into a SQL Server table by mapping through a delimited file.
Note the col-delimiter and date-format required for proper mapping into MS SQL.
mssql_datasink.json
{
"type": "DKDataSink_MSSQL",
"name": "mssql_datasink",
"config": {{mssqlconfig}},
"keys": {
"mapping1_sink": {
"query-type": "freebcp",
"table-name": "{{redshiftConfig.schema}}.raw_demo__orders",
"format": "csv",
"col-delimiter": "|",
"date-format": "%Y-%m-%d"
}
},
"tests": {
"Num_New_Orders_Min": {
"action": "stop-on-error",
"test-variable": "num_new_orders",
"type": "test-contents-as-integer",
"test-logic": {
"test-compare": "greater-than",
"test-metric": 1000
}
}
}
}