Git Product home page Git Product logo

dbt-sqlserver's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

dbt-sqlserver's Issues

dbt data tests error with 'with' clause usage

-- assert count of rows

with cte as (
select count(distinct([category])) as cnt
from table
)
select cnt
from cte
where cnt <> 2

('42000', "[42000] [Microsoft][ODBC Driver 17 for SQL Server][SQL Server]Incorrect syntax near the keyword 'with'. (156) (SQLExecDirectW); [42000] [Microsoft][ODBC Driver 17 for SQL Server][SQL Server]Incorrect syntax near the keyword 'with'. If this statement is a common table expression, an xmlnamespaces clause or a change tracking context clause, the previous statement must be terminated with a semicolon. (319); [42000] [Microsoft][ODBC Driver 17 for SQL Server][SQL Server]Incorrect syntax near ')'. (102)")

I then add a semi-colon before the with and I get this error.

('42000', "[42000] [Microsoft][ODBC Driver 17 for SQL Server][SQL Server]Incorrect syntax near ';'. (102) (SQLExecDirectW); [42000] [Microsoft][ODBC Driver 17 for SQL Server][SQL Server]Incorrect syntax near ')'. (102)")

Any thoughts on why this is occurring?

README documentation suggestion - Username should be in domain\username

@mikaelene sweet adapter! Thank you for this contribution!

When I first tried to connect with dbt debug, I could not connect until I my username was in the domain\username format. At first, I was expecting username or [email protected] format. I didn't think to use domain\username format until reading some documentation on pymssql (link),

Not sure if anyone else needs the hint or if the situation is specific to my set up. But if you think it is helpful, perhaps something similar to
following would nudge users in the right direction.

Configure your profile

  type: sqlserver
  ...
  user: domain\username
  ...

Define as_columnstore=false from dbt_project.yml

Hi,

I am trying to define the config option as_columnstore from dbt_project.yml:

models:
  my_project:
    mart:
      materialized: table
      as_columnstore: False

The materialized config works, but as_columnstore seems to be ignored.

Am I doing something wrong or is this currently unsupported?

Thanks, Kim

dbt version 0.17.2 - dbt found two macros with the name "test_unique" in the namespace "dbt"

Getting below issue:

c:\DV_Demo>dbt test
Running with dbt=0.17.2
* Deprecation Warning: dbt v0.17.0 introduces a new config format for the
dbt_project.yml file. Support for the existing version 1 format will be removed
in a future release of dbt. The following packages are currently configured with
config version 1:
 - dbt_sqlserver

For upgrading instructions, consult the documentation:
  https://docs.getdbt.com/docs/guides/migration-guide/upgrading-to-0-17-0
Encountered an error:
Compilation Error
  dbt found two macros with the name "test_unique" in the namespace "dbt". Since these macros have the same name and exist in the same namespace, dbt will be unable to decide which to call. To fix this, change the name of one of these macros:
  - macro.dbt_sqlserver.test_unique (macros\schema_tests.sql)
  - macro.dbt_sqlserver.test_unique (macros\schema_tests.sql)

Version check:

c:\DV_Demo>dbt --version
installed version: 0.17.2
   latest version: 0.17.2

Up to date!

Plugins:
  - postgres: 0.17.2
  - redshift: 0.17.2
  - snowflake: 0.17.2

Was working with 0.15.2 after --upgrade dbt getting this issue.

dbt found two macros with the name "test_unique" in the namespace "dbt".

80% chance I'm missing something obvious here... I think it has to do w/ my conda env
here goes anyway.

used pip to install dbt-sqlserver. dbt debug checks out no problem. However, I'm encountering the below issue with dbt run.

> dbt run -t sqlserver
Running with dbt=0.16.1
Encountered an error:
Compilation Error
  dbt found two macros with the name "test_unique" in the namespace "dbt". Since these macros have the same name and exist in the same namespace, dbt will be unable to decide which to call. To fix this, change the name of one of these macros:
  - macro.dbt_sqlserver.test_unique (macros/schema_tests.sql)
  - macro.dbt_sqlserver.test_unique (macros/schema_tests.sql)

here's the full stacktrace from the logs...

Traceback (most recent call last):
  File "/Users/anders.swanson/opt/anaconda3/lib/python3.7/site-packages/dbt/main.py", line 81, in main
    results, succeeded = handle_and_check(args)
  File "/Users/anders.swanson/opt/anaconda3/lib/python3.7/site-packages/dbt/main.py", line 159, in handle_and_check
    task, res = run_from_args(parsed)
  File "/Users/anders.swanson/opt/anaconda3/lib/python3.7/site-packages/dbt/main.py", line 212, in run_from_args
    results = task.run()
  File "/Users/anders.swanson/opt/anaconda3/lib/python3.7/site-packages/dbt/task/runnable.py", line 351, in run
    self._runtime_initialize()
  File "/Users/anders.swanson/opt/anaconda3/lib/python3.7/site-packages/dbt/task/runnable.py", line 107, in _runtime_initialize
    super()._runtime_initialize()
  File "/Users/anders.swanson/opt/anaconda3/lib/python3.7/site-packages/dbt/task/runnable.py", line 75, in _runtime_initialize
    self.load_manifest()
  File "/Users/anders.swanson/opt/anaconda3/lib/python3.7/site-packages/dbt/task/runnable.py", line 63, in load_manifest
    self.manifest = get_full_manifest(self.config)
  File "/Users/anders.swanson/opt/anaconda3/lib/python3.7/site-packages/dbt/perf_utils.py", line 23, in get_full_manifest
    return load_manifest(config, internal, set_header)
  File "/Users/anders.swanson/opt/anaconda3/lib/python3.7/site-packages/dbt/parser/manifest.py", line 646, in load_manifest
    return ManifestLoader.load_all(config, internal_manifest, macro_hook)
  File "/Users/anders.swanson/opt/anaconda3/lib/python3.7/site-packages/dbt/parser/manifest.py", line 336, in load_all
    loader.load(internal_manifest=internal_manifest)
  File "/Users/anders.swanson/opt/anaconda3/lib/python3.7/site-packages/dbt/parser/manifest.py", line 204, in load
    self.macro_hook(macro_manifest)
  File "/Users/anders.swanson/opt/anaconda3/lib/python3.7/site-packages/dbt/perf_utils.py", line 21, in set_header
    adapter.connections.set_query_header(manifest)
  File "/Users/anders.swanson/opt/anaconda3/lib/python3.7/site-packages/dbt/adapters/base/connections.py", line 46, in set_query_header
    self.query_header = MacroQueryStringSetter(self.profile, manifest)
  File "/Users/anders.swanson/opt/anaconda3/lib/python3.7/site-packages/dbt/adapters/base/query_headers.py", line 75, in __init__
    ctx = self._get_context()
  File "/Users/anders.swanson/opt/anaconda3/lib/python3.7/site-packages/dbt/adapters/base/query_headers.py", line 84, in _get_context
    return generate_query_header_context(self.config, self.manifest)
  File "/Users/anders.swanson/opt/anaconda3/lib/python3.7/site-packages/dbt/context/configured.py", line 136, in generate_query_header_context
    return ctx.to_dict()
  File "/Users/anders.swanson/opt/anaconda3/lib/python3.7/site-packages/dbt/context/configured.py", line 121, in to_dict
    dct.update(self.get_macros())
  File "/Users/anders.swanson/opt/anaconda3/lib/python3.7/site-packages/dbt/context/configured.py", line 116, in get_macros
    nsp.add_macros(self.manifest.macros.values(), self._ctx)
  File "/Users/anders.swanson/opt/anaconda3/lib/python3.7/site-packages/dbt/context/configured.py", line 76, in add_macros
    self.add_macro(macro, ctx)
  File "/Users/anders.swanson/opt/anaconda3/lib/python3.7/site-packages/dbt/context/configured.py", line 66, in add_macro
    raise_duplicate_macro_name(macro_func.macro, macro, namespace)
  File "/Users/anders.swanson/opt/anaconda3/lib/python3.7/site-packages/dbt/exceptions.py", line 664, in raise_duplicate_macro_name
    node_2.unique_id, node_2.original_file_path
  File "/Users/anders.swanson/opt/anaconda3/lib/python3.7/site-packages/dbt/exceptions.py", line 363, in raise_compiler_error
    raise CompilationException(msg, node)
dbt.exceptions.CompilationException: Compilation Error
  dbt found two macros with the name "test_unique" in the namespace "dbt". Since these macros have the same name and exist in the same namespace, dbt will be unable to decide which to call. To fix this, change the name of one of these macros:
  - macro.dbt_sqlserver.test_unique (macros/schema_tests.sql)
  - macro.dbt_sqlserver.test_unique (macros/schema_tests.sql)

Interpolate values in log lines

I'm seeing some issues where the logs in logs/dbt.log are missing some crucial information that would help with debugging.

Check out this example:
https://github.com/mikaelene/dbt-sqlserver/blob/fb228938805289b1ba84bf21534730d3e8d5a63c/dbt/adapters/sqlserver/connections.py#L144-L146

I think this code should instead read:

    logger.debug('On {}: {}....', connection.name, sql[0:512])
else:
    logger.debug('On {}: {}', connection.name, sql)

Changing the %s fields to {} will get the logger to correctly interpolate log values as shown here

I'd be happy to make a PR for this change :)

Overriding the database in the dbt_project file does not work

When I override the database to something different than the database configured in the profile, the CLI says it's going to built into the override database, but it's executed on the profile database.

Example profiles.xml:

datawarehouse:
  target: development
  outputs:
    development:
      type: sqlserver
      driver: 'ODBC Driver 17 for SQL Server'
      server: bisql
      port: 1433
      windows_login: True
      database: Test
      schema: test
      threads: 4

example dbt_project.yml:

...
models:
  my_datawarehouse:
    materialized: view
    products:
      +database: ProductsModels
      +schema: ProductsModels

When ran, the CLI says "created view model ProductsModels.ProductsModels.Metrics..." but in reality it's created in Test.ProductsModels.Metrics

Refactor the create table as macros

The macro for creating a table for a model (sqlserver__create_table_as) is quite complicated and in case of error leaves unfinished models in the database. The next run might get an error in case the temp-view already exists etc.

The macro should be splitt up in different parts and maybe utilize the sqlserver__create_view_as macro instead of encapsulating a run EXEC() command. Or is there a dbt-macro that also drops the temp_view if it already exists?

Model files throw 20018 errors

Summary

What happened:

  • Loaded CSVs as flat files directly to DB via SQL Server Management Studio
  • Ran dbt run <- ERRORs thrown for 3 models in jaffle_shop model directory. See errors below.

What I expected to happen:

After loading CSVs directly into SQL Server DB, I expected dbt run to successfully load and run models found in staging and marts.

Errors

Running with dbt=0.14.0
Found 8 models, 20 tests, 0 snapshots, 0 analyses, 131 macros, 0 operations, 3 seed files, 0 sources

23:01:38 | Concurrency: 1 threads (target='dev')
23:01:38 |
23:01:38 | 1 of 8 START view model dbt_dev.stg_orders....................... [RUN]
23:01:40 | 1 of 8 OK created view model dbt_dev.stg_orders.................. [OK in 1.61s]
23:01:40 | 2 of 8 START view model dbt_dev.stg_payments..................... [RUN]
23:01:41 | 2 of 8 OK created view model dbt_dev.stg_payments................ [OK in 0.33s]
23:01:41 | 3 of 8 START view model dbt_dev.stg_customers.................... [RUN]
23:01:42 | 3 of 8 OK created view model dbt_dev.stg_customers............... [OK in 0.33s]
23:01:42 | 4 of 8 START table model dbt_dev.customer_orders................. [RUN]
23:01:42 | 4 of 8 ERROR creating table model dbt_dev.customer_orders........ [ERROR in 0.31s]
23:01:42 | 5 of 8 START table model dbt_dev.customer_payments............... [RUN]
23:01:43 | 5 of 8 ERROR creating table model dbt_dev.customer_payments...... [ERROR in 0.24s]
23:01:43 | 6 of 8 START table model dbt_dev.order_payments.................. [RUN]
23:01:44 | 6 of 8 ERROR creating table model dbt_dev.order_payments......... [ERROR in 0.24s]
23:01:44 | 7 of 8 SKIP relation dbt_dev.dim_customers....................... [SKIP]
23:01:44 | 8 of 8 SKIP relation dbt_dev.fct_orders.......................... [SKIP]
23:01:44 |
23:01:44 | Finished running 3 view models, 5 table models in 5.97s.

Completed with 3 errors:

Database Error in model customer_orders (models\marts\core\intermediate\customer_orders.sql)
  (208, b"Invalid object name 'dbt_dev.customer_orders__dbt_tmp_temp_view'.DB-Lib error message 20018, severity 16:\nGeneral SQL Server error: Check messages from the SQL Server\n")
  compiled SQL at target\compiled\jaffle_shop\marts\core\intermediate\customer_orders.sql

Database Error in model customer_payments (models\marts\core\intermediate\customer_payments.sql)
  (208, b"Invalid object name 'dbt_dev.customer_payments__dbt_tmp_temp_view'.DB-Lib error message 20018, severity 16:\nGeneral SQL Server error: Check messages from the SQL Server\n")
  compiled SQL at target\compiled\jaffle_shop\marts\core\intermediate\customer_payments.sql

Database Error in model order_payments (models\marts\core\intermediate\order_payments.sql)
  (208, b"Invalid object name 'dbt_dev.order_payments__dbt_tmp_temp_view'.DB-Lib error message 20018, severity 16:\nGeneral SQL Server error: Check messages from the SQL Server\n")
  compiled SQL at target\compiled\jaffle_shop\marts\core\intermediate\order_payments.sql

Done. PASS=3 ERROR=3 SKIP=2 TOTAL=8

Environment Details

  • remote MSSQL: MS SQL Server 2017
  • dbt version: 0.14.0
  • dbt-sqlserver: 0.1.3
  • pymssql = 2.1.4

custom data tests with CTEs fail

background

Per @alittlesliceoftom's Slack thread.

Steps to reproduce:

  1. write a data test that uses CTE clause,
  2. dbt wraps the data test's query into another CTE.
  3. Synapse returns an error because it does not allow for nested CTEs.

Example

tests/example.sql

WITH cte_test AS (
    SELECT * FROM {{ref('interesting_moments')}}
)
select TOP 0 * FROM cte_test

target/compiled/ava_its_data/tests/example.sql

This doesn't work as Synapse does not allow a single WITH clause inside of

with dbt__CTE__INTERNAL_test as (
WITH cte_test AS (
    SELECT * FROM "june_sql_pool"."ajs_mkt"."interesting_moments"
)
select TOP 0 * FROM cte_test
)select count(*) from dbt__CTE__INTERNAL_test

potential workarounds

haven't thought much about the best solution yet...

add the wrapped CTE to the end of the CTE chain?

Synapse doesn't support nested CTEs, but it does supported "chained referencing" of CTEs inside a single WITH clause. So the example below works, but I don't know how I would conditionally do this only when the data test has a CTE

WITH
cte_test AS (
    SELECT * FROM "june_sql_pool"."ajs_mkt"."interesting_moments"
),
dbt__CTE__INTERNAL_test AS (
    select TOP 0 * FROM cte_test
)
select count(*) from dbt__CTE__INTERNAL_test

create and dispose of a view/ temp table instead of a CTE?

CREATE VIEW  dbt__CTE__INTERNAL_test as
WITH cte_test AS (
    SELECT * FROM "june_sql_pool"."ajs_mkt"."interesting_moments"
)
select TOP 0 * FROM cte_test;

select count(*) from dbt__CTE__INTERNAL_test

DROP VIEW dbt__CTE__INTERNAL_test

New adapter testing framework

Hey Mikael! Nice work on the quick v0.18.0 turnaround :)

We've released a new testing framework, pytest-dbt-adapter:

The dbt-integration-tests framework was a neat experiment with behave, and it served us well for the past year, but we found it wasn't a sustainable way to support the matrix of functionality across databases. With the flexibility of the pytest-based approach, each adapter can declare its supported features by overriding or disabling the predefined test sequences.

Running tests with pytest-dbt-adapter would look like adding a file sqlserver.dbtspec and executing pytest path/to/sqlserver.dbspec. The repo linked above contains sample specs for Postgres, Spark, and Presto. You can also see how we've hooked up the tests to run via tox + CircleCI in the dbt-spark and dbt-presto repos.

We'd like to support you in adopting this testing framework, whether that's by writing the code, answering questions, or making needed adjustments to the existing tests. Let me know!

clean up relation construction

we should try and standardize on a convention of using {{ relation }} wherever possible over these variants:

  • {{ relation.schema }}.{{ relation.identifier }}
  • to_relation.schema ~ '.' ~ to_relation.identifier

@jtcohen6, how do we disable by default database being included when calling {{ relation }}? Do we have create a child of the Relation class?

EXEC('CREATE SCHEMA {{ relation.without_identifier().schema }}')

create view {{ relation.schema }}.{{ relation.identifier }} as

EXEC sp_rename '{{ from_relation.schema }}.{{ from_relation.identifier }}', '{{ to_relation.identifier }}'

{%- set full_relation = relation.schema ~ '.' ~ relation.identifier -%}

EXEC('create view {{ tmp_relation.schema }}.{{ tmp_relation.identifier }} as
{{ temp_view_sql }}
');

{%- set full_to_relation = to_relation.schema ~ '.' ~ to_relation.identifier -%}
{%- set full_from_relation = from_relation.schema ~ '.' ~ from_relation.identifier -%}

increase CI test coverage for connection target types

give the weird behavior we saw with connection strings and SqlPassword=True. @NandanHegde15 had the idea to have a separate Circle CI job would be verify that dbt debug works on all target configurations that are commonly used. We can either add this to #62 or in a new PR after #62 is merged.

The ones I can think of:

  • trusted_connection=True
  • windows_integrated=True
  • when Encrypt and TrustServerCertificate aren't specified
  • the various combos ofEncrypt, TrustServerCertificate, and SqlPassword
  • Active Directory
    • Password
    • Integrated (trickier as it would require a VM and an AD user in the same RG as the Azure SQL server)
    • Interactive

sqlserver__get_columns_in_query not working

background

sqlserver__get_columns_in_query was added by @qed- in #56 as basically a near copy of the base adapter version get_columns_in_query I'm working on it for the workaround suggested by the venerable @jtcohen6 in microsoft/dbt-synapse#40.

TL;DR

The return statement below isn't working because:

  • accessing the .columns attr of load_result().table throws and error, and
  • when I can get an Agate table I can't access the column column and make it into a list
{{ return(load_result('get_columns_in_query').table.columns | map(attribute='name') | list) }}

{% macro sqlserver__get_columns_in_query(select_sql) %}
{% call statement('get_columns_in_query', fetch_result=True, auto_begin=False) -%}
select TOP 0 * from (
{{ select_sql }}
) as __dbt_sbq
where 0 = 1
{% endcall %}
{{ return(load_result('get_columns_in_query').table.columns | map(attribute='name') | list) }}
{% endmacro %}

reproduction

in the https://github.com/dbt-msft/jaffle_shop_mssql/, I make a dummy model for testing that looks like this (after running dbt seed)

environment

installed dbt-sqlserver in develop mode (pip install -e .) from lastest commit of master branch (full list of packages installed)

$ pip list | grep "dbt"
dbt-core                 0.19.0rc2
dbt-sqlserver            0.19.0rc2  /Users/anders.swanson/repos/dbt-sqlserver
pytest-dbt-adapter       0.4.0

anders.sql

{% set select_sql = 'select * from ' ~ ref('raw_orders') %}
{% set output = sqlserver__get_columns_in_query(select_sql) %}
{{ log('output:\n' ~ output, info=True) }}

error

full stacktrace

Running with dbt=0.19.0-rc2
Encountered an error:
Compilation Error in model anders (models/marts/core/anders.sql)
  'None' has no attribute 'table'
  
  > in macro sqlserver__get_columns_in_query (macros/adapters.sql)
  > called by model anders (models/marts/core/anders.sql)

attempts to fix

1) adding log information

I commented out the replaced the return statement with these logs. below is the result

{% set result = load_result('get_columns_in_query') %}
{{ log('result:\n' ~ result, info=True) }}
{{ log('result_table:\n' ~ result['table'], info=True) }}
{{ return([]) }}

full log

Running with dbt=0.19.0-rc2
result:
None
result_table:

output:
[]
Found 9 models, 20 tests, 0 snapshots, 0 analyses, 363 macros, 0 operations, 3 seed files, 0 sources, 0 exposures

17:35:15 | Concurrency: 1 threads (target='azsql')
17:35:15 | 
result:
{'response': AdapterResponse(_message='OK', code=None, rows_affected=0), 'data': [], 'table': <agate.table.Table object at 0x7f8d7d133240>}
result_table:
| column     | data_type |
| ---------- | --------- |
| id         | Number    |
| user_id    | Number    |
| order_date | Number    |
| status     | Number    |

2) using sqlserver__get_columns_from_relation()'s return syntax

sqlserver__get_columns_from_relation()'s return syntax works fine with

  {% set table = load_result('get_columns_in_relation').table %}
  {{ return(sql_convert_columns_in_relation(table)) }}

So I tried that, but sql_convert_columns_in_relation() expects each column to be a row, so converted_columns is an empty list.

{% set table = load_result('get_columns_in_query').table %}
{{ log('table:\n' ~ table, info=True) }}
{% set coverted_columns = sql_convert_columns_in_relation(table) %}
{{ log('coverted_columns:\n' ~ coverted_columns, info=True) }}
{{ return(coverted_columns) }}
table:
| column     | data_type |
| ---------- | --------- |
| id         | Number    |
| user_id    | Number    |
| order_date | Number    |
| status     | Number    |

coverted_columns:
[]
output:
[]
17:49:06 | Done.

3) access first column of Agate table

I thought I could copy this pattern form the Agate docs to get the column column
table.columns[0] or table.columns['column']. Fail again with the same original error:

 'None' has no attribute 'table'

Which peeves me to no end bc you can see that result.table is an agate.table.Table object!

4) debug in an ipython term to figure out how to properly query Agate table...

  1. I tried to use {{ debug() }} but got an error that debug is undefined.
  2. I saw the note in the docs about setting DBT_MACRO_DEBUGGING
  3. I ran export DBT_MACRO_DEBUGGING=1
  4. I call dbt compile and get No module named ipdb`
  5. pip install ipdb
  6. No module named ipdb`
  7. open this GitHub issue.

dbt version 0.16 issue

Hi, I upgraded dbt from 0.15.2 to 0.16.1 and dbt-sqlserver stopped working and I am confused by the error. I haven't found any duplicates in schema_test.sql

Running with dbt=0.16.1
Encountered an error:
Compilation Error
  dbt found two macros with the name "test_unique" in the namespace "dbt". Since these macros have the same name and exist in the same namespace, dbt will be unable to decide which to call. To fix this, change the name of one of these macros:
  - macro.dbt_sqlserver.test_unique (macros\schema_tests.sql)
  - macro.dbt_sqlserver.test_unique (macros\schema_tests.sql)

thanks

how to debug adapter macros?

full disclosure: total newb post

I'm trying to adapt your adapter to azure-synapse, I'm getting (and expecting errors), but I'm unsure how to isolate which macros I need to tweak. For me this is where I attach a debugger and step through the process, but I'm definitely missing some skills and context.

Originally, I was getting an error related to sp_rename which makes sense as the sproc isn't supported in Synapse. So I CMD+F for sp_rename and find the sqlserver__rename_relation macro, which I tweak to use the Synapse-supported RENAME OBJECT syntax.

For example, I'm trying to create a simple view with zero jinja in it (for simplicity's sake).

But now I'm getting the following error, which is troubling, because the sql compiled at stage/stg_headcount.sql works without issue. It must be another macro, no?

error (\n added for clarity)

here's the full log w/ stacktrace

> dbt run --target sqlserver --model stg_headcount
('42000', "[42000]
[Microsoft][ODBC Driver 17 for SQL Server][SQL Server]
Parse error at line: 1, column: 15:
Incorrect syntax near ''source.stg_headcount__dbt_tmp''. (103010) (SQLExecDirectW)")
compiled SQL at target/run/ava_its_data/stage/stg_headcount.sql

stg_headcount.sql

SELECT
    MonthCode
    ,CountryNm
    ,Total_Employees
FROM "may_test_sqlpool"."source"."monthly_headcount"

stg_schema.yml

version: 2

models:
  - name: stg_headcount
    description: >
      demo table for debugging
    columns:
      - name: MonthCode
      - name: CountryNm
      - name: Total_Employees

stage/stg_headcount.sql

create view source.stg_headcount__dbt_tmp as
    SELECT
    MonthCode
    ,CountryNm
    ,Total_Employees
FROM "may_test_sqlpool"."source"."monthly_headcount"

Incompatible with dbt>0.15, should update setup.py

Reference issue describes incorrect duplicate macros when using dbt>0.15 with dbt-sqlserver=0.15.

" One helpful change might be for the maintainer of dbt-sqlserver to change their requirements for the package from >=0.15.0 to something like >=0.15.0, <0.16."

refactor seed jinja to avoid table value constructor

plugging away on the azure-synapse fork. So far its been just a matter of replacing unsupported functions. Why they're now unsupported... who knows!?!

Anyway, turns out this classic SQL Server idiom is no longer supported.

INSERT INTO source.country_codes (country_code, country_name, region) values
('NG', 'Nigeria', 'GM'),('NP', 'Nepal', 'GM')

this SO post, gives me the workaround as:

INSERT INTO source.country_codes (country_code, country_name, region)
SELECT 'NG', 'Nigeria', 'GM' UNION ALL
SELECT 'NP', 'Nepal', 'GM'

However, I'm a jinja newb and struggling refactoring macros/materializations/seed/seed.sql.
Any shoves in the right direction? I thought the loop below is what creates the loop, but instead it looks like thats just what would create the column list piece ((country_code, country_name, region)).

{% set sql %}
    insert into {{ this.render() }} ({{ cols_sql }}) values
    {% for row in chunk -%}
        ({%- for column in agate_table.column_names -%}
            ?
            {%- if not loop.last%},{%- endif %}
        {%- endfor -%})
        {%- if not loop.last%},{%- endif %}
    {%- endfor %}
{% endset %}

Snapshot Second Run Problem

I am having a problem with running a snapshot after the initial snapshot table is created (running dbt snapshot for the second time)

I am pretty new to dbt, so I could have some sort of configuration problem too, but I am able to materialize other objects.

I have SQL Server profile trace running to capture the activity and it seems that it is creating a temp table, then it drops the temp table, then it tries to use the temp table to do the merge into the snapshot.

I am attaching the trace results so you can exactly what is happening on my server.

trace.trc.zip

Conventions for Python code

Let's decide on which conventions to apply for Python code.

Issue originates from discussion about python import order

I'll put them back in the same order and open a separate PR for this. I do not care too much either. The following resources are useful:

According to PEP8, Python imports should be on separate lines (per library) and the imports are grouped in three blocks:

  1. Standard library imports.
  2. Related third party imports.
  3. Local application/library specific imports.

Next to that, it is common to sort the imports within on of these three group alphabetically - to improve searchability. Finally, some put the import ... before from ... import ..., or vice versa or mix them.

Originally posted by @JCZuurmond in #71 (comment)

dbt.log not showing the sql code being executed

I'm not sure if there is a way to configure dbt or dbt-sqlserver to show the sql code being executed in the log.

Otherwise, it will be really nice if there is a way to see it for debugging and better understanding of what's happening under the hood.

For instance, one of my incrementally materialized models has unique_key='...' option turned on. Being able to see the exact delete and insert statement will help understand what's being done.

Thank you!

Error when collation of server instance is different than database instance

Hi!

Ran in to an issue running a dbt project on a sql server 2014 instance (I believe the same error would occur on later version but have not tested that)

When the collation of a database is different than the collation set on server level dbt will fail with the error message:

Cannot resolve collation conflict between "SQL
_Latin1_General_CP1_CI_AS" and "Finnish_Swedish_100_CI_AS" in UNION ALL operator occurring in SELECT statement column 2

The reason is the union made in macro sqlserver__get_columns_in_relation where two datasets containing column info from tempdb is union-ed with a dataset containing column data on the specific database. Those two datasets have different collations on the varchar fields and cannot be unioned without specifying collation.

I managed a simple workaround where I collate the columns on the tempdb dataset to the "DATABASE_DEFAULT" collation (the target database). And it seems to work. Though there might be issues in cases I´ve not come across.

So, I added "collate database_default" on columns "column_name" and "data_type" in the tempdb dataset in the sqlserver__get_columns_in_relation macro, resulting in the code below:

{% macro sqlserver__get_columns_in_relation(relation) -%}
  {% call statement('get_columns_in_relation', fetch_result=True) %}
      SELECT
          column_name,
          data_type,
          character_maximum_length,
          numeric_precision,
          numeric_scale
      FROM
          (select
              ordinal_position,
              column_name,
              data_type,
              character_maximum_length,
              numeric_precision,
              numeric_scale
          from INFORMATION_SCHEMA.COLUMNS
          where table_name = '{{ relation.identifier }}'
            and table_schema = '{{ relation.schema }}'
          UNION ALL
          select
              ordinal_position,
              **column_name collate database_default,
              data_type collate database_default,**
              character_maximum_length,
              numeric_precision,
              numeric_scale
          from tempdb.INFORMATION_SCHEMA.COLUMNS
          where table_name like '{{ relation.identifier }}%') cols
      order by ordinal_position


  {% endcall %}
  {% set table = load_result('get_columns_in_relation').table %}
  {{ return(sql_convert_columns_in_relation(table)) }}
{% endmacro %}

Check if index exists before attempting to create it

Currently if you use create_nonclustered_index() on an incremental table, the refresh fails, because the index already exists.

I understand the current pattern is to use a pre-hook with drop_all_indexes_on_table(), but that leads to all indexes having to be re-built on every refresh and that the indexes can't be used for the refresh merge.

Is there anything that prevents us from checking if an index exists in create_nonclustered_index() and skipping the creation if it does?

I have implemented it like this and so far it seems to be working well...

{% macro ii_create_clustered_index(columns, unique=False) -%}
    {{ log("Creating clustered index...") }}
    {% set idx_name = this.table + '__clustered_index_on_' + columns|join('_') %}

    if not exists(select * from sys.indexes 
                  where 
                    name = '{{ idx_name }}' and 
                    object_id = OBJECT_ID('{{ this }}')
    )
    begin

        create
        {% if unique -%}
        unique
        {% endif %}
        clustered index
            {{ idx_name }}
            on {{ this }} ({{ '[' + columns|join("], [") + ']' }})
    end
{%- endmacro %}


{% macro ii_create_nonclustered_index(columns, includes=False) %}
    {{ log("Creating nonclustered index...") }}
    {% set idx_name = this.table + '__index_on_' + columns|join('_') %}

    if not exists(select * from sys.indexes 
                  where 
                    name = '{{ idx_name }}' and 
                    object_id = OBJECT_ID('{{ this }}')
    )
    begin
        create nonclustered index {{ idx_name }}
            on {{ this }} ({{ '[' + columns|join("], [") + ']' }})
            {% if includes -%}
                include ({{ '[' + includes|join("], [") + ']' }})
            {% endif %}
    end
{% endmacro %}

DB-Lib error message 20002

Hello and thank you for this contrib.

My dbt debug fails on the connection test:

 >Database Error
  (20002, b'DB-Lib error message 20002, severity 9:\nAdaptive Server connection failed (jaffle-shop-warehouse.database.windows.net:1433)\n')

dbt-sqlserver 0.1.4.1
dbt 0.14.0
pymssql 2.1.4 | 2.13
python 3.6.8 | 2.7.15

The database is an Azure SQL DB. I have the correct firewall settings all the way around and can connect to the db with the Azure query editor.

Add application name to ODBC string

Microsoft collects anonymous aggregate statistics about SQLServer + Synapse usage via the "Application Name" property of connections. After we spoke with a Synapse PM the other day, they're curious to know how many people are using dbt with SQLServer + Synapse.

If you're open to it, I believe we can add an APP parameter to the ODBC connection string. For reference:

V0.18.0 -- dbt_macro__sqlserver__get_catalog' takes not more than 1 argument(s)

Running dbt_docs_generate gives the following non-breaking error message. The generated catalog.jsonstill is able to be served and looks okay.

Encountered an error while generating catalog: Compilation Error in macro sqlserver__get_catalog (macros/catalog.sql)
  macro 'dbt_macro__sqlserver__get_catalog' takes not more than 1 argument(s)
  
  > in macro get_catalog (macros/adapters/common.sql)
  > called by macro sqlserver__get_catalog (macros/catalog.sql)
dbt encountered 1 failure while writing the catalog

All the other get_catalog() for core adapters have the following function signature. Changing sqlserver's resolved the error message, but:

  1. sqlserver__get_catalog() doen't use any arguments anyway (should it, @mikaelene?), and
  2. I'm leery of what other issue it might introduce...

{% macro **__get_catalog(information_schema, schemas) -%}

error installing with pip

I tried to run pip install dbt-sqlserver and it errored. Looks like you've vendored the pymssql lib in there and it has a dependency on Cython?

I'm trying this on a Mac

Collecting dbt-sqlserver
  Downloading https://files.pythonhosted.org/packages/92/6a/805de0abd4ed1c823760282072f78fb66095e659e45376cfd82dec4b72f0/dbt_sqlserver-0.1.4.1-py3-none-any.whl
Collecting dbt-core>=0.14.0 (from dbt-sqlserver)
  Downloading https://files.pythonhosted.org/packages/62/ee/63b4b36a486487bf4713415736092a63f3654d023540d8cf31745debe75b/dbt-core-0.14.0.tar.gz (534kB)
     |████████████████████████████████| 542kB 9.2MB/s
Collecting pymssql>=2.1.4 (from dbt-sqlserver)
  Downloading https://files.pythonhosted.org/packages/2e/81/99562b93d75f3fc5956fa65decfb35b38a4ee97cf93c1d0d3cb799fffb99/pymssql-2.1.4.tar.gz (691kB)
     |████████████████████████████████| 696kB 16.4MB/s
    ERROR: Command errored out with exit status 1:
     command: /Users/myself/Projects/py3test/.venv/bin/python -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'/private/var/folders/tx/gnjzcly97p94whybspstf9b00000gn/T/pip-install-v5n8y1mc/pymssql/setup.py'"'"'; __file__='"'"'/private/var/folders/tx/gnjzcly97p94whybspstf9b00000gn/T/pip-install-v5n8y1mc/pymssql/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' egg_info --egg-base pip-egg-info
         cwd: /private/var/folders/tx/gnjzcly97p94whybspstf9b00000gn/T/pip-install-v5n8y1mc/pymssql/
    Complete output (7 lines):
    /Users/myself/Projects/py3test/.venv/lib/python3.7/site-packages/setuptools/dist.py:45: DistDeprecationWarning: Do not call this function
      warnings.warn("Do not call this function", DistDeprecationWarning)
    Traceback (most recent call last):
      File "<string>", line 1, in <module>
      File "/private/var/folders/tx/gnjzcly97p94whybspstf9b00000gn/T/pip-install-v5n8y1mc/pymssql/setup.py", line 88, in <module>
        from Cython.Distutils import build_ext as _build_ext
    ModuleNotFoundError: No module named 'Cython'
    ----------------------------------------
ERROR: Command errored out with exit status 1: python setup.py egg_info Check the logs for full command output.

Add support for 'dbt docs generate' by implementing get_catalog

Currently dbt documentation and the built-in documentation site don't work with dbt-sqlserver,

$ dbt docs generate
Running with dbt=0.14.3
Found 8 models, 0 tests, 0 snapshots, 0 analyses, 238 macros, 0 operations, 0 seed files, 0 sources

16:59:32 | Concurrency: 1 threads (target='dev')
16:59:32 |
16:59:32 | Done.
16:59:32 | Building catalog
Encountered an error:
Compilation Error in macro get_catalog (macros\adapters\common.sql)
  get_catalog not implemented for mssql

  > in macro default__get_catalog (macros\adapters\common.sql)
  > called by macro adapter_macro (macros\adapters\common.sql)
  > called by macro get_catalog (macros\adapters\common.sql)
  > called by macro get_catalog (macros\adapters\common.sql)

Make dbt_utils and surrogate_key() work

After installing the adapter most of the dbt_utils functions didn't work. I falsely assumed that it's because this adapter didn't implement all the required functionality, but after reading this dbt issue I understood it's not related to this adapter at all.

While the dbt team works on a solution for this I wanted to document my workaround here.

Add to dbt_modules/dbt_utils/macros/custom/sqlserver_adapters.sql:

{% macro sqlserver__type_string() %}
    varchar(max)
{% endmacro %}

{% macro sqlserver__concat(fields) -%}
    concat({{ fields|join(', ') }}, '')
{%- endmacro %}

{% macro sqlserver__hash(field) -%}
    convert(varchar(32), hashbytes('md5', {{field}}), 2) 
{%- endmacro %}

Add dbt_modules/.gitignore:

*
!*/
!dbt_utils/macros/custom/**

Add support for sql server named instance

If sql server host name contains a backslash - pointing at a named instance connection to server fails

This is because if a named instance is specified, the port number must be omitted.

This can be fixed by replacing row 90 in connections.py with this code:

if "\\" in credentials.host:
                # if there is a backslash \ in the host name the host is a sql-server named instance
                # in this case then port number has to be omitted
                con_str.append(f"SERVER={credentials.host}")
            else:
                con_str.append(f"SERVER={credentials.host},{credentials.port}")

Maybe there is a more elegant way to do it, but the above works for me :)

Creating Indexed Views

In SQL Server we can have indexed views to add speed without needing to recreate data.

However if we try and add these with post hooks we'll get this error:

Cannot create index on view 'YourModel' because the view is not schema bound. (1939) (SQLExecDirectW)")

To solve this we need to add WITH SCHEMABINDING to the view creation statement or alter it. The simplest way I can think to do this is to add this line to the config:
with":"SCHEMABINDING",
Which then would also support some other with statements.

e.g.
https://stackoverflow.com/questions/8506487/cannot-create-index-on-view-view-table-name-because-the-view-is-not-schema-bou

Happy to implement if @mikaelene can give me a pointer on where to start?

Cannot Authenticate with CLI

I am trying to use the SQL Server adapter with the CLI authentication.
My config is the following :

dbt version: 0.18.1
dbt-mssql version: 0.18.1
python version: 3.7.4
os info: Windows-10-10.0.19041-SP0

Here is my profiles.yml

my_profile:
  target: dev
  outputs:
    dev:
      type: sqlserver
      driver: 'ODBC Driver 17 for SQL Server'
      server: ip_of_my_server
      port: port_of_my_server
      database: db_of_my_server_that_is_not_master
      schema: my_schema
      authentication: CLI

I run az login sucessfully

You have logged in. Now let us find all the subscriptions to which you have access...

But when I run dbt debug I encounter the following error :

Connection:
  database: my_db
  schema: my_schema
  port: 1433
  UID: None
  client_id: None
  authentication: CLI
  encrypt: False
  trust_cert: False
  Connection test: ERROR

dbt was unable to connect to the specified database.
The database returned the following error:

  >Database Error
  ('28000', "[28000] [Microsoft][ODBC Driver 17 for SQL Server][SQL Server]Login failed for user '<token-identified principal>'. (18456) (SQLDriverConnect); [28000] [Microsoft][ODBC Driver 17 for SQL 
Server][SQL Server]Login failed for user '<token-identified principal>'. (18456)")

If I use :

authentication: ActiveDirectoryInteractive
user: [email protected]

It works but I have to authenticate at every command executed which is not efficient.
Do you have any idea of what I could be doing wrong ? Or how we can fix this ?

hooks with multiple results stop execution after first result

Hi, I noticed that if a hook contains multiple commands which are returning results, dbt stops execution once it gets the first result. It sometimes happens also when SQL returns some intermediate messages (like rowcount).

To reproduce it I created a useless script just to show that the execution stops right after EXEC sp_who2 and the following steps are not executed. It stops few milliseconds after first EXEC, that's why I added the WAITFOR DELAY.

{{
  config(   
    pre_hook = " 
       EXEC('DROP VIEW IF EXISTS test_bug_view')    
        
       EXEC sp_who2 -- this returns a result and execution of the hook stops

       WAITFOR DELAY '00:00:04';
       EXEC('CREATE VIEW test_bug_view AS SELECT 1 as col')
       "
  )
}}
SELECT * FROM test_bug_view --this fails, because it wasn't created in the pre-hook

As a workaround, now I make sure that there are no intermediate results and it works. For example in EXEC using NO_OUTPUT

I'd be grateful if somebody can fix it :)

Azure connection error on macOS using FreeTDS (port issue)

I was having connection problems to Azure.

Using the ODBC driver I was getting this error:

Connection:
  database: sm_analysis
  schema: dbt_kim
  UID: hostmaster
  windows_login: False
libc++abi.dylib: terminating with uncaught exception of type std::runtime_error: collate_byname<char>::collate_byname failed to construct for C/en_US.UTF-8/C/C/C/C
[1]    5062 abort      dbt debug
/Users/Kim/.pyenv/versions/3.6.5/lib/python3.6/multiprocessing/semaphore_tracker.py:143: UserWarning: semaphore_tracker: There appear to be 1 leaked semaphores to clean up at shutdown   
  len(cache))

Then I tried using FreeTDS, but was getting this error:

dbt was unable to connect to the specified database.
The database returned the following error:

  >Database Error
  ('08S01', '[08S01] [FreeTDS][SQL Server]Unable to connect: Adaptive Server is unavailable or does not exist (20009) (SQLDriverConnect)')

Check your database credentials and try again. For more information, visit:
https://docs.getdbt.com/docs/configure-your-profile

I finally narrowed down the problem to the fact that the port is not specified in the connection string.

After making these changes I was able to connect:

diff --git a/dbt/adapters/sqlserver/connections.py b/dbt/adapters/sqlserver/connections.py
index e336dcc..5a9568c 100644
--- a/dbt/adapters/sqlserver/connections.py
+++ b/dbt/adapters/sqlserver/connections.py
@@ -19,6 +19,7 @@ class SQLServerCredentials(Credentials):
     host: str
     database: str
     schema: str
+    port: Optional[int] = 1433
     UID: Optional[str] = None
     PWD: Optional[str] = None
     windows_login: Optional[bool] = False
@@ -87,6 +88,7 @@ class SQLServerConnectionManager(SQLConnectionManager):
             con_str = []
             con_str.append(f"DRIVER={{{credentials.driver}}}")
             con_str.append(f"SERVER={credentials.host}")
+            con_str.append(f"PORT={credentials.port}")
             con_str.append(f"Database={credentials.database}")
 
             if not getattr(credentials, 'windows_login', False):

Schema tests raise SQL error when columns contain space

Hi,

Schema tests on columns that contain spaces are currently raising a SQL error. This is due to the macro generating the following SQL code

select count(*) as validation_errors
from "DB"."schema"."table"
where My Column Name is null

I tried to change the column name in schema.yml to put it in the format "'My Column Name'", but while the schema test now works, the documentation shows 2 different entries for my column, one with My Column Name and one with 'My Column Name' (including the single quotes).

I also tried to replace My Column Name with [My Column Name] but I get the same result, the test is working but I am getting a duplicate column in the documentation.

I think that wrapping the column name in single quotes in the macros defined in schema_tests.sql would prevent this problem. So, replacing {% set column_name = kwargs.get('column_name', kwargs.get('arg')) %} by {% set column_name = "'" ~ kwargs.get('column_name', kwargs.get('arg')) ~ "'" %}

Does it sound ok? If so, I am happy to create a Pull Request with those changes.

Recreating clustered columnstore index fails on sql server 2014

Hi All!

I´ve setup a project using an sql server 2014 db and ran in to an error when running models: "Incorrect syntax near the keyword 'IF'..."

This seems to occur in the macro sqlserver__create_clustered_columnstore_index because of the syntax DROP INDEX IF EXISTS {{relation_name}}.{{cci_name}} which will not work in sql server 2014 where the sys tables has to be queried instead.

I tried changing the code in my local fork to:

if exists (
        select * from
        sys.indexes where name = '{{cci_name}}'
        and object_id=object_id('{{relation_name}}')
    )
  drop index {{full_relation}}.{{cci_name}}

And it seems to work fine in my local dev environment

Br
Anders

CLI Authentication fails

I am logged in through the CLI, but I am receiving the error below in dbt debug

profiles.yml:

default:
  target: dev
  outputs:
    dev:
      type: sqlserver
      driver: 'ODBC Driver 17 for SQL Server'
      encrypt: true
      trust_cert: false
      authentication: cli
      server: "{{ env_var('HOSTNAME') }}"
      port: 1433
      database: "{{ env_var('DATABASE') }}"
      schema: "{{ env_var('SCHEMA') }}"

dbt debug:

Running with dbt=0.19.0
dbt version: 0.19.0
python version: 3.8.7
python path: /usr/local/bin/python
os info: Linux-4.19.121-linuxkit-x86_64-with-glibc2.2.5
Using profiles.yml file at /dbt/config/profiles.yml
Using dbt_project.yml file at /dbt/projects/default/dbt_project.yml

Configuration:
  profiles.yml file [OK found and valid]
  dbt_project.yml file [OK found and valid]

Required dependencies:
 - git [OK found]

Connection:
  database: <private>
  schema: <private>
  port: 1433
  UID: None
  client_id: None
  authentication: cli
  encrypt: True
  trust_cert: False
  Connection test: ERROR

dbt was unable to connect to the specified database.
The database returned the following error:

  >Database Error
  ('28000', "[28000] [Microsoft][ODBC Driver 17 for SQL Server][SQL Server]Login failed for user ''. (18456) (SQLDriverConnect)")

dbt source snapshot-freshness UTC ?

Not entirely sure if this is a dbt issue or dbt-sqlserver.

When using the 'freshness' support in dbt and calling dbt source snapshot-freshness, it seems to comparing to local time rather than UTC time.

Any ideas? Or is this performed internally in dbt?

dbt-adapter-test test_dbt_snapshot_strategy_check_cols failing

This is the only test (besides data_test_ephemeral_models) that isn't passing CI. relevant Circle CI run for those who belong to dbt-MSFT org. Here's the full gist from CircleCI that includes all the SQL executed when only this test is run and no others. Below is the pytest result summary.

Perhaps the merge functionality isn't working properly and it's only inserting or only updating?

============================= test session starts ==============================
platform linux -- Python 3.7.9, pytest-6.1.2, py-1.9.0, pluggy-0.13.1 -- /root/project/.tox/integration-synapse/bin/python
cachedir: .pytest_cache
rootdir: /root/project
plugins: dbt-adapter-0.3.0
collecting ... collected 1 item                                                               

test/integration/sqlserver.dbtspec::test_dbt_snapshot_strategy_check_cols FAILED [100%]

=================================== FAILURES ===================================
________________ usecase: test_dbt_snapshot_strategy_check_cols ________________
Unknown error: expected cc_all_snapshot to have 30 rows, but it has 20
assert Decimal('20') == 30
  +Decimal('20')
  -30 in test index 12 (item_type=relation_rows)

Incorrect syntax near the keyword 'if'

installed version: 0.15.0

I am having a bit of trouble running the test sql script created by dbt init, which is select 1 as id.

profiles.yml

dbt-sqlserver:
  outputs:
    dev:
      type: sqlserver
      driver: 'ODBC Driver 17 for SQL Server'
      threads: 1
      server: excitingserver.com
      port: 1433
      user: excitinguser
      password: excitingpassword
      database: regulardatabase
      schema: dbo
      windows_login: false
  target: dev

error

Running with dbt=0.15.0
Found 1 model, 0 tests, 0 snapshots, 0 analyses, 136 macros, 0 operations, 0 seed files, 0 sources

15:43:46 | Concurrency: 1 threads (target='dev')
15:43:46 |
15:43:46 | 1 of 1 START view model dbo.my_first_dbt_model....................... [RUN]
15:43:47 | 1 of 1 ERROR creating view model dbo.my_first_dbt_model.............. [ERROR in 0.07s]
15:43:47 |
15:43:47 | Finished running 1 view model in 0.49s.

Completed with 1 error and 0 warnings:

Database Error in model my_first_dbt_model (models/example/my_first_dbt_model.sql)
  ('42000', "[42000] [Microsoft][ODBC Driver 17 for SQL Server][SQL Server]Incorrect syntax near the keyword 'if'. (156) (SQLExecDirectW)")
  compiled SQL at target/compiled/kreplach/example/my_first_dbt_model.sql

Done. PASS=0 WARN=0 ERROR=1 SKIP=0 TOTAL=1

I'm assuming the answer is to be found in one of the macros, but I'm not smart enough to trace it back.

DROP TABLE IF EXISTS syntax only works on SQL Server >2016

The syntax DROP TABLE IF EXISTS ... works only on SQL Server >2016

https://github.com/mikaelene/dbt-sqlserver/blob/fb228938805289b1ba84bf21534730d3e8d5a63c/dbt/include/sqlserver/macros/adapters.sql#L44

Thus dbt-sqlserver will currently fail on previous versions (mine is 2012) with something like:

  ('42000', "[42000] [Microsoft][ODBC Driver 17 for SQL Server][SQL Server]Incorrect syntax near the keyword 'if'. (156) (SQLExecDirectW)")

I'm happy to submit a PR, but is there an overall test suite that I can run somehow to verify other macros aren't similarly incompatible? I am not familiar enough with dbt at the moment to know how one tests an adapter.

Connection Issue

I received the following error when attempting to connect to my database (localhost):

Database Error ('08001', '[08001] [Microsoft][ODBC Driver 17 for SQL Server]TCP Provider: No connection could be made because the target machine actively refused it.\r\n (10061) (SQLDriverConnect); [08001] [Microsoft][ODBC Driver 17 for SQL Server]Login timeout expired (0); [08001] [Microsoft][ODBC Driver 17 for SQL Server]A network-related or instance-specific error has occurred while establishing a connection to SQL Server. Server is not found or not accessible. Check if instance name is correct and if SQL Server is configured to allow remote connections. For more information see SQL Server Books Online. (10061)'

I'm using dbt and dbt-sqlserver v0.18.1

dbt-test-integration fails for `dbt seed` feature

FWIW, I saw that the dbt-integration-test is passing on 2 of 3 features. See output below. Here are the highlights:

  • 2 features passed, 1 failed, 0 skipped -> Run dbt seed successfully
  • 12 scenarios passed, 1 failed, 0 skipped -> Make sure rows are added to on_run_hook table
  • 145 steps passed, 1 failed, 2 skipped, 0 undefined

dbt-integration-test output

$ ./bin/run-with-profile jaffle_shop
Test direct copying of source tables    # features/001_basic_materializations.feature
  Test materialized='view' -- @1.1   .........
  Test materialized='table' -- @1.2   .........
  Test materialized='incremental' -- @1.3   .........

Test re-materializing models as different types    # features/002_rematerialize_models.feature
  Materialize as view first, then view -- @1.1   ............
  Materialize as view first, then table -- @1.2   ............
  Materialize as view first, then incremental -- @1.3   ............
  Materialize as table first, then view -- @1.4   ............
  Materialize as table first, then table -- @1.5   ............
  Materialize as table first, then incremental -- @1.6   ............
  Materialize as incremental first, then view -- @1.7   ............
  Materialize as incremental first, then table -- @1.8   ............
  Materialize as incremental first, then incremental -- @1.9   ............

Test pre- and post-run hooks    # features/003_hooks.feature
  Make sure rows are added to on_run_hook table  ..........F
--------------------------------------------------------------------------------
FAILURE in step 'I successfully run "dbt seed"' (features/003_hooks.feature:109):
Assertion Failed: Running with dbt=0.14.0
Found 6 models, 2 tests, 0 snapshots, 0 analyses, 246 macros, 2 operations, 4 seed files, 0 sources

23:45:42 |
23:45:42 | Running 1 on-run-start hook
23:45:42 | 1 of 1 START hook: test.on-run-start.0............................... [RUN]
Database error while running on-run-start
Encountered an error:
Database Error
  (208, b"Invalid object name 'dbt_dev.on_run_hook'.DB-Lib error message 20018, severity 16:\nGeneral SQL Server error: Check messages from the SQL Server\n")

Expected: <0>
     but: was <2>

--------------------------------------------------------------------------------


Failing scenarios:
  features/003_hooks.feature:77  Make sure rows are added to on_run_hook table

2 features passed, 1 failed, 0 skipped
12 scenarios passed, 1 failed, 0 skipped
145 steps passed, 1 failed, 2 skipped, 0 undefined
Took 12m5.674s

Environment Details

  • remote MSSQL: MS SQL Server 2017
  • dbt version: 0.14.0
  • dbt-sqlserver: 0.1.3
  • pymssql = 2.1.4

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.