Git Product home page Git Product logo

loopback-connector-mssql's People

Contributors

0candy avatar amir-61 avatar ataft avatar bajtos avatar bitmage avatar candytangnb avatar christiaanwesterbeek avatar code-vicar avatar crandmck avatar dhmlau avatar emrahcetiner avatar eugene-frb avatar foysalosmany avatar frbuceta avatar gunjpan avatar idoshamun avatar jannyhou avatar joostdebruijn avatar landgenoot avatar loay avatar nabdelgadir avatar raymondfeng avatar rmg avatar sam-github avatar sashasochka avatar siddhipai avatar ssh24 avatar superkhau avatar trimeego avatar virkt25 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

loopback-connector-mssql's Issues

Models generated use actual Column name, not valid camel cased Class properties

Before I assume this is an issue, is Loopback intending that Models/objects in javascript have camel cased properties and these properties are translated to their corresponding columns names provided by the model's definition schema? If so, I don't believe this work properly for the mssql connector.

Most if not all of mssql.js code gets the actual column names from the Model's property keys. See the mssql buildInsert() method as it uses the model's js property names, which happen to be the same as what slc loopback:model generate.

If I generate a model using slc command line ( slc loopback:model), I can make a model with these SQL valid column names but they are not valid js object property names, mainly because of the dashes.

{
  "name": "testColumnNames",
  "base": "Model",
  "properties": {
    "My_First_Col": {
      "type": "string"
    },
    "the-Second_cOlumn": {
      "type": "string"
    },
    "theThirdColumn": {
      "type": "string"
    },
    "Only-Dash": {
      "type": "string"
    },
    "Only_Underscore": {
      "type": "string"
    },
    "ALLUPPER": {
      "type": "string"
    }
  },
  "validations": [],
  "relations": {},
  "acls": [],
  "methods": []
}

I was about to paste in what this generated Schema using Discovery but discoverModelProperties() on this table returns an empty array...

Anyway, I also noticed in the mssql tests, the column names for any CRUD are all camelcase "acceptable" names. I suggest trying the same tests with the valid SQL column names that would be different when translated into camel case or class-ified names.

Query for getting primary keys inadvertently returns foreign keys too

The query executed in lib/discovery.js queryForPrimaryKeys returns not only primary keys but also foreign keys. Which in turn causes schemas to be generated with foreign key properties with an id property > 0. Which in turn generates an error when updating model instance foreign key properties through updateAttributes: "id property (some foreign key column) cannot be updated from A to
B"

Root cause is the query defined in the queryForPrimaryKeys function on the test table inventory.

SELECT
  kc.table_schema     AS "owner", 
  kc.table_name       AS "tableName",
  kc.column_name      AS "columnName",
  kc.ordinal_position AS "keySeq",
  kc.constraint_name  AS "pkName"
FROM information_schema.table_constraints tc
JOIN information_schema.key_column_usage kc
  ON kc.table_name      = tc.table_name
 AND kc.table_schema    = tc.table_schema
WHERE tc.constraint_type  ='PRIMARY KEY'
  AND kc.ordinal_position IS NOT NULL
  AND kc.table_schema     = 'dbo'
  AND kc.table_name       = 'inventory'
ORDER BY kc.table_schema, kc.table_name, kc.ordinal_position;

which returns:

owner tableName columnName  keySeq pkName                        
dbo   inventory location_id 1      location_fk                   
dbo   inventory id          1      PK__inventor__3213E83F07F6335A
dbo   inventory product_id  1      product_fk 

But it should return:

owner tableName columnName  keySeq pkName                        
dbo   inventory id          1      PK__inventor__3213E83F07F6335A

There's a join-on criterium missing. This should be the join:

JOIN information_schema.key_column_usage kc
  ON kc.table_name      = tc.table_name
 AND kc.constraint_name = tc.constraint_name -- < the missing criterium
 AND kc.table_schema    = tc.table_schema

Without that criterium you're getting every key column for the same table which are all foreign keys too.

Do you agree?

migration does not support blob columntypes

In our project we have a model that stores files as attachments in a MySQL MEDIUMBLOB. I have this in my model file, which was generated by the discoverSchema function:

"data" : { 
        "type": "Binary",
        "required": false,
        "length": 16777215,
        "precision": null,
        "scale": null,
        "mysql":
         { "columnName": "data",
           "dataType": "mediumblob",
           "dataLength": 16777215,
           "dataPrecision": null,
           "dataScale": null,
           "nullable": "Y"
         }
      }

When we run autoupdate, it tries to change the column type for data in the database and in doing so generates an invalid ALTER statement:

loopback:connector:mysql SQL: ALTER TABLE `app_attachment` CHANGE COLUMN `data` `data` MEDIUMBLOB(16777215) NULL, params: [] +1ms
loopback:connector:mysql Error: {"code":"ER_PARSE_ERROR","errno":1064,"sqlState":"42000","index":0} +2ms

you can see the SQL:

ALTER TABLE `app_attachment` CHANGE COLUMN `data` `data` MEDIUMBLOB(16777215) NULL

to create a MEDIUMBLOB column in mysql, you do not specify the length as above, a valid version of this statement would be:

ALTER TABLE `app_attachment` CHANGE COLUMN `data` `data` MEDIUMBLOB NULL

And if you remove the length and dataLength properties from the model file, it falls back to TEXT:

ALTER TABLE `app_attachment` CHANGE COLUMN `data` `data` TEXT NULL, params: [] +1ms

I've traced the issue down to the function buildColumnType in migration.js:

MySQL.prototype.buildColumnType = function buildColumnType(propertyDefinition) {
    var dt = '';
    var p = propertyDefinition;
    switch (p.type.name) {
      default:
      case 'JSON':
      case 'Object':
      case 'Any':
      case 'Text':
        dt = columnType(p, 'TEXT');
        dt = stringOptionsByType(p, dt);
        break;
      case 'String':
        dt = columnType(p, 'VARCHAR');
        dt = stringOptionsByType(p, dt);
        break;
      case 'Number':
        dt = columnType(p, 'INT');
        dt = numericOptionsByType(p, dt);
        break;
      case 'Date':
        dt = columnType(p, 'DATETIME'); // Currently doesn't need options.
        break;
      case 'Boolean':
        dt = 'TINYINT(1)';
        break;
      case 'Point':
      case 'GeoPoint':
        dt = 'POINT';
        break;
      case 'Enum':
        dt = 'ENUM(' + p.type._string + ')';
        dt = stringOptions(p, dt); // Enum columns can have charset/collation.
        break;
    }
    return dt;
  };

In addition do this function not handling BLOB types (BLOB, MEDIUMBLOB and LONGBLOB) the discoverSchema function probably doesn't need to provide length/dataLength when it finds these column types.

Thanks.

Identity column haven't been correctly identified

https://github.com/strongloop/loopback-connector-mssql/blob/master/lib/mssql.js#L193

var isIdentity = (prop && prop.type === Number && prop.generated !== false);

This statement should identify if a specifc column has the identity attribute. But when generated attr is not defined at the JSON model, it assumes as undefined, and undefined !== false.

The assumption here is: prop.generated default value is false. My guess is changing that line for this line below, could resolve this problem.

var isIdentity = (prop && prop.type === Number && !!prop.generated);

Support read-only Views as models

Does LB plan to support read-only views in MsSQL where there might not be a primary key?

Integrating existing 3rd party systems into the SL micro-services architecture is very valuable. Currently, I assume we could add (hack) new wrapper views with a fake Primary Keys which does something like select someKey,* from view in order to comply.

@raymondfeng What are you thoughts?

Fix CI failure on master branch

master branch fails due to error:

1) MS SQL server connector should auto migrate/update tables:
   Error: timeout of 5000ms exceeded. Ensure the done() callback is being called in this test.

URL Parameter not working properly

I'm currently working with the version "loopback-connector-mssql": "^2.3.1" trying to connect to a Database in Azure. If I use the mssql directly I'm able to connect and perform a basic Query, but I try using it in Loopback I get a timeout error.

My datasource looks like

module.exports = {
    db: {
        connector: 'loopback-connector-mssql',
        url: "mssql://user:[email protected]:1433/itx-pcr-dev?schema=dbo",
        debug: true,
        encrypt: true
    }
};

And in the terminal I get the following error

Web server listening at: http://0.0.0.0:3000
Browse your REST API at http://0.0.0.0:3000/explorer
## connectors:mssql is licensed from 2015-11-06T17:35:44.146Z to 2015-12-06T17:35:44.146Z.
  loopback:connector:mssql Connection error:  +527ms { [ConnectionError: Failed to connect to undefined:1433 - connect ECONNREFUSED 127.0.0.1:1433]
  name: 'ConnectionError',
  message: 'Failed to connect to undefined:1433 - connect ECONNREFUSED 127.0.0.1:1433',
  code: 'ESOCKET' }
Connection fails:  { [ConnectionError: Failed to connect to undefined:1433 - connect ECONNREFUSED 127.0.0.1:1433]
  name: 'ConnectionError',
  message: 'Failed to connect to undefined:1433 - connect ECONNREFUSED 127.0.0.1:1433',
  code: 'ESOCKET' } 
It will be retried for the next request.

events.js:141
      throw er; // Unhandled 'error' event
      ^
ConnectionError: Failed to connect to undefined:1433 - connect ECONNREFUSED 127.0.0.1:1433

somehow it ends up pointing towards 127.0.0.1 instead of the real host. Any toughts?

The target table 'dbo.Test' of the DML statement cannot have any enabled triggers if the statement contains an OUTPUT clause without INTO clause

I am using strongloop framework to develop REST Api and mssql as server. I have a table Test in my db which has a trigger. whenever I try to insert or update a record it gives me error as "The target table 'dbo.Test' of the DML statement cannot have any enabled triggers if the statement contains an OUTPUT clause without INTO clause"

but the mssql.js file with me is same as it is in (loopback-connector-mssql)the original fork, how do I make it work with triggers.

my app version:
D:\App>slc -v
strongloop v5.0.0 (node v0.12.4)
โ”œโ”€โ”€ [email protected] (15814c0)
โ”œโ”€โ”€ [email protected] (70e7f5b)
โ”œโ”€โ”€ [email protected] (72dbb7c)
โ”œโ”€โ”€ [email protected] (1b220f7)
โ”œโ”€โ”€ [email protected] (dbaa1a5)
โ”œโ”€โ”€ [email protected] (f46e58f)
โ”œโ”€โ”€ [email protected] (b17cbc0)
โ”œโ”€โ”ฌ [email protected] (50fcfa5)
โ”‚ โ””โ”€โ”€ [email protected] (8554cb3)
โ”œโ”€โ”€ [email protected] (01a38d5)
โ”œโ”€โ”€ [email protected]
โ””โ”€โ”€ [email protected]

Please help me on how do i get it work with a trigger in place....:thought_balloon:

Security hole - auth token is simple integer

The AccessToken table is created with an auto-increment integer. So session authorization tokens are 1, 2, 3 etc.

Any one can use /explorer or curl to try tokens from 1 upwards until a valid token is found.

Default newid() to uniqueidentifier it's not according to the docs

Hi,

DEFAULT newid() here: https://github.com/strongloop/loopback-connector-mssql/blob/master/lib/migration.js#L336-L342

Now I'm trying to generate a guid in server-side using "defaultFn": "guid", example:

    "id": {
      "type": "String",
      "id": true,
      "generated": false,
      "defaultFn": "guid",
      "mssql": {
        "dataType": "UNIQUEIDENTIFIER",
        "dataLength": null,
        "dataPrecision": null,
        "dataScale": null,
        "nullable": "NO"
      }

And not successful, because you are generating a new ID in database-side, even putting "generated": false

Thanks.

Cannot create SQL Server table with a character key

I cannot create a SQL Server table with a character key using automigrate. It works fine with a MySQL datasource. LoopBack 2.0.2, loopback-connector-mssql 1.1.0

 "idInjection": false,
  "properties": {
    "code": {
      "type": "String",
      "id": true,
      "length": 20,
      "nullable": false,
      "generated": false,
      "mysql": {
        "dataType": "varchar",
        "dataLength": 20
      },
      "mssql": {
        "dataType": "varchar",
        "dataLength": 20
      }
    },

I have tried several variations on the above without success.

The automigrate code in mssql.js calls the propertiesSQL function. The column identified as the key is always generated as an int identity column.

MsSQL.prototype.propertiesSQL = function (model) {
  // debugger;
  var self = this;
  var objModel = this._models[model];
  var modelPKID = this.idName(model);

  var sql = ["[" + modelPKID + "] [int] IDENTITY(1,1) NOT NULL"];
  Object.keys(objModel.properties).forEach(function (prop) {
    console.log(prop)
    if (prop === modelPKID) {
      return;
    }

Problem with relations and subqueries

(Pardon the "bad" title, I couldn't think of anything better)

I have a User model object, defined like this:

{
  "name": "CmsUsers",
  "options": {
    "idInjection": false,
    "mssql": {
      "schema": "dbo",
      "table": "Users"
    }
  },
  "properties": {
    "userid": {
      "type": "String",
      "required": true,
      "length": null,
      "precision": null,
      "scale": null,
      "id": 1,
      "mssql": {
        "columnName": "UserId",
        "dataType": "uniqueidentifier",
        "dataLength": null,
        "dataPrecision": null,
        "dataScale": null,
        "nullable": "NO"
      }
    },
    "parentuserid": {
      "type": "String",
      "required": false,
      "length": null,
      "precision": null,
      "scale": null,
      "id": 1,
      "mssql": {
        "columnName": "ParentUserId",
        "dataType": "uniqueidentifier",
        "dataLength": null,
        "dataPrecision": null,
        "dataScale": null,
        "nullable": "YES"
      }
    },
    "firstname": {
      "type": "String",
      "required": true,
      "length": 50,
      "precision": null,
      "scale": null,
      "mssql": {
        "columnName": "FirstName",
        "dataType": "nvarchar",
        "dataLength": 50,
        "dataPrecision": null,
        "dataScale": null,
        "nullable": "NO"
      }
    },
    "lastname": {
      "type": "String",
      "required": true,
      "length": 50,
      "precision": null,
      "scale": null,
      "mssql": {
        "columnName": "LastName",
        "dataType": "nvarchar",
        "dataLength": 50,
        "dataPrecision": null,
        "dataScale": null,
        "nullable": "NO"
      }
    },
    "username": {
      "type": "String",
      "required": true,
      "length": 50,
      "precision": null,
      "scale": null,
      "id": 1,
      "mssql": {
        "columnName": "Username",
        "dataType": "nvarchar",
        "dataLength": 50,
        "dataPrecision": null,
        "dataScale": null,
        "nullable": "NO"
      }
    },
    "password": {
      "type": "String",
      "required": true,
      "length": 50,
      "precision": null,
      "scale": null,
      "mssql": {
        "columnName": "Password",
        "dataType": "nvarchar",
        "dataLength": 50,
        "dataPrecision": null,
        "dataScale": null,
        "nullable": "NO"
      }
    },
    "company": {
      "type": "String",
      "required": true,
      "length": 50,
      "precision": null,
      "scale": null,
      "mssql": {
        "columnName": "Company",
        "dataType": "nvarchar",
        "dataLength": 50,
        "dataPrecision": null,
        "dataScale": null,
        "nullable": "NO"
      }
    },
    "address1": {
      "type": "String",
      "required": true,
      "length": 50,
      "precision": null,
      "scale": null,
      "mssql": {
        "columnName": "Address1",
        "dataType": "nvarchar",
        "dataLength": 50,
        "dataPrecision": null,
        "dataScale": null,
        "nullable": "NO"
      }
    },
    "address2": {
      "type": "String",
      "required": true,
      "length": 50,
      "precision": null,
      "scale": null,
      "mssql": {
        "columnName": "Address2",
        "dataType": "nvarchar",
        "dataLength": 50,
        "dataPrecision": null,
        "dataScale": null,
        "nullable": "NO"
      }
    },
    "postalcode": {
      "type": "String",
      "required": true,
      "length": 50,
      "precision": null,
      "scale": null,
      "mssql": {
        "columnName": "PostalCode",
        "dataType": "nvarchar",
        "dataLength": 50,
        "dataPrecision": null,
        "dataScale": null,
        "nullable": "NO"
      }
    },
    "city": {
      "type": "String",
      "required": true,
      "length": 50,
      "precision": null,
      "scale": null,
      "mssql": {
        "columnName": "City",
        "dataType": "nvarchar",
        "dataLength": 50,
        "dataPrecision": null,
        "dataScale": null,
        "nullable": "NO"
      }
    },
    "country": {
      "type": "String",
      "required": true,
      "length": 50,
      "precision": null,
      "scale": null,
      "mssql": {
        "columnName": "Country",
        "dataType": "nvarchar",
        "dataLength": 50,
        "dataPrecision": null,
        "dataScale": null,
        "nullable": "NO"
      }
    },
    "email": {
      "type": "String",
      "required": true,
      "length": 50,
      "precision": null,
      "scale": null,
      "mssql": {
        "columnName": "EMail",
        "dataType": "nvarchar",
        "dataLength": 50,
        "dataPrecision": null,
        "dataScale": null,
        "nullable": "NO"
      }
    },
    "phone": {
      "type": "String",
      "required": true,
      "length": 50,
      "precision": null,
      "scale": null,
      "mssql": {
        "columnName": "Phone",
        "dataType": "varchar",
        "dataLength": 50,
        "dataPrecision": null,
        "dataScale": null,
        "nullable": "NO"
      }
    },
    "mobile": {
      "type": "String",
      "required": true,
      "length": 50,
      "precision": null,
      "scale": null,
      "mssql": {
        "columnName": "Mobile",
        "dataType": "varchar",
        "dataLength": 50,
        "dataPrecision": null,
        "dataScale": null,
        "nullable": "NO"
      }
    },
    "userlanguageid": {
      "type": "String",
      "required": true,
      "length": null,
      "precision": null,
      "scale": null,
      "id": 1,
      "mssql": {
        "columnName": "UserLanguageId",
        "dataType": "uniqueidentifier",
        "dataLength": null,
        "dataPrecision": null,
        "dataScale": null,
        "nullable": "NO"
      }
    },
    "cmschannelid": {
      "type": "String",
      "required": false,
      "length": null,
      "precision": null,
      "scale": null,
      "id": 1,
      "mssql": {
        "columnName": "CMSChannelId",
        "dataType": "uniqueidentifier",
        "dataLength": null,
        "dataPrecision": null,
        "dataScale": null,
        "nullable": "YES"
      }
    },
    "lastlogin": {
      "type": "Date",
      "required": true,
      "length": null,
      "precision": null,
      "scale": null,
      "mssql": {
        "columnName": "LastLogin",
        "dataType": "datetime",
        "dataLength": null,
        "dataPrecision": null,
        "dataScale": null,
        "nullable": "NO"
      }
    },
    "usertype": {
      "type": "Number",
      "required": true,
      "length": null,
      "precision": 10,
      "scale": 0,
      "mssql": {
        "columnName": "UserType",
        "dataType": "int",
        "dataLength": null,
        "dataPrecision": 10,
        "dataScale": 0,
        "nullable": "NO"
      }
    },
    "userstatusflags": {
      "type": "Number",
      "required": true,
      "length": null,
      "precision": 10,
      "scale": 0,
      "mssql": {
        "columnName": "UserStatusFlags",
        "dataType": "int",
        "dataLength": null,
        "dataPrecision": 10,
        "dataScale": 0,
        "nullable": "NO"
      }
    },
    "timestamp": {
      "type": "String",
      "required": true,
      "length": null,
      "precision": null,
      "scale": null,
      "mssql": {
        "columnName": "Timestamp",
        "dataType": "timestamp",
        "dataLength": null,
        "dataPrecision": null,
        "dataScale": null,
        "nullable": "NO"
      }
    }
  },
  "relations": {
    "roles": {
      "type": "hasMany",
      "model": "CmsRoles"
    }
  }
}

And a Role model object defined like this:

{
  "name": "CmsRoles",
  "options": {
    "idInjection": false,
    "mssql": {
      "schema": "dbo",
      "table": "Roles"
    },
    "relations": {
      "users": {
        "model": "CmsUsers",
        "type": "belongsTo",
        "foreignKey": "userid"
      }
    }
  },
  "properties": {
    "roleid": {
      "type": "String",
      "required": true,
      "length": null,
      "precision": null,
      "scale": null,
      "id": 1,
      "mssql": {
        "columnName": "RoleId",
        "dataType": "uniqueidentifier",
        "dataLength": null,
        "dataPrecision": null,
        "dataScale": null,
        "nullable": "NO"
      }
    },
    "owneruserid": {
      "type": "String",
      "required": true,
      "length": null,
      "precision": null,
      "scale": null,
      "id": 1,
      "mssql": {
        "columnName": "OwnerUserId",
        "dataType": "uniqueidentifier",
        "dataLength": null,
        "dataPrecision": null,
        "dataScale": null,
        "nullable": "NO"
      }
    },
    "rolename": {
      "type": "String",
      "required": true,
      "length": 50,
      "precision": null,
      "scale": null,
      "mssql": {
        "columnName": "RoleName",
        "dataType": "nvarchar",
        "dataLength": 50,
        "dataPrecision": null,
        "dataScale": null,
        "nullable": "NO"
      }
    }
  }
}

Using the Explorer, I try to access CmsUsers/{id}/roles, and, when running with debug enabled, the connector outputs the following:

  loopback:connector:mssql SQL: SELECT * FROM (SELECT [UserId],[ParentUserId],[FirstName],[LastName],[Username],[Password],[Company],[Address1],[Address2],[PostalCode],[City],[Country],[EMail],[Phone],[Mobile],[UserLanguageId],[CMSChannelId],[LastLogin],[UserType],[UserStatusFlags],[Timestamp],ROW_NUMBER() OVER ( ORDER BY [UserId],[ParentUserId],[Username],[UserLanguageId],[CMSChannelId]) AS RowNum FROM [dbo].[Users] WHERE [UserId]=@param1) AS S
 WHERE S.RowNum > 0 AND S.RowNum <= 1
 Parameters: ["12345678-1234-1234-1234-123456789AB"] +3s
  loopback:connector:mssql Result: {"name":"RequestError","message":"Conversion failed when converting from a character string to uniqueidentifier.","code":"EREQUEST","number":8169,"lineNumber":1,"state":2,"class":16,"serverName":"CMS-DB","procName":"","precedingErrors":[]} undefined +15ms

However, if I take that very same SQL Query and run it manually in SQL Server Management Studio, with a line like this on top: DECLARE @param1 AS uniqueidentifier = '12345678-1234-1234-1234-123456789AB' then the query succeeds. Obviously, the connector assumes the wrong type on the @param1 variable, despite the UserId being marked as "dataType": "uniqueidentifier" in the model file.

Have I done something wrong, or missed something crucial, or is this an issue with the connector?

Connection is busy

This is a "heads up" to help others who see this error rather than an issue report.

When we erroneously attempted to update a table row with an invalid value (ex: a 300 character long string into an nvarchar(255)) other loopback calls would begin to fail, and we'd see the following error for all subsequent calls until a timeout occurred:

RequestError: Connection is busy with results for another command
โ€‚โ€‚โ€‚โ€‚at StreamEvents.<anonymous> ([path]\node_modules\loopback-connector-mssql\node_modules\mssql\lib\msnodesqlv8.js:624:21)

To resolve we ensured only valid data was being inserted/updated.

Connection With Keep-Alive

The load balancing policy on Azure is to disconnect connections that are idle for 30 minutes or more.
Is there any way to connect to Azure SQL with keep-alive connection?

Fault-Tolerance Support

Regarding SQL Server connection params for fault-tolerance, we need support for Data Source=Server_A;Failover Partner=Server_B

Customer scheduled to go to prodcution in 4 weeks. This is a blocker.

PLATAPI-383

limit and order

An observation that may be a documentation issue.
With the loopback-connector-mssql, it is possible to define a limit: .
This results in "bad sql".

With a console.log(command) in ../node_modules/loopback-connector-mssql/node_modules/mssql/lib/tedious.js, one observes a ROW_NUMBER() OVER(null).

Adding a order:column gives a OVER(ORDER BY [column]), and one is happy again.

As stated, this could be a general documentation problem for all loopback-connector-somedb

Br Owen

Edit:
Summary: if you have to discover a model from a database, and use slc arc and there is already an id column in the database, make sure you understand the "is id" flag that arc shows when presenting the result of the discovery ...

Filter for NULL?

I don't see any way to filter for NULL/not NULL columns. Am I missing something? Is this functionality planned?

mssql connector doesn't work on Azure

Hi, it seems like there is something problem with mssql connector on Azure. Even before the app boots, it crashes with the following error:

WARNING: LoopBack connector "mssql" is not installed as any of the following modules:

 ./connectors/mssql
loopback-connector-mssql

To fix, run:

    npm install loopback-connector-mssql
Application has thrown an uncaught exception and is terminated:
Error: 
WARNING: LoopBack connector "mssql" is not installed as any of the following modules:

 ./connectors/mssql
loopback-connector-mssql

To fix, run:

    npm install loopback-connector-mssql

    at DataSource.setup (D:\home\site\wwwroot\node_modules\loopback-datasource-juggler\lib\datasource.js:304:28)
    at new DataSource (D:\home\site\wwwroot\node_modules\loopback-datasource-juggler\lib\datasource.js:109:8)
    at Registry.createDataSource (D:\home\site\wwwroot\node_modules\loopback\lib\registry.js:349:12)
    at dataSourcesFromConfig (D:\home\site\wwwroot\node_modules\loopback\lib\application.js:415:19)
    at EventEmitter.app.dataSource (D:\home\site\wwwroot\node_modules\loopback\lib\application.js:222:12)
    at D:\home\site\wwwroot\node_modules\loopback-boot\lib\executor.js:169:9
    at D:\home\site\wwwroot\node_modules\loopback-boot\lib\executor.js:260:5
    at Array.forEach (native)
    at forEachKeyedObject (D:\home\site\wwwroot\node_modules\loopback-boot\lib\executor.js:259:20)
    at setupDataSources (D:\home\site\wwwroot\node_modules\loopback-boot\lib\executor.js:168:3)

On the other hand Postgres connector works flawlessly. I used Azure App Service (API app) for deployment. I checked the contents of the node_modules directory. Indeed, it had loopback-connector-mssql package inside. But loopback just fails to pick it up. It works locally, on the other hand.

The only idea I have had is that relates to the fact that the app is launched using IISNode. But why doesn't Postgres connector crash as well?

Connet to instance

How to connet instance? for example SEVER\SQLEXPRESS

config = {
user: 'user',
password: 'pass',
host: 'server',
database: 'dbname',
supportsOffSetFetch: Math.random() > 0.5
}

Regression in 2.7.0: Float Column Type is Rounded to Integer

I have this model JSON:

"properties": {
"percentage": {
"type": "number",
"required": true,
"mssql": {
"dataType": "float"
}
}
}

In 2.6.0 (and possibly earlier), the value is correctly written to the database table.

In 2.70, the value is rounded to the closest integer.

AuthToken Error "Conversion failed when converting from a character string to uniqueidentifier."

When i try to use the login API

http://localhost:3000/api/Users/login?include=user

I am getting the error in the console

RequestError: Conversion failed when converting from a character string to uniqueidentifier.
    at Connection.<anonymous> (E:\projects\ssgfluux\node_modules\loopback-connector-mssql\node_modules\mssql\lib\tedious.js:661:17)
    at Connection.emit (events.js:107:17)
    at Parser.<anonymous> (E:\projects\ssgfluux\node_modules\loopback-connector-mssql\node_modules\mssql\node_modules\tedious\lib\connection.js:472:15)
    at Parser.emit (events.js:107:17)
    at Parser.<anonymous> (E:\projects\ssgfluux\node_modules\loopback-connector-mssql\node_modules\mssql\node_modules\tedious\lib\token\token-stream-parser.js:40:24)
    at Parser.emit (events.js:107:17)
    at readableAddChunk (E:\projects\ssgfluux\node_modules\loopback-connector-mssql\node_modules\mssql\node_modules\tedious\node_modules\readable-stream\lib\_stream_readable.js:198:16)
    at Parser.Readable.push (E:\projects\ssgfluux\node_modules\loopback-connector-mssql\node_modules\mssql\node_modules\tedious\node_modules\readable-stream\lib\_stream_readable.js:162:10)
    at Parser.Transform.push (E:\projects\ssgfluux\node_modules\loopback-connector-mssql\node_modules\mssql\node_modules\tedious\node_modules\readable-stream\lib\_stream_transform.js:133:32)
    at Parser.callee$1$0$ (E:\projects\ssgfluux\node_modules\loopback-connector-mssql\node_modules\mssql\node_modules\tedious\lib\token\stream-parser.js:92:18)

This is my migration script

app.models().forEach(function(Model) {
    app.dataSources.db.automigrate(Model.modelName, function(err) {
      if (err) console.log(err);
    });
  });

accesstoken fields
This is the column structure i am seeing in the mssql server manager.

Connecting to Azure SQL Server shows SQL errors while creating tables

I'm not able to use a Azure SQL database and I'm running into some strange errors. After some debugging and testing I'm now able to successfully connect to the database, but now I'm getting SQL errors while creating the database tables.

Here are my improvements so far:
Part 1: enable encryption, this is mandatory for Azure. Without encryption there's no chance to connect to the database, and you'll see absolute no errors or hints what's going on! Set the option "encrypt" to "true", this setting will be used in Tedious module.

    "options": { 
      "encrypt": "true" 
    }

Part 2: for DB username use the syntax "USERNAME@SERVERNAME", where "SERVERNAME" is the first part of the Azure DNS name for your SQL instance, otherwise you'll encounter the following errors

loopback:connector:mssql Connection error:  +654ms { name: 'ConnectionError',
  message: 'Server name cannot be determined.  It must appear as the first segment 
of the server\'s dns name (servername.database.windows.net).  Some libraries do not 
send the server name, in which case the server name must be included as part of the 
user name (username@servername).  In addition, if both formats are used, the server names must match.',
  code: 'ELOGIN' }

Here are the important settings to connect to an Azure SQL database. You have to define your datasource.json the right way.

  "accountDB": {
    "name": "accountDB",
    "connector": "mssql",
    "host": "SERVERNAME.database.windows.net",
    "port": 1433,
    "database": "DBNAME",
    "username": "USERNAME@SERVERNAME",
    "password": "PASSWORD",
    "options": { 
      "encrypt": "true" 
    }
  }

With this settings I can now successfully connect to the Azure SQL database. But now, I ran into SQL errors while creating the database tables.

Activating debug logging:

DEBUG=loopback:connector:* node server/create-test-data.js

Console output:

  loopback:connector:mssql Connection established:  +648ms SERVERNAME.database.windows.net
  loopback:connector:mssql SQL: IF EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'[dbo].[account]') AND type in (N'U'))
BEGIN
    DROP TABLE [dbo].[account]
END null +1ms
  loopback:connector:mssql Result: undefined undefined +59ms
  loopback:connector:mssql SQL: SET ANSI_NULLS ON;
SET QUOTED_IDENTIFIER ON;
SET ANSI_PADDING ON;
IF NOT EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'[dbo].[account]') AND type in (N'U'))
BEGIN
CREATE TABLE [dbo].[account] (
    [id] [int] IDENTITY(1,1) NOT NULL,
    [email] [nvarchar](255) NULL,
    [level] [int] NULL,
    [created] [datetime] NULL,
    [modified] [datetime] NULL,
PRIMARY KEY CLUSTERED
(
    [id] ASC
) WITH (PAD_INDEX  = OFF, STATISTICS_NORECOMPUTE  = OFF, IGNORE_DUP_KEY = OFF,
ALLOW_ROW_LOCKS  = ON, ALLOW_PAGE_LOCKS  = ON) ON [PRIMARY]
) ON [PRIMARY]
END;
 null +2ms
  loopback:connector:mssql Result: {"name":"RequestError","message":"'Filegroup reference and 
partitioning scheme' is not supported in this version of SQL Server.","code":"EREQUEST","precedingErrors":[]} undefined +83ms
  loopback:connector:mssql RequestError: 'Filegroup reference and partitioning scheme' 
is not supported in this version of SQL Server.

The first step, deleting the database table works (I did create one by hand). But the creation of a new table fails.
Hope these details helps you to fix this issue. If you need further details, I'm happy to help.

Put request with non existent properties result in error

Put requests having non existent properties (properties that do not exists in the model definition) are resulting in the following error;

TypeError: Cannot read property 'id' of undefined
    at MySQL.SQLConnector.buildFields (\loopback-connector-mysql\node_modules\loopback-connector\lib\sql.js:861:24)
    at MySQL.SQLConnector.buildFieldsForUpdate (\loopback-connector-mysql\node_modules\loopback-connector\lib\sql.js:892:21)
    at MySQL.SQLConnector.buildUpdate (\loopback-connector-mysql\node_modules\loopback-connector\lib\sql.js:583:21)
    at MySQL.SQLConnector.update (\loopback-connector-mysql\node_modules\loopback-connector\lib\sql.js:601:19)
    at MySQL.SQLConnector.updateAttributes (\loopback-connector-mysql\node_modules\loopback-connector\lib\sql.js:570:8)
    at ModelConstructor.<anonymous> (\loopback-datasource-juggler\lib\dao.js:2253:23)
    at ModelConstructor.trigger (\loopback-datasource-juggler\lib\hooks.js:65:12)
    at ModelConstructor.<anonymous> (\loopback-datasource-juggler\lib\dao.js:2218:14)
    at ModelConstructor.trigger (\loopback-datasource-juggler\lib\hooks.js:65:12)
    at \loopback-datasource-juggler\lib\dao.js:2217:12

The field builder could just skip the property when it is not in the model definition. Perhaps it is even better to remove such properties before it hits the data connector layer?

{inq: []} produces invalid SQL

This query:

Project.find({
  fields: { id: true },
  where: { id: { inq: [] } } }
}, function(){})

Produces this SQL:

SELECT [id] FROM [dbo].[Project]
WHERE 0
ORDER BY [id]

Which is invalid, and fails with this error:

RequestError: An expression of non-boolean type specified in a context where a condition is expected, near 'ORDER'.

The query presented is generated by the changes method, which is part of the Loopback replication implementation. This particular query will be generated when there are no changes detected.

I think the changes method should ignore this case and not send a query. I will send a separate issue/pull request for that.

I bring up the issue here though, because I think that the MSSQL driver should probably also gracefully handle this situation by creating a no-op query. Something like this:

SELECT [id] FROM [dbo].[Project]
WHERE 0 = 1
ORDER BY [id]

or...

WHERE [id] IS NULL

or...

WHERE [id] IN (NULL)

I'm concerned about the latter two though, as they might incorrectly produce results in the case of a NULLABLE field. Is there a better way to form such a query? I'm happy to make a pull request if I can get some feedback on what the correct solution would be.

Thanks!

accessToken id data type - impossible to change from uniqueidentifier to nvarchar

I have this model definition:

{
  "name": "accessToken",
  "plural": "accessTokens",
  "base": "AccessToken",
  "idInjection": false,
  "options": {
    "idInjection": false
  },
  "properties": {
    "id": {
      "type": "string",
      "id": true,
      "generated": false,
      "mssql": {
        "dataType": "nvarchar(255)"
      }
    }
  },
  "validations": [],
  "relations": {
    "user": {
      "type": "belongsTo",
      "model": "user",
      "foreignKey": "userId"
    }
  },
  "acls": [],
  "methods": {}
}

Running app.dataSources.db.autoupdate() results with the following DDL for me:

CREATE TABLE AccessToken
(
    id UNIQUEIDENTIFIER PRIMARY KEY NOT NULL,
    ttl INT,
    created DATETIME,
    userId INT
);

While instead the ID column should have "nvarchar(255)" data type. This is most definitely a bug.

Syntax for Indexes Model JSON Does Not Match Documentation

Based on the documentation at the bottom of this page:

https://docs.strongloop.com/display/public/LB/Model+definition+JSON+file#ModeldefinitionJSONfile-Indexes

I believe that I should be able to do this:

"indexes": {
"ERPIDandLTPID": {
"entityReportingPeriodId": 1,
"lowerTierPartnerId": 1
}
}

But after studying the source, I need to do this:

"indexes":{
"blah": {
"columns": "entityReportingPeriodId,lowerTierPartnerId"
}
}

I'm fine with either syntax but it would help if the code and docs matched.

How to make query on a field of a relation

Lets say I have a table B which belongs to table A, and table A has a type column, how can I make a filter query on table B based on type field in table A.
I tried

where: {'tableA.type': {inq: [1, 2]}

i dont think dot notation works here, i keep getting error like this in console and app crashes.

/home/jaideepdas/Projects/ssgfluux/node_modules/loopback-connector-mssql/lib/mssql.js:275
  if (prop.type === String) {
          ^

TypeError: Cannot read property 'type' of undefined

connectors:mssql licensing missing or invalid

When I "slc run", it showed:

## connectors:mssql licensing missing or invalid. Please verify your licenses in the Licenses page by running StrongLoop Arc by typing "slc arc --licenses" . If you have questions about your license or StrongLoop licensing please contact [email protected].

But when I "slc arc --licenses", I can not find the licenses page.(on logout button?)

When query http://0.0.0.0:3000/api/Users:

{
  "error": {
    "name": "RequestError",
    "status": 500,
    "message": "Invalid object name 'dbo.ACL'.",
    "code": "EREQUEST",
    "number": 208,
    "lineNumber": 1,
    "state": 1,
    "class": 16,
    "serverName": "WIN-SV2EV1V9NJ4",
    "procName": "",
    "precedingErrors": [],
...

Please hint.Thank you very much.

Insert into Table with Active Trigger

{
  "error": {
    "name": "RequestError",
    "status": 500,
    "message": "The target table 'dbo.tableName' of the DML statement cannot have any enabled triggers if the statement contains an OUTPUT clause without INTO clause.",
    "code": "EREQUEST",
    "precedingErrors": [],
    "stack": "RequestError: The target table 'dbo.tableName' of the DML statement cannot have any enabled triggers if the statement contains an OUTPUT clause without INTO clause.\n    at Connection.<anonymous> (app/node_modules/loopback-connector-mssql/node_modules/mssql/lib/tedious.js:608:17)\n    at Connection.emit (events.js:95:17)\n    at Parser.<anonymous> (app/node_modules/loopback-connector-mssql/node_modules/mssql/node_modules/tedious/lib/connection.js:450:15)\n    at Parser.emit (events.js:95:17)\n    at Parser.nextToken (app/node_modules/loopback-connector-mssql/node_modules/mssql/node_modules/tedious/lib/token/token-stream-parser.js:103:18)\n    at Parser.addBuffer (app/node_modules/loopback-connector-mssql/node_modules/mssql/node_modules/tedious/lib/token/token-stream-parser.js:80:17)\n    at Connection.sendDataToTokenStreamParser (app/node_modules/loopback-connector-mssql/node_modules/mssql/node_modules/tedious/lib/connection.js:884:35)\n    at Connection.STATE.SENT_CLIENT_REQUEST.events.data (app/node_modules/loopback-connector-mssql/node_modules/mssql/node_modules/tedious/lib/connection.js:289:23)\n    at Connection.dispatchEvent (app/node_modules/loopback-connector-mssql/node_modules/mssql/node_modules/tedious/lib/connection.js:748:59)\n    at MessageIO.<anonymous> (app/node_modules/loopback-connector-mssql/node_modules/mssql/node_modules/tedious/lib/connection.js:676:22)"
  }
}
--I imagine the query used is similar to
INSERT INTO dbo.tableName
        ( ID ,
          Col1 ,
          Col2 
        )
        OUTPUT INSERTED.ID
VALUES  ( 0 ,
          'Col1' ,
          'Col2'
        )
--Microsoft documentation on OUTPUT with TRIGGER enabled: http://msdn.microsoft.com/en-us/library/ms177564.aspx

mssql Data source-specific options

I define datasources.json:

"chs": {
    "host": "xx.xx.xx.xx",
    "port": 1433,
    "database": "chs_services",
    "username": "xxx",
    "password": "xxx",
    "name": "chs",
    "connector": "mssql"
  }

Then I defined in Account.json

...
"options": {
  "chs": {
    "table": "chs_employees"
  },

Then it returned:

{
  "error": {
    "name": "RequestError",
    "status": 500,
    "message": "Invalid object name 'dbo.Account'.",
    "code": "EREQUEST",
    "number": 208,
    "lineNumber": 1,
    "state": 1,
    "class": 16,
    "serverName": "WIN-SV2EV1V9NJ4",
    "procName": "",
    "precedingErrors": [],
...

It seems that it use the model name "Account" as table name but not use the name "chs_employees" defined in Account.json.

SQL Server Express

Hi, is it possible to connect to SQL Server Express LocalDb? Usually, it doesn't have a port, username and password since it was created for development purposes.

IN and NOT IN string escaped after join

Hi,

In mssql.js on line 503 it seems that you are escaping the result of the join, this leads to unwanted single quotes. It is currently this:

return escape(val.join("','"));

I think it should be something like this:

return val.map(function(v) { return escape(v); }).join(",");

Please let me know if I have missed something.

Regards

INFORMATION_SCHEMA vs information_schema

Hi,

In Turkish, lowercase I is not i, so on discovery.js, queries using "information_schema" fails. I tried changing to INFORMATION_SCHEMA, and it seems working.

Please see "Turkey Test"

Sincerely

Windows SSO Authentication

Does the connector have support for using Windows SSO for authentication to SQL Server using Windows Authentication? Or is the only supported method using SQLServer auth with username/password?

loopback modules compatibilities

loopback2.1.2 and loopback-connector-mssql require different versions of loopback-datasource-juggler as peer dependencies.

npm ERR! peerinvalid The package loopback-datasource-juggler does not satisfy it
s siblings' peerDependencies requirements!
npm ERR! peerinvalid Peer [email protected] wants loopback-datasource-juggler@^2.0.
0
npm ERR! peerinvalid Peer [email protected] wants loopback-datasour
[email protected]

GeoPoint inconsistency?

Hello,

I'm starting to use this connector in preparation of a deployment. I'm migrating from using the memory connector. I have a model with a gps:"geopoint" property. When querying an instance of that model the memory connector properly returns an object of similar to {lat:number, lng:number}, while the mssql connector returns a string (resulting in a cascade of errors). Also, when inspecting the database, the type of that property is "varchar(255)", couldn't "geometry" be better suited?

The fix is easy on my side, but since it may affect more people I thought I could at least file a bug report even if it ends up not being one :)

Daniel

slc debug with mssql connector

I can't manage to debug loopback application which uses the loopback-connector-mssql.
It always throws timeout error.

Connection fails:  { [ConnectionError: Failed to connect to xxxxxx:1433 in 15000ms]
  name: 'ConnectionError',
  message: 'Failed to connect to xxxxxx:1433 in 15000ms',
  code: 'ETIMEOUT' } 
It will be retried for the next request.

Violation of PRIMARY KEY When there is an Identity column in model

If model have any identity column where id = true and generated=true getting this error when trying to insert

Error:
"[DatabaseError: RequestError: Violation of PRIMARY KEY constraint 'PK_TblLkp'. Cannot insert duplicate key in object 'dbo.TblLkp'.]"

Model:

{
    "name": "TblLkp",
    "options": {
        "idInjection": false,
        "mssql": {
            "schema": "dbo",
            "table": "TblLkp"
        },
        "relations": {}
    },
    "properties": {
        "id": {
            "type": "Number",
            "id": true,
            "generated": true,
            "required": true,
            "length": null,
            "precision": 10,
            "scale": 0,
            "mssql": {
                "columnName": "id",
                "dataType": "int",
                "dataLength": null,
                "dataPrecision": 10,
                "dataScale": 0,
                "nullable": "NO"
            }
        },
        "subTypeCode": {
            "type": "Number",
            "required": true,
            "length": null,
            "precision": 10,
            "scale": 0,
            "mssql": {
                "columnName": "SubTypeCode",
                "dataType": "int",
                "dataLength": null,
                "dataPrecision": 10,
                "dataScale": 0,
                "nullable": "NO"
            }
        },
        "dataType": {
            "type": "String",
            "required": true,
            "length": 100,
            "precision": null,
            "scale": null,
            "mssql": {
                "columnName": "DataType",
                "dataType": "varchar",
                "dataLength": 100,
                "dataPrecision": null,
                "dataScale": null,
                "nullable": "NO"
            }
        },
        "dataValue1": {
            "type": "String",
            "required": true,
            "length": 100,
            "precision": null,
            "scale": null,
            "mssql": {
                "columnName": "DataValue1",
                "dataType": "varchar",
                "dataLength": 100,
                "dataPrecision": null,
                "dataScale": null,
                "nullable": "NO"
            }
        },
        "dataValue2": {
            "type": "String",
            "required": true,
            "length": 100,
            "precision": null,
            "scale": null,
            "mssql": {
                "columnName": "DataValue2",
                "dataType": "varchar",
                "dataLength": 100,
                "dataPrecision": null,
                "dataScale": null,
                "nullable": "NO"
            }
        }
    },
    "base": "LookupBaseModel"
}

Generated SQL:

SET IDENTITY_INSERT [dbo].[TblLkp] ON;

DECLARE @ids TABLE (id VARCHAR(50))

INSERT INTO [dbo].TblLkp OUTPUT INSERTED.[id] into @ids VALUES(0,11,N'sdfs',N'r',N's')

SET IDENTITY_INSERT [dbo].[TblLkp] OFF;

SELECT id AS insertId from @ids

Test failure

It seems this test has been failing since July 23rd:

The error message:

Conversion failed when converting from a character string to uniqueidentifier.

The test case has been passing and it still passes on all other connectors except loopback-connector-mssql

@superkhau @0candy @loay The test was added on July 7th and it was passing up to July 23rd; do you know why it started failing? We may need to skip this test for now if we cannot figure out the reason soon. Something is wrong just with loopback-connector-mssql

CC: @bajtos @raymondfeng

screen shot 2016-07-27 at 12 28 11 pm

Why does toDatabase() convert dates to UTC?

I know it's best practice to store dates as UTC but often developers are working with existing systems that do not. I'm not sure why the connector should force this on the developer as the default.

Most DB access layers using MSSQL that I've used do not do this conversion automatically. I think it should either leave the date alone or maybe make it an option to convert to UTC that could be 'true' by default if you don't want to make a breaking change.

Also, it appears fromDatabase() does not convert back. It really can't because not all MSSQL date types support timezones.

Wrong parameter types in parameterizedSQL

If you have SQL table with char or varchar datatypes, loopback connector always sends nvarchar as parameter type. At least it should get proper type from model definition, instead of completely ignoring type defined in model and leave guessing of type to mssql package.

For example if properties definition in model is like this:

"properties": {
"Vrsta": {
"type": "String",
"id": true,
"required": true,
"length": 2,
"precision": null,
"scale": null,
"mssql": {
"columnName": "Vrsta",
"dataType": "char",
"dataLength": 2,
"dataPrecision": null,
"dataScale": null,
"nullable": "NO"
},
"_selectable": false
},
"Stevilka": {
"type": "String",
"id": true,
"required": true,
"length": 8,
"precision": null,
"scale": null,
"mssql": {
"columnName": "Stevilka",
"dataType": "char",
"dataLength": 8,
"dataPrecision": null,
"dataScale": null,
"nullable": "NO"
},
"_selectable": false
}
}

parameter type in generated sql should be char instead of nvarchar, because when this happen, SQL server will not use indexes and that dramatically slows down queries.

See: https://lostechies.com/jimmybogard/2012/07/18/troubleshooting-sql-index-performance-on-varchar-columns/

As I see the problem is in this line of loopback-conector-mssql:

https://github.com/strongloop/loopback-connector-mssql/blob/master/lib/mssql.js#L140

where guessing of paramater type is left to mssql package instead using it from model definition.

Mssql generates SQL like this:

exec sp_executesql @Statement=N'SELECT [Vrsta],[Stevilka] FROM [eNarocanje].[ZahtevekPozicija] WHERE [Stevilka]=@param1 AND [Vrsta]=@Param2 ORDER BY [Vrsta],[Stevilka]',@params=N'@param1 nvarchar(8), @Param2 nvarchar(2)',@param1=N'16-00039',@Param2=N'Z1'

But the correct SQL would be:

exec sp_executesql @Statement=N'SELECT [Vrsta],[Stevilka] FROM [eNarocanje].[ZahtevekPozicija] WHERE [Stevilka]=@param1 AND [Vrsta]=@Param2 ORDER BY [Vrsta],[Stevilka]',@params=N'@param1 char(8), @Param2 char(2)',@param1=N'16-00039',@Param2=N'Z1'

update with string or varchar as primary key not working

Could you please add the feature when the primary key is not numeric?

Currenlty, what I do to quick fix is adding "id = escape(id)" before generating SQL query.

For example,

//update
id = escape(id); // the added line to quick fix
sql = "UPDATE " + tblName + MsSQL.newline;
sql += "SET " + combined.join() + MsSQL.newline;
sql += "WHERE [" + modelPKID + "] = (?);" + MsSQL.newline;
sql += "SELECT " + id + " AS pkid;";
fieldValues.push(id);

queryForeignKeys returns incorrect fkTableName

The ds.discoverForeignKeys function, which uses queryForeignKeys from discovery.js, returns the pkTableName as both pkTableName and fkTableName on SQL Server 2016. The following query should return the correct results for fkTableName:

SELECT kcu.constraint_name AS "fkName", kcu.table_schema AS "fkOwner", t.name AS "fkTableName",
kcu.column_name AS "fkColumnName", kcu.ordinal_position AS "keySeq",
'PK' AS "pkName", ccu.table_schema AS "pkOwner",
ccu.table_name AS "pkTableName", ccu.column_name AS "pkColumnName"
FROM information_schema.table_constraints tc
JOIN information_schema.key_column_usage AS kcu
ON tc.constraint_schema = kcu.constraint_schema AND tc.constraint_name = kcu.constraint_name
JOIN information_schema.constraint_column_usage ccu
ON ccu.constraint_schema = tc.constraint_schema AND ccu.constraint_name = tc.constraint_name
JOIN sys.foreign_keys AS f
ON f.name = ccu.constraint_name
JOIN sys.foreign_key_columns AS fc 
ON f.object_id = fc.constraint_object_id
JOIN sys.tables t 
ON t.object_id = fc.referenced_object_id
WHERE tc.constraint_type = 'FOREIGN KEY'

Here's the corresponding DDL that shows the foreign key constraint:

CREATE TABLE [dbo].[PAYMENTS](
	[PAYMENT_KEY] [int] IDENTITY(1,1) NOT NULL,
	[DRG Definition] [varchar](250) NOT NULL,
	[Provider Id] [varchar](50) NOT NULL,
	[Hospital Referral Region Description] [varchar](150) NOT NULL,
	[Total Discharges ] [numeric](18, 4) NULL,
	[Covered Charges] [numeric](18, 4) NULL,
	[Total Payments] [numeric](18, 4) NULL,
	[Medicare Payments] [numeric](18, 4) NULL,
	[Year] [int] NOT NULL,
 CONSTRAINT [PK_PAYMENTS] PRIMARY KEY CLUSTERED 
(
	[PAYMENT_KEY] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]

ALTER TABLE [dbo].[PAYMENTS]  WITH CHECK ADD  CONSTRAINT [FK_PAYMENTS_PROVIDERS] FOREIGN KEY([Provider Id])
REFERENCES [dbo].[PROVIDERS] ([Provider Id])

ALTER TABLE [dbo].[PAYMENTS] CHECK CONSTRAINT [FK_PAYMENTS_PROVIDERS]

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.