Git Product home page Git Product logo

psdataverse's People

Contributors

rezanid avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar

psdataverse's Issues

Get-DataverseGlobalOptionSet

Add a new function to retrieve all the information about a global optionset by its name. It would be interesting to also support the following parameters:
-LanguageCode by default 1033. This parameter will only affect the below parameter.
-OnlyValueLabels makes the function return only values and keys in the form of [Int64], [string]

Automatic pagination

Get requests (queries) that have more than 5000 rows in their response are paginated and retrieving each request requires a new request to be sent. This would require a loop that detects the next page and sends new requests.

To make it easier to work with such requests a new parameter (-AutoPaginate) in Send-DataversrOperation could take care of receiving pages automatically.

Add unit tests

Now that PSDataverse's core has become more mature, it's time to add unit tests both to the C# part (binary Cmdlets) and PowerShell Cmdlets.

Get-DataverseAttributes

A new Cmdlet to make downloading attribute metadata easier. The syntax can be like the following.

Get-DataverseAttributes [-EntityLogicalName] <String[]> [[-AttributeType] <String>] [[-Select] <String>]
    [[-Filter] <String>] [[-Expand] <String>] [-WhatIf] [-Confirm] [<CommonParameters>]

Example 1:

PS > Get-DataverseAttributes -EntityLogicalName 'account', 'contact'

Example 2:

PS > 'account', 'contact' | Get-DataverseAttributes

Example 3:

PS > Get-DataverseAttributes -EntityLogicalName 'account' -AttributeType Decimal

Example 4:

PS > Get-DataverseAttributes -EntityLogicalName 'account' -AttributeType Picklist -Expand OptionSet -Filter "IsValidForRead eq true"

Shorten the name of cmdlets

Currently all Cmdlets use the following pattern for naming:

<verb>-Dataverse<noun>

For operations that have long noun part in the name like Get-DataverseRowCount, it can be tedious to type such a long name.

Consider renaming all the existing Cmdlets to the following pattern:

<verb>-Dv<noun>

To continue supporting the current names, old functions can become wrappers to call new functions.

Function Clear-DvTable {
    # Implementation here
}
Function Clear-DataverseTable {
  Clear-DvTable
}

New Cmdlet to convert any vector into any code `ConvertTo-Code`

A new Cmdlet ConvertTo-Code can convert any PowerShell vector into code based on a given template. This will allow code generation based on metadata retrieved from Power Platform (or any other source). The template approach makes this command useful for unforeseen future scenarios.

Syntax

ConvertTo-Code [-InputObject] <PSObject[]> -Template <String>

Future Enhancements

  • Support loading template from file using -TemplateLiteralPath <String>
  • Support loading template from file using -TemplatePath <String>
  • Support known templates using -KnownTemplate <String>
  • Support outputting to file using -OutPath <String>

Verification code expired before contacting the server

For scripts running for more than an hour the Send-DataVerseOperation reports " Verification code expired before contacting the server".
This is using Device based authentication running in Visual Studio Code on Powershell version 7.4.1 and latest version of PSDataVerse

Add a Cmdlet to count table rows

There are different ways of counting rows in Power Platform. The most reliable is ask for all rows and count them.

  • The difficulty is that Web API will paginate the result if there are more than 5000 rows to be return
  • A good side effect of this approach is that it makes it possible to send a filter that enables us to do a specific count (e.g. all accounts that their revenue is higher than 1M $)
  • The downside is that this is the slowest technique to count the data and puts the highest pressure on the resources of both client and server.

Another approach is to ask Web API to do the count based on the last image.

  • This is the fastest way and has the lowers impact on both client and server.
  • But the result might not be up to date and it might be a day old.

The Cmdlet should allow the user to choose the strategy and by default rely on the most reliable approach.

Enable true "one-command' operations by integrating Connect-Dataverse into Send-DataverseOperation

In some occasions, there is only ever one operation to be run and running two commands, one to connect and another to do the actual command gets in the way. Bringing parameters of Connect-Dataverse as optional parameters to Send-DataverseOperation can improve that. This will make it easier for other functions that are (and will be) built on top of Send-DataverseOperation to continue this tradition.

Add Clear-DataverseTable cmdlet

Syntax:
Clear-DataverseTable [-SchemaName] <String[]>

Example:
Clear-DataverseTable accounts,contacts

This cmdlet should be implemented in PowerShell language and not as a binary Cmdlet.

ℹī¸ Remarks

  • The command blocks until the the operation is completed.
  • Cancellation is supported, but it is subject to the last n x 1000 rows deleted based on the official documentation of BulkDelete action.
  • Rollback won't be possible.

More verbose Authentication/permission errors

Hey,

I have two suggestions.

  • When authenticating I overlooked that I was using the wrong clientId. The error the module gave was generic. Microsoft itself probably returned a far more detailed error but this wasn't returned. Would it be possible to just pass through the error Microsoft gives? In powershell I'd implement this as throw "Authentication error: $($_.Exception.Message)".

  • When trying to retrieve data I missed that the application did not have permission to the specific table. Instead of returning an error I just got empty data. I only got an error when I tried to look up a specific record.
    Screenshot 2023-08-24 152209 (1)

P.S. love the module so far, its feeling far more versatile than Microsoft.Xrm.Data.PowerShell which is stuck on powershell 5.

Syntax Error in Connect-Dataverse Example #3

The third Connect-Dataverse example should have "clientid" instead of "client_id" (current example leads to Connect-Dataverse: The given key 'clientid' was not present in the dictionary.).

In addition it is also missing the resource key (which is present in the other examples).

I the example should be like this:
Connect-Dataverse "authority=https://login.microsoftonline.com/<your-tenant-id>/oauth2/authorize;clientid=1950a258-227b-4e31-a9cf-717495945fc2;device=true;resource=https://<your-environment-name>.crm4.dynamics.com/" -InformationAction Continue

Setup CI + CD

The pipeline should include the following.

  • 1. Build the .NET code.

  • 2. Prepare the PowerShell module (full automation is not required yet).

  • 3. Produce the artifact.

  • 4. Publish to PowerShell Gallery.

Make Connect-Dataverse work with MSAL.PS

MSAL.PS is maintained by AzureAD team and wraps the MSAL.NET library. It will give more control over the authentication to the user. The Get-MsalToken cmdlet in this module returns an AuthenticationResult object that contains the access token needed for interacting with Datavers.

Added value:

  • Puts user in control of authentication.
  • Enables the user to easily reuse the authentication for other commands (e.g. MS Graph API, custom APIs)

Example use case:

$token = Get-MsalToken -ClientId 1950a258-227b-4e31-a9cf-717495945fc2 -TenantId 12345678-ABCD-ABCD-ABCD-1234567890AB
Connect-Dataverse $token

-Batchsize does not handle in-line If/else statements

When writing if/else statements inside a for each record statement, it seems that the -batchsize command is unable to read that logic. Running the send-Dataverseoperation 1 record at a time runs with no issue. When using an if directly after the for each and then and else, repeating all columns, -batchsize works with that.

PSFunctions folder is not found in the release package

Install-Module PSDataverse is successfull, but when running Import-Module PSDataverse, the following error occurs:

Get-ChildItem: Cannot find path '...PowerShell\Modules\PSDataverse\0.0.4\PSFunctions' because it does not exist.

Upon checking I discovered that the PSFunctions folder that contains pure PS functions is not packaged during the release.

Piping in JSON not working

I am having issues running even the most basic commands when piping JSON into the Send-DataverseOperation function. The commands work if I provide the operation after the call to Send-DataverseOperation like this: Send-DataverseOperation '{"Uri":"WhoAmI"}' but when piping them in following the example in the read me file ('{"Uri":"WhoAmI"}' | Send-DataverseOperation) I get the following error: Send-DataverseOperation: No operation has been given. Please provide an operation using either of -InputOperation or -InputJson or -InputObject arguments.

Piping in non JSON commands does seem to work, for example the following works just fine: @{Uri="WhoAmI"} | Send-DataverseOperation

Possibility to disable affinity cookie

By setting the header Arr-Disable-Session-Affinity to false we can disable the affinity cookie to make sure the requests are distributed across multiple servers.

Explore the possibility of using TPL Dataflow Library

Context

PSDataverse relies on TPL and custom logic to manage parallel execution of multiple batch requests. The custom logic takes care of distributing the work, managing the max degree of parallelism, merging the result, and piping it back to PowerShell.

What is TPL Dataflow Library?

According to Microsoft Documentation:
The Task Parallel Library (TPL) provides dataflow components to help increase the robustness of concurrency-enabled applications. These dataflow components are collectively referred to as the TPL Dataflow Library. This dataflow model promotes actor-based programming by providing in-process message passing for coarse-grained dataflow and pipelining tasks. The dataflow components build on the types and scheduling infrastructure of the TPL and integrate with the C#, Visual Basic, and F# language support for asynchronous programming.

Benefits

Looking at the official documentation and samples. It seems possible (with a lot of work) to replace the current implementation with the TPL Dataflow library. The added value would be reduction of concern in the current implementation and using a standard approach that can be potentially more familiar to the developers.

It is not clear if switching to TPL Dataflows will have other benefits like better utilization of resources and better error handling or not.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤ī¸ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.