Git Product home page Git Product logo

cisagov / scubagear Goto Github PK

View Code? Open in Web Editor NEW
1.4K 43.0 198.0 28.3 MB

Automation to assess the state of your M365 tenant against CISA's baselines

Home Page: https://www.cisa.gov/resources-tools/services/secure-cloud-business-applications-scuba-project

License: Creative Commons Zero v1.0 Universal

PowerShell 29.23% HTML 25.19% CSS 0.38% JavaScript 0.84% Open Policy Agent 44.36%
assessment-tool cisa cybersecurity m365 open-policy-agent open-source powershell rego scuba security security-automation contributions-welcome

scubagear's Issues

Evaluate permitting more narrowly scoped CA policies

Today, we require CA policies apply to all applications. Since this is an M365 baseline, it should be permissible to allow CA policies that apply to just O365 (there is a label for this) or All apps or even potentially explicit lists of apps.

This may be required in practice as some apps may not be able to enforce certain requirements due to their use cases (e.g. being public facing).

Add note about Unblock-File

💡 Summary

Add note about Unblock-File.

Motivation and context

Needed when downloading files.

Implementation notes

cd ScubaGear-0.1.0
Get-ChildItem -Recurse . | Unblock-File

Acceptance criteria

When this has been added.

Fix error handling bugs for teams Rego 2.10

Issue found: Teams rego for 2.10 should be modified
Screen Shot 2022-12-13 at 7 21 42 AM
Should be a easy fix just two lines for 2.10 to change the commandlet
to only "Get-CsTeamsMeetingBroadcastPolicy" but currently is set for "Get-CsTeamsMeetingPolicy"

_Originally posted by @Dylan-MITRE

Getting multiple product results when any number of products are assigned to the ProductNames variable.

🐛 Summary

What's wrong? Please be specific.

When inputting the variable $ProductNames and setting the variable to any number of products, the script runs against all products.

Steps to reproduce the behavior:

Set parameters
$LogIn = $true
$ProductNames = @("aad")
$Endpoint = ""
$OutPath = "./Reports"
$OPAPath = "./"

2
Run .\RunSCuBA.ps1

Expected behavior

What did you expect to happen that didn't?

The scripts will only log into aad and give results for aad. However, all products are logged into and results are given for each.

Add any screenshots of the problem here.

Below is the output from the results of selecting just aad to be tested.

Baseline Conformance Reports Details
Azure Active Directory 9 tests passed 3 warnings 10 tests failed 8 manual checks needed
Microsoft 365 Defender 67 tests passed 7 warnings 3 tests failed 6 manual checks needed
Exchange Online 10 tests passed  4 tests failed 23 manual checks needed
OneDrive for Business 4 tests passed 1 warning 1 test failed 2 manual checks needed
SharePoint Online 4 tests passed  1 test failed 1 manual check needed
Microsoft Teams 16 tests passed   9 manual checks needed

Issues Checking Multiple Defender DLP Policies

🐛 Summary

A tenant with 2 DLP policies was tested. The DLP policies are set for 1 for Teams and 1 for devices. The Teams policy is checking for PII, credit cards, and UK passports. The Device policy only checks for credit cards. The output of the script states that the requirement for PII is met, but not all policies are checking for PII. The script needs to be modified to iterate through the policies and provide the correct output for policies that do not meet the requirement.

Add CONTRIBUTING and RELEASE documentation

As pointed out in the public repo, we don't have a CONTRIBUTING.md file yet which will contain instructions for users that want to contribute to the project. Review similar coding efforts at CISA to re-use their language. Have the contributing file reference other documentation, such as the RELEASE file.

OneDrive 2.7 report output shows "Currently cannot be checked automatically" but there is a setting in the provider JSON that can be checked

Problem

The OneDrive 2.7 "Legacy Authentication SHALL Be Blocked" report output says that the setting cannot be checked automatically but I found a setting named LegacyAuthProtocolsEnabled that is in the existing provider export JSON.

Fix

Modify the Rego policy to check for the LegacyAuthProtocolsEnabled setting. Follow the instructions in the baseline document to configure the setting and test to ensure that this setting that I found is the right one.
image

Agency 2 Pilot: Defender 2.2 potential false "PASS" results

Agency 2 noted that their DLP is managed outside Defender through another system, however they noted that have policies enabled for credit cards, TIN, SSN and PII for monitoring purposes. When they ran the assessment script the results for the sensitive information came up as "PASS" but are not configured to be blocked when the policy states “the DLP policy SHOULD be set to block sharing sensitive information”.

Cannot connect to multiple tenants

🐛 Summary

We have ran this on our test tenant, and then against a dev environment.
When we get the report from the dev environment it is listing users and settings from the test setup.
Even after a reboot, and deleting the scuba folder and running again from a new location.

To reproduce

Steps to reproduce the behavior:

  1. Run Scuba against tenant 1
  2. Run Scuba against tenant 2
  3. Review reports and you will see details from tenant 1 in tenant 2.

Expected behavior

Would have expected a sign in to occur for each tenant. - which seemed to occur
This reports data from tenant 1 only.
Tenant 2 is then requested a powershell app to setup, and then reports data from tenant 2 only.
See issues for getting the initial domain.

Any helpful log output or screenshots

image

Teams Provider Error after leaving the PowerShell terminal running

long running teams error

Unable to repeat this error consistently and it only seems to happen on Test Tenant 1 (G5) as it doesn't occur on the Test Tenant 3 (E5) I'm testing on. Documenting this odd error here.

As far as I can tell, it only happens when I leave my PowerShell terminal connected to Test Tenant 1 (G5) and leave it there for some period of time. Then I'll rerun ScubaGear with teams in the list of $ProductNames and this error which only affects the Get-CsTeamsMeetingPolicy cmdlet will pop up from the Team Provider.

Running the disconnect cmdlet, (e.g. Disconnect-MicrosoftTeams), closing and opening new PowerShell terminals doesn't seem to make this error go away. Instead, the error will vanish by itself if I just go do something else for an hour.

Unit Testing Passing Parameters

When using pester it there is an issue when passing command line parameters.
This will need to be resolved so that a user won't have to edit the tests themselves.
Currently trying to find a work around

Investigate how to consistently alert the user when they do not have the required permissions

I ran the Coral release code for each of the products independently with a test user that does not have any Azure AD roles.

Although each of the products do alert the user that permissions are missing, the output message is very different for each product. Also in some cases (like EXO/Defender/Teams) the tool produces a report, whereas for other products (AAD,OneDrive,Sharepoint) no report is produced.

The scope of this issue would be to investigate if there is a way to consistently display a message to the user alerting them about the missing permissions and also ensure that the tool handles the scenario in the same way. This is probably not a high priority issue, but more like a code standardization / cleanup task.
I have screenshots below showing how each product handles not having the required user role.

image

image

image

image

image

image

image

image

image

image

Azure Government and GCC High Support

💡 Summary

Added some logic and a new parameter to support targeting Azure Government and GCC High cloud environments. Added examples and details to the readme.

Motivation and context

We almost exclusively with clients in Azure Government and GCC High. Targeting the various services with different connection strings require some tedious checking. Once past the connection strings there are various differences between Commercial Azure/M365 and GCC High.

You can change a parameter and target your cloud environment or further expand to the other Azure Clouds (ie DOD).

Implementation notes

Forked repository could pull into this repo. However, many files were edited to make sure the cloud environment identity is carried through the modules. A review needs to take place.

https://github.com/sentinelblue/ScubaGear

Acceptance criteria

How do we know when this work is done?

  • Validated against a GCC High tenant. I validated against our tenant in GCC High/Azure Gov.
  • How to handle errors that occur to configuration differences in GCC High and Commercial? The Teams provider throws some errors due to missing properties on the 'Get-CsTenant' cmdlt. It appears these values aren't populated in my tenant or not populated in GCC High.

Adjust exported cmdlet names to be ScubaGear specific

    I'm wondering if, in the future, we rename all exported command to include the name "Scuba" in them, so as to distinguish them from any other cmdlets on the system? This would be similar to the way that MS Graph has the "Mg" (Get-MgUser) in the cmdlet names. I don't think this is critical for Coral.

_Originally posted by @tkol2022
This is probably most important for export cmdlets.

e.g. Disconnect-Tenant -> Disconnect-SCuBATenant

Fix EXO deprecated alert policies in MS.EXO.16.1

Defender2 9

Defender 2.9 was showing as a fail in the report and highlighted 2 policies.

  • "Malware campaign detected after delivery"
  • "Unusual increase in email reported as phish"

Both of these prebuilt alert policies have disappeared from the Alert Policy list and thus from current Provider exports.
I looked back at an older Provider JSON and found that policies were still there a little over month ago.

OctoberJson

The names of these policies are listed in EXO 2.16, so this will require both a baseline policy update and a Rego code change.

Improve handling of complex mail flow and DKIM/DMARC/SPF

I was investigating the DKIM related issues with Agency 1, which seem to be do mostly to their more complex mail flow configuration.

Agency 1 mail flow is more complex than just office365 to the internet. They route their mail back through what looks like an on-prem Forcepoint email gateway, which is what actually signs their mail as seen by recipient. This is based on me looking at some inbound email from people at Agency 1. This uses selector agency1seg2048a._domainkey.agency1.gov. and it seems to work as expected.

However, EXO is also configured with DKIM signing of email being sent to the forcepoint gateway. They have multiple domains in EXO, some of which have dkim "disabled" and others have it "enabled". The actual mail samples I saw seem to use DKIM signing using a a selector associated with the domain "agency1365.onmicrosoft.com", which is marked as enabled in the config. This is apparently what microsoft does when the custom domain dkim policy is not enabled (as in this example). This mail is purely internal to Agency 1 infrastructure between O365 and their on-prem email gateway (which also appears to validate DKIM).

We probably at least need to add some documentation indicating this behavior for situations with more complicated mail flow. In theory, we could detect mail routing configs that send outbound email not directly to the destination and use that to influence our messages.

PowerPlatform policy 2.1 is missing a check in the Rego

Problem

PowerPlatform policy 2.1 contains 2 separate implementation configurations in the baseline instructions steps 4 and 5. The Rego is currently only checking for step 4, so configuration step 5 is missing.

image

The Fix

Another Rego policy for 2.1 needs to be added to the code. The Rego should check that the disableTrialEnvironmentCreationByNonAdminUsers field in the provider export is == true

Add GCCHigh Endpoint

I modified the code to connect to my GCC High Tenants, but it would be nice to have more elegant built-in Endpoint

Sample Report Output

Is it possible to provide a sample output of the report generated by this tool?

Various sections of report incorrectly list failure

Initially ran a baseline and corrected the ones I saw fit. Tenant is commercial. Followed CISA documents, such as this. Re-ran and got these results:

Defender 2.7

All but the last entry list failure.

Defender 2.8

Malware in attachments is set to block.

Defender 2.9

Not seeing either "disabled" report: Malware campaign detected after delivery, Unusual increase in email reported as phish. User is E5 licensed.

SharePoint 2.5

Setting was already configured to block.

User interaction required for evaluating SPO when using MFA

🐛 Summary

I'm trying to produce a report using a docker container as part of a headless workflow. As a part of the SharePoint rego Get-SPOTenant is used and it requires Connect-SPOService to auth before. According to the documentation it require user interaction (or a global administrator username/password with MFA turned off).

The Graph API can use certificates to authenticate without user interaction, but as far as I can tell the baselines for Sharepoint & Onedrive aren't exposed via Graph API.

To reproduce

Steps to reproduce the behavior:

  1. Install dependencies
  2. Connect-MgGraph -ClientID <client id> -TentantId <tenant id> -CertificateThumbprint <thumbprint>
  3. $InitDom = (Get-MgOrganization).VerifiedDomains | Where-Object {$_.isInitial}
  4. $InitDomPrefix = $InitDom.Name.split(".")[0]
  5. Connect-SPOService -Url "https://$($InitDomPrefix)-admin.sharepoint.com"

Expected behavior

To be able to evaluate baselines in a headless workflow.

Any helpful log output or screenshots

PS C:\> Connect-SPOService -Url "https://$($InitDomPrefix)-admin.sharepoint.com"
Connect-SPOService : No valid OAuth 2.0 authentication session exists
At line:1 char:1
+ Connect-SPOService -Url "https://$($InitDomPrefix)-admin.sharepoint.c ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo          : NotSpecified: (:) [Connect-SPOService], AuthenticationException
    + FullyQualifiedErrorId : Microsoft.Online.SharePoint.PowerShell.AuthenticationException,Microsoft.Online.SharePoint.PowerShell.ConnectSPOService

Explore how to improve the execution performance of the Azure AD provider

The Azure AD export provider is by far the longest of all the products and can take over a minute.
I ran a profiler tool and determined that most of the time in this provider is spent executing in the functions Get-PrivilegedUser and Get-PrivilegedRole. Both of those functions have loops and inside the loops, calls to back-end M365 services are called.

This issue is to explore if there's a way we can make significant improvements to the execution time by exploring alternative provider export algorithms. The ideal outcome of this issue would be a code prototype.

Agency 2 Pilot: OneDrive; 2.2 and 2.3 error if anyone links are disabled (Policy 2.1)

OneDrive for Business policy 2.1 is "Anyone Links SHOULD Be Turned Off". If this is turned off the current rego will mark 2.2 and 2.3 as "FAIL" (Expiration Date SHOULD Be Set for Anyone Links & Link Permissions SHOULD Be Set to Enabled Anyone Links to View) since the admin cannot set an expiration date or permissions to "view".

Suggest adding to the rego a "PASS" scenario if anyone links are disable 2.2 and 2.3 will also show up as "PASS".

Keep track of the latest Microsoft Teams and Microsoft Graph PowerShell module versions

Making this issue to track the MicrosoftTeams PowerShell module 5.0 release.
SetUp.ps1 by default installs the latest module versions.
For ScubaGear, we are enforcing a Maximum version of 4.9999 for MicrosoftTeams. The currently released version is 4.9.1.
Looking at the Teams module release pattern, 5.0 should be due for release in the next few months.

When the Teams module updates to 5.0, this issue is for testing and raising the Maximum plus Minimum allowable versions (Teams 4.9.1 added Application-based auth support) in RequiredVersions.ps1.

Getting an unable to parse input: yaml error

🐛 Summary

When I run the script using all products, I get a message saying "unable to parse input: yaml: line 184: found unknown escape character" followed by a much of objects not found in the Orchestrator.psml. If I run the products individually in the RunScuba script the issue does not occur.

Basically, if I run this, I get the error:
$ProductNames = @("aad", "defender", "exo", "onedrive", "sharepoint", "teams") # The specific products that you want the tool to assess.

If I run the script with this (tested with all products) it runs find
$ProductNames =@("teams")

Any help would be greatly appreciated.

2022-11-11_10-25-43

Unexpected exception returned from msal

🐛 Summary

When attempting to scan AAD, multiple AAD prompts occur even though the account being used is a Global Reader or even Global Admin and the enterprise application has the appropriate consent granted for the organization. This occurs during the "Running the AAD Provider; 1 of 1 Product settings extracted" process. If you respond to the constant authentication prompts about 20 times it, one of two things will occur.

  1. Powershell will eventually return an error saying "Unexpected exception returned from msal".
  2. MS logon will deny logging in with an error that says: "We couldn't sign you in, pleas try again later". Selecting the option for "use another account" and supplying the same credentials will result in the error above from Item #1.

To reproduce

Steps to reproduce the behavior:

  1. RunSCUBA.ps1 with logon=True and products including AAD.

Expected behavior

Should complete the AAD check

Any helpful log output or screenshots

ERROR when getting the MS "we couldn't sign you in..."

Export-AADProvider : Check the second error message below and if it appears to be related to permissions, your user
account must have a minimum of Global Reader role to run this script. You must also get an administrator to consent to the required MS Graph Powershell application permissions. View the README file for detailed instructions and then try again. At C:\temp2\ScubaGear-main\PowerShell\ScubaGear\Modules\Orchestrator.psm1:154 char:31 + $RetVal = Export-AADProvider | Select-Object -Las ... + ~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [Write-Error], WriteErrorException
+ FullyQualifiedErrorId : Microsoft.PowerShell.Commands.WriteErrorException,Export-AADProvider

Get-MgRoleManagementDirectoryRoleAssignmentScheduleInstance : Code: generalException
Message: Unexpected exception returned from MSAL.
At C:\temp2\ScubaGear-main\PowerShell\ScubaGear\Modules\Providers\ExportAADProvider.psm1:221 char:34

  • ... gnments = @(Get-MgRoleManagementDirectoryRoleAssignmentScheduleInstan ...
  •             ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    
    • CategoryInfo : NotSpecified: (:) [Get-MgRoleManag...leInstance_List], AuthenticationException
    • FullyQualifiedErrorId : Microsoft.Graph.PowerShell.Cmdlets.GetMgRoleManagementDirectoryRoleAssignmentScheduleIns
      tance_List

ERROR when just clicking on the authentication account about 20 times.
PS C:\temp2\ScubaGear-main> .\RunSCuBA.ps1
Export-AADProvider : Check the second error message below and if it appears to be related to permissions, your user
account must have a minimum of Global Reader role to run this script. You must also get an administrator to consent to the required MS Graph Powershell application permissions. View the README file for detailed instructions and then try again. At C:\temp2\ScubaGear-main\PowerShell\ScubaGear\Modules\Orchestrator.psm1:154 char:31 + $RetVal = Export-AADProvider | Select-Object -Las ... + ~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [Write-Error], WriteErrorException
+ FullyQualifiedErrorId : Microsoft.PowerShell.Commands.WriteErrorException,Export-AADProvider

Get-MgUser : Code: generalException
Message: Unexpected exception returned from MSAL.
At C:\temp2\ScubaGear-main\PowerShell\ScubaGear\Modules\Providers\ExportAADProvider.psm1:120 char:17

  • ... $AADUser = Get-MgUser -ErrorAction Stop -UserId $User.Id
  •              ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    
    • CategoryInfo : NotSpecified: (:) [Get-MgUser_Get], AuthenticationException
    • FullyQualifiedErrorId : Microsoft.Graph.PowerShell.Cmdlets.GetMgUser_Get

Add any screenshots of the problem here.
image
image

Add retry logic to DNS that attempts to retry against a public resolver. Ensure that report indicates level of confidence in the correctness of the result.

Agency 2 assessment showed a "FAIL" but when looking into the domains flagged by the assessment script, many of which were sub-domains to ones configured in the DNS records hosted by the agency's domain. Agency 2 also mentioned that they noticed two domains that should be approved.

Ethan did check some domains, one did provide a SFT and one provided a “null” output. Ethan also investigated and it could be some weird DNS lookup failures on our end. Maybe we just try to do more testing against tenants with many (think Agency 2 has 60) domains.

Getting Started

🐛 Summary

The instructions tell us to run the setup.ps1 file but nothing about what we need to do before this step. Are supposed to clone the repo, download some files, or something else. A little bit of extra help for newbies would be helpful.

To reproduce

Steps to reproduce the behavior:

  1. Do this
  2. Then this

Expected behavior

What did you expect to happen that didn't?

Any helpful log output or screenshots

Paste the results here:

Add any screenshots of the problem here.

ProviderSettingsExport.json is broken JSON

🐛 Summary

ProviderSettingsExport.json is broken JSON, resulting in the following reporting error:

ConvertFrom-Json : Invalid JSON primitive: .
At C:\Users\Sankgreall\Documents\AzureDevOps\ScubaGear\PowerShell\ScubaGear\Modules\CreateReport\CreateReport.psm1:39
char:44
+ $SettingsExport =  Get-Content $FileName | ConvertFrom-Json
+                                            ~~~~~~~~~~~~~~~~
    + CategoryInfo          : NotSpecified: (:) [ConvertFrom-Json], ArgumentException
    + FullyQualifiedErrorId : System.ArgumentException,Microsoft.PowerShell.Commands.ConvertFromJsonCommand

Upon investigating ProviderSettingsExport.json and sticking it in a JSON validator, it flagged the following section:

            "conditional_access_policies": ,
    "authorization_policies": {
    "AllowEmailVerifiedUsersToJoinOrganization":  false,
    "AllowInvitesFrom":  "everyone",
    "AllowUserConsentForRiskyApps":  null,
    "AllowedToSignUpEmailBasedSubscriptions":  true,
    "AllowedToUseSspr":  true,
    "BlockMsolPowerShell":  false,
    "DefaultUserRoleOverrides":  null,
    "DefaultUserRolePermissions":  {
                                       "AllowedToCreateApps":  true,
                                       "AllowedToCreateSecurityGroups":  true,
                                       "AllowedToReadBitlockerKeysForOwnedDevice":  true,
                                       "AllowedToReadOtherUsers":  true
                                   },
    "DeletedDateTime":  null,
    "Description":  "Used to manage authorization related settings across the company.",
    "DisplayName":  "Authorization Policy",
    "EnabledPreviewFeatures":  [

                               ],
    "GuestUserRoleId":  "10dae51f-b6af-4016-8d66-8c2a99b929b3",
    "Id":  "authorizationPolicy",
    "PermissionGrantPolicyIdsAssignedToDefaultUserRole":  [
                                                              "ManagePermissionGrantsForSelf.microsoft-user-default-legacy"
                                                          ],
    "AdditionalProperties":  {

                             }
},

I'm assuming this is either supposed to be "conditional_access_policies": "[some value]" or "conditional_access_policies": {... nested dict}

To reproduce

Steps to reproduce the behavior:

  • I've been conducting my testing on a newly provisioned O365 tenant under the MS Developer Sandbox scheme. This is on E3 licensing and configured with a baseline of features.

  • Running the script against this tenant lead to this issue. Should be capable of reproducing as all sandbox tenants are theoretically the same.

Expected behavior

Report should be populated, but it's unable to parse JSON

Current Rego does not check for empty values which can result in False Positives

I noticed this in Teams but this oversight affects EXO and most likely multiple other products as well.
This ties into the Provider Error handling issues where before it was implemented, an error would be thrown and an incorrect report would be displayed.
However, after adding in error handling we discovered a critical oversight in the current Rego and this can be considered an extension of the adding error handling issues.

Below is screenshot of the Provider JSON after adding in error handling and running the provider on an account with insufficient permissions. This is expected behavior.

NullCheck

However, below is screenshot of the report. Notice that everything is passing, and some policies are missing. Teams has 16 testable policies.
Report

Below is the Rego logic for Teams 2.3, where we are checking for if there are any policies that have a property that is listed as a SHOULD NOT be allowed.
2 3

Since no policies are returned the Rego passes as no policies are meeting the fail condition :) fails and nothing is output to the report.
Team23

For the Rego Logic for Teams 2.2 the Rego passes as no policies are meeting the fail condition :)
22

This needs to be addressed fully in the Rego which we're expecting will take a large amount of rework unless a quicker fix can be found.
The temporary solution for the Coral release is to catch the false positives in the CreateReport.psm1 (see picture below) and display the error.
However, TestResults.json will still contain the false positives which is a critical bug that will need to be addressed immediately in the near future.

TempFixPowerShellError

Report is creating errors when run with non ASCII characters

Hello

when I run script runscuba.ps1 report is empty
I have an error inputobject is value null ?

see extract powershell command

unable to parse input: yaml: invalid trailing UTF-8 octet
ConvertTo-Csv : Impossible de lier l'argument au paramètre « InputObject », car il a la valeur Null. Au caractère C:\scubaO365\ScubaGear-main\PowerShell\ScubaGear\Modules\Orchestrator.psm1:272 : 46 + ... $TestResultsCsv = $TestResults | ConvertTo-Csv -NoTypeInformation + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + CategoryInfo : InvalidData : (:) [ConvertTo-Csv], ParameterBindingValidationException + FullyQualifiedErrorId : ParameterArgumentValidationErrorNullNotAllowed,Microsoft.PowerShell.Commands.ConvertToCs vCommand ConvertTo-Csv : Impossible de lier l'argument au paramètre « InputObject », car il a la valeur Null. Au caractère C:\scubaO365\ScubaGear-main\PowerShell\ScubaGear\Modules\Orchestrator.psm1:272 : 46 + ... $TestResultsCsv = $TestResults | ConvertTo-Csv -NoTypeInformation + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

if you have a solution ?

thanks

Agency 2 Pilot: OneDrive 2.4, 2.5, and 2.6; Agency Defined Domains

Agency 2 has enabled restrictions for OneDrive for Business using a list of domains but showed "FAIL" for 2.4 and 2.6 and "PASS" for 2.5.

There are already ongoing discussions on possible consolidation of these policies and Microsoft has provided test scripts to run to verify findings. Testing is expected to occur post Coral.

From Ram:

  1. Set-SPOTenantSyncClientRestriction -Enable -DomainGuids "786548DD-877B-4760-A749-6B1EFBC1190A; 877564FF-877B-4760-A749-6B1EFBC1190A"
    And what is experience of Mac Sync Client on Mac for , when trying to sync OneDrive for Business.
    A. Mac Joined to 786548DD-877B-4760-A749-6B1EFBC1190A domain -
    B. Mac Joined to domain different from "786548DD-877B-4760-A749-6B1EFBC1190A; 877564FF-877B-4760-A749-6B1EFBC1190A"
    C. Mac not joined any domain.

  2. Set-SPOTenantSyncClientRestriction -Enable -DomainGuids "786548DD-877B-4760-A749-6B1EFBC1190A; 877564FF-877B-4760-A749-6B1EFBC1190A" -BlockMacSync:$true
    And what is experience of Mac Sync Client on Mac for , when trying to sync OneDrive for Business.
    A. Mac Joined to 786548DD-877B-4760-A749-6B1EFBC1190A domain
    B. Mac Joined to domain different from "786548DD-877B-4760-A749-6B1EFBC1190A; 877564FF-877B-4760-A749-6B1EFBC1190A"
    C. Mac not joined any domain.

  3. Set-SPOTenantSyncClientRestriction -Enable -DomainGuids "786548DD-877B-4760-A749-6B1EFBC1190A; 877564FF-877B-4760-A749-6B1EFBC1190A" -BlockMacSync:$false
    And what is experience of Mac Sync Client on Mac for , when trying to sync OneDrive for Business.
    A. Mac Joined to 786548DD-877B-4760-A749-6B1EFBC1190A domain
    B. Mac Joined to domain different from "786548DD-877B-4760-A749-6B1EFBC1190A; 877564FF-877B-4760-A749-6B1EFBC1190A"
    C. Mac not joined any domain.

  4. Set-SPOTenantSyncClientRestriction -BlockMacSync:$false ( with no domain specified in either enable or disable list)
    And what is experience of Mac Sync Client on Mac for , when trying to sync OneDrive for Business.
    A. Mac Joined to any domain
    B. Mac not joined to any domain.

  5. If you run two separate commands
    Set-SPOTenantSyncClientRestriction -BlockMacSync:$true
    Set-SPOTenantSyncClientRestriction -Enable -DomainGuids "786548DD-877B-4760-A749-6B1EFBC1190A; 877564FF-877B-4760-A749-6B1EFBC1190A" " -BlockMacSync:$false
    A. Mac Joined to 786548DD-877B-4760-A749-6B1EFBC1190A domain
    B. Mac Joined to domain different from "786548DD-877B-4760-A749-6B1EFBC1190A; 877564FF-877B-4760-A749-6B1EFBC1190A"
    C. Mac not joined any domain.

Support Assignment Exclusions for Conditional Access Policies

Updated AAD policy and unit tests to define policy for users, groups, and roles, and to test for any exclusions to these.
Policy updates were made to 2.1, 2.2, 2.3, 2.4, 2.9, 2.10, and 2.13.

Policy 2.17 involves hybrid joined and compliant devices and is unchanged. This is the only "should" vs "shall" requirement, and we are not currently applying these policies in the tenant configurations.

Additional Defender prodcuts

💡 Summary

Please add baselines and assessments for the other products in the M365 Defender suite, specifically Defender for Endpoint, Defender for Identity and Defender for Cloud Apps.

Motivation and context

Only providing baselines for Defender for Office 365 is not sufficient. there are many other attack paths that don't include email.

This would be useful because... organizations need help protecting themselves from all of the attack paths.

Implementation notes

It would be similar to the documents and tools provided for Defender for O365

Acceptance criteria

When the baselines for all the M365 Defender suite have been published.

  • Criterion

Azure AD Baseline Feedback

Azure AD Baseline Feedback

Overview

From an analysis of the Azure AD baseline that is part of SCuBA, I would like to provide feedback from the perspective of someone who has worked in a consultative role in the space of Azure AD with a variety of organizations.

It's understandable that the government provide guidance for agencies when implementing M365 and subsequently Azure AD tenants that are providing the identity layer to those services.

There are areas that are of somewhat concern, with public consumption of the guidance and the potential for adoption of such guidance without proper understanding, in particular in the commercial space. While it's understandable that the primary directive of this guidance is not to strengthen security of commercial organizations, it's likely that those organizations may attempt to consume this guidance as written to the letter. Considering how easy it is to run the ScubaGear tooling, one can see organizations easily adopting this tooling for baselining their environments, yet not necessarily understanding the ramifications behind adopting certain guidelines, especially those that tend to be more security-heavy or do not align with Microsoft recommendations. Along similar lines, there are other security recommendations for properly securing an Azure AD tenant that are not covered within here, and it may be helpful to note that this is not a comprehensive security baseline for all aspects of Azure AD.

The submission of this relates to my recent blog post in which I analyze the results, but I want to speak a bit further to some of them within here. You can find that post here, https://ericonidentity.com/2022/10/26/cisa-scuba-diving-into-the-azure-ad-baseline.

Perhaps it's simply a matter of CISA acknowledging within the guidance that if the public decides to consume it, that it may or may not cover all aspects of identity security in Azure AD, and that the recommendations may or may not align to any particular business model.

General Areas of Concern

SHALL v SHOULD

It could be presumed that SHALL v SHOULD align to NIST definitions, but it is not apparent within the guidance itself. While these terms perhaps are clear in the federal space, outside of that it may help from an open-source perspective to include the definitions within the guidance.

2.2 and 2.3 - Implementation Gap

For AAD tenants that have Azure AD Identity Protection available to them (AAD P2 or M365 E/G5), it has always been somewhat a bit misunderstood as far as whether to choose Conditional Access v AAD Identity Protection as far as the place to implement such policies. It's understandable to design to implement with CA, considering that you can only have one policy for user risk and sign-in risk defined within AAD Identity Protection.

There is one gap though when not defining these policies within Identity Protection. As CA policies are only ever evaluated in the resource tenant, if the users account is at risk or the sign-in is risky, the CA policy defined in the home tenant will not be evaluated when the user authenticates to the resource tenant. This primarily is a driver why baseline Identity Protection policies to the "majority" user base are still enabled, and then you can build from there with CAP.

2.9 and 2.10 - Usability Issues

It can be presumed that the recommendations here are to align to NIST 800-63B 4.2.3 or 4.3.3 for reauthentication and 7.3 for secret retention/session persistence.

It's understandable that CISA guidance would align to NIST, but from a usability perspective it's highly likely to become problematic, especially on mobile devices. Again, perhaps without insight into the user personas within federal agencies, this may or may not be an issue. But this area in particular is one that consumers outside of the federal space, who want to apply CISA guidance, should be warned that usability is at risk.

A 12-hour time window will generally cause little interference with a "normal desk job" worker, who likely locks their Windows device when away, or the device locks itself on idle, as unlock will cause reauthentication to take place. Therefore, a policy as such could be applied and not interfere with devices under control of the organization.

Mobile devices though can suffer from these settings, as they can interfere with client applications. A common example is session expiration in the Outlook application, and then users not realizing that they have not been receiving email, as the client will not alert the user that reauthentication is necessary.

2.14 - Lack of Break-glass accounts

The general recommendation on break-glass accounts is missing, which perhaps CISA and the federal space account for the inherent risk, but organizations that do accidently lock themselves out of their tenant have to go through a decent process to gain access to their tenant. Hence the need for break-glass accounts.

While there may also be the possibility that the federal space has different mechanisms for tenant recovery, commercial organizations that do not account for break-glass, regardless of their size, will spend days locked out of their tenant from a management perspective, which depending on the severity of what was implemented, could be highly impactful to the entire organization.

Likewise, in the case that there are issues with the Azure MFA services or Azure AD PIM role assignment, recommendation from Microsoft is that at least one break-glass account is without MFA, and that it is permanently assigned Global Administrator. Again, CISA guidance may align to federal requirements and recommendations, but this advice may not necessarily translate well beyond those walls.

2.14 and 2.15 - Implementation Gap

Out of the box all role assignments in Azure AD PIM provide an 8-hour role elevation lifetime. Perhaps covered in other federal recommendations outside of this guidance, but for roles such as Global Administrator, Microsoft recommends that activation should be limited to an hour. General identity security guidance would align in that a highly privileged role should be limited to the minimal viable window - it's unlikely a Global Administrator requires activated privileges for an entire 8 hours. Alternatively, recommendation should indicate that roles should be deactivated through Azure AD PIM once the usage of the role for the required work is complete.

Exchange External Sender Policy: Multiple Rules not iterated through

For Exchange 2.7, the script marks this check as not implemented if Multiple External Sender Policies are present, and the first policy is disabled but the follow-on rules are appropriately configured. Suggest iterating through the policies and checking the configs of each to determine the outcome.

Improve Cached Tenant Settings cmdlet support

These are my comments on Run-Cached:

  • I tested it and no errors. It only works if there is a ProviderSettingsExport.json in the current directory. See my last bullet.
  • Rename it to make it unique just like we did for Disconnect-Tenant. FYI - Cassey is referencing Run-Cached in her regression test script. (This may be OBE as regression test script likely replaced by new testing framework.)
  • When I execute Run-Cached without any parameters, it simply acts like Run-Scuba. I think we should change the default behavior to distinguish it. Perhaps by default it will set the -ExportProvider $false ?
  • Since we are exporting Run-Cached, do we need to add any instructions to the README? Or do we simply let developers figure this script out by examining the code on their own?
  • I think the script should output a friendly message when there is no ProviderSettingsExport.json file in the current directory or in OutPath. Right now the user gets a bunch of generic errors.

_Originally posted by @tkol2022

Should also be renamed from Invoke-RunCached to Invoke-SCuBACached

Device Security Baseline for Cross-Tenant Access

💡 Summary

Section 2.17 of the Microsoft Azure Active Directory Baseline states that the device claim (compliance status of the device) from the home tenant should be trusted for guests in the resource tenant. When conditional access policies are evaluated during authentication, it only renders a binary answer of whether a device is compliant or not. While this type of policy provides a rudimentary filter for blocking access to non-compliant devices, it does not yield that a guest user's device complies with the resource tenant's security requirements and fundamentally goes against the ZT concept of not allowing implicit trust across boundaries.

Motivation and context

Let's consider a simple scenario of a resource tenant and home tenant have differing security requirements for HAADJ/AADJ devices. If the resource tenant requires BitLocker to be enabled for a device to be marked as compliant, but the home tenant does not require BitLocker to be enabled to mark a device as compliant, then a user from the home tenant (with whom a resource has been shared) can access a resource as a guest user within the resource tenant without meeting the BitLocker requirements of the resource tenant. This carries a significant security risk if there were no other DLP policies applied to the resources. This yields the question, what is the real value of section 2.17 in the context of guest users as it stands today? While general guidance for securing devices has been laid out through a lot of publications from NIST and CISA, there has to be consensus and a shared understanding of what is considered secure before the implementation steps 2 and 3 under section 2.17.4 are executed. This is confounded by the fact that CISOs and CIOs may have opposing opinions on the on this topic.

We also have to consider that sharing real-time device configuration across two tenants is probably not a capability that Microsoft can (or will) implement for the conditional access policy feature set any time soon. If and when they do, it can be incorporated into the baseline. In the meantime, to be pragmatic, CISA should also provide a framework for how device trust should be established.

This would be helpful so that CISOs and CIOs can follow guidance from a single authoritative source and build consensus on the absolute minimum required device configuration to meet the security thresholds for cross-tenant collaboration.

Implementation notes

The framework could include the following:

  • Publishing a security baseline for Windows Devices (or use an existing one that defines the bare minimum required).
  • Developing scripts that can generate reports about whether security baselines for a tenant have been implemented or not.
  • Providing guidance on how home tenant ISOs should supply their security baseline reports as evidence to resource tenant ISOs through Interagency agreements (or some other medium).

Acceptance criteria

Resource tenant implementers have a shared understanding with home tenant implementers on the minimum security/configuration requirements for devices, before cross-tenant access is enabled.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.