Git Product home page Git Product logo

pwnedpasswordsazurefunction's Introduction

HIBP Logo

Pwned Passwords - Azure Function


Continuous Integration Build Publish Test Results

APIs for the k-anonymity Pwned Passwords implementation
Visit Pwned Passwords · View Pwned Passwords API · Report an Issue

Contents

Give a Star! ⭐

If you like the project, please consider giving it a star to raise awareness!

About Pwned Passwords - Azure Function

This repository holds the code for the Pwned Passwords Azure Function - the web endpoint which interacts directly with the Azure Blob Storage to retrieve the SHA-1 hashes of the Pwned Passwords using a fixed five-length SHA-1 prefix to anonymise requests. For more details such as the k-anonymity model and Pwned Passwords, view Troy Hunt's blog post here. The Pwned Passwords Cloudflare Worker can be found here.

Getting Started with Contributions

Any and all are welcome to contribute to this project. Please ensure that you read our Code of Conduct first.

Prerequisites

It is recommended to use Visual Studio 2019, VS Code, or JetBrains Rider for working on Pwned Passwords Azure Function. You will also need the the .NET 5.0 SDK installed.

Once you have installed your IDE of choice, make sure that you install the relevant Azure Functions extensions.

  • For Visual Studio 2019, make sure you have installed the Azure Development workload. If you haven't, use the Visual Studio Installer to do so (instructions on this can be found here).
  • For VS Code, use the VS Code Azure Functions extension.
  • For JetBrains Rider, install the Azure Toolkit and make sure to install the Azure Functions core tools from the settings (Tools | Azure | Functions).

An Azure Storage emulator will also be needed for local emulation of Azure Blob Storage:

Running Locally

You should configure a local.settings.json file to hold the Configuration Manager values for PwnedPasswordsConnectionString, BlobContainerName, TableStorageName and MetadataTableStorageName within the Functions project. Ensure that this file is not commited (it is ignored by Git within the Functions project).

local.settings.json should contain the following correctly configured values:

{
    "IsEncrypted": false,
    "Values": {
        "AzureWebJobsStorage": "<Your Connection String from Azure Storage Emulator",
        "PwnedPasswordsConnectionString": "<Your Connection String from Azure Storage Emulator>",
        "BlobContainerName": "<Name of Blob Container you created>",
        "TableStorageName": "<Name of Table Storage you created",
        "MetadataTableStorageName": "<Name of second Table Storage you created>"
    }
}

Using a utility such as cURL or a web browser will allow you to visit the locally running Azure Functions endpoints, typically at http://localhost:7071.

curl -X GET http://localhost:7071/range/21BD1

To-Do List

  • Authenticated Endpoint which can receive SHA-1/NTLM/prevalence data
  • Extract SHA-1 into Azure Blob Storage - either appending in order to file or updating count value
  • Add SHA-1/NTLM hashes to downloadable corpus - this should be updated monthly
  • Corresponding Cloudflare cache item for corpus or blob storage file must be invalidated - this shouldn't be more than once per day for a cache item

The whole premise around open sourcing the Pwned Passwords Azure Function is to foster further development on the Pwned Passwords project. This To-Do list has been taken from the announcement blog post open-sourcing Pwned Passwords.

  1. There's an authenticated endpoint that'll receive SHA-1 and NTLM hash pairs of passwords. The hash pair will also be accompanied by a prevalence indicating how many times it has been seen in the corpus that led to its disclosure. As indicated earlier, volumes will inevitably fluctuate and I've no idea what they'll look like, especially over the longer term.
  2. Upon receipt of the passwords, the SHA-1 hashes need to be extracted into the existing Azure Blob Storage construct. This is nothing more than 16^5 different text files (because each SHA-1 hash is queried by a 5 character prefix), each containing the 35 byte SHA-1 hash suffix of each password previously seen and the number of times it's been seen.
  3. "Extracted into" means either adding a new SHA-1 hash and its prevalence or updating the prevalence where the hash has been seen before.
  4. Both the SHA-1 and NTLM hashes must be added to a downloadable corpus of data for use offline and as per the previous point, this will mean creating some new entries and updating the counts on existing entries. Due to the potential frequency of new passwords and the size of the downloadable corpuses (up to 12.5GB zipped at present), my thinking is to make this a monthly process.
  5. After either the file in blob storage or the entire downloadable corpus is modified, the corresponding Cloudflare cache item must be invalidated. This is going to impact the cache hit ratio which then impacts performance and the cost of the services on the origin at Azure. We may need to limit the impact of this by defining a rate at which cache invalidation can occur (i.e. not more than once per day for any given cache item).

Maintainers

This repository is currently maintained by @TroyHunt.

License

This project has been released under the BSD 3-clause license. More information can be found by viewing the license here.

pwnedpasswordsazurefunction's People

Contributors

bbosman avatar danjagnow avatar dependabot[bot] avatar ievangelist avatar maartenba avatar martins-vds avatar mdawsonuk avatar nonameformee avatar ovation22 avatar rzontar avatar slang25 avatar stebet avatar thomaslevesque avatar troyhunt avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

pwnedpasswordsazurefunction's Issues

Idempotency Concern of the ProcessAppendQueueItem Function

I would like to kindly bring attention to a potential issue in the ProcessAppendQueueItem, which is invoked by messages that enter the %TableNamespace%-ingestion queue. For each queue item, function ProcessPasswordEntry is called to first increment the pwned passwords’ prevalence stored in the table entries and then increment the prevalence in the blob storage. However, this method is not idempotent. Suppose the function is crashed after incrementing table entries; when the function retries, the table entries and blob entries will be incremented again. The values stored in the table entries hence deviate from the blob entries and the ground truth. If the retry happens after incrementing both the table entries and the blob entries, both counters will deviate from the ground truth due to duplicate updates.

Though the deviation of the prevalence counter is not a huge problem, such an issue can be resolved by maintaining a LastRequestId field in each table entry and blob entry, which stores the invocation id of the last ProcessAppendQueueItem function call. The invocation id is constant across Azure Function retries. Before updating a table/blob entry, if the comparison finds item.LastRequestId == FunctionContext.InvocationId, this means the update has happened before, so continue to iterate the next table entry or blob. Otherwise, if item.LastRequestId != FunctionContext.InvocationId, update the prevalence counter and the LastRequestId field with FunctionContext.InvocationId.

Thank you for considering this potential issue. I hope this suggestion can help improve the idempotency of the ProcessAppendQueueItem. Please feel free to reach out if you have any questions or concerns.

Providing k-anonymity model for NTLM hashes?

Any chance of providing k-anonymity ranges for NTLM hashes?
I think this would be valuable for checking local user's passwords in small AD domains, where downloading the whole corpus is a bit too much.

IsHash regex has no timeout

I noticed the different regexes has no timeouts which means they could feasibly bring down with the “right” input (i will make a PR for this tomorrow, but creating an issue for posterity)

UpForGrabs

Anything that would be useful? I'd love to assist.

Up For Grabs 2022

Hi all - I would like to contribute and was wondering if there are any issues that may have not been opened yet. If anyone wouldn't mind pointing me in the right direction, I would very much appreciate it. Thank you!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.