Git Product home page Git Product logo

isomorphic-git's Issues

tags

TODOs:

  • create lightweight tag
  • list lightweight tags
  • delete lightweight tag
  • push lightweight tag
  • fetch lightweight tag
  • checkout lightweight tag
  • create annotated tag
  • read annotated tag
  • list annotated tags covered by listing lightweight tags
  • delete annotated tag
  • push annotated tag
  • fetch annotated tag
  • checkout annotated tag

Implementation notes:
Writing an annotated tag parser should be trivial. Here's what they look like. The tagger line looks like it is the same format as author and committer in git commits.

object af4d84a6a9fa7a74acdad07fddf9f17ff3a974ae
type commit
tag v0.0.9
tagger Will Hilton <[email protected]> 1507071414 -0400

0.0.9
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1

iQIcBAABAgAGBQJZ1BW2AAoJEJYJuKWSi6a5S6EQAJQkK+wIXijDf4ZfVeP1E7Be
aDDdOLga0/gj5p2p081TLLlaKKLcYj2pub8BfFVpEmvT0QRaKaMb+wAtO5PBHTbn
y2s3dCmqqAPQa0AXrChverKomK/gUYZfFzckS8GaJTiw2RyvheXOLOEGSLTHOwy2
wjP8KxGOWfHlXZEhn/Z406OlcYMzMSL70H26pgyggSTe5RNfpXEBAgWmIAA51eEM
9tF9xuijc0mlr6vzxYVmfwat4u38nrwX7JvWp2CvD/qwILMAYGIcZqRXK5jWHemD
/x5RtUGU4cr47++FD3N3zBWx0dBiCMNUwT/v68kmhrBVX20DhcC6UX38yf1sdDfZ
yapht2+TakKQuw/T/K/6bFjoa8MIHdAx7WCnMV84M0qfMr+e9ImeH5Hj592qw4Gh
vSY80gKslkXjRnVes7VHXoL/lVDvCM2VNskWTTLGHqt+rIvSXNFGP05OGtdFYu4d
K9oFVEoRPFTRSeF/9EztyeLb/gtSdBmWP2AhZn9ip0a7rjbyv5yeayZTsedoUfe5
o8cB++UXreD+h3c/F6mTRs8aVELhQTZNZ677PY71HJKsCLbQJAd4n+gS1n8Y/7wv
Zp4YxnShDkMTV3rxZc27vehq2g9gKJzQsueLyZPJTzCHqujumiLbdYV4i4X4CZjy
dBWrLc3kdnemrlhSRzR2
=PrR1
-----END PGP SIGNATURE-----

Note that this is NOT a 'clearsigned' GPG document. 🙄 But rather a 'detached' signature concatenated on the end of the payload with an extra newline in between.

payload.txt

git tag
object af4d84a6a9fa7a74acdad07fddf9f17ff3a974ae
type commit
tag v0.0.9
tagger Will Hilton <[email protected]> 1507071414 -0400

0.0.9

signature.txt

-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1

iQIcBAABAgAGBQJZ1BW2AAoJEJYJuKWSi6a5S6EQAJQkK+wIXijDf4ZfVeP1E7Be
aDDdOLga0/gj5p2p081TLLlaKKLcYj2pub8BfFVpEmvT0QRaKaMb+wAtO5PBHTbn
y2s3dCmqqAPQa0AXrChverKomK/gUYZfFzckS8GaJTiw2RyvheXOLOEGSLTHOwy2
wjP8KxGOWfHlXZEhn/Z406OlcYMzMSL70H26pgyggSTe5RNfpXEBAgWmIAA51eEM
9tF9xuijc0mlr6vzxYVmfwat4u38nrwX7JvWp2CvD/qwILMAYGIcZqRXK5jWHemD
/x5RtUGU4cr47++FD3N3zBWx0dBiCMNUwT/v68kmhrBVX20DhcC6UX38yf1sdDfZ
yapht2+TakKQuw/T/K/6bFjoa8MIHdAx7WCnMV84M0qfMr+e9ImeH5Hj592qw4Gh
vSY80gKslkXjRnVes7VHXoL/lVDvCM2VNskWTTLGHqt+rIvSXNFGP05OGtdFYu4d
K9oFVEoRPFTRSeF/9EztyeLb/gtSdBmWP2AhZn9ip0a7rjbyv5yeayZTsedoUfe5
o8cB++UXreD+h3c/F6mTRs8aVELhQTZNZ677PY71HJKsCLbQJAd4n+gS1n8Y/7wv
Zp4YxnShDkMTV3rxZc27vehq2g9gKJzQsueLyZPJTzCHqujumiLbdYV4i4X4CZjy
dBWrLc3kdnemrlhSRzR2
=PrR1
-----END PGP SIGNATURE-----

To verify: gpg --verify signature.txt payload.txt
To create:

gpg --armor --output signature.txt --detach-sig payload.txt
cat payload.txt signature.txt > newtag
SHA=$(git hash-object -t tag -w newtag)
echo "$SHA" > .git/refs/tags/newtag
git tag -v newtag

Pull crashes Node, cannot be caught using try-catch

Hey, pull brings Node down for me in the TypeScript repro below:

import * as git from 'isomorphic-git';
import * as fs from 'fs-extra';
import * as readline from 'readline';

void async function() {
  console.log('Removing…');
  await fs.remove('junk');
  console.log('Removed.');

  let errored = false;

  try {
    console.log('Cloning…');
    await git.clone({ fs, dir: 'junk', ref: 'master', url: 'https://github.com/TomasHubelbauer/bloggo.git' });
    console.log('Cloned.');
  } catch (error) {
    console.log('Failed to clone:', error);
    errored = true;
  }

  try {
    console.log('Pulling…');
    await git.pull({ fs, dir: 'junk', ref: 'master' });
    console.log('Pulled.');
  } catch (error) {
    console.log('Failed to pull:', error);
    errored = true;
  }

  // Added to prove the process will end without any user interaction
  !errored && await new Promise(resolve => {
    readline.createInterface(process.stdin, process.stdout).question('Blocking…', (answer) => resolve(answer));
  });
}()

Here's the full output:

Removing…
Removed.
Cloning…
Cloned.
Pulling…
Using ref=master

I am using Node v9.4.0, Isomorphic Git version "0.9.0" and FS Extra version "5.0.0".

support .gitignore files

So thinking a little about what this means, specifically what git commands are affected?

  • status should return 'ignored' instead of 'absent'
  • add does not currently support adding directories recursively, but if it did it should use .gitignore information.
  • add should maybe throw an error if you try to add an ignored file? IDK

Huh, that's really about it.

How to automatically verify PGP signed commits?

To start with, we have only the "William Hilton [email protected]" and the GPG signature.

To trust the signature, we can (first) simply check that the email used in the GPG signature is for that committer (or author) email address. However that does not establish trust - someone could have made that GPG key just to fake the commit. We must find evidence that the GPG key itself is legit by finding other references to it.

An obvious first place to look would be Github. We can link the email to an individual account by doing a search like this:

https://api.github.com/search/[email protected]&in=email

This can give us a Github account username. Ironically, Github's API doesn't let us get the raw GPG Public Keys:

The data returned in the public_key response field is not a GPG formatted key. When a user uploads a GPG key, it is parsed and the cryptographic public key is extracted and stored. This cryptographic key is what is returned by the APIs on this page. This key is not suitable to be used directly by programs like GPG. source

Well then what is it suitable for? They don't answer that question. Le sigh. What they DO give us are the public key IDs, e.g.

[
    {
        ...
        "key_id": "9609B8A5928BA6B9",
        "emails": [
            {
                "email": "[email protected]",
                "verified": true
            }
        ],
        ...
    },

So now we could compare the key id used in the signature with that listed in the Github API. If it matches, then we can say: baring a MITM attack that compromised our interaction with Github, Github agrees this is a valid signature / email pairing.

This is probably sufficient for our needs, since we can autogenerate a keypair in the browser and upload them to Github when users log in via OAuth the first time. Keeping things Github-centric keeps it simple. Less moving parts.


Let's talk moving parts!

Not everyone wants to upload their public keys to Github. There will be cases where people want other ways to verify keys. One, is we can use the MIT keyserver to look up keys:

http://pgp.mit.edu/pks/lookup?op=get&search=0x9609B8A5928BA6B9

We can also look up keys by username on keybase:

https://keybase.io/_/api/1.0/user/autocomplete.json?q=wmhilton

Or on onename:

https://api.onename.com/v1/users/wmhilton

but both of those require "usernames" and not email addresses, which is what we have. Unless we try the Github username, and now we're back running in circles.

Config parsing

Hey @wmhilton I've decided to abandon my effort in pursuing libgit2 + emscripten simply because mmap doesn't work properly with an adaptation of virtualfs as the emscripten inmemory fs. To do it properly will require a soft-mmu implemented in javascript and I'm not up to that.

So I've decided to go back to my old plan which is to reimpl git in JS. For now I'm looking through your codebase to adapt. The first output is actually config parsing. I see that you're using the ini parsing system, without a real parser. So I've implemented a real git-config parser: https://github.com/MatrixAI/js-virtualgit/tree/master/lib/config All tests are passing for this. There are a couple things missing: tilde expansion for inclusion paths and conditional includes, but they are kind of not important at this stage.

Oh and while I'm adapting your code, I have slightly different requirements than yours that's why I cannot just directly use your codebase, but I hope we can collaborate together and share code/algos/ideas. So feel free to use my lexer/parser/interpreter for your isomorphic git if you want.

support ofs-deltas

Currently fetch only supports ref-deltas. It would be nice to support ofs-deltas as well.

Error when committing without "author"

In the documentation author doesn't appear as required for commit but this error is thrown:

models.js:440 Uncaught (in promise) TypeError: Cannot read property '_named' of undefined
    at isNamedSection (models.js:440)
    at GitConfig.get (models.js:460)
    at config (commands.js:135)

The error goes away if I specify author.

My guess is that all commits must have an author so it should be marked as required.

git status for directories

Currently git status only works on a single file at a time. We probably will want to make a recursive version sometime.

Better 401 Unauthorized error message

Currently git.push is throwing

Error: Unparsable response from server! Expected 'unpack ok' or 'unpack [error message]' but got ''

when it should probably just throw the fetch response

POST http://localhost:3000/wmhilton/test.git/git-receive-pack 401 (Unauthorized)

Faster cloning

Acceptance criteria: Be able to clone https://gitlab.com/gitlab-org/gitlab-ce.git

For reference, on my desktop using canonical git:

> git clone https://gitlab.com/gitlab-org/gitlab-ce                          
Cloning into 'gitlab-ce'...                                                  
warning: redirecting to https://gitlab.com/gitlab-org/gitlab-ce.git/         
remote: Counting objects: 857065, done.                                      
remote: Compressing objects: 100% (255551/255551), done.                     
remote: Total 857065 (delta 653254), reused 765557 (delta 592042)            
Receiving objects: 100% (857065/857065), 380.04 MiB | 5.55 MiB/s, done.      
Resolving deltas: 100% (653254/653254), done.                                
Checking out files: 100% (12449/12449), done.                                
✓ (5m13s) [2017-12-18 19:39:50]

(packfile is 389,165 KB, pack index is 23,437 KB, packed-refs 202 KB, index 1,399 KB).

Add useful error message when checking out a branch that hasn't been fetched

See #43

The error reported is:

Failed to read git object with oid e1593a418bbf61846ce6f044bd01c9cd3cde2004

isomorphic-git needs to be smart enough to realize what has happened and say something like

Error: Tried to checkout a branch that isn't available locally - do git.fetch({ref: 'gh-pages'}) to make the branch available locally"

GitRefManager.updateRemoteRefs not handling remote with combined fetch and push URL

I have a config like this:

[core]
repositoryformatversion = 0
filemode = false
bare = false
logallrefupdates = true
symlinks = false
ignorecase = true

[remote "origin"]
url = https://gitlab.com/TomasHubelbauer/bloggo.git

This code in GitRefManager.updateRemoteRefs fails for me:

if (!refspecs) {
  refspecs = await config.getall(`remote.${remote}.fetch`);
}

I think the config should be fine? Should the logic here be extended to cover a case where both fetch and push URLs are represented like this? I am not too familiar with Git internals, so I am not sure exactly.

Make GitObjectManager aware of .git/shallow

Right now only GitRemoteHTTP uses .git/shallow. It might be nice to check that file in GitObjectManager so if you try to checkout a shallow commit it will have a helpful error.

getObject

So as nice as the high-level API is, there are times where you want to get your hands dirty, and do specialized batch operations without wasting time doing a full "git.checkout".

To that end, I'm thinking about adding an API function that would provide some of the benefits of using the GitObjectManager directly.

await git.getObject({
  oid: string,
  format: string
})

values for format:

  • 'raw' - returns raw deflate-compressed file. Useful for manually unpacking objects from packfiles or efficiently shuffling around loose objects - whenever you don't care about the contents you can save time by not unzipping them
  • 'plain' - returns deflated object as Buffer. Useful for calculating SHA object ids
  • 'content' - returns object content, minus the git header
  • 'parsed' - default. parse the object and return a JavaScript object using one of the schemas (CommitDescription, BlobDescription, TreeDescription, TagDescription)

Example usage:

let sha = await git.resolveRef({fs, dir, ref: 'mybranch'})
let commit = await git.getObject({fs, dir, ref: sha})
let tree = await git.getObject({fs, dir, ref: commit.tree})
for (let file of tree.entries) {
  if (file.path === 'package.json') {
    let blob = await git.getObject({fs, dir, ref: file.oid})
    console.log(blob.toString('utf8'))
  }
}

Fast checkout

One path to faster cloning, which will also save disk space and provide faster checkout for all branches, and resolve #8 , is finishing support for utilizing packfiles in the .git/objects/pack. This issue is just to track that.

I added the necessary functionality to the GitPackfile model (previously unused by my actual code) today. The next step is integrating GitPackfile into the GitObjectManager so using the packfile is transparent.

combine getConfig and setConfig into a single config

Reasoning: the canonical git CLI uses git config with one argument to get values, and with two arguments to set values. (similar to jQuery, which is also a chainable interface) While this potentially could lead to accidental gets instead of sets, I think the convenience for typing and the convenience for remembering (by operating the same way as cgit) is worth the complexity of overloading the config function.

Allow for pluggable types of remotes

I want to use this in conjunction with the dat archive API in BeakerBrowser.

The way it works is you have what looks like a filesystem that's distributed in a peer to peer network.
If someone has access to the URL (a public key) of your FS, they can find the data from peers in the network.
Only the owner of the filesystem (that has the private key) can write to the FS, but anyone can read and replicate it if they have the URL.

BeakerBrowser provides APIs for interacting with these Dat archives to JS in a browser context.
This means that people can make applications that are totally decentralized and don't need any third party services for storing their data.

What I'd like to be able to do is to have a dat archive be used as somebody's git history. That way other people can get the data without you having to store it in a git server.

This will also be the foundation for something like Github, but totally decentralized and p2p.

Ideally, what I'd like to do is set up git remotes which point to other people's dat archives, and to read the other person's dat archive directly instead of trying to make requests to a git server.

Emit progress events while cloning

Maybe something like

git('.')
  .progress(callback)
  .clone(url)

That way we can still await the clone, and we don't have the locality disconnect of making "git" itself an EventEmitter.

git reset

I'm trying to reset a branch to a particular commit (i.e. git checkout <branch> && git reset [--soft] <commit>) but there doesn't seem to be a reset command.

@wmhilton do you have plans to implement this in the near future? I don't want to swamp you with more requests 😅 Seems like an easy thing to do manually by editing stuff in .git/

Querying and Cancelling long-running tasks

I don't know if this should be built in - I don't want to create a new problem for users who already have a multithreading / job queue in place. And obviously it won't matter too much if I can make the time for individual operations negligible.

However, a working job-queue based demo is needed for my own projects and probably many others. So some sort of API for starting, cancelling, and getting progress percentages is needed. Sadly the state of the art (FetchObserver) is not yet figured out, so we have no API examples from the native web platform t from which to copy.

I think the best bet is to provide a working example using a library like http://npm.im/workerpool and if necessary provide a wrapper.

Can I retrieve a blob from a tree?

Is it possible to retrieve a blob from a tree? My use case is that I want to be able to grab files out of the git repository without checking out the branch. This allows me to extract files from branches in the repository without affecting the worktree.

I'm looking for something to replace this nodegit operation: http://www.nodegit.org/api/tree_entry/#getBlob

Here's an example:

const fs = require('fs')
const git = require('nodegit')

async function entryToFile (entry) {
  const blob = await entry.getBlob()
  const contents = blob.content()
  const stat = new fs.Stats()
  stat.mode = entry.filemode()
  stat.size = contents.length
  return { path: entry.path(), contents, stat }
}

const dir = 'isogit'
const branchName = 'develop'

;(async () => {
  const repo = await git.Repository.open(dir)
  const walker = (await(await repo.getBranchCommit(branchName)).getTree()).walk()
  const files = await new Promise((resolve, reject) => {
    const collector = []
    walker.on('entry', (entry) => collector.push(entryToFile(entry)))
    walker.on('error', (err) => reject(err))
    walker.on('end', () => resolve(Promise.all(collector)))
    walker.start()
  })
  files.forEach((file) => console.log([file.path, file.stat.size]))
})()

I would also need a way to get a list of files from a branch without checking it out. I don't really need a walker. Just a flat list of files in the branch would suffice.

Clean up tracked files when switching branches

Right now checkout simply

  • blows away the git index
  • reads the commit tree into the index
  • then overwrites files.

What it should do is

  • delete files that are in the index
  • blows away the index
  • read the commit tree into the index
  • then only write files that aren't already on disk

Certain commands (e.g. clone) should update the repo config file

The initial repo config file is pretty bare:

[core]
	repositoryformatversion = 0
	filemode = false
	bare = false
	logallrefupdates = true
	symlinks = false
	ignorecase = true

I've never edited my config file but here's what it is for esgit:

[core]
	repositoryformatversion = 0
	filemode = false
	bare = false
	logallrefupdates = true
	symlinks = false
	ignorecase = true
[remote "origin"]
	url = https://github.com/wmhilton/esgit
	fetch = +refs/heads/*:refs/remotes/origin/*
[branch "master"]
	remote = origin
	merge = refs/heads/master
[branch "dist"]
	remote = origin
	merge = refs/heads/dist

Look into what commands (fetch? merge? checkout?) auto-insert entries into the config file and mimic that behavior in esgit.

Not coercing file modes when adding to Tree

Currently, the "add" command is not coercing file permission modes to be strictly '100644' (normal) or '100755' (exec) when adding them to the index. This subtle bug reared its head when I compared the basic-test commit sha created when run in karma (the browser) to the basic-test commit sha when run in jasmine (in node).

Tests say Success: null ms - a problem?

Hey, maybe this is just my confusion, but most (all?) tests running in SauceLabs seem to be returning null for the runtime value in ms. I meant to take a look at a random session from my latest PR, but the video doesn't load for me to be able to check on whether the execution goes through okay. Are there any more tools that could be used to verify nothing is amiss?

listRemoteBranches

From #43

Currently, the listBranches function only looks for heads. Could you add a function that lists remote branches (or all branches?).

I think... add a listRemoteBranches function that takes a remote argument that defaults to undefined, and returns all remote branches, or if remote is specified returns just the branches in that remote.

Support reading from .git/packed-refs

Cloning a repo using git version 2.10.1.windows.1 results in a totally empty .git/refs/remotes folder. For compatibility, we need to add support for at least reading refs from https://git-scm.com/docs/git-pack-refs

> cat .git/packed-refs
# pack-refs with: peeled fully-peeled
a2dd810e222b7b02fc53760037d9928cb97c645d refs/remotes/origin/dist
5741bed81a5e38744ec8ca88b5aa4f058467d4bf refs/remotes/origin/git-fetch
025860fcfb6af84739a924ff49bcbda036855b1a refs/remotes/origin/master
930182105c9d144f952a191a29f40d1945bca990 refs/remotes/origin/rollup
e10ebb90d03eaacca84de1af0a59b444232da99e refs/remotes/origin/test-branch
92e7b4123fbf135f5ffa9b6fe2ec78d07bbc353e refs/remotes/origin/test-branch-shallow-clone
1e40fdfba1cf17f3c9f9f3d6b392b1865e5147b9 refs/tags/test-tag
1a2149e96a9767b281a8f10fd014835322da2d14 refs/tags/v0.0.1
9e3ee22249ed50acccfd3996dadb5d27019a7dad refs/tags/v0.0.2
3e6345233bb696737784f423ace943e0eaa2b30c refs/tags/v0.0.3
^b3ed1e3f15c9bcab23833dbb5ef6a8e2198ec4e2
01509d00409c556c331bb278269c6ca770eb7c52 refs/tags/v0.0.4
^3eb8f48d22cac58d8ba42237cb2227ef90bfce08
ff03e74259efab829557d0b3c15d6c76b9458262 refs/tags/v0.0.5
20668e724eed5fffd23968793aee0592babac2ab refs/tags/v0.0.6
^641859e5e6bad88afab83a4a3e94903ed1d8e10b
6dedfbd21a0633055a93c05cc8b4b5cd89f2b708 refs/tags/v0.0.7
4606f7652aba2b7e8d7c70eb0aa6cd75226f4d83 refs/tags/v0.0.8
^025860fcfb6af84739a924ff49bcbda036855b1a

Cloning a tag fails during checkout

I can successfully clone a specific branch by passing a ref value. However when I try to clone a tag (lightweight) it fails. From what I can tell the fetch works but it fails during checkout trying to resolve the ref origin/tag-name

I'm cloning with:

    await git.clone({
      // ...
      ref: 'tag-name',
    });

I see that fetch has a tags argument but clone never passes it so my guess is that the initial fetch does not include the tags.

Thanks for this awesome library

WebSockets?

Tim Caswell at js-git stated, "I would recommend using the git pack-protocol over secure websockets. It would essentially be the same as the ssh protocol, but authenticate using normal https methods and tunnel over websocket binary messages."

I'm wondering what you might think of this for isomorphic-git?

(Very excited to try it out, btw, when I get a chance...)

Handle file checkout errors gracefully

Current isomorphic-git output:

(node:7996) UnhandledPromiseRejectionWarning: Unhandled promise rejection (rejection id: 1): Error: ENOENT: no such file or directory, open 'E:\git\temp\isogit\deployed at Mon Jan  8 01:55:46 UTC 2018 by Deployment Bot (from Travis CI)'

The checkout is aborted and the rest of the files are not checked out.

Desired output:

WARNING: Skipping file 'E:\git\temp\isogit\deployed at Mon Jan  8 01:55:46 UTC 2018 by Deployment Bot (from Travis CI)': Unable to create file or directory

The checkout returns a warning but all the other files are checked out.

Expose commit signing payload

We can eliminate the awkward dependency injection of OpenPGP.js and make the API truly agnostic (which I need if I want to try using kbpgp) by the following:

instead of verify, add an option to commit and log to expose the output of GitCommit.withoutSignature() as payload and GitCommit.isolateSignature() as signature.

instead of sign, expose GitCommit.fromPayloadSignature({payload, signature})

thus instead of:

await git.commit(...)
await git.sign(...)

one would do:

let {payload} = await git.commit({..., signing: true});
let signature = YourLibraryPGP.sign(payload);
await git.replace({signature, payload});

Error when pushing without "ref"

The documentation says that push doesn't require a ref but I get the following if I don't specify it:

Cannot read property 'startsWith' of undefined
    at Object.push (commands.js:841)

The error goes away if I pass a ref.

Thanks for this amazing library

Fetch all branches that match a pattern, ideas for API enhancements, and some bugs that were found

It seems that if a branch has a disconnected history from the main branch, isomorphic-git fails to check it out. You can see the problem on this repository:

const git = require('isomorphic-git')
const fs = require('fs')

;(async () => {
  await git.fetch({
    fs,
    dir: 'isogit',
    url: 'https://github.com/isomorphic-git/isomorphic-git.git',
  })

  await git.checkout({
    fs,
    dir: 'isogit',
    remote: 'origin',
    ref: 'gh-pages',
  })
})()

The error reported is:

Failed to read git object with oid e1593a418bbf61846ce6f044bd01c9cd3cde2004

If you change the ref from gh-pages to develop, it works fine.

git merge

OK let's break this down into tasks...

File: src/commands/merge.js
pseudocode

import diff3 from 'node-diff3' // at least this exists
// find most recent common ancestor of ref a and ref b
let o = await findMergeBase(a, b)
// for each file, determine whether it is present or absent or modified (see http://gitlet.maryrosecook.com/docs/gitlet.html#section-217)
let diff = await findChangedFiles(a, o, b)
for (let file of diff) {
  // for simple cases of add, remove, or modify files
  updateMergeIndex(file);
  updateWorkTree(file);
  // for files that changed on both branches, compute the diff3.
  if (file.a !== file.o && file.b !== file.o && file.a !== file.b) {
    diff = await diff3(file.a, file.o, file.b)
    // If the diff3 merge was unsuccessful, mark the conflict
    if (diff.length > 1 || diff[0].conflict) {
      markConflictInIndex(file)
    }
    // regardless save the result to the work dir
    fs.writeFile(file.name, formatDiff3(diff))
  }
}

Tasks:

  • implement findMergeBase(a, b)
  • implement findChangedFiles(a, o, b)
  • implement updateMergeIndex(file)
  • implement updateWorkTree(file)
  • implement markConflictInIndex(file)
  • implement formatDiff3(diff)
  • implement mergeTree(a, o, b)
  • implement mergeFile(a, o, b)

copied and pasted from my work document:

This is trickier, and would involve implementing more of the merge in isomorphic-git. This would undoubtedly be a good thing for everyone who uses isomorphic-git, but it is a bit of a time commitment to do well.

Simple cases involving merges that don't involve the same files should be doable in just a couple days. The algorithm is something like:

  1. Compute the nearest common ancestor of commit A and B, call it O. (This code already exists as findMergeBase)
  2. Compute two tree patches: O -> A and O -> B. (This algorithm should be very similar to that used in statusMatrix I think) Note: this should also be used to speed up and make checkout safer.
  3. Check that the two tree patches do not contain any operations that happen to the same file.
  4. If there are no operations on the same file, apply both patches to O and create a merge commit D whose parents are A and B.
  5. Move the current branch to point to D.
  6. Attempt to checkout the current branch.

We need a datastructure for storing tree patches. I propose something like:

const patches = [
  {filepath: 'TODO.MD', op: 'rm', before: 'f4c8920', after: null},
  {filepath: 'TODO.md', op: 'write', before: null, after: 'ec63514'},
  {filepath: 'lib', op: 'mkdir', before: null, after: 'he3414c'},
  {filepath: 'lib/app.rb', op: 'write', before: null, after: '00750ed'},
]

Including before oids makes it possible to safely detect if files have changed between the time the patch is computed and the time the patch is performed. For instance, a "safe delete with rollback" might consist of:

  • create a temporary directory 'tmp'
  • move <file> to tmp/<before> (e.g. mv 'TODO.MD' to 'tmp/f4c8920...')
  • compute the SHA of tmp/<before> and make sure it matches before.
  • if not, move the file back and trigger a rollback.
  • if so, delete tmp/<before>

The after oids should simply be written to the indicated filepath. For directories, I guess the tree oid could be computed and verified afterwards, but if all the file oids match it should be correct.

Edit:
So.... actually most of that is not needed. There's no need to compute two "tree patches" for O -> A and O -> B. There's no need to merge the patches or apply patches. In the end I just merged the commits in a single call to walkBeta1.

Unable to push to empty repository on GitHub

First of all: thanks for this wonderful library. It seems to be exactly what I need.

I am experimenting with this a bit (Firefox 57) and I can't get a simple clone-edit-commit-push cycle to work. What am I doing wrong. In the <head> of my html I have included isomorphic-git and browserfs as instructed. In the body I have a script tag containing:

git('.').clone('https://cors-buster-jfpactjnem.now.sh/github.com/wDhTIG/rthKRF');
fs.writeFile('README.md', 'Test', 'utf8', (err) => {
    if (err) throw err;
        console.log('The file has been saved!');
});
git('.').auth('mysecret')
        .add('README.md')
        .commit('Test commit')
        .push('master');

I get the following errors in the console log:

The character encoding of the HTML document was not declared. The document will render with garbled text in some browser configurations if the document contains characters from outside the US-ASCII range. The character encoding of the page must be declared in the document or in the transfer protocol.
git.html
Unhandled promise rejection
TypeError: l(...) is undefined
Stack trace:
e/<@https://unpkg.com/isomorphic-git:1:80863
n@https://unpkg.com/isomorphic-git:1:125272
f/<@https://unpkg.com/isomorphic-git:1:126316
o/</e[t]@https://unpkg.com/isomorphic-git:1:125448
i@https://unpkg.com/isomorphic-git:1:120912
i/<@https://unpkg.com/isomorphic-git:1:121012
B/</<@https://unpkg.com/isomorphic-git:1:273404
B/<@https://unpkg.com/isomorphic-git:1:273274
f@https://unpkg.com/isomorphic-git:1:257078
isomorphic-git:1:273747
Source map error: request failed with status 404
Resource URL: https://unpkg.com/isomorphic-git
Source Map URL: bundle.umd.min.js.map

Support reading from .git/objects/pack/*

A fresh clone is leaving my packfiles packed. To be fully compatible with git, we need to support reading from .pack, .idx file pairs in the .git/objects/pack directory.

Info messages while pushing/fetching

Is there a way to get the informational messages sent by the server while pushing/fetching? I'm talking about what is normally prefixed by remote: .... in regular command-line git.

Commits shouldn't contain full path names

We need to break flat trees (e.g. index) into nested trees when we create Commit objects.

It seems to work locally, but then on git push it will fail (because packfiles etc):

remote: error: object 2dc67cd03eedd139371ccb99990ffd416777f3b7: fullPathname: contains full pathnames

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.