Git Product home page Git Product logo

torrefy's Introduction

torrefy

GitHub top language npm version npm downloads GitHub search hit counter Rate this package

An ESM package that uses Web Streams API to create v1, v2 or hybrid torrents in your web browser.

πŸ—This package is under active development.πŸ—

Install

npm i torrefy # or yarn add torrefy

Basic usage

import { create, encode, decode } from "torrefy";

// create a test file
const testFile = new File(
  ["Hello world. This is the test file content."],
  "testfile.txt"
);

// calculate (hash) the meta info of the test file
const metaInfo = await create([testFile]);

// bencode meta info into a readable stream
const torrentStream = encode(metaInfo);

// tee the readable stream into two readable streams
const [torrentStream1, torrentStream2] = torrentStream.tee();

// consume the first readable stream as an array buffer
const torrentBinary = await new Response(torrentStream1).arrayBuffer();

// decode the second readable stream into meta info
const decodedMetaInfo = await decode(torrentStream2);

Features

Supports Creating V1, V2 or Hybrid Torrents

This package supports creating v1, v2 (introduction blog) or hybrid (introduction blog) torrents.

Covers Various Web File APIs

This package can handle input files or directories acquired from File API, File and Directory Entries API or File System Access API.

Supports Comprehensive Options

TBD

Supports Handling Progress

TBD

Exposes Stream-Based APIs

The create function consumes an iterable of input files as ReadableStreams with options and populates a MetaInfo object. This function internally uses several TransformStreams to chop the files into pieces and hash them.

The encode function consumes any bcodec friendly entity (e.g. MetaInfo object) and bencodes it into a ReadableStream.

The decode function consumes any bcodec friendly ReadableStream (e.g. torrent ReadableStream) and bdecodes it into the corresponding entity. This function internally uses a TransformStream called Tokenizer to tokenize the input ReadableStream and then calls parse function to parse the Tokens.

All TransformStreams used in this package are also exported.

Supports a Comprehensive Set of Bcodec Friendly Javascript Types

Bcodec friendly Javascript types includes (for the time being):

Bcodec Type \ Javascript Type Strict Loose
ByteString string string ArrayBuffer
Integer number bigint number bigint boolean
List Strict[] Loose[]
Dictionary {[key: string]: Strict} {[key: string]: Loose}
Map<string | ArrayBuffer, Loose>
ignored - undefined null

encode function supports all Loose type inputs and decode function always returns Strict type results.

Supports Hooks in Bencoding

You can register encoder hooks when using the encode function. A common use case is extracting the bencoded info dictionary and calculating the infohash. (This package doesn't provide an out-of-box function to calculate infohash for now)

To use encoder hooks, you will have to install the peer dependency @sec-ant/trie-map, which acts as an encoder hook system and allows you to register encoder hooks with iterable paths as keys in. Refer to its README to learn more about the package.

This package provides several helper functions to help you register hooks in a hook system and consume their results as you please: useUint8ArrayStreamHook, useArrayBufferPromiseHook, useTextPromiseHook. You can also define your own functions to register and use hooks.

Here is probably how you should use this feature:

import { encode, EncoderHookSystem, useArrayBufferPromiseHook } from "torrefy";
import { TrieMap } from "@sec-ant/trie-map";

// create a dummy object to encode
const dummyObject = {
  a: "b",
  c: 1,
  info: {
    foo: "bar",
  },
  s: ["t"],
};

// initialize an encoder hook system
const hookSystem: EncoderHookSystem = new TrieMap();

// register an encoder hook under dummyObject.info path in the hook system
// and consume the result as an array buffer promise
const infoArrayBufferPromise = useArrayBufferPromiseHook(["info"], hookSystem);

// pass the hook system as an input argument to the encode function
const bencodedReadableStream = encode(dummyObject, hookSystem);

// consume the result of the hook
const infoArrayBuffer = await infoArrayBufferPromise; // => ArrayBuffer(12)

torrefy's People

Contributors

sec-ant avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar

Forkers

manouchehri

torrefy's Issues

migrate to vite/vitest

Vite uses esbuild to prebundle dependencies and transform source codes in dev mode, which makes it a perfect selection to test the package in browser env without losing typescript features. And Vite is fast.

refactor web streams to async iterator/generators

Some related links

Basically, there're two functions that need implementing:

Other works left including refactoring transformers to async generators are already done in another branch of this repo. But they still need testing as I also refactored many other places.

Weird broken results with Cloudflare Workers

Trying to get this module to work with a Cloudflare Worker, but I seem to always get a weird truncated response. Any ideas on what I'm doing wrong?

interface Env {
}

import { create, encode } from "torrefy";

export default {
	async fetch(
		request: Request,
		env: Env,
		ctx: ExecutionContext
	): Promise<Response> {
		const testFile = new File(
			["Hello world. This is the test file content."],
			"testfile.txt"
		);

		// calculate (hash) the meta info of the test file
		const metaInfo = await create([testFile]);
		console.debug(metaInfo);

		// bencode meta info into a readable stream
		const torrentStream = encode(metaInfo);

		return new Response(torrentStream);
	},
};

I end up getting the following data returned:

00000000: 6431 303a 6372 6561 7465 6420 6279 3133  d10:created by13
00000010: 3a63 7265 6174 696f 6e20 6461 7465 343a  :creation date4:
00000020: 696e 666f 65                             infoe

ReadableStream instead of FileDirLike?

For large files, buffering the entire thing into an ArrayBuffer or Blob for new File(...) isn't always doable. Would it be possible to have create([my_ReadableStream]) to avoid needing to buffer everything at once?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    πŸ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. πŸ“ŠπŸ“ˆπŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❀️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.