Git Product home page Git Product logo

vladmandic / face-api Goto Github PK

View Code? Open in Web Editor NEW
726.0 24.0 140.0 75.58 MB

FaceAPI: AI-powered Face Detection & Rotation Tracking, Face Description & Recognition, Age & Gender & Emotion Prediction for Browser and NodeJS using TensorFlow/JS

Home Page: https://vladmandic.github.io/face-api/demo/webcam.html

License: MIT License

TypeScript 2.45% HTML 97.10% JavaScript 0.05% CSS 0.40%
tensorflow tfjs face-api face-detection age-gender emotion-detection face-recognition

face-api's Introduction

Git Version NPM Version Last Commit License GitHub Status Checks Vulnerabilities

FaceAPI

AI-powered Face Detection & Rotation Tracking, Face Description & Recognition, Age & Gender & Emotion Prediction for Browser and NodeJS using TensorFlow/JS


Live Demo: https://vladmandic.github.io/face-api/demo/webcam.html


Additional Documentation




Examples


Browser

Browser example that uses static images and showcases both models
as well as all of the extensions is included in /demo/index.html
Example can be accessed directly using Git pages using URL:
https://vladmandic.github.io/face-api/demo/index.html

Browser example that uses live webcam is included in /demo/webcam.html
Example can be accessed directly using Git pages using URL:
https://vladmandic.github.io/face-api/demo/webcam.html


Demo using FaceAPI to process images
Note: Photos shown below are taken by me

screenshot

Demo using FaceAPI to process live webcam

screenshot


NodeJS

NodeJS examples are:

  • /demo/node-simple.js: Simplest possible NodeJS demo for FaceAPI in under 30 lines of JavaScript code
  • /demo/node.js:
    Using TFJS native methods to load images without external dependencies
  • /demo/node-canvas.js and /demo/node-image.js:
    Using external canvas module to load images
    Which also allows for image drawing and saving inside NodeJS environment
  • /demo/node-match.js:
    Simple demo that compares face similarity from a given image
    to a second image or list of images in a folder
  • /demo/node-multiprocess.js:
    Multiprocessing showcase that uses pool of worker processes
    (node-multiprocess-worker.js)
    Main starts fixed pool of worker processes with each worker having
    it's instance of FaceAPI
    Workers communicate with main when they are ready and main dispaches
    job to each ready worker until job queue is empty
2021-03-14 08:42:03 INFO:  @vladmandic/face-api version 1.0.2
2021-03-14 08:42:03 INFO:  User: vlado Platform: linux Arch: x64 Node: v15.7.0
2021-03-14 08:42:03 INFO:  FaceAPI multi-process test
2021-03-14 08:42:03 STATE:  Main: started worker: 1888019
2021-03-14 08:42:03 STATE:  Main: started worker: 1888025
2021-03-14 08:42:04 STATE:  Worker: PID: 1888025 TensorFlow/JS 3.3.0 FaceAPI 1.0.2 Backend: tensorflow
2021-03-14 08:42:04 STATE:  Worker: PID: 1888019 TensorFlow/JS 3.3.0 FaceAPI 1.0.2 Backend: tensorflow
2021-03-14 08:42:04 STATE:  Main: dispatching to worker: 1888019
2021-03-14 08:42:04 STATE:  Main: dispatching to worker: 1888025
2021-03-14 08:42:04 DATA:  Worker received message: 1888019 { image: 'demo/sample1.jpg' }
2021-03-14 08:42:04 DATA:  Worker received message: 1888025 { image: 'demo/sample2.jpg' }
2021-03-14 08:42:06 DATA:  Main: worker finished: 1888025 detected faces: 3
2021-03-14 08:42:06 STATE:  Main: dispatching to worker: 1888025
2021-03-14 08:42:06 DATA:  Worker received message: 1888025 { image: 'demo/sample3.jpg' }
2021-03-14 08:42:06 DATA:  Main: worker finished: 1888019 detected faces: 3
2021-03-14 08:42:06 STATE:  Main: dispatching to worker: 1888019
2021-03-14 08:42:06 DATA:  Worker received message: 1888019 { image: 'demo/sample4.jpg' }
2021-03-14 08:42:07 DATA:  Main: worker finished: 1888025 detected faces: 3
2021-03-14 08:42:07 STATE:  Main: dispatching to worker: 1888025
2021-03-14 08:42:07 DATA:  Worker received message: 1888025 { image: 'demo/sample5.jpg' }
2021-03-14 08:42:08 DATA:  Main: worker finished: 1888019 detected faces: 4
2021-03-14 08:42:08 STATE:  Main: dispatching to worker: 1888019
2021-03-14 08:42:08 DATA:  Worker received message: 1888019 { image: 'demo/sample6.jpg' }
2021-03-14 08:42:09 DATA:  Main: worker finished: 1888025 detected faces: 5
2021-03-14 08:42:09 STATE:  Main: worker exit: 1888025 0
2021-03-14 08:42:09 DATA:  Main: worker finished: 1888019 detected faces: 4
2021-03-14 08:42:09 INFO:  Processed 15 images in 5944 ms
2021-03-14 08:42:09 STATE:  Main: worker exit: 1888019 0

Note that @tensorflow/tfjs-node or @tensorflow/tfjs-node-gpu
must be installed before using any NodeJS examples




Quick Start

Simply include latest version of FaceAPI directly from a CDN in your HTML:
(pick one, jsdelivr or unpkg)

<script src="https://cdn.jsdelivr.net/npm/@vladmandic/face-api/dist/face-api.js"></script>
<script src="https://unpkg.dev/@vladmandic/face-api/dist/face-api.js"></script>

Installation

FaceAPI ships with several pre-build versions of the library:

  • dist/face-api.js: IIFE format for client-side Browser execution
    with TFJS pre-bundled
  • dist/face-api.esm.js: ESM format for client-side Browser execution
    with TFJS pre-bundled
  • dist/face-api.esm-nobundle.js: ESM format for client-side Browser execution
    without TFJS pre-bundled
  • dist/face-api.node.js: CommonJS format for server-side NodeJS execution
    without TFJS pre-bundled
  • dist/face-api.node-gpu.js: CommonJS format for server-side NodeJS execution
    without TFJS pre-bundled and optimized for CUDA GPU acceleration

Defaults are:

{
  "main": "dist/face-api.node-js",
  "module": "dist/face-api.esm.js",
  "browser": "dist/face-api.esm.js",
}

Bundled TFJS can be used directly via export: faceapi.tf

Reason for additional nobundle version is if you want to
include a specific version of TFJS and not rely on pre-packaged one

FaceAPI is compatible with TFJS 2.0+ and TFJS 3.0+

All versions include sourcemap




There are several ways to use FaceAPI:

1. IIFE script

Recommened for quick tests and backward compatibility with older Browsers that do not support ESM such as IE

This is simplest way for usage within Browser
Simply download dist/face-api.js, include it in your HTML file & it's ready to use:

<script src="dist/face-api.js"><script>

Or skip the download and include it directly from a CDN:

<script src="https://cdn.jsdelivr.net/npm/@vladmandic/face-api/dist/face-api.js"></script>

IIFE script bundles TFJS and auto-registers global namespace faceapi within Window object which can be accessed directly from a <script> tag or from your JS file.


2. ESM module

Recommended for usage within Browser

2.1. Direct Import

To use ESM import directly in a Browser, you must import your script (e.g. index.js) with a type="module"

  <script src="./index.js" type="module">

and then in your index.js

  import * as faceapi from 'dist/face-api.esm.js';

2.2. With Bundler

Same as above, but expectation is that you've installed @vladmandic/faceapi package:

  npm install @vladmandic/face-api 

and that you'll package your application using a bundler such as webpack, rollup or esbuild
in which case, you do not need to import a script as module - that depends on your bundler configuration

  import * as faceapi from '@vladmandic/face-api';

or if your bundler doesn't recognize recommended type, force usage with:

  import * as faceapi from '@vladmandic/face-api/dist/face-api.esm.js';

or to use non-bundled version

  import * as tf from `@tensorflow/tfjs`;
  import * as faceapi from '@vladmandic/face-api/dist/face-api.esm-nobundle.js';

3. NPM module

3.1. Import CommonJS

Recommended for NodeJS projects

Node: FaceAPI for NodeJS does not bundle TFJS due to binary dependencies that are installed during TFJS installation

Install with:

  npm install @tensorflow/tfjs-node
  npm install @vladmandic/face-api 

And then use with:

  const tf = require('@tensorflow/tfjs-node')
  const faceapi = require('@vladmandic/face-api');

If you want to force CommonJS module instead of relying on recommended field:

  const faceapi = require('@vladmandic/face-api/dist/face-api.node.js');

If you want to GPU Accelerated execution in NodeJS, you must have CUDA libraries already installed and working
Then install appropriate version of FaceAPI:

  npm install @tensorflow/tfjs-node-gpu
  npm install @vladmandic/face-api 

And then use with:

  const tf = require('@tensorflow/tfjs-node-gpu')
  const faceapi = require('@vladmandic/face-api/dist/face-api.node-gpu.js'); // this loads face-api version with correct bindings for tfjs-node-gpu

If you want to use FaceAPI in a NodeJS on platforms where tensorflow binary libraries are not supported, you can use NodeJS WASM backend.

  npm install @tensorflow/tfjs
  npm install @tensorflow/tfjs-backend-wasm
  npm install @vladmandic/face-api 

And then use with:

  const tf = require('@tensorflow/tfjs');
  const wasm = require('@tensorflow/tfjs-backend-wasm');
  const faceapi = require('@vladmandic/face-api/dist/face-api.node-wasm.js'); // use this when using face-api in dev mode
  wasm.setWasmPaths('https://cdn.jsdelivr.net/npm/@tensorflow/tfjs-backend-wasm/dist/');
  await tf.setBackend('wasm');
  await tf.ready();
  ...

If you want to use graphical functions inside NodeJS,
you must provide appropriate graphical library as
NodeJS does not include implementation for DOM elements
such as HTMLImageElement or HTMLCanvasElement:

Install Canvas for NodeJS:

npm install canvas

Patch NodeJS environment to use newly installed Canvas library:

const canvas = require('canvas');
const faceapi = require('@vladmandic/face-api');

const { Canvas, Image, ImageData } = canvas
faceapi.env.monkeyPatch({ Canvas, Image, ImageData })




Weights

Pretrained models and their weights are included in ./model.




Test & Dev Web Server

To install development dependencies, use npm install --production=false

Built-in test&dev web server can be started using

npm run dev

By default it starts HTTP server on port 8000 and HTTPS server on port 8001 and can be accessed as:

2022-01-14 09:56:19 INFO:  @vladmandic/face-api version 1.6.4
2022-01-14 09:56:19 INFO:  User: vlado Platform: linux Arch: x64 Node: v17.2.0
2022-01-14 09:56:19 INFO:  Application: { name: '@vladmandic/face-api', version: '1.6.4' }
2022-01-14 09:56:19 INFO:  Environment: { profile: 'development', config: '.build.json', package: 'package.json', tsconfig: true, eslintrc: true, git: true }
2022-01-14 09:56:19 INFO:  Toolchain: { build: '0.6.7', esbuild: '0.14.11', typescript: '4.5.4', typedoc: '0.22.10', eslint: '8.6.0' }
2022-01-14 09:56:19 INFO:  Build: { profile: 'development', steps: [ 'serve', 'watch', 'compile' ] }
2022-01-14 09:56:19 STATE: WebServer: { ssl: false, port: 8000, root: '.' }
2022-01-14 09:56:19 STATE: WebServer: { ssl: true, port: 8001, root: '.', sslKey: 'build/cert/https.key', sslCrt: 'build/cert/https.crt' }
2022-01-14 09:56:19 STATE: Watch: { locations: [ 'src/**', 'README.md', 'src/**', 'src/**' ] }
2022-01-14 09:56:19 STATE: Compile: { name: 'tfjs/node/cpu', format: 'cjs', platform: 'node', input: 'src/tfjs/tf-node.ts', output: 'dist/tfjs.esm.js', files: 1, inputBytes: 143, outputBytes: 1276 }
2022-01-14 09:56:19 STATE: Compile: { name: 'faceapi/node/cpu', format: 'cjs', platform: 'node', input: 'src/index.ts', output: 'dist/face-api.node.js', files: 162, inputBytes: 234787, outputBytes: 175203 }
2022-01-14 09:56:19 STATE: Compile: { name: 'tfjs/node/gpu', format: 'cjs', platform: 'node', input: 'src/tfjs/tf-node-gpu.ts', output: 'dist/tfjs.esm.js', files: 1, inputBytes: 147, outputBytes: 1296 }
2022-01-14 09:56:19 STATE: Compile: { name: 'faceapi/node/gpu', format: 'cjs', platform: 'node', input: 'src/index.ts', output: 'dist/face-api.node-gpu.js', files: 162, inputBytes: 234807, outputBytes: 175219 }
2022-01-14 09:56:19 STATE: Compile: { name: 'tfjs/node/wasm', format: 'cjs', platform: 'node', input: 'src/tfjs/tf-node-wasm.ts', output: 'dist/tfjs.esm.js', files: 1, inputBytes: 185, outputBytes: 1367 }
2022-01-14 09:56:19 STATE: Compile: { name: 'faceapi/node/wasm', format: 'cjs', platform: 'node', input: 'src/index.ts', output: 'dist/face-api.node-wasm.js', files: 162, inputBytes: 234878, outputBytes: 175294 }
2022-01-14 09:56:19 STATE: Compile: { name: 'tfjs/browser/tf-version', format: 'esm', platform: 'browser', input: 'src/tfjs/tf-version.ts', output: 'dist/tfjs.version.js', files: 1, inputBytes: 1063, outputBytes: 1662 }
2022-01-14 09:56:19 STATE: Compile: { name: 'tfjs/browser/esm/nobundle', format: 'esm', platform: 'browser', input: 'src/tfjs/tf-browser.ts', output: 'dist/tfjs.esm.js', files: 2, inputBytes: 2172, outputBytes: 811 }
2022-01-14 09:56:19 STATE: Compile: { name: 'faceapi/browser/esm/nobundle', format: 'esm', platform: 'browser', input: 'src/index.ts', output: 'dist/face-api.esm-nobundle.js', files: 162, inputBytes: 234322, outputBytes: 169437 }
2022-01-14 09:56:19 STATE: Compile: { name: 'tfjs/browser/esm/bundle', format: 'esm', platform: 'browser', input: 'src/tfjs/tf-browser.ts', output: 'dist/tfjs.esm.js', files: 11, inputBytes: 2172, outputBytes: 2444105 }
2022-01-14 09:56:20 STATE: Compile: { name: 'faceapi/browser/iife/bundle', format: 'iife', platform: 'browser', input: 'src/index.ts', output: 'dist/face-api.js', files: 162, inputBytes: 2677616, outputBytes: 1252572 }
2022-01-14 09:56:20 STATE: Compile: { name: 'faceapi/browser/esm/bundle', format: 'esm', platform: 'browser', input: 'src/index.ts', output: 'dist/face-api.esm.js', files: 162, inputBytes: 2677616, outputBytes: 2435063 }
2022-01-14 09:56:20 INFO:  Listening...
...
2022-01-14 09:56:46 DATA:  HTTPS: { method: 'GET', ver: '2.0', status: 200, mime: 'text/html', size: 1047, url: '/', remote: '::1' }
2022-01-14 09:56:46 DATA:  HTTPS: { method: 'GET', ver: '2.0', status: 200, mime: 'text/javascript', size: 6919, url: '/index.js', remote: '::1' }
2022-01-14 09:56:46 DATA:  HTTPS: { method: 'GET', ver: '2.0', status: 200, mime: 'text/javascript', size: 2435063, url: '/dist/face-api.esm.js', remote: '::1' }
2022-01-14 09:56:47 DATA:  HTTPS: { method: 'GET', ver: '2.0', status: 200, mime: 'application/octet-stream', size: 4125244, url: '/dist/face-api.esm.js.map', remote: '::1' }
2022-01-14 09:56:47 DATA:  HTTPS: { method: 'GET', ver: '2.0', status: 200, mime: 'application/json', size: 3219, url: '/model/tiny_face_detector_model-weights_manifest.json', remote: '::1' }
2022-01-14 09:56:47 DATA:  HTTPS: { method: 'GET', ver: '2.0', status: 200, mime: 'application/octet-stream', size: 193321, url: '/model/tiny_face_detector_model.bin', remote: '::1' }
2022-01-14 09:56:47 DATA:  HTTPS: { method: 'GET', ver: '2.0', status: 200, mime: 'application/json', size: 28233, url: '/model/ssd_mobilenetv1_model-weights_manifest.json', remote: '::1' }
2022-01-14 09:56:47 DATA:  HTTPS: { method: 'GET', ver: '2.0', status: 200, mime: 'application/octet-stream', size: 5616957, url: '/model/ssd_mobilenetv1_model.bin', remote: '::1' }
2022-01-14 09:56:48 DATA:  HTTPS: { method: 'GET', ver: '2.0', status: 200, mime: 'application/json', size: 8392, url: '/model/age_gender_model-weights_manifest.json', remote: '::1' }
2022-01-14 09:56:48 DATA:  HTTPS: { method: 'GET', ver: '2.0', status: 200, mime: 'application/octet-stream', size: 429708, url: '/model/age_gender_model.bin', remote: '::1' }
2022-01-14 09:56:48 DATA:  HTTPS: { method: 'GET', ver: '2.0', status: 200, mime: 'application/json', size: 8485, url: '/model/face_landmark_68_model-weights_manifest.json', remote: '::1' }
2022-01-14 09:56:48 DATA:  HTTPS: { method: 'GET', ver: '2.0', status: 200, mime: 'application/octet-stream', size: 356840, url: '/model/face_landmark_68_model.bin', remote: '::1' }
2022-01-14 09:56:48 DATA:  HTTPS: { method: 'GET', ver: '2.0', status: 200, mime: 'application/json', size: 19615, url: '/model/face_recognition_model-weights_manifest.json', remote: '::1' }
2022-01-14 09:56:48 DATA:  HTTPS: { method: 'GET', ver: '2.0', status: 200, mime: 'application/octet-stream', size: 6444032, url: '/model/face_recognition_model.bin', remote: '::1' }
2022-01-14 09:56:48 DATA:  HTTPS: { method: 'GET', ver: '2.0', status: 200, mime: 'application/json', size: 6980, url: '/model/face_expression_model-weights_manifest.json', remote: '::1' }
2022-01-14 09:56:48 DATA:  HTTPS: { method: 'GET', ver: '2.0', status: 200, mime: 'application/octet-stream', size: 329468, url: '/model/face_expression_model.bin', remote: '::1' }
2022-01-14 09:56:48 DATA:  HTTPS: { method: 'GET', ver: '2.0', status: 200, mime: 'image/jpeg', size: 144516, url: '/sample1.jpg', remote: '::1' }




Build

If you want to do a full rebuild, either download npm module

npm install @vladmandic/face-api
cd node_modules/@vladmandic/face-api

or clone a git project

git clone https://github.com/vladmandic/face-api
cd face-api

Then install all dependencies and run rebuild:

npm install --production=false
npm run build

Build process uses @vladmandic/build module that creates optimized build for each target:

> @vladmandic/[email protected] build /home/vlado/dev/face-api
> node build.js

2022-07-25 08:21:05 INFO:  Application: { name: '@vladmandic/face-api', version: '1.7.1' }
2022-07-25 08:21:05 INFO:  Environment: { profile: 'production', config: '.build.json', package: 'package.json', tsconfig: true, eslintrc: true, git: true }
2022-07-25 08:21:05 INFO:  Toolchain: { build: '0.7.7', esbuild: '0.14.50', typescript: '4.7.4', typedoc: '0.23.9', eslint: '8.20.0' }
2022-07-25 08:21:05 INFO:  Build: { profile: 'production', steps: [ 'clean', 'compile', 'typings', 'typedoc', 'lint', 'changelog' ] }
2022-07-25 08:21:05 STATE: Clean: { locations: [ 'dist/*', 'typedoc/*', 'types/lib/src' ] }
2022-07-25 08:21:05 STATE: Compile: { name: 'tfjs/node/cpu', format: 'cjs', platform: 'node', input: 'src/tfjs/tf-node.ts', output: 'dist/tfjs.esm.js', files: 1, inputBytes: 143, outputBytes: 614 }
2022-07-25 08:21:05 STATE: Compile: { name: 'faceapi/node/cpu', format: 'cjs', platform: 'node', input: 'src/index.ts', output: 'dist/face-api.node.js', files: 162, inputBytes: 234137, outputBytes: 85701 }
2022-07-25 08:21:05 STATE: Compile: { name: 'tfjs/node/gpu', format: 'cjs', platform: 'node', input: 'src/tfjs/tf-node-gpu.ts', output: 'dist/tfjs.esm.js', files: 1, inputBytes: 147, outputBytes: 618 }
2022-07-25 08:21:05 STATE: Compile: { name: 'faceapi/node/gpu', format: 'cjs', platform: 'node', input: 'src/index.ts', output: 'dist/face-api.node-gpu.js', files: 162, inputBytes: 234141, outputBytes: 85705 }
2022-07-25 08:21:05 STATE: Compile: { name: 'tfjs/node/wasm', format: 'cjs', platform: 'node', input: 'src/tfjs/tf-node-wasm.ts', output: 'dist/tfjs.esm.js', files: 1, inputBytes: 185, outputBytes: 670 }
2022-07-25 08:21:05 STATE: Compile: { name: 'faceapi/node/wasm', format: 'cjs', platform: 'node', input: 'src/index.ts', output: 'dist/face-api.node-wasm.js', files: 162, inputBytes: 234193, outputBytes: 85755 }
2022-07-25 08:21:05 STATE: Compile: { name: 'tfjs/browser/tf-version', format: 'esm', platform: 'browser', input: 'src/tfjs/tf-version.ts', output: 'dist/tfjs.version.js', files: 1, inputBytes: 1063, outputBytes: 400 }
2022-07-25 08:21:05 STATE: Compile: { name: 'tfjs/browser/esm/nobundle', format: 'esm', platform: 'browser', input: 'src/tfjs/tf-browser.ts', output: 'dist/tfjs.esm.js', files: 2, inputBytes: 910, outputBytes: 527 }
2022-07-25 08:21:05 STATE: Compile: { name: 'faceapi/browser/esm/nobundle', format: 'esm', platform: 'browser', input: 'src/index.ts', output: 'dist/face-api.esm-nobundle.js', files: 162, inputBytes: 234050, outputBytes: 82787 }
2022-07-25 08:21:05 STATE: Compile: { name: 'tfjs/browser/esm/bundle', format: 'esm', platform: 'browser', input: 'src/tfjs/tf-browser.ts', output: 'dist/tfjs.esm.js', files: 11, inputBytes: 910, outputBytes: 1184871 }
2022-07-25 08:21:05 STATE: Compile: { name: 'faceapi/browser/iife/bundle', format: 'iife', platform: 'browser', input: 'src/index.ts', output: 'dist/face-api.js', files: 162, inputBytes: 1418394, outputBytes: 1264631 }
2022-07-25 08:21:05 STATE: Compile: { name: 'faceapi/browser/esm/bundle', format: 'esm', platform: 'browser', input: 'src/index.ts', output: 'dist/face-api.esm.js', files: 162, inputBytes: 1418394, outputBytes: 1264150 }
2022-07-25 08:21:07 STATE: Typings: { input: 'src/index.ts', output: 'types/lib', files: 93 }
2022-07-25 08:21:09 STATE: TypeDoc: { input: 'src/index.ts', output: 'typedoc', objects: 154, generated: true }
2022-07-25 08:21:13 STATE: Lint: { locations: [ 'src/' ], files: 174, errors: 0, warnings: 0 }
2022-07-25 08:21:14 STATE: ChangeLog: { repository: 'https://github.com/vladmandic/face-api', branch: 'master', output: 'CHANGELOG.md' }
2022-07-25 08:21:14 INFO:  Done...
2022-07-25 08:21:14 STATE: Copy: { input: 'types/lib/dist/tfjs.esm.d.ts' }
2022-07-25 08:21:15 STATE: API-Extractor: { succeeeded: true, errors: 0, warnings: 417 }
2022-07-25 08:21:15 INFO:  FaceAPI Build complete...




Face Mesh

FaceAPI landmark model returns 68-point face mesh as detailed in the image below:

facemesh




Note

This is updated face-api.js with latest available TensorFlow/JS as the original is not compatible with tfjs >=2.0.
Forked from face-api.js version 0.22.2 which was released on March 22nd, 2020

Why? I needed a FaceAPI that does not cause version conflict with newer versions of TensorFlow
And since the original FaceAPI was open-source, I've released this version as well

Changes ended up being too large for a simple pull request and it ended up being a full-fledged version on its own
Plus many features were added since the original inception

Although a lot of work has gone into this version of FaceAPI and it will continue to be maintained,
at this time it is completely superseded by my newer library Human which covers the same use cases,
but extends it with newer AI models, additional detection details, compatibility with latest web standard and more


Differences

Compared to face-api.js version 0.22.2:

  • Compatible with TensorFlow/JS 2.0+, 3.0+ and 4.0+
    Currently using TensorFlow/JS 4.16
    Original face-api.js is based on TFJS 1.7.4
  • Compatible with WebGL, CPU and WASM TFJS Browser backends
  • Compatible with both tfjs-node and tfjs-node-gpu TFJS NodeJS backends
  • Updated all type castings for TypeScript type checking to TypeScript 5.3
  • Switched bundling from UMD to ESM + CommonJS with fallback to IIFE
    Resulting code is optimized per-platform instead of being universal
    Fully tree shakable when imported as an ESM module
    Browser bundle process uses ESBuild instead of Rollup
  • Added separate face-api versions with tfjs pre-bundled and without tfjs
    When using -nobundle version, user can load any version of tfjs manually
  • Typescript build process now targets ES2018 and instead of dual ES5/ES6
    Resulting code is clean ES2018 JavaScript without polyfills
  • Removed old tests, docs, examples
  • Removed old package dependencies (karma, jasmine, babel, etc.)
  • Updated all package dependencies
  • Updated TensorFlow/JS dependencies since backends were removed from @tensorflow/tfjs-core
  • Updated mobileNetv1 model due to batchNorm() dependency
  • Added version class that returns JSON object with version of FaceAPI as well as linked TFJS
  • Added test/dev built-in HTTP & HTTPS Web server
  • Removed mtcnn and tinyYolov2 models as they were non-functional in latest public version of FaceAPI
    Which means valid models are tinyFaceDetector and mobileNetv1
    If there is a demand, I can re-implement them back.
  • Added face angle calculations that returns roll, yaw and pitch
  • Added typdoc automatic API specification generation during build
  • Added changelog automatic generation during build
  • New process to generate TypeDocs bundle using API-Extractor

Credits


Stars Forks Code Size CDN
Downloads Downloads Downloads

face-api's People

Contributors

abdemirza avatar bettysteger avatar khwalkowicz avatar mayankagarwals avatar meeki007 avatar ninadontwant avatar patrickhulce avatar rebser avatar thesohaibahmed avatar vladmandic avatar xemle avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

face-api's Issues

face-api.esm-nobundle.js seems including tensorflow

I understood we had to specifically use face-api.esm-nobundle.js in order to be able to use tensorflow externally.
But here are my vendors before:
Screenshot 2021-01-06 at 22 19 43

Then when I import face-api:

import * as faceapi from '@vladmandic/face-api/dist/face-api.esm-nobundle.js';

Screenshot 2021-01-06 at 22 20 25

Did I understand something wrong?

Cant load models from uri on node.js side

When i try to load models from node i get this error, i can load them from disc without any problem,

await faceapi.nets.faceExpressionNet.loadFromDisk(modelPath); // I can load

await faceapi.nets.faceExpressionNet.loadFromUri("http://localhost:5000/app/ai/faceapi/models/"); // cant load and get this error;

functions: TypeError: fetch is not a function
at fetchOrThrow (C:\Development\hemsor\ticore\node_modules@vladmandic\face-api\dist\face-api.node.js:1513:21)
at fetchJson (C:\Development\hemsor\ticore\node_modules@vladmandic\face-api\dist\face-api.node.js:1532:17)
at loadWeightMap (C:\Development\hemsor\ticore\node_modules@vladmandic\face-api\dist\face-api.node.js:1573:26)
at SsdMobilenetv1.loadFromUri (C:\Development\hemsor\ticore\node_modules@vladmandic\face-api\dist\face-api.node.js:1661:29)
at C:\Development\hemsor\functions\lib\handler\example.handler.js:121:39
at Generator.next ()
at fulfilled (C:\Development\hemsor\functions\lib\handler\example.handler.js:24:58)
at processTicksAndRejections (internal/process/task_queues.js:97:5)

I need to load from uri because i will put them into google function which has no disc,

Best Regards.

Protection against image fraud

Any idea how Recognizing real people is not a problem, I know it's possible. But is it implemented here, I myself have been using the original project for 2 years
Thank you

Is there WASM support?

"Face-api.js" includes "WEBGL" and "CPU". CPU is very slow. I want to use "WASM" for tensorflowjs. Now tensorflowjs supports WASM:

https://blog.tensorflow.org/2020/03/introducing-webassembly-backend-for-tensorflow-js.html

How can I use it with WASM? Tensoflowjs with WASM is below:

https://github.com/tensorflow/tfjs/tree/master/tfjs-backend-wasm

I added "import * as faceapi from './dist/face-api.esm.js';" After that, how can I add with import?

Can you give a example?

Current tensorflowjs version is 2.7.0. Also, will you update it with tensorflowjs 2.8.1 version?

Thanks for in advance.

wasm streaming compile failed: TypeError: Failed to execute 'compile' on 'WebAssembly': Incorrect response MIME type. Expected 'application / wasm'

this happens to me sometimes and sometimes it does not happen to me:

vendor.5719bdd2.js: 14910 [Deprecation] SharedArrayBuffer will require cross-origin isolation as of M91, around May 2021. See https://developer.chrome.com/blog/enabling-shared-array-buffer/ for more details.

14.337f99a0.js: 1 Detect Error: Error: TinyYolov2 - load model before inference

vendor.5719bdd2.js: 195 wasm streaming compile failed: TypeError: Failed to execute 'compile' on 'WebAssembly': Incorrect response MIME type. Expected 'application / wasm'.

falling back to ArrayBuffer instantiation

I work with vue

import * as faceapi from "@vladmandic/face-api/dist/face-api.esm.js";
var backendWasm = await this.backendWasm();
Promise.all([faceapi.nets.tinyFaceDetector.loadFromUri("/statics/models")]);

this.video.addEventListener(
    "play",
    function() {
      self.canvasFace = document.getElementById("c1");
      self.displaySize = {
        width: self.videoWidth,
        height: self.videoHeight
      };
      faceapi.matchDimensions(self.canvasFace, self.displaySize);
      self.canvasText = document.getElementById("c2");
      self.timerCallback();
    },
    false
  );

timerCallback: async function() {
console.log('timer');
if (this.video.paused || this.video.ended) {
return;
}
let box = {};
let xBox;
let boxArea;
let anchBoundBox;
let altBoundBox;
let yBox;
if (this.$q.screen.xs) {
box = { x: 50, y: 37.5, width: 200, height: 225 };
xBox = 10;
yBox = box.y + 25;
boxArea = 20000;
} else {
box = { x: 75, y: 37.5, width: 150, height: 225 };
xBox = box.x - 35;
yBox = box.y + 45;
boxArea = 21000;
}

  var lenghtx = box.x + box.width;
  var lenghty = box.y + box.height;
  // see DrawBoxOptions below
  const drawOptions = {
    lineWidth: 2,
    boxColor: "#23b2be"
  };
  const drawBox = new faceapi.draw.DrawBox(box, drawOptions);
  drawBox.draw(this.canvasFace);
  let inputSize = 128;
  let scoreThresholds = 0.2;
  let self = this;
  faceapi
    .detectAllFaces(
      self.video,
      new faceapi.TinyFaceDetectorOptions({ inputSize, scoreThresholds })
    )
    .then(result => {
      const resizedDetections = faceapi.resizeResults(
        result,
        self.displaySize
      );

      if (resizedDetections[0]) {
        if (self.$q.screen.xs) {
          anchBoundBox = resizedDetections[0].box["width"] - 40;
          altBoundBox = resizedDetections[0].box["height"] - 35;
        } else {
          anchBoundBox = resizedDetections[0].box["width"] - 15;
          altBoundBox = resizedDetections[0].box["height"] - 15;
        }
        if (
          anchBoundBox + resizedDetections[0].box["x"] > lenghtx ||
          resizedDetections[0].box["x"] < xBox ||
          resizedDetections[0].box["y"] < yBox ||
          altBoundBox + resizedDetections[0].box["y"] > lenghty
        ) {
          self.validCentered = false;
          self.textAlert = "Centre su rostro";
          self.paintBoxAlert(self.textAlert);
          self.$emit("detect-centered", false);
        } else {
          self.validCentered = true;
          self.$emit("detect-centered", true);
        }
        if (resizedDetections[0].box.area > boxArea) {
          self.validProximity = true;
          self.$emit("detect-proximity", true);
        } else {
          self.textAlert = "Acerque su rostro";
          self.paintBoxAlert(self.textAlert);
          self.validProximity = false;
          self.$emit("detect-proximity", false);
        }
      } else {
        self.validProximity = false;
        self.validCentered = false;
      }
      if (self.validCentered && self.validProximity) {
        self.canvasText
          .getContext("2d")
          .clearRect(0, 0, self.canvasText.width, 20);
      }
      requestAnimationFrame(() => self.timerCallback());
      return true;
    })
    .catch(err => {
      console.error(`Detect Error: ${err}`);
      return false;
    });
  return false;
},

how to fully optimize processing to take advance of all available hardware

I run this function:

  async function analyzeFrame (frameData, index) {
    const tensor = await getTensorFromBuffer(frameData)
    const faces = await faceapi.detectAllFaces(tensor, optionsSSDMobileNet).withFaceLandmarks().withFaceDescriptors()
    tensor.dispose()
    faces.forEach(async (face) => {
      // console.log(face)
    })
  }

for hundreds of frames of a video. When doing so, it takes about half the time as if it where using only tfjs-node. It fills the RAM of my two 1080s, however the only usage that changes when I start the script is GPU 1s copy utilization goes to about 15%.

I want to take better advantage of my hardware and run this faster.

Windows 10, cuda 10, TFJS 2.7.0

findBestMatch - is not working correctly

Hi ,
When I inserted this face into db :
cGc
and compared with this, he found the first one! why ? he should return unknown.
cGc40
distance Threshold 0.5 and the actual threshold
"bestMatchPerson": {
"_label": "2",
"_distance": 0.44472999761672266
}
What do I miss ? the image inputs and compare are not good ?
Thanks in advance.

How will my web provide it all (WASM + SIMD + MULTI-THREAD) ?

I tried Google-Chrome, Firefox.
There is SIMD problem for Google-Chrome:

DEBUG: false
IS_BROWSER: true
IS_NODE: false
IS_TEST: false
PROD: true
WASM_HAS_MULTITHREAD_SUPPORT: true
WASM_HAS_SIMD_SUPPORT: false

My google-chrome version is 87.
How can I provide SIMD for Google-chrome?

Information about models used by FaceAPI

Hi!

For academic purposes I would need to get more information about some models (Tiny Face Detector, Face Recognizer Net and Face Landmark Model).

Is there any documentation (papers, benchmarks...) that I could check out? Who developed and trained these models?

Thanks!

There is a leak memory: faceapi.extractFaceTensors and faceapi.extractFaces

faceapi.tf.setBackend('webgl');
faceapi.tf.enableProdMode();
faceapi.tf.ENV.set('DEBUG', false);
faceapi.tf.ready();

.......
const videoTensor = await faceapi.tf.browser.fromPixels(video);
const detections = await faceapi.detectAllFaces(videoTensor, new faceapi.SsdMobilenetv1Options({ minConfidence: minScore, maxResults }));
const faceImages = await faceapi.extractFaceTensors(videoTensor, detections);

await videoTensor.dispose();
await faceImages[0].dispose();

I follow my GPU memory. I detect a leak memory. Maybe we can use x.dispose(); I tried it, but this problem continue...
I checked faceapi.detectAllFaces function. There is not any leak. I think faceapi.extractFaceTensors function.
How can I fix it?

ReferenceError: require is not defined

Hi

This is error:

file:///home/ali/Desktop/Nodejs%20Projects/face-api-test2/node_modules/@vladmandic/face-api/dist/face-api.node.js:5
`;return b[b.length-1]=" "+b[b.length-1]+"]"+(o?"":L),b}function dh(e){const t=[];for(let n=0;n<e.length;n+=2)t.push([e[n],e[n+1]]);return t}class an{constructor(e,t,n){if(this.dtype=t,this.shape=e.slice(),this.size=M(e),n!=null){const s=n.length;A(s===this.size,()=>`Length of values '${s}' does not match the size inferred by the shape '${this.size}'.`)}if(t==="complex64")throw new Error("complex64 dtype TensorBuffers are not supported. Please create a TensorBuffer for the real and imaginary parts separately and call tf.complex(real, imag).");this.values=n||Is(t,this.size),this.strides=Ke(e)}set(e,...t){t.length===0&&(t=[0]),A(t.length===this.rank,()=>`The number of provided coordinates (${t.length}) must match the rank (${this.rank})`);const n=this.locToIndex(t);this.values[n]=e}get(...e){e.length===0&&(e=[0]);let t=0;for(const s of e){if(s<0||s>=this.shape[t]){const i=`Requested out of range element at ${e}.   Buffer shape=${this.shape}`;throw new Error(i)}t++}let n=e[e.length-1];for(let s=0;s<e.length-1;++s)n+=this.strides[s]*e[s];return this.values[n]}locToIndex(e){if(this.rank===0)return 0;if(this.rank===1)return e[0];let t=e[e.length-1];for(let n=0;n<e.length-1;++n)t+=this.strides[n]*e[n];return t}indexToLoc(e){if(this.rank===0)return[];if(this.rank===1)return[e];const t=new Array(this.shape.length);for(let n=0;n<t.length-1;++n)t[n]=Math.floor(e/this.strides[n]),e-=t[n]*this.strides[n];return t[t.length-1]=e,t}get rank(){return this.shape.length}toTensor(){return vi().makeTensor(this.values,this.shape,this.dtype)}}let vi=null,za=null,kT=null;function zk(e){vi=e}function Vk(e){za=e}function Gk(e){kT=e}class te{constructor(e,t,n,s){this.kept=!1,this.isDisposedInternal=!1,this.shape=e.slice(),this.dtype=t||"float32",this.size=M(e),this.strides=Ke(e),this.dataId=n,this.id=s,this.rankType=this.rank<5?this.rank.toString():"higher"}get rank(){return this.shape.length}async buffer(){const e=await this.data();return za.buffer(this.shape,this.dtype,e)}bufferSync(){return za.buffer(this.shape,this.dtype,this.dataSync())}async array(){const e=await this.data();return xs(this.shape,e)}arraySync(){return xs(this.shape,this.dataSync())}async data(){this.throwIfDisposed();const e=vi().read(this.dataId);if(this.dtype==="string"){const t=await e;try{return t.map(n=>lh(n))}catch(n){throw new Error("Failed to decode the string bytes into utf-8. To get the original bytes, call tensor.bytes().")}}return e}dataSync(){this.throwIfDisposed();const e=vi().readSync(this.dataId);if(this.dtype==="string")try{return e.map(t=>lh(t))}catch(t){throw new Error("Failed to decode the string bytes into utf-8. To get the original bytes, call tensor.bytes().")}return e}async bytes(){this.throwIfDisposed();const e=await vi().read(this.dataId);return this.dtype==="string"?e:new Uint8Array(e.buffer)}dispose(){if(this.isDisposed)return;vi().disposeTensor(this),this.isDisposedInternal=!0}get isDisposed(){return this.isDisposedInternal}throwIfDisposed(){if(this.isDisposed)throw new Error("Tensor is disposed.")}print(e=!1){return za.print(this,e)}clone(){return this.throwIfDisposed(),za.clone(this)}toString(e=!1){const t=this.dataSync();return Mk(t,this.shape,this.dtype,e)}cast(e){return this.throwIfDisposed(),za.cast(this,e)}variable(e=!0,t,n){return this.throwIfDisposed(),vi().makeVariable(this,e,t,n)}}Object.defineProperty(te,Symbol.hasInstance,{value:e=>!!e&&e.data!=null&&e.dataSync!=null&&e.throwIfDisposed!=null});class ph extends te{constructor(e,t,n,s){super(e.shape,e.dtype,e.dataId,s);this.trainable=t,this.name=n}assign(e){if(e.dtype!==this.dtype)throw new Error(`dtype of the new value (${e.dtype}) and previous value (${this.dtype}) must match`);if(!ie(e.shape,this.shape))throw new Error(`shape of the new value (${e.shape}) and previous value (${this.shape}) must match`);vi().disposeTensor(this),this.dataId=e.dataId,vi().incRef(this,null)}dispose(){vi().disposeVariable(this),this.isDisposedInternal=!0}}Object.defineProperty(ph,Symbol.hasInstance,{value:e=>e instanceof te&&e.assign!=null&&e.assign instanceof Function});(function(e){e.R0="R0",e.R1="R1",e.R2="R2",e.R3="R3",e.R4="R4",e.R5="R5",e.R6="R6"})(r.Rank||(r.Rank={}));var Qy;(function(e){e.float32="float32",e.int32="int32",e.bool="int32",e.complex64="complex64"})(Qy||(Qy={}));var eb;(function(e){e.float32="float32",e.int32="int32",e.bool="bool",e.complex64="complex64"})(eb||(eb={}));var tb;(function(e){e.float32="float32",e.int32="float32",e.bool="float32",e.complex64="complex64"})(tb||(tb={}));var nb;(function(e){e.float32="complex64",e.int32="complex64",e.bool="complex64",e.complex64="complex64"})(nb||(nb={}));const Yk={float32:tb,int32:Qy,bool:eb,complex64:nb};function Bn(e,t){if(e==="string"||t==="string"){if(e==="string"&&t==="string")return"string";throw new Error(`Can not upcast ${e} with ${t}`)}return Yk[e][t]}function np(e){return Bn(e,"int32")}function Gt(e,t){if(e.dtype===t.dtype)return[e,t];const n=Bn(e.dtype,t.dtype);return[e.cast(n),t.cast(n)]}function FT(e,t){A(e.dtype===t.dtype,()=>`The dtypes of the first(${e.dtype}) and second(${t.dtype}) input must match`)}function sp(e,t){return t.some(n=>n.id===e.id)}function Zi(e){const t=[],n=new Set;return _T(e,t,n),t}function _T(e,t,n){if(e==null)return;if(e instanceof te){t.push(e);return}if(!Hk(e))return;const s=e;for(const i in s){const o=s[i];n.has(o)||(n.add(o),_T(o,t,n))}}function Hk(e){return Array.isArray(e)||typeof e=="object"}var qk=Object.freeze({__proto__:null,makeTypesMatch:Gt,assertTypesMatch:FT,isTensorInList:sp,getTensorsInContainer:Zi});class WT{constructor(){this.registeredVariables={},this.nextTapeNodeId=0,this.numBytes=0,this.numTensors=0,this.numStringTensors=0,this.numDataBuffers=0,this.gradientDepth=0,this.kernelDepth=0,this.scopeStack=[],this.numDataMovesStack=[],this.nextScopeId=0,this.tensorInfo=new WeakMap,this.profiling=!1,this.activeProfile={newBytes:0,newTensors:0,peakBytes:0,kernels:[],result:null}}dispose(){for(const e in this.registeredVariables)this.registeredVariables[e].dispose()}}class mh{constructor(e){this.ENV=e,this.registry={},this.registryFactory={},this.pendingBackendInitId=0,this.state=new WT}async ready(){if(this.pendingBackendInit!=null)return this.pendingBackendInit.then(()=>{});if(this.backendInstance!=null)return;const e=this.getSortedBackends();for(let t=0;t<e.length;t++){const n=e[t],s=await this.initializeBackend(n).success;if(s){await this.setBackend(n);return}}throw new Error("Could not initialize any backends, all backend initializations failed.")}get backend(){if(this.pendingBackendInit!=null)throw new Error(`Backend '${this.backendName}' has not yet been initialized. Make sure to await tf.ready() or await tf.setBackend() before calling other methods`);if(this.backendInstance==null){const{name:e,asyncInit:t}=this.initializeBackendsAndReturnBest();if(t)throw new Error(`The highest priority backend '${e}' has not yet been initialized. Make sure to await tf.ready() or await tf.setBackend() before calling other methods`);this.setBackend(e)}return this.backendInstance}backendNames(){return Object.keys(this.registryFactory)}findBackend(e){if(!(e in this.registry))if(e in this.registryFactory){const{asyncInit:t}=this.initializeBackend(e);if(t)return null}else return null;return this.registry[e]}findBackendFactory(e){return e in this.registryFactory?this.registryFactory[e].factory:null}registerBackend(e,t,n=1){return e in this.registryFactory?(console.warn(`${e} backend was already registered. Reusing existing backend factory.`),!1):(this.registryFactory[e]={factory:t,priority:n},!0)}async setBackend(e){if(this.registryFactory[e]==null)throw new Error(`Backend name '${e}' not found in registry`);if(this.backendName=e,this.registry[e]==null){this.backendInstance=null;const{success:t,asyncInit:n}=this.initializeBackend(e),s=n?await t:t;if(!s)return!1}return this.backendInstance=this.registry[e],this.setupRegisteredKernels(),this.profiler=new _k(this.backendInstance),!0}setupRegisteredKernels(){const e=Zd(this.backendName);e.forEach(t=>{t.setupFunc!=null&&t.setupFunc(this.backendInstance)})}disposeRegisteredKernels(e){const t=Zd(e);t.forEach(n=>{n.disposeFunc!=null&&n.disposeFunc(this.registry[e])})}initializeBackend(e){const t=this.registryFactory[e];if(t==null)throw new Error(`Cannot initialize backend ${e}, no registration found.`);try{const n=t.factory();if(n&&!(n instanceof f)&&typeof n.then=="function"){const s=++this.pendingBackendInitId,i=n.then(o=>s<this.pendingBackendInitId?!1:(this.registry[e]=o,this.pendingBackendInit=null,!0)).catch(o=>(s<this.pendingBackendInitId||(this.pendingBackendInit=null,console.warn(`Initialization of backend ${e} failed`),console.warn(o.stack||o.message)),!1));return this.pendingBackendInit=i,{success:i,asyncInit:!0}}else return this.registry[e]=n,{success:!0,asyncInit:!1}}catch(n){return console.warn(`Initialization of backend ${e} failed`),console.warn(n.stack||n.message),{success:!1,asyncInit:!1}}}removeBackend(e){if(!(e in this.registryFactory))throw new Error(`${e} backend not found in registry`);this.backendName===e&&this.pendingBackendInit!=null&&this.pendingBackendInitId++,e in this.registry&&(this.disposeRegisteredKernels(e),this.registry[e].dispose(),delete this.registry[e]),delete this.registryFactory[e],this.backendName===e&&(this.pendingBackendInit=null,this.backendName=null,this.backendInstance=null)}getSortedBackends(){if(Object.keys(this.registryFactory).length===0)throw new Error("No backend found in registry.");return Object.keys(this.registryFactory).sort((e,t)=>this.registryFactory[t].priority-this.registryFactory[e].priority)}initializeBackendsAndReturnBest(){const e=this.getSortedBackends();for(let t=0;t<e.length;t++){const n=e[t],{success:s,asyncInit:i}=this.initializeBackend(n);if(i||s)return{name:n,asyncInit:i}}throw new Error("Could not initialize any backends, all backend initializations failed.")}moveData(e,t){const n=this.state.tensorInfo.get(t),s=n.backend,i=this.readSync(t);s.disposeData(t),n.backend=e,e.move(t,i,n.shape,n.dtype),this.shouldCheckForMemLeaks()&&this.state.numDataMovesStack[this.state.numDataMovesStack.length-1]++}tidy(e,t){let n=null;if(t==null){if(typeof e!="function")throw new Error("Please provide a function to tidy()");t=e}else{if(typeof e!="string"&&!(e instanceof String))throw new Error("When calling with two arguments, the first argument to tidy() must be a string");if(typeof t!="function")throw new Error("When calling with two arguments, the 2nd argument to tidy() must be a function");n=e}let s;return this.scopedRun(()=>this.startScope(n),()=>this.endScope(s),()=>(s=t(),s instanceof Promise&&console.error("Cannot return a Promise inside of tidy."),s))}scopedRun(e,t,n){e();try{const s=n();return t(),s}catch(s){throw t(),s}}nextTensorId(){return mh.nextTensorId++}nextVariableId(){return mh.nextVariableId++}clone(e){const t=this.makeTensorFromDataId(e.dataId,e.shape,e.dtype),n={x:e},s=o=>({x:()=>{const a="float32",c={x:o},h={dtype:a};return G.runKernelFunc(p=>p.cast(o,a),c,null,ka,h)}}),i=[];return this.addTapeNode(this.state.activeScope.name,n,[t],s,i,{}),t}runKernel(e,t,n,s,i){const o=null,a=null;return this.runKernelFunc(o,t,a,e,n,s,i)}shouldCheckForMemLeaks(){return this.ENV.getBool("IS_TEST")}checkKernelForMemLeak(e,t,n){const s=this.backend.numDataIds();let i=0;n.forEach(c=>{i+=c.dtype==="complex64"?3:1});const o=this.state.numDataMovesStack[this.state.numDataMovesStack.length-1],a=s-t-i-o;if(a>0)throw new Error(`Backend '${this.backendName}' has an internal memory leak (${a} data ids) after running '${e}'`)}runKernelFunc(e,t,n,s,i,o,a){let c,h=[];const p=this.isTapeOn();s==null&&(s=this.state.activeScope!=null?this.state.activeScope.name:"");const m=this.state.numBytes,y=this.state.numTensors;this.shouldCheckForMemLeaks()&&this.state.numDataMovesStack.push(0);let b;const w=Ky(s,this.backendName);let L;if(w!=null)b=()=>{const v=this.backend.numDataIds();L=w.kernelFunc({inputs:t,attrs:i,backend:this.backend});const C=Array.isArray(L)?L:[L];this.shouldCheckForMemLeaks()&&this.checkKernelForMemLeak(s,v,C);const O=C.map(({dataId:D,shape:k,dtype:F})=>this.makeTensorFromDataId(D,k,F));if(p){let D=this.getTensorsForGradient(s,t,O);if(D==null){a==null&&(a=[]);const k=O.filter((F,B)=>a[B]);D=(o||[]).slice().concat(k)}h=this.saveTensorsForBackwardMode(D)}return O};else{const v=C=>{if(!p)return;h=C.map(O=>this.keep(this.clone(O)))};b=()=>{const C=this.backend.numDataIds();L=this.tidy(()=>e(this.backend,v));const O=Array.isArray(L)?L:[L];return this.shouldCheckForMemLeaks()&&this.checkKernelForMemLeak(s,C,O),O}}let T;return this.scopedRun(()=>this.state.kernelDepth++,()=>this.state.kernelDepth--,()=>{!this.ENV.getBool("DEBUG")&&!this.state.profiling?c=b():(T=this.profiler.profileKernel(s,t,()=>b()),this.ENV.getBool("DEBUG")&&this.profiler.logKernelProfile(T),c=T.outputs)}),p&&this.addTapeNode(s,t,c,n,h,i),this.state.profiling&&this.state.activeProfile.kernels.push({name:s,bytesAdded:this.state.numBytes-m,totalBytesSnapshot:this.state.numBytes,tensorsAdded:this.state.numTensors-y,totalTensorsSnapshot:this.state.numTensors,inputShapes:Object.keys(t).map(v=>t[v]!=null?t[v].shape:null),outputShapes:c.map(v=>v.shape),kernelTimeMs:T.timeMs,extraInfo:T.extraInfo}),Array.isArray(L)?c:c[0]}saveTensorsForBackwardMode(e){const t=e.map(n=>this.keep(this.clone(n)));return t}getTensorsForGradient(e,t,n){const s=Xy(e);if(s!=null){const i=s.inputsToSave||[],o=s.outputsToSave||[];let a;s.saveAllInputs?(A(Array.isArray(t),()=>"saveAllInputs is true, expected inputs to be an array."),a=Object.keys(t).map(h=>t[h])):a=i.map(h=>t[h]);const c=n.filter((h,p)=>o[p]);return a.concat(c)}return null}makeTensor(e,t,n,s){if(e==null)throw new Error("Values passed to engine.makeTensor() are null");n=n||"float32",s=s||this.backend;let i=e;n==="string"&&Ji(e[0])&&(i=e.map(c=>ep(c)));const o=s.write(i,t,n),a=new te(t,n,o,this.nextTensorId());if(this.incRef(a,s),n==="string"){const c=this.state.tensorInfo.get(o),h=Xx(i);this.state.numBytes+=h-c.bytes,c.bytes=h}return a}makeTensorFromDataId(e,t,n,s){n=n||"float32";const i=new te(t,n,e,this.nextTensorId());return this.incRef(i,s),i}makeVariable(e,t=!0,n,s){n=n||this.nextVariableId().toString(),s!=null&&s!==e.dtype&&(e=e.cast(s));const i=new ph(e,t,n,this.nextTensorId());if(this.state.registeredVariables[i.name]!=null)throw new Error(`Variable with name ${i.name} was already registered`);return this.state.registeredVariables[i.name]=i,this.incRef(i,this.backend),i}incRef(e,t){const n=this.state.tensorInfo.has(e.dataId)?this.state.tensorInfo.get(e.dataId).refCount:0;if(this.state.numTensors++,e.dtype==="string"&&this.state.numStringTensors++,n===0){this.state.numDataBuffers++;let s=0;e.dtype!=="complex64"&&e.dtype!=="string"&&(s=e.size*iy(e.dtype)),this.state.tensorInfo.set(e.dataId,{backend:t||this.backend,dtype:e.dtype,shape:e.shape,bytes:s,refCount:0}),this.state.numBytes+=s}this.state.tensorInfo.get(e.dataId).refCount++,e instanceof ph||this.track(e)}disposeTensor(e){if(!this.state.tensorInfo.has(e.dataId))return;this.state.numTensors--,e.dtype==="string"&&this.state.numStringTensors--;const t=this.state.tensorInfo.get(e.dataId),n=t.refCount;n<=1?(e.dtype!=="complex64"&&(this.state.numBytes-=t.bytes),this.state.numDataBuffers--,t.backend.disposeData(e.dataId),this.state.tensorInfo.delete(e.dataId)):this.state.tensorInfo.get(e.dataId).refCount--}disposeVariables(){for(const e in this.state.registeredVariables){const t=this.state.registeredVariables[e];this.disposeVariable(t)}}disposeVariable(e){this.disposeTensor(e),this.state.registeredVariables[e.name]!=null&&delete this.state.registeredVariables[e.name]}memory(){const e=this.backend.memory();return e.numTensors=this.state.numTensors,e.numDataBuffers=this.state.numDataBuffers,e.numBytes=this.state.numBytes,this.state.numStringTensors>0&&(e.unreliable=!0,e.reasons==null&&(e.reasons=[]),e.reasons.push("Memory usage by string tensors is approximate (2 bytes per character)")),e}async profile(e){this.state.profiling=!0;const t=this.state.numBytes,n=this.state.numTensors;this.state.activeProfile.kernels=[],this.state.activeProfile.result=await e(),this.state.profiling=!1,this.state.activeProfile.peakBytes=Math.max(...this.state.activeProfile.kernels.map(s=>s.totalBytesSnapshot)),this.state.activeProfile.newBytes=this.state.numBytes-t,this.state.activeProfile.newTensors=this.state.numTensors-n;for(const s of this.state.activeProfile.kernels)s.kernelTimeMs=await s.kernelTimeMs,s.extraInfo=await s.extraInfo;return this.state.activeProfile}isTapeOn(){return this.state.gradientDepth>0&&this.state.kernelDepth===0}addTapeNode(e,t,n,s,i,o){const a={id:this.state.nextTapeNodeId++,kernelName:e,inputs:t,outputs:n,saved:i},c=Xy(e);c!=null&&(s=c.gradFunc),s!=null&&(a.gradient=h=>(h=h.map((p,m)=>{if(p==null){const y=n[m],b=Da(y.size,y.dtype);return this.makeTensor(b,y.shape,y.dtype)}return p}),s(h.length>1?h:h[0],i,o))),this.state.activeTape.push(a)}keep(e){return e.kept=!0,e}startTape(){this.state.gradientDepth===0&&(this.state.activeTape=[]),this.state.gradientDepth++}endTape(){this.state.gradientDepth--}startScope(e){const t={track:[],name:"unnamed scope",id:this.state.nextScopeId++};e&&(t.name=e),this.state.scopeStack.push(t),this.state.activeScope=t}endScope(e){const t=Zi(e),n=new Set(t.map(i=>i.id));for(let i=0;i<this.state.activeScope.track.length;i++){const o=this.state.activeScope.track[i];!o.kept&&!n.has(o.id)&&o.dispose()}const s=this.state.scopeStack.pop();this.state.activeScope=this.state.scopeStack.length===0?null:this.state.scopeStack[this.state.scopeStack.length-1],t.forEach(i=>{!i.kept&&i.scopeId===s.id&&this.track(i)})}gradients(e,t,n,s=!1){if(A(t.length>0,()=>"gradients() received an empty list of xs."),n!=null&&n.dtype!=="float32")throw new Error(`dy must have 'float32' dtype, but has '${n.dtype}'`);const i=this.scopedRun(()=>this.startTape(),()=>this.endTape(),()=>this.tidy("forward",e));A(i instanceof te,()=>"The result y returned by f() must be a tensor.");const o=Uk(this.state.activeTape,t,i);if(!s&&o.length===0&&t.length>0)throw new Error("Cannot compute gradient of y=f(x) with respect to x. Make sure that the f you passed encloses all operations that lead from x to y.");return this.tidy("backward",()=>{const a={};a[i.id]=n==null?jk(i.shape):n,Bk(a,o,h=>this.tidy(h),Kk);const c=t.map(h=>a[h.id]);return this.state.gradientDepth===0&&(this.state.activeTape.forEach(h=>{for(const p of h.saved)p.dispose()}),this.state.activeTape=null),{value:i,grads:c}})}customGrad(e){return A($r(e),()=>"The f passed in customGrad(f) must be a function."),(...t)=>{A(t.every(i=>i instanceof te),()=>"The args passed in customGrad(f)(x1, x2,...) must all be tensors");let n;const s={};return t.forEach((i,o)=>{s[o]=i}),this.runKernelFunc((i,o)=>(n=e(...t,o),A(n.value instanceof te,()=>"The function f passed in customGrad(f) must return an object where `obj.value` is a tensor"),A($r(n.gradFunc),()=>"The function f passed in customGrad(f) must return an object where `obj.gradFunc` is a function."),n.value),s,(i,o)=>{const a=n.gradFunc(i,o),c=Array.isArray(a)?a:[a];A(c.length===t.length,()=>"The function f passed in customGrad(f) must return an object where `obj.gradFunc` is a function that returns the same number of tensors as inputs passed to f(...)."),A(c.every(p=>p instanceof te),()=>"The function f passed in customGrad(f) must return an object where `obj.gradFunc` is a function that returns a list of only tensors.");const h={};return c.forEach((p,m)=>{h[m]=()=>p}),h})}}readSync(e){const t=this.state.tensorInfo.get(e);return t.backend.readSync(e)}read(e){const t=this.state.tensorInfo.get(e);return t.backend.read(e)}async time(e){const t=Jn(),n=await this.backend.time(e);return n.wallMs=Jn()-t,n}track(e){return this.state.activeScope!=null&&(e.scopeId=this.state.activeScope.id,this.state.activeScope.track.push(e)),e}get registeredVariables(){return this.state.registeredVariables}reset(){this.pendingBackendInitId++,this.state.dispose(),this.ENV.reset(),this.state=new WT;for(const e in this.registry)this.disposeRegisteredKernels(e),this.registry[e].dispose(),delete this.registry[e];this.backendName=null,this.backendInstance=null,this.pendingBackendInit=null}}mh.nextTensorId=0,mh.nextVariableId=0;function jk(e){const t=ry(M(e),"float32");return G.makeTensor(t,e,"float32")}function $T(){const e=tT();if(e._tfengine==null){const t=new eT(e);e._tfengine=new mh(t)}return Nk(e._tfengine.ENV),zk(()=>e._tfengine),e._tfengine}const G=$T();function Kk(e,t){const n={a:e,b:t};return G.runKernelFunc((s,i)=>{const o=s.add(e,t);return i([e,t]),o},n,null,Oo)}function Xk(){return typeof navigator!="undefined"&&navigator!=null}function UT(){if(Xk()){const e=navigator.userAgent||navigator.vendor||window.opera;return/(android|bb\d+|meego).+mobile|avantgo|bada\/|blackberry|blazer|compal|elaine|fennec|hiptop|iemobile|ip(hone|od)|iris|kindle|lge |maemo|midp|mmp|mobile.+firefox|netfront|opera m(ob|in)i|palm( os)?|phone|p(ixi|re)\/|plucker|pocket|psp|series(4|6)0|symbian|treo|up\.(browser|link)|vodafone|wap|windows ce|xda|xiino/i.test(e)||/1207|6310|6590|3gso|4thp|50[1-6]i|770s|802s|a wa|abac|ac(er|oo|s\-)|ai(ko|rn)|al(av|ca|co)|amoi|an(ex|ny|yw)|aptu|ar(ch|go)|as(te|us)|attw|au(di|\-m|r |s )|avan|be(ck|ll|nq)|bi(lb|rd)|bl(ac|az)|br(e|v)w|bumb|bw\-(n|u)|c55\/|capi|ccwa|cdm\-|cell|chtm|cldc|cmd\-|co(mp|nd)|craw|da(it|ll|ng)|dbte|dc\-s|devi|dica|dmob|do(c|p)o|ds(12|\-d)|el(49|ai)|em(l2|ul)|er(ic|k0)|esl8|ez([4-7]0|os|wa|ze)|fetc|fly(\-|_)|g1 u|g560|gene|gf\-5|g\-mo|go(\.w|od)|gr(ad|un)|haie|hcit|hd\-(m|p|t)|hei\-|hi(pt|ta)|hp( i|ip)|hs\-c|ht(c(\-| |_|a|g|p|s|t)|tp)|hu(aw|tc)|i\-(20|go|ma)|i230|iac( |\-|\/)|ibro|idea|ig01|ikom|im1k|inno|ipaq|iris|ja(t|v)a|jbro|jemu|jigs|kddi|keji|kgt( |\/)|klon|kpt |kwc\-|kyo(c|k)|le(no|xi)|lg( g|\/(k|l|u)|50|54|\-[a-w])|libw|lynx|m1\-w|m3ga|m50\/|ma(te|ui|xo)|mc(01|21|ca)|m\-cr|me(rc|ri)|mi(o8|oa|ts)|mmef|mo(01|02|bi|de|do|t(\-| |o|v)|zz)|mt(50|p1|v )|mwbp|mywa|n10[0-2]|n20[2-3]|n30(0|2)|n50(0|2|5)|n7(0(0|1)|10)|ne((c|m)\-|on|tf|wf|wg|wt)|nok(6|i)|nzph|o2im|op(ti|wv)|oran|owg1|p800|pan(a|d|t)|pdxg|pg(13|\-([1-8]|c))|phil|pire|pl(ay|uc)|pn\-2|po(ck|rt|se)|prox|psio|pt\-g|qa\-a|qc(07|12|21|32|60|\-[2-7]|i\-)|qtek|r380|r600|raks|rim9|ro(ve|zo)|s55\/|sa(ge|ma|mm|ms|ny|va)|sc(01|h\-|oo|p\-)|sdk\/|se(c(\-|0|1)|47|mc|nd|ri)|sgh\-|shar|sie(\-|m)|sk\-0|sl(45|id)|sm(al|ar|b3|it|t5)|so(ft|ny)|sp(01|h\-|v\-|v )|sy(01|mb)|t2(18|50)|t6(00|10|18)|ta(gt|lk)|tcl\-|tdg\-|tel(i|m)|tim\-|t\-mo|to(pl|sh)|ts(70|m\-|m3|m5)|tx\-9|up(\.b|g1|si)|utst|v400|v750|veri|vi(rg|te)|vk(40|5[0-3]|\-v)|vm40|voda|vulc|vx(52|53|60|61|70|80|81|83|85|98)|w3c(\-| )|webc|whit|wi(g |nc|nw)|wmlb|wonu|x700|yas\-|your|zeto|zte\-/i.test(e.substr(0,4))}return!1}function sb(){return typeof window!="undefined"&&window.document!=null||typeof WorkerGlobalScope!="undefined"}var Jk=Object.freeze({__proto__:null,isMobile:UT,isBrowser:sb});const Qi=ae();Qi.registerFlag("DEBUG",()=>!1,e=>{e&&console.warn("Debugging mode is ON. The output of every math call will be downloaded to CPU and checked for NaNs. This significantly impacts performance.")}),Qi.registerFlag("IS_BROWSER",()=>sb()),Qi.registerFlag("IS_NODE",()=>typeof process!="undefined"&&typeof process.versions!="undefined"&&typeof process.versions.node!="undefined"),Qi.registerFlag("IS_CHROME",()=>typeof navigator!="undefined"&&navigator!=null&&navigator.userAgent!=null&&/Chrome/.test(navigator.userAgent)&&/Google Inc/.test(navigator.vendor)),Qi.registerFlag("PROD",()=>!1),Qi.registerFlag("TENSORLIKE_CHECK_SHAPE_CONSISTENCY",()=>Qi.getBool("DEBUG")),Qi.registerFlag("DEPRECATION_WARNINGS_ENABLED",()=>!0),Qi.registerFlag("IS_TEST",()=>!1);function Ni(e,t){let n=e;if(un(e))return t==="string"?[]:[e.length];if(!Array.isArray(e))return[];const s=[];for(;Array.isArray(n)||un(n)&&t!=="string";)s.push(n.length),n=n[0];return Array.isArray(e)&&ae().getBool("TENSORLIKE_CHECK_SHAPE_CONSISTENCY")&&BT(e,s,[]),s}function BT(e,t,n){if(n=n||[],!Array.isArray(e)&&!un(e)){A(t.length===0,()=>`Element arr[${n.join("][")}] is a primitive, but should be an array/TypedArray of ${t[0]} elements`);return}A(t.length>0,()=>`Element arr[${n.join("][")}] should be a primitive, but is an array of ${e.length} elements`),A(e.length===t[0],()=>`Element arr[${n.join("][")}] should have ${t[0]} elements, but has ${e.length} elements`);const s=t.slice(1);for(let i=0;i<e.length;++i)BT(e[i],s,n.concat(i))}function MT(e,t,n,s){if(e==null)return;if(e!=="numeric"&&e!==t||e==="numeric"&&t==="string")throw new Error(`Argument '${n}' passed to '${s}' must be ${e} tensor, but got ${t} tensor`)}function W(e,t,n,s="numeric"){if(e instanceof te)return MT(s,e.dtype,t,n),e;let i=Ea(e);if(i!=="string"&&["bool","int32","float32"].indexOf(s)>=0&&(i=s),MT(s,i,t,n),e==null||!un(e)&&!Array.isArray(e)&&typeof e!="number"&&typeof e!="boolean"&&typeof e!="string"){const h=e==null?"null":e.constructor.name;throw new Error(`Argument '${t}' passed to '${n}' must be a Tensor or TensorLike, but got '${h}'`)}const o=Ni(e,i);!un(e)&&!Array.isArray(e)&&(e=[e]);const a=!0,c=i!=="string"?Ur(e,i):Q(e,[],a);return G.makeTensor(c,o,i)}function fh(e,t,n,s="numeric"){if(!Array.isArray(e))throw new Error(`Argument ${t} passed to ${n} must be a \`Tensor[]\` or \`TensorLike[]\``);const i=e;return i.map((o,a)=>W(o,`${t}[${a}]`,n),s)}const PT="__op";function z(e){const t=Object.keys(e);if(t.length!==1)throw new Error(`Please provide an object with a single key (operation name) mapping to a function. Got an object with ${t.length} keys.`);let n=t[0];const s=e[n];n.endsWith("_")&&(n=n.substring(0,n.length-1)),n=n+PT;const i=(...o)=>{G.startScope(n);try{const a=s(...o);return Ro(a)&&console.error("Cannot return a Promise inside of tidy."),G.endScope(a),a}catch(a){throw G.endScope(null),a}};return Object.defineProperty(i,"name",{value:n,configurable:!0}),i}function Zk(e,t){const n=W(e,"real","complex"),s=W(t,"imag","complex");U(n.shape,s.shape,`real and imag shapes, ${n.shape} and ${s.shape}, must match in call to tf.complex().`);const i=a=>a.complex(n,s),o={real:n,imag:s};return G.runKernelFunc(i,o,null,xd)}const er=z({complex_:Zk});function Br(e,t,n,s){if(s==null&&(s=Ea(e)),s==="complex64")throw new Error("Cannot construct a complex64 tensor directly. Please use tf.complex(real, imag).");if(!un(e)&&!Array.isArray(e)&&typeof e!="number"&&typeof e!="boolean"&&typeof e!="string")throw new Error("values passed to tensor(values) must be a number/boolean/string or an array of numbers/booleans/strings, or a TypedArray");if(t!=null){ay(t);const i=M(t),o=M(n);A(i===o,()=>`Based on the provided shape, [${t}], the tensor should have ${i} values but has ${o}`);for(let a=0;a<n.length;++a){const c=n[a],h=a===n.length-1?c!==M(t.slice(a)):!0;A(n[a]===t[a]||!h,()=>`Error creating a new Tensor. Inferred shape (${n}) does not match the provided shape (${t}). `)}}return!un(e)&&!Array.isArray(e)&&(e=[e]),t=t||n,e=s!=="string"?Ur(e,s):Q(e,[],!0),G.makeTensor(e,t,s)}function sn(e,t,n){const s=Ni(e,n);return Br(e,t,s,n)}const ib={float32:4,float16:2,int32:4,uint16:2,uint8:1,bool:1,complex64:8};const ip=4;async function rb(e,t){const n=[],s=[],i=Array.isArray(e)?e.map(a=>a.name):Object.keys(e);for(let a=0;a<i.length;++a){const c=i[a],h=Array.isArray(e)?e[a].tensor:e[c];if(h.dtype!=="float32"&&h.dtype!=="int32"&&h.dtype!=="bool"&&h.dtype!=="string"&&h.dtype!=="complex64")throw new Error(`Unsupported dtype in weight '${c}': ${h.dtype}`);const p={name:c,shape:h.shape,dtype:h.dtype};if(h.dtype==="string"){const m=new Promise(async y=>{const b=await h.bytes(),w=b.reduce((v,C)=>v+C.length,0)+ip*b.length,L=new Uint8Array(w);let T=0;for(let v=0;v<b.length;v++){const C=b[v],O=new Uint8Array(new Uint32Array([C.length]).buffer);L.set(O,T),T+=ip,L.set(C,T),T+=C.length}y(L)});s.push(m)}else s.push(h.data());t!=null&&(p.group=t),n.push(p)}const o=await Promise.all(s);return{data:Qk(o),specs:n}}function rp(e,t){const n={};let s,i=0;for(const o of t){const a=o.name,c=o.dtype,h=o.shape,p=M(h);let m;if("quantization"in o){const y=o.quantization;if(y.dtype==="uint8"||y.dtype==="uint16"){if(!("min"in y&&"scale"in y))throw new Error(`Weight ${o.name} with quantization ${y.dtype} doesn't have corresponding metadata min and scale.`)}else if(y.dtype==="float16"){if(c!=="float32")throw new Error(`Weight ${o.name} is quantized with ${y.dtype} which only supports weights of type float32 not ${c}.`)}else throw new Error(`Weight ${o.name} has unknown quantization dtype ${y.dtype}. Supported quantization dtypes are: 'uint8', 'uint16', and 'float16'.`);const b=ib[y.dtype],w=e.slice(i,i+p*b),L=y.dtype==="uint8"?new Uint8Array(w):new Uint16Array(w);if(c==="float32")if(y.dtype==="uint8"||y.dtype==="uint16"){m=new Float32Array(L.length);for(let T=0;T<L.length;T++){const v=L[T];m[T]=v*y.scale+y.min}}else if(y.dtype==="float16")s===void 0&&(s=rF()),m=s(L);else throw new Error(`Unsupported quantization type ${y.dtype} for weight type float32.`);else if(c==="int32"){if(y.dtype!=="uint8"&&y.dtype!=="uint16")throw new Error(`Unsupported quantization type ${y.dtype} for weight type int32.`);m=new Int32Array(L.length);for(let T=0;T<L.length;T++){const v=L[T];m[T]=Math.round(v*y.scale+y.min)}}else throw new Error(`Unsupported dtype in weight '${a}': ${c}`);i+=p*b}else if(c==="string"){const y=M(o.shape);m=[];for(let b=0;b<y;b++){const w=new Uint32Array(e.slice(i,i+ip))[0];i+=ip;const L=new Uint8Array(e.slice(i,i+w));m.push(L),i+=w}}else{const y=ib[c],b=e.slice(i,i+p*y);if(c==="float32")m=new Float32Array(b);else if(c==="int32")m=new Int32Array(b);else if(c==="bool")m=new Uint8Array(b);else if(c==="complex64"){m=new Float32Array(b);const w=new Float32Array(m.length/2),L=new Float32Array(m.length/2);for(let C=0;C<w.length;C++)w[C]=m[C*2],L[C]=m[C*2+1];const T=sn(w,h,"float32"),v=sn(L,h,"float32");n[a]=er(T,v),T.dispose(),v.dispose()}else throw new Error(`Unsupported dtype in weight '${a}': ${c}`);i+=p*y}c!=="complex64"&&(n[a]=sn(m,h,c))}return n}function Qk(e){if(e===null)throw new Error(`Invalid input value: ${JSON.stringify(e)}`);let t=0;const n=[];e.forEach(o=>{if(t+=o.byteLength,n.push(o.byteLength===o.buffer.byteLength?o:new o.constructor(o)),!(o instanceof Float32Array||o instanceof Int32Array||o instanceof Uint8Array))throw new Error(`Unsupported TypedArray subtype: ${o.constructor.name}`)});const s=new Uint8Array(t);let i=0;return n.forEach(o=>{s.set(new Uint8Array(o.buffer),i),i+=o.byteLength}),s.buffer}const ob=typeof Buffer!="undefined"&&(typeof Blob=="undefined"||typeof atob=="undefined"||typeof btoa=="undefined");function zT(e){return ob?Buffer.byteLength(e):new Blob([e]).size}function eF(e){if(ob)return Buffer.from(e).toString("base64");const t=new Uint8Array(e);let n="";for(let s=0,i=t.length;s<i;s++)n+=String.fromCharCode(t[s]);return btoa(n)}function tF(e){if(ob){const s=Buffer.from(e,"base64");return s.buffer.slice(s.byteOffset,s.byteOffset+s.byteLength)}const t=atob(e),n=new Uint8Array(t.length);for(let s=0;s<t.length;++s)n.set([t.charCodeAt(s)],s);return n.buffer}function op(e){if(e.length===1)return e[0];let t=0;e.forEach(i=>{t+=i.byteLength});const n=new Uint8Array(t);let s=0;return e.forEach(i=>{n.set(new Uint8Array(i),s),s+=i.byteLength}),n.buffer}function VT(e){const t="/";for(e=e.trim();e.endsWith(t);)e=e.slice(0,e.length-1);const n=e.split(t);return n[n.length-1]}function gh(e){if(e.modelTopology instanceof ArrayBuffer)throw new Error("Expected JSON model topology, received ArrayBuffer.");return{dateSaved:new Date,modelTopologyType:"JSON",modelTopologyBytes:e.modelTopology==null?0:zT(JSON.stringify(e.modelTopology)),weightSpecsBytes:e.weightSpecs==null?0:zT(JSON.stringify(e.weightSpecs)),weightDataBytes:e.weightData==null?0:e.weightData.byteLength}}function nF(){const e=n=>{let s=n<<13,i=0;for(;(s&8388608)===0;)i-=8388608,s<<=1;return s&=~8388608,i+=947912704,s|i},t=new Uint32Array(2048);t[0]=0;for(let n=1;n<1024;n++)t[n]=e(n);for(let n=1024;n<2048;n++)t[n]=939524096+(n-1024<<13);return t}function sF(){const e=new Uint32Array(64);e[0]=0,e[31]=1199570944,e[32]=2147483648,e[63]=3347054592;for(let t=1;t<31;t++)e[t]=t<<23;for(let t=33;t<63;t++)e[t]=2147483648+(t-32<<23);return e}function iF(){const e=new Uint32Array(64);for(let t=0;t<64;t++)e[t]=1024;return e[0]=e[32]=0,e}function rF(){const e=nF(),t=sF(),n=iF();return s=>{const i=new ArrayBuffer(4*s.length),o=new Uint32Array(i);for(let a=0;a<s.length;a++){const c=s[a],h=e[n[c>>10]+(c&1023)]+t[c>>10];o[a]=h}return new Float32Array(i)}}class en{constructor(){this.saveRouters=[],this.loadRouters=[]}static getInstance(){return en.instance==null&&(en.instance=new en),en.instance}static registerSaveRouter(e){en.getInstance().saveRouters.push(e)}static registerLoadRouter(e){en.getInstance().loadRouters.push(e)}static getSaveHandlers(e){return en.getHandlers(e,"save")}static getLoadHandlers(e,t){return en.getHandlers(e,"load",t)}static getHandlers(e,t,n){const s=[],i=t==="load"?en.getInstance().loadRouters:en.getInstance().saveRouters;return i.forEach(o=>{const a=o(e,n);a!==null&&s.push(a)}),s}}const oF=e=>en.registerSaveRouter(e),aF=e=>en.registerLoadRouter(e),ab=e=>en.getSaveHandlers(e),cb=(e,t)=>en.getLoadHandlers(e,t);const ap="tensorflowjs",lb=1,Eo="models_store",Mr="model_info_store";async function dte(){const e=hb();return new Promise((t,n)=>{const s=e.deleteDatabase(ap);s.onsuccess=()=>t(),s.onerror=i=>n(i)})}function hb(){if(!ae().getBool("IS_BROWSER"))throw new Error("Failed to obtain IndexedDB factory because the current environmentis not a web browser.");const e=typeof window=="undefined"?self:window,t=e.indexedDB||e.mozIndexedDB||e.webkitIndexedDB||e.msIndexedDB||e.shimIndexedDB;if(t==null)throw new Error("The current browser does not appear to support IndexedDB.");return t}function ub(e){const t=e.result;t.createObjectStore(Eo,{keyPath:"modelPath"}),t.createObjectStore(Mr,{keyPath:"modelPath"})}class Do{constructor(e){if(this.indexedDB=hb(),e==null||!e)throw new Error("For IndexedDB, modelPath must not be null, undefined or empty.");this.modelPath=e}async save(e){if(e.modelTopology instanceof ArrayBuffer)throw new Error("BrowserLocalStorage.save() does not support saving model topology in binary formats yet.");return this.databaseAction(this.modelPath,e)}async load(){return this.databaseAction(this.modelPath)}databaseAction(e,t){return new Promise((n,s)=>{const i=this.indexedDB.open(ap,lb);i.onupgradeneeded=()=>ub(i),i.onsuccess=()=>{const o=i.result;if(t==null){const a=o.transaction(Eo,"readonly"),c=a.objectStore(Eo),h=c.get(this.modelPath);h.onsuccess=()=>{if(h.result==null)return o.close(),s(new Error(`Cannot find model with path '${this.modelPath}' in IndexedDB.`));n(h.result.modelArtifacts)},h.onerror=p=>(o.close(),s(h.error)),a.oncomplete=()=>o.close()}else{const a=gh(t),c=o.transaction(Mr,"readwrite");let h=c.objectStore(Mr);const p=h.put({modelPath:this.modelPath,modelArtifactsInfo:a});let m;p.onsuccess=()=>{m=o.transaction(Eo,"readwrite");const y=m.objectStore(Eo),b=y.put({modelPath:this.modelPath,modelArtifacts:t,modelArtifactsInfo:a});b.onsuccess=()=>n({modelArtifactsInfo:a}),b.onerror=w=>{h=c.objectStore(Mr);const L=h.delete(this.modelPath);L.onsuccess=()=>(o.close(),s(b.error)),L.onerror=T=>(o.close(),s(b.error))}},p.onerror=y=>(o.close(),s(p.error)),c.oncomplete=()=>{m==null?o.close():m.oncomplete=()=>o.close()}}},i.onerror=o=>s(i.error)})}}Do.URL_SCHEME="indexeddb://";const GT=e=>ae().getBool("IS_BROWSER")&&(!Array.isArray(e)&&e.startsWith(Do.URL_SCHEME))?cF(e.slice(Do.URL_SCHEME.length)):null;en.registerSaveRouter(GT),en.registerLoadRouter(GT);function cF(e){return new Do(e)}function lF(e){return e.startsWith(Do.URL_SCHEME)?e.slice(Do.URL_SCHEME.length):e}class hF{constructor(){this.indexedDB=hb()}async listModels(){return new Promise((e,t)=>{const n=this.indexedDB.open(ap,lb);n.onupgradeneeded=()=>ub(n),n.onsuccess=()=>{const s=n.result,i=s.transaction(Mr,"readonly"),o=i.objectStore(Mr),a=o.getAll();a.onsuccess=()=>{const c={};for(const h of a.result)c[h.modelPath]=h.modelArtifactsInfo;e(c)},a.onerror=c=>(s.close(),t(a.error)),i.oncomplete=()=>s.close()},n.onerror=s=>t(n.error)})}async removeModel(e){return e=lF(e),new Promise((t,n)=>{const s=this.indexedDB.open(ap,lb);s.onupgradeneeded=()=>ub(s),s.onsuccess=()=>{const i=s.result,o=i.transaction(Mr,"readwrite"),a=o.objectStore(Mr),c=a.get(e);let h;c.onsuccess=()=>{if(c.result==null)return i.close(),n(new Error(`Cannot find model with path '${e}' in IndexedDB.`));{const p=a.delete(e),m=()=>{h=i.transaction(Eo,"readwrite");const y=h.objectStore(Eo),b=y.delete(e);b.onsuccess=()=>t(c.result.modelArtifactsInfo),b.onerror=w=>n(c.error)};p.onsuccess=m,p.onerror=y=>(m(),i.close(),n(c.error))}},c.onerror=p=>(i.close(),n(c.error)),o.oncomplete=()=>{h==null?i.close():h.oncomplete=()=>i.close()}},s.onerror=i=>n(s.error)})}}const Ci="/",ko="tensorflowjs_models",YT="info",uF="model_topology",dF="weight_specs",pF="weight_data",mF="model_metadata";function pte(){if(!ae().getBool("IS_BROWSER")||typeof window=="undefined"||typeof window.localStorage=="undefined")throw new Error("purgeLocalStorageModels() cannot proceed because local storage is unavailable in the current environment.");const e=window.localStorage,t=[];for(let n=0;n<e.length;++n){const s=e.key(n),i=ko+Ci;if(s.startsWith(i)&&s.length>i.length){e.removeItem(s);const o=qT(s);t.indexOf(o)===-1&&t.push(o)}}return t}function HT(e){return{info:[ko,e,YT].join(Ci),topology:[ko,e,uF].join(Ci),weightSpecs:[ko,e,dF].join(Ci),weightData:[ko,e,pF].join(Ci),modelMetadata:[ko,e,mF].join(Ci)}}function qT(e){const t=e.split(Ci);if(t.length<3)throw new Error(`Invalid key format: ${e}`);return t.slice(1,t.length-1).join(Ci)}function fF(e){return e.startsWith(Fo.URL_SCHEME)?e.slice(Fo.URL_SCHEME.length):e}class Fo{constructor(e){if(!ae().getBool("IS_BROWSER")||typeof window=="undefined"||typeof window.localStorage=="undefined")throw new Error("The current environment does not support local storage.");if(this.LS=window.localStorage,e==null||!e)throw new Error("For local storage, modelPath must not be null, undefined or empty.");this.modelPath=e,this.keys=HT(this.modelPath)}async save(e){if(e.modelTopology instanceof ArrayBuffer)throw new Error("BrowserLocalStorage.save() does not support saving model topology in binary formats yet.");{const t=JSON.stringify(e.modelTopology),n=JSON.stringify(e.weightSpecs),s=gh(e);try{return this.LS.setItem(this.keys.info,JSON.stringify(s)),this.LS.setItem(this.keys.topology,t),this.LS.setItem(this.keys.weightSpecs,n),this.LS.setItem(this.keys.weightData,eF(e.weightData)),this.LS.setItem(this.keys.modelMetadata,JSON.stringify({format:e.format,generatedBy:e.generatedBy,convertedBy:e.convertedBy,userDefinedMetadata:e.userDefinedMetadata})),{modelArtifactsInfo:s}}catch(i){throw this.LS.removeItem(this.keys.info),this.LS.removeItem(this.keys.topology),this.LS.removeItem(this.keys.weightSpecs),this.LS.removeItem(this.keys.weightData),this.LS.removeItem(this.keys.modelMetadata),new Error(`Failed to save model '${this.modelPath}' to local storage: size quota being exceeded is a possible cause of this failure: modelTopologyBytes=${s.modelTopologyBytes}, weightSpecsBytes=${s.weightSpecsBytes}, weightDataBytes=${s.weightDataBytes}.`)}}}async load(){const e=JSON.parse(this.LS.getItem(this.keys.info));if(e==null)throw new Error(`In local storage, there is no model with name '${this.modelPath}'`);if(e.modelTopologyType!=="JSON")throw new Error("BrowserLocalStorage does not support loading non-JSON model topology yet.");const t={},n=JSON.parse(this.LS.getItem(this.keys.topology));if(n==null)throw new Error(`In local storage, the topology of model '${this.modelPath}' is missing.`);t.modelTopology=n;const s=JSON.parse(this.LS.getItem(this.keys.weightSpecs));if(s==null)throw new Error(`In local storage, the weight specs of model '${this.modelPath}' are missing.`);t.weightSpecs=s;const i=this.LS.getItem(this.keys.modelMetadata);if(i!=null){const a=JSON.parse(i);t.format=a.format,t.generatedBy=a.generatedBy,t.convertedBy=a.convertedBy,t.userDefinedMetadata=a.userDefinedMetadata}const o=this.LS.getItem(this.keys.weightData);if(o==null)throw new Error(`In local storage, the binary weight values of model '${this.modelPath}' are missing.`);return t.weightData=tF(o),t}}Fo.URL_SCHEME="localstorage://";const jT=e=>ae().getBool("IS_BROWSER")&&(!Array.isArray(e)&&e.startsWith(Fo.URL_SCHEME))?gF(e.slice(Fo.URL_SCHEME.length)):null;en.registerSaveRouter(jT),en.registerLoadRouter(jT);function gF(e){return new Fo(e)}class yF{constructor(){A(ae().getBool("IS_BROWSER"),()=>"Current environment is not a web browser"),A(typeof window=="undefined"||typeof window.localStorage!="undefined",()=>"Current browser does not appear to support localStorage"),this.LS=window.localStorage}async listModels(){const e={},t=ko+Ci,n=Ci+YT;for(let s=0;s<this.LS.length;++s){const i=this.LS.key(s);if(i.startsWith(t)&&i.endsWith(n)){const o=qT(i);e[o]=JSON.parse(this.LS.getItem(i))}}return e}async removeModel(e){e=fF(e);const t=HT(e);if(this.LS.getItem(t.info)==null)throw new Error(`Cannot find model at path '${e}'`);const n=JSON.parse(this.LS.getItem(t.info));return this.LS.removeItem(t.info),this.LS.removeItem(t.topology),this.LS.removeItem(t.weightSpecs),this.LS.removeItem(t.weightData),n}}const Va="://";class Ts{constructor(){this.managers={}}static getInstance(){return Ts.instance==null&&(Ts.instance=new Ts),Ts.instance}static registerManager(e,t){A(e!=null,()=>"scheme must not be undefined or null."),e.endsWith(Va)&&(e=e.slice(0,e.indexOf(Va))),A(e.length>0,()=>"scheme must not be an empty string.");const n=Ts.getInstance();A(n.managers[e]==null,()=>`A model store manager is already registered for scheme '${e}'.`),n.managers[e]=t}static getManager(e){const t=this.getInstance().managers[e];if(t==null)throw new Error(`Cannot find model manager for scheme '${e}'`);return t}static getSchemes(){return Object.keys(this.getInstance().managers)}}function cp(e){if(e.indexOf(Va)===-1)throw new Error(`The url string provided does not contain a scheme. Supported schemes are: ${Ts.getSchemes().join(",")}`);return{scheme:e.split(Va)[0],path:e.split(Va)[1]}}async function KT(e,t,n=!1){A(e!==t,()=>`Old path and new path are the same: '${e}'`);const s=en.getLoadHandlers(e);A(s.length>0,()=>`Copying failed because no load handler is found for source URL ${e}.`),A(s.length<2,()=>`Copying failed because more than one (${s.length}) load handlers for source URL ${e}.`);const i=s[0],o=en.getSaveHandlers(t);A(o.length>0,()=>`Copying failed because no save handler is found for destination URL ${t}.`),A(o.length<2,()=>`Copying failed because more than one (${s.length}) save handlers for destination URL ${t}.`);const a=o[0],c=cp(e).scheme,h=cp(e).path,p=c===cp(e).scheme,m=await i.load();n&&p&&await Ts.getManager(c).removeModel(h);const y=await a.save(m);return n&&!p&&await Ts.getManager(c).removeModel(h),y.modelArtifactsInfo}async function bF(){const e=Ts.getSchemes(),t={};for(const n of e){const s=await Ts.getManager(n).listModels();for(const i in s){const o=n+Va+i;t[o]=s[i]}}return t}async function wF(e){const t=cp(e),n=Ts.getManager(t.scheme);return n.removeModel(t.path)}async function LF(e,t){const n=!1;return KT(e,t,n)}async function SF(e,t){const n=!0;return KT(e,t,n)}class IF{fetch(e,t){return fetch(e,t)}now(){return performance.now()}encode(e,t){if(t!=="utf-8"&&t!=="utf8")throw new Error(`Browser's encoder only supports utf-8, but got ${t}`);return this.textEncoder==null&&(this.textEncoder=new TextEncoder),this.textEncoder.encode(e)}decode(e,t){return new TextDecoder(t).decode(e)}}if(ae().get("IS_BROWSER")){ae().setPlatform("browser",new IF);try{Ts.registerManager(Fo.URL_SCHEME,new yF)}catch(e){}try{Ts.registerManager(Do.URL_SCHEME,new hF)}catch(e){}}const xF={importFetch:()=>F2()};let Ga;function mte(){Ga=null}function fte(e){Ga=e}function gte(){return Ga}class TF{constructor(){this.util=require("util"),this.textEncoder=new this.util.TextEncoder}fetch(e,t){return ae().global.fetch!=null?ae().global.fetch(e,t):(Ga==null&&(Ga=xF.importFetch()),Ga(e,t))}now(){const e=process.hrtime();return e[0]*1e3+e[1]/1e6}encode(e,t){if(t!=="utf-8"&&t!=="utf8")throw new Error(`Node built-in encoder only supports utf-8, but got ${t}`);return this.textEncoder.encode(e)}decode(e,t){return e.length===0?"":new this.util.TextDecoder(t).decode(e)}}ae().get("IS_NODE")&&ae().setPlatform("node",new TF);function wt(e,t="float32",n){return t=t||"float32",ay(e),new an(e,t,n)}function AF(e,t){const n=W(e,"x","cast");if(!Wr(t))throw new Error(`Failed to cast to unknown dtype ${t}`);if(t==="string"&&n.dtype!=="string"||t!=="string"&&n.dtype==="string")throw new Error("Only strings can be casted to strings");const s={x:n},i={dtype:t};return G.runKernelFunc(o=>o.cast(n,t),s,null,ka,i)}const ve=z({cast_:AF});function vF(e){const t=W(e,"x","clone",null),n=()=>G.makeTensorFromDataId(t.dataId,t.shape,t.dtype),s={x:t};return G.runKernelFunc(n,s,null,$l)}const Pr=z({clone_:vF});function XT(e,t=!1){console.log(e.toString(t))}$T();const NF={buffer:wt,cast:ve,clone:Pr,print:XT};Vk(NF);const CF="model",RF=".json",OF=".weights.bin";function JT(e){return new Promise(t=>setTimeout(t)).then(e)}class Ya{constructor(e){if(!ae().getBool("IS_BROWSER"))throw new Error("browserDownloads() cannot proceed because the current environment is not a browser.");e.startsWith(Ya.URL_SCHEME)&&(e=e.slice(Ya.URL_SCHEME.length)),(e==null||e.length===0)&&(e=CF),this.modelTopologyFileName=e+RF,this.weightDataFileName=e+OF}async save(e){if(typeof document=="undefined")throw new Error("Browser downloads are not supported in this environment since `document` is not present");const t=window.URL.createObjectURL(new Blob([e.weightData],{type:"application/octet-stream"}));if(e.modelTopology instanceof ArrayBuffer)throw new Error("BrowserDownloads.save() does not support saving model topology in binary formats yet.");{const n=[{paths:["./"+this.weightDataFileName],weights:e.weightSpecs}],s={modelTopology:e.modelTopology,format:e.format,generatedBy:e.generatedBy,convertedBy:e.convertedBy,weightsManifest:n},i=window.URL.createObjectURL(new Blob([JSON.stringify(s)],{type:"application/json"})),o=this.jsonAnchor==null?document.createElement("a"):this.jsonAnchor;if(o.download=this.modelTopologyFileName,o.href=i,await JT(()=>o.dispatchEvent(new MouseEvent("click"))),e.weightData!=null){const a=this.weightDataAnchor==null?document.createElement("a"):this.weightDataAnchor;a.download=this.weightDataFileName,a.href=t,await JT(()=>a.dispatchEvent(new MouseEvent("click")))}return{modelArtifactsInfo:gh(e)}}}}Ya.URL_SCHEME="downloads://";class EF{constructor(e){if(e==null||e.length<1)throw new Error(`When calling browserFiles, at least 1 file is required, but received ${e}`);this.files=e}async load(){const e=this.files[0],t=this.files.slice(1);return new Promise((n,s)=>{const i=new FileReader;i.onload=o=>{const a=JSON.parse(o.target.result),c=a.modelTopology;if(c==null){s(new Error(`modelTopology field is missing from file ${e.name}`));return}t.length===0&&n({modelTopology:c});const h=a.weightsManifest;if(h==null){s(new Error(`weightManifest field is missing from file ${e.name}`));return}let p;try{p=this.checkManifestAndWeightFiles(h,t)}catch(w){s(w);return}const m=[],y=[],b=[];h.forEach(w=>{w.paths.forEach(L=>{y.push(L),b.push(null)}),m.push(...w.weights)}),h.forEach(w=>{w.paths.forEach(L=>{const T=new FileReader;T.onload=v=>{const C=v.target.result,O=y.indexOf(L);b[O]=C,b.indexOf(null)===-1&&n({modelTopology:c,weightSpecs:m,weightData:op(b),format:a.format,generatedBy:a.generatedBy,convertedBy:a.convertedBy,userDefinedMetadata:a.userDefinedMetadata})},T.onerror=v=>s(`Failed to weights data from file of path '${L}'.`),T.readAsArrayBuffer(p[L])})})},i.onerror=o=>s(`Failed to read model topology and weights manifest JSON from file '${e.name}'. BrowserFiles supports loading Keras-style tf.Model artifacts only.`),i.readAsText(e)})}checkManifestAndWeightFiles(e,t){const n=[],s=t.map(o=>VT(o.name)),i={};for(const o of e)o.paths.forEach(a=>{const c=VT(a);if(n.indexOf(c)!==-1)throw new Error(`Duplicate file basename found in weights manifest: '${c}'`);if(n.push(c),s.indexOf(c)===-1)throw new Error(`Weight file with basename '${c}' is not provided.`);i[a]=t[s.indexOf(c)]});if(n.length!==t.length)throw new Error(`Mismatch in the number of files in weights manifest (${n.length}) and the number of weight files provided (${t.length}).`);return i}}const DF=e=>ae().getBool("IS_BROWSER")&&(!Array.isArray(e)&&e.startsWith(Ya.URL_SCHEME))?kF(e.slice(Ya.URL_SCHEME.length)):null;en.registerSaveRouter(DF);function kF(e="model"){return new Ya(e)}function FF(e){return new EF(e)}function ZT(e,t,n,s){a(e),n=n==null?0:n,s=s==null?1:s,c(n,s);let i=0;const o=h=>(h.then(p=>{const m=n+ ++i/e.length*(s-n);return t(m),p}),h);function a(h){A(h!=null&&Array.isArray(h)&&h.length>0,()=>"promises must be a none empty array")}function c(h,p){A(h>=0&&h<=1,()=>`Progress fraction must be in range [0, 1], but got startFraction ${h}`),A(p>=0&&p<=1,()=>`Progress fraction must be in range [0, 1], but got endFraction ${p}`),A(p>=h,()=>`startFraction must be no more than endFraction, but got startFraction ${h} and endFraction ${p}`)}return Promise.all(e.map(o))}async function QT(e,t){t==null&&(t={});const n=t.fetchFunc==null?ae().platform.fetch:t.fetchFunc,s=e.map(y=>n(y,t.requestInit,{isBinary:!0})),i=0,o=.5,a=t.onProgress==null?await Promise.all(s):await ZT(s,t.onProgress,i,o),c=a.map(y=>y.arrayBuffer()),h=.5,p=1,m=t.onProgress==null?await Promise.all(c):await ZT(c,t.onProgress,h,p);return m}async function eA(e,t="",n,s){const i=a=>QT(a,{requestInit:s}),o=tA(i);return o(e,t,n)}function tA(e){return async(t,n="",s)=>{const i=t.map(()=>!1),o={},a=s!=null?s.map(()=>!1):[],c=[];if(t.forEach((w,L)=>{let T=0;w.weights.forEach(v=>{const C="quantization"in v?v.quantization.dtype:v.dtype,O=ib[C]*M(v.shape),D=()=>{i[L]=!0,o[L]==null&&(o[L]=[]),o[L].push({manifestEntry:v,groupOffset:T,sizeBytes:O})};s!=null?s.forEach((k,F)=>{k===v.name&&(D(),a[F]=!0)}):D(),c.push(v.name),T+=O})}),!a.every(w=>w)){const w=s.filter((L,T)=>!a[T]);throw new Error(`Could not find weights in manifest with names: ${w.join(", ")}. 
                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                            

ReferenceError: require is not defined
    at new TF (file:///home/ali/Desktop/Nodejs%20Projects/face-api-test2/node_modules/@vladmandic/face-api/dist/face-api.node.js:5:43877)
    at file:///home/ali/Desktop/Nodejs%20Projects/face-api-test2/node_modules/@vladmandic/face-api/dist/face-api.node.js:5:44377
    at file:///home/ali/Desktop/Nodejs%20Projects/face-api-test2/node_modules/@vladmandic/face-api/dist/face-api.node.js:1:18641
    at file:///home/ali/Desktop/Nodejs%20Projects/face-api-test2/node_modules/@vladmandic/face-api/dist/face-api.node.js:1:18737
    at file:///home/ali/Desktop/Nodejs%20Projects/face-api-test2/node_modules/@vladmandic/face-api/dist/face-api.node.js:1:99
    at file:///home/ali/Desktop/Nodejs%20Projects/face-api-test2/node_modules/@vladmandic/face-api/dist/face-api.node.js:3957:9445
    at ModuleJob.run (internal/modules/esm/module_job.js:146:23)
    at async Loader.import (internal/modules/esm/loader.js:165:24)
    at async Object.loadESM (internal/process/esm_loader.js:68:5)

Process finished with exit code 1

This is package.json:

{
  "type": "module",
  "name": "face-api-test",
  "version": "0.0.1",
  "description": "Face API test",
  "main": "index.mjs",
  "dependencies": {
    "@tensorflow/tfjs-node": "latest",
    "@tensorflow/tfjs": "latest",
    "@vladmandic/face-api": "latest",
    "canvas": "latest",
    "express": "latest",
    "formidable": "latest"
  },
  "scripts": {
    "start": "tsc && node dist/faceDetection.js",
    "test": "echo \"Error: no test specified\" && exit 1"
  },
  "author": "Ali Tafakkori",
  "license": "ISC"
}

This is index.mjs:

import '@tensorflow/tfjs'
import * as canvas from 'canvas';
import * as faceapi from '@vladmandic/face-api/dist/face-api.esm.js';
import * as express from "express";
import * as formidable from "formidable";

import fs from 'fs';
import path from 'path';

const {IncomingForm} = formidable;
const app = express();
const {Canvas, Image, ImageData} = canvas;
faceapi.env.monkeyPatch({Canvas, Image, ImageData});

function saveFile(fileName, buf) {
    const baseDir = "./out";
    if (!fs.existsSync(baseDir)) {
        fs.mkdirSync(baseDir)
    }

    fs.writeFileSync(path.resolve(baseDir, fileName), buf)
}

const minConfidence = 0.5;

// TinyFaceDetectorOptions
const inputSize = 408;
const scoreThreshold = 0.5;

function getFaceDetectorOptions(net) {
    return net === faceapi.nets.ssdMobilenetv1
        ? new faceapi.SsdMobilenetv1Options({minConfidence})
        : new faceapi.TinyFaceDetectorOptions({inputSize, scoreThreshold})
}

async function run() {
    console.log(new Date().toISOString());
    const start = new Date().getTime();

    await faceapi.nets.ssdMobilenetv1.loadFromDisk('./weights');
    await faceapi.nets.faceRecognitionNet.loadFromDisk('./weights');
    await faceapi.nets.faceLandmark68Net.loadFromDisk('./weights');
    await faceapi.nets.faceLandmark68TinyNet.loadFromDisk('./weights');
    await faceapi.nets.ageGenderNet.loadFromDisk('./weights');
    await faceapi.nets.faceExpressionNet.loadFromDisk('./weights');

    //await faceRecognition();
    //await detectSingleFace();

    console.log(new Date().getTime() - start);
    console.log(new Date().toISOString());
}

run().then();

async function detectSingleFace(query) {
    const img = await canvas.loadImage(query);
    const res = await faceapi.detectSingleFace(img, getFaceDetectorOptions(faceapi.nets.ssdMobilenetv1))
        .withFaceLandmarks()
        .withAgeAndGender()
        .withFaceExpressions()
        .withFaceDescriptor();
    const out = faceapi.createCanvasFromMedia(img);
    //faceapi.draw.drawFaceLandmarks(out, res);
    faceapi.draw.drawDetections(out, res);
    //faceapi.draw.drawFaceExpressions(out, res);
    fs.writeFileSync('./out/faceDetection.jpg', out.toBuffer('image/jpeg'));
}

async function detectAllFaces(query) {
    const img = await canvas.loadImage(query);
    const results = await faceapi.detectAllFaces(img).withFaceLandmarks().withFaceDescriptors()
    const out = faceapi.createCanvasFromMedia(img);
    faceapi.draw.drawDetections(out, results);
    //faceapi.draw.drawFaceLandmarks(out, results);
    fs.writeFileSync(`./out/faces/faces.jpg`, out.toBuffer('image/jpeg'));
}

async function faceRecognition(queryImagePath, src) {

    const referenceImage = await canvas.loadImage(src);
    const queryImage = await canvas.loadImage(queryImagePath);

    const resultsRef = await faceapi.detectAllFaces(referenceImage, getFaceDetectorOptions(faceapi.nets.ssdMobilenetv1))
        .withFaceLandmarks()
        .withFaceDescriptors();

    const resultsQuery = await faceapi.detectAllFaces(queryImage, getFaceDetectorOptions(faceapi.nets.ssdMobilenetv1))
        .withFaceLandmarks()
        .withFaceDescriptors();

    const faceMatcher = new faceapi.FaceMatcher(resultsRef);

    const labels = faceMatcher.labeledDescriptors
        .map(ld => ld.label);
    const refDrawBoxes = resultsRef
        .map(res => res.detection.box)
        .map((box, i) => new faceapi.draw.DrawBox(box, {label: labels[i]}));
    const outRef = faceapi.createCanvasFromMedia(referenceImage);
    refDrawBoxes.forEach(drawBox => drawBox.draw(outRef));

    saveFile('referenceImage.jpg', outRef.toBuffer('image/jpeg'));

    const queryDrawBoxes = resultsQuery.map(res => {
        const bestMatch = faceMatcher.findBestMatch(res.descriptor);
        return new faceapi.draw.DrawBox(res.detection.box, {label: bestMatch.toString()});
    });
    const outQuery = faceapi.createCanvasFromMedia(queryImage);
    queryDrawBoxes.forEach(drawBox => drawBox.draw(outQuery));
    saveFile('queryImage.jpg', outQuery.toBuffer('image/jpeg'));
    console.log('done, saved results to out/queryImage.jpg');
}


app.post("/faceRecognition", function (req, res) {
    IncomingForm().parse(req, async function (err, fields, files) {
        if (err) {
            return res.status(503).send(err);
        } else {
            const src = files.src.path;
            const query = files.query.path;
            console.log(new Date().toISOString());
            const start = new Date().getTime();
            await faceRecognition(src, query);
            console.log(new Date().getTime() - start);
            console.log(new Date().toISOString());
            res.redirect("index.html");
        }
    });
});

app.post("/detectSingleFace", function (req, res) {
    IncomingForm().parse(req, async function (err, fields, files) {
        if (err) {
            return res.status(503).send(err);
        } else {
            const query = files.query.path;
            console.log(new Date().toISOString());
            const start = new Date().getTime();
            await detectSingleFace(query);
            console.log(new Date().getTime() - start);
            console.log(new Date().toISOString());
            res.redirect("index.html");
        }
    });
});

app.post("/detectAllFaces", function (req, res) {
    IncomingForm().parse(req, async function (err, fields, files) {
        if (err) {
            return res.status(503).send(err);
        } else {
            const query = files.query.path;
            console.log(new Date().toISOString());
            const start = new Date().getTime();
            await detectAllFaces(query);
            console.log(new Date().getTime() - start);
            console.log(new Date().toISOString());
            res.redirect("index.html");
        }
    });
});

app.use(express.static("./static"));

app.listen(7700, function () {
    console.log("Server ready on 7700 port");
});

Can this still be used clientside?

I'm getting complaints from javascript about "require".

There is an example folder, but the example doesn't work because it's missing the basic "face-api.js" in the dist folder? The file https://vladmandic.github.io/face-api/dist/face-api.js that the index.html calls doesn't seem to exist anymore?

How does detectAllFaces function will handle fake image photo by mobile

Hi,
I used this code that detect all faces, I want to avoid that user will photo himself by cellular and show the cellular photo to camera , how I avoid by code ?
For know fake image by cellular also work.
I saw this link in history issue and there is no option do install through npm ?
https://github.com/vladmandic/anti-spoofing

 const ssdOptions = { minConfidence: 0.1, maxResults: 10 };
      const optionsSSDMobileNet = new faceapi.SsdMobilenetv1Options(ssdOptions);
      const faces = await faceapi
        .detectAllFaces(tensor, optionsSSDMobileNet)
        .withFaceLandmarks()
        .withFaceExpressions()
        .withFaceDescriptors()
        .withAgeAndGender();

import * as seedrandom from 'seedrandom

Issue Description
I got a problem about "import * as seedrandom from 'seedrandom';"

Steps to Reproduce
git clone https://github.com/vladmandic/face-api
cd face-api
npm install
npm run build
npm run dev

Expected Behavior
I think it should run correctly, see my face was detected with the camera

**Environment

  • FaceAPI version
    version is the lastest face-api version from vladmandic

  • FaceAPI module used (e.g. face-api, face-api.esm, face-api.esm-nobundle)
    I just wanna start the demo locally

  • Browser or NodeJS and version (e.g. NodeJS 14.15 or Chrome 86)
    browser version is chrome 89.0.4389.82 and the nodejs version is v12.18.3

  • OS and Hardware platform (e.g. Windows 10, Ubuntu Linux on x64, Android 10)
    os version is Windows7

  • Packager (if any) (e.g, webpack, rollup, parcel, esbuild, etc.)
    no packager

the error message :

         > node_modules/@tensorflow/tfjs-data/dist/dataset.js:19:28: error: Could not resolve "seedrandom" (mark it as external to exclude it from the bundle)
            19 │ import * as seedrandom from 'seedrandom';
               ╵                             ~~~~~~~~~~~~
        
         > node_modules/@tensorflow/tfjs-data/dist/iterators/lazy_iterator.js:19:28: error: Could not resolve "seedrandom" (mark it as external to exclude it from the bundle)
            19 │ import * as seedrandom from 'seedrandom';
               ╵                             ~~~~~~~~~~~~
        
        2021-03-16 09:26:04 STATE:  Monitoring: [ 'package.json', 'demo', 'src', [length]: 3 ]
        2021-03-16 09:26:04 ERROR:  Build error [
          {
            "location": {
              "column": 28,
              "file": "node_modules/@tensorflow/tfjs-data/dist/dataset.js",
              "length": 12,
              "line": 19,
              "lineText": "import * as seedrandom from 'seedrandom';",
              "namespace": ""
            },
            "notes": [],
            "text": "Could not resolve \"seedrandom\" (mark it as external to exclude it from the bundle)"
          },
          {
            "location": {
              "column": 28,
              "file": "node_modules/@tensorflow/tfjs-data/dist/iterators/lazy_iterator.js",
              "length": 12,
              "line": 19,
              "lineText": "import * as seedrandom from 'seedrandom';",
              "namespace": ""
            },
            "notes": [],
            "text": "Could not resolve \"seedrandom\" (mark it as external to exclude it from the bundle)"
          }
        ]

Additional

  • For installation or startup issues include your package.json
  • For usage issues, it is recommended to post your code as gist

Help understanding - bundled version of tfjs

I'm trying to allow users to select to use your bundled version of tfjs - OR - select the non-bundled version

Issue I'm having with your bundled version (brain broken):
I set the backend

faceapi.tf.setBackend('tensorflow');

I get an error: Backend name 'tensorflow' not found in registry"

I figure this is because I'm not using

ts = require(@tensorflow/tfjs-node);

as im using your bundled version of tfjs

However if I use a version check without. ts = require(@tensorflow/tfjs-node) because im using your bundled version

let faceapi_tf_version_core = faceapi.tf.version_core;

I get: 2.7.0

and if I just try to get the Backend without setting it:

let tf_req = faceapi.tf.getBackend();

its undefined.

SO............. how do I check that your bundled version of tfjs is initialized

Section of code I'm using for this check:

//lode modules based on switch statment, without break it will load based on first match down.
    switch (nobundle_tensorflow)
    {
      case true:
      let fs = require(tfjs_module_require_value);
      default:
      let faceapi = require(vladmandic_face_api_require_value); //JavaScript face recognition API for nodejs see https://www.npmjs.com/package/@vladmandic/face-api

      // function to initialize tfjs and env settings
      async function init_tfjs()
      {
          await faceapi.tf.setBackend('tensorflow');
          await faceapi.tf.enableProdMode();
          await faceapi.tf.ENV.set('DEBUG', false);
          await faceapi.tf.ready();
      }

      let faceapi_tf_version_core = faceapi.tf.version_core;
      let tf_req = faceapi.tf.getBackend();

      //error check of await function init_tfjs
      var init_tfjs_no_error = true;
      init_tfjs().catch(error => {
      init_tfjs_no_error = ("Could not initialize tfjs " + error);
      });

Using FaceAPI with WASM in next.js (client side only)

Issue Description
When trying to dynamically import dependencies to use wasm, it seems to be ignoring it, still using webgl.
In Next.js, it's important to only load dependencies in the client-side, avoiding larger bundles and other SSR problems.

Steps to Reproduce

React.useEffect(_ => {
        Promise.all([
          import('@tensorflow/tfjs'), // dynamic import
          import('@tensorflow/tfjs-backend-wasm') // dynamic import
        ]).then(async ([tf, wasm]) => {
          window.tf = tf;
          wasm = wasm;
          window.wasm = wasm;
          wasm.setWasmPaths('https://cdn.jsdelivr.net/npm/@tensorflow/[email protected]/dist/');

          await tf.setBackend('wasm');
          await tf.ready();
          console.log(tf.getBackend()); // wasm <<<--- OK

          await createScript('/face-detection-ia/face-api.min.js', scrId); // imports the script from /public

          const faceapi = window.faceapi;

          try {
            await Promise.all([
              faceapi.nets.tinyFaceDetector.loadFromUri('/face-detection-ia/models/'),
              faceapi.nets.faceLandmark68Net.loadFromUri('/face-detection-ia/models/'),
            ]); // this also loads all the models OK
            console.log('DONE loading AI stuff');
            console.log(faceapi.tf.getBackend()); // webgl <<<<<----- !!!!!!!!!
          } catch (error) {
            console.error('ERROR loading AI stuff', error);
            return;
          }
        });
      }
}, []);

Expected Behavior

After loading the faceapi script, faceapi.tf.getBackend() should be wasm.

**Environment

  • Browser or NodeJS and version (e.g. NodeJS 14.15 or Chrome 89)?
    Chrome and Firefox LTS
  • OS and Hardware platform (e.g. Windows 10, Ubuntu Linux on x64, Android 10)?
    MacOS on a MacBook Pro
  • Packager (if any) (e.g, webpack, rollup, parcel, esbuild, etc.)
    Next.js

Issue coming from justadudewhohacks/face-api.js#799

Bad performance to detect the face wearing a mask.

Issue Description
It had a bad performance to detect the face wearing a mask, but it is good at original face-api.js
Steps to Reproduce

Expected Behavior

**Environment

  • Module version?
  • 1.1.12
  • Built-in demo or custom code?
  • custom code
  • Type of module used (e.g. js, esm, esm-nobundle)?
  • js
  • Browser or NodeJS and version (e.g. NodeJS 14.15 or Chrome 89)?
  • Chrome 90
  • OS and Hardware platform (e.g. Windows 10, Ubuntu Linux on x64, Android 10)?
  • MacOs
  • Packager (if any) (e.g, webpack, rollup, parcel, esbuild, etc.)?
  • webpack

Additional

  • For installation or startup issues include your package.json
  • For usage issues, it is recommended to post your code as gist

Cannot use import statement outside a module

Hi!
I get this error when run the code:

/node_modules/@vladmandic/face-api/build/index.js:1
import * as tf from '@tensorflow/tfjs-core';
^^^^^^

SyntaxError: Cannot use import statement outside a module
    at wrapSafe (internal/modules/cjs/loader.js:1053:16)
    at Module._compile (internal/modules/cjs/loader.js:1101:27)
    at Module._extensions..js (internal/modules/cjs/loader.js:1157:10)
    at Object.require.extensions.<computed> [as .js] (/home/martin/Trabajo/SevntV2/src/Monitoreo.Core.Server/node_modules/ts-node/src/index.ts:851:44)
    at Module.load (internal/modules/cjs/loader.js:985:32)
    at Function.Module._load (internal/modules/cjs/loader.js:878:14)
    at Module.require (internal/modules/cjs/loader.js:1025:19)
    at require (internal/modules/cjs/helpers.js:72:18)
    at Object.<anonymous> (/home/martin/Trabajo/SevntV2/src/Monitoreo.Core.Server/src/commons/env.ts:5:1)
    at Module._compile (internal/modules/cjs/loader.js:1137:30)
Waiting for the debugger to disconnect...

My package.json is:

{
  "name": "monitoreo.core.server",
  "version": "1.0.0",
  "description": "Modulo Core de Reco Facial y Monitoreo Sevnt",
  "main": "dist/app.js",
  "scripts": {
    "test": "echo \"Error: no test specified\" && exit 1"
  },
  "author": "",
  "license": "ISC",
  "dependencies": {
    "@tensorflow/tfjs": "^2.3.0",
    "@tensorflow/tfjs-node-gpu": "^2.3.0",
    "@types/express": "^4.17.7",
    "@vladmandic/face-api": "^0.4.1",
    "dotenv": "^8.2.0",
    "event-loop-stats": "^1.3.0",
    "express": "^4.17.1",
    "microjob": "^0.7.0",
    "rtsp-ffmpeg": "0.0.14",
    "socket.io": "^2.3.0",
    "tslib": "^1.11.1"
  },
  "devDependencies": {
    "@types/jasmine": "^3.5.14",
    "@types/node": "^13.9.2",
    "@types/socket.io": "^2.1.6",
    "canvas": "2.6.1",
    "jasmine": "^3.5.0",
    "jasmine-core": "^3.5.0",
    "karma": "^4.4.1",
    "karma-chrome-launcher": "^3.1.0",
    "karma-jasmine": "^3.1.1",
    "karma-typescript": "^5.1.0",
    "rollup": "^2.26.8",
    "rollup-plugin-commonjs": "^10.1.0",
    "rollup-plugin-node-resolve": "^5.2.0",
    "rollup-plugin-typescript2": "^0.26.0",
    "rollup-plugin-uglify": "^6.0.4",
    "ts-node": "^8.7.0",
    "typescript": "^3.8.3"
  }
}

Implement head position angle of (yaw, roll, pitch)

HI do you have plan to implement Head position angle of (Yaw, Roll, Pitch)

pose: {
  pitch_angle: {value: 11.102898}
  roll_angle: {value: -20.291693}
  yaw_angle: {value: 14.172521}
}

like face-api discuss here https://github.com/justadudewhohacks/face-api.js/issues/107

Missing Types

Hi Vladimir,
Despite works normally, your module is missing the types. When I try to use in the VS Code, this message is shown when we hover the import:

Could not find a declaration file for module '@vladmandic/face-api'.
'.../node_modules/@vladmandic/face-api/dist/face-api.node.js' implicitly has an 'any' type.
Try `npm i --save-dev @types/vladmandic__face-api`
 if it exists or add a new declaration (.d.ts) file containing `declare module '@vladmandic/face-api';`js(7016)

import * faceapi from "@vladmandic/face-api";
or:
import * as faceapi2 from '@vladmandic/face-api/dist/face-api.esm.js';

The original face-api.js works.

Later, I plan to clone your project locally to see what we can do.

faceapi is working for me on desktop but not on mobile

faceapi is working for me on desktop but not on mobile.

I am using wasm, as recommended to me and it worked perfect for a long time, both on desktop and mobil.

import * as faceapi from "@vladmandic/face-api/dist/face-api.esm.js";

async start() {
/**el face api solo se necesita para el selfie */
if (this.type == "selfie") {
await this.backendWasm();
await this.setupFaceAPI();
}
}
async setupFaceAPI() {
await faceapi.nets.tinyFaceDetector.loadFromUri("/statics/models");
},

async backendWasm() {
  await faceapi.tf.setWasmPaths("../statics/");
  await faceapi.tf.setBackend("wasm");
},

for a couple of days it stopped working on mobile, will it be a question of versions of the libraries?

tensorflow
face api ??
Captura de Pantalla 2021-06-28 a la(s) 1 33 21 p  m
Captura de Pantalla 2021-06-28 a la(s) 12 57 45 p  m
Captura de Pantalla 2021-06-28 a la(s) 12 57 50 p  m

if you look at the first image they don't load all the wasm files, that's in mobile.

but in the other two images, if you see if they load all the files, that's on desktop

uso chrome.

"@tensorflow/tfjs": "^3.3.0",
"@tensorflow/tfjs-backend-wasm": "^3.3.0",
"@vladmandic/face-api": "^1.1.12",

Captura de Pantalla 2021-06-28 a la(s) 1 29 27 p  m

withFaceExpressions() && withAgeAndGender() wheather or not to use in detectAllFaces

Hi,
I have two actions register and compare, I would like to ask if
to use withFaceExpressions() && withAgeAndGender() or not ?
When I will use them ?
Thanks in advance.
```
const ssdOptions = { minConfidence: 0.1, maxResults: 10 };
const optionsSSDMobileNet = new faceapi.SsdMobilenetv1Options(ssdOptions);
const faces = await faceapi
.detectAllFaces(tensor, optionsSSDMobileNet)
.withFaceLandmarks()
.withFaceExpressions()
.withFaceDescriptors()
.withAgeAndGender();

Migration to old face-api to @vladmandic/face-api

I am using old face-api with React + NextJS, and it is running on nodev8 which raises many issues. I guess the old face-api consuming 100% clientside rom it didn't give any load to serverside. Do I impliment new @vladmandic/face-api in same manner. I want It didn't give load to server, If website traffic raise I don't need to worry about it.

issues importing face-api on a custom server configuration with webpack

if i use either of these two

import * as faceapi from "components / lib / face-api.esm";
import * as faceapi from "@ vladmandic / face-api / dist / face-api.esm";

works fine for me in my project locally

but on the server it doesn't work for me to import the library from node-modules

import * as faceapi from "@ vladmandic / face-api / dist / face-api.esm";
Module not found: Error: Can't resolve '@vladmandic/face-api' in '/opt/whatsign/frontend/src/components/lib'
resolve '@vladmandic/face-api' in '/opt/whatsign/frontend/src/components/lib'
  Parsed request is a module
  using description file: /opt/whatsign/frontend/package.json (relative path: ./src/components/lib)
    Field 'browser' doesn't contain a valid alias configuration
    resolve as module
      /opt/whatsign/frontend/src/components/lib/node_modules doesn't exist or is not a directory
      /opt/whatsign/frontend/src/components/node_modules doesn't exist or is not a directory
      /opt/whatsign/frontend/src/node_modules doesn't exist or is not a directory
      /opt/whatsign/node_modules doesn't exist or is not a directory
      /opt/node_modules doesn't exist or is not a directory
      /node_modules doesn't exist or is not a directory
      looking for modules in /opt/whatsign/frontend/node_modules
        using description file: /opt/whatsign/frontend/package.json (relative path: ./node_modules)
          Field 'browser' doesn't contain a valid alias configuration
          using description file: /opt/whatsign/frontend/package.json (relative path: ./node_modules/@vladmandic/face-api)
            no extension
              Field 'browser' doesn't contain a valid alias configuration
              /opt/whatsign/frontend/node_modules/@vladmandic/face-api doesn't exist
            .mjs
              Field 'browser' doesn't contain a valid alias configuration
              /opt/whatsign/frontend/node_modules/@vladmandic/face-api.mjs doesn't exist
            .js
              Field 'browser' doesn't contain a valid alias configuration
              /opt/whatsign/frontend/node_modules/@vladmandic/face-api.js doesn't exist
            .vue
              Field 'browser' doesn't contain a valid alias configuration
              /opt/whatsign/frontend/node_modules/@vladmandic/face-api.vue doesn't exist
            .json
              Field 'browser' doesn't contain a valid alias configuration
              /opt/whatsign/frontend/node_modules/@vladmandic/face-api.json doesn't exist
            .wasm
              Field 'browser' doesn't contain a valid alias configuration
              /opt/whatsign/frontend/node_modules/@vladmandic/face-api.wasm doesn't exist
            as directory
              /opt/whatsign/frontend/node_modules/@vladmandic/face-api doesn't exist
      looking for modules in /opt/whatsign/frontend/node_modules
        using description file: /opt/whatsign/frontend/package.json (relative path: ./node_modules)
          Field 'browser' doesn't contain a valid alias configuration
          using description file: /opt/whatsign/frontend/package.json (relative path: ./node_modules/@vladmandic/face-api)
            no extension
              Field 'browser' doesn't contain a valid alias configuration
              /opt/whatsign/frontend/node_modules/@vladmandic/face-api doesn't exist
            .mjs
              Field 'browser' doesn't contain a valid alias configuration
              /opt/whatsign/frontend/node_modules/@vladmandic/face-api.mjs doesn't exist
            .js
              Field 'browser' doesn't contain a valid alias configuration
              /opt/whatsign/frontend/node_modules/@vladmandic/face-api.js doesn't exist
            .vue
              Field 'browser' doesn't contain a valid alias configuration
              /opt/whatsign/frontend/node_modules/@vladmandic/face-api.vue doesn't exist
            .json
              Field 'browser' doesn't contain a valid alias configuration
              /opt/whatsign/frontend/node_modules/@vladmandic/face-api.json doesn't exist
            .wasm
              Field 'browser' doesn't contain a valid alias configuration
              /opt/whatsign/frontend/node_modules/@vladmandic/face-api.wasm doesn't exist
            as directory
              /opt/whatsign/frontend/node_modules/@vladmandic/face-api doesn't exist
      looking for modules in /opt/whatsign/frontend/node_modules/@quasar/app/node_modules
        using description file: /opt/whatsign/frontend/node_modules/@quasar/app/package.json (relative path: ./node_modules)
          Field 'browser' doesn't contain a valid alias configuration
          using description file: /opt/whatsign/frontend/node_modules/@quasar/app/package.json (relative path: ./node_modules/@vladmandic/face-api)
            no extension
              Field 'browser' doesn't contain a valid alias configuration
              /opt/whatsign/frontend/node_modules/@quasar/app/node_modules/@vladmandic/face-api doesn't exist
            .mjs
              Field 'browser' doesn't contain a valid alias configuration
              /opt/whatsign/frontend/node_modules/@quasar/app/node_modules/@vladmandic/face-api.mjs doesn't exist
            .js
              Field 'browser' doesn't contain a valid alias configuration
              /opt/whatsign/frontend/node_modules/@quasar/app/node_modules/@vladmandic/face-api.js doesn't exist
            .vue
              Field 'browser' doesn't contain a valid alias configuration
              /opt/whatsign/frontend/node_modules/@quasar/app/node_modules/@vladmandic/face-api.vue doesn't exist
            .json
              Field 'browser' doesn't contain a valid alias configuration
              /opt/whatsign/frontend/node_modules/@quasar/app/node_modules/@vladmandic/face-api.json doesn't exist
            .wasm
              Field 'browser' doesn't contain a valid alias configuration
              /opt/whatsign/frontend/node_modules/@quasar/app/node_modules/@vladmandic/face-api.wasm doesn't exist
            as directory
              /opt/whatsign/frontend/node_modules/@quasar/app/node_modules/@vladmandic/face-api doesn't exist
[/opt/whatsign/frontend/src/components/lib/node_modules]
[/opt/whatsign/frontend/src/components/node_modules]
[/opt/whatsign/frontend/src/node_modules]
[/opt/whatsign/node_modules]
[/opt/node_modules]
[/node_modules]
[/opt/whatsign/frontend/node_modules/@vladmandic/face-api]
[/opt/whatsign/frontend/node_modules/@vladmandic/face-api.mjs]
[/opt/whatsign/frontend/node_modules/@vladmandic/face-api.js]
[/opt/whatsign/frontend/node_modules/@vladmandic/face-api.vue]
[/opt/whatsign/frontend/node_modules/@vladmandic/face-api.json]
[/opt/whatsign/frontend/node_modules/@vladmandic/face-api.wasm]
[/opt/whatsign/frontend/node_modules/@quasar/app/node_modules/@vladmandic/face-api]
[/opt/whatsign/frontend/node_modules/@quasar/app/node_modules/@vladmandic/face-api.mjs]
[/opt/whatsign/frontend/node_modules/@quasar/app/node_modules/@vladmandic/face-api.js]
[/opt/whatsign/frontend/node_modules/@quasar/app/node_modules/@vladmandic/face-api.vue]
[/opt/whatsign/frontend/node_modules/@quasar/app/node_modules/@vladmandic/face-api.json]
[/opt/whatsign/frontend/node_modules/@quasar/app/node_modules/@vladmandic/face-api.wasm]
 @ ./src/components/lib/faceapi.js 1:14-45
 @ ./node_modules/@quasar/app/lib/webpack/loader.transform-quasar-imports.js!./node_modules/babel-loader/lib??ref--2-0!./node_modules/@quasar/app/lib/webpack/loader.auto-import-client.js?kebab!./node_modules/vue-loader/lib??vue-loader-options!./src/components/Camare4.vue?vue&type=script&lang=js&
 @ ./src/components/Camare4.vue?vue&type=script&lang=js&
 @ ./src/components/Camare4.vue
 @ ./node_modules/@quasar/app/lib/webpack/loader.transform-quasar-imports.js!./node_modules/babel-loader/lib??ref--2-0!./node_modules/@quasar/app/lib/webpack/loader.auto-import-client.js?kebab!./node_modules/vue-loader/lib??vue-loader-options!./src/pages/signingProcess/selfiePhotoFaceApi.vue?vue&type=script&lang=js&
 @ ./src/pages/signingProcess/selfiePhotoFaceApi.vue?vue&type=script&lang=js&
 @ ./src/pages/signingProcess/selfiePhotoFaceApi.vue
 @ ./src/router/routes.js
 @ ./src/router/index.js
 @ ./.quasar/app.js
 @ ./.quasar/client-entry.js
 @ multi ./.quasar/client-entry.js

@vladmandic/face-api usage

I have been having some issues lately with face-api.js and not been able to deploy it on the server so tried your version of it and I'm not sure if I'm using it right or not. I'm getting a bunch of errors, can you please review the code and errors?

pasting the code here.

const express = require("express");
const faceapi = require("@vladmandic/face-api");
const fetch = require("node-fetch");
const path = require("path");
const canvas = require("canvas");
const tnsr = require("@tensorflow/tfjs-node");

const faceDetectionNet = faceapi.nets.ssdMobilenetv1;

const minConfidence = 0.5;
const trainingDataLength = 1;
const inputSize = 384;

const scoreThreshold = 0.5;

function getFaceDetectorOptions(net = faceapi.NeuralNetwork()) {
return net === faceapi.nets.ssdMobilenetv1
? new faceapi.SsdMobilenetv1Options({
minConfidence,
})
: new faceapi.TinyFaceDetectorOptions({
inputSize,
scoreThreshold,
});
}
const faceDetectionOptions = getFaceDetectorOptions(faceDetectionNet);

const { Canvas, Image, ImageData } = canvas;
faceapi.env.monkeyPatch({
Canvas,
Image,
ImageData,
});
faceapi.env.monkeyPatch({
fetch: fetch,
});

loadModels = async () => {
const WeightsDir1 = path.resolve(__dirname, "src/weights");
const WeightsDir2 = path.resolve(__dirname, "src/weights");
const WeightsDir3 = path.resolve(__dirname, "src/weights");
await faceDetectionNet.loadFromDisk(WeightsDir1);
await faceapi.nets.faceLandmark68Net.loadFromDisk(WeightsDir2);
await faceapi.nets.faceRecognitionNet.loadFromDisk(WeightsDir3);
};

const app = express();
verifyFace = async () => {
await loadModels();
const dir1 = path.resolve(__dirname, "a.jpg");
const dir2 = path.resolve(__dirname, "b.jpg");
const application_id = "123";
const QUERY_IMAGE = dir1;
const queryImage = await canvas.loadImage(QUERY_IMAGE);

const resultsQuery = await faceapi
	.detectSingleFace(queryImage, faceDetectionOptions)
	.withFaceLandmarks()
	.withFaceDescriptor();
const labeledFaceDescriptors = await loadLabeledImages();
const faceMatcher = new faceapi.FaceMatcher(labeledFaceDescriptors);
if (resultsQuery) {
	const bestMatch = faceMatcher.findBestMatch(resultsQuery.descriptor);
	var match_percentage = 100 - Math.ceil(bestMatch._distance * 100);
	db.query(
		"update required_documents set _8applicant_image_verification_status = ? where application_id  = ?",
		[match_percentage, application_id],
		(err, result, field) => {
			if (err) throw err;
			if (result.affectedRows > 0) {
				console.log("match percentage added", match_percentage);
			} else {
				console.log("match percentage not added");
			}
		}
	);
}
const labels = faceMatcher.labeledDescriptors.map((ld) => ld.label);

function loadLabeledImages() {
	const labels = [application_id];
	return Promise.all(
		labels.map(async (label) => {
			const descriptions = [];
			//              let url = `../public/images/${application_id}/training_data`;

			//       const TrainingData = path.resolve(__dirname, dir2);
			for (let i = 1; i <= trainingDataLength; i++) {
				//   const REFERENCE_IMAGE = TrainingData + `/abc (${i}).jpg`;
				const referenceImage = await canvas.loadImage(dir2);
				const detections = await faceapi
					.detectSingleFace(referenceImage, faceDetectionOptions)
					.withFaceLandmarks()
					.withFaceDescriptor();
				descriptions.push(detections.descriptor);
			}

			return new faceapi.LabeledFaceDescriptors(label, descriptions);
		})
	);
}

};

and the errors I'm getting are below:

user2@Hamza MINGW64 ~/Documents/letsTryAgain
$ node app.js
node-pre-gyp info This Node instance does not support builds for N-API version 6
node-pre-gyp info This Node instance does not support builds for N-API version 6
2020-09-08 14:39:17.082639: I tensorflow/core/platform/cpu_feature_guard.cc:142] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2
(node:13856) UnhandledPromiseRejectionWarning: TypeError: t.toFloat is not a function
at C:\Users\user2\Documents\letsTryAgain\node_modules@vladmandic\face-api\dist\face-api.cjs:21279:59
at Array.map ()
at C:\Users\user2\Documents\letsTryAgain\node_modules@vladmandic\face-api\dist\face-api.cjs:21279:46
at C:\Users\user2\Documents\letsTryAgain\node_modules@vladmandic\face-api\dist\face-api.cjs:2604:16
at Engine.scopedRun (C:\Users\user2\Documents\letsTryAgain\node_modules@vladmandic\face-api\dist\face-api.cjs:2614:19)
at Engine.tidy (C:\Users\user2\Documents\letsTryAgain\node_modules@vladmandic\face-api\dist\face-api.cjs:2603:17)
at tidy (C:\Users\user2\Documents\letsTryAgain\node_modules@vladmandic\face-api\dist\face-api.cjs:4327:17)
at NetInput.toBatchTensor (C:\Users\user2\Documents\letsTryAgain\node_modules@vladmandic\face-api\dist\face-api.cjs:21263:12)
at C:\Users\user2\Documents\letsTryAgain\node_modules@vladmandic\face-api\dist\face-api.cjs:23288:33
at C:\Users\user2\Documents\letsTryAgain\node_modules@vladmandic\face-api\dist\face-api.cjs:2604:16
(node:13856) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). To terminate the node process on unhandled promise rejection, use the CLI flag --unhandled-rejections=strict (see https://nodejs.org/api/cli.html#cli_unhandled_rejections_mode). (rejection id: 1)
(node:13856) [DEP0018] DeprecationWarning: Unhandled promise rejections are deprecated. In the future, promise rejections that are not handled will terminate the Node.js process with a non-zero exit code.

please review the code and errors and suggest me a solution of if possible, modify it for me please? i am trying to recognize faces on server-side.
Thanks in advace

Face-api best match issue

When trying to match two detecion data i got this error, error baseImage data and imageData and code snippets are below;

Environment : node.js

Matcher code;

const baseFaceTensor = await urlImageFace(imageUrl);
        let baseImageDetection = await FACE_API.detectSingleFace(baseFaceTensor);

        const faceTensor = await urlImageFace(imageUrl);
        let imageDetection = await FACE_API.detectSingleFace(faceTensor);

        console.log(`PARAMS`);
        console.log(`Base image detection : ${JSON.stringify(baseImageDetection)}`);
        console.log(`Image detection : ${JSON.stringify(imageDetection)}`);

        let faceMatcher = await FACE_API.getFaceMatcher(baseImageDetection, CX.AIMonitoringUtil.FACE_DISTANCE_THRESHOLD);
        let bestMatch = faceMatcher.findBestMatch((imageDetection as any).descriptor);
        console.log(`Best match : ${JSON.stringify(bestMatch)}`);

urlImageFace function ;

async function urlImageFace(url: string): Promise<Canvas> {
    const img = await canvas.loadImage(url);
    const c: Canvas = canvas.createCanvas(img.width, img.height);
    const ctx = c.getContext('2d');
    ctx.drawImage(img, 0, 0, img.width, img.height);
    return c;
}

Error;

Error: FaceRecognizer.constructor - expected inputs to be of type LabeledFaceDescriptors | WithFaceDescriptor<any> | Float32Array | Array<LabeledFaceDescriptors | WithFaceDescriptor<any> | Float32Array>
    at C:\Development\hemsor\functions\node_modules\@vladmandic\face-api\dist\face-api.node.js:4481:13
    at Array.map (<anonymous>)
    at new FaceMatcher (C:\Development\hemsor\functions\node_modules\@vladmandic\face-api\dist\face-api.node.js:4471:43)
    at Object.<anonymous> (C:\Development\hemsor\functions\lib\util\vlad-face-api.util.js:19:16)
    at Generator.next (<anonymous>)
    at C:\Development\hemsor\functions\lib\util\vlad-face-api.util.js:8:71
    at new Promise (<anonymous>)
    at __awaiter (C:\Development\hemsor\functions\lib\util\vlad-face-api.util.js:4:12)
    at Object.getFaceMatcher (C:\Development\hemsor\functions\lib\util\vlad-face-api.util.js:18:12)
    at C:\Development\hemsor\functions\lib\util\ai-util.js:109:71

Base image detection : {"detection":{"_imageDims":{"_width":640,"_height":480},"_score":0.9686928987503052,"_classScore":0.9686928987503052,"_className":"","_box":{"_x":454.9592590332031,"_y":219.2596435546875,"_width":100.70419311523438,"_height":121.38046264648436}},"landmarks":{"_imgDims":{"_width":100,"_height":121},"_shift":{"_x":454.9592590332031,"_y":219.2596435546875},"_positions":[{"_x":457.7955788373947,"_y":274.79810521006584},{"_x":455.8875436335802,"_y":286.5654146671295},{"_x":456.68051674962044,"_y":298.1366723179817},{"_x":458.96097369492054,"_y":308.63758021593094},{"_x":462.24460288882256,"_y":320.30693769454956},{"_x":465.9631198644638,"_y":328.66408812999725},{"_x":469.84499603509903,"_y":333.61204957962036},{"_x":474.94090408086777,"_y":337.5555011034012},{"_x":487.29751765727997,"_y":340.2047301530838},{"_x":502.3620820045471,"_y":337.2347257733345},{"_x":516.916069984436,"_y":333.31238424777985},{"_x":528.8288867473602,"_y":328.38736468553543},{"_x":539.1570425033569,"_y":319.2793632745743},{"_x":546.2026572227478,"_y":307.60420721769333},{"_x":549.878523349762,"_y":295.52019369602203},{"_x":552.8075134754181,"_y":282.6771650314331},{"_x":554.6635127067566,"_y":269.4297107756138},{"_x":456.71929236501455,"_y":258.5899069905281},{"_x":458.2994854450226,"_y":253.5179107785225},{"_x":462.4326793849468,"_y":250.93901473283768},{"_x":467.3671631515026,"_y":250.55087059736252},{"_x":472.56822645664215,"_y":252.00682219862938},{"_x":493.5812121629715,"_y":251.0898426771164},{"_x":500.124836564064,"_y":248.90755784511566},{"_x":508.66381883621216,"_y":248.119802236557},{"_x":518.5417628288269,"_y":250.4396229982376},{"_x":527.1569406986237,"_y":254.91037011146545},{"_x":481.46510779857635,"_y":264.9688262939453},{"_x":478.3727025985718,"_y":272.6963405609131},{"_x":474.8985803127289,"_y":279.8749299645424},{"_x":473.7836158275604,"_y":287.18489611148834},{"_x":474.41680341959,"_y":294.45778453350067},{"_x":476.453555226326,"_y":295.812755048275},{"_x":479.88436728715897,"_y":296.3741569519043},{"_x":485.24633407592773,"_y":294.77281177043915},{"_x":489.85130846500397,"_y":293.72017508745193},{"_x":463.71563509106636,"_y":267.46946316957474},{"_x":465.22086694836617,"_y":264.9259752333164},{"_x":470.7351824641228,"_y":264.15974792838097},{"_x":476.72491282224655,"_y":266.8614670932293},{"_x":471.60392463207245,"_y":269.0662105977535},{"_x":466.11671432852745,"_y":269.2822581231594},{"_x":499.16064500808716,"_y":265.42910647392273},{"_x":503.96093785762787,"_y":261.68926510214806},{"_x":510.0673234462738,"_y":261.6694568991661},{"_x":516.157124042511,"_y":264.71637177467346},{"_x":510.39352774620056,"_y":266.95047959685326},{"_x":503.7195062637329,"_y":266.5281714498997},{"_x":470.3068068623543,"_y":312.33296221494675},{"_x":472.18376606702805,"_y":308.0835347175598},{"_x":477.0624226331711,"_y":305.3708242177963},{"_x":481.02106511592865,"_y":305.8787118792534},{"_x":484.65346455574036,"_y":304.50840109586716},{"_x":495.04682183265686,"_y":306.8852195739746},{"_x":505.4706907272339,"_y":309.66006284952164},{"_x":494.9705547094345,"_y":312.89019548892975},{"_x":487.4601876735687,"_y":315.1039686203003},{"_x":481.3606923818588,"_y":315.98395335674286},{"_x":476.9816866517067,"_y":315.8076087832451},{"_x":472.9878506064415,"_y":314.95839834213257},{"_x":471.2059488892555,"_y":312.17052268981934},{"_x":477.1176925301552,"_y":309.77621471881866},{"_x":481.7724800109863,"_y":309.25895845890045},{"_x":487.95234620571136,"_y":309.3286928534508},{"_x":503.8042551279068,"_y":309.58249604701996},{"_x":487.6369780302048,"_y":309.75305646657944},{"_x":481.8918204307556,"_y":310.40967333316803},{"_x":477.4653887748718,"_y":310.478051841259}]},"unshiftedLandmarks":{"_imgDims":{"_width":100,"_height":121},"_shift":{"_x":0,"_y":0},"_positions":[{"_x":2.8363198041915894,"_y":55.53846165537834},{"_x":0.9282846003770828,"_y":67.30577111244202},{"_x":1.7212577164173126,"_y":78.87702876329422},{"_x":4.001714661717415,"_y":89.37793666124344},{"_x":7.2853438556194305,"_y":101.04729413986206},{"_x":11.003860831260681,"_y":109.40444457530975},{"_x":14.885737001895905,"_y":114.35240602493286},{"_x":19.981645047664642,"_y":118.29585754871368},{"_x":32.33825862407684,"_y":120.9450865983963},{"_x":47.402822971343994,"_y":117.975082218647},{"_x":61.95681095123291,"_y":114.05274069309235},{"_x":73.8696277141571,"_y":109.12772113084793},{"_x":84.19778347015381,"_y":100.01971971988678},{"_x":91.24339818954468,"_y":88.34456366300583},{"_x":94.91926431655884,"_y":76.26055014133453},{"_x":97.84825444221497,"_y":63.417521476745605},{"_x":99.70425367355347,"_y":50.170067220926285},{"_x":1.760033331811428,"_y":39.33026343584061},{"_x":3.340226411819458,"_y":34.25826722383499},{"_x":7.473420351743698,"_y":31.679371178150177},{"_x":12.407904118299484,"_y":31.29122704267502},{"_x":17.608967423439026,"_y":32.74717864394188},{"_x":38.62195312976837,"_y":31.830199122428894},{"_x":45.1655775308609,"_y":29.64791429042816},{"_x":53.70455980300903,"_y":28.860158681869507},{"_x":63.58250379562378,"_y":31.17997944355011},{"_x":72.19768166542053,"_y":35.650726556777954},{"_x":26.50584876537323,"_y":45.70918273925781},{"_x":23.413443565368652,"_y":53.436697006225586},{"_x":19.939321279525757,"_y":60.61528640985489},{"_x":18.8243567943573,"_y":67.92525255680084},{"_x":19.45754438638687,"_y":75.19814097881317},{"_x":21.494296193122864,"_y":76.5531114935875},{"_x":24.92510825395584,"_y":77.1145133972168},{"_x":30.28707504272461,"_y":75.51316821575165},{"_x":34.89204943180084,"_y":74.46053153276443},{"_x":8.756376057863235,"_y":48.20981961488724},{"_x":10.26160791516304,"_y":45.66633167862892},{"_x":15.775923430919647,"_y":44.900104373693466},{"_x":21.765653789043427,"_y":47.601823538541794},{"_x":16.644665598869324,"_y":49.806567043066025},{"_x":11.157455295324326,"_y":50.02261456847191},{"_x":44.20138597488403,"_y":46.16946291923523},{"_x":49.001678824424744,"_y":42.429621547460556},{"_x":55.10806441307068,"_y":42.40981334447861},{"_x":61.19786500930786,"_y":45.45672821998596},{"_x":55.43426871299744,"_y":47.690836042165756},{"_x":48.760247230529785,"_y":47.26852789521217},{"_x":15.347547829151154,"_y":93.07331866025925},{"_x":17.22450703382492,"_y":88.82389116287231},{"_x":22.103163599967957,"_y":86.11118066310883},{"_x":26.061806082725525,"_y":86.61906832456589},{"_x":29.69420552253723,"_y":85.24875754117966},{"_x":40.087562799453735,"_y":87.62557601928711},{"_x":50.51143169403076,"_y":90.40041929483414},{"_x":40.011295676231384,"_y":93.63055193424225},{"_x":32.5009286403656,"_y":95.8443250656128},{"_x":26.4014333486557,"_y":96.72430980205536},{"_x":22.02242761850357,"_y":96.54796522855759},{"_x":18.028591573238373,"_y":95.69875478744507},{"_x":16.2466898560524,"_y":92.91087913513184},{"_x":22.158433496952057,"_y":90.51657116413116},{"_x":26.813220977783203,"_y":89.99931490421295},{"_x":32.99308717250824,"_y":90.06904929876328},{"_x":48.844996094703674,"_y":90.32285249233246},{"_x":32.67771899700165,"_y":90.49341291189194},{"_x":26.93256139755249,"_y":91.15002977848053},{"_x":22.5061297416687,"_y":91.2184082865715}]},"alignedRect":{"_imageDims":{"_width":640,"_height":480},"_score":0.9686928987503052,"_classScore":0.9686928987503052,"_className":"","_box":{"_x":446.00994672626257,"_y":238.91130944490433,"_width":118.53116288781166,"_height":110.50191349983216}},"angle":{"roll":0.05245019508173883,"pitch":0.8317519703872617,"yaw":-0.19297368625573164},"expressions":{"neutral":0.00735729094594717,"happy":0.9922767281532288,"sad":0.00025446360814385116,"angry":0.00000297511974167719,"fearful":1.666869864358489e-9,"disgusted":0.00010860323527595028,"surprised":1.277310346381455e-8}}

> Image detection : {"detection":{"_imageDims":{"_width":640,"_height":480},"_score":0.9686928987503052,"_classScore":0.9686928987503052,"_className":"","_box":{"_x":454.9592590332031,"_y":219.2596435546875,"_width":100.70419311523438,"_height":121.38046264648436}},"landmarks":{"_imgDims":{"_width":100,"_height":121},"_shift":{"_x":454.9592590332031,"_y":219.2596435546875},"_positions":[{"_x":457.7955788373947,"_y":274.79810521006584},{"_x":455.8875436335802,"_y":286.5654146671295},{"_x":456.68051674962044,"_y":298.1366723179817},{"_x":458.96097369492054,"_y":308.63758021593094},{"_x":462.24460288882256,"_y":320.30693769454956},{"_x":465.9631198644638,"_y":328.66408812999725},{"_x":469.84499603509903,"_y":333.61204957962036},{"_x":474.94090408086777,"_y":337.5555011034012},{"_x":487.29751765727997,"_y":340.2047301530838},{"_x":502.3620820045471,"_y":337.2347257733345},{"_x":516.916069984436,"_y":333.31238424777985},{"_x":528.8288867473602,"_y":328.38736468553543},{"_x":539.1570425033569,"_y":319.2793632745743},{"_x":546.2026572227478,"_y":307.60420721769333},{"_x":549.878523349762,"_y":295.52019369602203},{"_x":552.8075134754181,"_y":282.6771650314331},{"_x":554.6635127067566,"_y":269.4297107756138},{"_x":456.71929236501455,"_y":258.5899069905281},{"_x":458.2994854450226,"_y":253.5179107785225},{"_x":462.4326793849468,"_y":250.93901473283768},{"_x":467.3671631515026,"_y":250.55087059736252},{"_x":472.56822645664215,"_y":252.00682219862938},{"_x":493.5812121629715,"_y":251.0898426771164},{"_x":500.124836564064,"_y":248.90755784511566},{"_x":508.66381883621216,"_y":248.119802236557},{"_x":518.5417628288269,"_y":250.4396229982376},{"_x":527.1569406986237,"_y":254.91037011146545},{"_x":481.46510779857635,"_y":264.9688262939453},{"_x":478.3727025985718,"_y":272.6963405609131},{"_x":474.8985803127289,"_y":279.8749299645424},{"_x":473.7836158275604,"_y":287.18489611148834},{"_x":474.41680341959,"_y":294.45778453350067},{"_x":476.453555226326,"_y":295.812755048275},{"_x":479.88436728715897,"_y":296.3741569519043},{"_x":485.24633407592773,"_y":294.77281177043915},{"_x":489.85130846500397,"_y":293.72017508745193},{"_x":463.71563509106636,"_y":267.46946316957474},{"_x":465.22086694836617,"_y":264.9259752333164},{"_x":470.7351824641228,"_y":264.15974792838097},{"_x":476.72491282224655,"_y":266.8614670932293},{"_x":471.60392463207245,"_y":269.0662105977535},{"_x":466.11671432852745,"_y":269.2822581231594},{"_x":499.16064500808716,"_y":265.42910647392273},{"_x":503.96093785762787,"_y":261.68926510214806},{"_x":510.0673234462738,"_y":261.6694568991661},{"_x":516.157124042511,"_y":264.71637177467346},{"_x":510.39352774620056,"_y":266.95047959685326},{"_x":503.7195062637329,"_y":266.5281714498997},{"_x":470.3068068623543,"_y":312.33296221494675},{"_x":472.18376606702805,"_y":308.0835347175598},{"_x":477.0624226331711,"_y":305.3708242177963},{"_x":481.02106511592865,"_y":305.8787118792534},{"_x":484.65346455574036,"_y":304.50840109586716},{"_x":495.04682183265686,"_y":306.8852195739746},{"_x":505.4706907272339,"_y":309.66006284952164},{"_x":494.9705547094345,"_y":312.89019548892975},{"_x":487.4601876735687,"_y":315.1039686203003},{"_x":481.3606923818588,"_y":315.98395335674286},{"_x":476.9816866517067,"_y":315.8076087832451},{"_x":472.9878506064415,"_y":314.95839834213257},{"_x":471.2059488892555,"_y":312.17052268981934},{"_x":477.1176925301552,"_y":309.77621471881866},{"_x":481.7724800109863,"_y":309.25895845890045},{"_x":487.95234620571136,"_y":309.3286928534508},{"_x":503.8042551279068,"_y":309.58249604701996},{"_x":487.6369780302048,"_y":309.75305646657944},{"_x":481.8918204307556,"_y":310.40967333316803},{"_x":477.4653887748718,"_y":310.478051841259}]},"unshiftedLandmarks":{"_imgDims":{"_width":100,"_height":121},"_shift":{"_x":0,"_y":0},"_positions":[{"_x":2.8363198041915894,"_y":55.53846165537834},{"_x":0.9282846003770828,"_y":67.30577111244202},{"_x":1.7212577164173126,"_y":78.87702876329422},{"_x":4.001714661717415,"_y":89.37793666124344},{"_x":7.2853438556194305,"_y":101.04729413986206},{"_x":11.003860831260681,"_y":109.40444457530975},{"_x":14.885737001895905,"_y":114.35240602493286},{"_x":19.981645047664642,"_y":118.29585754871368},{"_x":32.33825862407684,"_y":120.9450865983963},{"_x":47.402822971343994,"_y":117.975082218647},{"_x":61.95681095123291,"_y":114.05274069309235},{"_x":73.8696277141571,"_y":109.12772113084793},{"_x":84.19778347015381,"_y":100.01971971988678},{"_x":91.24339818954468,"_y":88.34456366300583},{"_x":94.91926431655884,"_y":76.26055014133453},{"_x":97.84825444221497,"_y":63.417521476745605},{"_x":99.70425367355347,"_y":50.170067220926285},{"_x":1.760033331811428,"_y":39.33026343584061},{"_x":3.340226411819458,"_y":34.25826722383499},{"_x":7.473420351743698,"_y":31.679371178150177},{"_x":12.407904118299484,"_y":31.29122704267502},{"_x":17.608967423439026,"_y":32.74717864394188},{"_x":38.62195312976837,"_y":31.830199122428894},{"_x":45.1655775308609,"_y":29.64791429042816},{"_x":53.70455980300903,"_y":28.860158681869507},{"_x":63.58250379562378,"_y":31.17997944355011},{"_x":72.19768166542053,"_y":35.650726556777954},{"_x":26.50584876537323,"_y":45.70918273925781},{"_x":23.413443565368652,"_y":53.436697006225586},{"_x":19.939321279525757,"_y":60.61528640985489},{"_x":18.8243567943573,"_y":67.92525255680084},{"_x":19.45754438638687,"_y":75.19814097881317},{"_x":21.494296193122864,"_y":76.5531114935875},{"_x":24.92510825395584,"_y":77.1145133972168},{"_x":30.28707504272461,"_y":75.51316821575165},{"_x":34.89204943180084,"_y":74.46053153276443},{"_x":8.756376057863235,"_y":48.20981961488724},{"_x":10.26160791516304,"_y":45.66633167862892},{"_x":15.775923430919647,"_y":44.900104373693466},{"_x":21.765653789043427,"_y":47.601823538541794},{"_x":16.644665598869324,"_y":49.806567043066025},{"_x":11.157455295324326,"_y":50.02261456847191},{"_x":44.20138597488403,"_y":46.16946291923523},{"_x":49.001678824424744,"_y":42.429621547460556},{"_x":55.10806441307068,"_y":42.40981334447861},{"_x":61.19786500930786,"_y":45.45672821998596},{"_x":55.43426871299744,"_y":47.690836042165756},{"_x":48.760247230529785,"_y":47.26852789521217},{"_x":15.347547829151154,"_y":93.07331866025925},{"_x":17.22450703382492,"_y":88.82389116287231},{"_x":22.103163599967957,"_y":86.11118066310883},{"_x":26.061806082725525,"_y":86.61906832456589},{"_x":29.69420552253723,"_y":85.24875754117966},{"_x":40.087562799453735,"_y":87.62557601928711},{"_x":50.51143169403076,"_y":90.40041929483414},{"_x":40.011295676231384,"_y":93.63055193424225},{"_x":32.5009286403656,"_y":95.8443250656128},{"_x":26.4014333486557,"_y":96.72430980205536},{"_x":22.02242761850357,"_y":96.54796522855759},{"_x":18.028591573238373,"_y":95.69875478744507},{"_x":16.2466898560524,"_y":92.91087913513184},{"_x":22.158433496952057,"_y":90.51657116413116},{"_x":26.813220977783203,"_y":89.99931490421295},{"_x":32.99308717250824,"_y":90.06904929876328},{"_x":48.844996094703674,"_y":90.32285249233246},{"_x":32.67771899700165,"_y":90.49341291189194},{"_x":26.93256139755249,"_y":91.15002977848053},{"_x":22.5061297416687,"_y":91.2184082865715}]},"alignedRect":{"_imageDims":{"_width":640,"_height":480},"_score":0.9686928987503052,"_classScore":0.9686928987503052,"_className":"","_box":{"_x":446.00994672626257,"_y":238.91130944490433,"_width":118.53116288781166,"_height":110.50191349983216}},"angle":{"roll":0.05245019508173883,"pitch":0.8317519703872617,"yaw":-0.19297368625573164},"expressions":{"neutral":0.00735729094594717,"happy":0.9922767281532288,"sad":0.00025446360814385116,"angry":0.00000297511974167719,"fearful":1.666869864358489e-9,"disgusted":0.00010860323527595028,"surprised":1.277310346381455e-8}}

WebGL silent issues cause empty results on some clients

gd,

I switched from version "face-api.js": "0.22.2" to "@vladmandic / face-api": "1.5.3".
node -v: v14.15.0, 13.latest, v14.15.17
I use face recognition and facematch 2 faces in the browser using react.js

I am faced with the problem that there are users' computers where the code below returns []

        const detections = await faceapi
            .detectSingleFace (imageRef.current, faceapi.SsdMobilenetv1Options ())
            .withFaceLandmarks ()
            .withFaceDescriptor ();
        console.log ('fl detection', detections)

Chrome browsers of the latest version, Windows is 7, 8, 10.
For example, on this computes the test https://vladmandic.github.io/face-api/demo/ with girls does not return dots on faces.

tell me which way to dig, what could be the problem?

Снимок
Снимок2

processing of one specific image results in exception

identify "media/Photos/Concourse d'Elegance/Modern (21).jpg"
media/Photos/Concourse d'Elegance/Modern (21).jpg JPEG 5075x3765 5075x3765+0+0 8-bit sRGB 10.0642MiB 0.000u 0:00.009
Uncaught (in promise) DOMException: Failed to execute 'drawImage' on 'CanvasRenderingContext2D': The image argument is a canvas element with a width or height of 0.
      Ur @ imageToSquare.ts:24
      (anonymous) @ NetInput.ts:147
      (anonymous) @ NetInput.ts:130
      (anonymous) @ engine.js:327
      scopedRun @ engine.js:337
      tidy @ engine.js:326
      tidy @ globals.ts:192
      toBatchTensor @ NetInput.ts:129
      (anonymous) @ FaceRecognitionNet.ts:25
      (anonymous) @ engine.js:327
      scopedRun @ engine.js:337
      tidy @ engine.js:326
      tidy @ globals.ts:192
      forwardInput @ FaceRecognitionNet.ts:24
      (anonymous) @ FaceRecognitionNet.ts:66
      (anonymous) @ engine.js:327
      scopedRun @ engine.js:337
      tidy @ engine.js:326
      tidy @ globals.ts:192
      computeFaceDescriptor @ FaceRecognitionNet.ts:65
      async function (async)
      then @ ComposableTask.ts:6
      setTimeout (async)
      tryFn @ util_base.js:289
      (anonymous) @ util_base.js:291
      repeatedTry2 @ util_base.js:276
      addItemToPoll @ gpgpu_context.js:391
      (anonymous) @ gpgpu_context.js:372
      pollFence @ gpgpu_context.js:371
      createAndWaitForFence @ gpgpu_context.js:163
      read @ backend_webgl.js:246
      read @ backend_webgl.js:223
      read @ engine.js:929
      data @ tensor.js:230
      detect @ modelDetect.js:135
      async function (async)
      detect @ modelDetect.js:106
      process2 @ processImage.js:235
      async function (async)
      process2 @ processImage.js:202
      processFiles @ process.js:67
      async function (async)
      processFiles @ process.js:39
      main @ process.js:121
      async function (async)
      main @ process.js:111
      load (async)
      (anonymous) @ process.js:124

Can't `require('@vladmandic/face-api')` and can't import

I switched to using this repo from face-api.js, but there are differences not noted in your README.

In particular this one:
Can't require('@vladmandic/face-api') and can't import either:

[1] /Users/me/app/electron/face-extraction.js:1
[1] import * as faceapi from '@vladmandic/face-api';
[1] ^^^^^^
[1]
[1] SyntaxError: Cannot use import statement outside a module

I used to:

faceapi = require("face-api.js");

But I cannot:

const faceapi = require('@vladmandic/face-api');

Error when require-ing:

[1] Error [ERR_REQUIRE_ESM]: Must use import to load ES Module: /Users/me/app/node_modules/@vladmandic/face-api/dist/face-api.esm.js
[1] require() of ES modules is not supported.
[1] require() of /Users/me/app/node_modules/@vladmandic/face-api/dist/face-api.esm.js from /Users/me/app/electron/face-extraction.js is an ES module file as it is a .js file whose nearest parent package.json contains "type": "module" which defines all .js files in that package scope as ES modules.
[1] Instead rename face-api.esm.js to end in .cjs, change the requiring code to use import(), or remove "type": "module" from /Users/me/app/node_modules/@vladmandic/face-api/package.json.
[1] 

Version: "@vladmandic/face-api": "^0.7.4"
NodeJS Version: 12.16.3

Unknown dtype undefined

Hello!

I recently moved from the original face-api.js library to @vladmandic/face-api because I need to use WASM backend in my face recognition application.

The models are working great in webgl and cpu backends, but when I switch to wasm, the following error shows up:

Error: Unknown dtype undefined

I wonder if I have not configured face-api correctly or it is a problem with tfjs with the models.

The source code for the face-api implementation is here: https://github.com/pabloegpf1/Face-App/blob/master/src/scripts/faceApi.js
You can check out the application here: https://www.pabloescriva.com/Face-App/?backend=wasm

incompatibility with web worker

i am using face-api.js and it workes great using wasm backend.
however when trying to do

faceapi.tf.setWasmPaths(
    "../node_modules/@tensorflow/tfjs-backend-wasm/dist/"
  );
  await faceapi.tf.setBackend("wasm");

results in an error

Uncaught error: initialization of  backend wasm failed

and when i try to set backend to cpu, it shows-

 SSDMOBIlENETV1 : load model before inference

i am using a simple demo as shown in examples but in worker.

Can't get result from some url

When i try, this url it works and can detect

baseImg = https://cdn.etohum.com//3415.jpg?width=300&height=300&crop=true&type=jpeg&top=true

But with below, i cant get result probably because of RGBA image, we experienced this in human side thanks to you it is solved, face and human data types are different so i guess i need a healthy way to convert image for faceapi this time;

img = https://firebasestorage.googleapis.com/v0/b/zetform-63ea7.appspot.com/o/organizations%2FuQbMQXCc2tOqjcG4NpEz%2Fmonitorings%2FRvNxY92jwm7FmqjjBSIA%2Ftestinstances%2FykCFB5VobSyqVXkygDOs%2Fphotos%2F3_2021-04-20%2017%3A55%3A25.jpg?alt=media&token=9e4045f3-7132-4b55-8182-ccc35cb406e2

main function

  const baseFaceTensor = await urlImageFace(baseImageUrl);
    console.log(`Base image url : ${baseImageUrl}`);
    let baseImageDetection = await FACE_API.detectSingleFace(baseFaceTensor);
    console.log(baseFaceTensor);
    console.log(`Base image result : ${JSON.stringify(baseImageDetection)}`);

    const faceTensor = await urlImageFace(imageUrl);
    console.log(`Image url : ${imageUrl}`);
    let imageDetection = await FACE_API.detectSingleFace(faceTensor);
    console.log(`Image result : ${JSON.stringify(imageDetection)}`);

converter function

async function urlImageFace(url: string): Promise<Canvas> { const img = await canvas.loadImage(url); const c: Canvas = canvas.createCanvas(img.width, img.height); const ctx = c.getContext('2d'); ctx.drawImage(img, 0, 0, img.width, img.height); return c; }

img result is undefined

baseImg result ;

{"detection":{"_imageDims":{"_width":300,"_height":300},"_score":0.9976093769073486,"_classScore":0.9976093769073486,"_className":"","_box":{"_x":124.07849729061127,"_y":58.481401205062866,"_width":54.043176770210266,"_height":68.65620017051697}},"landmarks":{"_imgDims":{"_width":54,"_height":68},"_shift":{"_x":124.07849729061127,"_y":58.481401205062866},"_positions":[{"_x":120.85518828034401,"_y":81.02721881866455},{"_x":120.71701441705227,"_y":88.84860563278198},{"_x":120.98090274631977,"_y":96.0622034072876},{"_x":121.51622291654348,"_y":102.37310743331909},{"_x":122.78241972997785,"_y":109.49110436439514},{"_x":125.95677074790001,"_y":115.32119131088257},{"_x":130.28979291021824,"_y":119.44327092170715},{"_x":136.257010191679,"_y":124.08218765258789},{"_x":145.44469159841537,"_y":127.46338200569153},{"_x":155.15684974193573,"_y":126.09272766113281},{"_x":161.93911027908325,"_y":123.1921558380127},{"_x":167.1705765724182,"_y":120.24840211868286},{"_x":171.88425850868225,"_y":115.14139103889465},{"_x":175.22645723819733,"_y":108.75134587287903},{"_x":177.0073914527893,"_y":102.24288082122803},{"_x":178.92500245571136,"_y":95.49140310287476},{"_x":179.8350180387497,"_y":88.06687307357788},{"_x":127.940269947052,"_y":75.45436489582062},{"_x":131.76312348246574,"_y":73.33397936820984},{"_x":136.0147544145584,"_y":73.12188285589218},{"_x":139.93764925003052,"_y":73.84130990505219},{"_x":143.3716133236885,"_y":75.51655995845795},{"_x":158.46489536762238,"_y":77.16854918003082},{"_x":162.0581327676773,"_y":76.36341595649719},{"_x":166.23921489715576,"_y":76.35853803157806},{"_x":170.78442299365997,"_y":77.56704139709473},{"_x":173.98983550071716,"_y":80.27199971675873},{"_x":149.7866354584694,"_y":85.10771954059601},{"_x":149.1838851571083,"_y":90.43016183376312},{"_x":148.42311841249466,"_y":95.14306616783142},{"_x":147.69247180223465,"_y":99.32246899604797},{"_x":142.78614234924316,"_y":100.2460241317749},{"_x":144.79968523979187,"_y":101.60989356040955},{"_x":147.67254835367203,"_y":102.40084290504456},{"_x":150.8202343583107,"_y":102.04974579811096},{"_x":153.16719889640808,"_y":101.63336110115051},{"_x":132.85474580526352,"_y":81.75076687335968},{"_x":135.57794165611267,"_y":81.2574702501297},{"_x":139.20949870347977,"_y":81.60047125816345},{"_x":142.37340027093887,"_y":83.65502202510834},{"_x":139.38527888059616,"_y":84.35140407085419},{"_x":135.51377302408218,"_y":83.66151916980743},{"_x":157.91564548015594,"_y":85.28257095813751},{"_x":161.4507862329483,"_y":84.01510286331177},{"_x":165.25217378139496,"_y":84.35735607147217},{"_x":167.62154495716095,"_y":85.82754683494568},{"_x":164.6939502954483,"_y":87.08810234069824},{"_x":160.8412960767746,"_y":86.4825668334961},{"_x":134.39271372556686,"_y":106.06995034217834},{"_x":138.76355463266373,"_y":105.34680151939392},{"_x":144.6028035879135,"_y":105.35100865364075},{"_x":147.4310240149498,"_y":106.07166075706482},{"_x":149.98530024290085,"_y":105.82894802093506},{"_x":156.2930817604065,"_y":107.24864101409912},{"_x":160.88993632793427,"_y":108.75533413887024},{"_x":155.0191719532013,"_y":112.29439330101013},{"_x":150.7055457830429,"_y":113.77883076667786},{"_x":146.32683044672012,"_y":113.92894196510315},{"_x":142.47318810224533,"_y":113.0419499874115},{"_x":138.50945180654526,"_y":110.89265561103821},{"_x":134.87744334340096,"_y":106.21800255775452},{"_x":142.9297295808792,"_y":107.0421793460846},{"_x":147.11835139989853,"_y":107.6575436592102},{"_x":151.34333431720734,"_y":108.0135247707367},{"_x":159.98774528503418,"_y":108.62691926956177},{"_x":150.7983877658844,"_y":110.58375144004822},{"_x":146.81495493650436,"_y":110.74887943267822},{"_x":142.94862306118011,"_y":110.17836689949036}]},"unshiftedLandmarks":{"_imgDims":{"_width":54,"_height":68},"_shift":{"_x":0,"_y":0},"_positions":[{"_x":-3.2233090102672577,"_y":22.545817613601685},{"_x":-3.361482873558998,"_y":30.367204427719116},{"_x":-3.0975945442914963,"_y":37.58080220222473},{"_x":-2.5622743740677834,"_y":43.891706228256226},{"_x":-1.296077560633421,"_y":51.009703159332275},{"_x":1.878273457288742,"_y":56.8397901058197},{"_x":6.211295619606972,"_y":60.96186971664429},{"_x":12.178512901067734,"_y":65.60078644752502},{"_x":21.366194307804108,"_y":68.98198080062866},{"_x":31.078352451324463,"_y":67.61132645606995},{"_x":37.860612988471985,"_y":64.71075463294983},{"_x":43.092079281806946,"_y":61.767000913619995},{"_x":47.805761218070984,"_y":56.65998983383179},{"_x":51.14795994758606,"_y":50.26994466781616},{"_x":52.92889416217804,"_y":43.76147961616516},{"_x":54.8465051651001,"_y":37.01000189781189},{"_x":55.75652074813843,"_y":29.585471868515015},{"_x":3.861772656440735,"_y":16.97296369075775},{"_x":7.684626191854477,"_y":14.852578163146973},{"_x":11.936257123947144,"_y":14.640481650829315},{"_x":15.85915195941925,"_y":15.359908699989319},{"_x":19.29311603307724,"_y":17.03515875339508},{"_x":34.38639807701111,"_y":18.687147974967957},{"_x":37.97963547706604,"_y":17.882014751434326},{"_x":42.160717606544495,"_y":17.877136826515198},{"_x":46.705925703048706,"_y":19.08564019203186},{"_x":49.911338210105896,"_y":21.790598511695862},{"_x":25.708138167858124,"_y":26.626318335533142},{"_x":25.10538786649704,"_y":31.948760628700256},{"_x":24.344621121883392,"_y":36.661664962768555},{"_x":23.613974511623383,"_y":40.84106779098511},{"_x":18.707645058631897,"_y":41.764622926712036},{"_x":20.721187949180603,"_y":43.12849235534668},{"_x":23.59405106306076,"_y":43.91944169998169},{"_x":26.741737067699432,"_y":43.568344593048096},{"_x":29.088701605796814,"_y":43.15195989608765},{"_x":8.776248514652252,"_y":23.269365668296814},{"_x":11.499444365501404,"_y":22.776069045066833},{"_x":15.1310014128685,"_y":23.119070053100586},{"_x":18.294902980327606,"_y":25.17362082004547},{"_x":15.306781589984894,"_y":25.87000286579132},{"_x":11.435275733470917,"_y":25.180117964744568},{"_x":33.83714818954468,"_y":26.801169753074646},{"_x":37.372288942337036,"_y":25.5337016582489},{"_x":41.17367649078369,"_y":25.8759548664093},{"_x":43.54304766654968,"_y":27.346145629882812},{"_x":40.615453004837036,"_y":28.606701135635376},{"_x":36.76279878616333,"_y":28.001165628433228},{"_x":10.314216434955597,"_y":47.58854913711548},{"_x":14.68505734205246,"_y":46.865400314331055},{"_x":20.524306297302246,"_y":46.86960744857788},{"_x":23.35252672433853,"_y":47.59025955200195},{"_x":25.90680295228958,"_y":47.34754681587219},{"_x":32.21458446979523,"_y":48.767239809036255},{"_x":36.811439037323,"_y":50.27393293380737},{"_x":30.940674662590027,"_y":53.812992095947266},{"_x":26.62704849243164,"_y":55.29742956161499},{"_x":22.248333156108856,"_y":55.44754076004028},{"_x":18.394690811634064,"_y":54.56054878234863},{"_x":14.43095451593399,"_y":52.41125440597534},{"_x":10.798946052789688,"_y":47.73660135269165},{"_x":18.851232290267944,"_y":48.56077814102173},{"_x":23.039854109287262,"_y":49.17614245414734},{"_x":27.26483702659607,"_y":49.53212356567383},{"_x":35.90924799442291,"_y":50.1455180644989},{"_x":26.719890475273132,"_y":52.10235023498535},{"_x":22.736457645893097,"_y":52.267478227615356},{"_x":18.870125770568848,"_y":51.69696569442749}]},"alignedRect":{"_imageDims":{"_width":300,"_height":300},"_score":0.9976093769073486,"_classScore":0.9976093769073486,"_className":"","_box":{"_x":114.80521405488253,"_y":67.68773294091226,"_width":70.94160434603693,"_height":65.20979897975921}},"angle":{"roll":-0.11672766465663492,"pitch":0.07139194507918284,"yaw":-0.33357952252539474},"expressions":{"neutral":3.8578268357625234e-10,"happy":1,"sad":1.5390822297178808e-12,"angry":3.267717346711052e-11,"fearful":1.6766032938614166e-14,"disgusted":7.074605345991358e-9,"surprised":2.202393432557126e-12},"descriptor":{"0":-0.1349954903125763,"1":-0.00383098516613245,"2":0.026064850389957428,"3":-0.06253636628389359,"4":-0.07294207811355591,"5":-0.10778254270553589,"6":0.04157354682683945,"7":-0.055339716374874115,"8":0.11064469069242477,"9":-0.001926228404045105,"10":0.12700673937797546,"11":-0.07955881953239441,"12":-0.1769789755344391,"13":-0.012578554451465607,"14":-0.01091816183179617,"15":0.05831509828567505,"16":-0.20441895723342896,"17":-0.10909132659435272,"18":-0.09465648978948593,"19":-0.037060223519802094,"20":0.06026291847229004,"21":-0.023215286433696747,"22":0.025254035368561745,"23":0.0751262903213501,"24":-0.12100088596343994,"25":-0.294894814491272,"26":-0.14616863429546356,"27":-0.007880615070462227,"28":0.05986291915178299,"29":-0.034005340188741684,"30":0.0668824091553688,"31":0.16395270824432373,"32":-0.1730918139219284,"33":-0.11485938727855682,"34":0.0687878206372261,"35":0.1032743901014328,"36":0.005319777876138687,"37":-0.01689540222287178,"38":0.18072715401649475,"39":-0.020627979189157486,"40":-0.13309717178344727,"41":0.05123228579759598,"42":0.050005458295345306,"43":0.36286723613739014,"44":0.14997558295726776,"45":0.1098690778017044,"46":-0.026445843279361725,"47":-0.057895153760910034,"48":0.10304757952690125,"49":-0.28323161602020264,"50":0.06952199339866638,"51":0.19740347564220428,"52":0.048222701996564865,"53":0.031479865312576294,"54":0.0920865386724472,"55":-0.09178395569324493,"56":0.1103983223438263,"57":0.1232658177614212,"58":-0.20526114106178284,"59":0.054151881486177444,"60":0.0885997787117958,"61":-0.03711840882897377,"62":0.08623755723237991,"63":-0.12073357403278351,"64":0.1085720956325531,"65":0.03570355847477913,"66":-0.0881534144282341,"67":-0.09348952770233154,"68":0.05820057541131973,"69":-0.13804568350315094,"70":0.004564085975289345,"71":0.045562125742435455,"72":-0.12411823123693466,"73":-0.19604453444480896,"74":-0.30623453855514526,"75":0.076019287109375,"76":0.43541425466537476,"77":0.1760491132736206,"78":-0.12408383190631866,"79":0.03355195373296738,"80":-0.03132125362753868,"81":-0.03928499296307564,"82":0.1120387464761734,"83":0.00067950040102005,"84":-0.18385279178619385,"85":-0.005263097584247589,"86":-0.13899032771587372,"87":0.06951217353343964,"88":0.15472730994224548,"89":0.02182161808013916,"90":-0.02992740273475647,"91":0.1791854053735733,"92":0.049786705523729324,"93":-0.028928251937031746,"94":0.0649631917476654,"95":0.025132261216640472,"96":-0.10960439592599869,"97":0.02571445144712925,"98":-0.21299579739570618,"99":0.0500495582818985,"100":0.02644030749797821,"101":-0.08359766006469727,"102":0.022061776369810104,"103":0.12894141674041748,"104":-0.1345120668411255,"105":0.12630078196525574,"106":-0.013402394950389862,"107":-0.07819202542304993,"108":-0.10407000780105591,"109":0.0002585873007774353,"110":-0.1769947111606598,"111":-0.0453077033162117,"112":0.11151634156703949,"113":-0.2253071665763855,"114":0.07506467401981354,"115":0.19966936111450195,"116":0.03586537018418312,"117":0.1311870813369751,"118":0.0607939176261425,"119":0.06628672033548355,"120":-0.04801439493894577,"121":0.05651909112930298,"122":-0.05420826002955437,"123":-0.025805585086345673,"124":0.1171688660979271,"125":0.027032427489757538,"126":0.06286288052797318,"127":0.007505020126700401}}

Time taken on first run much higher than consequent runs

Issue Description:
On running the browser API, the time taken to calculate the first encoding is about 30x more than the consequent ones in the same run and about 15x on consequent runs on the same tab. . Can all of this be attributed to cache misses or can loading be made better even if it takes more time?

Steps to Reproduce:
In the demo folder, add two files with pasted content:
TimeMultipleRuns.js

import * as faceapi from '../dist/face-api.esm.js';


const modelPath = "../model";


await faceapi.tf.setBackend('webgl');
await faceapi.tf.enableProdMode();
await faceapi.tf.ENV.set('DEBUG', false);
await faceapi.tf.ready();

const minScore = 0.3; // minimum score
const maxResults = 10; // maximum number of results to return

const optionsSSDMobileNet = new faceapi.SsdMobilenetv1Options({
    minConfidence: minScore,
    maxResults
});

document.getElementById("run").addEventListener("click", run);

let imgElement;

async function run(){
        
    await faceapi.nets.ssdMobilenetv1.load(modelPath);
    await faceapi.nets.faceLandmark68Net.load(modelPath);
    await faceapi.nets.faceRecognitionNet.load(modelPath);

    console.log("Models loaded")

    const inputElement = document.getElementById("file");
    const file = inputElement.files[0];

    if(FileReader){ //file reader support
      let fr = new FileReader();
      fr.onload = async function () {

          imgElement = new Image();
          imgElement.onload = ImgOnLoad;
          imgElement.src = fr.result;


      }
      fr.readAsDataURL(file);
    }
    else{
      console.log("File reader not supported")
    }
    console.log(inputElement.files[0]);


}

async function ImgOnLoad(){
    let canvas = document.getElementById("canvas")
    canvas.width = imgElement.width;
    canvas.height = imgElement.height;
    let ctx = canvas.getContext("2d");
    ctx.drawImage(imgElement,0,0);



  	const encoding = await calculateEncoding(canvas);

 	console.log(encoding);
}


async function calculateEncoding(img){


      let result;

      for(let i = 0; i < 10; ++i){

          const startTime = performance.now();

          //We want to detect the best face in each file
          result = await faceapi.detectSingleFace(img, new faceapi.SsdMobilenetv1Options(optionsSSDMobileNet))
          .withFaceLandmarks()
          .withFaceDescriptor()

          const endTime = performance.now();

          console.log(`Time for run ${i + 1} in ms: ${endTime - startTime}`);

      }

      //only if face found
      if(result != undefined)
          return result.descriptor;
      else 
        return undefined;

}

TimeMultipleRuns.html

<!DOCTYPE html>
<html>
<head>
    <script src="./TimeMultipleRuns.js" type="module"></script>

</head>
<body>

  <input type="file" id="file" name="file"/><br><br>

    <button id = "run">Run</button><br><br>
    <canvas id="canvas" width="640" height="480"></canvas>


</body>
</html>

Expected Behavior:
Even if load() takes more time, runs should only be doing calculation on values already loaded in memory

Environment:

  • Module version : Latest clone of the repo
  • Browser or NodeJS and version (e.g. NodeJS 14.15 or Chrome 89) : Chrome 89
  • OS and Hardware platform (e.g. Windows 10, Ubuntu Linux on x64, Android 10) : Ubuntu 20 on x64

Unable to use faceapi in a nexjs - react application "Module not found: Can't resolve 'fs'"

Issue Description
Getting the following error:

error - ./node_modules/@vladmandic/face-api/dist/face-api.esm.js:8:25031
Module not found: Can't resolve 'fs'

Steps to Reproduce
create a nextjs application as given here:

https://nextjs.org/learn/basics/create-nextjs-app/setup

added face api packages (npm install). Including tensorflow
Invoked the api to load the models:
await faceapi.nets.ssdMobilenetv1.loadFromUri('/models')

And got the error:

Expected Behavior
It should have loaded the models.

**Environment
Windows 10. Nodejs. React application. Using VS code editor. Browser is chrome/Edge.

  • Module version?
  • Built-in demo or custom code? - custom
  • Type of module used (e.g. js, esm, esm-nobundle)? js
  • Browser or NodeJS and version (e.g. NodeJS 14.15 or Chrome 89)? - Node v14.13.1, Chrome Version 89.0.4389.90 (Official Build) (64-bit)
  • OS and Hardware platform (e.g. Windows 10, Ubuntu Linux on x64, Android 10)? Windows 10
  • Packager (if any) (e.g, webpack, rollup, parcel, esbuild, etc.)? standard webpack

Additional

  • For installation or startup issues include your package.json
  • For usage issues, it is recommended to post your code as gist

@vladmandic/face-api/dist/face-api.node.js requires ES module???

trying to migrate to your fork of face-api

Using:

let faceapi   = require('@vladmandic/face-api/dist/face-api.node.js');

I get:

Error [ERR_REQUIRE_ESM]: Must use import to load ES Module: /home/meeki/node-red-contrib-facial-recognition/node_modules/@vladmandic/face-api/dist/face-api.node.js
require() of ES modules is not supported.
require() of /home/meeki/node-red-contrib-facial-recognition/node_modules/@vladmandic/face-api/dist/face-api.node.js from /home/meeki/node-red-contrib-facial-recognition/facial-recognition.js is an ES module file as it is a .js file whose nearest parent package.json contains "type": "module" which defines all .js files in that package scope as ES modules.
Instead rename face-api.node.js to end in .cjs, change the requiring code to use import(), or remove "type": "module" from /home/meeki/node-red-contrib-facial-recognition/node_modules/@vladmandic/face-api/package.json.

interesting

Question - yes/no - JavaScript without polyfills

You state: "Typescript build process now targets ES2018 and instead of dual ES5/ES6
Resulting code is clean ES2018 JavaScript without polyfills"

So I no longer have to use the monkeyPatch?

// patch nodejs environment, we need to provide an implementation of
// HTMLCanvasElement and HTMLImageElement
const { Canvas, Image, ImageData } = canvas
faceapi.env.monkeyPatch({ Canvas, Image, ImageData })

If so that would be great to loose the canvas depends that has been giving me trouble of late.

thank you,
meeki007

Multiple issue facing with Face Recognization.

Thanks for this pkg. It would be really helpful for lots of developers. I appreciate your efforts. I am using your pkg in my activity. Unfortunately, I am facing some issues which are listed below.

My Codebase:

Loading Image:

const buffer = fs.readFileSync(image_path);
const decoded = tf.node.decodeImage(buffer);
const casted = decoded.toFloat();
const loadedImage = casted.expandDims(0);
decoded.dispose(); //release from memory
casted.dispose(); //release from memory

Getting Descriptors:

const result = await FaceAPI
                 .detectAllFaces(loadedImage, new FaceAPI.SsdMobilenetv1Options({ minConfidence: 0.5 }))
                .withFaceLandmarks()
                .withFaceDescriptors()

Issue-1: Sometimes, I feel it's creating issue with PNG images. I don't know why? Any comment plz.
Issue-2: Some images don't have multiple faces, but still getting multiple descriptors. (let me know where I can share image.)
Issue-3: Some images have face but still get no descriptors. I have change value "minConfidence" and it's working.How to recognize right value?
Issue-4: Getting below error while executing ".withFaceDescriptors()" method for specific image. it's seems that is just configuration fine-tuning, but I am unable to identify the configuration for it. + Why it's showing UnhandledPromiseRejectionWarning while I am using try-catch in my codebase.

(node:6042) UnhandledPromiseRejectionWarning: Error: Invalid TF_Status: 3
Message: Input to reshape is a tensor with 1048576 values, but the requested shape has 786432
at NodeJSKernelBackend.executeSingleOutput (/FR/code/node_modules/@tensorflow/tfjs-node/dist/nodejs_kernel_backend.js:209:43)
at Object.kernelFunc (/home/manish/Desktop/MIND/FR/code/node_modules/@tensorflow/tfjs-node/dist/kernels/Reshape.js:33:27)
at kernelFunc (/FR/code/node_modules/@tensorflow/tfjs-core/dist/tf-core.node.js:3139:32)
at /FR/code/node_modules/@tensorflow/tfjs-core/dist/tf-core.node.js:3203:110
at holdResultWrapperFn (/FR/code/node_modules/@tensorflow/tfjs-core/dist/tf-core.node.js:1628:23)
at NodeJSKernelBackend. (/FR/code/node_modules/@tensorflow/tfjs-node/dist/nodejs_kernel_backend.js:510:17)
at step (/FR/code/node_modules/@tensorflow/tfjs-node/dist/nodejs_kernel_backend.js:61:23)
at Object.next (/FR/code/node_modules/@tensorflow/tfjs-node/dist/nodejs_kernel_backend.js:42:53)
at /FR/code/node_modules/@tensorflow/tfjs-node/dist/nodejs_kernel_backend.js:36:71
at new Promise ()

Environment

  • Ubuntu 18.04 with Node-v12.18.1
  • "@tensorflow/tfjs-node": "^3.3.0"
  • "@vladmandic/face-api": "^1.0.2"
  • "face-api.js": "^0.22.2"

return process.dlopen(module, path.toNamespacedPath(filename));

hey! im using your package in my project for face recognition and it works fine. i copied the same project to different computer and im getting this error.
tried all solutions available online, reinstalled node modules a number of times, tried all versions of node and tensorflow as well as your package, not sure where is the error coming from.

here is the error stack:

return process.dlopen(module, path.toNamespacedPath(filename));
^

Error: A dynamic link library (DLL) initialization routine failed.
\?\C:\Users\User\Documents\digitallending_sql\node_modules@tensorflow\tfjs-node\lib\napi-v5\tfjs_binding.node
at Object.Module._extensions..node (internal/modules/cjs/loader.js:1206:18)
at Module.load (internal/modules/cjs/loader.js:1000:32)
at Function.Module._load (internal/modules/cjs/loader.js:899:14)
at Module.require (internal/modules/cjs/loader.js:1042:19)
at require (internal/modules/cjs/helpers.js:77:18)
at Object. (C:\Users\User\Documents\digitallending_sql\node_modules@tensorflow\tfjs-node\dist\index.js:58:16)
at Module._compile (internal/modules/cjs/loader.js:1156:30)
at Object.Module._extensions..js (internal/modules/cjs/loader.js:1176:10)
at Module.load (internal/modules/cjs/loader.js:1000:32)
at Function.Module._load (internal/modules/cjs/loader.js:899:14)
at Module.require (internal/modules/cjs/loader.js:1042:19)
at require (internal/modules/cjs/helpers.js:77:18)
at Object. (C:\Users\User\Documents\digitallending_sql\functions\facedetection.js:2:1)
at Module._compile (internal/modules/cjs/loader.js:1156:30)
at Object.Module._extensions..js (internal/modules/cjs/loader.js:1176:10)
at Module.load (internal/modules/cjs/loader.js:1000:32)
at Function.Module._load (internal/modules/cjs/loader.js:899:14)
at Module.require (internal/modules/cjs/loader.js:1042:19)
at require (internal/modules/cjs/helpers.js:77:18)
at Object. (C:\Users\User\Documents\digitallending_sql\controllers\applicationController.js:6:20)
at Module._compile (internal/modules/cjs/loader.js:1156:30)
at Object.Module._extensions..js (internal/modules/cjs/loader.js:1176:10)
at Module.load (internal/modules/cjs/loader.js:1000:32)
at Function.Module._load (internal/modules/cjs/loader.js:899:14)
at Module.require (internal/modules/cjs/loader.js:1042:19)
at require (internal/modules/cjs/helpers.js:77:18)
at Object. (C:\Users\User\Documents\digitallending_sql\routes\applicationRouter.js:3:29)
at Module._compile (internal/modules/cjs/loader.js:1156:30)


im using windows 10, node 12.16.2

React native support (mobile)

Please let me know if there is support for react native and if not what changes can be done to support react native

Can't use in a lerna/create-react app project

I am using the library in a TypeScript lerna module which is build with tsc and then bundled as part of a create-react-app. I had no issue with the original face-api but wanted the latest fdjs. I am using the npm module with npm install.

If I only do:

import * as faceapi from '@vladmandic/face-api'

I get a compile error:

Failed to compile
../face-detect-plugin/node_modules/@vladmandic/face-api/dist/face-api.js 344:47
Module parse failed: Unexpected token (344:47)
File was processed with these loaders:

  • ./node_modules/babel-loader/lib/index.js
    You may need an additional loader to handle the result of these loaders.
    | } : T || {};
    |
    var E = g(f(T.entropy ? [x, w(e)] : x ?? y(), 3), _),

| F = new d(_),
| D = function () {

if I add:

import * as tf from '@tensorflow/tfjs'

before importing face-api then I get an error in the console:

Uncaught SyntaxError: Unexpected token '!'

If I look at the code I see this:

class MathBackendCPU extends !(function webpackMissingModule() { var e = new Error("Cannot find module '@tensorflow/tfjs-core'"); e.code = 'MODULE_NOT_FOUND'; throw e; }()) {
constructor() {
super();
this.blockSize = 48;
this.firstUse = true;
this.data = new !(function webpackMissingModule() { var e = new Error("Cannot find module '@tensorflow/tfjs-core'"); e.code = 'MODULE_NOT_FOUND'; throw e; }())(this, !(function webpackMissingModule() { var e = new Error("Cannot find module '@tensorflow/tfjs-core'"); e.code = 'MODULE_NOT_FOUND'; throw e; }())());
}

node got Error: Size(1048576) must match the product of shape 512,512,3

Using the code from nodejs example. I send the base64 image from the client. using canvas.toDataUrl(); then on nodejs I use the following code to transform base64 to jpg then faceApi.
Note that using the normal jpg from internet the lib is working as expected. However, i download some img.jpg and inspect I can't find any difference.

`
const content = message.value.toString().replace(/^data:image/png;base64,/, "");
await fsp.writeFile('./img.jpg', content, 'base64')
const rezied = await sharp('./img.jpg')
.rotate()
.resize(400,400)
.toBuffer()
await fsp.writeFile('./img.jpg',rezied)
const tensor = await image('./img.jpg');

      const res = await faceapi
      .detectAllFaces(tensor, optionsSSDMobileNet)
      .withFaceLandmarks()
      .withFaceExpressions()
      .withFaceDescriptors()
      .withAgeAndGender();

    //   const faceMatcher = new faceapi.FaceMatcher(res)
      console.log('res:',res)

`
Here is the error code

        throw Error("Size(" + size + ") must match the product of shape " + shape);
              ^
      Error: Size(1048576) must match the product of shape 512,512,3
          at util.inferFromImplicitShape (/home/thape/faceApi/node_modules/@tensorflow/tfjs-core/src/util_base.ts:317:13)
          at forward (/home/thape/faceApi/node_modules/@tensorflow/tfjs-core/src/ops/reshape.ts:64:13)
          at /home/thape/faceApi/node_modules/@tensorflow/tfjs-core/src/engine.ts:625:31
          at /home/thape/faceApi/node_modules/@tensorflow/tfjs-core/src/engine.ts:433:20
          at Engine.scopedRun (/home/thape/faceApi/node_modules/@tensorflow/tfjs-core/src/engine.ts:444:19)
          at Engine.tidy (/home/thape/faceApi/node_modules/@tensorflow/tfjs-core/src/engine.ts:431:17)
          at kernelFunc (/home/thape/faceApi/node_modules/@tensorflow/tfjs-core/src/engine.ts:625:20)
          at /home/thape/faceApi/node_modules/@tensorflow/tfjs-core/src/engine.ts:639:23
          at Engine.scopedRun (/home/thape/faceApi/node_modules/@tensorflow/tfjs-core/src/engine.ts:444:19)
at Engine.runKernelFunc (/home/thape/faceApi/node_modules/@tensorflow/tfjs-core/src/engine.ts:636:10)

Unknown file extension

hi
I can not use it in ubuntu

Error [ERR_UNKNOWN_FILE_EXTENSION]: Unknown file extension: /home/ali/Desktop/Nodejs%20Projects/face-api-test/node_modules/@vladmandic/face-api/dist/face-api.cjs
at Loader.resolve [as _resolve] (internal/modules/esm/default_resolve.js:93:13)
at Loader.resolve (internal/modules/esm/loader.js:58:33)
at Loader.getModuleJob (internal/modules/esm/loader.js:113:40)
at ModuleWrap.promises.module.link (internal/modules/esm/module_job.js:32:40)
at link (internal/modules/esm/module_job.js:31:36)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.