Git Product home page Git Product logo

react-native-facerecognition's Introduction


React Native Face Recognition [Android]

NPM Version NPM Version PRs Welcome Join us on Slack

Summary

💥 New version with new features @3.1.0!


Do you ever want to detect and recognize faces? this is now possible especially in react native, This lib aims to provide best detection and recognition results as well as facial landmarks.

Important

  • New to Face Recognition? We recommend you start with the default values which described below.
  • Looking to contribute? feel free to join our community on Slack, and take a look into coming updates.
  • If the library doesn't work with you, you need to check your mobile's CPU if it one of those:

After downloading you need to put that package into this location react-native-facerecognition/android/app/src/main/jniLibs and that's all!

⚠️ Since we're focusing our efforts on next updates, we are really welcome for any issues/pulls to improve the way we go.

Features

  • Face Detection
  • Face Recognition (LBPH Algorithm only)
  • Tracking face
  • Facial landmarks
  • Train your own dataset
  • Without any internet!
  • All devices supported.

Upcoming updates

  • Add EagenFace Algorithm ❌
  • Add Fisherface Algorithm ❌
  • UI Component ✔️

Real world examples

Tracker Face Landmarks Face


Code Example

import Camera from 'react-native-facerecognition';
...
<Camera
          ref={(cam) => {
            this.camera = cam;
          }}
          style={styles.preview}
          aspect={Camera.constants.Aspect.fill}
          captureQuality={Camera.constants.CaptureQuality.high}
          touchToFocus
          torchMode={Camera.constants.TorchMode.on}
          rotateMode={Camera.constants.RotateMode.on}
          cameraType={Camera.constants.CameraType.front}
          model = {Camera.constants.Model.lbp}
          dataset
          distance = {200}
          onTrained = {this.onTrained}
          onRecognized = {this.onRecognized}
          onUntrained = {this.onUntrained}
          onUnrecognized = {this.onUnrecognized}
        />

🔥 Checkout our main example to get the right way to start.

Get Started

npm install react-native-facerecognition --save
or
yarn add react-native-facerecognition

⚠️ We are highly recommend you to use Deamon gradle for building faster and ignore any error may happen.

  • Inside build.gradle put this line in projectName/android/build.gradle.
repositories {
        ...
        google()
   }
   ...
   classpath 'com.android.tools.build:gradle:3.0.1'
   ...
   allprojects {
    repositories {
        ...
        maven { url "https://jitpack.io" }
        maven { url "https://maven.google.com" }
        ...
        }
       }
       
  • And go to projectName/android/gradle/wrapper/gradle-wrapper.properties and change this.
distributionUrl=https\://services.gradle.org/distributions/gradle-4.4-all.zip

1.1 Direct Setup

  • You would need to link the library once you install it.
react-native link react-native-facerecognition 
  • Add this line into build.gradle and the path projectName/android/app/build.gradle.
compileSdkVersion 26
buildToolsVersion "26.0.2"
....
....
  • Then put this lines into settings.gradle and the path projectName/android/settings.gradle.
include ':openCV'
project(':openCV').projectDir = new File(rootProject.projectDir, '../node_modules/react-native-facerecognition/android/openCV')
  • You need to put permissions to use Camera on AndroidManifest.xml and remove the line below.
<uses-permission android:name="android.permission.CAMERA" />
<uses-feature android:name="android.hardware.camera" android:required="true"/>
<uses-feature android:name="android.hardware.camera.autofocus" android:required="false"/>
<uses-feature android:name="android.hardware.camera.front" android:required="true"/>
<uses-feature android:name="android.hardware.camera.front.autofocus" android:required="false"/>
...
<activity
  android:windowSoftInputMode="adjustResize" // remove this in order to make UI works well

1.2 Manual Setup

  • First, put this line into MainApplication.java and its path projectName/android/app/src/main/java/com/projectName/MainApplication.java.
import cv.reactnative.facerecognition.FaceModulePackage;
...
return Arrays.<ReactPackage>asList(
  new FaceModulePackage()
);
  • Then add this line into build.gradle and the path projectName/android/app/build.gradle.
compileSdkVersion 26
buildToolsVersion "26.0.2"
....
....
....
dependencies {
  implementation project(':react-native-facerecognition')
  }
  • Inside settings.gradle put these lines you can find the path at projectName/android/settings.gradle
include ':react-native-facerecognition'
project(':react-native-facerecognition').projectDir = new File(rootProject.projectDir, '../node_modules/react-native-facerecognition/android/app')
include ':openCV'
project(':openCV').projectDir = new File(rootProject.projectDir, '../node_modules/react-native-facerecognition/android/openCV')
  • You need to put permissions to use Camera on AndroidManifest.xml and remove the line below.
<uses-permission android:name="android.permission.CAMERA" />
<uses-feature android:name="android.hardware.camera" android:required="true"/>
<uses-feature android:name="android.hardware.camera.autofocus" android:required="false"/>
<uses-feature android:name="android.hardware.camera.front" android:required="true"/>
<uses-feature android:name="android.hardware.camera.front.autofocus" android:required="false"/>
...
<activity
  android:windowSoftInputMode="adjustResize" // remove this in order to make UI works well

🔥 You're ready to go!

Documentation

How it works:

There's three main process you have to follow the first one is to detect the face first on the image, Then you do some training on every face (Person1 & Person2 & Person3) you will need to recognize later which is the second process. Then, you can now after training, Recognize all these faces by their names.

1.3 Props

model

  • Camera.constants.Model.cascade [DEPRECATED]
    • Higher recall & More trained faces
  • Camera.constants.Model.landmarks
    • Higher accuracy at recognition
  • Camera.constants.Model.lbp
    • Higher precision & Faster & Less trained faces

cameraType

  • Camera.constants.CameraType.front
  • Camera.constants.CameraType.back

captureQuality

  • Camera.constants.CaptureQuality.low
  • Camera.constants.CaptureQuality.medium
  • Camera.constants.CaptureQuality.high

aspect

  • Camera.constants.Aspect.fit
  • Camera.constants.Aspect.fill
  • Camera.constants.Aspect.stretch

dataset (default: false)

  • train your own images. copy Images to that location at react-native-facerecognition/android/app/src/main/assets/dataset inside that folder.

if you will use this, you need to use lbp model only in order to make it work. Please see the test photo there to know how to rename your photos according to that test photo every photo should be named like that name_number.jpg/png.

distance

<Camera distance = {200} />
  • Distance between the face and the camera. This is very important to keep recognition always works and to help make the result mainly true (this effects the recognition result grossly).

rotateMode - (Landscape/Portrait)

  • Camera.constants.RotateMode.on
  • Camera.constants.RotateMode.off

1.4 Functions

  • takePicture()

Take a picture then process it to detect face inside

  • train(Object)

Train the algorithm with a new detected face

  • identify()

Take a picture then predict whom face belongs to

  • clear()

Clear all previous trained faces

1.5 Events

onTrained

  • Called after success training

onUntrained

  • If training fails this function will be called with the error

onRecognized You get details about recognized face:

  • name The face name
  • confidence This number indicates how much the result is true. Usually low number < 100 means a good result depends on the distance you put

onUnrecognized

  • If recognition fails this function will be called with the error

Recommendations

If you're not familiar with OpenCV and face recognition you have to be in safe and use our default arguments as we care about all of the details for you. In case of using your own arguments please note that you may effect the accuracy depends on your settings.

  • Recognition default arguments
    Arguments

    • distance = {200}
  • Training images count
    Minimum

    • There's no maximum but the average is 3~4 photos per face to guarantee high accuracy

Updates

🚀 Updates you can find all previous versions and updates up to date!

Thank to

rn-openALPR rn-camera

Licence

MIT

react-native-facerecognition's People

Contributors

aboozaid avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

react-native-facerecognition's Issues

Ui suggestion

Hi @assemmohamedali,
Take a look at this graphic that I made.
Let me know what you think.

import React, { Component } from 'react';
import {
  Platform,
  StyleSheet,
  Text,
  View,
  TouchableOpacity,
  ToastAndroid,
  ListView,
  TextInput,
  TouchableHighlight,
  DeviceEventEmitter,
  StatusBar,
  Image,
} from 'react-native';
import { RNCamera } from 'react-native-camera';
import Face from 'react-native-facerecognition';
import DialogManager, {
  SlideAnimation,
  DialogContent,
  DialogTitle,
  DialogButton,
} from 'react-native-dialog-component';

export default class App extends Component {
  constructor(props) {
    super(props);
    this.ds = new ListView.DataSource({
      rowHasChanged: (r1, r2) => r1 !== r2,
    });
    const faces = [];
    this.state = {
      dataSource: this.ds.cloneWithRows(faces),
      captured: 1,
      faces: faces,
      type: 'front',
    };
  }
  componentDidMount() {
    StatusBar.setHidden(true);
    DeviceEventEmitter.addListener(
      'onFaceRecognized',
      this.onFaceRecognized.bind(this)
    );
    DeviceEventEmitter.addListener('onClean', this.onClean.bind(this));
  }
  render() {
    return (
      <View style={styles.container}>
        <RNCamera
          ref={ref => {
            this.camera = ref;
          }}
          style={styles.preview}
          type={this.state.type}
          permissionDialogTitle={'Permission to use camera'}
          permissionDialogMessage={
            'We need your permission to use your camera phone'
          }
          onMountError={msg => this.cameraError(msg)}
          onCameraReady={() => {
            this.onCameraReady();
          }}>
          <View
            style={{
              flex: 0,
              flexDirection: 'row',
              justifyContent: 'flex-end',
            }}>
            <TouchableOpacity
              onPress={this.recognizePicture.bind(this)}
              style={styles.capture}>
              <Image
                style={{ width: 35, height: 35 }}
                source={{
                  uri:
                    'https://cdn3.iconfinder.com/data/icons/antivirus-internet-security-thin-colored-outline/33/facial_recognition-512.png',
                }}
              />
            </TouchableOpacity>
            <TouchableOpacity
              onPress={() =>
                this.setState({
                  type: this.state.type === 'back' ? 'front' : 'back',
                })
              }
              style={styles.capture}>
              <Image
                style={{ width: 35, height: 35 }}
                source={{
                  uri:
                    'https://user-images.githubusercontent.com/20476002/43048582-2fa1b864-8dea-11e8-968a-2bb209302604.png',
                }}
              />
            </TouchableOpacity>
            <TouchableOpacity
              onPress={() => this.clean()}
              style={styles.capture}>
              <Image
                style={{ width: 35, height: 35 }}
                source={{
                  uri:
                    'https://cdn2.iconfinder.com/data/icons/basic-4/512/delete-512.png',
                }}
              />
            </TouchableOpacity>
          </View>
          <View style={{ flex: 1, justifyContent: 'flex-end' }}>
            <TouchableOpacity
              onPress={this.takePicture.bind(this)}
              style={styles.capture}>
              <Image
                style={{ width: 35, height: 35 }}
                source={{
                  uri:
                    'https://user-images.githubusercontent.com/20476002/43048598-67bed272-8dea-11e8-9736-a6ebb6e28362.png',
                }}
              />
            </TouchableOpacity>
          </View>
        </RNCamera>
      </View>
    );
  }

  onFaceRecognized(data) {
    ToastAndroid.show(
      'Recognized: ' + data.name + ' Distance: ' + data.distance,
      ToastAndroid.LONG
    );
  }
  onClean(msg) {
    this.setState({ faces: [] });
    ToastAndroid.show(msg, ToastAndroid.SHORT);
  }
  clean() {
    Face.Clean();
  }
  onCameraReady() {
    Face.Start(
      Face.Detection.DEEP,
      success => {
        ToastAndroid.show('Train initialized', ToastAndroid.SHORT);
      },
      error => {
        ToastAndroid.show(error, ToastAndroid.LONG);
      }
    );
  }

  takePicture = async function() {
    if (this.camera) {
      const options = { width: 200, base64: true };
      const data = await this.camera.takePictureAsync(options);
      Face.Detect(
        data.base64,
        detected => {
          ToastAndroid.show(detected, ToastAndroid.SHORT);
          this.setState({ image64: data.base64 });
          this.onFaceDetect();
        },
        error => {
          ToastAndroid.show(error, ToastAndroid.SHORT);
        }
      );
    }
  };
  recognizePicture = async function() {
    if (this.camera) {
      const options = { width: 200, base64: true };
      const data = await this.camera.takePictureAsync(options);
      Face.Detect(
        data.base64,
        detected => {
          ToastAndroid.show(detected, ToastAndroid.SHORT);
          Face.Identify(data.base64, unrecognized => {
            ToastAndroid.show(unrecognized, ToastAndroid.SHORT);
          });
        },
        error => {
          ToastAndroid.show(error, ToastAndroid.SHORT);
        }
      );
      console.log(data.uri);
    }
  };
  onFaceDetect() {
    if (this.state.faces.length == 0) this.newFaceDetected();
    else {
      DialogManager.show(
        {
          title: 'Trained Faces',
          titleAlign: 'center',
          haveOverlay: false,
          animationDuration: 200,
          SlideAnimation: new SlideAnimation({ slideFrom: 'top' }),
          children: (
            <DialogContent>
              <View>
                <ListView
                  dataSource={this.state.dataSource}
                  renderRow={this.renderRow.bind(this)}
                />
              </View>
              <DialogButton
                text="Close"
                align="right"
                onPress={() => DialogManager.dismiss()}
              />
              <DialogButton
                text="New Face"
                align="right"
                onPress={() => this.newFaceDetected()}
              />
            </DialogContent>
          ),
        },
        () => {
          console.log('callback - show');
        }
      );
    }
  }
  newFaceDetected() {
    DialogManager.show(
      {
        title: 'Train Face',
        titleAlign: 'center',
        haveOverlay: false,
        animationDuration: 200,
        SlideAnimation: new SlideAnimation({ slideFrom: 'top' }),
        children: (
          <DialogContent>
            <View>
              <TextInput
                placeholder="face name"
                onChangeText={Fname => this.setState({ Fname })}
              />
            </View>
            <DialogButton text="Save" onPress={() => this.newFaceImage()} />
          </DialogContent>
        ),
      },
      () => {
        console.log('callback - show');
      }
    );
  }
  newFaceImage() {
    const faces = [
      ...this.state.faces,
      { Fname: this.state.Fname, captured: this.state.captured },
    ];
    const images = { image64: this.state.image64, Fname: this.state.Fname };
    Face.Training(images, result => alert(result), err => alert(err));
    this.setState({ dataSource: this.ds.cloneWithRows(faces), faces });
    DialogManager.dismissAll();
  }
  saveCaptureImage(faceData) {
    if (faceData.captured == 5)
      ToastAndroid.show('More photos are not allowed', ToastAndroid.SHORT);
    else {
      const slice = this.state.faces.slice();
      slice.map(face => {
        if (face.Fname == faceData.Fname) face.captured++;
      });
      this.setState({ dataSource: this.ds.cloneWithRows(slice) });
      const images = { image64: this.state.image64, Fname: faceData.Fname };
      Face.Training(images, result => alert(result), err => alert(err));
    }
    DialogManager.dismiss();
  }
  renderRow(rowData) {
    return (
      <TouchableHighlight
        onPress={() => this.saveCaptureImage(rowData)}
        underlayColor="transparent">
        <View
          style={{
            flex: 1,
            flexDirection: 'row',
            padding: 15,
            alignItems: 'center',
            borderColor: '#D7D7D7',
            borderBottomWidth: 1,
          }}>
          <Text style={{ fontSize: 16 }}>{rowData.captured}</Text>
          <View style={{ paddingLeft: 20 }}>
            <Text style={{ fontSize: 18 }}>{rowData.Fname}</Text>
          </View>
        </View>
      </TouchableHighlight>
    );
  }
}

const styles = StyleSheet.create({
  container: {
    flex: 1,
    flexDirection: 'column',
    backgroundColor: 'white',
  },
  preview: {
    flex: 1,
    //justifyContent: 'flex-end',
    //alignItems: 'center',
  },
  capture: {
    flex: 0,
    //backgroundColor: '#D7D7D7',
    //borderRadius: 5,
    padding: 10,
    //paddingHorizontal: 10,
    alignSelf: 'center',
    margin: 10,
  },
});

I cannot run android when i linked your library

I got this error when i linked
FAILURE: Build failed with an exception.

  • Where:
    Build file 'D:\mobile-app\node_modules\react-native-facerecognition\android\app\build.gradle' line: 40

  • What went wrong:
    A problem occurred evaluating project ':react-native-facerecognition'.

Project with path ':openCV' could not be found in project ':react-native-facerecognition'.

  • Try:
    Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output.

  • Get more help at https://help.gradle.org

my react-native version is 0.57.3
react-native-cli: 2.0.1

it said cannot find openCV module but there is on your package
Thankyou in advance

Traning

Thanks for your lib.
Don't know that I can use training with images instead of 'this.camera'

NativeModules.Face

i've been trying to use this repo and make some update
but the problem i think is in NativeModules.Face

does NativeModules is still usable ?

_reactNativeFace.recognition.default.Detection.DEEP

Hello guys,

I'm trying to create a simple face recognition but the thing is I can't proceed the development due to this error. I don't know where this error came from and where should I fix.

Here's the screenshot of the error that I've encountered

error-encountered

Android Device

  • Device: [ASUS Zenfone Max]
  • OS: [Android Marshmallow 6.0.1]

Here's my code


import React, { Component } from 'react';
import {
  Platform,
  StyleSheet,
  Text,
  View,
  TouchableOpacity,
  ToastAndroid,
  ListView,
  TextInput,
  TouchableHighlight,
  DeviceEventEmitter
} from 'react-native';
import { RNCamera } from 'react-native-camera';
import Face from 'react-native-facerecognition'
import DialogManager, { SlideAnimation, DialogContent, DialogTitle, DialogButton } from 'react-native-dialog-component';

export default class App extends Component {
  constructor(props) {
    super(props);
    this.ds = new ListView.DataSource({
      rowHasChanged:(r1,r2) => r1 !== r2
    });
    const faces = [];
    this.state = {
      dataSource: this.ds.cloneWithRows(faces),
      captured: 1,
      faces: faces,
      type: 'front'
    };
  }
  componentDidMount() {
    DeviceEventEmitter.addListener("onFaceRecognized", this.onFaceRecognized.bind(this))
    DeviceEventEmitter.addListener("onClean", this.onClean.bind(this))
  }
  render() {
    return (
      <View style={styles.container}>
        <RNCamera
            ref={ref => {
              this.camera = ref;
            }}
            style = {styles.preview}
            type={this.state.type}
            permissionDialogTitle={'Permission to use camera'}
            permissionDialogMessage={'We need your permission to use your camera phone'}
            onMountError={(msg) => this.cameraError(msg)}
            onCameraReady={() => {this.onCameraReady()}}
        />
        <View style={{flex: 0, flexDirection: 'row', justifyContent: 'center',}}>
        <TouchableOpacity
            onPress={this.takePicture.bind(this)}
            style = {styles.capture}
        >
            <Text style={{fontSize: 14}}> SHOT </Text>
        </TouchableOpacity>
        <TouchableOpacity
            onPress={this.recognizePicture.bind(this)}
            style = {styles.capture}
        >
            <Text style={{fontSize: 14}}> RECOGNIZE </Text>
        </TouchableOpacity>
        <TouchableOpacity
            onPress={() => this.setState({type: this.state.type === 'back' ? 'front' : 'back',})}
            style = {styles.capture}
        >
            <Text style={{fontSize: 14}}> FLIP </Text>
        </TouchableOpacity>
        <TouchableOpacity
            onPress={() => this.clean()}
            style = {styles.capture}
        >
            <Text style={{fontSize: 14}}> CLEAN </Text>
        </TouchableOpacity>
        </View>
      </View>
    );
  }
  
  onFaceRecognized(data) {
    ToastAndroid.show("Recognized: " + data.name + " Distance: " + data.distance, ToastAndroid.LONG)
  }
  onClean(msg) {
    this.setState({faces: []})
    ToastAndroid.show(msg, ToastAndroid.SHORT)
  }
  clean() {
    Face.Clean();
  }
  onCameraReady() {
    Face.Start(Face.Detection.DEEP, (success) => {
      ToastAndroid.show("Train initialized", ToastAndroid.SHORT)
    }, (error) => {
      ToastAndroid.show(error, ToastAndroid.LONG)
    })
  }

  takePicture = async function() {
    if (this.camera) {
      const options = { width: 200, base64: true }
      const data = await this.camera.takePictureAsync(options)
      Face.Detect(data.base64, (detected) => {
        ToastAndroid.show(detected, ToastAndroid.SHORT)
        this.setState({image64: data.base64})
        this.onFaceDetect()
      }, (error) => {
        ToastAndroid.show(error, ToastAndroid.SHORT)
      })
    }
  };
  recognizePicture = async function() {
    if (this.camera) {
      const options = { width: 200, base64: true };
      const data = await this.camera.takePictureAsync(options)
      Face.Detect(data.base64, (detected) => {
        ToastAndroid.show(detected, ToastAndroid.SHORT)
        Face.Identify(data.base64, (unrecognized) => {
          ToastAndroid.show(unrecognized, ToastAndroid.SHORT)
        })
      }, (error) => {
        ToastAndroid.show(error, ToastAndroid.SHORT)
      })
      console.log(data.uri);
    }
  };
  onFaceDetect() {
    if(this.state.faces.length == 0)
      this.newFaceDetected();
    else {
      DialogManager.show({
        title: 'Trained Faces',
        titleAlign: 'center',
        haveOverlay: false,
        animationDuration: 200,
        SlideAnimation: new SlideAnimation({slideFrom: 'top'}),
        children: (
          <DialogContent >
              <View>
                <ListView dataSource = {this.state.dataSource} renderRow = {this.renderRow.bind(this)} />
              </View>
              <DialogButton text = "Close" align = 'right' onPress = {() => DialogManager.dismiss()} />
              <DialogButton text = "New Face" align = 'right' onPress = {() => this.newFaceDetected()} />
          </DialogContent>
        ),
      }, () => {
        console.log('callback - show')
      });
    }
  }
  newFaceDetected() {
    DialogManager.show({
      title: 'Train Face',
      titleAlign: 'center',
      haveOverlay: false,
      animationDuration: 200,
      SlideAnimation: new SlideAnimation({slideFrom: 'top'}),
      children: (
        <DialogContent>
            <View>
              <TextInput placeholder="face name" onChangeText={(Fname) => this.setState({Fname})} />
            </View>
          <DialogButton text = "Save" onPress= {() => this.newFaceImage()}/>
        </DialogContent>
      ),
    }, () => {
      console.log('callback - show');
    });
  }
  newFaceImage() {
    const faces = [...this.state.faces, {Fname: this.state.Fname, captured: this.state.captured}]
    const images = {image64: this.state.image64, Fname: this.state.Fname}
    Face.Training(images, (result) => alert(result), (err) => alert(err))
    this.setState({dataSource: this.ds.cloneWithRows(faces), faces})
    DialogManager.dismissAll()
  }
  saveCaptureImage(faceData) {
    if(faceData.captured == 5)
      ToastAndroid.show("More photos are not allowed", ToastAndroid.SHORT)
    else {
      const slice = this.state.faces.slice()
      slice.map((face) => {
        if(face.Fname == faceData.Fname)
          face.captured++
      })
      this.setState({dataSource: this.ds.cloneWithRows(slice)})
      const images = {image64: this.state.image64, Fname: faceData.Fname}
      Face.Training(images, (result) => alert(result), (err) => alert(err))
    }
    DialogManager.dismiss()
  }
  renderRow(rowData) {
    return(
          <TouchableHighlight onPress= {() => this.saveCaptureImage(rowData)} underlayColor='transparent' >
            <View style = {{
                flex:1,
                flexDirection: 'row',
                padding: 15,
                alignItems: 'center',
                borderColor: '#D7D7D7',
                borderBottomWidth: 1
            }}>
            <Text style = {{fontSize: 16}}>{rowData.captured}</Text>
                <View style = {{paddingLeft: 20}}>
                  <Text style = {{fontSize: 18}}>{rowData.Fname}</Text>
                </View>
            </View>
          </TouchableHighlight>
    );
  }
}


const styles = StyleSheet.create({
  container: {
    flex: 1,
    flexDirection: 'column',
    backgroundColor: 'white'
  },
  preview: {
    flex: 1,
    justifyContent: 'flex-end',
    alignItems: 'center'
  },
  capture: {
    flex: 0,
    backgroundColor: '#D7D7D7',
    borderRadius: 5,
    padding: 10,
    paddingHorizontal: 10,
    alignSelf: 'center',
    margin: 10
  }
});

I already follow the documentation and how it setup

I really need your help guys. Hope you can help me.

Thank you and God bless
John Mark

Hello author, have the following questions

Adapt to rn0.6.0 +?

2.Package not found?
OpenCV Manager package was not found!Try to install it?

3.Error while updating property 'mounted' of a view managed of :FaceCamera
null

surroundings:
"react": "16.8.3",
"react-native": "0.59.9",
"react-native-facerecognition": "^3.1.0"
arm64-v7a

A problem occurred configuring project ':app'. > Could not get unknown property 'mergeResourcesProvider' for object of type com.android.build.gradle.internal.api.ApplicationVariantImpl.

./android/build.gradle

// Top-level build file where you can add configuration options common to all sub-projects/modules.

buildscript {
ext {
buildToolsVersion = "26.0.2"
minSdkVersion = 21
compileSdkVersion = 26
targetSdkVersion = 29
}
repositories {
google()
jcenter()
}
dependencies {
//classpath("com.android.tools.build:gradle:3.5.3")
classpath 'com.android.tools.build:gradle:3.0.1'
// NOTE: Do not place your application dependencies here; they belong
// in the individual module build.gradle files
}
}

allprojects {
repositories {
mavenLocal()
maven {
// All of React Native (JS, Obj-C sources, Android binaries) is installed from npm
url("$rootDir/../node_modules/react-native/android")
}
maven {
// Android JSC is installed from npm
url("$rootDir/../node_modules/jsc-android/dist")
}

    google()
    jcenter()
    maven { url 'https://www.jitpack.io' }
    maven { url "https://maven.google.com" }
}

}

./android/app/buil.gradle

apply plugin: "com.android.application"

import com.android.build.OutputFile

/**

  • The react.gradle file registers a task for each build variant (e.g. bundleDebugJsAndAssets
  • and bundleReleaseJsAndAssets).
  • These basically call react-native bundle with the correct arguments during the Android build
  • cycle. By default, bundleDebugJsAndAssets is skipped, as in debug/dev mode we prefer to load the
  • bundle directly from the development server. Below you can see all the possible configurations
  • and their defaults. If you decide to add a configuration block, make sure to add it before the
  • apply from: "../../node_modules/react-native/react.gradle" line.
  • project.ext.react = [
  • // the name of the generated asset file containing your JS bundle
  • bundleAssetName: "index.android.bundle",
  • // the entry file for bundle generation. If none specified and
  • // "index.android.js" exists, it will be used. Otherwise "index.js" is
  • // default. Can be overridden with ENTRY_FILE environment variable.
  • entryFile: "index.android.js",
  • // https://reactnative.dev/docs/performance#enable-the-ram-format
  • bundleCommand: "ram-bundle",
  • // whether to bundle JS and assets in debug mode
  • bundleInDebug: false,
  • // whether to bundle JS and assets in release mode
  • bundleInRelease: true,
  • // whether to bundle JS and assets in another build variant (if configured).
  • // See http://tools.android.com/tech-docs/new-build-system/user-guide#TOC-Build-Variants
  • // The configuration property can be in the following formats
  • // 'bundleIn${productFlavor}${buildType}'
  • // 'bundleIn${buildType}'
  • // bundleInFreeDebug: true,
  • // bundleInPaidRelease: true,
  • // bundleInBeta: true,
  • // whether to disable dev mode in custom build variants (by default only disabled in release)
  • // for example: to disable dev mode in the staging build type (if configured)
  • devDisabledInStaging: true,
  • // The configuration property can be in the following formats
  • // 'devDisabledIn${productFlavor}${buildType}'
  • // 'devDisabledIn${buildType}'
  • // the root of your project, i.e. where "package.json" lives
  • root: "../../",
  • // where to put the JS bundle asset in debug mode
  • jsBundleDirDebug: "$buildDir/intermediates/assets/debug",
  • // where to put the JS bundle asset in release mode
  • jsBundleDirRelease: "$buildDir/intermediates/assets/release",
  • // where to put drawable resources / React Native assets, e.g. the ones you use via
  • // require('./image.png')), in debug mode
  • resourcesDirDebug: "$buildDir/intermediates/res/merged/debug",
  • // where to put drawable resources / React Native assets, e.g. the ones you use via
  • // require('./image.png')), in release mode
  • resourcesDirRelease: "$buildDir/intermediates/res/merged/release",
  • // by default the gradle tasks are skipped if none of the JS files or assets change; this means
  • // that we don't look at files in android/ or ios/ to determine whether the tasks are up to
  • // date; if you have any other folders that you want to ignore for performance reasons (gradle
  • // indexes the entire tree), add them here. Alternatively, if you have JS files in android/
  • // for example, you might want to remove it from here.
  • inputExcludes: ["android/", "ios/"],
  • // override which node gets called and with what additional arguments
  • nodeExecutableAndArgs: ["node"],
  • // supply additional arguments to the packager
  • extraPackagerArgs: []
  • ]
    */

project.ext.react = [
enableHermes: false, // clean and rebuild if changing
]

apply from: "../../node_modules/react-native/react.gradle"

/**

  • Set this to true to create two separate APKs instead of one:
    • An APK that only works on ARM devices
    • An APK that only works on x86 devices
  • The advantage is the size of the APK is reduced by about 4MB.
  • Upload all the APKs to the Play Store and people will download
  • the correct one based on the CPU architecture of their device.
    */
    def enableSeparateBuildPerCPUArchitecture = false

/**

  • Run Proguard to shrink the Java bytecode in release builds.
    */
    def enableProguardInReleaseBuilds = false

/**

  • The preferred build flavor of JavaScriptCore.
  • For example, to use the international variant, you can use:
  • def jscFlavor = 'org.webkit:android-jsc-intl:+'
  • The international variant includes ICU i18n library and necessary data
  • allowing to use e.g. Date.toLocaleString and String.localeCompare that
  • give correct results when using with locales other than en-US. Note that
  • this variant is about 6MiB larger per architecture than default.
    */
    def jscFlavor = 'org.webkit:android-jsc:+'

/**

  • Whether to enable the Hermes VM.
  • This should be set on project.ext.react and mirrored here. If it is not set
  • on project.ext.react, JavaScript will not be compiled to Hermes Bytecode
  • and the benefits of using Hermes will therefore be sharply reduced.
    */
    def enableHermes = project.ext.react.get("enableHermes", false);

android {
compileSdkVersion 26
buildToolsVersion "26.0.2"
compileSdkVersion rootProject.ext.compileSdkVersion

compileOptions {
    sourceCompatibility JavaVersion.VERSION_1_8
    targetCompatibility JavaVersion.VERSION_1_8
}

defaultConfig {
    applicationId "com.facereco"
    minSdkVersion rootProject.ext.minSdkVersion
    targetSdkVersion rootProject.ext.targetSdkVersion
    versionCode 1
    versionName "1.0"
    
}
splits {
    abi {
        reset()
        enable enableSeparateBuildPerCPUArchitecture
        universalApk false  // If true, also generate a universal APK
        include "armeabi-v7a", "x86", "arm64-v8a", "x86_64"
    }
}
signingConfigs {
    debug {
        storeFile file('debug.keystore')
        storePassword 'android'
        keyAlias 'androiddebugkey'
        keyPassword 'android'
    }
}
buildTypes {
    debug {
        signingConfig signingConfigs.debug
    }
    release {
        // Caution! In production, you need to generate your own keystore file.
        // see https://reactnative.dev/docs/signed-apk-android.
        signingConfig signingConfigs.debug
        minifyEnabled enableProguardInReleaseBuilds
        proguardFiles getDefaultProguardFile("proguard-android.txt"), "proguard-rules.pro"
    }
}

// applicationVariants are e.g. debug, release
applicationVariants.all { variant ->
    variant.outputs.each { output ->
        // For each separate APK per architecture, set a unique version code as described here:
        // https://developer.android.com/studio/build/configure-apk-splits.html
        def versionCodes = ["armeabi-v7a": 1, "x86": 2, "arm64-v8a": 3, "x86_64": 4]
        def abi = output.getFilter(OutputFile.ABI)
        if (abi != null) {  // null for the universal-debug, universal-release variants
            output.versionCodeOverride =
                    versionCodes.get(abi) * 1048576 + defaultConfig.versionCode
        }

    }
}

}

dependencies {
implementation project(':react-native-facerecognition')
implementation fileTree(dir: "libs", include: ["*.jar"])
//noinspection GradleDynamicVersion
implementation "com.facebook.react:react-native:+" // From node_modules

implementation "androidx.swiperefreshlayout:swiperefreshlayout:1.0.0"

debugImplementation("com.facebook.flipper:flipper:${FLIPPER_VERSION}") {
  exclude group:'com.facebook.fbjni'
}

debugImplementation("com.facebook.flipper:flipper-network-plugin:${FLIPPER_VERSION}") {
    exclude group:'com.facebook.flipper'
    exclude group:'com.squareup.okhttp3', module:'okhttp'
}

debugImplementation("com.facebook.flipper:flipper-fresco-plugin:${FLIPPER_VERSION}") {
    exclude group:'com.facebook.flipper'
}

if (enableHermes) {
    def hermesPath = "../../node_modules/hermes-engine/android/";
    debugImplementation files(hermesPath + "hermes-debug.aar")
    releaseImplementation files(hermesPath + "hermes-release.aar")
} else {
    implementation jscFlavor
}

}

// Run this once to be able to run the application with BUCK
// puts all compile dependencies into folder libs for BUCK to use
task copyDownloadableDepsToLibs(type: Copy) {
from configurations.compile
into 'libs'
}

apply from: file("../../node_modules/@react-native-community/cli-platform-android/native_modules.gradle"); applyNativeModulesAppBuildGradle(project)

./android/graddle/wrapper/gradle-wrapper.properties
distributionBase=GRADLE_USER_HOME
distributionPath=wrapper/dists
distributionUrl=https://services.gradle.org/distributions/gradle-4.4-all.zip
zipStoreBase=GRADLE_USER_HOME
zipStorePath=wrapper/dists

./android/settings.gradle

rootProject.name = 'facereco'
include ':react-native-facerecognition'
project(':react-native-facerecognition').projectDir = new File(rootProject.projectDir, '../node_modules/react-native-facerecognition/android')
apply from: file("../node_modules/@react-native-community/cli-platform-android/native_modules.gradle"); applyNativeModulesSettingsGradle(settings)
include ':app'
include ':openCV'
project(':openCV').projectDir = new File(rootProject.projectDir, '../node_modules/react-native-facerecognition/android/openCV')

./android/app/src/main/AndroidManifest.xml

<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.CAMERA" />
<uses-feature android:name="android.hardware.camera" android:required="true"/>
<uses-feature android:name="android.hardware.camera.autofocus" android:required="false"/>
<uses-feature android:name="android.hardware.camera.front" android:required="true"/>
<uses-feature android:name="android.hardware.camera.front.autofocus" android:required="false"/>
<application
  android:name=".MainApplication"
  android:label="@string/app_name"
  android:icon="@mipmap/ic_launcher"
  android:roundIcon="@mipmap/ic_launcher_round"
  android:allowBackup="false"
  android:theme="@style/AppTheme">
  <activity
    android:name=".MainActivity"
    android:label="@string/app_name"
    android:configChanges="keyboard|keyboardHidden|orientation|screenSize|uiMode"
    android:launchMode="singleTask">
    <intent-filter>
        <action android:name="android.intent.action.MAIN" />
        <category android:name="android.intent.category.LAUNCHER" />
    </intent-filter>
  </activity>
  <activity android:name="com.facebook.react.devsupport.DevSettingsActivity" />
</application>

App crash when add react-native-facerecognition

Describe the bug
A clear and concise description of what the bug is.

To Reproduce
Steps to reproduce the behavior:

  1. Go to '...'
  2. Click on '....'
  3. Scroll down to '....'
  4. See error

Expected behavior
A clear and concise description of what you expected to happen.

Screenshots
If applicable, add screenshots to help explain your problem.

Smartphone (please complete the following information):

  • Device: [e.g. HTC]
  • OS: [e.g. Android]

Additional context
Add any other context about the problem here.

Links Expired

Hi, the links provided for jniLibs files is expired. I'm unable to download them. App crashes on arm64_v8a cpu mobile phones

Can not found symbol error

/home/shubham/Desktop/projects/FaceRecognition/android/app/build/generated/rncli/src/main/java/com/facebook/react/PackageList.java:18: error: cannot find symbol
import cv.reactnative.FaceModulePackage;
^
symbol: class FaceModulePackage
location: package cv.reactnative
/home/shubham/Desktop/projects/FaceRecognition/android/app/build/generated/rncli/src/main/java/com/facebook/react/PackageList.java:64: error: cannot find symbol
new FaceModulePackage()
^
symbol: class FaceModulePackage
location: class PackageList
2 errors

FAILURE: Build failed with an exception.

  • What went wrong:
    Execution failed for task ':app:compileDebugJavaWithJavac'.

Compilation failed; see the compiler error output for details.

  • Try:
    Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

  • Get more help at https://help.gradle.org

React Native warn me using this package

Here's the log I got when trying to install this package.
I hope you to follow their guide to make it better.
Thanks.

The following packages use deprecated "rnpm" config that will stop working from next release:

App Crashes

After Installation, while opening the App it is Getting crashed

Faces matching and custom face's dataset

Faces matching.
How can I compare two faces and determine if it is the same person?

Face's dataset
I want to create and store face's dataset for authentication.
Is it possible ? How can I do it?
@assemmohamedali thanks for this module

Error while updating property 'mounted' of a view managed of :FaceCamera

Got two errors with your example while start it:


Error while updating property 'mounted' of a view managed of :FaceCamera

null

No implementation found for long org.opencv.face.LBPHFaceRecognizer.created_0(int,int,int,int,double) (tried Java_org_opencv_face_LBPHFaceRecognizer_create_10 and Java_org_opencv_face_LBPHFaceRecognizer_create_10__IIID)

updateViewProp
ViewManagersPropertyCache.java: 95


  1. Package not found - OpenCV Manager package was not found! Try to install it?

Any suggestion, Thanks

The package is outdated and not working more !

Could not determine the dependencies of task ':app:preDebugBuild'
Could not resolve all task dependencies for configuration ':app:debugRuntimeClasspath'
Could not resolve project :react-native-facerecognition ,
......
.....
.....
etc.

plz update the package or close this .

How to run the Program on Android ?

I am new to react native and there are many error occurs during running this app , can you please guide me step by step how to run this app ?

Couple of Questions

  • M unable to set your example code up and running
  • Integrating library with react native had to done few fixes on configurations side

Even though i managed to get It up and Running,
It says "OpenCv manager not found. Trying to install OpenCv"

  • on yes redirects to playstore which says item not found.

its visibly seen throwing error

com.facebook.react.bridge.JSApplicationIllegalArgumentException: Error while updating property 'mounted' of a view managed by: FaceCamera
at com.facebook.react.uimanager.ViewManagersPropertyCache$PropSetter.updateViewProp(ViewManagersPropertyCache.java:98)
at com.facebook.react.uimanager.ViewManagerPropertyUpdater$FallbackViewManagerSetter.setProperty(ViewManagerPropertyUpdater.java:131)
at com.facebook.react.uimanager.ViewManagerPropertyUpdater.updateProps(ViewManagerPropertyUpdater.java:51)
at com.facebook.react.uimanager.ViewManager.updateProperties(ViewManager.java:46)
at com.facebook.react.uimanager.NativeViewHierarchyManager.updateProperties(NativeViewHierarchyManager.java:139)
at com.facebook.react.uimanager.UIViewOperationQueue$UpdatePropertiesOperation.execute(UIViewOperationQueue.java:93)

ASK: Where do you save the dataset?

Is your feature request related to a problem? Please describe.
First of all, thanks for the amazing possibilities you opened by making this library.
My only question after reading your documentation is regarding the dataset itself.
I get a feeling that this is managed by you or your library currently using either your server or phone's storage.
Please correct me if my assumption was wrong :)
If I want to integrate my app with this feature to existing database, how would I go about it?

Describe the solution you'd like
It would be amazing if we can supply the dataset too to your library so that we could have integrated system/database for say a Guest Book app across multiple branches of the same company.
Please let me know! :)

Thanks again!

Fail Start

cant run, even with cloning this project

 error: cannot find symbol
            new FaceModulePackage()

methods :

clone example repo, npm install, copy v8a. cant run

Could not resolve project :react-native-facerecognition

FAILURE: Build failed with an exception.

  • What went wrong:
    Could not determine the dependencies of task ':app:compileDebugJavaWithJavac'.

Could not resolve all task dependencies for configuration ':app:debugCompileClasspath'.
Could not resolve project :react-native-facerecognition.
Required by:
project :app
> Unable to find a matching configuration of project :react-native-facerecognition:
- None of the consumable configurations have attributes.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.