Git Product home page Git Product logo

videosdk-live / videosdk-rtc-flutter-sdk-example Goto Github PK

View Code? Open in Web Editor NEW
74.0 2.0 28.0 4.72 MB

WebRTC based video conferencing SDK for Flutter (Android / iOS)

Home Page: https://docs.videosdk.live/flutter/guide/video-and-audio-calling-api-sdk/getting-started

Dart 77.87% Kotlin 0.05% Swift 7.04% Objective-C 0.02% HTML 1.50% Ruby 1.24% CMake 4.11% C++ 7.87% C 0.30%
flutter android ios webrtc video-calling sdk video

videosdk-rtc-flutter-sdk-example's Introduction

Video SDK for Flutter (Android and iOS)

Documentation Firebase TestFlight Discord Register

At Video SDK, we’re building tools to help companies create world-class collaborative products with capabilities of live audio/videos, compose cloud recordings/rtmp/hls and interaction APIs

Demo App

πŸ“² Download the sample iOS app here: https://testflight.apple.com/join/C1UOYbxh

πŸ“± Download the sample Android app here: https://appdistribution.firebase.dev/i/80c2c6cc9fcb89b0

Features

  • Real-time video and audio conferencing
  • Enable/disable camera
  • Mute/unmute mic
  • Switch between front and back camera
  • Change audio device
  • Screen share
  • Chat
  • Recording

Setup Guide


Prerequisites

  • If your target platform is iOS, your development environment must meet the following requirements:
    • Flutter 2.0 or later
    • Dart 2.12.0 or later
    • macOS
    • Xcode (Latest version recommended)
  • If your target platform is Android, your development environment must meet the following requirements:
    • Flutter 2.0 or later
    • Dart 2.12.0 or later
    • macOS or Windows
    • Android Studio (Latest version recommended)
  • If your target platform is iOS, you need a real iOS device.
  • If your target platform is Android, you need an Android simulator or a real Android device.
  • Valid Video SDK Account

Run the Sample App

1. Clone the sample project

Clone the repository to your local environment.

$ git clone https://github.com/videosdk-live/videosdk-rtc-flutter-sdk-example.git

2. Copy the .env.example file to .env file.

Open your favorite code editor and copy .env.example to .env file.

$ cp .env.example .env

3. Modify .env file

Generate temporary token from Video SDK Account.

AUTH_TOKEN = "TEMPORARY-TOKEN";

4. Install the dependecies

Install all the dependecies to run the project.

flutter pub get

4. Run the sample app

Bingo, it's time to push the launch button.

flutter run

Key Concepts

  • Meeting - A Meeting represents Real time audio and video communication.

    Note : Don't confuse with Room and Meeting keyword, both are same thing πŸ˜ƒ

  • Sessions - A particular duration you spend in a given meeting is a referred as session, you can have multiple session of a particular meetingId.

  • Participant - Participant represents someone who is attending the meeting's session, local partcipant represents self (You), for this self, other participants are remote participants.

  • Stream - Stream means video or audio media content that is either published by local participant or remote participants.


Android Permission

Add all the following permissions to AndroidManifest.xml file.

    <uses-feature android:name="android.hardware.camera" />
    <uses-feature android:name="android.hardware.camera.autofocus" />
    <uses-permission android:name="android.permission.RECORD_AUDIO" />
    <uses-permission android:name="android.permission.INTERNET" />
    <uses-permission android:name="android.permission.READ_PHONE_STATE" />
    <uses-permission android:name="android.permission.CAMERA" />
    <uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
    <uses-permission android:name="android.permission.CHANGE_NETWORK_STATE" />

    <!-- Needed to communicate with already-paired Bluetooth devices. (Legacy up to Android 11) -->
    <uses-permission
        android:name="android.permission.BLUETOOTH"
        android:maxSdkVersion="30" />
    <uses-permission
        android:name="android.permission.BLUETOOTH_ADMIN"
        android:maxSdkVersion="30" />

    <!-- Needed to communicate with already-paired Bluetooth devices. (Android 12 upwards)-->
    <uses-permission android:name="android.permission.BLUETOOTH_CONNECT" />

iOS Permission

Add the following entry to your Info.plist file, located at <project root>/ios/Runner/Info.plist:

<key>NSCameraUsageDescription</key>
<string>$(PRODUCT_NAME) Camera Usage!</string>
<key>NSMicrophoneUsageDescription</key>
<string>$(PRODUCT_NAME) Microphone Usage!</string>

iOS Screen share Setup

Please refer to this documentation guide to setup screenshare for iOS


Token Generation

Token is used to create and validate a meeting using API and also initialise a meeting.

πŸ› οΈ Development Environment:

  • For development, you can use temporary token. Visit VideoSDK dashboard to generate temporary token.

🌐 Production Environment:

  • For production, you have to set up an authentication server to authorize users. Follow our official example repositories to setup authentication server, videosdk-rtc-api-server-examples

Note :

The expiry of development environment token lasts 7 days only.


API: Create and Validate meeting

  • create meeting - Please refer this documentation to create meeting.
  • validate meeting- Please refer this documentation to validate the meetingId.

  • You can initialize the meeting using createRoom() method. createRoom() will generate a new Room object and the initiated meeting will be returned.
  Room room = VideoSDK.createRoom(
        roomId: "abcd-efgh-ijkl",
        token: "YOUR TOKEN",
        displayName: "GUEST",
        micEnabled: true,
        camEnabled: true,
        maxResolution: 'hd',
        defaultCameraIndex: kIsWeb ? 0 : 1,
        notification: const NotificationInfo(
          title: "Video SDK",
          message: "Video SDK is sharing screen in the meeting",
          icon: "notification_share", // drawable icon name
        ),
      );

// unmute mic
room.unmuteMic();

// mute mic
room.muteMic();

  • The meeting.getAudioOutputDevices() function allows a participant to list all of the attached audio devices (e.g., Bluetooth and Earphone).
 // get connected audio devices
 List<MediaDeviceInfo> outputDevice = room.getAudioOutputDevices()
  • Local participant can change the audio device using switchAudioOutput(MediaDeviceInfo device) method of Room class.
// change mic
room.switchAudioOutput(mediaDeviceInfo);

// enable webcam
room.enableCam();

// disable webcam
room.disableCam();

// switch webcam
room.changeCam(deviceId);

  • The chat feature allows participants to send and receive messages about specific topics to which they have subscribed.
// publish
room.pubSub.publish(String topic,String message, PubSubPublishOptions pubSubPublishoptions);

// pubSubPublishoptions is an object of PubSubPublishOptions, which provides an option, such as persist, which persists message history for upcoming participants.


//subscribe
PubSubMessages pubSubMessageList = room.pubSub.subscribe(String topic, Function(PubSubMessage) messageHandler)


//unsubscribe
room.pubSub.unsubscribe(topic, Function(PubSubMessage) messageHandler);


// Message Handler
void messageHandler(msg){
  // Do something
  print("New message received: $msg");
}

// Only one participant will leave/exit the meeting; the rest of the participants will remain.
room.leave();

// The meeting will come to an end for each and every participant. So, use this function in accordance with your requirements.
room.end();

By registering callback handlers, VideoSDK sends callbacks to the client app whenever there is a change or update in the meeting after a user joins.

    room.on(
      Events.roomJoined,
      () {
        // This event will be emitted when a localParticipant(you) successfully joined the meeting.
      },
    );

    room.on(Events.roomLeft, (String? errorMsg) {
      // This event will be emitted when a localParticipant(you) left the meeting.
      // [errorMsg]: It will have the message if meeting was left due to some error like Network problem
    });

    room.on(Events.recordingStarted, () {
      // This event will be emitted when recording of the meeting is started.
    });

    room.on(Events.recordingStopped, () {
      // This event will be emitted when recording of the meeting is stopped.
    });

    room.on(Events.presenterChanged, (_activePresenterId) {
      // This event will be emitted when any participant starts or stops screen sharing.
      // [participantId]: Id of participant who shares the screen.
    });

    room.on(Events.speakerChanged, (_activeSpeakerId) {
      // This event will be emitted when a active speaker changed.
      // [participantId] : Id of active speaker
    });

    room.on(Events.participantJoined, (Participant participant) {
      // This event will be emitted when a new participant joined the meeting.
      // [participant]: new participant who joined the meeting
    });

    room.on(Events.participantLeft, (participantId) => {
      // This event will be emitted when a joined participant left the meeting.
      // [participantId]: id of participant who left the meeting
    });

By registering callback handlers, VideoSDK sends callbacks to the client app whenever a participant's video, audio, or screen share stream is enabled or disabled.

  participant.on(Events.streamEnabled, (Stream _stream) {
    // This event will be triggered whenever a participant's video, audio or screen share stream is enabled.
  });

  participant.on(Events.stereamDisabled, (Stream _stream) {
    // This event will be triggered whenever a participant's video, audio or screen share stream is disabled.
  });

If you want to learn more about the SDK, read the Complete Documentation of Flutter VideoSDK


Project Description


Note :

  • master branch: Better UI with One-to-One call experience.
  • v1-code-sample branch: Simple UI with Group call experience.

App Behaviour with Different Meeting Types

  • One-to-One meeting - The One-to-One meeting allows 2 participants to join a meeting in the app.

  • Group Meeting - The Group meeting allows any number of participants to join a meeting in the app with maximum 6 participants on screen.


Project Structure

  • We have seprated screens and widget in following folder structure:
    • one-to-one - It includes all files related to OneToOne meeting.
    • common - It inclues all the files that are used in both meeting type (OneToOne and Group calls).
    • conference-call - It includes all files related to the conference call.

Common Content

1. Create or join Meeting

  • join_screen.dart: It shows the user with the option to create or join a meeting and to initiate webcam and mic status.

    • api.dart : It incldes all the API calls for create and validate meeting.

    • joining_details.dart: This widget allows user to enter the meetingId and name for the meeting.

    • If Join Meeting is clicked, it will show following:

      • Dropdown for Meeting Type - These dropdown is to select the meeting mode: Group Call or One To One call.
      • EditText for ParticipantName - This edit text will contain name of the participant.
      • EditText for MeetingId - This edit text will contain the meeting Id that you want to join.
      • Join Meeting Button - This button will call api for join meeting with meetingId that you entered.
    • If Create Meeting is clicked, it will show following:

      • Dropdown for Meeting Type - These dropdown is to select the meeting mode: Group Call or One To One call.
      • EditText for ParticipantName - This edit text will contain name of the participant.
      • Join Meeting Button - This button will call api for join meeting with a new meetingId

2. PartcipantList

  • participant_list.dart and participant_list_item.dart files are used to show Participant list.

3. Meeting Actions

  • Meeting actions are present in the meeting_action_bar.dart

    • MoreOptions:

    • AudioDeviceList:

    • LeaveOrEndDialog:

4. Meeting Top Bar

  • meeting_appbar.dart: It contains the meeting timer, switch camera option and recording indicatior.

5. Chat

  • chat_screen.dart: It contains the chat screen made using PubSub.

One-to-one

  • one_to_one_meeting_screen.dart: It contains the complete layout for one to one meeting.

  • one_to_one_meeting_container.dart: It contains the logic to render the participants in the miniview and large view.

  • participant_view.dart: It is used to display the individual stream of the participant.

conference-call

  • conference_participant_grid.dart: It contains the management of participant grid.

  • participant_grid_tile.dart: It contains the widget for a single participant which is displayed int he grid.

  • conference_screenshare_view.dart: It contains the widget which will dislpay the screenshare in the meeting.


Examples

Examples for Conference

Examples for Live Streaming


Documentation

Read the documentation to start using Video SDK.


Community

  • Discord - To get involved with the Video SDK community, ask questions and share tips.
  • Twitter - To receive updates, announcements, blog posts, and general Video SDK tips.

videosdk-rtc-flutter-sdk-example's People

Contributors

ahmedbhesaniya97 avatar arjun-kava avatar bhumisalat avatar chintanrajpara avatar ishabodiwala avatar rajansurani avatar shuaixiaoqiang avatar videosdkadmin avatar yash-chudasama avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar

videosdk-rtc-flutter-sdk-example's Issues

iOS: By default the audio only works through an earpiece, not the main speaker

Version: videosdk: 1.0.10
OS: iOS
Device: iPhone 12

Hi, I have a bug that I can't quite resolve at first glance. For some reason, the iOS SDK defaults to using the earpiece audio instead of the main speakers. When I connect a second source, like Bluetooth earbuds, it works correctly. I can switch between the main speaker and the earbuds with no issues (but still no earpiece in the sources).

Here is how my room is configured:

room = VideoSDK.createRoom(
      defaultCameraIndex: 1,
      roomId: widget.meetingId,
      token: widget.token,
      displayName: 'Customer',
      micEnabled: micEnabled,
      camEnabled: webcamEnabled,
      maxResolution: 'hd',
      notification: const NotificationInfo(
        title: "Video SDK",
        message: "Video SDK is sharing screen in the meeting",
        icon: "notification_share",
      ),
    );

When nothing is connected to the phone, getAudioOutputDevices returns only 1 audio device with "Speaker" deviceId. The sound comes through the earpiece:

alt text

Flutter Web [QUESTION]

I would like to inquire about flutter web support. At what stage is the implementation and is it possible to count on it in the near future?

Progress

Is there any way to track the progress for complete support for mobile and web?

type 'MultiChildLayoutParentData' is not a subtype of type 'FlexParentData' in type cast

Flutter Doctor -

[√] Flutter (Channel stable, 3.3.10, on Microsoft Windows [Version 10.0.19045.3930], locale en-IN)
β€’ Flutter version 3.3.10 on channel stable at E:\flutter
β€’ Upstream repository https://github.com/flutter/flutter.git
β€’ Framework revision 135454af32 (1 year, 1 month ago), 2022-12-15 07:36:55 -0800
β€’ Engine revision 3316dd8728
β€’ Dart version 2.18.6
β€’ DevTools version 2.15.0

[√] Android toolchain - develop for Android devices (Android SDK version 33.0.0)
β€’ Android SDK at C:\Users\Admin\AppData\Local\Android\sdk
β€’ Platform android-33, build-tools 33.0.0
β€’ Java binary at: E:\Android Studio\jre\bin\java
β€’ Java version OpenJDK Runtime Environment (build 11.0.13+0-b1751.21-8125866)
β€’ All Android licenses accepted.

[√] Chrome - develop for the web
β€’ Chrome at C:\Program Files\Google\Chrome\Application\chrome.exe

[X] Visual Studio - develop for Windows
X Visual Studio not installed; this is necessary for Windows development.
Download at https://visualstudio.microsoft.com/downloads/.
Please install the "Desktop development with C++" workload, including all of its default components

[√] Android Studio (version 2021.3)
β€’ Android Studio at E:\Android Studio
β€’ Flutter plugin can be installed from:
https://plugins.jetbrains.com/plugin/9212-flutter
β€’ Dart plugin can be installed from:
https://plugins.jetbrains.com/plugin/6351-dart
β€’ Java version OpenJDK Runtime Environment (build 11.0.13+0-b1751.21-8125866)

[√] VS Code (version 1.85.1)
β€’ VS Code at C:\Users\Admin\AppData\Local\Programs\Microsoft VS Code
β€’ Flutter extension version 3.80.0

[√] Connected device (3 available)
β€’ Windows (desktop) β€’ windows β€’ windows-x64 β€’ Microsoft Windows [Version 10.0.19045.3930]
β€’ Chrome (web) β€’ chrome β€’ web-javascript β€’ Google Chrome 120.0.6099.217
β€’ Edge (web) β€’ edge β€’ web-javascript β€’ Microsoft Edge 120.0.2210.91

[√] HTTP Host Availability
β€’ All required HTTP hosts are available

Error StackTrace -

I/flutter (32283): meeting_screen:getRoomToken:response.data
I/flutter (32283): eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJhcGlrZXkiOiI5MDFkMmNkMi0xOWRjLTQ0YjctYTkyMS1iMWZmNTI0MzRjZmUiLCJwZXJtaXNzaW9ucyI6WyJhbGxvd19qb2luIiwiYWxsb3dfbW9kIl0sInZlcnNpb24iOjIsInJvb21JZCI6Imowa2YtNDR2NS1veG5tIiwiaWF0IjoxNzA1MjIwNDU1LCJleHAiOjE3MDUyMjEwNTV9.kVQ-wultNO9leEE-tLBvV-W3uRTVRBRNmaPTE3Q-kZw
D/FlutterWebRTCPlugin(32283): onConnectionChangeCLOSED
I/flutter (32283): generateProfileLevelIdForAnswer() | result: [profile:1, level:31
I/FlutterWebRTCPlugin(32283): getUserMedia(audio): mandatory: [googEchoCancellation2: true, googNoiseSuppression: true, echoCancellation: true, googAutoGainControl: true, googDAEchoCancellation: true, googEchoCancellation: true], optional: [sourceId: 0]
D/FlutterWebRTCPlugin(32283): MediaStream id: 7f9f4300-abfa-4a72-a188-09c3f76fa9e9
D/FlutterWebRTCPlugin(32283): addStreamcom.cloudwebrtc.webrtc.utils.AnyThreadResult@77ebd34
I/FlutterWebRTCPlugin(32283): getUserMedia(video): ConstraintsMap{mMap={frameRate=30, facingMode=user, width=1280, optional=[{sourceId=1}], height=720}}
D/FlutterWebRTCPlugin(32283): Creating video capturer using Camera2 API.
D/FlutterWebRTCPlugin(32283): create user specified camera 1 succeeded
D/FlutterWebRTCPlugin(32283): changeCaptureFormat: 1280x720@30
D/FlutterWebRTCPlugin(32283): MediaStream id: b9a41c6f-7b9b-44fd-ac45-6989a0539471
I/flutter (32283): type 'MultiChildLayoutParentData' is not a subtype of type 'FlexParentData' in type cast
D/FlutterWebRTCPlugin(32283): CameraEventsHandler.onCameraOpening: cameraName=1
I/flutter (32283): #0      Flexible.applyParentData (package:flutter/src/widgets/basic.dart:5030)
I/flutter (32283): #1      RenderObjectElement._updateParentData (package:flutter/src/widgets/framework.dart:6061)
I/flutter (32283): #2      RenderObjectElement.attachRenderObject (package:flutter/src/widgets/framework.dart:6082)
I/flutter (32283): #3      RenderObjectElement.mount (package:flutter/src/widgets/framework.dart:5751)
I/flutter (32283): #4      SingleChildRenderObjectElement.mount (package:flutter/src/widgets/framework.dart:6299)
I/flutter (32283): #5      Element.inflateWidget (package:flutter/src/widgets/framework.dart:3863)
I/flutter (32283): #6      Element.updateChild (package:flutter/src/widgets/framework.dart:3592)
I/flutter (32283): #7      ComponentElement.performRebuild (package:flutter/src/widgets/framework.dart:4904)
I/flutter (32283): #8      Element.rebuild (package:flutter/src/widgets/framework.dart:4604)
I/flutter (32283): #9      ComponentElement._firstBuild (package:flutter/src/widgets/framework.dart:4859)
I/flutter (32283): #10     ComponentElement.mount (package:flutter/src/widgets/framework.dart:4853)
I/flutter (32283): #11     Element.inflateWidget (package:flutter/src/widgets/framework.dart:3863)
I/flutter (32283): #12     Element.updateChild (package:flutter/src/widgets/framework.dart:3586)
I/flutter (32283): #13     ComponentElement.performRebuild (package:flutter/src/widgets/framework.dart:4904)
I/flutter (32283): #14     Element.rebuild (package:flutter/src/widgets/framework.dart:4604)
I/flutter (32283): #15     StatelessElement.update (package:flutter/src/widgets/framework.dart:4956)
I/flutter (32283): #16     Element.updateChild (package:flutter/src/widgets/framework.dart:3570)
I/flutter (32283): #17     ComponentElement.performRebuild (package:flutter/src/widgets/framework.dart:4904)
I/flutter (32283): #18     Element.rebuild (package:flutter/src/widgets/framework.dart:4604)
I/flutter (32283): #19     StatelessElement.update (package:flutter/src/widgets/framework.dart:4956)
I/flutter (32283): #20     Element.updateChild (package:flutter/src/widgets/framework.dart:3570)
I/flutter (32283): #21     ComponentElement.performRebuild (package:flutter/src/widgets/framework.dart:4904)
I/flutter (32283): #22     Element.rebuild (package:flutter/src/widgets/framework.dart:4604)
I/flutter (32283): #23     StatelessElement.update (package:flutter/src/widgets/framework.dart:4956)
I/flutter (32283): #24     Element.updateChild (package:flutter/src/widgets/framework.dart:3570)
I/flutter (32283): #25     ComponentElement.performRebuild (package:flutter/src/widgets/framework.dart:4904)
I/flutter (32283): #26     Element.rebuild (package:flutter/src/widgets/framework.dart:4604)
I/flutter (32283): #27     ProxyElement.update (package:flutter/src/widgets/framework.dart:5228)
I/flutter (32283): #28     Element.updateChild (package:flutter/src/widgets/framework.dart:3570)
I/flutter (32283): #29     ComponentElement.performRebuild (package:flutter/src/widgets/framework.dart:4904)
I/flutter (32283): #30     Element.rebuild (package:flutter/src/widgets/framework.dart:4604)
I/flutter (32283): #31     ProxyElement.update (package:flutter/src/widgets/framework.dart:5228)
I/flutter (32283): #32     Element.updateChild (package:flutter/src/widgets/framework.dart:3570)
I/flutter (32283): #33     RenderObjectElement.updateChildren (package:flutter/src/widgets/framework.dart:5904)
I/flutter (32283): #34     MultiChildRenderObjectElement.update (package:flutter/src/widgets/framework.dart:6460)
I/flutter (32283): #35     Element.updateChild (package:flutter/src/widgets/framework.dart:3570)
I/flutter (32283): #36     ComponentElement.performRebuild (package:flutter/src/widgets/framework.dart:4904)
I/flutter (32283): #37     Element.rebuild (package:flutter/src/widgets/framework.dart:4604)
I/flutter (32283): #38     ProxyElement.update (package:flutter/src/widgets/framework.dart:5228)
I/flutter (32283): #39     Element.updateChild (package:flutter/src/widgets/framework.dart:3570)
I/flutter (32283): #40     ComponentElement.performRebuild (package:flutter/src/widgets/framework.dart:4904)
I/flutter (32283): #41     StatefulElement.performRebuild (package:flutter/src/widgets/framework.dart:5050)
I/flutter (32283): #42     Element.rebuild (package:flutter/src/widgets/framework.dart:4604)
I/flutter (32283): #43     StatefulElement.update (package:flutter/src/widgets/framework.dart:5082)
I/flutter (32283): #44     Element.updateChild (package:flutter/src/widgets/framework.dart:3570)
I/flutter (32283): #45     ComponentElement.performRebuild (package:flutter/src/widgets/framework.dart:4904)
I/flutter (32283): #46     StatefulElement.performRebuild (package:flutter/src/widgets/framework.dart:5050)
I/flutter (32283): #47     Element.rebuild (package:flutter/src/widgets/framework.dart:4604)
I/flutter (32283): #48     StatefulElement.update (package:flutter/src/widgets/framework.dart:5082)
I/flutter (32283): #49     Element.updateChild (package:flutter/src/widgets/framework.dart:3570)
I/flutter (32283): #50     ComponentElement.performRebuild (package:flutter/src/widgets/framework.dart:4904)
I/flutter (32283): #51     Element.rebuild (package:flutter/src/widgets/framework.dart:4604)
I/flutter (32283): #52     ProxyElement.update (package:flutter/src/widgets/framework.dart:5228)
I/flutter (32283): #53     Element.updateChild (package:flutter/src/widgets/framework.dart:3570)
I/flutter (32283): #54     ComponentElement.performRebuild (package:flutter/src/widgets/framework.dart:4904)
I/flutter (32283): #55     StatefulElement.performRebuild (package:flutter/src/widgets/framework.dart:5050)
I/flutter (32283): #56     Element.rebuild (package:flutter/src/widgets/framework.dart:4604)
I/flutter (32283): #57     StatefulElement.update (package:flutter/src/widgets/framework.dart:5082)
I/flutter (32283): #58     Element.updateChild (package:flutter/src/widgets/framework.dart:3570)
I/flutter (32283): #59     SingleChildRenderObjectElement.update (package:flutter/src/widgets/framework.dart:6307)
I/flutter (32283): #60     Element.updateChild (package:flutter/src/widgets/framework.dart:3570)
I/flutter (32283): #61     ComponentElement.performRebuild (package:flutter/src/widgets/framework.dart:4904)
I/flutter (32283): #62     Element.rebuild (package:flutter/src/widgets/framework.dart:4604)
I/flutter (32283): #63     ProxyElement.update (package:flutter/src/widgets/framework.dart:5228)
I/flutter (32283): #64     Element.updateChild (package:flutter/src/widgets/framework.dart:3570)
I/flutter (32283): #65     SingleChildRenderObjectElement.update (package:flutter/src/widgets/framework.dart:6307)
I/flutter (32283): #66     Element.updateChild (package:flutter/src/widgets/framework.dart:3570)
I/flutter (32283): #67     ComponentElement.performRebuild (package:flutter/src/widgets/framework.dart:4904)
I/flutter (32283): #68     StatefulElement.performRebuild (package:flutter/src/widgets/framework.dart:5050)
I/flutter (32283): #69     Element.rebuild (package:flutter/src/widgets/framework.dart:4604)
I/flutter (32283): #70     StatefulElement.update (package:flutter/src/widgets/framework.dart:5082)
I/flutter (32283): #71     Element.updateChild (package:flutter/src/widgets/framework.dart:3570)
I/flutter (32283): #72     ComponentElement.performRebuild (package:flutter/src/widgets/framework.dart:4904)
I/flutter (32283): #73     StatefulElement.performRebuild (package:flutter/src/widgets/framework.dart:5050)
I/flutter (32283): #74     Element.rebuild (package:flutter/src/widgets/framework.dart:4604)
I/flutter (32283): #75     StatefulElement.update (package:flutter/src/widgets/framework.dart:5082)
I/flutter (32283): #76     Element.updateChild (package:flutter/src/widgets/framework.dart:3570)
I/flutter (32283): #77     ComponentElement.performRebuild (package:flutter/src/widgets/framework.dart:4904)
I/flutter (32283): #78     Element.rebuild (package:flutter/src/widgets/framework.dart:4604)
I/flutter (32283): #79     ProxyElement.update (package:flutter/src/widgets/framework.dart:5228)
I/flutter (32283): #80     Element.updateChild (package:flutter/src/widgets/framework.dart:3570)
I/flutter (32283): #81     ComponentElement.performRebuild (package:flutter/src/widgets/framework.dart:4904)
I/flutter (32283): #82     Element.rebuild (package:flutter/src/widgets/framework.dart:4604)
I/flutter (32283): #83     ProxyElement.update (package:flutter/src/widgets/framework.dart:5228)
I/flutter (32283): #84     Element.updateChild (package:flutter/src/widgets/framework.dart:3570)
I/flutter (32283): #85     ComponentElement.performRebuild (package:flutter/src/widgets/framework.dart:4904)
I/flutter (32283): #86     Element.rebuild (package:flutter/src/widgets/framework.dart:4604)
I/flutter (32283): #87     ProxyElement.update (package:flutter/src/widgets/framework.dart:5228)
I/flutter (32283): #88     Element.updateChild (package:flutter/src/widgets/framework.dart:3570)
I/flutter (32283): #89     ComponentElement.performRebuild (package:flutter/src/widgets/framework.dart:4904)
I/flutter (32283): #90     StatefulElement.performRebuild (package:flutter/src/widgets/framework.dart:5050)
I/flutter (32283): #91     Element.rebuild (package:flutter/src/widgets/framework.dart:4604)
I/flutter (32283): #92     StatefulElement.update (package:flutter/src/widgets/framework.dart:5082)
I/flutter (32283): #93     Element.updateChild (package:flutter/src/widgets/framework.dart:3570)
I/flutter (32283): #94     ComponentElement.performRebuild (package:flutter/src/widgets/framework.dart:4904)
I/flutter (32283): #95     Element.rebuild (package:flutter/src/widgets/framework.dart:4604)
I/flutter (32283): #96     ProxyElement.update (package:flutter/src/widgets/framework.dart:5228)
I/flutter (32283): #97     Element.updateChild (package:flutter/src/widgets/framework.dart:3570)
I/flutter (32283): #98     ComponentElement.performRebuild (package:flutter/src/widgets/framework.dart:4904)
I/flutter (32283): #99     StatefulElement.performRebuild (package:flutter/src/widgets/framework.dart:5050)
I/flutter (32283): Another exception was thrown: Instance of 'DiagnosticsProperty<void>'
I/flutter (32283): Another exception was thrown: Instance of 'DiagnosticsProperty<void>'
D/FlutterWebRTCPlugin(32283): onIceGatheringChangeGATHERING
D/FlutterWebRTCPlugin(32283): onIceCandidate
D/FlutterWebRTCPlugin(32283): onIceCandidate
D/FlutterWebRTCPlugin(32283): onIceCandidate
D/FlutterWebRTCPlugin(32283): onConnectionChangeCONNECTING
D/FlutterWebRTCPlugin(32283): onIceCandidate
D/FlutterWebRTCPlugin(32283): onSelectedCandidatePairChanged
D/FlutterWebRTCPlugin(32283): onIceGatheringChangeCOMPLETE
D/FlutterWebRTCPlugin(32283): addStreamcom.cloudwebrtc.webrtc.utils.AnyThreadResult@307fdc9
D/FlutterWebRTCPlugin(32283): onConnectionChangeCONNECTED
D/FlutterWebRTCPlugin(32283): CameraEventsHandler.onFirstFrameAvailable
W/FlutterWebRTCPlugin(32283): FlutterRTCVideoRenderer.setVideoTrack, set video track to 5cf5d1ec-011d-4d90-bfdd-4d5523dc3e9a

missing VideoRenderer.onFirstFrameRendered implementations

/C:/Users/AppData/Local/Pub/Cache/hosted/pub.dartlang.org/flutter_webrtc-0.8.12/lib/src/native/rtc_video_renderer_impl.dart:11:7: Error: The non-abstract class 'RTCVideoRenderer' is missing implementations for these members:

  • VideoRenderer.onFirstFrameRendered
    Try to either
  • provide an implementation,
  • inherit an implementation from a superclass or mixin,
  • mark the class as abstract, or
  • provide a 'noSuchMethod' implementation.

class RTCVideoRenderer extends ValueNotifier
^^^^^^^^^^^^^^^^
/C:/flutter/.pub-cache/hosted/pub.dartlang.org/webrtc_interface-1.0.8/lib/src/rtc_video_renderer.dart:51:13: Context: 'VideoRenderer.onFirstFrameRendered' is defined here.
Function? onFirstFrameRendered;
^^^^^^^^^^^^^^^^^^^^

Doctor summary (to see all details, run flutter doctor -v):
[√] Flutter (Channel stable, 3.3.0, on Microsoft Windows [Version 10.0.22000.856], locale en-US)
[√] Android toolchain - develop for Android devices (Android SDK version 33.0.0)
[√] Chrome - develop for the web
[X] Visual Studio - develop for Windows
X Visual Studio not installed; this is necessary for Windows development.
Download at https://visualstudio.microsoft.com/downloads/.
Please install the "Desktop development with C++" workload, including all of its default components
[√] Android Studio (version 2020.3)
[√] IntelliJ IDEA Ultimate Edition (version 2020.3)
[√] VS Code (version 1.71.0)
[√] Connected device (4 available)
[√] HTTP Host Availability

! Doctor found issues in 1 category.

Not getting entryRequested

When two or three user join calls at same time then i am not getting entry request and also when host joined the meeting instantly after creating group call then again join request not coming.

[Android] NullPointerException when initializing VideoCapturer on Xiaomi/Redmi devices running Android < 9.0

Description:
I've been working on integrating the Video SDK into a Flutter project, and overall, the implementation works fine on most devices. However, I encountered an exception specifically on Xiaomi/Redmi devices running Android versions earlier than 9.0. The exception trace is as follows:

E/AndroidRuntime(30301): java.lang.NullPointerException: Attempt to invoke interface method 'void org.webrtc.VideoCapturer.initialize(org.webrtc.SurfaceTextureHelper, android.content.Context, org.webrtc.CapturerObserver)' on a null object reference E/AndroidRuntime(30301): at com.cloudwebrtc.webrtc.GetUserMediaImpl.getUserVideo(GetUserMediaImpl.java:730) E/AndroidRuntime(30301): at com.cloudwebrtc.webrtc.GetUserMediaImpl.getUserMedia(GetUserMediaImpl.java:597) E/AndroidRuntime(30301): at com.cloudwebrtc.webrtc.GetUserMediaImpl.access$000(GetUserMediaImpl.java:84) E/AndroidRuntime(30301): at com.cloudwebrtc.webrtc.GetUserMediaImpl$1.invoke(GetUserMediaImpl.java:456) E/AndroidRuntime(30301): at com.cloudwebrtc.webrtc.GetUserMediaImpl.lambda$requestPermissions$1(GetUserMediaImpl.java:849) E/AndroidRuntime(30301): at com.cloudwebrtc.webrtc.GetUserMediaImpl$$ExternalSyntheticLambda0.invoke(Unknown Source:6) E/AndroidRuntime(30301): at com.cloudwebrtc.webrtc.utils.PermissionUtils$1.onReceiveResult(PermissionUtils.java:115) E/AndroidRuntime(30301): at android.os.ResultReceiver$MyRunnable.run(ResultReceiver.java:50) E/AndroidRuntime(30301): at android.os.Handler.handleCallback(Handler.java:873) E/AndroidRuntime(30301): at android.os.Handler.dispatchMessage(Handler.java:99) E/AndroidRuntime(30301): at android.os.Looper.loop(Looper.java:201) E/AndroidRuntime(30301): at android.app.ActivityThread.main(ActivityThread.java:6810) E/AndroidRuntime(30301): at java.lang.reflect.Method.invoke(Native Method) E/AndroidRuntime(30301): at com.android.internal.os.RuntimeInit$MethodAndArgsCaller.run(RuntimeInit.java:547) E/AndroidRuntime(30301): at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:873) I/Process (30301): Sending signal. PID: 30301 SIG: 9

Additional Observations:
The issue seems to be specific to Xiaomi/Redmi devices running Android versions earlier than 9.0. I have followed the instructions in this link to address the problem, but the error still persists on other devices. Additionally, on Xiaomi devices, switching the camera also does not function as expected.

Devices Affected:

  • Xiaomi Mi 8 (Android 8.1)
  • Redmi Note 7 (Android 8.0)

Expected Behavior:
The Video SDK should work consistently across various Android devices, including Xiaomi/Redmi models with Android versions earlier than 9.0. Switching the camera should also function properly on all supported devices.

Steps to Reproduce:

  1. Install the app on Xiaomi/Redmi devices with Android < 9.0.
  2. Attempt to use the Video SDK functionalities, such as starting a video call or switching the camera.

Possible Solution:
As I investigated further, it appears that the issue might be related to how the VideoCapturer is initialized on these specific devices. I would appreciate any guidance, insights, or potential fixes from the community to resolve this problem.

Thank you for your assistance!

VideoSDK Crashes on iOS 16.4 Simulator and Samsung S8+ Android 9 with NullPointerException

The VideoSDK continues to experience operational failures on both the iOS 16.4 Simulator and the Samsung S8+ running Android 9. A crash persists while attempting to use the VideoSDK, yielding a NullPointerException within the logs. This issue might be correlated with Flutter WebRTC integration, as indicated by the following URL: flutter-webrtc/flutter-webrtc#1365.

Steps to Reproduce:

  1. Install VideoSDK version 1.1.5.
  2. Launch the iOS 16.4 Simulator or use a Samsung S8+ device with Android 9.
  3. Integrate the VideoSDK into a sample application or use an existing application that employs the SDK's video functionality.
  4. Attempt to initiate video playback or recording.

Expected Behavior:
VideoSDK should successfully initialize and execute video-related functions on both iOS 16.4 Simulator and Samsung S8+ Android 9 devices.

E/AndroidRuntime(10644): java.lang.NullPointerException: Attempt to invoke interface method 'void org.webrtc.VideoCapturer.initialize(org.webrtc.SurfaceTextureHelper, android.content.Context, org.webrtc.CapturerObserver)' on a null object reference
E/AndroidRuntime(10644): 	at com.cloudwebrtc.webrtc.GetUserMediaImpl.getUserVideo(GetUserMediaImpl.java:730)
E/AndroidRuntime(10644): 	at com.cloudwebrtc.webrtc.GetUserMediaImpl.getUserMedia(GetUserMediaImpl.java:597)
E/AndroidRuntime(10644): 	at com.cloudwebrtc.webrtc.GetUserMediaImpl.access$000(GetUserMediaImpl.java:84)
E/AndroidRuntime(10644): 	at com.cloudwebrtc.webrtc.GetUserMediaImpl$1.invoke(GetUserMediaImpl.java:456)
E/AndroidRuntime(10644): 	at com.cloudwebrtc.webrtc.GetUserMediaImpl.lambda$requestPermissions$1(GetUserMediaImpl.java:849)
E/AndroidRuntime(10644): 	at com.cloudwebrtc.webrtc.GetUserMediaImpl$$ExternalSyntheticLambda0.invoke(Unknown Source:6)
E/AndroidRuntime(10644): 	at com.cloudwebrtc.webrtc.utils.PermissionUtils$1.onReceiveResult(PermissionUtils.java:115)
E/AndroidRuntime(10644): 	at android.os.ResultReceiver$MyRunnable.run(ResultReceiver.java:50)
E/AndroidRuntime(10644): 	at android.os.Handler.handleCallback(Handler.java:873)
E/AndroidRuntime(10644): 	at android.os.Handler.dispatchMessage(Handler.java:99)
E/AndroidRuntime(10644): 	at android.os.Looper.loop(Looper.java:214)
E/AndroidRuntime(10644): 	at android.app.ActivityThread.main(ActivityThread.java:7050)
E/AndroidRuntime(10644): 	at java.lang.reflect.Method.invoke(Native Method)
E/AndroidRuntime(10644): 	at com.android.internal.os.RuntimeInit$MethodAndArgsCaller.run(RuntimeInit.java:494)
E/AndroidRuntime(10644): 	at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:965)

Environment:

  • VideoSDK version: 1.1.5
  • iOS Simulator version: 16.4
  • Device: Samsung S8+
  • Android version: 9

Steps Taken So Far:

  1. Verified the issue by replicating it on both platforms.
  2. Tested the provided example application using VideoSDK version 1.1.5.
  3. Investigated the error logs for insights into the root cause.
  4. Noted the resemblance between this issue and the one outlined at flutter-webrtc/flutter-webrtc#1365.

Expected Impact:
This issue impedes the effective utilization of VideoSDK features on iOS 16.4 Simulator and Samsung S8+ Android 9 devices, adversely affecting developers and users who depend on the SDK's functionality. It is imperative to address this issue promptly to uphold a seamless user experience.

Note:
This issue report aims to spotlight the concern and foster collaboration among the developer community to identify and rectify the underlying issue. If you have encountered this issue or possess relevant insights, please contribute to the ongoing discussion. Additionally, the potential connection to the Flutter WebRTC issue flutter-webrtc/flutter-webrtc#1365 could offer a valuable avenue of investigation.

.env file not found

flutter:
Error: unable to find directory entry in pubspec.yaml: C:\Users\pprag\Downloads\azaz.env\

java.lang.NullPointerException: Attempt to invoke interface method 'void org.webrtc.VideoCapturer.initialize(org.webrtc.SurfaceTextureHelper, android.content.Context, org.webrtc.CapturerObserver)' on a null object reference

java.lang.NullPointerException: Attempt to invoke interface method 'void org.webrtc.VideoCapturer.initialize(org.webrtc.SurfaceTextureHelper, android.content.Context, org.webrtc.CapturerObserver)' on a null object reference
E/AndroidRuntime(16634): at com.cloudwebrtc.webrtc.GetUserMediaImpl.getUserVideo(GetUserMediaImpl.java:761)
E/AndroidRuntime(16634): at com.cloudwebrtc.webrtc.GetUserMediaImpl.getUserMedia(GetUserMediaImpl.java:628)
E/AndroidRuntime(16634): at com.cloudwebrtc.webrtc.GetUserMediaImpl.access$100(GetUserMediaImpl.java:88)
E/AndroidRuntime(16634): at com.cloudwebrtc.webrtc.GetUserMediaImpl$2.invoke(GetUserMediaImpl.java:478)
E/AndroidRuntime(16634): at com.cloudwebrtc.webrtc.GetUserMediaImpl.lambda$requestPermissions$1(GetUserMediaImpl.java:880)
E/AndroidRuntime(16634): at com.cloudwebrtc.webrtc.GetUserMediaImpl$$ExternalSyntheticLambda0.invoke(Unknown Source:6)
E/AndroidRuntime(16634): at com.cloudwebrtc.webrtc.utils.PermissionUtils$1.onReceiveResult(PermissionUtils.java:115)
E/AndroidRuntime(16634): at android.os.ResultReceiver$MyRunnable.run(ResultReceiver.java:50)
E/AndroidRuntime(16634): at android.os.Handler.handleCallback(Handler.java:790)
E/AndroidRuntime(16634): at android.os.Handler.dispatchMessage(Handler.java:99)
E/AndroidRuntime(16634): at android.os.Looper.loop(Looper.java:214)
E/AndroidRuntime(16634): at android.app.ActivityThread.main(ActivityThread.java:6977)
E/AndroidRuntime(16634): at java.lang.reflect.Method.invoke(Native Method)
E/AndroidRuntime(16634): at com.android.internal.os.RuntimeInit$MethodAndArgsCaller.run(RuntimeInit.java:528)
E/AndroidRuntime(16634): at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:910)
I/Process (16634): Sending signal. PID: 16634 SIG: 9
Lost connection to device.

My app is getting crashed after calling join method on android 8. I have tested it on above versions it is working fine. I'm using videosdk: ^1.1.10

Quit Spamming

Hi, could I request you stop spamming other communities with messages directing people to your website?

Multiple customers have reported it to us and find it very annoying.

Lack of Support for Retrieving the Currently Selected Audio Device and Defaulting to External Speaker on Certain Cases

Currently, I am working with the Video SDK integrated into a Flutter project for real-time communication. The implementation is generally successful; however, I have encountered an issue related to audio devices.

Problem:

The Video SDK does not provide a method to retrieve the currently selected audio device, such as the earpiece, speakerphone, or Bluetooth headset, which is essential for users to know which device is currently in use.

Defaulting to External Speaker:
Furthermore, I noticed that in some specific scenarios, the app defaults to the external speaker even when other audio devices are available and selected. This issue particularly affects users who prefer using earpieces or Bluetooth headsets for privacy or convenience reasons.

Expected Behavior:

The Video SDK should expose a method or API that enables developers to retrieve information about the currently selected audio device.
The app should respect the user's selected audio device and not automatically default to the external speaker unless explicitly chosen.
Steps to Reproduce:

Launch the app on various devices.
Connect different audio devices such as earpieces, Bluetooth headsets, or external speakers.
Observe the audio output device that the app defaults to in each scenario.
Devices and Environments Affected:
This issue has been observed on the following devices and environments:

Xiaomi Mi 9T (Android 10)
Samsung Galaxy S10 (Android 11)
Google Pixel 4 (Android 12)
iOS devices running iOS 14 and 15
Additional Information:
I have investigated the official documentation and forums, but there are no available solutions or workarounds to address this problem. As this issue can significantly impact user experience, I hope the development team can prioritize adding support for retrieving the currently selected audio device and improving the handling of default audio output.

Suggested Solution:

Introduce a new method in the Video SDK that enables developers to obtain information about the currently selected audio device (e.g., "getCurrentAudioDevice()").
Review the audio handling logic to ensure the app respects the user's selected audio device and does not automatically default to the external speaker without explicit user input.
Your prompt attention to this matter would be greatly appreciated. Thank you for your dedication to enhancing the Video SDK and delivering a seamless real-time communication experience.

If there are any additional details or testing I can provide to assist with resolving this issue, please let me know.

Thank you!

Screen resolution issue on M2 chip mac

On m2 Mac, there are 4 different resolution,

  1. 1024x666
  2. 1280x832
  3. 1470x956
  4. 1710x1112

with 1 and 2 its working fine, but 3 and 4 its not working properly. I can see the screen but the image looks like crooked and its added three images in a row.

Note: In other mac chip model with different resolution, its working fine. I am seeing the issue only on M2 mac

Screenshot 2024-02-07 at 4 46 49 PM

Please some one help me to fix this issue....

Unbind all the `handler` from the `event`

I have listened room events when starting call and I need to unbind all the listened events.

You are using events2 package and there is off method. Please, can you export this method to public ...?

Video not rendering on IOS

Hi, videos(including shared screens) from other participants are not rendered on IOS 17.4 (Simulator). Is there something am missing?

Can't run on iOS

I installed the videosdk package and now i can't run my app.

When i do flutter run throws:

Unhandled Exception: MissingPluginException(No implementation found for method getAll on channel plugins.flutter.io/shared_preferences

I tried flutter clean and uninstall and reinstall the app with no success.

Am i missing something? If i remove the videosdk package everything works fine. Please help me.

Thanks

How can I optain an Auth Server url ? [Question]

Hello there. I am new to this plugin, and i am having an issue getting the token. I have looked in the example and this is the method I found :

  Future<String> fetchToken() async {
    final String? _AUTH_URL = dotenv.env['AUTH_URL'];
    String? _AUTH_TOKEN = dotenv.env['AUTH_TOKEN'];

    if ((_AUTH_TOKEN?.isEmpty ?? true) && (_AUTH_URL?.isEmpty ?? true)) {
      toastMsg("Please set the environment variables");
      throw Exception("Either AUTH_TOKEN or AUTH_URL is not set in .env file");
      return "";
    }

    if ((_AUTH_TOKEN?.isNotEmpty ?? false) &&
        (_AUTH_URL?.isNotEmpty ?? false)) {
      toastMsg("Please set only one environment variable");
      throw Exception("Either AUTH_TOKEN or AUTH_URL can be set in .env file");
      return "";
    }

    if (_AUTH_URL?.isNotEmpty ?? false) {
      final Uri getTokenUrl = Uri.parse('$_AUTH_URL/get-token');
      final http.Response tokenResponse = await http.get(getTokenUrl);
      _AUTH_TOKEN = json.decode(tokenResponse.body)['token'];
    }

    // log("Auth Token: $_AUTH_TOKEN");

    return _AUTH_TOKEN ?? "";
  }

How can I create an $_AUTH_URL. Is it available at the dashboard, and if not would anyone please help me creating it.

Thanks !

Audio output coming from earpiece

after joining call by default audio is coming from earpiece not from speaker?

I joined a call and i am not connected to any other external audio output device and every time i am getting audio from earpiece not from speaker.

below is the sample code :
Room room = VideoSDK.createRoom(
roomId: GroupSharingApi.instance.meetingId,
token: GroupSharingApi.instance.videoSDKToken,
displayName: '${_profile.firstName} ${_profile.lastName}',
micEnabled: widget.micEnabled,
camEnabled: widget.camEnabled,
maxResolution: 'hd',
multiStream: false,
participantId: _userId,
defaultCameraIndex: 1,
notification: const NotificationInfo(
title: "Video SDK",
message: "Video SDK is sharing screen in the meeting",
icon: "notification_share", // drawable icon name
),
);

Blank dark screen

I started using this package last month for my project and it has been working fine. I was able to have video calls on my flutter app.

But, since yesterday, it hasn't been working. When, I try to have a video call, the screen just remains dark and blank, it doesn't even show my video.

I also tried the videosdk code example and it was the same thing.

This is my debug console log:

I/org.webrtc.Logging(17810): NativeLibrary: Loading native library: jingle_peerconnection_so
I/org.webrtc.Logging(17810): NativeLibrary: Loading library: jingle_peerconnection_so
W/.rimotli.avoca(17810): Accessing hidden method Lsun/misc/Unsafe;->putLong(Ljava/lang/Object;JJ)V (greylist, linking, allowed)
I/org.webrtc.Logging(17810): PeerConnectionFactory: PeerConnectionFactory was initialized without an injected Loggable. Any existing Loggable will be deleted.
I/org.webrtc.Logging(17810): EglBase14Impl: SDK version: 29. isEGL14Supported: true
I/org.webrtc.Logging(17810): EglBase14Impl: Using OpenGL ES version 2
I/org.webrtc.Logging(17810): WebRtcAudioManagerExternal: Sample rate is set to 48000 Hz
I/org.webrtc.Logging(17810): WebRtcAudioManagerExternal: Sample rate is set to 48000 Hz
E/org.webrtc.Logging(17810): JavaAudioDeviceModule: HW NS not supported
I/org.webrtc.Logging(17810): JavaAudioDeviceModule: createAudioDeviceModule
I/org.webrtc.Logging(17810): JavaAudioDeviceModule: HW NS will not be used.
I/org.webrtc.Logging(17810): JavaAudioDeviceModule: HW AEC will be used.
I/org.webrtc.Logging(17810): WebRtcAudioEffectsExternal: ctor@[name=main, id=2]
I/org.webrtc.Logging(17810): WebRtcAudioRecordExternal: ctor@[name=main, id=2]
I/org.webrtc.Logging(17810): WebRtcAudioTrackExternal: ctor@[name=main, id=2]
W/AudioCapabilities(17810): Unsupported mime audio/x-ima
W/AudioCapabilities(17810): Unsupported mime audio/mpeg-L1
W/AudioCapabilities(17810): Unsupported mime audio/mpeg-L2
W/AudioCapabilities(17810): Unsupported mime audio/x-ms-wma
W/VideoCapabilities(17810): Unsupported mime video/wvc1
W/VideoCapabilities(17810): Unsupported mime video/avc-wfd
W/VideoCapabilities(17810): Unsupported mime video/mp43
W/VideoCapabilities(17810): Unrecognized profile/level 1/32 for video/mp4v-es
W/VideoCapabilities(17810): Unrecognized profile/level 32768/2 for video/mp4v-es
W/VideoCapabilities(17810): Unrecognized profile/level 32768/64 for video/mp4v-es
W/VideoCapabilities(17810): Unsupported mime video/wvc1
W/VideoCapabilities(17810): Unsupported mime video/x-ms-wmv7
W/VideoCapabilities(17810): Unsupported mime video/x-ms-wmv8
I/org.webrtc.Logging(17810): WebRtcAudioRecordExternal: enableBuiltInAEC(true)
I/org.webrtc.Logging(17810): WebRtcAudioEffectsExternal: setAEC(true)
I/org.webrtc.Logging(17810): PeerConnectionFactory: onWorkerThreadReady
I/org.webrtc.Logging(17810): PeerConnectionFactory: onSignalingThreadReady
I/org.webrtc.Logging(17810): PeerConnectionFactory: onNetworkThreadReady
D/MediaConstraintsUtils(17810): mandatory constraints are not a map
I/FlutterWebRTCPlugin(17810): getUserMedia(audio): mandatory: [], optional: [sourceId: audio-1]
D/FlutterWebRTCPlugin(17810): MediaStream id: 78fae78f-de4f-4ebf-b8fb-6d9c97b5163b
W/.rimotli.avoca(17810): Accessing hidden method Lsun/misc/Unsafe;->getLong(Ljava/lang/Object;J)J (greylist,core-platform-api, linking, allowed)
W/.rimotli.avoca(17810): Accessing hidden method Lsun/misc/Unsafe;->getObject(Ljava/lang/Object;J)Ljava/lang/Object; (greylist, linking, allowed)
W/.rimotli.avoca(17810): Accessing hidden method Lsun/misc/Unsafe;->putObject(Ljava/lang/Object;JLjava/lang/Object;)V (greylist, linking, allowed)
W/.rimotli.avoca(17810): Accessing hidden method Lsun/misc/Unsafe;->putLong(Ljava/lang/Object;JJ)V (greylist, linking, allowed)
W/.rimotli.avoca(17810): Accessing hidden method Lsun/misc/Unsafe;->getLong(Ljava/lang/Object;J)J (greylist,core-platform-api, linking, allowed)
W/.rimotli.avoca(17810): Accessing hidden method Lsun/misc/Unsafe;->putLong(Ljava/lang/Object;JJ)V (greylist, linking, allowed)
W/.rimotli.avoca(17810): Accessing hidden method Lsun/misc/Unsafe;->getLong(Ljava/lang/Object;J)J (greylist,core-platform-api, linking, allowed)
I/flutter (17810): request timeout
W/.rimotli.avoca(17810): Accessing hidden method Lsun/misc/Unsafe;->putLong(Ljava/lang/Object;JJ)V (greylist, linking, allowed)

In IOS audio output changed from main speaker to earpiece speaker And after mute still another person on call able to hear my audio and in android it works fine

When i am enabling my mic from IOS my audio output is changed from main speaker to earpiece speaker.
Below is my code snipet:

Mute and Unmute fuction:
onMicButtonPressed: () {
if (audioStream != null) {
meeting.muteMic();
} else {
meeting.unmuteMic();
}
},

Full code sample

class ConferenceScreen extends StatefulWidget {
final bool micEnabled, camEnabled, chatEnabled, isHost;
final String customRoomId;
final int eventId;

const ConferenceScreen({
Key? key,
required this.isHost,
required this.eventId,
required this.customRoomId,
this.micEnabled = true,
this.camEnabled = true,
this.chatEnabled = true,
}) : super(key: key);

@OverRide
State createState() => _ConferenceScreenState();
}

class _ConferenceScreenState extends State {
late Room meeting;
bool _joined = false;
bool callLeave = false;
Stream? videoStream;
Stream? audioStream;
List cameras = [];
List<Map<String,String>> participantList = [{}];

// Stream? remoteParticipantShareStream;

Future joinAndCreateInitCall() async {
String _userId =
await SharedPreferencesHelper.getValue(SharedPreferencesHelper.userId);
ProfileDTO _profile =
await SharedPreferencesHelper.getValue(SharedPreferencesHelper.profile)
.then((profile) {
return ProfileDTO.fromJson(json.decode(profile), _userId);
}).catchError((e) {});
Room room = VideoSDK.createRoom(
roomId: GroupSharingApi.instance.meetingId,
token: GroupSharingApi.instance.videoSDKToken,
displayName: '${_profile.firstName} ${_profile.lastName}',
micEnabled: widget.micEnabled,
camEnabled: widget.camEnabled,
maxResolution: 'hd',
multiStream: true,
participantId: _userId,
defaultCameraIndex: 1,
notification: const NotificationInfo(
title: "Video SDK",
message: "Video SDK is sharing screen in the meeting",
icon: "notification_share", // drawable icon name
),
);

// Register meeting events
registerMeetingEvents(room);

log('room join');
// Join meeting
room.join();
Wakelock.enable();
log('room join');

}

@OverRide
void initState() {
joinAndCreateInitCall();
super.initState();
}

@OverRide
void dispose() {
log('dispose called');
meeting.leave();
Wakelock.disable();
super.dispose();
}

@OverRide
Widget build(BuildContext context) {
return _joined
? Container(
height: MediaQuery.of(context).size.height * 0.4,
color: Colors.black,
alignment: Alignment.center,
child: Column(
mainAxisSize: MainAxisSize.min,
children: [
Flexible(child: ConferenceParticipantGrid(meeting: meeting)),
const SizedBox(height: 10),
AnimatedCrossFade(
duration: const Duration(milliseconds: 300),
crossFadeState: CrossFadeState.showFirst,
secondChild: const SizedBox.shrink(),
firstChild: MeetingActionBar(
isMicEnabled: audioStream != null,
isCamEnabled: videoStream != null,
// Called when Call End button is pressed
onCallEndButtonPressed: () {
callLeave = true;
meeting.leave();
},
// Called when Call leave button is pressed
// onCallLeaveButtonPressed: () {
// callLeave = true;
// meeting.leave();
// },
// Called when mic button is pressed
onMicButtonPressed: () {
if (audioStream != null) {
meeting.muteMic();
} else {
meeting.unmuteMic();
}
},
// Called when camera button is pressed
onCameraButtonPressed: () {
if (videoStream != null) {
meeting.disableCam();
} else {
meeting.enableCam();
}
},
onSwitchMicButtonPressed: (details) async {
List outptuDevice =
meeting.getAudioOutputDevices();
double bottomMargin = (70.0 * outptuDevice.length);
final screenSize = MediaQuery.of(context).size;
await showMenu(
context: context,
color: Colors.white,
shape: RoundedRectangleBorder(
borderRadius: BorderRadius.circular(12)),
position: RelativeRect.fromLTRB(
screenSize.width - details.globalPosition.dx,
details.globalPosition.dy - bottomMargin,
details.globalPosition.dx,
(bottomMargin),
),
items: outptuDevice.map((e) {
return PopupMenuItem(
value: e, child: Text(e.label));
}).toList(),
elevation: 8.0,
).then((value) {
if (value != null) {
meeting.switchAudioDevice(value);
}
});
},

                onParticipantPressed: () {
                  log('custom room id from confrense screen ${widget.customRoomId}');
                  showModalBottomSheet(
                    context: context,
                    enableDrag: false,
                    builder: (context) => ParticipantList(meeting: meeting, isHost: widget.isHost,eventId:widget.eventId,customRoomId: widget.customRoomId),
                  );
                },
                onSwitchCamera: (){
                  MediaDeviceInfo newCam = cameras.firstWhere((camera) => camera.deviceId != meeting.selectedCamId);
                  meeting.changeCam(newCam.deviceId);
                },
              ),
            ),
          ],
        ),
      )
    : SizedBox(
        height: MediaQuery.of(context).size.height * 0.4,
        child: Center(
          child: Column(
            mainAxisAlignment: MainAxisAlignment.center,
            mainAxisSize: MainAxisSize.min,
            children: [
              Text(widget.isHost ? "Creating a Room" : "Please wait host will let you in soon!",
                  style: TextStyle(fontSize: 20, color: Colors.white, fontWeight: FontWeight.w500)),
              const SizedBox(height: 10),
              CupertinoActivityIndicator(
                color: Colors.white,
              ),
            ],
          ),
        ),
      );

}

void registerMeetingEvents(Room _meeting) {
// Called when joined in meeting
_meeting.on(
Events.roomJoined,
() {
log('on room joined ');
setState(() {
meeting = _meeting;
cameras = meeting.getCameras();
_joined = true;
});
},
);
// Called when meeting is ended
_meeting.on(Events.roomLeft, (String? errorMsg){
if (errorMsg != null) {
showSnackBarMessage(message: "Meeting left due to $errorMsg !!", context: context);
}
GroupSharingApi.instance.showCallScreen.value = false;
});

_meeting.on(Events.participantJoined, (Participant participant){
  participantList.add({'${participant.id}':'${participant.displayName}'});
  showSnackBarMessage(message: "${participant.displayName} joinded the room!", context: context);
});
_meeting.on(Events.participantLeft, (String participantId){
  showSnackBarMessage(message: "${participantList.firstWhere((element) => element.containsKey(participantId))[participantId]} left the room!", context: context);
  participantList.removeWhere((element) => element.containsKey(participantId));
});
// Called when stream is enabled
_meeting.localParticipant.on(Events.streamEnabled, (Stream _stream) {
  if (_stream.kind == 'video') {
    setState(() {
      videoStream = _stream;
    });
  } else if (_stream.kind == 'audio') {
    setState(() {
      audioStream = _stream;
    });
  }
});

// Called when stream is disabled
_meeting.localParticipant.on(Events.streamDisabled, (Stream _stream) {
  if (_stream.kind == 'video' && videoStream?.id == _stream.id) {
    setState(() {
      videoStream = null;
    });
  } else if (_stream.kind == 'audio' && audioStream?.id == _stream.id) {
    setState(() {
      audioStream = null;
    });
  }
});

// Called when you requesting to join meeting
_meeting.on(Events.entryRequested, (data) {
  log('user entery request');
  var name = data["name"];
  var allow = data["allow"];
  var deny = data["deny"];
  showRequestDialog(context,"Join Request","Do you want to allow $name to join room?",allow,deny);
});

// Handle camera Requested
_meeting.on(Events.cameraRequested, (data) {
  log('requested for camera');
  var allow = data["accept"];
  var deny = data["reject"];
  showRequestDialog(context,"Enable Camera Request","Host requested to enable camera!",allow,deny);
});

// Handle Mic Requested
_meeting.on(Events.micRequested, (data){
  log('requested for mic');
  var allow = data["accept"];
  var deny = data["reject"];
  showRequestDialog(context,"Enable Mic Request","Host requested to enable mic!",allow,deny);
});

_meeting.on(
    Events.error,
    (error) => {
          showSnackBarMessage(
              message: error['name'].toString() + " :: " + error['message'].toString(), context: context)
        });

}
}

void showSnackBarMessage(
{required String message,
Widget? icon,
Color messageColor = Colors.white,
required BuildContext context}) {
ScaffoldMessenger.of(context).removeCurrentSnackBar();

ScaffoldMessenger.of(context).showSnackBar(SnackBar(
margin: const EdgeInsets.symmetric(horizontal: 16, vertical: 4),
behavior: SnackBarBehavior.floating,
shape: RoundedRectangleBorder(borderRadius: BorderRadius.circular(8)),
content: Row(
children: [
if (icon != null) icon,
Flexible(
child: Text(
message,
style: TextStyle(
color: messageColor,
fontSize: 14,
fontWeight: FontWeight.w500,
),
overflow: TextOverflow.fade,
),
)
],
)));
}

void showRequestDialog(BuildContext context,String title,String content,Function allow,Function deny)=>showDialog(
context: context,
builder: (context) => AlertDialog(
title: Text(title),
content: Text(content),
actions: [
TextButton(
child: const Text("Deny"),
onPressed: () async{
await deny();
Navigator.of(context).pop();
},
),
TextButton(
child: const Text("Allow"),
onPressed: () async{
await allow();
Navigator.of(context).pop();
},
),
],
),
);

My flutter Doctor:
Doctor summary (to see all details, run flutter doctor -v):
[!] Flutter (Channel stable, 3.7.3, on macOS 13.0.1 22A400 darwin-arm64, locale
en-IN)
! Warning: dart on your path resolves to
/opt/homebrew/Cellar/dart/2.18.5/libexec/bin/dart, which is not inside
your current Flutter SDK checkout at
/Users/macintosh/Desktop/Sangam/flutter_sdk/flutter. Consider adding
/Users/macintosh/Desktop/Sangam/flutter_sdk/flutter/bin to the front of
your path.
[!] Android toolchain - develop for Android devices (Android SDK version 33.0.0)
βœ— cmdline-tools component is missing
Run path/to/sdkmanager --install "cmdline-tools;latest"
See https://developer.android.com/studio/command-line for more details.
βœ— Android license status unknown.
Run flutter doctor --android-licenses to accept the SDK licenses.
See https://flutter.dev/docs/get-started/install/macos#android-setup for
more details.
[βœ“] Xcode - develop for iOS and macOS (Xcode 14.2)
[βœ“] Chrome - develop for the web
[βœ“] Android Studio (version 2021.3)
[βœ“] VS Code (version 1.75.1)
[βœ“] Connected device (4 available)
[βœ“] HTTP Host Availability

! Doctor found issues in 2 categories.

My video sdk version : 1.0.9

And after mute still another person on call able to hear my audio and in android it works fine

Video Sample:

WhatsApp.Video.2023-02-26.at.8.01.33.PM.mp4

cannot run on flutter 3.3.10

i tried on every java verison 17,15,11,and 8 but still getting this error.

FAILURE: Build failed with an exception.

  • What went wrong:
    Execution failed for task ':app:processDebugResources'.

Could not resolve all files for configuration ':app:debugRuntimeClasspath'.
Failed to transform audioswitch-c498d866c57f1d88056d5e7e7a78d622e3b0c046.aar (com.github.davidliu:audioswitch:c498d866c57f1d88056d5e7e7a78d622e3b0c046) to match attributes {artifactType=android-compiled-dependencies-resources, org.gradle.category=library, org.gradle.libraryelements=jar, org.gradle.status=release, org.gradle.usage=java-runtime}.
> Execution failed for AarResourcesCompilerTransform: C:\Users\abdullah.gradle\caches\transforms-3\f81649683aef4eb6854f409a8444a1d1\transformed\jetified-audioswitch-c498d866c57f1d88056d5e7e7a78d622e3b0c046.
> C:\Users\abdullah.gradle\caches\transforms-3\f81649683aef4eb6854f409a8444a1d1\transformed\jetified-audioswitch-c498d866c57f1d88056d5e7e7a78d622e3b0c046\AndroidManifest.xml

  • Try:

Run with --stacktrace option to get the stack trace.
Run with --info or --debug option to get more log output.
Run with --scan to get full insights.

BUILD FAILED in 15s
Exception: Gradle task assembleDebug failed with exit code 1

VideoSdk blocks audio sound when initiated even after exit and dispose

I am using videosdk with RingtonePlayer , The ringtone rings out initially, but when a call(audio/video) has been initiated and eventually dispose off, video sdk still blocks sounds both from the ringtoneplayer and other sound from the phone.
I would be happy if this is seen and attended to early.

Some time during call video of participants swaps between Particippants

When call gets connected after enabling video some time participants video get swaps .

My video sdk version 1.0.9
my flutter doctor

[!] Flutter (Channel stable, 3.7.3, on macOS 13.0.1 22A400 darwin-arm64, locale
en-IN)
! Warning: dart on your path resolves to
/opt/homebrew/Cellar/dart/2.18.5/libexec/bin/dart, which is not inside
your current Flutter SDK checkout at
/Users/macintosh/Desktop/Sangam/flutter_sdk/flutter. Consider adding
/Users/macintosh/Desktop/Sangam/flutter_sdk/flutter/bin to the front of
your path.
[!] Android toolchain - develop for Android devices (Android SDK version 33.0.0)
βœ— cmdline-tools component is missing
Run path/to/sdkmanager --install "cmdline-tools;latest"
See https://developer.android.com/studio/command-line for more details.
βœ— Android license status unknown.
Run flutter doctor --android-licenses to accept the SDK licenses.
See https://flutter.dev/docs/get-started/install/macos#android-setup for
more details.
[βœ“] Xcode - develop for iOS and macOS (Xcode 14.2)
[βœ“] Chrome - develop for the web
[βœ“] Android Studio (version 2021.3)
[βœ“] VS Code (version 1.75.1)
[βœ“] Connected device (2 available)
[βœ“] HTTP Host Availability

My flutter code impelementation

Room room = VideoSDK.createRoom(
  roomId: GroupSharingApi.instance.meetingId,
  token: GroupSharingApi.instance.videoSDKToken,
  displayName: '${_profile.firstName} ${_profile.lastName}',
  micEnabled: widget.micEnabled,
  camEnabled: widget.camEnabled,
  maxResolution: 'hd',
  multiStream: true,
  participantId: _userId,
  defaultCameraIndex: 1,
  notification: const NotificationInfo(
    title: "Video SDK",
    message: "Video SDK is sharing screen in the meeting",
    icon: "notification_share", // drawable icon name
  ),
);

// Register meeting events
registerMeetingEvents(room);

and participant grid UI Code:

import 'package:flutter/foundation.dart';
import 'package:flutter/material.dart';
import 'package:videosdk/videosdk.dart';
import 'package:creatics_mobile/group_sharing/video_sdk/widgets/conference-call-participant-grid/participant_grid_tile.dart';

class ConferenceParticipantGrid extends StatefulWidget {
final Room meeting;

const ConferenceParticipantGrid({Key? key, required this.meeting}) : super(key: key);

@OverRide
State createState() => _ConferenceParticipantGridState();
}

class _ConferenceParticipantGridState extends State {
late Participant localParticipant;
String? activeSpeakerId;
String? presenterId;
int numberofColumns = 1;
int numberOfMaxOnScreenParticipants = 6;
String quality = "high";

Map<String, Participant> participants = {};
Map<String, Participant> onScreenParticipants = {};

@OverRide
void initState() {
localParticipant = widget.meeting.localParticipant;
participants.putIfAbsent(localParticipant.id, () => localParticipant);
participants.addAll(widget.meeting.participants);
presenterId = widget.meeting.activePresenterId;
updateOnScreenParticipants();
// Setting meeting event listeners
setMeetingListeners(widget.meeting);

super.initState();

}

@OverRide
void setState(fn) {
if (mounted) {
super.setState(fn);
}
}

@OverRide
Widget build(BuildContext context) {
return GridView.builder(
shrinkWrap: true,
itemCount: onScreenParticipants.length,
gridDelegate: SliverGridDelegateWithFixedCrossAxisCount(
crossAxisCount: 2,
crossAxisSpacing: 4.0,
mainAxisSpacing: 4.0
), itemBuilder: (BuildContext context, int index){
return ParticipantGridTile(
participant: onScreenParticipants.values.toList()[index],
quality: quality,
activeSpeakerId: activeSpeakerId);
});
}

void setMeetingListeners(Room _meeting) {
// Called when participant joined meeting
_meeting.on(
Events.participantJoined,
(Participant participant) {
final newParticipants = participants;
newParticipants[participant.id] = participant;
setState(() {
participants = newParticipants;
updateOnScreenParticipants();
});
},
);

// Called when participant left meeting
_meeting.on(
  Events.participantLeft,
  (participantId) {
    final newParticipants = participants;

    newParticipants.remove(participantId);
    setState(() {
      participants = newParticipants;
      updateOnScreenParticipants();
    });
  },
);

_meeting.on(
  Events.speakerChanged,
  (_activeSpeakerId) {
    setState(() {
      activeSpeakerId = _activeSpeakerId;
      updateOnScreenParticipants();
    });
  },
);

_meeting.on(Events.presenterChanged, (_presenterId) {
  setState(() {
    presenterId = _presenterId;
    numberOfMaxOnScreenParticipants = _presenterId != null ? 2 : 6;
    updateOnScreenParticipants();
  });
});

// Called when speaker is changed
_meeting.on(Events.speakerChanged, (_activeSpeakerId) {
  setState(() {
    activeSpeakerId = _activeSpeakerId;
  });
});

_meeting.localParticipant.on(Events.streamEnabled, (Stream stream) {
  if (stream.kind == "share") {
    setState(() {
      numberOfMaxOnScreenParticipants = 2;
      updateOnScreenParticipants();
    });
  }
});
_meeting.localParticipant.on(Events.streamDisabled, (Stream stream) {
  if (stream.kind == "share") {
    setState(() {
      numberOfMaxOnScreenParticipants = 6;
      updateOnScreenParticipants();
    });
  }
});

}

updateOnScreenParticipants() {
Map<String, Participant> newScreenParticipants = <String, Participant>{};
participants.values
.toList()
.sublist(
0,
participants.length > numberOfMaxOnScreenParticipants
? numberOfMaxOnScreenParticipants
: participants.length)
.forEach((participant) {
newScreenParticipants.putIfAbsent(participant.id, () => participant);
});
if (!newScreenParticipants.containsKey(activeSpeakerId) && activeSpeakerId != null) {
newScreenParticipants.remove(newScreenParticipants.keys.last);
newScreenParticipants.putIfAbsent(
activeSpeakerId!, () => participants.values.firstWhere((element) => element.id == activeSpeakerId));
}
if (!listEquals(newScreenParticipants.keys.toList(), onScreenParticipants.keys.toList())) {
setState(() {
onScreenParticipants = newScreenParticipants;
quality = newScreenParticipants.length > 4
? "low"
: newScreenParticipants.length > 2
? "medium"
: "high";
});
}
if (numberofColumns !=
(newScreenParticipants.length > 2 || numberOfMaxOnScreenParticipants == 2 ? 2 : 1)) {
setState(() {
numberofColumns = newScreenParticipants.length > 2 || numberOfMaxOnScreenParticipants == 2 ? 2 : 1;
});
}
}
}

WhatsApp Image 2023-02-25 at 11 53 19 AM

If you see in screen shot

  1. In Mahadev user video is correct but in Shruti user Mahadev user video shows

which is wrong.

Here i have joined from only my own device
I am able to see myself at 2 places
1 with correct name
Other with incorrect name

Others have no sound and no video when I access the web from IOS WHY

I log in web meeting from 3 devices, from android, ios and macos devices, when i log in from macos it works fine only when i log in through chrome, when i log in through safari, no sound and no video etc. the video and sound are muted because if i login from ios same problem is observed and android is working fine plz can you solve this problem for me ?

No video stream - Camera not enabled

Hi all.
I've been trying to integrate the SDK into my existing app but could not get the video stream. I've also tried this example app but it's not working too. I figured out that the camera is not enabled. While trying to enable the camera, i get this error

webrtc.Logging(22287): SurfaceTextureHelper: stopListening()
[log] Unable to RTCPeerConnection::createOffer: peerConnectionCreateOffer(): WEBRTC_CREATE_OFFER_ERROR: Session error code: ERROR_CONTENT. Session error description: Failed to set local video description recv parameters for m-section with mid='video'..
[log] #0      Transport._produce
      <asynchronous suspension>
#1      FlexQueue._runTask
      <asynchronous suspension>

image
When i try to switch from/to front/back camera i get the output
[log] camera is not enabled!
Flutter doctor ouputs

[βœ“] Flutter (Channel stable, 3.10.0, on Pop!_OS 22.04 LTS 6.2.6-76060206-generic, locale fr_FR.UTF-8)
[βœ“] Android toolchain - develop for Android devices (Android SDK version 33.0.0-rc2)
[βœ“] Chrome - develop for the web
[βœ“] Linux toolchain - develop for Linux desktop
[βœ“] Android Studio (version 2022.3)
[βœ“] Connected device (3 available)
[βœ“] Network resources

Tested on Samsung T225N (Android 13), Pixel 4XL (Android 13)

Audio quality is not so good in flutter.

when i play youtube video along with video call at same time in my app i am unable to hear audio from video call whereas I reduced the sound of youtube player still not getting clear audio from video call in my app

Screen remains at initializing meeting

Video calls do not progress beyond the screen that says initializing meeting. This occurs on both our app and the example code supplied here.

The same code worked previously but in the past 24 to 48 hours, this issue has persisted.

Console:
I/org.webrtc.Logging(17810): NativeLibrary: Loading native library: jingle_peerconnection_so
I/org.webrtc.Logging(17810): NativeLibrary: Loading library: jingle_peerconnection_so
W/.rimotli.avoca(17810): Accessing hidden method Lsun/misc/Unsafe;->putLong(Ljava/lang/Object;JJ)V (greylist, linking, allowed)
I/org.webrtc.Logging(17810): PeerConnectionFactory: PeerConnectionFactory was initialized without an injected Loggable. Any existing Loggable will be deleted.
I/org.webrtc.Logging(17810): EglBase14Impl: SDK version: 29. isEGL14Supported: true
I/org.webrtc.Logging(17810): EglBase14Impl: Using OpenGL ES version 2
I/org.webrtc.Logging(17810): WebRtcAudioManagerExternal: Sample rate is set to 48000 Hz
I/org.webrtc.Logging(17810): WebRtcAudioManagerExternal: Sample rate is set to 48000 Hz
E/org.webrtc.Logging(17810): JavaAudioDeviceModule: HW NS not supported
I/org.webrtc.Logging(17810): JavaAudioDeviceModule: createAudioDeviceModule
I/org.webrtc.Logging(17810): JavaAudioDeviceModule: HW NS will not be used.
I/org.webrtc.Logging(17810): JavaAudioDeviceModule: HW AEC will be used.
I/org.webrtc.Logging(17810): WebRtcAudioEffectsExternal: ctor@[name=main, id=2]
I/org.webrtc.Logging(17810): WebRtcAudioRecordExternal: ctor@[name=main, id=2]
I/org.webrtc.Logging(17810): WebRtcAudioTrackExternal: ctor@[name=main, id=2]
W/AudioCapabilities(17810): Unsupported mime audio/x-ima
W/AudioCapabilities(17810): Unsupported mime audio/mpeg-L1
W/AudioCapabilities(17810): Unsupported mime audio/mpeg-L2
W/AudioCapabilities(17810): Unsupported mime audio/x-ms-wma
W/VideoCapabilities(17810): Unsupported mime video/wvc1
W/VideoCapabilities(17810): Unsupported mime video/avc-wfd
W/VideoCapabilities(17810): Unsupported mime video/mp43
W/VideoCapabilities(17810): Unrecognized profile/level 1/32 for video/mp4v-es
W/VideoCapabilities(17810): Unrecognized profile/level 32768/2 for video/mp4v-es
W/VideoCapabilities(17810): Unrecognized profile/level 32768/64 for video/mp4v-es
W/VideoCapabilities(17810): Unsupported mime video/wvc1
W/VideoCapabilities(17810): Unsupported mime video/x-ms-wmv7
W/VideoCapabilities(17810): Unsupported mime video/x-ms-wmv8
I/org.webrtc.Logging(17810): WebRtcAudioRecordExternal: enableBuiltInAEC(true)
I/org.webrtc.Logging(17810): WebRtcAudioEffectsExternal: setAEC(true)
I/org.webrtc.Logging(17810): PeerConnectionFactory: onWorkerThreadReady
I/org.webrtc.Logging(17810): PeerConnectionFactory: onSignalingThreadReady
I/org.webrtc.Logging(17810): PeerConnectionFactory: onNetworkThreadReady
D/MediaConstraintsUtils(17810): mandatory constraints are not a map
I/FlutterWebRTCPlugin(17810): getUserMedia(audio): mandatory: [], optional: [sourceId: audio-1]
D/FlutterWebRTCPlugin(17810): MediaStream id: 78fae78f-de4f-4ebf-b8fb-6d9c97b5163b
W/.rimotli.avoca(17810): Accessing hidden method Lsun/misc/Unsafe;->getLong(Ljava/lang/Object;J)J (greylist,core-platform-api, linking, allowed)
W/.rimotli.avoca(17810): Accessing hidden method Lsun/misc/Unsafe;->getObject(Ljava/lang/Object;J)Ljava/lang/Object; (greylist, linking, allowed)
W/.rimotli.avoca(17810): Accessing hidden method Lsun/misc/Unsafe;->putObject(Ljava/lang/Object;JLjava/lang/Object;)V (greylist, linking, allowed)
W/.rimotli.avoca(17810): Accessing hidden method Lsun/misc/Unsafe;->
putLong(Ljava/lang/Object;JJ)V (greylist, linking, allowed)
W/.rimotli.avoca(17810): Accessing hidden method Lsun/misc/Unsafe;->getLong(Ljava/lang/Object;J)J (greylist,core-platform-api, linking, allowed)
W/.rimotli.avoca(17810): Accessing hidden method Lsun/misc/Unsafe;->putLong(Ljava/lang/Object;JJ)V (greylist, linking, allowed)
W/.rimotli.avoca(17810): Accessing hidden method Lsun/misc/Unsafe;->getLong(Ljava/lang/Object;J)J (greylist,core-platform-api, linking, allowed)
I/flutter (17810): request timeout
W/.rimotli.avoca(17810): Accessing hidden method Lsun/misc/Unsafe;->putLong(Ljava/lang/Object;JJ)V (greylist, linking, allowed)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    πŸ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. πŸ“ŠπŸ“ˆπŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❀️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.