zmxv / react-native-sound Goto Github PK
View Code? Open in Web Editor NEWReact Native module for playing sound clips
License: MIT License
React Native module for playing sound clips
License: MIT License
Hello, I have the same problem as another had before but on Android platform.
I use React 0.21 and Android Api 22.
Can you add support for playing sound from a URL, eg: http://www.mydomain.com/some/sound1.mp3.
If my filename is something like this:
audioFileName += "/" + timeStamp;
I have no problem with playing/pausing/etc. However, if my filename has an extension, like so: audioFileName += "/" + timeStamp + ".wav"
; I get the following error:
failed to load the sound Object {message: "resource not found", code: -1}
.
This error is on Android. My device is running Marshmallow.
I would like to play downloaded mp3 file , but I can't.
The downloaded directory is under the path which is got from "react-native-fs" 's ExternalDirectoryPath().
I checked if the file exists or not . I used exists() of "react-native-fs" and it returned true.
onError's argument error
is below.
{“message”:”resource not found”,”code”:”-1”}
I found that this problem occurs by trimmed file extension on Android.
External file path requires file extension except bunbled file on Android.
https://github.com/zmxv/react-native-sound/blob/master/sound.js#L12
I'm using react-native-networking module to download a file from a server which saves it to contents://downloads/my_downloads/{id} but sound fails to load the file given the path and file name (id). I am not too familiar with the file system on Android, could there be a permissions issue?
My app plays a short click sound when the user taps a button. I find that if she taps the button more than about 4 times a second, the clicks get dropped.
I was using react-native-audioplayer before and I switched to this one because it was not mixing properly with other sounds on the device. I also like the ability to set the volume with this component.
But with react-native-audioplayer the user can tap much faster and the clicks keep up almost as fast as it is possible to tap the device.
Do you have any ideas on what might be causing the delay? As you suggest in the readme, I call var sound = new Sound()
when the component mounts, then just call sound.play()
when the button is tapped.
undefined is not an object (evaluating 'RNSound.setLooping')?
Hey,
I'm trying this out on a clean install of RN 0.29.
But unfortunately, nothing comes out of my speakers. I tried the demo with an older version of RN and everything is working fine.
I also had this working on a previous app I've tried using 0.27
Thanks,
Peter
Great job by the way :)
libRNSound.a(RNSound.o)) was built for newer iOS version (9.0) than being linked (8.0)
Can we handle this warning by updating the targeted version for RNSound?
error Object {code: "ENSOSSTATUSERRORDOMAIN2003334207", message: "The operation couldn’t be completed. (OSStatus error 2003334207.)", nativeStackIOS: Array[17], domain: "NSOSStatusErrorDomain"}
It is not work if I using mp3 file in my component folder
Music/
music.js
background1.mp3
playSound() {
var s = new Sound('background2.mp3', Sound.DOCUMENT, (e) => {
if (e) {
console.log('error', e);
} else {
console.log('duration', s.getDuration());
s.play();
}
});
}
transforming [======================================= ] 99% 539/546TypeError: unsupported file type
at lookup (/Users/user/Dev/app/app/node_modules/image-size/lib/index.js:35:9)
at /Users/user/Dev/app/app/node_modules/image-size/lib/index.js:93:22
at /Users/user/Dev/app/app/node_modules/image-size/lib/index.js:50:9
at /Users/user/Dev/app/app/node_modules/graceful-fs/graceful-fs.js:43:10
at FSReqWrap.oncomplete (fs.js:82:15)
It works fine if I put my mp3 file outside the project scope (on my Desktop for instance) and I just load by '/Users/user/Desktop/abc.mp3', but when I move the mp3 file into my project, it throws this error 0_o
I have it working just fine with mp3s but I can't see why aacs won't play since you say they would. I've tried with both .aac and .m4a with no success. The files have been added into the project, definitely should be working since everything is exactly the same.
Currently trying to load an mp3 from the root directory in my project but am being thrown this error in the react native debugger:
Object {code: 2003334207, message: "The operation couldn’t be completed. (OSStatus error 2003334207.)", nativeStackIOS: Array[16], domain: "NSOSStatusErrorDomain"}
How I'm currently attempting to use react-native-sound:
'use strict';
var React = require('react-native');
var {
AppRegistry,
StyleSheet,
Text,
Image,
TouchableHighlight,
View,
} = React;
var Sound = require('react-native-sound');
var ding = new Sound('ding.mp3', Sound.MAIN_BUNDLE, (error) => {
if (error) {
console.log(error);
}
});
var soundtest = React.createClass({
_handleDing: function() {
ding.play();
},
render: function() {
return (
<View style={styles.container}>
<TouchableHighlight onPress={this._handleDing}>
<View style={{width: 100, height: 100, padding: 20, backgroundColor: 'red', borderRadius: 50}}>
</View>
</TouchableHighlight>
</View>
);
}
});
AppRegistry.registerComponent('soundtest', () => soundtest);
My guess is that this has something to do with the react native/ios file system. I've tried multiple mp3s so I'm relatively sure it's not to do with the file itself, but perhaps where I'm storing it. Right now it is in the root directory of my react native project (same as index.ios.js
).
I'm running:
"react-native": "^0.17.0",
"react-native-sound": "^0.7.3"
Any ideas?
why did not support for android left and right channel
I download mp3 files from a remote source with react-native-fetch-blob and store them in local filesystem with react-native-fs.
However, the data is stored in base64 encoded then.
Is it possible to play these files with this react-native-sound? Or how can i decode the base64 to native binary data?
would be great.
Is it possible to control sound played using this library via the iOS control center (the menu that is dragged from the bottom)?
Would it be possible to add a method for controlling tempo of sound file?
In some cases it may be necessary to load a file ad-hoc and play it as soon as it has loaded. It is impossible to do this now without some convoluted polling for the value of isLoaded()
inside a recurring setTimeout
call.
This problem could be solved by providing a functional API that passes the sound instance to a callback once it's finished loaded.
So instead of doing:
const sound = new Sound(...);
You'd have something like:
Sound.load(filename, Sound.MAIN_BUNDLE, (sound, error) => {
// Do whatever you want with sound and error here
});
I would like to retrieve the current time directly, without passing a callback or at least synchronously. Do you think it could be conceivable?
Thanks in advance,
I load js bundle from remote http server, it can't play the local sound file. If I change to local js bundle, it works fine.
In this case, the error
argument in callback
is null
, but just no sound. If I change the file name to a non-exists file, the error
argument is an error.
I have no problems with playing/manipulating the sound file in IOS, but the sound is coming from the receiver so it's not practical. I am using the latest version of this module, testing on Iphone 5s.
Hi,
i tried install on android. It build successfully but i keep getting
"undefined is not an object (evaluating 'RNSound.IsAndroid') " error on emulator.
Can anyone help me with this issue?
Hey, thanks for great plugin - it's awesome!
Do you know if there is possibility to play sound when app is in background using this package?
Also for an app I will be needing to be able to fade in one file while the other fades out.
Hello.
I tried to play call .play() but it does not work and did not give me any error. I tried wav and mp3 files. Also .duration and .numberOfChannels give undefined value, when I changed to ._duration and ._numberOfChannels it gave me duration and -1 in numberOfChannels.
The step below from README.md
, I received errors when went ahead and edited my MainActivity.java, the getPackages() function was not there to begin with. Instead, I had the function in MainApplication.java in the same directory: android/app/src/main/java/.../MainApplication.java
. I thought you may want to cover that in the readme.
Here's that step:
Edit android/app/src/main/java/.../MainActivity.java
to register the native module:
...
import com.zmxv.RNSound.RNSoundPackage; // <-- New
...
public class MainActivity extends ReactActivity {
...
@Override
protected List<ReactPackage> getPackages() {
return Arrays.<ReactPackage>asList(
new MainReactPackage(),
new RNSoundPackage() // <-- New
);
}
Hi!
Could you enable a switch for playing the audio in silent mode?
http://stackoverflow.com/a/12868879/528675
Great job with the library so far!
-Albert
I have a weird issue, the same as described here.
The loop parameter is not taken in consideration until the first play, even though the numberOfLoops
property is correct (I logged it just before the play).
The temporary workaround I found JS side (without touching the library) is:
const sound = new Sound('alert.m4a', Sound.MAIN_BUNDLE, () => {
sound.setNumberOfLoops(-1);
sound.play(() => {
sound.play();
});
});
The second play take into account the numberOfLoops
property. On the SO post, the OP found a native workaround by replacing the player. Maybe it's worth looking into this.
I'm using react-native 0.20.0 and react-native-sound 0.7.3. The audio only plays once no matter which number of loops was given.
Hi there,
Just trying to implement a basic example with your component. Everything seems to be fine until I call play()
and then nothing happens. I've tried delaying the call with setTimeout()
but no luck.
Here's my code:
componentDidMount: function() {
var snd_testShort = new Sound('mymp3.mp3', Sound.MAIN_BUNDLE, (error) => {
if (error) {
console.log('snd_testShort - failed to load the sound', error);
} else {
//all the file details come back and are logged...
console.log('snd_testShort - duration in seconds: ' + snd_testShort._duration +
' number of channels: ' + snd_testShort._numberOfChannels);
//the below line fires
console.log("snd_testShort - play attempt");
//I get nothing back from the `play()` call - I can breakpoint in XCode and the play method there fires without issue
snd_testShort.play((test) => {
console.log("snd_testShort - TEST");
if (test) {
console.log('snd_testShort - successfully finished playing');
} else {
console.log('snd_testShort - playback failed due to audio decoding errors');
}
});
}
});
As stated, in XCode I can step through the following in RNSound
(seemingly) without issue:
RCT_EXPORT_METHOD(play:(nonnull NSNumber*)key withCallback:(RCTResponseSenderBlock)callback) {
AVAudioPlayer* player = [self playerForKey:key];
if (player) {
[[self callbackPool] setObject:[callback copy] forKey:key];
[player play];
}
}
The error is regarding the following line of code in sound.js
:
var RNSound = require('react-native').NativeModules.RNSound;
var IsAndroid = RNSound.IsAndroid;
I have followed the instructions exactly as outlined for IOS, so I have no idea why I am getting this error.
This module works perfectly on Android.
On AVD emulator, method "stop" doesn't work well.
import Sound from 'react-native-sound'
const SOUND_FILE_NAME = 'sound.mp3'
const sound = new Sound(SOUND_FILE_NAME, Sound.MAIN_BUNDLE, (error) => {
if (error) {
console.log('cant load sound', SOUND_FILE_NAME)
}
})
// ok
sound.play()
// this makes the sound in playing!!
sound.stop()
Hey, I'm the author of react-native-audio.
After some discussion here: jsierles/react-native-audio#74 (comment), it seems like a good idea to integrate these two libraries.
Would you be interested? If so, I can look at a PR to add iOS recording support here.
I want just to inform about this exception we've got. Maybe others are looking for it also and can give better feedback than me.
This exception isn't straight reproducable, but occurs from time to time when multiple sounds was played one after another in our app.
Versions:
"react-native": "0.26.2",
"react-native-sound": "^0.8.3",
The Exception:
Fatal Exception: NSGenericException
Collection <__NSDictionaryM: 0x16018f740> was mutated while being enumerated.
Callstack (small list):
-[RNSound keyForPlayer:]
-[RNSound audioPlayerDidFinishPlaying:successfully:]
The whole Callstack (taken from Crashlytics):
Fatal Exception: NSGenericException
Collection <__NSDictionaryM: 0x16018f740> was mutated while being enumerated.
Callstack (small list):
-[RNSound keyForPlayer:]
-[RNSound audioPlayerDidFinishPlaying:successfully:]
The whole Callstack (taken from Crashlytics):
# Crashlytics - plaintext stacktrace downloaded by Hagen Hübel at Tue, 09 Aug 2016 14:57:40 GMT
# Platform: ios
# Version: 1.18 (73)
# Issue #: 6
# Issue ID: 57a9ee07ffcdc04250571b51
# Session ID: 82cfb8bd763a4678879093efec533e8b
# Date: 2016-08-09T14:51:31Z
# OS Version: 9.3.2 (13F69)
# Device: iPhone 6
# RAM Free: 7.3%
# Disk Free: 3.8%
#0. Crashed: com.twitter.crashlytics.ios.exception
0 MyApp 0x1000df788 CLSProcessRecordAllThreads + 4296570760
1 MyApp 0x1000df788 CLSProcessRecordAllThreads + 4296570760
2 MyApp 0x1000df644 CLSProcessRecordAllThreads + 4296570436
3 MyApp 0x1000cfe04 CLSHandler + 4296506884
4 MyApp 0x1000dd85c __CLSExceptionRecord_block_invoke + 4296562780
5 libdispatch.dylib 0x18311947c _dispatch_client_callout + 16
6 libdispatch.dylib 0x183124728 _dispatch_barrier_sync_f_invoke + 100
7 MyApp 0x1000dd300 CLSExceptionRecord + 4296561408
8 MyApp 0x1000dd134 CLSExceptionRecordNSException + 4296560948
9 MyApp 0x1000dcd54 CLSTerminateHandler() + 4296559956
10 libc++abi.dylib 0x182d26f44 std::__terminate(void (*)()) + 16
11 libc++abi.dylib 0x182d26b10 __cxa_rethrow + 144
12 libobjc.A.dylib 0x182d34120 objc_exception_rethrow + 44
13 CoreFoundation 0x1835accf8 CFRunLoopRunSpecific + 552
14 GraphicsServices 0x184e94088 GSEventRunModal + 180
15 UIKit 0x188896088 UIApplicationMain + 204
16 MyApp 0x10004bd50 main (main.m:16)
17 libdispatch.dylib 0x18314a8b8 (Missing)
--
Fatal Exception: NSGenericException
0 CoreFoundation 0x1836cedb0 __exceptionPreprocess
1 libobjc.A.dylib 0x182d33f80 objc_exception_throw
2 CoreFoundation 0x1836ce7e4 -[NSException name]
3 CoreFoundation 0x183607c64 -[NSDictionary allKeysForObject:]
4 MyApp 0x100120210 -[RNSound keyForPlayer:]
5 MyApp 0x100120380 -[RNSound audioPlayerDidFinishPlaying:successfully:]
6 Foundation 0x1840a402c __NSThreadPerformPerform
7 CoreFoundation 0x18368509c __CFRUNLOOP_IS_CALLING_OUT_TO_A_SOURCE0_PERFORM_FUNCTION__
8 CoreFoundation 0x183684b30 __CFRunLoopDoSources0
9 CoreFoundation 0x183682830 __CFRunLoopRun
10 CoreFoundation 0x1835acc50 CFRunLoopRunSpecific
11 GraphicsServices 0x184e94088 GSEventRunModal
12 UIKit 0x188896088 UIApplicationMain
13 MyApp 0x10004bd50 main (main.m:16)
14 libdispatch.dylib 0x18314a8b8 (Missing)
#0. Crashed: com.twitter.crashlytics.ios.exception
0 MyApp 0x1000df788 CLSProcessRecordAllThreads + 4296570760
1 MyApp 0x1000df788 CLSProcessRecordAllThreads + 4296570760
2 MyApp 0x1000df644 CLSProcessRecordAllThreads + 4296570436
3 MyApp 0x1000cfe04 CLSHandler + 4296506884
4 MyApp 0x1000dd85c __CLSExceptionRecord_block_invoke + 4296562780
5 libdispatch.dylib 0x18311947c _dispatch_client_callout + 16
6 libdispatch.dylib 0x183124728 _dispatch_barrier_sync_f_invoke + 100
7 MyApp 0x1000dd300 CLSExceptionRecord + 4296561408
8 MyApp 0x1000dd134 CLSExceptionRecordNSException + 4296560948
9 MyApp 0x1000dcd54 CLSTerminateHandler() + 4296559956
10 libc++abi.dylib 0x182d26f44 std::__terminate(void (*)()) + 16
11 libc++abi.dylib 0x182d26b10 __cxa_rethrow + 144
12 libobjc.A.dylib 0x182d34120 objc_exception_rethrow + 44
13 CoreFoundation 0x1835accf8 CFRunLoopRunSpecific + 552
14 GraphicsServices 0x184e94088 GSEventRunModal + 180
15 UIKit 0x188896088 UIApplicationMain + 204
16 MyApp 0x10004bd50 main (main.m:16)
17 libdispatch.dylib 0x18314a8b8 (Missing)
#1. com.apple.libdispatch-manager
0 libsystem_kernel.dylib 0x1832694d8 kevent_qos + 8
1 libdispatch.dylib 0x18312c7d8 _dispatch_mgr_invoke + 232
2 libdispatch.dylib 0x18311b648 _dispatch_source_invoke + 50
#2. com.twitter.crashlytics.ios.MachExceptionServer
0 libsystem_kernel.dylib 0x18324cfd8 mach_msg_trap + 8
1 libsystem_kernel.dylib 0x18324ce54 mach_msg + 72
2 MyApp 0x1000cac64 CLSMachExceptionServer + 4296485988
3 libsystem_pthread.dylib 0x183333b28 _pthread_body + 156
4 libsystem_pthread.dylib 0x183333a8c _pthread_body + 154
5 libsystem_pthread.dylib 0x183331028 thread_start + 4
#3. com.apple.NSURLConnectionLoader
0 libsystem_kernel.dylib 0x18324cfd8 mach_msg_trap + 8
1 libsystem_kernel.dylib 0x18324ce54 mach_msg + 72
2 CoreFoundation 0x183684c60 __CFRunLoopServiceMachPort + 196
3 CoreFoundation 0x183682964 __CFRunLoopRun + 1032
4 CoreFoundation 0x1835acc50 CFRunLoopRunSpecific + 384
5 CFNetwork 0x183d2dc68 +[NSURLConnection(Loader) _resourceLoadLoop:] + 412
6 Foundation 0x1840a3e4c __NSThread__start__ + 1000
7 libsystem_pthread.dylib 0x183333b28 _pthread_body + 156
8 libsystem_pthread.dylib 0x183333a8c _pthread_body + 154
9 libsystem_pthread.dylib 0x183331028 thread_start + 4
#4. com.facebook.react.JavaScript
0 libsystem_kernel.dylib 0x18324cfd8 mach_msg_trap + 8
1 libsystem_kernel.dylib 0x18324ce54 mach_msg + 72
2 CoreFoundation 0x183684c60 __CFRunLoopServiceMachPort + 196
3 CoreFoundation 0x183682964 __CFRunLoopRun + 1032
4 CoreFoundation 0x1835acc50 CFRunLoopRunSpecific + 384
5 MyApp 0x100093094 +[RCTJSCExecutor runRunLoopThread] + 4296257684
6 Foundation 0x1840a3e4c __NSThread__start__ + 1000
7 libsystem_pthread.dylib 0x183333b28 _pthread_body + 156
8 libsystem_pthread.dylib 0x183333a8c _pthread_body + 154
9 libsystem_pthread.dylib 0x183331028 thread_start + 4
#5. JavaScriptCore::Marking
0 libsystem_kernel.dylib 0x183267f24 __psynch_cvwait + 8
1 libsystem_pthread.dylib 0x183332ce8 _pthread_cond_wait + 648
2 libc++.1.dylib 0x182cbf42c std::__1::condition_variable::wait(std::__1::unique_lock<std::__1::mutex>&) + 56
3 JavaScriptCore 0x1870812cc JSC::GCThread::waitForNextPhase() + 144
4 JavaScriptCore 0x187081364 JSC::GCThread::gcThreadMain() + 84
5 JavaScriptCore 0x186d56f14 WTF::threadEntryPoint(void*) + 212
6 JavaScriptCore 0x186d56e24 WTF::wtfThreadEntryPoint(void*) + 24
7 libsystem_pthread.dylib 0x183333b28 _pthread_body + 156
8 libsystem_pthread.dylib 0x183333a8c _pthread_body + 154
9 libsystem_pthread.dylib 0x183331028 thread_start + 4
#6. GAIThread
0 libsystem_kernel.dylib 0x18324cfd8 mach_msg_trap + 8
1 libsystem_kernel.dylib 0x18324ce54 mach_msg + 72
2 CoreFoundation 0x183684c60 __CFRunLoopServiceMachPort + 196
3 CoreFoundation 0x183682964 __CFRunLoopRun + 1032
4 CoreFoundation 0x1835acc50 CFRunLoopRunSpecific + 384
5 Foundation 0x183fbccfc -[NSRunLoop(NSRunLoop) runMode:beforeDate:] + 308
6 Foundation 0x184012030 -[NSRunLoop(NSRunLoop) run] + 88
7 MyApp 0x100142378 +[GAI threadMain:] + 4296975224
8 Foundation 0x1840a3e4c __NSThread__start__ + 1000
9 libsystem_pthread.dylib 0x183333b28 _pthread_body + 156
10 libsystem_pthread.dylib 0x183333a8c _pthread_body + 154
11 libsystem_pthread.dylib 0x183331028 thread_start + 4
#7. AVAudioSession Notify Thread
0 libsystem_kernel.dylib 0x18324cfd8 mach_msg_trap + 8
1 libsystem_kernel.dylib 0x18324ce54 mach_msg + 72
2 CoreFoundation 0x183684c60 __CFRunLoopServiceMachPort + 196
3 CoreFoundation 0x183682964 __CFRunLoopRun + 1032
4 CoreFoundation 0x1835acc50 CFRunLoopRunSpecific + 384
5 libAVFAudio.dylib 0x189d259e0 GenericRunLoopThread::Entry(void*) + 164
6 libAVFAudio.dylib 0x189cfa75c CAPThread::Entry(CAPThread*) + 84
7 libsystem_pthread.dylib 0x183333b28 _pthread_body + 156
8 libsystem_pthread.dylib 0x183333a8c _pthread_body + 154
9 libsystem_pthread.dylib 0x183331028 thread_start + 4
#8. com.apple.CFSocket.private
0 libsystem_kernel.dylib 0x183268344 __select + 8
1 CoreFoundation 0x18368b1c8 __CFSocketManager + 648
2 libsystem_pthread.dylib 0x183333b28 _pthread_body + 156
3 libsystem_pthread.dylib 0x183333a8c _pthread_body + 154
4 libsystem_pthread.dylib 0x183331028 thread_start + 4
#9. com.apple.coreaudio.AQClient
0 libsystem_kernel.dylib 0x18324cfd8 mach_msg_trap + 8
1 libsystem_kernel.dylib 0x18324ce54 mach_msg + 72
2 CoreFoundation 0x183684c60 __CFRunLoopServiceMachPort + 196
3 CoreFoundation 0x183682964 __CFRunLoopRun + 1032
4 CoreFoundation 0x1835acc50 CFRunLoopRunSpecific + 384
5 AudioToolbox 0x185ec89f4 GenericRunLoopThread::Entry(void*) + 164
6 AudioToolbox 0x185ebad70 CAPThread::Entry(CAPThread*) + 124
7 libsystem_pthread.dylib 0x183333b28 _pthread_body + 156
8 libsystem_pthread.dylib 0x183333a8c _pthread_body + 154
9 libsystem_pthread.dylib 0x183331028 thread_start + 4
#10. Thread
0 libsystem_kernel.dylib 0x183268b48 __workq_kernreturn + 8
1 libsystem_pthread.dylib 0x183331530 _pthread_wqthread + 1284
2 libsystem_pthread.dylib 0x183331020 start_wqthread + 4
#11. com.facebook.react.RNSoundQueue
0 libsystem_kernel.dylib 0x18324cfd8 mach_msg_trap + 8
1 libsystem_kernel.dylib 0x18324ce54 mach_msg + 72
2 AudioToolbox 0x185ed0fd4 AQClient_Start + 164
3 AudioToolbox 0x185ed0eb4 AudioQueueStart + 216
4 libAVFAudio.dylib 0x189cf2648 AVAudioPlayerCpp::playQueue(AudioTimeStamp const*) + 288
5 libAVFAudio.dylib 0x189cf0f94 AVAudioPlayerCpp::play() + 76
6 libAVFAudio.dylib 0x189cefc70 -[AVAudioPlayer play] + 52
7 MyApp 0x100120c58 -[RNSound play:withCallback:] + 4296838232
8 CoreFoundation 0x1836d4a60 __invoking___ + 144
9 CoreFoundation 0x1835cc488 -[NSInvocation invoke] + 284
10 CoreFoundation 0x1835d0db0 -[NSInvocation invokeWithTarget:] + 60
11 MyApp 0x100079e14 -[RCTModuleMethod invokeWithBridge:module:arguments:] + 4296154644
12 MyApp 0x10009b35c -[RCTBatchedBridge _handleRequestNumber:moduleID:methodID:params:] + 4296291164
13 MyApp 0x10009ad50 __33-[RCTBatchedBridge handleBuffer:]_block_invoke.319 + 4296289616
14 libdispatch.dylib 0x1831194bc _dispatch_call_block_and_release + 24
15 libdispatch.dylib 0x18311947c _dispatch_client_callout + 16
16 libdispatch.dylib 0x1831254c0 _dispatch_queue_drain + 864
17 libdispatch.dylib 0x18311cf80 _dispatch_queue_invoke + 464
18 libdispatch.dylib 0x18311947c _dispatch_client_callout + 16
19 libdispatch.dylib 0x183127914 _dispatch_root_queue_drain + 2140
20 libdispatch.dylib 0x1831270b0 _dispatch_worker_thread3 + 112
21 libsystem_pthread.dylib 0x183331470 _pthread_wqthread + 1092
22 libsystem_pthread.dylib 0x183331020 start_wqthread + 4
#12. Thread
0 libsystem_kernel.dylib 0x183268b48 __workq_kernreturn + 8
1 libsystem_pthread.dylib 0x183331530 _pthread_wqthread + 1284
2 libsystem_pthread.dylib 0x183331020 start_wqthread + 4
#13. Thread
0 libsystem_kernel.dylib 0x18326841c __semwait_signal + 8
1 libsystem_c.dylib 0x18318522c nanosleep + 212
2 libc++.1.dylib 0x182cfd3b4 std::__1::this_thread::sleep_for(std::__1::chrono::duration<long long, std::__1::ratio<1l, 1000000000l> > const&) + 84
3 JavaScriptCore 0x1872ce690 bmalloc::Heap::scavenge(std::__1::unique_lock<bmalloc::StaticMutex>&, std::__1::chrono::duration<long long, std::__1::ratio<1l, 1000l> >) + 188
4 JavaScriptCore 0x1872ce340 bmalloc::Heap::concurrentScavenge() + 84
5 JavaScriptCore 0x1872d0ad8 bmalloc::AsyncTask<bmalloc::Heap, void (bmalloc::Heap::*)()>::entryPoint() + 100
6 JavaScriptCore 0x1872d0a68 bmalloc::AsyncTask<bmalloc::Heap, void (bmalloc::Heap::*)()>::pthreadEntryPoint(void*) + 12
7 libsystem_pthread.dylib 0x183333b28 _pthread_body + 156
8 libsystem_pthread.dylib 0x183333a8c _pthread_body + 154
9 libsystem_pthread.dylib 0x183331028 thread_start + 4
#14. Thread
0 libsystem_kernel.dylib 0x183268b48 __workq_kernreturn + 8
1 libsystem_pthread.dylib 0x183331530 _pthread_wqthread + 1284
2 libsystem_pthread.dylib 0x183331020 start_wqthread + 4
As of react-native 0.14 (https://facebook.github.io/react-native/docs/images.html#static-image-resources), images and other assets can be bundled in the JS bundle and used by simply requiring them (e.g. <Image source={require('./my-icon.png')} />
). Is it possible to play sounds that are in the JS bundle using react-native-sound?
Example:
let sound = new Sound('filename', Sound.JS_BUNDLE, (error, props) => {
sound.play()
})
// or
let sound = new Sound(require('./filename'), (error, props) => {
sound.play()
})
Maybe be inspired by cordova plugins ?
for iOS :
https://github.com/apache/cordova-plugin-media/blob/master/src/ios/CDVSound.m
for Andoid :
https://github.com/apache/cordova-plugin-media/tree/master/src/android
Based on Apple documentation, to implement this feature AVAudioPlayer must be switched to AVPlayer
@lostintangent did you see my comment on your PR? I wonder if you could please take a look?
thanks
16:41:56.198 ERROR: 404: error -66680
16:41:56.198 ERROR: >aq> 1600: failed (-66680); will stop (11025/0 frames)
is it support on ios8?
I have a component that loads a sound on constructor function and then plays it on some click handlers.
this.beepOn = new Sound('beep_short_on.wav', Sound.MAIN_BUNDLE, (error) => console.log(error));
However I only hear the sound the first time it this.beepOn.play()
I've even tried stopping it at the end: this.beepOn.play(() => this.beepOn.stop())
Does anyone know why this might be happening.
react-native 0.30.0
react-native-sound: ^0.8.3
I can get my sounds and the demo app sounds to work on the ios simulator but when I build to my phone on both apps I get no sound. No errors, the app recognizes the mp3 file because I get the length of the mp3 just fine with getDuration()
, here is some of the code, what am I missing?
class MusicPlayer extends Component {
constructor() {
super();
this.playSong = this.playSong.bind(this);
this.pauseSong = this.pauseSong.bind(this);
this.changeVolume = this.changeVolume.bind(this);
this.backPress = this.backPress.bind(this);
this.tick = this.tick.bind(this);
this.state = {
volume: .5,
isPlaying: false,
songLength: 0,
currentTime: 0,
interval: null,
error: null
}
}
componentWillMount() {
var song = new Sound('collide.mp3', Sound.MAIN_BUNDLE, (error) => {
if (error) {
console.log('failed to load the sound', error);
this.setState({
error:error.message
})
} else { // loaded successfully
console.log('duration in seconds: ' + song.getDuration() +
'number of channels: ' + song.getNumberOfChannels());
this.setState({
volume: .5,
song: song,
isPlaying: false,
songLength: song.getDuration(),
currentTime: 0,
interval: null,
error: null
})
}
});
}
playSong() {
this.state.song.play();
this.setState({
isPlaying: true,
interval: setInterval(this.tick, 1000)
})
}
pauseSong() {
this.state.song.pause();
this.setState({
isPlaying: false,
interval: clearInterval(this.state.interval)
})
}
tick() {
this.state.song.getCurrentTime((seconds) => {
this.setState({
currentTime: seconds
})
})
}
There are some issues with the docs there:
import com.facebook.react.ReactPackage;
import com.facebook.react.shell.MainReactPackage;
import java.util.*;
There is no getPackages
on ReactActivity
on RN 0.31, so the @Override
should be removed.
I think this caused by my old device while playing mp3. Finally it is work with wav file. My device is N7100 CyanogenMod 12.1 ...
Below is error Log.
E/ExtMediaPlayer-JNI(28234): QCMediaPlayer could not be located....
E/MediaPlayer-JNI(28234): QCMediaPlayer mediaplayer NOT present
E/FFmpegExtractor( 1971): android-source:0x410646e0: avformat_open_input failed, err:Invalid data found when processing input
E/FFmpegExtractor( 1971): android-source:0x410646e0|file:/data/app/com.nwrnlayout-1/base.apk: avformat_open_input failed, err:Invalid data found when processing input
Volume bug was introduced in 0.7.3 by checking for isLoaded
and we had to do this to circumvent:
class FixedSound extends Sound {
play (...args) {
this.setVolume(this.getVolume());
return super.play(...args);
}
}
Any chance to fix it soon?
hello, i download a mp3 file using react-native-fs. the file directory on my android device is '/data/data/com.testapp17/files/sample.mp3', but it is unable to play the file using the directory.
I am using Android 5.01 React-native 0.19
after creating object to access my file
const notifySound = new Sound('notify.aac', Sound.MAIN_BUNDLE, (error) => {
I got error { message: 'resource not found', code: -1 }
I followed all steps in the documentation, but then console.log - Sound.MAIN_BUNDLE
and it's undefined
{ [Function: Sound]
I/ReactNativeJS( 9413): enable: [Function],
I/ReactNativeJS( 9413): MAIN_BUNDLE: undefined,
I/ReactNativeJS( 9413): DOCUMENT: undefined,
I/ReactNativeJS( 9413): LIBRARY: undefined,
I/ReactNativeJS( 9413): CACHES: undefined }
which comes from sound.js
file and this code
Sound.MAIN_BUNDLE = RNSound.MainBundlePath;
Sound.DOCUMENT = RNSound.NSDocumentDirectory;
Sound.LIBRARY = RNSound.NSLibraryDirectory;
Sound.CACHES = RNSound.NSCachesDirectory;
Solution
In the end, I put my files into android/app/src/main/res/raw
, but it would be nice to put this file directly in my service folder
Thanks!
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.