Git Product home page Git Product logo

fdwaveformview's Introduction

FDWaveformView

FDWaveformView is an easy way to display an audio waveform in your app. It is a nice visualization to show a playing audio file or to select a position in a file.

🐣 Virtual tip jar: https://amazon.com/hz/wishlist/ls/EE78A23EEGQB

Usage

To use it, add an FDWaveformView using Interface Builder or programmatically and then just load your audio as per this example. Note: if your audio file does not have file extension, see this SO question.

let thisBundle = Bundle(for: type(of: self))
let url = thisBundle.url(forResource: "Submarine", withExtension: "aiff")
self.waveform.audioURL = url

Features

Set play progress to highlight part of the waveform:

self.waveform.progressSamples = self.waveform.totalSamples / 2

Zoom in to show only part of the waveform, of course, zooming in will smoothly re-render to show progressively more detail:

self.waveform.zoomStartSamples = 0
self.waveform.zoomEndSamples = self.waveform.totalSamples / 4

Enable gestures for zooming in, panning around or scrubbing:

self.waveform.doesAllowScrubbing = true
self.waveform.doesAllowStretch = true
self.waveform.doesAllowScroll = true

Supports animation for changing properties:

UIView.animate(withDuration: 0.3) {
    let randomNumber = arc4random() % self.waveform.totalSamples
    self.waveform.progressSamples = randomNumber
}

Creates antialiased waveforms by drawing more pixels than are seen on screen. Also, if you resize me (autolayout) I will render more detail if necessary to avoid pixelation.

Supports iOS12+ and Swift 5.

Includes unit tests, todo: run these on GitHub Actions

Installation

Add this to your project using Swift Package Manager. In Xcode that is simply: File > Swift Packages > Add Package Dependency... and you're done. Alternative installations options are shown below for legacy projects.

Contributing

fdwaveformview's People

Contributors

andreamazz avatar digitalfx avatar evandcoleman avatar fulldecent avatar jackyoustra avatar jakobsa avatar janx2 avatar jonandersen avatar mheist09 avatar msching avatar ospr avatar pixlwave avatar pjay avatar protikhonov avatar reedom avatar rflex avatar rvetas avatar saiday avatar simonbs avatar solomon23 avatar steryokhin avatar yaroslav-zhurakovskiy avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

fdwaveformview's Issues

`maximum` not actually used

I think you might have a small oversight in the downsampling code:

if (tallyCount == downsampleFactor) {
                sample = tally / tallyCount;
                maximum = maximum > sample ? maximum : sample;
                [fullSongData appendBytes:&sample length:sizeof(sample)];
                tally = 0;
                tallyCount = 0;
                outSamples++;
 }    

maximum is never actually used for anything here or above. Should the append line be amended to the following?

[fullSongData appendBytes:&maximum length:sizeof(maximum)];

Not drawing

Version 1.0.1 is not visualizing when trying to play a sound shorter than 2 secs, obviously there is something with the frame calculation since the image frame printed by the console is minus one pixel of width:

(9.0, 6.0, 84.0, 30.0) -- (-0.00151425, 0.0, 0.00151425, 30.0)

Weird thing is that is the same value of the x position.

With 0.3.1 version it works fine.

Auto layout in example app [help needed]

The example app was supposed to have autolayout but I could not understand it and removed it all.

Would someone like to help automatically place the buttons on the layout so they work well with either orientation and different device sizes.

Gestures

Any chance of implementing both Pinch-to-Zoom and touch to setProgressSamples?

Fix OGG File

While profiling the app, I noticed that I was getting an error "FDWaveformView failed to load AVAssetTrack". This was error was being generated when trying to load the OGG file.

Audio from Video Files

I would like to take a video file that was created using the camera and microphone send your code the URL to said file then have it map the audio waveform. I have implemented it and I can see it run apparently with no issues but nothing is drawn into the view. What am I missing?

Add tests

Should have

lines = cd_stats[:total_lines_of_code].to_f
expectations = cd_stats[:total_test_expectations].to_f
  0.045 < (expectations / lines)

Profile performance

Profile performance of the library for different actual hardware.

Maybe a wiki page with a table in it?


Hey Will.

I'm not sure exactly what you want in terms of specs, but I am running iOS 7.1.2 on an iPhone 4S.

Something I found surprising was that changing the overdraw defines to 1 for all values hardly affected the performance at all -- at most it improved it by 1-2 seconds. Is that expected behavior? I know that was a suggestion you gave to improve performance, so I was surprised it didn't have more of an effect.

A benchmark would definitely be useful. I don't have access to anything better than a 4S right now, so it'd be great to have some way to measure its performance relative to a better phone.

Thanks much,
Dennis

Won't render a short track

I have a track that's 5s long and it wont' render a waveform on it. It gets stuck in a loop between layoutSubviews and renderAsset. This is causing the problem:
line 181
if (self.image.image.size.width < self.frame.size.width * [UIScreen mainScreen].scale * horizontalMinimumOverdraw)
needToRender = YES;

That always causes it to re-render. If I remove that check, it works fine.

M4A format compatibility

I'm no audio expert by any means, so this might be a silly question: is FDWaveformView supposed to be compatible with M4A files?

When I try to set the audioURL to my M4A file (local file, not streamed) the code crashes as the "tracks" property is an empty array.

--Joao

renderPNGAudioPictogramLogForAsset sometimes fails

I found some strange behavior using FDWaveFormView under the simulator (iOS 7.1 and 8). About 1/3 of the time a call to layoutSubviews is being made before the image is loaded, its not the call being initiated from setAudioURL but from somewhere else (initial load or something?).

So it ends up calling renderPNGAudioPictogramLogForAsset with endSamples = 0
and this call

    CMSampleBufferRef sampleBufferRef = [trackOutput copyNextSampleBuffer];

hangs forever; also the renderingInProgress flag never gets set to false so no further updates ever get carried out.

At the beginning of layoutSubviews I added

if (!self.asset || self.renderingInProgress || self.zoomEndSamples == 0)
    return;

and it always works ok. I think zoomEndSamples will never be 0 in a normal situation and from what I could find copyNextSampleBuffer can hang if its sent an empty time range.

FDWaveformView.bundle" failed: No such file or directory

Hi,

I try to build the project which I install FDWaveformView from pod, and i got file missing.

/Users/joey/Documents/joeycodes/MacDevelop/joey/apps/Firecord/Pods/${BUILT_PRODUCTS_DIR}/FDWaveformView.bundle
building file list ... rsync: link_stat "/Users/joey/Documents/joeycodes/MacDevelop/joey/apps/Firecord/Pods/${BUILT_PRODUCTS_DIR}/FDWaveformView.bundle" failed: No such file or directory (2)
done

sent 29 bytes received 20 bytes 98.00 bytes/sec
total size is 0 speedup is 0.00
rsync error: some files could not be transferred (code 23) at /SourceCache/rsync/rsync-45/rsync/main.c(992) [sender=2.6.9]
Command /bin/sh failed with exit code 23

Please help ..

Slow rendering on slow devices

I have an iPhone 4, and I often need to render 30 minute audio files.
I have a 28:29 file, a .m4a at 13.07 MB.
When testing it on my phone, the log says rendering time is 136.788682 seconds.
Ideas?

Multiple colors of Progress Samples

I managed to get the Waveform to have two separate wavesColor, but I haven't managed to create a multi-colored progressSamples.

For example, if I want the middle 30 samples to be gold (regardless of whether they are included in the progressSamples or not), but the wavesColor to be black and the progressColor to be blue -- how would this be done?

Thank you. I really like the project.

Issue in change sound waves single color to Gradient color

Hello Developer,

I am using this code for create sound waves in my project it is perfectly fine. But as par my requirements I want change waves color, single color to gradient color. like this
http://www.sciencealert.com/images/articles/processed/sound-waves_1024.jpg

and more I think this method help you. please have a look inti this

  • (void)plotLogGraph:(NSData *) samplesData
    maximumValue:(Float32) normalizeMax
    mimimumValue:(Float32) normalizeMin
    sampleCount:(NSInteger) sampleCount
    imageHeight:(float) imageHeight
    done:(void(^)(UIImage *image, UIImage *selectedImage))done

Please do helpful.
Regards,
Narender kumar

Save State with Key

So I’m back with another feature request!

I’m currently trying to implement multiple waveformviews in the same controller, which with 3 or 4 minute tracks results in a very slow load time on older devices. That said, the AVAssets for the views are hardly ever changing, so it would be much more efficient if when the view is dismissed, the rendered png could be saved, and then on load be restored again. Presumably the view could check to see whether or not the associated avasset was the same one as when rendered just in case something has changed and it needs to re-render.

I was thinking some setable property along the lines of "savesStateWithKey" so that if a waveform had said key, it would load and save the fully zoomed out image using a key chosen by user, whenever it was created/destroyed.

Hope that makes sense?

Can I resize the waveform view?

Hi,

I tried to resize the waveform view in the storyboard of the sample project, but the view does not resize when i build and run the app. Is this intentional? Is the size being set in FDWaveformView.m?

Thanks,
Varun

Add arbitrary selection and play only that selection

Hi,
Amazing thanks! Had a question. R u planning to add selection? For example I want to select a specific section of the file (using Pan gesture -- UIGestureRecognizerStateBegin and End in handlePanGesture()). So the user can select a section of the file with his finger (pan/touch); of course color the selection differently.
thank you very much

Using 3+ colors

I'm hoping to use colors to signify (1) Progress samples; (2) The remaining samples; and (3) a few selection highlighted gold.

Obviously tasks (1) and (2) are already done -- do you have any suggestions for what I can do to add a third color over various samples?

Thank you -- really love the work.

Way to show waveform being built during recording?

Is there a way to show the waveform being built during a recording? For example, in the iOS Voice Memos app, as you record, the waveform is built. You don't have to wait until after recording to render the wave form.

Invalid context error

Getting an invalid context error (and no display of the waveform when building in the simulator ios8 SDK). Error looks like this:

CGContextSetBlendMode: invalid context 0x0. This is a serious error. This application, or a library it uses, is using an invalid context and is thereby contributing to an overall degradation of system stability and reliability. This notice is a courtesy: please fix this problem. It will become a fatal error in an upcoming update.

Any suggestions?

Get it working for online music

how to get this working for online url music? it only works with saved music but i want it to work with online too, any ideas?

Precision

How do I change the precision of the waveform and the gaps between the lines?

Output settings are not compatible with media type

Hey,

I'm using FDWaveformView in my project to play WAV files. Seems to work fine most of the time, but I keep getting these crash reports from users using it in the app store.

NSInvalidArgumentException - *** -[AVAssetReaderTrackOutput initWithTrack:outputSettings:] Output settings are not compatible with media type �

Exact crash line is +[FDWaveformView sliceAndDownsampleAsset:track:startSamples:endSamples:targetSamples:done:] FDWaveformView.m:267

Seems like it's specifying LPCM for output settings... is there a reason why this wouldn't work?

NSDictionary *outputSettingsDict = @{AVFormatIDKey: @(kAudioFormatLinearPCM),
AVLinearPCMBitDepthKey: @16,
AVLinearPCMIsBigEndianKey: @no,
AVLinearPCMIsFloatKey: @no,
AVLinearPCMIsNonInterleaved: @no};

Here's on which OS/devices the crash is happening
screen shot 2015-07-23 at 12 13 34 pm

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.