Git Product home page Git Product logo

dswaveformimage's Introduction

GitHub license dependencies Status devDependencies Status

About

This is a portfolio website I keep sporadically up to date as a hobby project. It's a collection of the different things I currently am and have been working on over the past. It's definitely not complete, but at does offer a good glimpse.

Usage

After installation, run yarn install and then run yarn start which will open up a preview of the template in your default browser, watch for changes to core template files, and live reload the browser when changes are saved. You can view the gulpfile.js to see which tasks are included with the dev environment.

Gulp Tasks

  • gulp the default task that builds everything
  • gulp watch browserSync opens the project in your default browser and live reloads when changes are made
  • gulp css compiles SCSS files into CSS and minifies the compiled CSS
  • gulp js minifies the themes JS file
  • gulp vendor copies dependencies from node_modules to the vendor directory

You must have yarn installed globally in order to use this build environment.

Copyright and License

This page is based on Creative, a one page creative theme for Bootstrap created by Start Bootstrap. Code released under the MIT license.

dswaveformimage's People

Contributors

adamritenauer avatar alfogrillo avatar avyavya avatar bdolman avatar chakrit avatar dmrschmidt avatar jverkoey avatar osheroff avatar tapsandswipes avatar tewha avatar tikhonp avatar tsuyoshi84 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

dswaveformimage's Issues

Can you provide an example for SwiftUI?

I am trying to use the WaveformLiveView in SwiftUI, but am having a hard time since there is no direct example.

How can I incorporate these elements into a declarative SwiftUI view definition? The following fails due to a Static method 'buildBlock' requires that 'WaveformLiveView' conform to 'View' error

   var body: some View {
        NavigationView {
            VStack{
                WaveformLiveView(configuration: Waveform.Configuration(dampening: .init(percentage: 0.125, sides: .both)))

Render bug?

Hello, any idea why this happen?

RPReplay_Final1662818646

I am using SwiftUI, and I wrapped the wave view into UIViewRepresentable.
The configuration of WaveformLiveView:

let configuration: Waveform.Configuration = Waveform.Configuration(
            style: .striped(.init(color: .red, width: 3, spacing: 3)),
            dampening: .init(percentage: 0.2, sides: .both),
            position: .middle,
            verticalScalingFactor: 2,
            shouldAntialias: true
        )

waveform is always partially transparent

I'm trying to create a black waveform but it turns dark grey. I tried making it white too using style: .filled(.white) but the yellow background still seeps through making it look light yellow:

Schermafbeelding 2021-02-10 om 11 22 31

I took a look at the drawing code but couldn't find why this happens. I sort of fixed it by just adding an extra stroke:

case .filled, .striped:
    context.setStrokeColor(configuration.color.cgColor)
    context.strokePath()
    context.addPath(path)
    context.strokePath()

The waveform is now much more black/white this way, but I wonder why this hack is necessary.

Schermafbeelding 2021-02-10 om 11 31 32

I've had this issue on 6.1.1 and now on 7.0.0 too.

Why can't I use RecordingDelegate for my own AudioRecorder?

Hi! I can't inherit from RecordingDelegate and I can't access SCAudioManager. I'm trying to use WaveformLiveCanvas with samples, but I can't figure out how this is implemented in example of yours. At the moment I have this implementation, but it doesn't quite work correctly
Thank you in advance for your help!
Снимок экрана 2022-12-15 в 16 52 17

Waveform disappears when you reopen the app

If I have a waveform image shown when backgrounding the app, the view will be empty when I re-foreground the app.
my theory is that this might be because of storage access getting blocked when the app backgrounds? the audio URL is local, but the library logs ERROR loading asset / audio track upon backgrounding. if I scroll the view out of the viewframe and back in, then the audio wave is back. the file is never touched, it's always on disk and shouldn't lead to an error like this. I'm also not sending new URLs to the wave view, I checked that

Crop audio

Hey @dmrschmidt! Do you have the ability to cut a fragment from a finished file? That is, select start and finish as RangeSlider

Release Swift 4.2 version to Cocoapods

I have another open source project on github which uses DSWaveformImage and I would like to submit that pod to CocoaPods with Swift 4.2 changes. However, I cannot do that until this library is released to CocoaPods as well (Current version 5.0.x is only Swift 4.0)

Set progress of WaveformImageView

Hi I am using WaveformImageView to show waveform of my audio file, and i am using AvAudioPlayer to play the audio. how can i set progress of the waveform when the audio is playing?

guard let url = Bundle.main.url(forResource: "OriginalRecording", withExtension: "m4a") else {
return
}

    middleWaveformView.configuration = Waveform.Configuration(
        backgroundColor: .lightGray.withAlphaComponent(0.1),
        style: .striped(.init(color: UIColor.red, width: 2, spacing: 1)),
        verticalScalingFactor: 0.5
    )
    middleWaveformView.waveformAudioURL = url

WaveformLiveView is slower than example project

Not sure if I am doing something wrong here, but my WaveformLiveView is much slower than the example projects. Mine is using all Swift and I followed what the sample project is doing.

Mine1.mov
Example.mov
import UIKit
import DSWaveformImage
import AVFoundation

class ViewController: UIViewController, AVAudioRecorderDelegate {
    
    @IBOutlet weak var recordButton: UIButton!
    @IBOutlet weak var waveformView: WaveformLiveView!
    
    var recordingSession: AVAudioSession!
    var audioRecorder: AVAudioRecorder!
    var timer: Timer?
    
    override func viewDidLoad() {
        super.viewDidLoad()
        // Do any additional setup after loading the view.
        
        waveformView.configuration = waveformView.configuration.with(
            style: .striped(.init(color: .red, width: 3, spacing: 3))
        )
        
        recordingSession = AVAudioSession.sharedInstance()

        do {
            try recordingSession.setCategory(.playAndRecord, mode: .default)
            try recordingSession.setActive(true)
            recordingSession.requestRecordPermission() { [unowned self] allowed in
                DispatchQueue.main.async {
                    if allowed {
                        createRecorder()
                        recordButton.isHidden = false
                    } else {
                        // failed to record!
                        recordButton.isHidden = true
                    }
                }
            }
        } catch {
            // failed to record!
        }
    }
    
    func createRecorder() {
        let audioFilename = getDocumentsDirectory().appendingPathComponent("recording.m4a")

        let settings = [
            AVFormatIDKey: Int(kAudioFormatMPEG4AAC),
            AVSampleRateKey: 44100,
            AVNumberOfChannelsKey: 1,
        ]

        do {
            audioRecorder = try AVAudioRecorder(url: audioFilename, settings: settings)
            audioRecorder.isMeteringEnabled = true
            audioRecorder.delegate = self
            audioRecorder.prepareToRecord()
        } catch {
            finishRecording(success: false)
        }
    }

    @IBAction func record(_ sender: Any) {
        if audioRecorder.isRecording {
            finishRecording(success: true)
            return
        }
        
        recordButton.setTitle("Tap to Stop", for: .normal)
        audioRecorder.record()
        timer = Timer.scheduledTimer(timeInterval: 0.1, target: self, selector: #selector(ViewController.updateAmplitude), userInfo: nil, repeats: true)
    }
    
    func getDocumentsDirectory() -> URL {
        let paths = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)
        return paths[0]
    }
    
    func finishRecording(success: Bool) {
        timer?.invalidate()
        timer = nil
        audioRecorder.stop()
        audioRecorder = nil

        if success {
            recordButton.setTitle("Tap to Re-record", for: .normal)
        } else {
            recordButton.setTitle("Tap to Record", for: .normal)
        }
    }
    
    @objc private func updateAmplitude() {
        audioRecorder.updateMeters()

        print("current power: \(audioRecorder.averagePower(forChannel: 0)) dB")
        
        let currentAmplitude = 1 - pow(10, (audioRecorder.averagePower(forChannel: 0) / 20))
        waveformView.add(samples: [currentAmplitude, currentAmplitude, currentAmplitude])
    }
}

Configuration given in WaveformLiveView initialiser not used in sampleLayer

Hi,

unfortunately, didSet isn't being triggered during an initialiser. Therefore the sampleLayer of WaveformLiveView never gets the message about the initial configuration given via

public init(configuration: Waveform.Configuration = defaultConfiguration, renderer: WaveformRenderer = LinearWaveformRenderer())

The sample applies to the renderer.

It can be worked around by setting these properties later.
It could be fixed by calling these property setters within a defer { } block in the initialiser. But only if they have a default value.

sampleLayer.configuration = configuration

Amplitudes should be halved to stay within available height

Hello,

if I understand correctly, all samples between 0 and 1 should be rendered within the available height and not be cut off.
I found that they are cut off though and was wondering what's going on. Trying with more and more extreme values for verticalScalingFactor I suspected that the scaling isn't working correctly.

I think that the drawingAmplitude should be divided by 2 to respect the fact that it's drawn mirrored and pointing both up and down from the vertical middle of the view. Currently, a sample of value 0 would result in a line that stretches outside of the view by 50% on both sides.

let drawingAmplitude = max(minimumGraphAmplitude, invertedDbSample * drawMappingFactor)

WaveForm for the http local file without downloading

Hi Dennis,
I really love your Library but current I am working one of the chat application now I want to create the wavesammple its working fine when I am using your library but I am facing one problem I am saving the path in sql cipher and its return me like this type of path "http://localhost:52421/media/B46E57FB-2B84-4939-A127-ECBDD67301C4/87BF79CC-B842-4DB2-8BEB-817D9AA43C31/A70F087F-69D5-4245-BDC9-15C8A65324A6.m4a" so I don't want to download it is their any other way for doing this because of when I pass the asset to AVReader its give me nil

Load a Remote URL

Has anyone attempted to build a version which could load a remote URL?
AVAssetReader complains currently "Cannot initialize an instance of AVAssetReader with an asset at non-local URL"

Package works fine but it's throwing 5 warnings

On my quest to keep my project free of warnings I noticed a few warnings this package is generating in TempiFFT.swift:

Inout expression creates a temporary pointer, but argument 'imagp' should be a pointer that outlives the call to 'init(realp:imagp:)' (line 112)

Initialization of 'UnsafeMutablePointer<Float>' results in a dangling pointer (line 166 and 226)

These seem to be a bit above my knowledge, but maybe I can figure it out. Could you take a look at this too?

Support clear background

If the background color config is set to clear, the waveform background is black.

I see that this commit a few years back fixed that, but I guess it got broken again since the drawing logic uses a different API now.

is it possible to have a clear background with this version?

different colors

截屏2021-02-20 下午3 02 15

want to realize like this function,set different part different color, how to work it? thx.

The waveform is offset than actual.

in the func

func process(_ samples: UnsafeMutablePointer<Int16>,
                         ofLength sampleLength: Int,
                         from assetReader: AVAssetReader,
                         downsampledTo targetSampleCount: Int) -> [Float] 

There is a line:
let samplesPerPixel = max(1, sampleCount(from: assetReader) / targetSampleCount)

samplesPerPixel is a Int, so it ignore Decimal, and it result there Is offset for waveform, and it accumulate the offset for each call to process.

Error on sample code "Trailing closure passed to parameter of type 'CGFloat?' that does not accept a closure"

I would like to draw an image based on a mp3
I Follow install your request and copy past your example

let waveformImageDrawer = WaveformImageDrawer()
let audioURL = Bundle.main.url(forResource: "example_sound", withExtension: "m4a")!
waveformImageDrawer.waveformImage(fromAudioAt: audioURL, with: .init(
size: topWaveformView.bounds.size,
style: .filled(UIColor.black),
position: .top) { image in
// need to jump back to main queue
DispatchQueue.main.async {
self.topWaveformView.image = image
}
})

but i've got this error on line 6 on the "{":
"Trailing closure passed to parameter of type 'CGFloat?' that does not accept a closure"

Can you give me advice to fix the problem please ?
Thanks in advance

Background thread rendering

I wonder if this (awesome!) library could be modified so the waveform is generated on a background thread (then of course displayed on the main thread)? I've had a play dropping some of the processing onto a global queue but could not get it working.

I would love to use this in a DAW style app but require 'on the fly' updating of waveforms (recording functionality etc.). Doesn't matter if the waveforms take a little while to display, but cannot hold up the main thread to maintain UI responsiveness.

Generate the wave from URL.

Hello,

I tried by downloading the audio file and saving it locally and then assign the URL to WaveformView but I get no results.

I get this error : ERROR loading asset / audio track

Is there any other way to generate the wave from the URL?

Thanks

Width issue

Hi,

This is an otherwise amazing framework, however there is one big issue that is plauging me, which prevents us from using this in our app.

When I set a width for the waveform, it does not properly fill out the image. It is as if it just appends a bunch of "silence" for the extra width, until I hit a certain width, where it starts rendering correctly again until it once more appends "silence" when making the image bigger.

I have attached screenshots of widths 260 (which shows only silence), 264 (which is almost perfect) and 300 (which shows "appended silence").

screenshot

screenshot

screenshot

Here is our code:

    AudioExporter.instance.exportSongToM4A(song: self.song!, completionHandler: { (success) in
        let waveformImageDrawer = WaveformImageDrawer()
        let waveformSize = CGSize(width: 264, height: sequenceHeight)
        let scale = UIScreen.main.scale

        DispatchQueue.main.async {
            print("export finished: \(success)")
            if success {
                if let trackWaveformImage = waveformImageDrawer.waveformImage(fromAudioAt: self.song!.exportUrlM4A(),
                                                                              size: waveformSize,
                                                                              color: UIColor.white,
                                                                              backgroundColor: UIColor.clear,
                                                                              style: .filled,
                                                                              position: .middle,
                                                                              scale: scale) {
                    
                    DispatchQueue.main.async {
                        let trackWaveformImageView = UIImageView(frame: CGRect(origin: CGPoint(x: x, y: y), size: waveformSize))
                        trackWaveformImageView.image = trackWaveformImage
                        trackWaveformImageView.contentMode = .scaleToFill
                        
                        self.waveformViews.append(trackWaveformImageView)
                        self.sequenceScrollView.addSubview(trackWaveformImageView)
                        self.sequenceScrollView.bringSubview(toFront: self.scrubberImageView)
                    }
                }
            } else {
                //TODO: Nothing?
            }
        }
    })

Any help would be appreciated

Cheers,

/Mikkel

Generate the waveform from URL inside Core Data

In my app you can record audio, and I've been saving that by saving the file to disc and storing a relative URL to Core Data. But I can't use those URL in WaveformView(audioURL: $url, configuration: $configuration)

I'm using swiftUI. Thank!

Application crashes because of memory issue while creating waveform.

This code in AudioProcessor.swift:

            if let sampleBuffer = trackOutput.copyNextSampleBuffer(),
                let blockBuffer = CMSampleBufferGetDataBuffer(sampleBuffer) {
                let blockBufferLength = CMBlockBufferGetDataLength(blockBuffer)
                let sampleLength = CMSampleBufferGetNumSamples(sampleBuffer) * channelCount(from: assetReader)
                var data = Data(capacity: blockBufferLength)
                data.withUnsafeMutableBytes { (blockSamples: UnsafeMutablePointer<Int16>) in
                    CMBlockBufferCopyDataBytes(blockBuffer, atOffset: 0, dataLength: blockBufferLength, destination: blockSamples)
                    CMSampleBufferInvalidate(sampleBuffer)

                    let processedSamples = process(blockSamples,
                                                   ofLength: sampleLength,
                                                   from: assetReader,
                                                   downsampledTo: targetSampleCount)
                    outputSamples += processedSamples
                }
            }

ends up crashing the application when selected asset is long enough. In my tests, audio file is 13 minutes long. Crash happens while over 500MB of memory is used in the while loop. If the block buffer isn't set crash doesn't happen.

Is there a way to cancel waveform image rendering process?

I have a scenario in my app, where user may select between multiple audios which may be 5-10 min long. Usually, it takes quite some time to render one waveform out. In case when one waveform is being rendered and user taps on the new audio, I want to stop the rendering process of the first waveform, so I can start rendering a waveform image of the new audio.

How can I stop the initial rendering process?

Creating Waves Visualisation from recording within view fram

Here is what I am trying to do:

Wave Expected Gif

The screenshot is taken from a XR iPhone.

Wave Done Gif

I have been working in AVAudioPlayer and I would like to draw a waveform which looks like the first gif. I am using DSWaveformImage-main Github pods to draw waves when the user records the sound. But I'm confused how to draw waves in the view frame without automatically scrolling and not going outside from the view. Means when the user records the sound the waves automatically will draw on the view, but not move outside, i need to show all the waves within the view frame and shrink the waves if it goes outside the frame.

Question: How to show the draw waves in the view frame without goes to outside from the view?

Can someone please explain to me how to draw same, I've tried to draw these waves with the above pods code but no results yet.

Any help would be greatly appreciated.

Thanks in advance.

macOS / AppKit support

I'm working on a personal music player app on macOS and I'm hoping to use the underlying waveform analysis code you've written. Everything in the package is currently bundled into a single target that's compatible only with iOS. Are you open to potentially splitting the package into a few targets, with one containing the platform-agnostic "Waveform*" classes?

Track microphone sound

Hi,

I'm running your example and the simulator asks for microphone access. But I don't see where I can couple the microphone with the wave. Is this possible to do live visualisation of the microphone sound?

Sample Count not working

I'm using WaveformAnalyzer to get X amount of samples from an audio file. Changing the count in the samples function doesn't change how many samples I get back.

Xcode Version 11.3
DSWaveformImage Version 6.1

waveform fit width?

both with the WaveformView and WaveformLiveView, theres a bunch of space on the left or right if the audio file / number of samples isn't long enough. is there some way to have the waveform fit width?

Screenshot 2022-11-28 at 12 52 24 PM

Long Render times for waveform

Hi there,

I'm having an issue with getting the waveform to render in a decent amount of time in my MacOS app. The waveform is taking 30 seconds to render at minimum. I've put a high task priority on it which has really changed anything. The audio is on a server on our network and I'm connecting through wifi. Now, that could be a bottleneck but I can play the audio with little to no delay and I'm running on a newer M1 MacBook pro with newer wifi gear in the office. I can easily access the audio files from the network through the finder without any noticeable delay. The files are about 3-8MB in size, so I'm not sure what I'm doing wrong. I am using Swift and SwiftUI to display the waveform view.

          let audioURL = URL(fileURLWithPath: eventItem?.convertedfp ?? "")
          if #available(macOS 12.0, *) {
                 if eventItem != nil{
                    //we need to check if it's a message or not
                   if (eventItem?.eventType != 3){
                        if (eventItem?.fileExists != false){
                            WaveformView(audioURL: audioURL, priority: .high)
                        }
                    }
                }
            } else {
                Text("need macos 12")
                //print(audioURL)
            }

I'm sorry if I've done something dumb, but I can't figure out what the issue is.

When it's pulling the audio from the server I also get the following errors:

2022-10-05 16:22:54.966377-0700 Nexus Playlsit Editor[76826:5249230] [plugin] AddInstanceForFactory: No factory >?registered for id <CFUUID 0x60000087b680> F8BB1C28-BAE8-11D6-9C31-00039315CD46
2022-10-05 16:22:54.991948-0700 Nexus Playlsit Editor[76826:5249530] [AQ] AudioQueueObject.cpp:2364 Error (-4) getting reporterIDs

any help would be greatly appreciated.

Calculated waveform image different level than live view

I am using the realtime view to display audio level when recording. If the user playsback the recording, I am generating a waveform image which replaces the liveview during playback.
I'm noticing that the levels that come in when recording are not the same as the rendered image. I will attach images but what I mean is the image view and the realtime view are the same sizes and when I'm realytime recording, I rarely get to the top of the container yet when the waveform is generated, the visual represention seems like most of the audio was at full level.
I'm probably doing a terrible job explaining this but is there a way to set thresholds for the waveform output so I can match them to what the user would see when recording?

Here is a sample of a realtime waveform:
Screen Shot 2022-08-24 at 11 49 43 AM

Here is sample of the generated waveformImage:
Screen Shot 2022-08-24 at 11 50 00 AM

return NSImage instead of view

Hi,

Try as I might (I'm still fairly new to swiftui), I couldn't manage to return the generated waveform instead of just displaying it. The goal I'm trying to achieve is this that I get a returned NSImage of the generated image so that I can store that data (.tiffRepresentation). Is there anyway this package can do that? I tried hacking about the code for about 2 days now with no success. Any help is appreciated. Thank you.

Error installing with SPM

Screen Shot 2022-11-21 at 17 12 01

Cannot find type 'SCAudioManager' in scope I used SPM to install DSWaveformImage, but I can't find SCAudioManager xcode Version 14.1 (14B47b), UIKit

Two identical waveform views don't always line up

Hey there! I'm attempting to do playback progress indication by generating two waveform views and then masking one:

let configuration: Waveform.Configuration = .init(
    style: .striped(.init(color: .gray, width: 1, spacing: 2))
)

GeometryReader { proxy in
    ZStack(alignment: .leading) {
        // Static waveform view, showing the whole file
        WaveformView(audioURL: audioURL, configuration: configuration)
            .frame(width: proxy.size.width)

        // Animated progress indicator, masked to the same waveformView, to show the progress
        let width = min(progress * proxy.size.width, proxy.size.width)
        Rectangle().frame(width: width).foregroundColor(.blue).mask(alignment: .leading) {
            WaveformView(audioURL: audioURL, configuration: configuration)
                .frame(width: proxy.size.width)
        }
    }
}

However, sometimes the masked waveform view doesn't line up with the unmasked one.

misaligned
misaligned2

I've used a view inspector to verify that both views have the exact same frame, but sometimes the blue one is misaligned to the left, sometimes to the right, and sometimes correctly. Any advice would be most appreciated! Thanks!

Memory issue

Loading url to waveformImageView ->
popping out to previous vc before waveformImageView finished loading ->
crashes on memory issue.
Is there an option to stop loading the waveformImageView? or kill the object?

XCFramework with SPM support

Hi, I love using this library on iOS! It makes audio visualizations so cool!

Is there a plan to make this library support xcframeworks with SPM? This way this library will compile on M1 Macs simulators and run with Mac Catalyst?

Thanks!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.