Git Product home page Git Product logo

midikit's Introduction

MIDIKit

MIDIKit

Xcode 14-15 License: MIT

An elegant and modern CoreMIDI wrapper in pure Swift supporting MIDI 1.0 and MIDI 2.0.

  • Modular, user-friendly I/O
  • Automatic MIDI endpoint connection management and identity persistence
  • Strongly-typed MIDI events that seamlessly interoperate between MIDI 1.0 and MIDI 2.0
  • Automatically uses appropriate Core MIDI API and defaults to MIDI 2.0 on platforms that support it
  • Supports Swift Playgrounds on iPad and macOS
  • Full documentation available in Xcode Documentation browser, including helpful guides and getting started information

Abstractions

Additional abstractions for MIDI extensions can be found in MIDIKit:

  • Reading/writing Standard MIDI Files (SMF)
  • Control Surface protocols (HUI, etc.)
  • Synchronization protocols (MTC, etc.)

Getting Started

The library is available as a Swift Package Manager (SPM) package.

Use the URL https://github.com/orchetect/MIDIKit when adding the library to a project or Swift package.

See the getting started guide for a detailed walkthrough of how to get the most out of MIDIKit.

Documentation

See the online documentation or view it in Xcode's documentation browser by selecting the Product โ†’ Build Documentation menu.

This includes a getting started guide, links to examples, and troubleshooting tips.

System Compatibility

  • Xcode 14.0 / macOS 12.0 are minimum requirements to compile

  • Once compiled, MIDIKit supports macOS 10.12+ and iOS 10.0+.

    tvOS and watchOS are not supported (as there is no Core MIDI implementation) but MIDIKit will build successfully on those platforms in the event it is included as a dependency in multi-platform projects.

Author

Coded by a bunch of ๐Ÿน hamsters in a trenchcoat that calls itself @orchetect.

License

Licensed under the MIT license. See LICENSE for details.

Sponsoring

If you enjoy using MIDIKit and want to contribute to open-source financially, GitHub sponsorship is much appreciated. Feedback and code contributions are also welcome.

Community & Support

Please do not email maintainers for technical support. Several options are available for questions and feature ideas:

  • Questions and feature ideas can be posted to Discussions.
  • If an issue is a verifiable bug with reproducible steps it may be posted in Issues.
  • The AudioKit discord #midikit channel is also a place to find help if Discussions and documentation don't contain an answer. (Invitation is necessary)

Contributions

Contributions are welcome. Posting in Discussions first prior to new submitting PRs for features or modifications is encouraged.

midikit's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

midikit's Issues

MIDI 2.0 Support (MIDI-CI/UMP)

MIDI 2.0 support progress:

  • Universal MIDI Packet (UMP) Events Support: added in 0.3.0
  • Additional MIDI 2.0 Events Support
    • Relative RPN/NRPN support (implement as a relative: Bool property on existing RPN/NRPN events)
    • Utility messages
    • Mixed Data Set Message (MIDI 2.0 Spec Chapter 4.6, Page 38-40 & 79)
    • Data 128 Bit UMPs (MIDI 2.0 Spec Appendix G, Page 79)

      "Does not translate to MIDI 1.0 Protocol, but may be used by a UMP MIDI 1.0 Device"

  • Extended abstractions (MIDI-CI Profiles, etc.) - may be a future post-1.0.0 feature, low priority
  • Unit tests
    • SysEx8 StreamID tests
  • Example project / Documentation

MIDI File parse bug.

Please Confirm

  • I have reviewed the MIDIKit Documentation which contains descriptive guides and extensive API reference
  • I have searched Issues and Discussions to see if the same question has already been asked

macOS Version(s) Used to Build

macOS 13 Ventura

Xcode Version(s)

Xcode 14

Description

The following file does not parse. The same file parses in two other MIDI Parsers, Logic Pro, and plays in AVAudioSequencer.
Anonim - Uzun Ince Bir Yoldayim.mid.zip

Crash Logs, Screenshots or Other Attachments (if applicable)

No response

Split library extensions into their own repos

Restructuring: similar to AudioKit, core essential functionality of MIDIKit (I/O and Events) will live in this repo (MIDIKit), while optional abstractions will be extricated into their own repos as extensions to MIDIKit. Each will import MIDIKit as a dependency.

Incorporate MIDIKitSync into videoplayer app

Thanks a lot for sharing your great work.

I'm hoping to use MIDIKitSync to syncronise playback of an avplayer in my app with an external DAW. I'm planning on using the following code to schedule video playback in the future (e.g. I want video play back to occur at a particular future timecode). I'm a bit unsure how to sync with MTC however. I'm guessing I need to ensure they're both using the same sourceClock. Any ideas how to do this? Sorry, if this is already taken care of, still learning. Thanks again.

var audioClock = CMClockGetHostTimeClock()
let videoPlayer = AVPlayer(url: videoURL)
videoPlayer.sourceClock = audioClock
videoPlayer.automaticallyWaitsToMinimizeStalling = false

func schedulePlayback(videoTime: TimeInterval, hostTime: UInt64) {
    videoPlay(at: 0, hostTime: hostTime)
}

func videoPlay(at time: TimeInterval = 0, hostTime: UInt64 = 0) {
    let cmHostTime = CMClockMakeHostTimeFromSystemUnits(hostTime)
    let cmVTime = CMTimeMakeWithSeconds(time, preferredTimescale: 1000000)
    let futureTime = CMTimeAdd(cmHostTime, cmVTime)
    videoPlayer.setRate(1, time: kCMTimeInvalid, atHostTime: futureTime)
}

MCU Protocol Support

MCU (Mackie Control Universal) protocol is similar to HUI, but many key differences in hardware layout, supported controls and MIDI messages used.

Hardware:

  • MCU Pro (main unit) - 8 channels + master fader + transport/editing controls
  • MCU XT Pro (connect up to 3 additional "fader packs" - each XT unit has 8 channel strips only)

Deflake Unit Tests

Unit Tests to Deflake

  • RoundTrip_NewCoreMIDIAPI_2_0_Protocol_Tests.testRapidMIDIEvents_NewCoreMIDIAPI_2_0_Protocol
  • RoundTrip_OldCoreMIDIAPI_Tests.testRapidMIDIEvents_OldCoreMIDIAPI
    • on macCatalyst
    • perhaps instead of wait(0.5) / wait(0.3) it should use a wait cycle to check Core MIDI to wait for the endpoints appear or disappear respectively
  • MIDIManager_MIDIIONotification_Tests.testSystemNotification_Add_Remove
  • MTC_Receiver_Receiver_Tests.testMTC_Receiver_Handlers_QFMessages
    • on macCatalyst, iOS, watchOS A B

MTC API addition: `AVAudioTime` interop

User feature request:

what I would love is a MTC Generator start at AVAudioTime
that way i can use the same value that is used everywhere with AVFoundation
would take into account the same hostTime that is used for all schedules

CI: Build Universal Binary 2 (x86_64, arm64)

Trying to get GitHub Actions CI which runs on Intel hardware to build a Universal Binary 2 (x86_64, arm64) but running into an issue with xcodebuild not liking it.

Build succeeds if ONLY_ACTIVE_ARCH = YES but only builds x86_64.

Setting ONLY_ACTIVE_ARCH = NO causes both architectures to build.

Locally, on an M1 Max machine of mine it works fine and compiles both. But it errors out on an Intel machine.

Some hints:

Example App: Bluetooth MIDI on iOS

Need to add an example app demonstrating how to implement bluetooth MIDI device connections on iOS.

CABTMIDICentralViewController and/or CABTMIDILocalPeripheralViewController is involved:
https://developer.apple.com/documentation/coreaudiokit/cabtmidicentralviewcontroller

So it wouldn't be a feature of MIDIKit, it's an implementation in the app to get it working. Should be an iOS example app.

(macOS is different, you just set up Bluetooth MIDI devices in Audio MIDI Setup. iOS it's per-app and the app has to implement.)

Example Projects

Master List of Example Projects

Build a set of barebones example projects in both AppKit/UIKit and SwiftUI

  • Create a virtual input and output and demonstrate sending and receiving events
  • Creating and managing connections
  • #100
    • macOS AppKit
    • macOS SwiftUI
    • iOS UIKit
    • iOS SwiftUI
  • Demonstrate multi-selection list for endpoints
    (similar to EndpointMenus/EndpointPickers example but using a listbox control with checkboxes or selection capability in a window/view instead of menus)
  • Parsing received MIDI events
    • macOS AppKit
    • macOS SwiftUI
    • iOS UIKit
    • iOS SwiftUI
  • Filtering received MIDI events
    • macOS AppKit
    • macOS SwiftUI
    • iOS UIKit
    • iOS SwiftUI
  • Bluetooth MIDI on iOS
    • #88 (receiving MIDI)
    • Investigate advertising iOS app's availability of MIDI over Bluetooth
    • Add test MIDI message Send button to bluetooth endpoint(s)
  • #152
  • #119

Clean up Advanced example projects

Issues with macOS Monterey on Intel

There's a potential issue with receiving MIDI using new Core MIDI API on macOS Monterey, specifically on Intel architecture.

The unit tests on Intel Monterey are passing, and the issue does not seem to appear on Apple Silicon (M1). So it needs further investigation.

A temporary workaround if needed is to revert to legacy Core MIDI API before starting the Manager, until the issue can be addressed:

midiManager.preferredAPI = .legacyCoreMIDI
try midiManager.start()

MIDI I/O To Do List

General

  • MIDIManager: Add remaining accessors for things like Core MIDI ExternalDevices, etc.
  • MIDIIOObjectProtocol: Add remaining set...() property methods (counterparts to get...() property methods)
  • New method to send multiple MIDIEventPacket in a single MIDIEventList (similar to existing MIDIPacketList method)
  • Formalize ReceivesMIDIEvents and SendsMIDIEvents protocols and how they are used internally.
    When using events: [MIDIEvent] variations of send or receive, I/O should attempt to pack all events into a single MIDIPacketList/MIDIEventList
  • getConnectionUniqueID() can be Int32 or CFData - see Core MIDI docs:

    The value provided may be an integer. To indicate that a driver connects to multiple external objects, pass the array of big-endian SInt32 values as a CFData object.

  • MIDI thru connections: add transforms that exist in Core MIDI to thru params

Legacy (non-UMP MIDI 1.0)

  • Multi-packet chunking of legacy MIDI 1.0 SysEx messages
  • Add compoundEventParsing: Bool property to MIDI1Parser that buffers received events that match certain RPN, NRPN messages and return compound events (ie: .rpn() instead of a series of .cc() events) upon receipt of all messages that comprise the compound event.

Done

  • Carry MIDI packet timestamp through to MIDIEvent consumers? (ReceivesMIDIEvents protocol) Or at least provide it in the receive handler.
  • MIDI non-persistent thru connections:
    • do they need to be reconnected (refreshed) when a received MIDI notification indicates setup has changed or things were added/removed in the Core MIDI subsystem? or does Core MIDI handle some of that autonomously?
    • what happens when ports involved disappear and reappear in the system? Do we need to worry about a refreshConnections(in:) method like with the inputs and outputs?

midiManager.endpoints and midiManager.devices.devices only give an empty list `[]`

Please Confirm

  • I have reviewed the MIDIKit Documentation which contains descriptive guides and extensive API reference
  • I have searched Issues and Discussions to see if the same question has already been asked

macOS Version(s) Used to Build

macOS 12 Monterey, macOS 11 Big Sur

Xcode Version(s)

Xcode 14, Xcode 13.3.1

Description

When using the midiManager declared via:

let midiManager = MIDIManager(
        clientName: "name",
        model: "model",
        manufacturer: "blockarchitech"
)

And then trying to list devices:

print(midiManager.devices.devices)
..or
print(midiManager.endpoints.inputs)
..or
print(midiManager.endpoints.outputs)

It will always return

[]

Crash Logs, Screenshots or Other Attachments (if applicable)

No response

HUI Protocol Support

Features

  • HUI text value types: add static constructors (.lossy(String) etc.)

Fixes

  • HUICoreEvent.largeDisplay(slices:) is open-ended array of chars. When sending this event, the MIDI event encoder could validate this to pad/trim to exactly 10 chars.
  • fix: remote presence works most of the time but sometimes thinks remote is present when it's not

Pre-Release

  • HUITest example project:
    • update V-Pots and faders to use AudioKitโ€™s Controls library
    • change local MIDIKit package linking over to remote SPM dependency
    • add text labels to HUITest project Fader view to match dbVU scale that is marked on the HUI device surface
  • update MIDIKit docs, publish to docs branch
  • unit tests

Done

  • add MIDI Device Inquiry SysEx response to HUISurface
  • V-Pot LEDs: add UnitInterval (0.0...1.0) value static constructors / translation methods for
  • HUITest example project: rename HUIHostHelper to HUIHostModel
  • test with ABAction/Arbiter to see if API changes make sense
  • jog wheel API
  • HUICoreEvent is slightly abstracted but should it be?
    • large display and time display cases carry their full display contents but should probably only carry the atomic changes (mirroring the actual MIDI encoded data)
      • maybe only HUIEvent (model change event) should carry these full strings, and when sending/receiving HUI only delta (partial) change messages should be sent when invoking transmitLargeDisplay() on HUIHostBank
    • should HUICoreEvent be internal and not public?
  • HUIHostBank: rename translator: HUIModel to model: and make it public? Allow more interactivity with the model for the host bank? Tricky part is causing updates to the model to be transmit to the bankโ€™s midiOut handler.
  • how is V-Pot aboslute value (UInt7) being dealt with vs. delta change messages?
  • V Pot encoding: surfaceโ†’host sends delta change value, but sending LED preset hostโ†’surface is an index number
  • Add CustomStringConvertible to ChannelStripComponent, and any other enums that need it
  • HUIHost should handle MIDI system reset msg 0xFF - add a HUICoreEvent case for it
  • HUI: run HUISwitch.allCases through a forEach {} unit test to see if they all encode and decode successfully
  • HUI: factor out calls to Logger.debug() - or have a Bool setting to enable logging? It only logs in DEBUG builds any way.
  • nomenclature consistency: all enum case/type names referring to HUI text/char displays should be named as their respective Display

Deferred / Cancelled

  • add HUISurface full state push?
    (not to host of course, there's nothing to push. but to the handler that is subscribed to hui surface change notifications)
    • by way of generating static model change notifications for all state members?
  • HUIHostBank additions
    • add model
    • add full model state push
      (to client surface, ie: when a client appears after a break in ping message stream (loss of surface presence))
    • have to find out if HUI surfaces understand receiving 0xFF system reset and can reset themselves to a known state, then HUISurface could simply do a diff of its model state against known default state and only push the diff'd state elements

How to added MidiKit to project

I like to know where do we add MidiKit to the project. Do we add it to where the project name: "midi_module" or to the pods

Screen Shot 2022-06-02 at 11 28 59 PM

Simple API for attaching modules to each other

Proposal

Simple API to facilitate piping MIDI events between objects.

Initially, this would help simplify the syntax required to attach MIDI endpoints from the I/O layer to MIDIKit Extensions abstraction classes (HUI, MTC, etc.).

This would likely take one of two possible implementation approaches:

  • synthesize the required handlers to pass MIDI Events in and out of modules (internally using weak references to the module classes being attached), removing the need for the library consumer to implement manual handler or closure code

or

  • act as a mechanism to store weak references to the actual class being attached

API

Protocols

public protocol ReceivesMIDIEvents {
    func midiIn(event: MIDI.Event)
    func midiIn(events: [MIDI.Event])
}

extension ReceivesMIDIEvents {
    public func midiIn(events: [MIDI.Event]) {
        for event in events {
            midiIn(event: event)
        }
    }
}

public protocol SendsMIDIEvents {
    /// Handler used when calling `midiOut()` methods.
    typealias MIDIOutHandler = ((_ events: [MIDI.Event]) -> Void)
    /// Handler used when calling `midiOut()` methods.
    var midiOutHandler: MIDIOutHandler? { get set }
}

extension SendsMIDIEvents {
    internal func midiOut(_ event: MIDI.Event) {
        midiOutHandler?([event])
    }
    internal func midiOut(_ events: [MIDI.Event]) {
        midiOutHandler?(events)
    }
}

Protocol Adoptions

extension MIDI.IO.Input: ReceivesMIDIEvents { }
extension MIDI.IO.Output: SendsMIDIEvents { }
extension MIDI.IO.InputConnection: ReceivesMIDIEvents { }
extension MIDI.IO.OutputConnection: SendsMIDIEvents { }

Implementation

extension ReceivesMIDIEvents { // protocol
    public func attach(to module: ReceivesMIDIEvents) {
        // ...
    }
    public func removeAttachments() {
        // ...
    }
}

extension SendsMIDIEvents { // protocol
    public func attach(to module: SendsMIDIEvents) {
        // ...
    }
    public func removeAttachments() {
        // ...
    }
}

Changes to MIDI.IO.Manager

.addInput() and .addInputConnection() API would change slightly:

  • The method parameter receiveHandler: MIDI.IO.ReceiveHandler would become an Optional. If passed nil, no receive handler would initially be instantiated and received MIDI packets would not be handled (at that point).
  • A new method parameter attachTo: ReceivesMIDIEvents would be added which would facilitate the creation of the managed Input/InputConnection as usual, but also attach a receiver object at the same time.

Considerations

Multiple MIDI pipes in a module

It's conceivable that a future MIDIKit module (or custom module developed by a library user) may make use of more than one input or output pipe. A possible future-proofing solution would be to add a property to specify which endpoint of a module to attach to when invoking the .attach() method.

Solution 1

A simple index system, starting at 0. If there is only one applicable endpoint, 0 or nil could be passed:

// SendsMIDIEvents protocol
func attach(to module: ReceivesMIDIEvents, index: Int? = nil)

Solution 2

A a new value type MIDIAttachment, giving type-safe names to each internal MIDI pipe.

public struct MIDIAttachment {
    public static var `default`: Self { MIDIAttachment("default") }
}

// SendsMIDIEvents protocol
func attach(to module: ReceivesMIDIEvents, port: MIDIAttachment = .default)

Ideally there would be a mechanism to strongly-type these identifiers to each specific MIDIKit module and not have them share the same global MIDIAttachment namespace.

Solution 3

Modules contain as many properties representing each internal MIDI "port" that they require.

Therefore, existing proposal logic could still apply without requiring an index or key system to identify them when calling .attach()

And because they are properties, they have carry with them the ability to have descriptive names and are strongly-typed.

class SomeModule {
    var port1 // conforms to ReceivesMIDIEvents
    var port2 // conforms to ReceivesMIDIEvents
}

This could, however, introduce more complexity than desired, as reference semantics would be required between SomeModule and those properties, which will be less than desirable if the module's logic exists in the module (which it invariably will).

MidiKit Bluetooth Simple class example request

I wrote a MidiKit React Native Bridge module. Is there any licence doc that I should include so that I can give credit to you for MidiKit.

Also is there a way you could possibly provide a simple class on how to use MidiKit with bluetooth. I see that you provided an example project however I really don't understand how bluetooth works. If you could provide a simple class to show how to use MidiKit with bluetooth It could help me understand how to create a React Native bridge to support bluetooth

Idea would be to have an example like this link
https://github.com/orchetect/MIDIKit/wiki/Simple-MIDI-Listener-Class-Example

Bug with payload.velocity

Describe the Bug
When receiving midi there is a bug where the payload.velocity is providing wrong value. This happens for noteOn, noteOff, cc, programChange, etc

Example Code, Steps to Reproduce, Crash Logs, etc.
I am running the simple class example to receive midi messages. When a message comes the velocity values are wrong

Screenshots
I attached a video showing the error

System Information
macOS 13.4

bug.mov

MIDI Event Filtering

Proposal

While some MIDI event filtering can be done using the native Swift .filter { } method on [MIDI.Event], more complex filtering tasks would be better as methods implemented in MIDIKit.

Filter methods on [MIDI.Event]

Consistent with Swift's .filter { } method, these methods will produce a new [MIDI.Event] including only events that match the filter.

extension Collection where Element == MIDI.Event {
    public func filterChannelVoice() -> [Element]
    public func filterSystemExclusive() -> [Element]
    public func filterSystemCommon() -> [Element]
    public func filterSystemRealTime() -> [Element]
    
    public func filterChannelVoice(channel isIncluded: MIDI.UInt4) -> [Element]
    public func filterChannelVoice(channels isIncluded: [MIDI.UInt4]) -> [Element]
    public func filter(channel isIncluded: MIDI.UInt4) -> [Element]
    public func filter(channels isIncluded: [MIDI.UInt4]) -> [Element]
}

Inverse methods should be offered that accomplish the opposite.

Since Swift's .drop method on collections is not meant as an inverse to .filter since it drops while a predicate is true instead of dropping all matching elements in the array, a new clear nomenclature should be used. .filterOut may be the most succinct.

extension Collection where Element == MIDI.Event {
    public func filterOutChannelVoice() -> [Element]
    public func filterOutSystemExclusive() -> [Element]
    public func filterOutSystemCommon() -> [Element]
    public func filterOutSystemRealTime() -> [Element]
    
    public func filterOutChannelVoice(channel isExcluded: MIDI.UInt4) -> [Element]
    public func filterOutChannelVoice(channels isExcluded: [MIDI.UInt4]) -> [Element]
    public func filterOut(channel isExcluded: MIDI.UInt4) -> [Element]
    public func filterOut(channels isExcluded: [MIDI.UInt4]) -> [Element]
}

Filter Definition Enum and EventFilter Object

Additionally, new types could be introduced:

  • new Filter enum describing a filter action, mirroring all of the ones offered as [MIDI.Event] collection methods as described above
  • new EventFilter class containing one or more of the filter definitions could be introduced.

An EventFilter instance could then optionally be attached to an object that conforms to SendsMIDIEvents ReceivesMIDIEvents either at its midiIn or midiOut node. The filter would automatically process MIDI events before midiIn receives them, or on the way out of midiOut before it returns the events.

This brings the benefit of endpoint/module filters being applied automatically, as well as being able to stack multiple filter criteria in one encapsulated mechanism.

extension MIDI {
    open class EventFilter {
        public var filters: [MIDI.Event.Filter]
        public func filter(events: [MIDI.Event]) -> [MIDI.Event] {
            // iterate through filters array and return the result
        }
    }
}

extension MIDI.Event {
    public enum Filter {
        // filter
        case filterChannelVoice
        case filterSystemExclusive
        case filterSystemCommon
        case filterSystemRealTime
        
        case filterChannelVoice(channel: MIDI.UInt4)
        case filterChannelVoice(channels: [MIDI.UInt4])
        case filter(channel: MIDI.UInt4)
        case filter(channels: [MIDI.UInt4])

        // filter out
        case filterOutChannelVoice
        case filterOutSystemExclusive
        case filterOutSystemCommon
        case filterOutSystemRealTime
        
        case filterOutChannelVoice(channel: MIDI.UInt4)
        case filterOutChannelVoice(channels: [MIDI.UInt4])
        case filterOut(channel: MIDI.UInt4)
        case filterOut(channels: [MIDI.UInt4])
    }
}

Additional filter ideas

Channel Voice

  • Channel Voice status type filter
    • notes (note on / note off)
    • polyAftertouch
    • cc
    • programChange
    • chanAftertouch
    • pitchBend
  • Note number range limiting (lowerNote...upperNote)
  • CC filter

`InputConnection` / `OutputConnection`: add multiple endpoint support

MIDI.IO.Manager

  • Modify .addInputConnection method API:
    public func addInputConnection(
        toOutputs: [MIDI.IO.EndpointIDCriteria],
        tag: String,
        receiveHandler: MIDI.IO.ReceiveHandler
    ) throws
  • Modify .addOutputConnection method API:
    public func addOutputConnection(
        toInputs: [MIDI.IO.EndpointIDCriteria],
        tag: String
    ) throws {

MIDI.IO.InputConnection

  • Alter old toOutput parameter to be an array; can take 0 or more MIDI.IO.EndpointIDCriteria
    internal init(toOutputs: [MIDI.IO.EndpointIDCriteria],
                  receiveHandler: ReceiveHandler)
  • Add .addConnection(toOutput: MIDI.IO.EndpointIDCriteria)
  • Add .removeConnection(toOutput: MIDI.IO.EndpointIDCriteria)
  • Add .connections computed property to return array of connected endpoints

MIDI.IO.OutputConnection

  • Alter old toInput parameter to be an array; can take 0 or more MIDI.IO.EndpointIDCriteria
    internal init(toInputs: [MIDI.IO.EndpointIDCriteria])
  • Add .addConnection(toInput: MIDI.IO.EndpointIDCriteria)
  • Add .removeConnection(toInput: MIDI.IO.EndpointIDCriteria)
  • Add .connections computed property to return array of connected endpoints

Broken link in docs for Event Filters

Please Confirm

  • I have reviewed the MIDIKit Documentation which contains descriptive guides and extensive API reference
  • I have searched Issues and Discussions to see if the same question has already been asked

macOS Version(s) Used to Build

macOS 13 Ventura

Xcode Version(s)

Xcode 14

Description

Hi, I found a small broken link on this page:
https://orchetect.github.io/MIDIKit/documentation/midikit/midimanager-creating-ports

There is a link for Event Filters that routes to the following broken URL:
https://orchetect.github.io/MIDIKit/documentation/midikit/midikit-getting-started/Event-Filters

Crash Logs, Screenshots or Other Attachments (if applicable)

image

MIDI 1.0 MPE / MIDI 2.0 Per-Note Abstractions

MIDI 1.0 MPE / MIDI 2.0 Per-Note Abstractions

  • Explore feasibility of adding MPE / MIDI 2.0 Per-Note Abstractions.

M2-104-UM Universal MIDI Packet (UMP) Format and MIDI 2.0 Protocol, page 53:

Using Note Number Rotation, Per-Note Pitch, and Per-Note Management Message for Independent Per-Note Expression

A MIDI 2.0 Protocol Sender can have fully independent control over individual Notes, even applied to simultaneous Notes on the same pitch. MIDI Polyphonic Expression (MPE) on the MIDI 1.0 Protocol uses a Channel Rotation mechanism for this kind of flexible expressive control with up to 16 notes of polyphony. In the MIDI 2.0 Protocol, a Note Number Rotation mechanism can replace the Channel Rotation mechanism for some applications. This improves on MPE by utilizing only a single MIDI Channel while providing polyphony of up to 128 notes.

MIDI Thru Connections are not supported on this platform due to Core MIDI bugs.

Please Confirm

  • I have reviewed the MIDIKit Documentation which contains descriptive guides and extensive API reference
  • I have searched Issues and Discussions to see if the same question has already been asked

macOS Version(s) Used to Build

macOS 13 Ventura

Xcode Version(s)

Xcode 14

Description

Hello, I am trying to send a MIDI message via USB to a MIDI controller.

When I try to setup a thru connection using the midiManager.addThruConnection method, it throws the following error at runtime:

Fatal error: Error raised at top level: MIDI Thru Connections are not supported on this platform due to Core MIDI bugs.

I saw that this issue was deferred in #19 but I wasn't sure which versions of MacOS are compatible/incompatible at runtime vs. build time. Any help would be appreciated.

Crash Logs, Screenshots or Other Attachments (if applicable)

No response

SwiftUI preview error when I include the MIDIKIT package

Please Confirm

  • I have reviewed the MIDIKit Documentation which contains descriptive guides and extensive API reference
  • I have searched Issues and Discussions to see if the same question has already been asked

macOS Version(s) Used to Build

macOS 12 Monterey

Xcode Version(s)

Xcode 14

Description

I tried finding a solution to this but couldn't. I hope it is not a silly one !

Once I include the swift package, SwiftUI Preview stops working. I attached a screenshot and the diagnostic error message below.
Thanks

XCode Version 14.0.1 (14A400)

SettingsError: noExecutablePath(<IDESwiftPackageStaticLibraryProductBuildable:ObjectIdentifier(0x0000600013c4c0f0):'MIDIKit'>)```



Enable build on tvOS & watchOS

Actionable Items

  • Enable build on tvOS
  • Enable build on watchOS
  • Add tvOS & watchOS builds to GitHub CI

Proposal

Although most of Core MIDI is unavailable on tvOS and watchOS, it would be ideal to allow MIDIKit to build on those platforms by conditionally fencing out incompatible code or marking them as not @available.

Even though the majority of functionality will be unavailable, the benefit to this would be allowing cross-platform applications or packages whose targets include tvOS and/or watchOS to use MIDIKit as a dependency. This also allows compatibility of MIDIKit extensions like MIDIKitSMF (Standard MIDI File) to function on those platforms where the unavailable Core MIDI I/O methods are not needed.

Essentially what would be left remaining and usable would be:

  • Global proprietary types (MIDI.UInt7, etc.)
  • MIDI.Event abstracts, including filters
  • MIDI.Note abstract
  • MIDI.IO.APIVersion and MIDI.IO.ProtocolVersion
  • MIDI.IO.Packet and its nested types

Implementation

It may make sense to simply remove most of the I/O module by conditional compilation. The downside is the lack of symbolication, but in theory that shouldn't be a problem.

#if !os(tvOS) && !os(watchOS)
// ...
#endif

Otherwise, individual objects, methods and properties could be marked @available. The downside would be added clutter to the MIDIKit codebase.

This may also not even be possible because some of the Core MIDI symbolication/headers are completely missing on tvOS/watchOS.

@available(macOS 10.12, iOS 10.0, tvOS 9999, watchOS 9999, *)
func abc() { }

`MIDIPacketNext` crash

Symptoms

Rare crashes seem to be caused by MIDIPacketNext with this crash log backtrace:

Thread 1 Crashed:
0   com.orchetect.TestApp         	0x00000001001c9784 MIDIPacketList.PacketListIterator.next() + 484
1   com.orchetect.TestApp         	0x0000000100146176 protocol witness for MIDIIOReceiveHandler.midiReadBlock(_:_:) in conformance MIDIReceiveHandler + 86
2   com.orchetect.TestApp         	0x00000001001b4c1c partial apply for implicit closure #2 in implicit closure #1 in MIDIIO.InputConnection.connect(in:) + 60
3   com.orchetect.TestApp         	0x00000001001b3787 thunk for @escaping @callee_guaranteed (@unowned UnsafePointer<MIDIPacketList>, @unowned UnsafeMutableRawPointer?) -> () + 23
4   com.apple.audio.midi.CoreMIDI 	0x00007fff34d0a19d LocalMIDIReceiverList::HandleMIDIIn(unsigned int, unsigned int, void*, MIDIPacketList const*) + 117
5   com.apple.audio.midi.CoreMIDI 	0x00007fff34d0a057 MIDIProcess::RunMIDIInThread() + 209
6   com.apple.audio.midi.CoreMIDI 	0x00007fff34d08d5e XThread::RunHelper(void*) + 10
7   com.apple.audio.midi.CoreMIDI 	0x00007fff34cec331 CAPThread::Entry(CAPThread*) + 77
8   libsystem_pthread.dylib       	0x00007fff6e216109 _pthread_start + 148
9   libsystem_pthread.dylib       	0x00007fff6e211b8b thread_start + 15

Repro

The issue is elusive but can generally be caused in any of the following contexts:

  • When stopping Reaper, but it can take up to 10 minutes of testing before the issue presents
    • Maybe sometimes it sends malformed packet lists / packets?
  • When receiving large MIDI packets (> 256 bytes)

Attempts to Solve

  • Swift compiler optimization appeared to play a role in this behavior, but was later proved to be unrelated after further testing.

MIDIEventLogger example In Advanced Examples folder won't build

Please Confirm

  • I have reviewed the MIDIKit Documentation which contains descriptive guides and extensive API reference
  • I have searched Issues and Discussions to see if the same question has already been asked

macOS Version(s) Used to Build

macOS 13 Ventura

Xcode Version(s)

Xcode 14

Description

Getting quite a few :
'Call can throw, but it is not marked with 'try' and the error is not handled'
error messages in SendMIDIEventsSystemExclusiveView at:

Button { sendEvent(.sysEx7( <- 'Call can throw, but it is not marked with 'try' and the error is not handled'

Also 'Value of type 'Int' has no member 'hexString'' errors:
let groupNumHex = $0.hexString(padTo: 1, prefix: true) <- 'Value of type 'Int' has no member 'hexString'

hth

Crash Logs, Screenshots or Other Attachments (if applicable)

No response

Build Fails

I have a React Native project and trying to use this package to create a native midi module. I am having a problem that once I import this Package the build will fail. It fails even without importing and using it in any of my source code. Simple adding the package to the project causes my build to fail. I am not sure if there are configurations that are require or should be avoided.

I attached my Xcode logs
Build target test_react-macOS_2022-06-03T02-11-57.txt

Screen Shot 2022-06-03 at 2 25 03 AM

Example: Handling/Parsing Core MIDI Notifications

  • Demonstrate the notification handler closure in MIDIManager
  • explain the notifications (MIDIIONotification)
  • show how to parse the metadata associated with each notification type
  • add project to examples CI workflow

AddInputConnection stopped working

Please Confirm

  • I have reviewed the MIDIKit Documentation which contains descriptive guides and extensive API reference
  • I have searched Issues and Discussions to see if the same question has already been asked

macOS Version(s) Used to Build

macOS 12 Monterey

Xcode Version(s)

Xcode 13.4.1

Description

I am having issues with getting addInputConnection to work with latest version of MidiKit. Looks as this worked in version 0.60 but not with latest version

public let midiManager = MIDIManager(
                     clientName: "ELMC Midi Deck",
                     model: "Midi Deck",
                     manufacturer: "ELMC")

let connection = "inputConnection"

        do {
           try midiManager.addInputConnection(
            toOutputs: [.uniqueID(88734594)],
             tag: connection,
             mode: .definedEndpoints,
             filter: .default(),
             receiver: .events { [weak self] events in
                 events.forEach { self?.handleMIDI($0) }
                             }
           )

        } catch {
         print("error setting up connection")
        }

Crash Logs, Screenshots or Other Attachments (if applicable)

No response

๐ŸŽน MIDI 1.0 Events support

  • MIDI 1.0 Spec events parser
    • Unit tests
  • MIDI.Event types with .rawBytes generators
    • Unit tests
  • Add abstracted constructors and properties
    • CC
    • RPN/NRPN
    • Unit tests
  • New Events ReceiveHandler
  • New EventsLogging ReceiveHandler
  • API: Refactoring MIDIKit modules to consume and emit MIDI.Event instead of raw bytes
    • HUI
    • Sync: MTC (Encoder, Decoder, Generator, Receiver)
  • Documentation

๐Ÿ’ก After completion, MIDIKit release 0.2.0 will have essential MIDI 1.0 events support.

API improvement: `*Endpoint` overloads for `Manager.add*()` and endpoint retrieval properties for those classes

Allow the direct use of MIDIKit Endpoint types in method overrides that currently support MIDI.IO.EndpointIDCriteria.

MIDI.IO.Manager

// Add overload:
public func addInputConnection(
    toOutputs: [MIDI.IO.OutputEndpoint],
    tag: String,
    receiveHandler: MIDI.IO.ReceiveHandler
) throws

// Add overload:
public func addOutputConnection(
    toInputs: [MIDI.IO.InputEndpoint],
    tag: String
) throws

Which implies adding corresponding initializers to those classes:

MIDI.IO.InputConnection

// Add init:
internal init(toOutputs: [MIDI.IO.OutputEndpoint],
              receiveHandler: ReceiveHandler)

MIDI.IO.OutputConnection

// Add init:
internal init(toInputs: [MIDI.IO.InputEndpoint])

Additionally, it then becomes natural to add computed properties to return those MIDIKit *Endpoints.

(MIDI.IO.ThruConnection already implements MIDIKit *Endpoint arrays.)

extension MIDI.IO.Input {
    // return itself as an `InputEndpoint`
    public var endpoint: InputEndpoint { get }
}

extension MIDI.IO.Output {
    // return itself as an `OutputEndpoint`
    public var endpoint: OutputEndpoint { get }
}

extension MIDI.IO.InputConnection {
    public var connectedEndpoints: [OutputEndpoint] { get }
}

extension MIDI.IO.OutputConnection {
    public var connectedEndpoints: [InputEndpoint] { get }
}

Expose inits for CoreMIDI packets and packet lists

MIDIKit contains various internal constructors for:

  • MIDIPacketList (MIDI 1.0, old API)
  • MIDIPacket (MIDI 1.0, old API)
  • MIDIEventList (MIDI 2.0, new API)
  • MIDIEventPacket (MIDI 2.0, new API)

Currently MIDIKit obfuscates them by having MIDIEvents be passed directly to an endpoint's send(event:) method and the packetlists/packets are subsequently created internally.

There are advanced use cases where a library consumer would benefit from having these methods be made public. Including methods where packets could be constructed from a MIDIEvent with a return type of one of the types mentioned above.

The goal of MIDIKit is to not export Core MIDI types and methods if possible in order to provide a cleaner namespace and not cloud/intermix autocomplete space with Core MIDI types.

To that end, it would be good to investigate whether this be done without exporting CoreMIDI.

Maybe @_implementationOnly import CoreMIDI - but will the methods become available in a context where the user imports CoreMIDI?

Add support for React Native

It would be nice if you could add support for React native (iOS and react native for macOS ).
MidiKit React Native Bridge for iOS and macOS

MIDI 2.0 RPN/NRPN

  • Refactor RPN to RegisteredController
    • need CustomStringConvertible for friendly debug output
  • Refactor NRPN to AssignableController
    • need CustomStringConvertible for friendly debug output
  • RegisteredController and AssignableController could use:
    • inits from raw param to form a proper param enum case (mostly for RPN)
    • with accompanying additional MIDIEvent static constructors .rpn(), .nrpn()
  • Add rpn and nrpn to MIDIEvent filters
  • syntax updated in old unit tests
  • Update MIDIKitSMF for new bridged RPN/NRPN MIDIFileEvent types
  • add new unit tests
  • API evolution 0.7.3 โ†’ 0.8.0 @available() entries added
  • update example projects - EventsParsing especially
  • run script to bump all example projects to use MIDIKit dependency from 0.8.0
  • make sure docs build
  • output new docs to GitHub Pages
  • Release 0.8.0

`MIDIFile`: Add structure validation

Motivation

MIDIFile allows for a fair degree of freedom with how its contents are constructed. Reliance on reading the documentation is necessary to understand and apply certain conventions.

This may lead to conditions where its structure does not cause errors, but it is technically illegal in terms of MIDI File Spec convention.

Some of these conditions are minor - other 3rd-party MIDI software or libraries/parsers may slacken their error conditions to allow for them. But some of these conditions are more undesirable and may produce incompatibilities with 3rd-party software.

For example:

  • It is possible to set the MIDIFile format to Type 0, which technically requires exactly 1 track chunk - no more, and no less. However many parsers will allow multiple tracks and ignore the format and treat it as Type 1 (synchronous multitrack) in this case.
  • In a Type 1 format file, tempo and time signature events are expected to be exclusively within the first track, but it is possible to insert them on multiple tracks
  • It is possible to insert multiple MIDI events on a track that should only be used once, such as SMPTE offset for example
  • It is possible to insert MIDI events at any time offset on a track that are conventionally expected to be at time 0, such as Track Name or SMPTE offset for example

Proposal

These actions are not prevented, and are not error conditions in of themselves. One such reason is that it allows parsing MIDI files that contain such illegalities while preserving the data in tact and not aborting file parsing - in such a case, the file's data itself is not malformed but the event definitions may be in a logical ordering that is unexpected or non-standard.

It could be beneficial to have an added validation method on MIDIFile to heuristically analyze the model contents and compile a list of warnings and/or recommendations based on all known conventions. This is a separate concept from parsing errors which are generally unrecoverable. But some parsing errors (such as a Type 0 file with more than 1 track) could be converted to a validation warning and not a fatal error condition.

This could serve both the purpose of educating the developer who is using MIDIKit during development of their software to diagnose issues, as well as provide somewhat granular information that could be actionable or presentable to the user in a user interface to inform them that data in the MIDI file may not be properly structured.

MIDI Notification - Object `removed` may return empty metadata

Sometimes a MIDIManager produces a removed notification with invalid data.

Example notification log:

Core MIDI notification: removed(MIDIOutputEndpoint(name: "MIDI Toolbox Out", uniqueID: -1844709742))
Core MIDI notification: removed(MIDIInputEndpoint(name: "", uniqueID: 0))

The likely cause is that internally, MIDIManager is querying the data for the removed object (endpoint in this example) from Core MIDI at the time the notification is received. Depending on how fast Core MIDI purges the endpoint, it may or may not still exist at the time this information is queried.

The solution is for MIDIManager to reference its endpoint cache to supply the endpoint's metadata in the notification prior to its endpoint cache being updated.

Modifying `mode` and `filter` properties does not cause managed connections to update.

After a managed connection is created, modifying the mode or filter property does not cause the connection to update and take the new settings into effect.

Affects InputConnection and OutputConnection.

let conn = midiManager.managedInputConnections["InputConnection1"];
conn?.removeAllOutputs()
conn?.mode = .allEndpoints
conn?.filter = .owned()

MIDISystemInfo Example Project Task List

MIDISystemInfo Example Project Task List

  • When a Device is selected in the list view, show its Entitys
  • iOS and macCatalyst ports? Should all work with very minimal changes, mostly reusable SwiftUI code.

example code Feature request

I am using your package MidiKit. I am having a little bit of an issue using the package.

You provided a good example how to use the package but I like to know if you could provide one more example that would show how to receive messages in its simplest form. Currently looks like you are using .eventLogging in the example
I am trying to use the following but get error:
Value of type '(MidiModule) -> () -> MidiModule' has no member 'receivedMIDIEvent'

receiveHandler: .events { [weak self] events in
        // Note: this handler will be called on a background thread
        // so call the next line on main if it may result in UI updates
        DispatchQueue.main.async {
            events.forEach { self?.receivedMIDIEvent($0) }
        }
    }

Can you provide a example where everything is in a swift class and one file. Also can you provide example using the receiveHandler

import MidiKit

class MidiModule: NSObject {
    var midiManager: MIDI.IO.Manager = {
        let newManager = MIDI.IO.Manager(
            clientName: "MIDIEventLogger",
            model: "LoggerApp",
            manufacturer: "Orchetect") { notification, manager in
                print("Core MIDI notification:", notification)
            }
        do {
            logger.debug("Starting MIDI manager")
            try newManager.start()
        } catch {
            logger.default(error)
        }
        
        // uncomment this to test different API versions or limit to MIDI 1.0 protocol
        //newManager.preferredAPI = .legacyCoreMIDI
        
        return newManager
    }()

   // ... Whatever code is needed to receive midi messages that use the receivedHandler
}

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.