orchetect / midikit Goto Github PK
View Code? Open in Web Editor NEW🎹 Modern multi-platform Swift CoreMIDI wrapper with MIDI 2.0 support.
License: MIT License
🎹 Modern multi-platform Swift CoreMIDI wrapper with MIDI 2.0 support.
License: MIT License
Sometimes a MIDIManager
produces a removed
notification with invalid data.
Example notification log:
Core MIDI notification: removed(MIDIOutputEndpoint(name: "MIDI Toolbox Out", uniqueID: -1844709742))
Core MIDI notification: removed(MIDIInputEndpoint(name: "", uniqueID: 0))
The likely cause is that internally, MIDIManager
is querying the data for the removed object (endpoint in this example) from Core MIDI at the time the notification is received. Depending on how fast Core MIDI purges the endpoint, it may or may not still exist at the time this information is queried.
The solution is for MIDIManager
to reference its endpoint cache to supply the endpoint's metadata in the notification prior to its endpoint cache being updated.
macOS 12 Monterey
Xcode 14
I tried finding a solution to this but couldn't. I hope it is not a silly one !
Once I include the swift package, SwiftUI Preview stops working. I attached a screenshot and the diagnostic error message below.
Thanks
XCode Version 14.0.1 (14A400)
SettingsError: noExecutablePath(<IDESwiftPackageStaticLibraryProductBuildable:ObjectIdentifier(0x0000600013c4c0f0):'MIDIKit'>)```
I have a React Native project and trying to use this package to create a native midi module. I am having a problem that once I import this Package the build will fail. It fails even without importing and using it in any of my source code. Simple adding the package to the project causes my build to fail. I am not sure if there are configurations that are require or should be avoided.
I attached my Xcode logs
Build target test_react-macOS_2022-06-03T02-11-57.txt
Roadmap to replacing AudioKit MIDI:
MIDIManager
: Add remaining accessors for things like Core MIDI ExternalDevices, etc.MIDIIOObjectProtocol
: Add remaining set...()
property methods (counterparts to get...()
property methods)MIDIEventPacket
in a single MIDIEventList
(similar to existing MIDIPacketList
method)ReceivesMIDIEvents
and SendsMIDIEvents
protocols and how they are used internally.events: [MIDIEvent]
variations of send or receive, I/O should attempt to pack all events into a single MIDIPacketList
/MIDIEventList
getConnectionUniqueID()
can be Int32
or CFData
- see Core MIDI docs:
The value provided may be an integer. To indicate that a driver connects to multiple external objects, pass the array of big-endian SInt32 values as a CFData object.
compoundEventParsing: Bool
property to MIDI1Parser
that buffers received events that match certain RPN, NRPN messages and return compound events (ie: .rpn()
instead of a series of .cc()
events) upon receipt of all messages that comprise the compound event.MIDIEvent
consumers? (ReceivesMIDIEvents
protocol) Or at least provide it in the receive handler.MIDIManager
MIDIIONotification
)Rare crashes seem to be caused by MIDIPacketNext
with this crash log backtrace:
Thread 1 Crashed:
0 com.orchetect.TestApp 0x00000001001c9784 MIDIPacketList.PacketListIterator.next() + 484
1 com.orchetect.TestApp 0x0000000100146176 protocol witness for MIDIIOReceiveHandler.midiReadBlock(_:_:) in conformance MIDIReceiveHandler + 86
2 com.orchetect.TestApp 0x00000001001b4c1c partial apply for implicit closure #2 in implicit closure #1 in MIDIIO.InputConnection.connect(in:) + 60
3 com.orchetect.TestApp 0x00000001001b3787 thunk for @escaping @callee_guaranteed (@unowned UnsafePointer<MIDIPacketList>, @unowned UnsafeMutableRawPointer?) -> () + 23
4 com.apple.audio.midi.CoreMIDI 0x00007fff34d0a19d LocalMIDIReceiverList::HandleMIDIIn(unsigned int, unsigned int, void*, MIDIPacketList const*) + 117
5 com.apple.audio.midi.CoreMIDI 0x00007fff34d0a057 MIDIProcess::RunMIDIInThread() + 209
6 com.apple.audio.midi.CoreMIDI 0x00007fff34d08d5e XThread::RunHelper(void*) + 10
7 com.apple.audio.midi.CoreMIDI 0x00007fff34cec331 CAPThread::Entry(CAPThread*) + 77
8 libsystem_pthread.dylib 0x00007fff6e216109 _pthread_start + 148
9 libsystem_pthread.dylib 0x00007fff6e211b8b thread_start + 15
The issue is elusive but can generally be caused in any of the following contexts:
RPN
to RegisteredController
CustomStringConvertible
for friendly debug outputNRPN
to AssignableController
CustomStringConvertible
for friendly debug outputRegisteredController
and AssignableController
could use:
rpn
and nrpn
to MIDIEvent filtersMIDIFileEvent
types@available()
entries addedIt would be nice if you could add support for React native (iOS and react native for macOS ).
MidiKit React Native Bridge for iOS and macOS
I have a lot of MIDI files working. But one of them gets a crash.
Originally posted by @SinanKarasu in #171 (comment)
There's a potential issue with receiving MIDI using new Core MIDI API on macOS Monterey, specifically on Intel architecture.
The unit tests on Intel Monterey are passing, and the issue does not seem to appear on Apple Silicon (M1). So it needs further investigation.
A temporary workaround if needed is to revert to legacy Core MIDI API before starting the Manager
, until the issue can be addressed:
midiManager.preferredAPI = .legacyCoreMIDI
try midiManager.start()
Need to add an example app demonstrating how to implement bluetooth MIDI device connections on iOS.
CABTMIDICentralViewController
and/or CABTMIDILocalPeripheralViewController
is involved:
https://developer.apple.com/documentation/coreaudiokit/cabtmidicentralviewcontroller
So it wouldn't be a feature of MIDIKit, it's an implementation in the app to get it working. Should be an iOS example app.
(macOS is different, you just set up Bluetooth MIDI devices in Audio MIDI Setup. iOS it's per-app and the app has to implement.)
Describe the Bug
When receiving midi there is a bug where the payload.velocity is providing wrong value. This happens for noteOn, noteOff, cc, programChange, etc
Example Code, Steps to Reproduce, Crash Logs, etc.
I am running the simple class example to receive midi messages. When a message comes the velocity values are wrong
Screenshots
I attached a video showing the error
System Information
macOS 13.4
While some MIDI event filtering can be done using the native Swift .filter { }
method on [MIDI.Event]
, more complex filtering tasks would be better as methods implemented in MIDIKit.
[MIDI.Event]
Consistent with Swift's .filter { }
method, these methods will produce a new [MIDI.Event]
including only events that match the filter.
extension Collection where Element == MIDI.Event {
public func filterChannelVoice() -> [Element]
public func filterSystemExclusive() -> [Element]
public func filterSystemCommon() -> [Element]
public func filterSystemRealTime() -> [Element]
public func filterChannelVoice(channel isIncluded: MIDI.UInt4) -> [Element]
public func filterChannelVoice(channels isIncluded: [MIDI.UInt4]) -> [Element]
public func filter(channel isIncluded: MIDI.UInt4) -> [Element]
public func filter(channels isIncluded: [MIDI.UInt4]) -> [Element]
}
Inverse methods should be offered that accomplish the opposite.
Since Swift's .drop
method on collections is not meant as an inverse to .filter
since it drops while a predicate is true instead of dropping all matching elements in the array, a new clear nomenclature should be used. .filterOut
may be the most succinct.
extension Collection where Element == MIDI.Event {
public func filterOutChannelVoice() -> [Element]
public func filterOutSystemExclusive() -> [Element]
public func filterOutSystemCommon() -> [Element]
public func filterOutSystemRealTime() -> [Element]
public func filterOutChannelVoice(channel isExcluded: MIDI.UInt4) -> [Element]
public func filterOutChannelVoice(channels isExcluded: [MIDI.UInt4]) -> [Element]
public func filterOut(channel isExcluded: MIDI.UInt4) -> [Element]
public func filterOut(channels isExcluded: [MIDI.UInt4]) -> [Element]
}
Filter
Definition Enum and EventFilter
ObjectAdditionally, new types could be introduced:
Filter
enum describing a filter action, mirroring all of the ones offered as [MIDI.Event]
collection methods as described aboveEventFilter
class containing one or more of the filter definitions could be introduced.An EventFilter
instance could then optionally be attached to an object that conforms to SendsMIDIEvents
ReceivesMIDIEvents
either at its midiIn or midiOut node. The filter would automatically process MIDI events before midiIn receives them, or on the way out of midiOut before it returns the events.
This brings the benefit of endpoint/module filters being applied automatically, as well as being able to stack multiple filter criteria in one encapsulated mechanism.
extension MIDI {
open class EventFilter {
public var filters: [MIDI.Event.Filter]
public func filter(events: [MIDI.Event]) -> [MIDI.Event] {
// iterate through filters array and return the result
}
}
}
extension MIDI.Event {
public enum Filter {
// filter
case filterChannelVoice
case filterSystemExclusive
case filterSystemCommon
case filterSystemRealTime
case filterChannelVoice(channel: MIDI.UInt4)
case filterChannelVoice(channels: [MIDI.UInt4])
case filter(channel: MIDI.UInt4)
case filter(channels: [MIDI.UInt4])
// filter out
case filterOutChannelVoice
case filterOutSystemExclusive
case filterOutSystemCommon
case filterOutSystemRealTime
case filterOutChannelVoice(channel: MIDI.UInt4)
case filterOutChannelVoice(channels: [MIDI.UInt4])
case filterOut(channel: MIDI.UInt4)
case filterOut(channels: [MIDI.UInt4])
}
}
notes
(note on / note off)polyAftertouch
cc
programChange
chanAftertouch
pitchBend
Restructuring: similar to AudioKit, core essential functionality of MIDIKit (I/O and Events) will live in this repo (MIDIKit
), while optional abstractions will be extricated into their own repos as extensions to MIDIKit. Each will import MIDIKit
as a dependency.
macOS 13 Ventura
Xcode 14
Hi, I found a small broken link on this page:
https://orchetect.github.io/MIDIKit/documentation/midikit/midimanager-creating-ports
There is a link for Event Filters
that routes to the following broken URL:
https://orchetect.github.io/MIDIKit/documentation/midikit/midikit-getting-started/Event-Filters
There is something happening I don't understand. For debugging I color Note On green, and Note Off red. However,
midi1RawBytes()
seem to be returning the next message op code. Second E4 should have been a 90, but is 80.
Originally posted by @SinanKarasu in #171 (comment)
Trying to get GitHub Actions CI which runs on Intel hardware to build a Universal Binary 2 (x86_64, arm64) but running into an issue with xcodebuild not liking it.
Build succeeds if ONLY_ACTIVE_ARCH = YES but only builds x86_64.
Setting ONLY_ACTIVE_ARCH = NO causes both architectures to build.
Locally, on an M1 Max machine of mine it works fine and compiles both. But it errors out on an Intel machine.
Some hints:
User feature request:
what I would love is a MTC Generator start at AVAudioTime
that way i can use the same value that is used everywhere with AVFoundation
would take into account the same hostTime that is used for all schedules
Would be nice to have a feature for macOS when a func is run it can create menu inputs and outputs with menu items of all inputs and outputs
macOS 12 Monterey
Xcode 13.4.1
I am having issues with getting addInputConnection to work with latest version of MidiKit. Looks as this worked in version 0.60 but not with latest version
public let midiManager = MIDIManager(
clientName: "ELMC Midi Deck",
model: "Midi Deck",
manufacturer: "ELMC")
let connection = "inputConnection"
do {
try midiManager.addInputConnection(
toOutputs: [.uniqueID(88734594)],
tag: connection,
mode: .definedEndpoints,
filter: .default(),
receiver: .events { [weak self] events in
events.forEach { self?.handleMIDI($0) }
}
)
} catch {
print("error setting up connection")
}
No response
MIDI 2.0 support progress:
relative: Bool
property on existing RPN/NRPN events)"Does not translate to MIDI 1.0 Protocol, but may be used by a UMP MIDI 1.0 Device"
Originally posted by @SinanKarasu in #171 (comment)
MIDIKit could conditionally vend the custom SwiftUI views and classes that currently reside only in the Bluetooth example projects.
.spi.yml
manifestSimple API to facilitate piping MIDI events between objects.
Initially, this would help simplify the syntax required to attach MIDI endpoints from the I/O layer to MIDIKit Extensions abstraction classes (HUI, MTC, etc.).
This would likely take one of two possible implementation approaches:
or
public protocol ReceivesMIDIEvents {
func midiIn(event: MIDI.Event)
func midiIn(events: [MIDI.Event])
}
extension ReceivesMIDIEvents {
public func midiIn(events: [MIDI.Event]) {
for event in events {
midiIn(event: event)
}
}
}
public protocol SendsMIDIEvents {
/// Handler used when calling `midiOut()` methods.
typealias MIDIOutHandler = ((_ events: [MIDI.Event]) -> Void)
/// Handler used when calling `midiOut()` methods.
var midiOutHandler: MIDIOutHandler? { get set }
}
extension SendsMIDIEvents {
internal func midiOut(_ event: MIDI.Event) {
midiOutHandler?([event])
}
internal func midiOut(_ events: [MIDI.Event]) {
midiOutHandler?(events)
}
}
extension MIDI.IO.Input: ReceivesMIDIEvents { }
extension MIDI.IO.Output: SendsMIDIEvents { }
extension MIDI.IO.InputConnection: ReceivesMIDIEvents { }
extension MIDI.IO.OutputConnection: SendsMIDIEvents { }
extension ReceivesMIDIEvents { // protocol
public func attach(to module: ReceivesMIDIEvents) {
// ...
}
public func removeAttachments() {
// ...
}
}
extension SendsMIDIEvents { // protocol
public func attach(to module: SendsMIDIEvents) {
// ...
}
public func removeAttachments() {
// ...
}
}
MIDI.IO.Manager
.addInput()
and .addInputConnection()
API would change slightly:
receiveHandler: MIDI.IO.ReceiveHandler
would become an Optional. If passed nil
, no receive handler would initially be instantiated and received MIDI packets would not be handled (at that point).attachTo: ReceivesMIDIEvents
would be added which would facilitate the creation of the managed Input/InputConnection as usual, but also attach a receiver object at the same time.It's conceivable that a future MIDIKit module (or custom module developed by a library user) may make use of more than one input or output pipe. A possible future-proofing solution would be to add a property to specify which endpoint of a module to attach to when invoking the .attach()
method.
A simple index system, starting at 0. If there is only one applicable endpoint, 0 or nil could be passed:
// SendsMIDIEvents protocol
func attach(to module: ReceivesMIDIEvents, index: Int? = nil)
A a new value type MIDIAttachment
, giving type-safe names to each internal MIDI pipe.
public struct MIDIAttachment {
public static var `default`: Self { MIDIAttachment("default") }
}
// SendsMIDIEvents protocol
func attach(to module: ReceivesMIDIEvents, port: MIDIAttachment = .default)
Ideally there would be a mechanism to strongly-type these identifiers to each specific MIDIKit module and not have them share the same global MIDIAttachment
namespace.
Modules contain as many properties representing each internal MIDI "port" that they require.
Therefore, existing proposal logic could still apply without requiring an index or key system to identify them when calling .attach()
And because they are properties, they have carry with them the ability to have descriptive names and are strongly-typed.
class SomeModule {
var port1 // conforms to ReceivesMIDIEvents
var port2 // conforms to ReceivesMIDIEvents
}
This could, however, introduce more complexity than desired, as reference semantics would be required between SomeModule
and those properties, which will be less than desirable if the module's logic exists in the module (which it invariably will).
macOS 13 Ventura
Xcode 14
Hello, I am trying to send a MIDI message via USB to a MIDI controller.
When I try to setup a thru connection using the midiManager.addThruConnection
method, it throws the following error at runtime:
Fatal error: Error raised at top level: MIDI Thru Connections are not supported on this platform due to Core MIDI bugs.
I saw that this issue was deferred in #19 but I wasn't sure which versions of MacOS are compatible/incompatible at runtime vs. build time. Any help would be appreciated.
No response
macOS 13 Ventura
Xcode 14
Getting quite a few :
'Call can throw, but it is not marked with 'try' and the error is not handled'
error messages in SendMIDIEventsSystemExclusiveView at:
Button { sendEvent(.sysEx7( <- 'Call can throw, but it is not marked with 'try' and the error is not handled'
Also 'Value of type 'Int' has no member 'hexString'' errors:
let groupNumHex = $0.hexString(padTo: 1, prefix: true) <- 'Value of type 'Int' has no member 'hexString'
hth
No response
MIDI.IO.Manager
.addInputConnection
method API:
public func addInputConnection(
toOutputs: [MIDI.IO.EndpointIDCriteria],
tag: String,
receiveHandler: MIDI.IO.ReceiveHandler
) throws
.addOutputConnection
method API:
public func addOutputConnection(
toInputs: [MIDI.IO.EndpointIDCriteria],
tag: String
) throws {
MIDI.IO.InputConnection
toOutput
parameter to be an array; can take 0 or more MIDI.IO.EndpointIDCriteria
internal init(toOutputs: [MIDI.IO.EndpointIDCriteria],
receiveHandler: ReceiveHandler)
.addConnection(toOutput: MIDI.IO.EndpointIDCriteria)
.removeConnection(toOutput: MIDI.IO.EndpointIDCriteria)
.connections
computed property to return array of connected endpointsMIDI.IO.OutputConnection
toInput
parameter to be an array; can take 0 or more MIDI.IO.EndpointIDCriteria
internal init(toInputs: [MIDI.IO.EndpointIDCriteria])
.addConnection(toInput: MIDI.IO.EndpointIDCriteria)
.removeConnection(toInput: MIDI.IO.EndpointIDCriteria)
.connections
computed property to return array of connected endpointsMIDIKit contains various internal constructors for:
MIDIPacketList
(MIDI 1.0, old API)MIDIPacket
(MIDI 1.0, old API)MIDIEventList
(MIDI 2.0, new API)MIDIEventPacket
(MIDI 2.0, new API)Currently MIDIKit obfuscates them by having MIDIEvent
s be passed directly to an endpoint's send(event:)
method and the packetlists/packets are subsequently created internally.
There are advanced use cases where a library consumer would benefit from having these methods be made public
. Including methods where packets could be constructed from a MIDIEvent
with a return type of one of the types mentioned above.
The goal of MIDIKit is to not export Core MIDI types and methods if possible in order to provide a cleaner namespace and not cloud/intermix autocomplete space with Core MIDI types.
To that end, it would be good to investigate whether this be done without exporting CoreMIDI.
Maybe @_implementationOnly import CoreMIDI
- but will the methods become available in a context where the user imports CoreMIDI?
Entity
s.lossy(String)
etc.)HUICoreEvent.largeDisplay(slices:)
is open-ended array of chars. When sending this event, the MIDI event encoder could validate this to pad/trim to exactly 10 chars.Fader
view to match dbVU scale that is marked on the HUI device surfaceHUISurface
HUICoreEvent
is slightly abstracted but should it be?
HUIEvent
(model change event) should carry these full strings, and when sending/receiving HUI only delta (partial) change messages should be sent when invoking transmitLargeDisplay()
on HUIHostBank
HUICoreEvent
be internal and not public?translator: HUIModel
to model:
and make it public? Allow more interactivity with the model for the host bank? Tricky part is causing updates to the model to be transmit to the bank’s midiOut handler.HUISwitch.allCases
through a forEach {}
unit test to see if they all encode and decode successfullyLogger.debug()
- or have a Bool setting to enable logging? It only logs in DEBUG builds any way.Display
HUISurface
full state push?HUIHostBank
additions
Thanks a lot for sharing your great work.
I'm hoping to use MIDIKitSync to syncronise playback of an avplayer in my app with an external DAW. I'm planning on using the following code to schedule video playback in the future (e.g. I want video play back to occur at a particular future timecode). I'm a bit unsure how to sync with MTC however. I'm guessing I need to ensure they're both using the same sourceClock. Any ideas how to do this? Sorry, if this is already taken care of, still learning. Thanks again.
var audioClock = CMClockGetHostTimeClock()
let videoPlayer = AVPlayer(url: videoURL)
videoPlayer.sourceClock = audioClock
videoPlayer.automaticallyWaitsToMinimizeStalling = false
func schedulePlayback(videoTime: TimeInterval, hostTime: UInt64) {
videoPlay(at: 0, hostTime: hostTime)
}
func videoPlay(at time: TimeInterval = 0, hostTime: UInt64 = 0) {
let cmHostTime = CMClockMakeHostTimeFromSystemUnits(hostTime)
let cmVTime = CMTimeMakeWithSeconds(time, preferredTimescale: 1000000)
let futureTime = CMTimeAdd(cmHostTime, cmVTime)
videoPlayer.setRate(1, time: kCMTimeInvalid, atHostTime: futureTime)
}
macOS 13 Ventura
Xcode 14
The following file does not parse. The same file parses in two other MIDI Parsers, Logic Pro, and plays in AVAudioSequencer.
Anonim - Uzun Ince Bir Yoldayim.mid.zip
No response
MIDI.Event
types with .rawBytes
generators
Events
ReceiveHandler
EventsLogging
ReceiveHandler
MIDI.Event
instead of raw bytes
Encoder
, Decoder
, Generator
, Receiver
)macOS 12 Monterey, macOS 11 Big Sur
Xcode 14, Xcode 13.3.1
When using the midiManager declared via:
let midiManager = MIDIManager(
clientName: "name",
model: "model",
manufacturer: "blockarchitech"
)
And then trying to list devices:
print(midiManager.devices.devices)
..or
print(midiManager.endpoints.inputs)
..or
print(midiManager.endpoints.outputs)
It will always return
[]
No response
I wrote a MidiKit React Native Bridge module. Is there any licence doc that I should include so that I can give credit to you for MidiKit.
Also is there a way you could possibly provide a simple class on how to use MidiKit with bluetooth. I see that you provided an example project however I really don't understand how bluetooth works. If you could provide a simple class to show how to use MidiKit with bluetooth It could help me understand how to create a React Native bridge to support bluetooth
Idea would be to have an example like this link
https://github.com/orchetect/MIDIKit/wiki/Simple-MIDI-Listener-Class-Example
I am using your package MidiKit. I am having a little bit of an issue using the package.
You provided a good example how to use the package but I like to know if you could provide one more example that would show how to receive messages in its simplest form. Currently looks like you are using .eventLogging in the example
I am trying to use the following but get error:
Value of type '(MidiModule) -> () -> MidiModule' has no member 'receivedMIDIEvent'
receiveHandler: .events { [weak self] events in
// Note: this handler will be called on a background thread
// so call the next line on main if it may result in UI updates
DispatchQueue.main.async {
events.forEach { self?.receivedMIDIEvent($0) }
}
}
Can you provide a example where everything is in a swift class and one file. Also can you provide example using the receiveHandler
import MidiKit
class MidiModule: NSObject {
var midiManager: MIDI.IO.Manager = {
let newManager = MIDI.IO.Manager(
clientName: "MIDIEventLogger",
model: "LoggerApp",
manufacturer: "Orchetect") { notification, manager in
print("Core MIDI notification:", notification)
}
do {
logger.debug("Starting MIDI manager")
try newManager.start()
} catch {
logger.default(error)
}
// uncomment this to test different API versions or limit to MIDI 1.0 protocol
//newManager.preferredAPI = .legacyCoreMIDI
return newManager
}()
// ... Whatever code is needed to receive midi messages that use the receivedHandler
}
MIDIFile
allows for a fair degree of freedom with how its contents are constructed. Reliance on reading the documentation is necessary to understand and apply certain conventions.
This may lead to conditions where its structure does not cause errors, but it is technically illegal in terms of MIDI File Spec convention.
Some of these conditions are minor - other 3rd-party MIDI software or libraries/parsers may slacken their error conditions to allow for them. But some of these conditions are more undesirable and may produce incompatibilities with 3rd-party software.
For example:
MIDIFile
format to Type 0, which technically requires exactly 1 track chunk - no more, and no less. However many parsers will allow multiple tracks and ignore the format and treat it as Type 1 (synchronous multitrack) in this case.These actions are not prevented, and are not error conditions in of themselves. One such reason is that it allows parsing MIDI files that contain such illegalities while preserving the data in tact and not aborting file parsing - in such a case, the file's data itself is not malformed but the event definitions may be in a logical ordering that is unexpected or non-standard.
It could be beneficial to have an added validation method on MIDIFile
to heuristically analyze the model contents and compile a list of warnings and/or recommendations based on all known conventions. This is a separate concept from parsing errors which are generally unrecoverable. But some parsing errors (such as a Type 0 file with more than 1 track) could be converted to a validation warning and not a fatal error condition.
This could serve both the purpose of educating the developer who is using MIDIKit during development of their software to diagnose issues, as well as provide somewhat granular information that could be actionable or presentable to the user in a user interface to inform them that data in the MIDI file may not be properly structured.
I am thinking of using this package in my next project. I like to know if it is still being maintain.
M2-104-UM Universal MIDI Packet (UMP) Format and MIDI 2.0 Protocol, page 53:
Using Note Number Rotation, Per-Note Pitch, and Per-Note Management Message for Independent Per-Note Expression
A MIDI 2.0 Protocol Sender can have fully independent control over individual Notes, even applied to simultaneous Notes on the same pitch. MIDI Polyphonic Expression (MPE) on the MIDI 1.0 Protocol uses a Channel Rotation mechanism for this kind of flexible expressive control with up to 16 notes of polyphony. In the MIDI 2.0 Protocol, a Note Number Rotation mechanism can replace the Channel Rotation mechanism for some applications. This improves on MPE by utilizing only a single MIDI Channel while providing polyphony of up to 128 notes.
Allow the direct use of MIDIKit Endpoint
types in method overrides that currently support MIDI.IO.EndpointIDCriteria
.
MIDI.IO.Manager
// Add overload:
public func addInputConnection(
toOutputs: [MIDI.IO.OutputEndpoint],
tag: String,
receiveHandler: MIDI.IO.ReceiveHandler
) throws
// Add overload:
public func addOutputConnection(
toInputs: [MIDI.IO.InputEndpoint],
tag: String
) throws
Which implies adding corresponding initializers to those classes:
MIDI.IO.InputConnection
// Add init:
internal init(toOutputs: [MIDI.IO.OutputEndpoint],
receiveHandler: ReceiveHandler)
MIDI.IO.OutputConnection
// Add init:
internal init(toInputs: [MIDI.IO.InputEndpoint])
Additionally, it then becomes natural to add computed properties to return those MIDIKit *Endpoint
s.
(MIDI.IO.ThruConnection
already implements MIDIKit *Endpoint
arrays.)
extension MIDI.IO.Input {
// return itself as an `InputEndpoint`
public var endpoint: InputEndpoint { get }
}
extension MIDI.IO.Output {
// return itself as an `OutputEndpoint`
public var endpoint: OutputEndpoint { get }
}
extension MIDI.IO.InputConnection {
public var connectedEndpoints: [OutputEndpoint] { get }
}
extension MIDI.IO.OutputConnection {
public var connectedEndpoints: [InputEndpoint] { get }
}
Although most of Core MIDI is unavailable on tvOS and watchOS, it would be ideal to allow MIDIKit to build on those platforms by conditionally fencing out incompatible code or marking them as not @available
.
Even though the majority of functionality will be unavailable, the benefit to this would be allowing cross-platform applications or packages whose targets include tvOS and/or watchOS to use MIDIKit as a dependency. This also allows compatibility of MIDIKit extensions like MIDIKitSMF (Standard MIDI File) to function on those platforms where the unavailable Core MIDI I/O methods are not needed.
Essentially what would be left remaining and usable would be:
MIDI.UInt7
, etc.)MIDI.Event
abstracts, including filtersMIDI.Note
abstractMIDI.IO.APIVersion
and MIDI.IO.ProtocolVersion
MIDI.IO.Packet
and its nested typesIt may make sense to simply remove most of the I/O module by conditional compilation. The downside is the lack of symbolication, but in theory that shouldn't be a problem.
#if !os(tvOS) && !os(watchOS)
// ...
#endif
Otherwise, individual objects, methods and properties could be marked @available
. The downside would be added clutter to the MIDIKit codebase.
This may also not even be possible because some of the Core MIDI symbolication/headers are completely missing on tvOS/watchOS.
@available(macOS 10.12, iOS 10.0, tvOS 9999, watchOS 9999, *)
func abc() { }
MCU (Mackie Control Universal) protocol is similar to HUI, but many key differences in hardware layout, supported controls and MIDI messages used.
Hardware:
After a managed connection is created, modifying the mode
or filter
property does not cause the connection to update and take the new settings into effect.
Affects InputConnection
and OutputConnection
.
let conn = midiManager.managedInputConnections["InputConnection1"];
conn?.removeAllOutputs()
conn?.mode = .allEndpoints
conn?.filter = .owned()
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.