I have a Casio 101 keyboard that I bought in 1987 and a $15 MIDI-to-USB cable. It's hard to describe how satisfying it is to see it continue to work as an input device for a computer manufactured in 2022.
The fact that MIDI 2.0 was only released in 2020 is mind boggling. It was just really well designed and implemented from the beginning.
I remember the first time I saw MIDI in action: I was in a mall in the mid-to-late 1980s and ComputerLand had a full sized keyboard hooked up to an Apple IIgs, which was playing The Entertainer while the notes scrolled across the screen on a color display. The real piano sound that came out was piped through some great speakers and I thought it was the most amazing technology I had ever seen. Think about that for a second: That tech remained almost untouched for 35 years. Incredible.
This video has a short bit in the middle about one of the first MIDI keyboards which still works with modern equipment (starting at 7:55): The History of the Prophet Synthesizer https://youtu.be/Ruh0B5QKBMs?t=476
Related (non-MIDI) note: I recently found this video about making electronic dance music on an Amiga in the early 1990s and learned quite a lot about how music-techies back then we're geeking out. Super fun - I'm not a musician, so I honestly had no real idea what was possible back then.
MIDI has proved amazingly resilient, whether over 5-pin DIN, the increasingly common TRS 1/4” jacks (common on Eurorack or guitar pedals, where space is a premium) or the utterly ubiquitous MIDI over USB. (And ethernet, can’t forget that). Everything digitally controlled speaks MIDI - and not just music, the stage lighting world is all MIDI too. I think that there is a real case that it’s the second most successful wired physical-layer protocol behind Ethernet.
A good number of stage lighting controllers have the ability to interface with and handle MIDI, but the vast majority of stage lighting equipment is controlled using DMX[1]. The two have a lot in common, being pretty much ubiquitous and rock-solid even given their age.
I think it’s likely they were confused also by the physical connector - DMX can and often does use the same 5 pin connector as MIDI, if I’m not mistaken.
Nah, I was just wrong :-) I know a lot of big touring productions drive lighting and patch changes off the same clock (usually a laptop which provides time code, possibly backing tracks, and possibly patch changes for effects and synths), and misremembered that lighting typically used a different protocol even if it’s the same program driving everything.
I love that a protocol from 1983 is still in very active use. Either it was very well thought out, or the userbase has very limited needs. (actually a little of both.)
> It is also the domain is limited. I mean a piano only has 88 keys and a few pedals and there are maybe 8 octaves or so in Common Practice.
Yeah but MIDI works for so much more than a piano. People use MIDI to program drum VSTs, guitar VSTs, entire orchestras, guzheng VSTs... basically anything musical.
That being said, it's serviceable but not great for a lot of these things, because originally it was meant just for keyboard-controlled synthesizers. The biggest flaw (IMO) is lack of standardization for individual instruments with more varied playing styles and articulations than the piano. As a basic example, for guitar VSTs, if you want to switch between sustain notes and palm mutes, you generally have to play an out-of-range note (a "keyswitch") to alter the articulation. However this keyswitch is different for Ample Sound, Shreddage, Native Instruments guitar libraries, etc. Similarly if you want to slide between notes, some VSTs just let you overlap the note while others make you use a slide keyswitch. Every VST has its own way of MIDI programming that isn't transferrable.
It's like switching between programming languages, except you're building a single program where it uses multiple languages at once in the same file and you have to remember if the current language syntax you're writing uses `elif`, `else if` or is LISP. (I have reference manuals for most VSTs I own bookmarked for this reason.)
The fact that MIDI continues to be used today is, more than anything, the result of "perfect is the enemy of done" IMO. You could redesign something better than MIDI from the ground up, but since everything already supports MIDI there's little motivation when it already works well enough.
Sorry for not being clear, I used the number of piano keys as an example of how limited the domain is. MIDI wasn’t designed to express ranges with 100’s or 1000’s of values. Twelve tone equal temperament is good enough and technology is not going to extend human hearing another four octaves.
What MIDI really got right is flexibility with regard to timbre. The standard was so open that General MIDI only came along later to establish some consistency for timbres.
MIDI doesn’t force you to use 12TET anyway. You can use MTS. Support seems widespread, at least among the equipment and software I use, some of it dating back to the 1980s.
MTS is how you specify how MIDI notes are translated to frequencies, if you desire. You can use it to specify alternative scales, if you desire. The Wikipedia list of hardware and software which supports MTS seems very conservative IMO... my personal experience is that MTS support is reasonably common.
The MIDI 2.0 spec was released in 2020 and is being adopted but that will take time. Think how long it takes for a new USB generation to be widespread and multiply by 5 (people are not buying a new piano or synthesizer every 3 years like they do with a laptop or other digital device)
Maybe there's a market in retrofitting old synthesizers like the kits available for some pre-MIDI classics ;)
I'm fairly confident that MIDI 2.0 is already dead. I really don't like to say that.
It was designed by a large industrial committee, in secret, with no transparency and no input from other interested parties. The opposite of MIDI 1.0. And it just stinks of it. It is overwrought by a long shot, with many bells and whistles nobody is asking for (there's a very specific, and frankly short, list of critical MIDI failings). It has no reference examples and will be awful to implement and to deploy over various transports. When it came out it went over like a lead balloon: nobody seems to want to implement it.
And CCs to handle faders and knobs. And PC to handle patch recall. And separate rx/tx busses. And ability to carry clock. And support for 16 simultaneous channels. Yeah it’s actually really well thought out I’d say
Dave was a brilliant designer, both analogue and digital, but not so great a businessperson; his company, Sequential, who made some of the most beloved electronic instruments - not least the Prophet series - went bankrupt and Yamaha bought the name. Dave went to work for Yamaha and Korg, and then Seer Systems, where he designed the first ever “soft synth” (professional-standard PC synthesiser plugin), Reality. He had all of the acclaim, and I’m sure he did OK, but he did not have the financial success the quality of his work deserved.
Nevertheless, he then founded Dave Smith Instruments, maybe the start of the analogue synth revival (which is going gangbusters now; Eurorack is the coolest hardware hacking scene on the planet). DSi did pretty well, which was great, but here’s what really reveals how beloved Dave was; his old friend Ikutaro had a word with his friend, the president of Yamaha, and more or less asked; “we all love Dave. You’re not using the Sequential name. How about it?” and Yamaha went… “you, know, yeah” and gave him the Sequential name - which still had significant brand equity - for free.
> Instrumental in restoring the Sequential name was Roland’s Founder, Ikutaro Kakehashi, a longtime colleague and friend of Smith’s. “I feel that it’s important to get rid of unnecessary conflict among electronic musical instrument companies,” said Kakehashi. “That is exactly the spirit of MIDI. For this reason, I personally recommended that the President of Yamaha, Mr. Nakata, return the rights to the Sequential name to Dave Smith. And I’m glad to see such a wonderful result—a new product with the Sequential name.”
And in 2021, he sold now-Sequential again under much happier circumstances; $24m to the well-regarded British gear manufacturer Focusrite.
The music gear industry has more than its share of villains - google Uli Behringer sometime - but Dave and Ikutaro were real ones, and the best parts of the industry (including the big Japanese manufacturers, Korg, Roland/Boss, and Yamaha) move in their spirit. And a large part of that is because of the existence of MIDI.
I don't know about 'ugly gaps' but its night and day compared to moving an analog knob with CV. Professional Composer guy may not have an issue. Perhaps indeed they are prolific in their marvelous, professional compositions (perhaps using CV)... but the limited expression of 128 steps is obvious to me. Source: I make techno.
Maybe you are not affected by it. I occasionally find it frustrating, in various scenarios… very slow sweeps by hand, or things like that. I only have problems with the limitation in certain scenarios with electronic sounds. It’s never a problem with sample libraries.
MIDI has 14-value CCs anyway, it’s just that few controllers generate the messages. I don’t k ow how many synths respond to them.
Composers are not really affected by this. Sound engineers are.
It is impossible to create smooth filter sweeps with regular CCs without further smoothing on the receiving end, because the human ear can differentiate between many more than 128 frequencies.
Should have been more clear. Professional film/TV/game composer. By definition we are producers and audio engineers as well.
And yes it truly isn’t a problem, as plugin developers have long since interpolated the 128 steps. If it were a problem, you’d be hearing choppy sweeps in every song and film score out there.
> And yes it truly isn’t a problem, as plugin developers have long since interpolated the 128 steps.
That's the point. It is not an issue for you, because it has been solved in certain receiving software that you use. That does not mean that the issue does not exist, but that it can be mitigated if you use software that allows that and smoothes the sweeps.
It is on one hand but it's also a PITA that holds back innovation and troubles its users with work-arounds.
It's time for a proper new standard using modern capabilities both for audio and control signal transport but it's a very heterogeneous market with tons of legacy, therefore almost impossible to achieve.
The baud rate chosen was extremely well thought out compared to the crazytalk that are standard serial port baud rates.
Specifically, the rate was 31250 bps. Why? Because the huge majority of synthesizer processors at the time had clocks that ran at 1 MHz or integer multiples thereof. Indeed that is still the case forty years later. Ask yourself what 31250 * 32 is.
You know how modern 3D printers and CNC machines use "G-code" instructions to specify their movement? I learned this tidbit literally yesterday:
The first implementation of a numerical control programming language was developed at the MIT Servomechanisms Laboratory in the late 1950s... The main standardized version used in the United States was settled by the Electronic Industries Alliance in the early 1960s. A final revision was approved in February 1980 as RS-274-D.
So yeah. Remember what Gibson said about the future...
1/4" and 1/8" (3.5mm) analogue audio jacks have had a very, very long life, from as far back as the 1890s. A year or two ago when I found my mother using with her iPhone an ancient single-ear mono headphone that probably came with a cassette player in the 80s, which struck me as quite funny. Still works fine.
Basically there are loads of different railway gauges around the world, and indeed, inside the USA itself, which historically had no single standard size.
All roads, railways, and thus tunnels and bridges everywhere are very approximately multiples of the width of a horse's pectoral/pelvic girdles. Nothing especially Roman or anything else about it.
This is really good news, finally I can stop telling users of pianojacq.com that they have to use Chrome or jump through all kinds of hoops to make this work on FF. Will test this out soon and change the code if necessary to make it work.
Edit: see other comment in this thread, no dice for now :( I'll have to do some more digging as to why it doesn't work.
Not really, but if you're taking a class, "install this plugin before we go on" is the same level of insurmountable as "please install vs code so that we're all using the same editor even if you have a different preferred editor".
> Not really, but if you're taking a class, "install this plugin before we go on" is the same level of insurmountable as "please install vs code so that we're all using the same editor even if you have a different preferred editor".
and as someone who sometimes have people installing VS Code as part of classroom settings, I can tell you that this can easily become a 2 hours ordeal + some help mails after the class when you have 30 students who are entirely computer-illiterate
Fingerprinting of WebMIDI devices is already happening in the wild. It was discovered because Firefox prompted for WebMIDI access on some random e-commerce websites, whereas Chrome silently allowed websites to access MIDI devices.
There are many legitimate reasons for webapps to do so, like game controller for games (or programming usb connected robots), sensors, or like in this case, connect a synthesizer to it, but little reason to allow random websites to have that.
I just would wish, there was a more clear distinction between them.
I think the connection from the browser vendors to the ad companies are not helping with that.
Whitelists/allowlists are tricky, because they create a bias towards existing, larger players. If you're creating some new gadget, you now have to convince the browser makers to allow it through. And the browser makers need to come up with some way to have a level of confidence that requesters are not just trojan horse manufacturers. Or so lax on security that a rogue website can use your USB connection to reprogram the device's firmware to emulate an ethernet device that will MitM your network.
You can let the user accept new entries, but then you're back having to give random nontechnical people enough context and information to make the correct choice when a random website causes the permission dialog to appear. Empirically, that doesn't go too well.
It's such a cool capability. Adds so much creative possibility. Huzzah.
I hope some day we get to a WebMIDI 2.0, to expose some of the many many new super-rad features in MIDI 2.0.
This want iconifies a lot for me, as sensible & highly desireable straighforwardhish follow-up I have little expectation we'll see anytime soon. I generally love the web platform, find it does a solid job, has decent API shape & decisions. Stuff gets made, & pretty well. But there's definitely a dearth of spec workers & browser maintainers out there to pitch in after the first drops, especially after some 1.0 implementations make it out the door. Just like in normal software development, there's a budget for new development, but maintenance & improvement can be severely lacking. Attention for what we have, tlc, is sparser than the places where new things happen.
WebMIDI is alas one of these great worthy & amazing specs that at some point we need to re-up, come around to again, and I want so much to imagine that happening. I feel like WebShare (and Target) was another great example- lots of really good improvements asked for in issues, growing potential, but the interest from the top has moved on.
Not just that, with people who have used smartphones and Chromebook all their lives now leaving college they're also slowly starting to pick up steam outside education.
They're also often exceptional deals for the low end of laptops (sub 200 dollar range) because Windows and its bloat make those laptops even worse to use. I'd never recommend any laptop that cheap unless someone has no meaningful way to save up, but this segment is very much still alive.
Honestly, I'm surprised the unification of smartphone and laptop hasn't reached those circles yet. A mid range phone and a cheap laptop shell that you can slide the phone into are a much better deal than a cheap laptop and a cheap phone. Samsung's DeX, for example, is fully competitive with ChromeOS and even most basic Windows use cases.
I believe it was removed because of the high risk of fingerprinting and inability to safeguard it from sending malicious data to devices (some devices do firmware updates over MIDI)
Everything that updates firmware over MIDI should have an explicit "update" mode on the hardware itself. No MIDI instruments I've ever used are writable by-default.
Also, fingerprinting is easy to avoid here. Firefox leaves MIDI off entirely until you opt-in to avoid random websites picking it up. Safari could do the same, presumably.
Firmware over midi was not uncommon for older equipment because the alternative was swapping out EPROMs if the instrument did not have an FD drive…and around Y2K, unavailability of older chips could mean replacing sockets or going without (so mostly going without).
I forget the exact device model because it’s been a decade plus, but I definitely did have a device that was upgraded over MIDI without requiring a separate mode.
That is definitely an issue with the device, but I don’t know how common or uncommon that is for much older equipment
Most Waldorf synthesizers are updated just by sending MIDI sysex to them. The same goes for nearly all DSI/Sequential machines, as well as M-Audio and Novation, among many other manufacturers.
Those are a bit more modern devices which have enough memory they could (an probably do) use a signed firmware image so it's not possible to brick them by flashing random garbage to the eeprom.
Could it just be hidden behind a prompt to enable it on a given site?
I feel like they’re just using fingerprinting as an excuse to not implement functionality that people want. Of course, I don’t really understand the problem space, so it’s likely I’m missing something.
most of the APIs listed there are already gated behind an explicit per-site opt-in in the browsers that implement them, and at least some in the spec defining them
i don't understand how this is a fingerprinting risk either, and i'm pretty sure i'm not missing anything.
> I feel like they’re just using fingerprinting as an excuse to not implement functionality that people want.
Do you actually believe this? Do you not default to more practical explinations like, maybe they don't consider it worthwhile to support because of engineering cost vs people that actually use it?
For example, I'd say 9/10 people I know who aren't tech literate have no idea the Health app exists on their iPhone. This includes people with Apple watches. Similarly it should be obvious that basically nobody ever knew about or used that one feature you liked.
I think engineering/support cost might be an excellent argument against implementing MIDI support in a browser, but the claim I responded to was that MIDI support wasn't on the table due to fingerprinting concerns (which are not obviously well-founded, from my outsider's perspective).
When I said "functionality that people want", I didn't mean to imply that there was a critical mass of people that made MIDI support in a browser obviously worthwhile, I just meant that some people want it and they're being told it won't happen because of fingerprinting.
When you make a web browser engine, you don't get to choose whether a use case is marginal or not (FWIW I use WebMIDI frequently in Chrome). You implement the standards or you perish.
Nobody wants a web browser that "chooses" not to work on some % of websites. Users choose, browsers implement. They are welcome to gate this feature behind a per-site permission prompt if they think it's insecure.
Can anyone explain to me, what's cool on web midi? I me for me it's totally useless.
I have a lot of sequencer, synth and midi help plugins in the DAW ... what does a JS interface to the web offers other than the next 100th "I just wrote a little useless javascript music script"-show HN posts?
For me it doesn't solve any problem or satisfy any need. It's just another other thing almost nobody uses in the browser.
One use case is education, where a web app is often easiest to build and deploy.
There are lots of rooms equipped with computer, notation software and keyboard in music institutions, where webmidi seems like a great way to develop some quickie music theory/education apps.
The web is not anywhere near parity for music and audio production yet, but its proponents see webmidi as one necessary step to that direction.
(I would personally not use current webaudio/webmidi software, but have successfully deployed a lot of it for education!)
I would argue it also lowers the barrier to entry for experimenting with new techniques for generating/processing audio. For example, you could pretty quickly create a page that has touch controls for adjusting parameters, uses a neural network for generating MIDI data, and then outputs that MIDI to a VST running on your computer.
Already on it... I will likely have to tweak the startup code a bit, look for an update tomorrow.
Edit: Ok, just tried it after upgrading FF, it doesn't work at all when used locally ("secure context required", as if there is anything more secure than using a local file...), and when using it on the server it does get past the initialization but the device list isn't populated, so whatever they are doing it isn't 100% compatible with the way Chrome does it.
The 'expected behavior' would be that the permissions dialog pops up but it doesn't so for now we'll have to wait until that is fixed. Pity, but it's very close now.
Novation has a pretty impressive web midi implementation used for managing the configuration on some of their controllers and synths. nice to see that becoming a thing outside chrome
Now if only they made one that wasn't web based. If I want to configure my LaunchKey on desktop, I'm still stuck with a huge electron app instead of a normal config application.
This is the same for the Nektar Pacer, the physical interface is a bit of work to get through, but someone made an amazing web midi configuration tool allowing complex configurations to be made or transferred between units.
are midi keyboards something special, or does this mean firefox relaxing it's hardware policy? last i checked bugzilla entries for implementing WebSerial and WebUSB were closed with the explanation that firefox believed it was harmful for web browsers to interact with hardware.
edit: nope, looks like their position is still the same
>Such an API would bridge the web and the physical world... Exposing that sort of capability to the web without adequate safeguards presents a significant threat to those devices.
I love what MIDI has given us, but MIDI devices are not designed for adversarial environments. I absolutely would want this to be mandatory opt-in (per session every single time) at the very least.
I'm excited to see where this goes, but last time I played around with MIDI in Chrome I remember the latency being kind of terrible (could just be my computer setup or the apps being poorly optimized).
Even ignoring MIDI, audio worklets in browsers don't have perfect latency, so it makes me nervous about the potential. Are there web MIDI apps that have good enough to actually play along with or composes music in real time? Is this just an issue with my setup?
It took SO long to get to this point, including several years of SysEx discussion, and an absolutely horrendous new addon concept that would have opened the door to Google and Microsoft going "well if Firefox is doing this, so can we" that thankfully got ditched entirely in favour of the model that's in place now.
It only took <checks bugzilla> over a decade, but we now finally, FINALLY, have proper midi support in Firefox.
And in the mean time, Chrome has had midi support for over 7 years now. Did Firefox do it better? Probably. Should it have taken this long? No. Not even remotely. The whole SysEx nonsense discussion alone took up literal years.
Can't wait until I can use Melodics in a browser, it would make it possible to use a raspberry pi or a cheap chromebox to run the app on a TV screen in my drum shed to practice.
Check it out, it's https://www.melodics.com/ and I'm not affiliated in any way, I just hope I catch the eye of the devs and they indulge me :)
Aside from the music applications, MIDI is a great protocol to tinker with for beginners who wish to exchange data between a microcontroller [1] and a computer, because of its simplicity.
You mean the thing you install an addon for, because there is literally no reason to make that a built-in function when there's a perfectly good addon for it?
The standard reply to this complaint is "use chrome.css", which sure, you and I can... but Mozilla has repeatedly threatened to remove support for it altogether.
I don't even have to write the css, I can just use https://github.com/ranmaru22/firefox-vertical-tabs, but in Edge, I can just select the option... not install an extension, click things in a hidden page, make a directory deep in $APPDATA, put a CSS file in that directory, and then also paste more CSS into said extension's custom CSS field, then restart browser.
You're already clearly a power user: hardly a chore to google a solution here? This is so far outside of normal browser needs that having to set an about:config value is kind of... just fine?
The WebMIDI standard has a problem, which is that only the main UI thread can talk to the WebMIDI devices directly. So even if you schedule your MIDI messages in the audio thread, to emit them on the hardware device you need to pass them to the main thread and then emit the messages from there, incurring whatever delays imposed by the main thread as it handles user input
You can do this fairly rapidly, as long as the tab has enough cycles at its disposal. The real problems start when you try to do a lot of display, computation and other interaction at the same time.
Yesterday I couldn't figure out how to get it to work, or even what was supposed to happen, but today I hooked up my controller, restarted the browser (learned that's necessary yesterday), opened settings which by default was set to "Input: Blank" and "Output: MPKMINI2", switched it to "Input: MPKMINI2" and "Output: BUILTIN AUDIO" and I can now play along with the notes step by step. It's really fun actually :)
if you dont mind autohotkey, people have made both midi in and out. I used to use AHK midi keyboard for some macros but ended up not liking it in the long run.