How to create a MIDI synthesizer with MIDI API and Node JS

I like to play the piano. It’s my hobby. When I decided to buy a new keyboard, I chose M-Audio Keystation 88es. As a simple MIDI keyboard, it doesn’t have any audio generation hardware. You can only connect it to a computer and generate sound there.

I typically use the Apple GarageBand — special software has a vast library of sounds. But the size of the library is quite big. For example, Garage Band needs to download more than 1.5GB of data to start playing, and smaller MIDI clients have smaller libraries and worst quality of the sounds.

Using Web MIDI API and simple Node JS server we can build small web client and a big library of good quality sounds.

M-Audio Keystation 88es

With the Web MIDI API we can easily receive the pads and sustain pressing. This data I will send to server using the Binary WebSocket Stream. The server generates an audio and sends it back to the clients using the streams. This stream is processing by an AudioContext to play the sound in a browser.

Scheme of MIDI API project

To emulate the MIDI events without a MIDI keyboard, you can use http://vmpk.sourceforge.net/

1. Start Node JS server

Let’s start with a simple Node JS server that serves a simple web page and the assets

const Express = require('express');
const ejs = require('ejs');
const path = require('path');

const app = new Express();

app.set('PORT', process.env.PORT || 3000);
app.set('view engine', 'ejs');
app.set('views', path.join(__dirname, './views'));

app.get('/', (request, response) => {
  response.render('index');
});

app.listen(app.get('PORT'), (error) => {
  if (error) {
    console.log('Server started with an error', error);
    process.exit(1);
  }
  console.log(`Server started and is listening at http://localhost:${app.get('PORT')}`);
})

Githubhttps://github.com/alexeybondarenko/midi-api/commit/25281f6cfe61c79bc2192fcb4e8880ea03be9517

2. Handling of MIDI Events

Next, let’s add some code to handle MIDI events. MIDI event has a data property with three numbers: statusdata1data2. For instance, we have status equals 0x90 — keyboard pressing event. For this event data1 is a tone, data2 is velocity.

More information about MIDI message formats you can find here: http://www.songstuff.com/recording/article/midi_message_format/

midiAccess.inputs.values() returns an iterator object. So you can easily iterate over all of the inputs. We will use only first input.

(function() {
  'use strict';

  console.log('App is running');

  var midiAccess = null;
  navigator.requestMIDIAccess().then(onMidiAccessSuccess, onMidiAccessFailure);

  function onMidiAccessSuccess(access) {
    midiAccess = access;

    var inputs = midiAccess.inputs;
    var inputIterators = inputs.values();

    var firstInput = inputIterators.next().value;

    if (!firstInput) return;
    firstInput.onmidimessage = handleMidiMessage;
  }

  function onMidiAccessFailure(error) {
    console.log('Oops. Something were wrong with requestMIDIAccess', error.code);
  }

  function handleMidiMessage(e) {
    console.log(e);
  }

})();

Githubhttps://github.com/alexeybondarenko/midi-api/commit/ea94c5976dc7d0b1da13b0ef8769fa5741e984bc

3. Add WebSocket to the client

Now we need to send events to a server using BinaryJs.

WebSocket — is a long living connection between client and server. It provides faster data transfers than HTTP, because connection is not setting every time.

// WEBSOCKETS
var socketUrl = 'ws://' + location.hostname + ':3001';
var client = new BinaryClient(socketUrl);
var MIDIStream = null;
client.on('open', function () {
  MIDIStream = client.createStream();
  MIDIStream.on('data', handleReceiveAudioData);
  MIDIStream.on('end', handleEndAudioStream);
});
function handleReceiveAudioData(data) {
  console.log('receive audio data', data);
}
function handleEndAudioStream(data) {
  console.log('end', data);
}

And modify a little bit handleMidiMessage method to send handled MIDI Events to the stream.

function handleMidiMessage(e) {
  console.log(e);
  if (!MIDIStream || e.data[0] !== 0x90) return;
  MIDIStream.write(e.data);
}

Githubhttps://github.com/alexeybondarenko/midi-api/commit/255c1d77d5c676e32cea1f4ba81d17c44dc21265

4. Add WebSocket to Server

We need to add a socket handler to the server side. Let’s use BinaryJs for the server also.

const binaryServer = require('binaryjs').BinaryServer;
const socket = new binaryServer({
  port: 3001,
});

socket.on('connection', (client) => {
  client.on('stream', (stream, meta) => {

    stream.on('data', (data) => {
      console.log(data);
    });

    stream.on('end', () => {
      console.log('end of stream');
    });
  });
});

5. Audio generation

To generate audio we need a library of the sounds. I’ve found free sounds in the Github public repo and renamed the files to make them equal to the key codes of the piano pads.

You can download the sound library from my repository. https://github.com/alexeybondarenko/midi-api/tree/master/server/wav

Further, we have to read the sounds when a server receives the key code and return audio stream to the client.

For now, we read a sound from a file for each request, but it can be optimized with a cache.

const socket = new binaryServer({
  port: 3001,
});

function playTone(tone, stream) {
  if (tone > 61 || tone < 1) {
    console.log('undefined tone', tone);
    return;
  }
  const filePath = path.resolve(__dirname, 'wav', `${tone}.wav`);
  const file = fs.createReadStream(filePath);
  file.pipe(stream);
  file.on('end', () => {
    file.unpipe(stream);
  });

  return file;
}
socket.on('connection', (client) => {
  client.on('stream', (stream, meta) => {

    stream.on('data', (data) => {
      console.log(data);
      const tone = data.readInt8(1);
      playTone(tone, stream);
    });

    stream.on('end', () => {
      console.log('end of stream');
    });
  });
});

When the client receives an audio stream, it needs to handle the new data and send it to the AudioContext for playing and mixing (if several sounds are playing at the same time).

Githubhttps://github.com/alexeybondarenko/midi-api/commit/41e37989008903860043d58cbc5c1665e9a72c40

6. Latency logging

Now it’s time to think about latency. Let’s add the time logs from sent key codes to receive audio.

function playSound(audioBuffer) {
  var source = context.createBufferSource();
  source.buffer = audioBuffer;
  source.connect(context.destination);
  source.start(0);
  console.timeEnd('send');
}
function handleMidiMessage(e) {
  if (!MIDIStream || e.data[0] !== 0x90) return;
  console.log(e);
  console.time('send');
  MIDIStream.write(e.data);
}

For local application, latency is around 12–20ms (depend on device).

Githubhttps://github.com/alexeybondarenko/midi-api/commit/703003e1e5beefcb98490d35941ee577e2a7484a

7. Multi-client support

Collaborations. Let’s add support for multiple clients connected to the same server. All connected client will be able to play, and all users will listen to each other. It’s like an online jam.

On the server side, we need to update our play callback.

stream.on('data', (data) => {
  console.log(data);
  const tone = data.readInt8(1);
  Object.keys(socket.clients).map(
    i => playTone(tone, socket.clients[i].createStream())
  );
});

Next, we need to update our client. It now needs to be ready for the new streams, that are creating from the server.

client.on('open', function () {
 MIDIStream = client.createStream();
});
client.on('stream', function (stream) {
 stream.on('data', handleReceiveAudioData);
 stream.on('end', handleEndAudioStream);
}

That’s all. Now our application supports multiple clients. And users can play together.

Githubhttps://github.com/alexeybondarenko/midi-api/commit/02cfc6ed4091218c8f5a292fda1aa4ec46d0e4ea

8. Deploy on Heroku

I like Heroku for my projects because it is free and easy to use. Unfortunately, it has some limitations. One of them is that we can have only one external port. So to deploy our application to Heroku, we need to handle HTTP and WebSocket connections on the same port.

const server = app.listen(app.get('PORT'), (error) => {
  if (error) {
    console.log('Server started with an error', error);
    process.exit(1);
  }
  console.log(`Server started and is listening at http://localhost:${app.get('PORT')}`);
});
const socket = new binaryServer({
  server: server,
  path: '/socket',
});
view raw

We updated the binaryServer options. For now, Express server and WebSocket server are listening on the same port.

var client = new BinaryClient(location.origin.replace('http', 'ws').replace('https', 'ws') + '/socket');

Githubhttps://github.com/alexeybondarenko/midi-api/commit/64812199c818d4c494c0ade05a40720cdfe9a386

Demo

MIDI API tutorials
Demo of MIDI API tutorial

Conclusion

We’ve created a simple web synthesizer with piano sounds. It works pretty fine locally but has latency in a Heroku deployment. How can we reduce the latency? Hmm. We can try to split our samples into several chunks and then reduce time to download of the first chunk.

Useful links