GSoC 2016 - Week 10 - Raw input timings
This was my 10th week working on note entry with MuseScore for Google Summer of Code. This week I started work on a new way of storing and handling user input internally that should result in more accurate notation.
This week’s summary:
- Started new internal representation of user input:
- Storing of raw input times
- Adjust times to cope with tempo changes
- Convert to ticks and quantise to beat
Still to do:
- Instant note entry in automatic Real-time mode
- Test user feedback
New internal representation
The “semi-realtime” approach is pretty straightforward: a note is inserted for each key that is being pressed at the instant a beat is received. At low input tempos this works pretty well, but at higher tempos it becomes increasingly likely that notes will be missed because the user pressed or released the key slightly too late or too early. My plan for overcoming this limitation is to store the time at which the NOTE_ON and NOTE_OFF events were received, and decide whether a note coincides with a beat based on these. The initial implementation is quite crude, but with the raw timings available it should be possible to provide a certain amount of correction for slight tempo changes or other inaccuracies, and to support more complicated rhythmical features such as tuplets.
Comments
could you point me where in the code you are doing "1. Storing of raw input times"?
In reply to could you point me where in by ericfontainejazz
I think I abandoned that idea due to lack of time and other issues arising that needed more immediate attention, such as my code not working with PortMidi (which meant it only work on Linux and not Windows or Mac).
Someone suggested that a good place to store it might be in the note offset field in the Inspector. This would have these benefits:
But it would have these drawbacks:
So it might end up being better to have a separate MIDI event queues for storing human performance information. That would also allow you to have multiple queues and pick the best for playback.
In reply to I think I abandoned that idea by shoogle
thanks for your response. I'm wasn't wanting to store the timestamps for giving a human performance (although now that I think about it...an option to do that might be useful), because really the main reason why I want to keep track of the timestamps was so that get more precise timing about when the midi was received by musescore.
In reply to thanks for your response. by ericfontainejazz
Sounds like you just want to qDebug the current time when a MIDI input event is received and again when realtimeAdvance is called.
In reply to Sounds like you just want to by shoogle
No. I want to timestamp incomming messages the moment musescore receives it in order to get as much accuracy as possible so it doesn't loose accuracy with potential thread context switches or cache misses between the time musescore receives and when your realtime code actually processes it.
In reply to No. I want to timestamp by ericfontainejazz
For portaudio case, seems should use Pa_GetStreamTime(PaStream* stream)
Returns the current time in seconds for a stream according to the same clock used to generate callback PaStreamCallbackTimeInfo timestamps. The time values are monotonically increasing and have unspecified origin.
Pa_GetStreamTime returns valid time values for the entire life of the stream, from when the stream is opened until it is closed. Starting and stopping the stream does not affect the passage of time returned by Pa_GetStreamTime.
This time may be used for synchronizing other events to the audio stream, for example synchronizing audio to MIDI.
That way incoming midi gets time-stamped exactly relative to the stream time.
http://www.portaudio.com/docs/v19-doxydocs/portaudio_8h.html#a2b3fb60e6…