Music generation

Creating your own music generation engine

Overview

JJazzLab framework → RhythmProvider → Rhythm → MusicGenerator → phrases → post-process

circle-check

RhythmProvider

Your engine must implement the RhythmProviderarrow-up-right interface with the @ServiceProvider annotation, so that the RhythmDatabasearrow-up-right implementation can automatically find it upon startup. The used Netbeans mechanism is described herearrow-up-right (lookup).

Example: See RhythmStubProviderImpl.javaarrow-up-right for a simple RhythmProvider implementation example.

Rhythm, RhythmVoices and RhythmParameters

The engine provides Rhythmarrow-up-right instances to the framework.

The Rhythm interface notably defines :

  • name, time signature, preferred tempo, feel, etc.

  • RhythmVoicesarrow-up-right : the supported tracks (Drums, Bass, Piano, etc.). Each RhythmVoice defines the recommended Midi instrument and settings for the track.

  • RhythmParametersarrow-up-right : the "control knobs" given to the user to modulate music generation. Common rhythm parameters are Variation, Intensity, Mute, etc. You see them in the Song Structure Editor, their value can be set for each Song Part. The framework provides default UI widget for each rhythm parameter type, but you can define your own UI widget if required.

circle-info

Rhythm instances are actually provided via RhythmInfo instances, which are used by the RhythmDatabasearrow-up-right for its cache file. They represent what the user sees in the rhythm selection dialog.

MusicGenerator

A Rhythm instance should implement the MusicGeneratorProviderarrow-up-right interface.

Actual music generation is performed by the provided MusicGeneratorarrow-up-right instance. It receives a song context (chord leadsheet, song structure, tempo, etc.) and returns musical phrases (one per instrument) that form the backing track.

Example: See RhythmStub.javaarrow-up-right for a simple Rhythm implementation example.

Example: See DummyGenerator.javaarrow-up-right for a simple MusicGenerator implementation example.

Post-processing

After generation, the framework applies engine-independent post-processing to the returned phrases :

  • muting instruments in a song part

  • per-channel velocity shift

  • volume fade-out

  • tempo factor changes

  • etc.

The details of post-processing tasks performed by the framework are provided in the MusicGenerator interface comments.

Last updated

Was this helpful?