Skip to content

Add MrKov #44

@capital-G

Description

@capital-G

There is already a python implementation in the dropbox and also a wrong implementation of the algorithm which is interesting nonetheless IMO.
But I would be really interested in a SC implementation of this as I am not familiar with FFT in SC and having a live signal makes this interesting.

One basic idea that is different from the python implementation could be the use a local sensitive hash instead of a representation in a vector space - here is a sketch of a suggestion

Start

|1| 2
|-| -
|A| B

we recorded and hash 2 grains (A and B) and delay playback by 1 grain so we have a look-ahead of 1 sample - the playbacked grain is indicated by | |

Hash "collision"

1  2  3 |4| 5
-  -  - |-| -
A  B  C |D| B

After recording and playback of n samples we occur a hash-collision in our look-ahead - for now we will continue of the playback of

Lets say we jumped back to sample 2

1  |2|  3  4  5  6
-  |-|  -  -  -  -
A  |B|  C  D  B  E

Transitions

IIRC Markov chains were first used as to determine the next character in a book given a certain character (and is also used in the PageRank algorithm which spawned the company google) - in mathematic formalism we say the transition from one state to another one. We can also consider the last n states for our prediction of state n+1 - this is called a markov chain of order n.

In the example above we do not care about the state of the current sample to calculate the next state so we do not really account for the characteristics of a Markov chain.
Using the transition probability from grain A to grain D is possible but this has the problem that we need to have all possible grains in memory and therefore needs a different design.

Performance parameters

Hash bit resolution

Reducing the bit size of our hash would limit our dictionary size and would increase jumps - although it would be good to make create high resolution hashes and allowing to reduce them at a later stage for more performance.
It would be good to use local sensitive hashing so reducing the bit size of our hash would result in a confusion of similar sounding grains

Length of Buffer

As at some point we have to clear the Buffer in a fifo way - the size of the buffer correlates with the number of available hashes so this allows us to jump more throughout the signal.
As the playback speed is 1 (?) we do not have the problem that the erasing of our buffer could catch us up in the playback as long as we do not jump to a grain which gets currently deleted so it would be good procedure to delete the hash before feeing samples from the buffer.

Skewness of distribution

Descibed above we used a uniform distribution for the likeliness of jumps - maybe it would be good to introduce a gradual skewness as a parameter using box muller which allows us to transform a uniform distribution to a normal distribution.

BUT this contradicts the

Using characters as hashes

If we use characters as hash symbols this allows us to interchange text and music hashes => To explore further

Exchange transition matrix

Applying the tranistion matrix of signal A on signal B could yield interesting results.

Stationary Markov process

One of the interesting results of markov chains is the stationary distribution which gives us the expected occurences of each object if we would run the markov chain infinetely

Metadata

Metadata

Assignees

No one assigned

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions