sentient
ios synthesizer with a machine learning algorithm
A few words about my new synthesizer with a neural network that sequentially studies everyday noise and builds a unique wavetable for each user.

The slider controlling the neural network’s sensitivity to new will appear as eyes — the more eyes are open, the more sensitive it is; if they are closed, it doesn’t learn. The slider for selecting neuron save dates will become a planet with satellites: the crescent shadow will show the day of the year, the Moon’s position will hint at the day of the month, and the sputnik’s position will indicate the time of day. The neural network will “swell” while it learns.

The neuron that stores information about a sound similar to a new one will “swell” more than the others and update its own sound. Nearby neurons will also update, but to a lesser extent, and thus “swell” less. An orange dot will highlight the selected neuron, and the curve above it will show the sound fragment learned by that neuron — a wavetable used by the app to synthesize notes. For now, a logarithmic scale of frequencies and amplitudes will take the place of a keyboard.