Skip to content

Latest commit

 

History

History
121 lines (77 loc) · 6.29 KB

README.md

File metadata and controls

121 lines (77 loc) · 6.29 KB

About the project

APICultor born to realize interdisciplinary performances based on sounds of the web platform http://redpanal.org. The system is also useful to use with any other sound database on the internet or even running it locally.

The sound is processed digitally using different live-coding techniques. A pre-analysis based on Music Information Retrieval (MIR) stored in a database and accessed via a web-service REST API is combined with real-time processing and synthesis, random processes and human control via external interfaces.

Examples available with SuperCollider, Pyo and pure data.

Spanish version: README_es.md

News

  • 2018
    • Migration to Python3 and setup.py install thanks to MarsCrop!
    • MusicEmotionMachine by MarsCrop (in development)
    • Cloud Instrument ready to play with an external controller, running in a dedicated device like Raspberry Pi or Bela Platform
  • 2017:
    • SuperCollider patch to play in realtime using sounds from public repositories (databases on the web). Realtime synthesis, wav retrieving using MIR descriptors, OSC & MIDI communication between process and controllers.
  • 2016: First demos (extracting MIR features, database build, MIR State Machine, etc)

Developers

See Development Guidelines.

License

Free Software shared with GPL v3, see LICENSE.

Cloud Instrument

Using a desktop computer, a Raspberry pi or bela platform.

See cloud_instrument/README.md

Interactive DEMO: Cloud Instrument. Interactive demo retrieving sounds from the Cloud using MIR descriptors and processing them in realtime (using raspicultor aka raspberry pi + apicultor).

UI

Open Stage Control User Interface

Custom MIDI Controller design

Yaeltex custom MIDI controllers

With a SuperCollider synthesizer/effects processor running in a Raspberry pi, plus an external sound card for high fidelity.

Performances

Sonidos Mutantes

Interdisciplinary performances based on sounds of the web platform Redpanal.org

Proofs of concept:

Components

  • Mock web service with API REST to provide audio samples using MIR descriptors as parameters
  • State machine, with each state defined by several MIR descriptors.
  • Interaction with the free internet sound database http://redpanal.org
  • API REST
  • Webscrapping by tag
  • Algorithms MIR to extract mean values or by frame of audio samples
  • Segmentation algorithms using different criteria.
  • Classify algorithms and clustering of samples of the sound database
  • Server OSC
  • Examples in Supercollider, pyo
  • Examples with MIDI and OSC controller. Locale and remote.

Dependencies

Tested under Linux, Mac OS (>10.11) and Windows 10.

Debian, Ubuntu 15.04 and 16.04 (and .10). And Docker images. Raspian @ Raspberry Pi

See INSTALL.md