Skip to main content

Tidal Profile - Atsushi Tadokoro

Tidal CyclistAtsushi Tadokoro
akatado, yoppa
LocationMaebashi Japan
Years with Tidal7 yrs
Other LiveCoding envSuperCollider, SonicPi, Hydra, Kodelife
Music available onlineSoundCluod, Vimeo
Code onlineGitHub
Other music/audio swAudacity, Pure Data, Ableton Live
CommentsClub Tidal Forum Thread

photo: Phont @phont1105 (ANGRM™)


What do you like about livecoding in Tidal? What inspires you?

What I like about live coding with TidalCycles is that I can improvise and change the pattern flexibly per-part basis (connections, d1, d2, d3). It also combines musical and coding ideas at a high level.

How do you approach your livecoding sessions?

In my case, I pre-code a rough flow in TidalCycles according to the time I need to perform. However, I leave as much room for improvisational changes and extensions to the code, making for improvisational and varied performances.

What functions and coding approaches do you like to use?

The function I currently use most often is the combination of scale and remainder operations to generate various phrases. For example, the following code is used.

$ s "supersaw*16"
# sustain "0.1"
# note (scale "minPent" "{-12..0}%5")

If the scale (minPent) used is changed to something else, the impression of the melody changes drastically. It is like improvisation in modal jazz.

Furthermore, by using the left and right channels effectively and by adding filters, you can add more depth to the performance.

$ s "supersaw*16"
# pan (rand)
# sustain "0.1"
# note (scale "indian" "{-12..[0, 5]}%[5, 7]")
# lpf (range 200 10000 $ slow 8 $ sine) # resonance "0.2"

More complex rhythmic swells can be generated by using functions such as "jux" and "rev" that create changes on the time axis.

$ sometimesBy 0.3 (jux (iter 16))
$ sometimesBy 0.1 (rev)
$ s "supersaw*16"
# pan (rand)
# sustain "0.1"
# note (scale "indian" "{-12..[0, 5]}%[5, 7]")
# lpf (range 200 10000 $ slow 8 $ sine) # resonance "0.2"

Do you use Tidal with other tools / environments?

I use TidalCycles in combination with other applications that generate visuals for audiovisual performance. Initially I used openFrameworks, but recently I have been using TouchDesigner.

However, it is difficult for one person to do live coding for sound and visuals at the same time. So I am currently using a method where the results of coding in TidalCycles are linked via OSC (Open Sound Control) to generate the visuals. I do the following.

First, I determine the names of the parameters to be sent from TidalCycles to TouchDesigner. For example, let's say we want to send out a numeric value of type Integer "td_s" that specifies the scene number in TouchDesigner. First, add the following statement to "BootTidal.hs"

let td_s = pI "td_s"

Next, add the following statement to the SuperCollider initialization file "startup.scd". This instruction forwards the OSC from TidalCycles to SuperCollider to yet another application, specifying an OSC argument of "\tidalplay" and a port number of "3333".

a ="localhost", 3333);
OSCdef(\tidalplay, {
arg msg;
}, '/dirt/play', n);

This OSC is parsed and used by the application generating the visuals. For example, in the case of TouchDesigner, the number can be retrieved by writing the following Python script in OSC In DAT.

from os import times
from time import time

def onReceiveOSC(dat, rowIndex, message, bytes, timeStamp, address, args, peer):
lst = message.split()
td_s = lst[lst.index('"td_s"') + 1]
op('scene_no').par.value0 = td_s

This allows for live-coded audiovisual performances with synchronized sound and visuals, as shown in the video below!


For more details on the code, please refer to the Github repository below.

Tidal Contributions

How do you contribute to Tidal Cycles? What have you worked on?

My focus is on education and the popularization of live coding with TidalCycles. I give lectures at universities on the central theme of live coding. The first half of the class covers the basics of live coding with Sonic Pi, and the second half is a full-scale live coding performance using TidalCycles. This type of lecture is rarely offered in Japan and has been well received.

What motivates you to work on Tidal?

The appeal of Tidal is its ability to generate very complex and diverse music and sounds with a few simple codes. The scalability of samples and instruments is also attractive.


Tell us about your livecoding music.

As I mentioned in the Livecoding section, I am interested in audio-visual expression through livecoding. In addition to that, I am interested in rhythmic expressions that sound natural but are a little bit twisted. For example, I am interested in polyrhythms, polymeters, and asymmetrical rhythms.

How has your music evolved since you have been livecoding?

Livecoding has made me more sensitive to rhythmic structure than before. I used to use a lot of simple four-beat repetitions, but I have started to create rhythms with more complexity.

What samples or instruments do you like to work with?

I use the sound samples and instruments included in SuperDirt as well as adding my own original samples and instruments. I have made them available in the following Github repository.

What projects are you currently working on or planning? What's next?

I am currently working on live coding of laser beams. I hope to show the results of my various experiments on Algorave. The current status is as shown in the video below.


Comments: Club Tidal Forum Thread