BBC microbit

The long wait is over: after a number of delays, the BBC microbit has had its release today.

Most the following was written at the end of January 2016, and has been sat in draft form since then… so fro today’s perspective it starts with a link to old news, but it is what it is:

IMG_1840 (more…)

Read More

KULES ex-factory

There’s a thing being called Art City going on in Stoke-on-Trent (a five-year programme that launched in September 2014). The KULES Art Residency is part of that, and their ex-factory exhibition runs November 8th-29th, 2014: more info from the supporting AirSpace Gallery.

I provided art-technology and programming support for Leslie Deere’s Laserdome installation found on the ground floor of the disused Olympus Engineering building – a space that has a tremendous amount of reverberation that exaggerates the mid-low frequency range.

(more…)

Read More

Live Coding in MaxMSP

Laurence Counihan clicking a heart shaped glyph today triggered a notification to me that brought one of my video recordings form 2010 to my attention… I couldn’t remember what
happens in ‘live code Max MSP 20100220‘, so I watched it again.

live code Max MSP 20100220 from Samuel Freeman on Vimeo.

I saw that Counihan had also clicked ‘like’ on Max/MSP Live Coding #2 from Kingsley Ash, of which I had been aware when recording in 2010:

Max/MSP Live Coding #2 from Kingsley Ash on Vimeo.

Using a buffer with variable speed record (poke~) and playback (index~) to generate sounds. Some really interesting noises using a very small number of objects, but the controls are unpredictable and sometimes slip out of range.

The subject of live coding is trending in my mind at the moment as I prepare for a number of things that happen or start in the next few weeks… rambling on that and those to follow (perhaps).

I don’t think I had seen this one before today, I’ve both clicked the ‘like button’ and added it to improvithing :

MaxMSP Live Coding nr.2 from Edo Paulus on Vimeo.

A realtime performance of creating a Max-patch, starting from zero.

The Rules:
– You have 6 minutes to build a Max-patch and do a performance with it.
– Start with an empty patch.
– Only use the standard objects that are part of Max/MSP/Jitter.
– Don't use externals, pre-build external datafiles, help files, or anything of that kind.

Personal interest here extends to the following observations:

The date 20100220 (used as title) is significant to me because it is the day before 20100221 which has been discussed in my thesis in connection the work that began in live code practice and later was given the name sub synth amp map… The video from Paulus above fits the same pattern of construction as sub synth amp map because it begins with the building of a soundmaking algorithm, a visual representation of data generated in that algorithm is then added, and then the algorithm is edited to control the (now audiovisual) output.

Read More

uke-yello (October 2013)

after some busy days, it’s time to get back to this CreativePact thing

On the 16th (two days ago) I had a quick look back at some of the ‘improvised’ videos that had been uploaded to (using this search url) and bookmarked some of them. One of those is no longer available; this one is from a 2009 performance and includes a number of edits; and two of the others are constructed as video edits (of improvised dance) arranged to non-improvised music; finally there are three that are of instrumentalists playing on camera (synths, guitar, guitar and bass), and the third of these – because of the inset video layering (see embedded below) – reminded me of a 4-track improvised video piece that I created last year…

At Last – 1963 Gretsch Double Anniversary 6117 archtop guitar – 1978 Fender Jazz Bass Fretless from franco1953meta on Vimeo.

Here’s the thing it reminded me of:

uke-yello from Samuel Freeman on Vimeo.

26 October 2013, Marsh, Huddersfield.

Both to take a break from thesis writing (i.e. procrastinate) and to confront the fact that I have never really got on with the ukulele as an instrument, the video here is an experiment that follows a process that I have employed several times before: using the built-in mic on the laptop overdubs are re recorded (with little or no rehearsal) whilst the previously recorded audio is allow to bleed through to the new recording (headphones are not used).

For this video the QuickTime Player (10.0) software was used, first to record direct from the built-in camera and built-in mic, and then on three more takes to record the desktop (screen-casting) whilst the camera input is visible alongside the previously recorded video.

The audio heard here is a stereo mix that was made by taking the audio out of the four videos that were recorded during the process.

…and then today, after deciding to actually post uke-yello I repeated the recent uploads search and found this:

The Calm Night Air (Ukulele Improvisation) from Stephen Froeber on Vimeo.

This is a Tenor Ukulele Improvisation that I recorded in one take while in South Korea.

Read More

Melodica improv

A few technical hitches with rendering the video for this one… here’s a preview posted here whilst sorting that out:

stretched+melodica(wip)

The video is now uploaded:

Showing an improvised melodica part to an experimental work in progress titled ‘stretch’.

About the work: Five stereo sound recordings (of around ten seconds to two minutes duration) have been stretched to a duration of around 12 to 14 minutes; the digital audio workstation (DAW, in this case Reaper) maintains the pitch content of the original sounds, with some audible artefacts being introduced by the processing. The exact length chosen as the final duration for each of these five tracks was selected with some visual reference to their waveform displays in the DAW. Four of the five recordings were then pitch-shifted to roughly ‘tune’ them to the reference spectra of the first one recorded. This tuning was done quickly and with little thought, the Photosounder Spiral plugin (as shown on the right in this video) was used to observe the spectra of the sounds during this activity. A melodica part was then recorded as an improvisation, performed whilst listening to the mix (on headphones) and watching the spectrum analysis which was interpreted as a real-time score of sorts, suggesting the notes that could be played. After being recorded the changing spectrum of the melodica improv was also observed, this time using the Photosounder SpiralCM plugin (seen on the left in the video). An offset of pitch (-16 cent) was applied to the melodica in order to better align the fundamental frequency of each note to 12TET (twelve-tone equal-temperament). Some artificial reverb was also applied to the melodica recording, in lieu of the artefacts introduced by the extreme time-stretching on the other parts.

Read More

Visually Representing CeReNeM

Several people, including Monty Adkins, who was the co-supervisor of my PhD, shared a link to an article on RedBullMusicAcademy.com which describes many of the inspiring goings on at CeReNeM – the Centre for Research in New Music at the University of Huddersfield.

link_share_redbull

The thing that first caught my eye about this was the way that it is one of my Max patches that has been used as the featured image for the article.

The image will have been taken from the HISS website where it is one a few of my images originally created as part of the WofS events in 2010.

Read More

thisis drawing circles

A data matrix from sdfsys exported as a png:

sdfsys_thisis_20140521_.png

The data was drawn using the macro syntax of thisis; six slightly different macro sequences were recorded for putting points relative to points that have been put, and another macro is defined for drawing six circles at a time with those points.

The delay prefix to the macro run command for the drawing was used when moving and then drawing within the same parse of the text buffer.

Read More