The Great Gamelan Project: Making a Shopping List

Alright, so now that I’ve set myself the near-impossible goal of making a pair of gamelan using Microbits, I need to actually source said Microbits – plus a bunch of other stuff.

https://core-electronics.com.au/micro-bit-v2-go-by-bbc.html – the Microbit v2 GO. I might need a lot of these if I were to execute this project in its entirety, but as that would cost me a lot of money, I think I’ll buy two and see how I go. (I fully intend to play with these in my own time as well.)

Other things I need are the actual audio files of Balinese gamelan. As I slowly come back down to earth, I think I’m definitely going to go with the calung, which only has 5-6 keys (depending on instrument). Now, this project could go two ways depending on the capacity of the Microbits and my own technological capabilities – I could find a way for the Microbits to play the actual sound files (though I’m not sure whether that’s possible), or I could use a tuner to find the frequencies of the notes, and get Makecode to play those frequencies, which it can definitely do. Of course the manufactured frequencies wouldn’t sound as good, and it wouldn’t have the same lovely harmonics as in the original, but it would do at a pinch.

Now for the coding bit. I’ve found this article – Pressure switch alarm | micro:bit – on the Microbit website, which has instructions on how to make an alarm that goes off when someone steps on it. It requires two Microbits (thus why I’m purchasing two). I reckon I could modify this code to make it play multiple sounds. It’s the “damping” effect that I’m worried about – I might have to give up on that if I can’t do it on Makecode. I’ll give it a shot, though.

James’ Secret (Composition) Diary: Entry 1

Guess what? This isn’t just a technology blog. I’m also doing Composition in Music Education, and in the last few weeks we’ve been doing all sorts of fun things like workshopping mixed bag arrangements and making baby steps for scaffolding compositions. And now, our big project is following those baby steps to write our own compositions.

You can see my baby steps here: https://jameskong12345678.wordpress.com/

To recap: we’re starting with “Gather the Stars,” a beautiful piece by Daniel Brinsmead, which is all about motivic development and layering. I’m starting with a text, then finding parts of the text to set as motifs, then drawing a structural diagram composed of these motifs.

So where do we start? Well, in my baby steps website I already started following some of my own steps in order to give students some examples. So far, we’ve got:

Plus the beginning of a structural diagram:

Here, the four colours represent the four instruments of the string quartet, Quart-Ed, which will be kindly workshopping our pieces (thanks guys!) – violin 1, violin 2, viola and cello.

What I’ve done here is transfer my two motifs into recognisable shapes, which I’ve labelled on the diagram. Then I’ve layered these shapes while stretching and breaking them up. I’ve been using this process to start my compositions since I was 16, and I still use it now. It gives a clear visualisation of a composition – macro and micro structures – without being too prescriptive, and I find that you guide yourself into making key decisions about the piece as you go along.

Important things to note about the structural diagram: 1. it’s not graphic notation, so it can be vague and you don’t have to follow it line-by-line; 2. it’s there as a starting point, not an endpoint; and 3. it can take a long time. It’s as much a part of the composition process – honestly even more – than putting notes on a page.

My next step is finishing the diagram, and then I’ll start transferring it to dots and lines on Sibelius.

Week 9: Lift Pitches and a New Thing for the Wishlist

This week, I got to hear everyone’s lift pitches – short descriptions of their TME project, summing it up and selling it in the space of a minute. It was really cool hearing everyone’s ideas – from the weird and wacky (music with body sensors, mockumentary) to the heartwarming (digital children’s books, concept album about queer identity) to those building skills for their portfolio (music videos for original songs), everyone had something they were keen to do, and everyone seemed both scared and excited to do it.

That’s definitely how I feel. I’m running with the gamelan idea, and I am terrified of how much I have to do, but I’m also looking forward to it! If I can pull this off, I think it’ll definitely be something I can use in my own teaching.

Things I have to do:

  1. Talk to someone (possibly Gary Watson) about accessing the Con gamelan so I can record samples of each note. Alternatively, source recordings elsewhere if that doesn’t work.
  2. Look into contact-sensitive robotic sensors. Find out where to source the parts.
  3. Learn how to write code for said robots. All I need is for a speaker to play a sample when a sensor is hit, and then stop playing when a different sensor is hit. Still, this is absolutely going to be hard for me.
  4. Measure some Orff xylophone frames so I can fit the gamelan over them.
  5. Make big gamelan outlines out of cardboard. I think I’m going to make either a pair of gangsa (alto-range) or calung (tenor-range) gamelan. Calung would be easier because it’s got fewer keys, but gangsa would look more impressive – plus I could showcase a faster melody.
  6. Rig up the sensors and speaker and whatever else I need.
  7. Experiment with beaters. This will probably depend on the sensor.
  8. Anything else I’ve inevitably missed.

Anyway, in class, we also got to look at some more tech. I played a few notes on the Artiphon Instrument 1 and immediately desperately wanted to buy it.

https://drive.google.com/file/d/1rDZcMKotc400EvafBbjeMpSTO1NcUbR6/view?usp=sharing

It’s a MIDI controller shaped like the neck of a guitar, but you can play it like so many different instruments. Even though it’s a sophisticated piece of tech, it feels really intuitive to use. I’d love to make music with this – it’s on the wishlist for sure.

Week 8: Unableton

I am really not a terribly tech-savvy person, and usually I don’t need to be. I can use the internet and play Minecraft badly and edit pictures on Microsoft Word. That’s about all the average 20-year-old really needs, right?

I bet there’s cool stuff you can do with Minecraft note blocks in the classroom.

But I did feel a little bit underprepared when I first looked at Ableton. It’s an industry standard when it comes to production, fantastically versatile and incredibly expensive. And it absolutely looks nothing like Soundtrap or Garageband or even Audacity.

We started the class by trying to remix some audio tracks on Soundtrap, which did not go very well for me, partially because Soundtrap, lovely as it is, has limitations, and partially because I have never remixed a song in my life.

Then we switched to Ableton, and James showed us how much better it was at remixing, as it works in loops and you can edit stuff up close. I will admit it was very hard to understand. I felt like I needed maybe six or seven hours to play around with it on my own to be able to use it, but I also didn’t want to spend those hours remixing Taylor Swift and Lorde.

I ended up going rogue and spending the lesson creating a short original track, which is not what I was supposed to be doing, but did give me a chance to play around with controls and not stress out a lot.

And then we spent some time discussing DAWless studios, most of which went way over my head. I can barely use a loop pedal, and all those lights and buttons were kind of freaking me out. But I love pop music and I would love to learn more about production, and I would love for my eyes to not blur over whenever I see a lot of dials, so I’m going to have to keep trying.

Maybe I will take some time to play around with Ableton. I’d love to be able to engage with my more technologically advanced students, even if they’ll always know more than me in this area. But for now, I think I’ll stick with Soundtrap.

Week 7: The One Where I Glare at Synthesisers Through a Screen While Surreptitiously Blowing My Nose

I was sick this week, if you couldn’t guess. I was miserable about it, too. This class began with groups building a synthesiser out of a LittleBits kit, which looked like a lot of fun.

I spent 20 minutes wishing I could touch those sweet dials.

I watched jealously as this group connected wires, fiddled with switches, and made odder and odder sounds come out of tiny pieces of metal and plastic. As previously established, I love playing with toys, especially shiny ones that make noises. My group helpfully held pieces up to the camera as I squinted.

Then James went over some stuff I vaguely remembered from Year 9 science – the physics of sound waves, which involved a very entertaining line of human dominoes.

Perks of Zooming: getting screenshots like this.

Then we got a chance to put theory into practice – on Soundtrap!

This is Soundtrap’s secret (it’s under the ambiguously named “Tweak”) synth panel. I played around with different shapes of waves, which gave me different timbres, especially in combination with each other, and then combined this with effects and filters. Simple things, but I’m such a newbie at synth stuff that it was still pretty cool to me.

Then I got to glare at the analog synthesiser which I played back in Week 3 for the music video, while James demonstrated how to actually use it. The sine, square, triangle, etc. wave shapes are all on that synth, and James adjusted the dials to create different sounds.

I really wish I could’ve come to this class in person. Playing around on Soundtrap definitely helped me get a better grasp of how synths work, and I think that’s something I’d definitely do in a classroom – they’re easy to use and not too threatening for beginners. That being said, not getting to play with the LittleBits kit or the other cool stuff meant that the knowledge gap I have in this area is still there. Electronic music is not something I’m into personally, but some of my students will be, and for that reason alone I’d like to know more about this. Hopefully there will be more opportunities down the line.

Week 6: Super Cool Tiny Robots

This week we got to play with super cool tiny robots and my mind is blown. Two lovely people, Renee and Rowena, came in and showed us how to do some basic programming (I say basic, but it was still well above my pay grade) with Microbits and Kookaberries, and then let us look at and play with some more complicated setups they’d done earlier.

But what do tiny robots have to do with music, you ask? Near the start of class, I heard some guys next to me quietly ask that as well. To be fair, this is what we were looking at:

The last time I did coding was in Year 4 computer class. I had a terrible time with this.

If you clicked on the little Microbit image, you could play with the display as though it was a real robot:

Still, not exactly musical, but I figured all would be revealed later – and it was!

This code, which we were inputting (with the patient assistance of R&R) into makecode.microbit.org, would transmit a pitch pattern from one Microbit to another. The person with the second Microbit would then have to replicate the pattern exactly as they heard it, and be rewarded with a happy face (or other image of the coder’s choice) if done right. This had potential, R&R said, to aid with pitch, tone colour, and pattern recognition, depending on what sounds were coded into the Microbit.

Probably not the most efficient way to do that, but cool gimmick, I thought. But there was more.

The Kookaberry is more advanced than the Microbit, and the coding is, predictably, harder. But what was hugely exciting was when R&R showed us all the cool stuff they’d done with it – intricate setups of speakers and moisture sensors, pitch bends, and, most excitingly, a Play-Doh xylophone:

https://drive.google.com/file/d/1BDv34Kto8UIapRoWtZGcgSg4AHJujRjD/view?usp=drivesdk

And with that, I had an idea. Here is my train of thought:

  1. At the end of semester, everyone in the TME course presents a project that utilises some sort of tech.
  2. I played Balinese gamelan briefly in my first semester of uni and fell so in love with it that I did a full semester’s worth of it in Sem 2. (You can see 18-year-old James’ gushy blog post about it here: https://jameskongsoundworlds.home.blog/2019/05/16/the-flowering-tree/)
  3. Gamelan are expensive and most schools don’t have access to them, so some use Orff xylophones instead. Some gamelan is better than none, so I’m not mad about this. However:
  4. Orff xylophones are tuned in C major scales. Gamelan uses a series of modes which have little to do with C major, or Western tuning in general. In addition, instruments are paired and tuned very slightly differently from each other, so a vibrating effect is achieved when notes ring. This concept is quite difficult to visualise (or audiate, rather) unless you hear it.
  5. Xylophone technique involves playing keys with alternating hands. Gamelan technique involves playing notes with one hand and damping each key with the other.
  6. What if I could make an electronic gamelan, designed to fit over a xylophone frame, that could be tuned exactly to the right pitches and would sustain notes until damped? In short, could I make these non-Western elements of gamelan playing accessible to schools?

I’ll have to keep thinking about this.

Week 5: Music Videos (and My Face)

So after Week 3’s weirdness, I can’t say I was looking forward to looking at my face for two hours straight, but Brad and James had other plans. Our mission: to turn all the camera and phone footage into a music video, and hopefully learn a bit in the process.

Today’s new software was Adobe Premiere Rush, the cheaper app version of Adobe’s flagship video editing software, Premiere Pro. The majority of my video-making experience has come from tinkering with Windows Live Movie Maker as an eleven-year-old, which, it’s just occurred to me, the kids I’ll be teaching will never know existed.

Twenty is far too young to start feeling old. On with the task. Turns out there’s a lot you can do with Premiere Rush. We learned to change exposure and colour, layer cropped videos on top of each other, and put in all sorts of basic effects. Then, we were left to the clips and our own devices.

Yeah, that’s my face and my giant fingers.

Video editing has never been a particular interest of mine, but I must admit that this is very useful. I’m not sure whether I would get students to make their own music videos – it is a lot of work, and the musicking side is quite a small part of it, although of course music videos are a staple in the industry – but it’s good to know the option is there.

More important, I think, is how you can edit tutorial videos with this software. I’ve been very struck by Brad’s popular music education, where students learn to play through teacher-made video tutorials. That way, they can learn in their own time before putting everything together. Premiere Rush (and other similar programs, but less so) is a great way to add extra info, such as chord charts and fingerings, to these videos, as well as just making them more polished and easy to understand.

The main drawback to this software is the limited number of exports; I can only make three videos with the free version before I have to pay roughly $14 a month, more than a Netflix account. (I don’t even have one of those, I use my partner’s.) I guess that’s the price you pay for excellence.

Week 4: Notation Software, and an Old Friend

You ever run into someone you used to know in school, and they’ve gone from being a little pimply dweeb to a model-esque figure with perfect skin? Has that ever made you feel impressed yet somehow inadequate at the same time?

Anyway, the last time I used Musescore properly was when I was eleven. 2012 Musescore was a fun plaything for me – I would input random notes through my keyboard and play them out loud, which was most of what Musescore could do at the time. Then I went to a private high school where Sibelius was on every computer, and I never looked back.

Until now, apparently. Following Brad’s instructions, I put away my Sibelius elitism for a few minutes and went to meet my old friend.

And my God, she’s hot now.

This is an SSAA arrangement of Hallelujah (Pentatonix version) I did when I was in high school. I exported it to Musescore for this lesson.

I’ve had a few issues with Musescore whenever I’ve had to use it in the past several years: its lack of magnetic layout (bars would crowd quite badly, and slurs would go through notes), its annoyingly unaligned lyrics, and its generally being worse than Sibelius for the things I needed to do. My principal study, before I enrolled in Honours, was composition, and there just weren’t the kind of symbols and extended techniques I needed. Nobody in the field used it, so I didn’t even think about it.

Except, well, it’s all fixed now. The layout is magnetic, there are lots of symbols in the library, bumpy lyrics can be aligned with one shortcut (Ctrl+R), and compared to the regular paid version of Sibelius, this little free program honestly does more. You can put your own braces on staves (very useful for a multi-choir piece), and do all sorts of things that Sibelius hides behind a paywall.

I was honestly amazed by its utility, its ease of use (even compared to Sibelius, which got worse over the past few years, but that’s another story), and the fact that you can so easily toggle between basic and advanced settings. In my opinion, this is the ideal notation software for classrooms. Sure, Sibelius is still an industry standard, and it can still do stuff that Musescore can’t. But Brad’s suggestion – to have one or two Sibelius licences on hand, and have senior music students use those if Musescore doesn’t suffice – is one that I think I’ll be adopting.

We also looked at some other programs I’d never explored before, because I was so deeply loyal to Sibelius. These programs – Noteflight and flat.io – are notable because they, like Soundtrap, are not downloads, but exist entirely online.

Noteflight looks decently comprehensive for simple art music purposes, and has the added bonus of being embeddable in Canvas, so you can edit right in the website. Flat.io, on the other hand, is very much built for pop music – it has a comprehensive chord dictionary, but little else by way of complexity. The benefit of flat.io is that it is collaborative, like Soundtrap, which means multiple students can work on a lead sheet at the same time.

Marwurrumburr, by Gurrumul, arranged by Lara and me for Aboriginal and Torres Strait Islander Music last year.

As Brad said, horses for courses. I would never use flat.io for a Music 2 Core Composition, but I would happily use it in a Year 9 rock band group arrangement. For someone who grew up thinking music engraving was only for those faceless people who printed AMEB books, it’s so empowering to be able to write your own. I’m glad we can offer our students a range of options to notate their own music, whether the HSC markers see it or not.

Week 3: Hey Ya…?

This was a weird week for me.

We filmed a music video this week. The class was divided into teams: some operated the large, horrifyingly expensive cameras, some filmed B-roll footage on their phones, some were in charge of lighting, one student directed, and a band of five – vocals, guitar, bass, drums and keys – were the talent.

I was on keys. More specifically, analog synthesiser, which had to be manually tuned, and kept up a long background beep whenever I wasn’t playing. I had never played one before, and after I’d figured out how to make sound come out of it, my impromptu bandmates had already decided on the song: “Hey Ya!”, the classic, which (to my shame) I did not know very well.

So I had to do what every blindsided music student must, and figure it out as it went along. I finally settled on a groove that was nothing like the synth part in the original song, but it was the best I could come up with in the few run-throughs we had time for. We recorded a few takes, and our final product ended up sounding pretty good – it was a more contemplative take on the lyrics, which, it turns out, are devastatingly sad. Go figure.

What to learn from this? Firstly, I was pretty amazed by how much work goes into making one video. The two stationary cameras captured the entire band setup from different angles, but filming B-roll on phones meant there was additional footage that centred various performers. In addition to filming, the lighting crew had to experiment with positioning and temperature, and the band had to rejig our setup many times to get rid of all the wires on the floor.

But more importantly, I was struck by all the ingenuity and problem solving that arose from every team. Whether it was trying out mismatched lights, or fixing up our cutoffs in the performance, people were thinking on their feet. I was figuring out how to play an instrument I’d never even seen before, while learning a song I’d never played before, while standing in front of glaring lights and cameras. Even though it was nerve-wracking and frustrating, I can’t deny that making a music video is an incredible learning experience.

If I were to do this in a high school music class – and I think I would – I would definitely utilise phone footage. I doubt most high schools have a twenty-thousand-dollar video camera, let alone two, lying around, but it’s also so valuable to have footage from a lot of different perspectives. Some people could zoom in on a performer’s hands, while others could focus on a different performer’s facial expression.

The whole process is intense, but it’s a great one. With a bit more preparation time for the performers (which, over the course of a music unit, they would have), the recording day can really focus on learning to use equipment, solving problems as a team, and creating something everyone can be proud of at the end. I know I am.

Week 2: Intentionally Horrifying Vocal Effects

This week, following on from last week’s introduction to Soundtrap, we set up microphones, recorded ourselves singing directly into Soundtrap, and then made these recordings more palatable in various ways. This meant experimenting with distance from the microphone, trying out different microphones (dynamic mic and condenser), and applying effects in Soundtrap to make the raw recordings sound more beautiful and true to life.

At least that’s what we think the other groups did. Our group decided to go down a more artistic route.

we originally named this “million minion march”, but Maya Angelou deserves better.

Listen here: https://drive.google.com/file/d/1OExm1rBDYOqrFDMkTiC20UjVbzrSUF9d/view?usp=sharing

The voice files are me singing the chorus and the riff from the ageless classic, Take On Me by A-ha. We realised that Soundtrap has a lot of avenues for vocal transformations. We changed the tempo from 100 to 50, opting to slow down all the musical material with it, and then pitch shifted my voice and put a LOT of reverb on. Using the collaborative feature on Soundtrap, Charis took it home and put on some automations, as well as a bunch of SFX that I think add a lot of atmosphere. I tweaked the (horrifically slow) chorus file to make it sound buzzy, using the effects panel:

Absurdist art pieces notwithstanding, we’ve delved into a totally different aspect of Soundtrap this week: using its effects panel to enhance recordings. Again, there’s a lot of utility here – each individual panel only has a couple of relevant dials, which means students are less likely to be overwhelmed. The effects of each change are instantly identifiable by listening, and with all the options, students have lots of room to play.

Plus, I like that you can record directly into Soundtrap, using a microphone or just your computer mic. In the past I’ve noticed a bit of delay in the recording, which is annoying, but it’s easy to fix – just turn off Snap to Grid and zoom in, and move it around until it matches up.

For a production newbie like myself, I can’t imagine a more user-friendly introduction to editing. A more experienced producer would find these controls quite simple, but starting high school students off with this means that when they want to move to a more advanced DAW, such as Logic or Ableton, they’ll be more equipped to use it.

Another point to Soundtrap. I swear they’re not sponsoring me.