Patchwork

One of the modules I’m likely to be mucking in on next year is the brand new Auditory Computing. It’s not at all clear what I’ll be doing on it, and I haven’t even seen the syllabus yet, but one of the potential tasks might be the setting of coursework. In idle and probably misdirected preparation for that I’ve been having a bit of a play with generating music using deep learning. People have tended to use LSTMs for this, but I thought it would be fun to try Andrej Karpathy’s neat little implementation of the notorious GPT. Training data is from the Classical Music MIDI dataset.

The results aren’t exactly going to be headlining the Last Night of the Proms, but some are quite cute, I think:

GPT-nano, trained on Mozart, Bach & Haydn
GPT-micro, trained on the whole dataset
GPT-micro, trained on the whole dataset

You can definitely hear the model regurgitating memorised fragments. But don’t we all do that?

The task is treated as a language modelling one, with a vocabulary of chords and durations. To somewhat reduce the vocab size and increase the representational overlap I’ve pruned chords to no more than 3 notes at a time. A snippet of code for this is below — not because I expect anyone to read or reuse it, really this is just to test out the syntax colouring WordPress plugin that I’ve just installed.

def simplify ( s, limit, mode='low', rng=local_rng ):
    """
    Drop notes from big chords so they have no more than `limit` notes.
    NB: operates in place.
    
    Drop strategies are pretty dumb. We always keep the highest and lowest notes
    (crudely assumed to be melody and bass respectively). Notes are dropped from
    the remainder according to one of three strategies:
    
        'low': notes are dropped from low to high (the default)
        'high': notes are dropped from high to low
        'random': notes are dropped randomly
    
    Latter could actually increase vocab by mapping the same input chord
    to several outputs. Modes can be abbreviated to initial letters.
    """
    if limit < 2: limit = 2
    
    drop_func = {
                    'r' : lambda d, c: rng.choice(d, c, replace=False),
                    'h' : lambda d, c: d[(len(d)-c):]
                }.get(mode.lower()[0],
                      lambda d, c: d[:(c-len(d))])
    
    for element in s.flat:
        if isinstance(element, MU.chord.Chord):
            if len(element) > limit:
                drop_count = len(element) - limit
                drops = [ nn.pitch.nameWithOctave for nn in element ][1:-1]
                
                if len(drops) > drop_count:
                    drops = drop_func(drops, drop_count)
                
                for note in drops:
                    element.remove(note)

Perhaps that will get more use in future, if all this coheres and I start working more of this out in public. Perhaps not.

Solidity

Today in “things that are unlikely to become clear to anyone anytime soon”:


Addendum: 3d software is always and inevitably difficult and, y’know, good luck to all who sail in her and all that, but I really don’t like SketchUp.

Comfort Zone

It’s hardly a radical or surprising observation that the last few years have been a bit challenging all round. I mean, I’ve really had it easy on nearly every level and still found the whole period pretty fucking difficult to deal with. I can barely imagine what others without my privilege and resources have been through.

I can’t pretend to have responded to those challenges with grit and fortitude. There’s been a lot of retreating into the known and comforting. An awful lot of 2020 was spent playing Animal Crossing New Horizons, an almost aggressively anodyne and undemanding “retirement simulator” whose suspicious timeliness must surely have spawned all manner of lurid conspiracy theories blaming Nintendo for the pandemic. Even though I’d never played an AC game before, NH managed to feel like putting on an old sweater, a slightly queasy escape from our ghastly reality into some kind of undemanding and idealised alternative. Of course, this is one of the things that games and media and art are forescapism has kind of a bad name, but that’s a lot of puritanical bullshit. Escape is a noble pursuit.

My consumption of new games, new media, new art seems to have taken a bit of a hit as part of this retreat. Not to nothing — and sometimes novelty makes the leap directly to comforting familiarity without passing Go or collecting £200 — but it certainly feels like the proportion of rereading and rewatching and replaying has massively increased.

The leap to comforting familiarity

I’m on my fifth or sixth playthrough of Breath of the Wild, and that’s not counting my many hours of tooled-up meandering through Hyrule after defeating Ganon for the first time. I revisited all of Guy Gavriel Kay’s novels — and recently knocked off Children of Earth and Sky yet again to follow up new release All the Seas of the World, with a revisit of A Brightness Long Ago queued up next. When faced with the option of tackling something new and unknown versus something dependable, the temptation of the familiar is terribly strong.

Which may go a little of the way towards explaining why I’ve resiled from my pending retirement and will be slinking back shamefaced to UCL for more teaching.

The joke’s on me, though, since the things I am familiar and comfortable with there have long been handed over to others. Whole new vistas of terrifying newness beckon. In at the deep end once again.

Lock your doors, bolt your windows

I don’t know how apparent it might be to casual readers of this site — were there any such, which there absolutely aren’t — but I’ve been circling WT uneasily for the last few weeks, tentatively sidling up to the idea of doing this again, at least a bit, in however limited and constrained a fashion. There’s something appealing about pausing for a bit of a self-indulgent wallow in the midst of whatever local and global catastrophes happen to be continually unfolding around us, a bit of — admittedly partial and dishonest and curated — record-keeping, putting down the odd marker after such a long stretch of unmoored limbo time, that shapeless, featureless slurry of unhistory.

Glancing back at entries from Walky Talky’s heyday, while one or two are now opaque without some crucial piece of omitted and long-forgotten context, the vast majority of those jottings from the past evoke a level of detail I’m hard pressed to recall for almost any time in the last decade; least of all the indistinguishable temporal morass that has been the Age of Covid. Various things did happen in 2020 and 2021, I think — they must have, surely, there are scars — but even the most drastic and wounding are hard to place now, bereft of mnemonic infrastructure, lost in the fog.

And so, the nostalgic allure of blogging. Might a few cuneiform scratchings in the digital clay of 2022 help stave off the worst amnesiac tendencies of my sclerotic brain? Can I erect some new retentive scaffolding to buttress these failing faculties? I don’t know. I don’t even know if I want to. Lying awake in the graveyard hours of interchangeable nights I concoct mental todo lists of salient posts, but when faced with actually writing the fuckers I tend to seize up. Unconvinced of the merit. Lacking the will or commitment. Unfocused and lazy and weak.

This, obviously, is one of those posts.

Call it a gauntlet of sorts. Thrown down. Demanding satisfaction.

Let’s hope it doesn’t just lie in the dust ignored for the next two years.