Ethan Hein: “The copyright situation isn’t going to improve until the hip-hop generation gets into positions of legislative authority.” Interview by Jason Richardson

0

If you like reading about music, sampling and participatory culture then you might already be familiar with Ethan Hein’s writing.

In addition to blogging about music, Ethan Hein is a doctoral fellow in Music Education at NYU, an adjunct professor of music technology at NYU and Montclair State University, and a founding member of the NYU Music Experience Design Lab.

As well as writing for online publications like Slate and Quartz, Hein has contributed to the recent The Oxford Handbook of Technology and Music Education. His CV also shows online experience in teaching, including a new online interactive music theory course created by Soundfly. Among other roles, and of interest to the Australian audience, is lecturing via video for Sydney Conservatorium of Music.

Cyclic Defrost writer Jason Richardson emailed Ethan Hein a series of questions to learn about his roles in developing musical interfaces and teaching.

JR: How long have you been blogging? What motivated you to start? It’s an insight into music theory and contemporary research for me but I guess it’s more academic for you?

EH: I started my WordPress blog in (checking the stats) July of 2009. I did maybe a year or two of writing before that on a plain-HTML site I built myself, and I was an active user of sites like Flickr and Delicious from their inception. I did social media for the usual reasons: a blend of fun, social connection, self-expression, self-promotion, and a general sense that it was a cool thing to be doing. 

My motivation for the WordPress blog was pretty much all self-promotional. I was doing various kinds of freelance writing and marketing, and I needed a central hub for presenting my ideas. It worked! I even got a book agent interested in me, and we put some proposals together for a multidisciplinary scientific look at music, though we were never able to sell anything. I haven’t given up on that idea, but it seems more likely that a book will eventually emerge out of my academic activities.

I came into academia through a weird and backward way relatively late in life. I was in my mid 30s, working in marketing and hating it, doing music on the sides. My wife finally convinced me that I would never be happy unless I made music my main focus, and getting a masters seemed like the best way forward. My ambition wasn’t even to be a professor, it was to find my way into designing music software in some capacity. I quickly discovered that I’m a much better writer and teacher than I am a programmer, and academic gigs kept presenting themselves, leading up to my present doctoral fellowship in music education. Life is funny.

So actually, I was very active and well-followed as a blogger long before academia entered into it. I would recommend blogging to any scholar, it has served me incredibly well.

JR: Am I right in remembering that you had an earlier career as a musician? Also, the music software interest reminds me that you often use non-traditional representations of music notation. There are a variety of circular designs, right? The Groove Pizza but also the clock-like chord charts. How did they develop? The idea of things being round really lends itself to loop-based composition but I’d guess you’ve found other motifs in those designs.

EH: Yes, I was a musician/bum for a good chunk of my twenties and early thirties. I played in a bunch of bands, did some musical theater, taught lessons, toured a tiny bit, did some session work, some composition and production, whatever was available. I played jazz, rock, funk, country, experimental electronica, and combinations of all of the above. 

I’ve always been interested in alternative visualizations for music, probably starting when I was a kid, I liked to listen to music and draw weird abstract shapes. There was a particular recording that Abdullah Ibrahim did of an Ellington tune called “Way Way Back” that I especially liked for that purpose. I’m a substantially self-taught musician, aside from two semesters of jazz theory in college, so I’m a lousy music reader. When I was learning my chords and scales, I drew myself diagrams of them. The diagrams have been useful when teaching my private students too. 

The visualization thing kicked into high gear when I started making music on the computer. I spent a few years getting very deep into Propellerhead’s Rebirth, remember that? And then Recycle and Reason and Pro Tools. All of those programs are incredibly powerful sound tools, but they all have terrible UI, so I felt a lot of simultaneous inspiration and frustration. Later I got into Ableton, which I love, but it was a long hard road figuring out how to use it. So some of my motivation with all the visualization was to figure out how these concepts could be presented in a more intuitive and less difficult way.

As for the circular representations, that came together from several sources: a lifelong interest in radial symmetry as an aesthetic thing, playing around with the Polar Coordinates filter in Photoshop, diagramming the chromatic circle and the circle of fifths, and thinking about how to visualize the loops I was using in my productions. The thing that really crystallized the Groove Pizza idea for me was discovering work by Godfried Toussaint on circular rhythm visualization.

Once you start looking for cool circular data visualizations, you find them everywhere. I was absolutely obsessed with this site for a while.

I have a bunch of ideas for circular melodic and harmonic interfaces, and it looks like one of them is going to get built in the near future, so that’s pretty exciting.

JR: Do you find there areas where teaching how to use a computer interface for music that requires a history lesson? One area I was curious about recently is the piano roll-style of MIDI editing. It seems so common in DAWs but has looks a lot like it has for over a century. One thing I like about Live is that it does feel like an interface that recognises that late-20th Century shift toward the studio becoming an instrument.

EH: I was born in 1975, and I’m in the last age cohort for whom tape recorders were really a thing. I was in a band in college that did a little EP on tape. Then I graduated two years later and that same year started uploading files to mp3.com – remember that site? And of course, many of my students weren’t even born at that point!

There’s still plenty of history baked into computer interfaces. We still use the floppy disk for the Save icon! I teach a lot with Logic and GarageBand, and those programs are loaded with nostalgic skeuomorphism.

To me, the MIDI piano roll isn’t an old-timey thing – it’s more like paper piano rolls were incredibly futuristic, a digital medium in analog form. The piano roll persists because we haven’t found a way to improve on it, at least not for sounds that can be abstracted down to discrete pitches and rhythms.

Ableton Live Session View is a really big deal because it’s based on a software metaphor rather than a hardware one. It’s like a spreadsheet, not a piece of studio gear. Live does still use plenty of hardware metaphors, but Session View feels like a big shift. And then there’s this generation of hardware controllers following the software metaphors, like the Push and Maschine and so forth.

This idea of the studio as instrument is definitely a central one for me. We’ve had people like the Beatles working that way forever, but it’s become so easy in the computer. So much of my musical life has been in the form of listening to recordings, and for a normal person that’s like 99% of what music is. So the ability to just dive straight into a recording and interact directly with it, it feels like the most natural and obvious way to make music in 2017, you know? I do also play instruments and love doing that, but it feels hopelessly limited by comparison. If I learn a song on the guitar, I feel like I’m only able to connect to like five percent of its musical content, especially with more current music. But with the computer I can touch the entire sound.

Most of the people I teach are conservatory students, and they’ve been in the classical pipeline their whole lives. So it takes a lot of pushing for them to understand the idea of recorded music as being so central. I get a fair bit of pushback from them (though less and less with each passing year.) Even the ones who embrace DAW-based production enthusiastically tend to keep a layer of irony around it–it’s like they’re mentally walling it off from “real” music. For “non-musicians” it’s easier, they don’t have the same preconceived notions to unlearn. 

As for the morality of remixing, I have some pretty radical ideas about the right of artists to determine the usage of their work. Unless you’re going to use someone’s music to advocate, like, Nazism or whatever, I think you should be free to do whatever you want with anyone’s recordings. If you’re making money, you should share it with the copyright holders, but as far as expression goes, the right to remix should hugely outweigh the right of artists to control their work. This is a selfish position, since I love remixing, and I love when people remix my stuff, even when I don’t personally enjoy the results. There’s a guy on SoundCloud who remixes everything I post, and not always for the better, but it’s just so cool that he wants to do that, you know?

I seem like a radical in the context of contemporary capitalist culture, but in the grand scheme of things, capitalism is the weird outlier. In traditional societies, no one owns music. It’s just there, like language or food recipes. As recently as Bach, there was no concept of copyrighting music or preventing “remixing”, and all the composers of that era remixed each other constantly (though they called it “writing variations”.) So to me, sampling culture is just re-establishing world historical normalcy.

JR: It’s interesting that in Australia we don’t have fair use and it’s clear that, despite reviews supporting it, parliament are being lobbied heavily not to adopt it. So you get situations like the recent Men At Work case, where the flute player was found to have plagiarised a well-known melody and he then killed himself. The melody wasn’t even a note-for-note appropriation and went unnoticed for decades. You probably know the hit ‘Land Down Under’ but not the nursery rhyme ‘Kookaburra In An Old Gum Tree’.

Do you find music notation lacking? The idea of a graphic score is a good one but I wonder about other expressions. Things like the sound descriptions that appear in brackets during subtitles maybe? Another example might be a transcribed 303 bassline, without the character of the instrument it would a strange little repetition, right?

It seems that in the days of people recreating music on their piano there was a different aesthetic, one with clear open singing for example. Whereas today there is more emphasis on individual characteristics, whether it’s the accent of a vocalist or maybe things like Kanye’s mispronunciations.

EH: I do know “Land Down Under” and was dimly aware there was some copyright controversy around it, but I had no idea the guy killed himself, that’s terrible. I feel like the copyright situation isn’t going to improve until the hip-hop generation gets into positions of legislative authority. 

In my jazz life I geeked out very hard on all the chords and scales, and I still enjoy thinking about that, but in my musical life I try to keep the harmonies as simple and static as possible. I want the listener’s attention free to focus on the beats and the sonic textures, and having a lot of tricky chords can be too much information. But I’m always happy to discover a crunchy new chord by accident via sampling or MIDI manipulation.

Music notation represents pitch very clearly, but does a poor job showing rhythm. It’s effortless to learn to read the staff, especially inside the diatonic system, and learning all the flats and sharps isn’t that onerous. But rhythms are hard. There’s no way to see the metrical function of a given beat, and there’s no relationship between a note’s horizontal placement and its time duration. And this issue of the 303 bassline is definitely a problem. I mean, the musicians who use 303s tend not to read or write anyway, but it would be nice if they had the option, you know?

I hadn’t thought about the relationship between singing around the piano and the vocal styles of pop singers, but there’s probably something to it. As I was reading about Chuck Berry over the last few days, I learned that he deliberately emulated the clear diction of Nat King Cole and country singers rather than the more casual pronunciation of blues singers, because he wanted a bigger (whiter) audience. Meanwhile, I’m struck by just how mumbly pop has become, especially in the druggier forms of hip-hop, like Future, Migos, etc. I love that sound but I can’t understand two thirds of the lyrics.

JR: The idea of a hiphop generation is interesting. Baby boomers have maintained a tight grip on many areas but particularly popular culture. Yes, that mumbling quality is the kind of character I was writing about.

Maybe the growing trend for ambient music is a reaction to the trend for narrative-driven music that seems to have dominated music since recording began? The rise of generative apps for creating music seems to be part of that popularity for ambient music too. I guess another way to look at it is through the lens of participatory culture. Where people would use sheet music or a piano roll to recreate popular music around a century ago, there is clearly still a desire among people to be more involved with music than as passive consumers listening to a recording.

How do you view your role in teaching music? I’m thinking on how some classes I’ve had have teachers who promote preparing students for the future, while others inform them of the past. I guess so much of it is getting students to open their ears? There are so many ways of listening.

EH: I hadn’t thought about the connection between mumble rap and anonymous EDM DJs. But it’s worth thinking about. Kanye is the king of self-aggrandizement, but he was also an early champion of Daft Punk. And there’s Madlib who literally wears a mask. 

I see a few different things driving ambient music. One is definitely the influence of Eastern philosophy, this rejection of Western linearity in favor of timelessness and circularity. The fact that ambient is the most natural fit for generative systems is another. For me personally, the biggest attraction is how well ambient fits into my listening environment. I’m hardly ever listening in perfect quiet, and my home is usually super noisy, between my kids and all our appliances and the buses and airplanes and fire trucks and so on outside. Ambient noise (ha) obstructs my enjoyment of music played with instruments, but it blends in fine with anything electronic.

It’s an article of faith for me that technology has the potential to broaden participation in music creation, the same way that it democratized photography and video. The potential hasn’t been fully realized yet, because DAWs are so formidably complex. Even GarageBand is pretty daunting for novices. The main motivation for the Music Experience Design Lab is to fill that gap between absolute beginners and a more robust and complex tool like a DAW. I’ve also been really impressed with the explosion of iOS apps, especially the ones aimed at little kids. My daughter is thirteen months old and already enjoys playing with Bebot and Sago Mini Sound Box.

My mission as a music teacher is to get people to own their creativity, to be producers rather than consumers. If I’m working with beginners, that just means getting them expressing themselves in whatever capacity they can. If it’s the music school kids, then it’s a matter of helping them to be more active and bold in making musical choices, rather than just doing what their teachers tell them. It also means helping them embrace hip-hop and EDM, both so they can connect to people outside of music school, and because those genres support self-determination so well via bedroom producing. I think that preserving traditions is valuable and important, but the music schools have got that covered already.

As for getting people to open their ears, I have a zealous faith in the power of computer production to help with that. I didn’t start to appreciate electronic music until I tried making it. The same is true of jazz, really, but you can’t just dabble in jazz, you have to commit to it. Playing around with loops on the computer is a lower barrier to entry. Also, the loop libraries you get with things like GarageBand are so insanely eclectic now. I get these student projects combining dubstep beats with upright bass and bagpipes and oboes and traffic noises, just because that stuff is all sitting there on the computer already.

Share.

About Author

Living in regional Australia led Jason Richardson to sample landscapes instead of records.

Leave A Reply