The Human Element

The Human Element

SOHN Puts the Soul in Laptop Pop with UAD

Toph Michael Taylor, aka SOHN, sings with a sweet, ethereal, neo-soul voice that injects a languid romanticism into his Electronica-approved soundscapes, a fitting formula for his two fine albums — 2014’s Tremors and 2017’s Rennen. The Englishman, now living near Barcelona, Spain, grew up on Michael Jackson and Radiohead, before falling under the spell of storytelling songwriters like Tom Waits and Paul Simon. But it’s his passion and acumen with laptop production, as much as his majestic melancholy, that makes him so singular.

Here, Taylor details how he uses his Apollo x8p interface and UAD-2 Satellite to wrap vivid, ultra-present synth, drum, and guitar sounds in swaths of lush but well-punctuated plate and digital reverbs, all the while deconstructing vocal sounds into otherworldly instruments of their own.

Reverb is often "icing" for many producers, but it seems an especially integral part of your sound.

For starters, I always record vocals and print the bus reverb, so that I’m singing into the reverb. That’s really important to me. I don’t like singing and slapping the reverb on afterward because then you’re not reacting to the way the reverb is carrying your voice.

For a long time, it was difficult to record that way without adding latency, but recording and printing in real time with UAD and Apollo, I’m able to do it all when I record and the reverb is part of the sound from the very beginning.

Your reverbs are a nice blend of fairly organic plates and more cinematic digital-sounding textures.

Right now, I’m on the Lexicon 480L Digital Reverb plug-in hard. [Laughs.] It does help that a lot of the UAD stuff is based on this legendary gear that you get used to seeing in proper studios when you’re young. For a long time I didn't really understand how any of that stuff worked. So exploring classic gear through the plug-ins means that you begin to get a real sense of what they can do.

The 480L was always this mysterious box that looked so cool and important, and I remember asking what it was, and the engineer would say, “That’s reverb.” It wasn’t even necessarily the best sounding reverb if you listened to it on its own. But in a mix it sounds incredible.

That’s been a lesson for me. I’ve gone from wanting the best-sounding reverb on its own, to understanding that a great reverb also must sit really well in the mix. To that end, I’ve been using the Lexicon 480L on absolutely everything.

"What makes UAD plug-ins shine is the inspiration they give you and the creative momentum that comes with that."

Do different reverb types have certain roles in your productions?

Plates are like an instrument in themselves. I use plates for all my vocals, and then I use other reverbs, like Lexicons, for very specific jobs, like a short gated reverb on a drum sound, for instance, or very long tails.

I find it much harder to manage the spatial characteristics of a mix if I start using rooms and halls, and stuff like that. At some point, I lose track of where everything is.

What other UAD plug-ins do you use for creating space?

I really like the Ocean Way Studios plug-in a lot as well, especially for adding a natural ambience to drum machines. I’m also liking the EMT 140 Plate Reverberator as well as the AKG BX 20 Spring Reverb. Basically, I’m using all of the UAD reverb plug-ins [Laughs.]

I have to say, though, that the Lexicon 480L is the one where, invariably, someone looks over my shoulder when I’m working and says, “What reverb is that?” A lot of people tell me that if they’re going to buy just one UAD plug-in, they’ve got to buy that one.

"UAD channel strips speak to my heart as much as my head. I use them because they make me feel good."

I like the way you use reverbs in an otherwise very dry drum mix to sort of "pop out" a second snare sound, or a cool percussion hit.

Well, I do a lot of automation for that. I tend to do a pass of every track while twisting physical knobs with a controller in real time to create the automation, and that’s how I put the human element into automation. It makes so much difference to how a track moves.

I may start by sending 70% of a spring reverb to the snare, then take it away completely, then bring it back to 30% later, then down to 10%; all of this being done simply as I feel it in the moment. This is simply what you would do anyway if you were mixing in a traditional studio with outboard gear; you’d pick the moments where you want more of this or less of that, but not by drawing the automation in, but actually listening and responding in real time. That’s a part of the process I really enjoy.

"What makes UAD plug-ins shine is the inspiration they give you and the creative momentum that comes with that."

Interestingly, you use your live band quite a bit in the studio. A bit of a surprise given your record as a sort of a one-man laptop virtuoso.

I’ve had a live band since the beginning of SOHN, and while I don’t always record with the full band — I still do a lot in the sort of laptop world — I think it’s important to remember what you lose by only doing laptop music by yourself, without other people.

For example, on a laptop, it’s pretty easy to construct a whole song quite quickly, and tell yourself, “Look at that, I’ve done a whole song — the drums are in there, the bass, the synths, the vocals. Done.” But you forget that this process of creating with a grid — with boxes, essentially — can be a pretty square way of working.

It's very easy to forget the human element when you’re creating by triggering parts on a keyboard, and then copying the entire chorus section to the next place in the arrangement, and so on.

So how did you start to include other players in the process?

Eventually, I began sending the songs minus the bass parts to my keyboard player, Albin Janoska, who also plays bass synths with the live band, and having him do a full pass of the song with my part as simply a reference. I’ll suggest he do four or five takes of it, and then have him improvise a bit with it, and I can comp from that. It’s always a slightly better version of it than when I did it straight, and it immediately helps bring more of a human element to it.

One forgets that, generally, great music is the sound of four or five different voices talking to each other. That conversational aspect of music is what makes it interesting. In this modern era of computer music, it can be very easy to neglect that.

How would you advise a laptop producer who is starting out?

The key for any laptop producer to really improve is to bring in any element that stops you from doing what you habitually would do. New instruments, new plug-ins, live players — anything that can inject an element of the unknown into your work.

The fear is going stale, which is why it’s so healthy to occasionally rotate your gear choices, work with other players, and allow some element of surprise back into your work.

"With the API 560 Graphic EQ plug-in, you quickly get something that sounds incredibly present with just a bit of added dirt."

Talk to me about those vocal stutter effects, like the one in “The Wheel.”

I remember writing “The Wheel” very clearly. Those are all cuts. I had the idea for a rhythmic melody figure, those “dit, de-dit, de-dit” rhythms. I wanted to do it with a keyboard sound of some kind, but I just couldn’t find anything that was appropriate. So I thought I would start by singing in the two notes separately: G# and a B a minor third up from that, and then I would chop it up later. But when I recorded it, I sort of spontaneously did a slide up from the G# to the B, and that became the intro of the whole song. And I was still able to chop them in the middle and get two distinct notes as well for use elsewhere, along with singing another interval on top, so that by chopping, copying and pasting I could create this three-part harmony figure, arranging them rhythmically to create the little riff. It’s actually a liberating way to work.

SOHN's "The Wheel," from his 2014 album, Tremors.

So you would use vocals as a placeholder because you didn't care for the synth sounds?

Exactly. I just didn’t have many great instruments. For example, I’d have an idea for a sound in my head, but I didn’t think the software synths I had really captured what I was shooting for, so I’d do it with my voice, layering parts on top of each other, and then I’d chop it up and move it around the arrangement. All to fill in those spaces where I couldn’t find a synth sound that did the trick for me, where they were just sounding cheesy or corny or whatever.

I do that whenever there’s an idea that I can’t seem to get from an instrument. At first, it may just be to get an idea on the track and establish a foundation for it, thinking that I’ll replace it later with a synth or a guitar. But I nearly always end up using the vocal version.

What processing do you place on your master bus and individual tracks?

Well, I’ve really taken to using the Studer A800 Multichannel Tape Recorder as an insert on pretty much every individual track, in the typical multi-track tape way — just simulating the sound of having recorded all the tracks to tape. On my master bus, I generally have the API 2500 for some gentle mastering compression. And normally, when I’m bouncing out mixes and stuff, I’m using the Ampex ATR-102 Mastering Tape Recorder, and I add the Precision Limiter as well to make sure I’m not peaking, and to boost the level a little.

Do you track through any UAD preamps or channel strips?

I love UAD channel strips. the API Vision Channel Strip, the UAD Neve 88RS, and the SSL 4000 E Channel Strip are the ones I use the most. I print them on the way in, with Unison Technology. With the SSL, I’m using the whole strip, including the preamp, for most of my vocals. I find UAD channel strips speak to my heart as much as my head. I bring these plug-ins into play partly just because they make me feel good. I think, “Ooh, I’d love to be running this sound through an SSL Channel Strip right now.”

Do you choose a preamp and dial it in thinking about the final mix?

Whether it makes sense in the final mix is neither here nor there. In the moment of creating, you have this emotional connection, putting this thing into the process with which you feel an emotional connection, and which makes you carry on working on that piece of music.

It’s a heart-based decision, but the look of the plug-in's user interface comes into play as well. It feels good to coexist in the creative moment with the way the plug-in feels, sounds, and looks.

What’s the one UAD plug-in that you simply couldn’t do without?

My one go-to plug-in, if I had to choose one from all the plug-ins out there, is the API 560 Graphic Equalizer plug-in. I use it on everything, because everything I record initially comes out sounding really smooth and rounded. And I quite often want to change that, at least to make it slightly edgier, more midrangey, and less hi-fi.

So I generally use the UAD API 560 to cut bass frequencies and enhance mid-range frequencies, and it can change the character of your sound in two seconds. With the API Graphic Equalizer, within three movements you’ve got something that sounds incredibly present with just a bit of added dirt. That’s my 100% couldn’t-live-without-it plug-in. It has done so much for me and my music. Honestly, I would marry that plug-in!

Read More