Posted Jan. 15, 2018, 12:46 p.m.
Over the last few years, a new retro music genre has emerged, bloomed and taken on a life of its own. Synthwave, or Retrowave is an electronic music genre heavily influenced by the sounds and aestethics of 1980s movies and its soundtracks (think John Carpenter, Vangelis etc) and video games. This nostalgia-induced style of electronic music pays tribute to the style, feel and sound of the 80s. Musically, Synthwave music often draws inspiration from bands that build their musical foundation on drum machines and (nowadays) classic synthesizers.
Emerging in the late 2000’s, Synthwave acts like Kavinsky, College and Com Truise were among the first to make the genre widely known and loved. Both Kavinsky and College were featured in the Synthwave-heavy soundtrack for the movie Drive, which definitely helped many discover the sounds of Synthwave and bring the genre into the mainstream. The Netflix hit show Stranger Things also features Synthwave music in its soundtrack and the whole series could of course also be considered an homage to 80s movies.
Synthwave music is often inspired by and based around 80s style components such as drum machines (such as the Linn Drum) and analogue synthesizers like the Roland Juno and Jupiter 8, mixed with more modern production techniques like creative use of sidechain compression.
With its rich plethora of drum machines and analogue inspired synthesizers, picking Reason to produce a Synthwave track is a perfect match. Here to show you how it’s done is producer and musician Paul Ortiz of Synthwave group ZETA.
Producer, musician and Reason producer Paul Ortiz (Chimp Spanner) is a member of Synthwave group ZETA, along with Daniel Tompkins (TesseracT) and Katie Jackson. Together they fuse the retro synth heavy decade of the 80s with futuristic and breath-taking imagery, bringing past and future together in a Cyberpunk-esque package that is ZETA.
Follow ZETA on YouTube, Facebook, Spotify, Bandcamp.
Make a Synthwave track yourself with Reason's free trial!
Posted July 7, 2017, 8:38 a.m.
ZETA is a collaboration between Paul Ortiz (Chimp Spanner), Daniel Tompkins (TesseracT) and Katie Jackson. The UK artists seek to push their own creative boundaries by exploring epic soundscapes that intertwine with stunning visuals.
This unique project fuses the retro synth heavy decade of the 80s with futuristic and breathtaking imagery, bringing past and future together in a Cyberpunk-esque package. With a huge span of influences ranging from metal, future garage, retrowave, prog, classical and various game and film soundtracks, their music embraces the sounds of electronica, but with textures and layers inspired by the whole musical spectrum.
We had a chat with Paul about creating music for ZETA and how Reason plays a big role in the creative process.
Tell us a bit about how ZETA came about and what your intention was when launching the project!
I guess it kinda formed by accident! So I'd known our singer Dan for a while through the Progressive Metal scene - I was busy with my project Chimp Spanner and he sings for TesseracT. We'd always planned on working together but just never got around to it. It wasn't until I shared a song of my partner Katie's that he approached me, thinking it was a song of mine. After I explained the mixup we decided that it'd be awesome to all work together and, here we are! Originally we'd intended to make a futuristic/chill kind of album, and then for a while it was all-out Synthwave, and then it naturally settled somewhere in the middle. I think it works because we all have a shared love of influences old and new, ranging from Tears for Fears, George Michael and Vangelis to Ghost in the Shell, Future Garage, sci-fi games and all of that.
Being an (almost all) electronic album, what was your approach on producing the album, as opposed to any guitar centered albums you've done previously?
Well the workflow was very different for me. I'm used to just writing on my own, instrumentally. With Zeta what'd usually happen is Katie would give me a MIDI file and a demo mixdown from Cubase. I'd listen a couple of times for reference and then dump the MIDI in Reason and basically start from scratch, then embellish with guitars or add new sections, chord changes, etc. So I guess it was more like re-mixing than anything. Then we'd send it off to Dan to do his thing, get the stems back and edit them in Reason, then figure out what needed to stay or go in the mix to make them fit. So yeah; for someone who's used to doing everything all at once it was a very different experience to bounce the songs around between three people. But it seems to have worked well. Of course some songs I wrote directly in Reason from start to finish but in either case the focus was on drums and bass. I found that once I nailed the rhythm section everything else fell into place, which really isn't too dissimilar to how I approach guitar music.
I've accumulated so many REs over the years that I had a device for just about every job, and where I didn't, I just made one myself in the Combinator.
How did Reason help you creatively when writing music for the album?
It's just fun! We tried Cubase at first; Silent Waves is actually the only track not made in Reason, and it would've been if I had been able to find the project. But I just wasn't happy with the sounds I was getting. Everything was kind of "cold", and I found the environment kinda taxing to work in, especially when it comes to automation. So we made the decision early on to switch. With Reason it felt like I was playing with a bunch of cool toys rather than working. I've accumulated so many REs over the years that I had a device for just about every job, and where I didn't, I just made one myself in the Combinator. But yeah more than anything it's just that fun factor. And then of course on a technical level the clip based automation is just such a time-saver. You can go really crazy with it and not have to worry about setting things back to the right position afterwards. In Cubase I'd normally just leave stuff as it is because I can't be dealing with my parameters being left at the wrong value after MIDI or host automation.
OK, synth nerd alert: what was the most used synth on the album?
Tough one! I'd say Antidote, just because it's so versatile. It's great for those dark unison Future Garage style basslines, as well as pads and leads. But beyond that, I used a lot of The Legend and Viking (wanted that authentic Moog kinda feel). And I'm pretty sure Quadelectra's Jackboxes are on every track (707, 808, Linn Drum). The Kings of Kong ReFill is also fantastic if you want even more retro drum machines. That features a lot also.
Any special, secret Reason production trick used in the process?
Well there's a tonne of side-chaining haha. Kinda comes with the synthwave/future territory. Typically what I'd do is take all my melodic elements (except for lead instruments and vocals) and put them in a group channel called "SC". Then I'd either key the compressor using audio from the kick, or more often than not I'd just use Pump RE and trigger it via MIDI. Having certain instruments outside of the side-chain group keeps the mix from sounding too ducked and keeps those elements more in focus. Also Audiomatic's Tape and Bottom presets got a lot of use on the album. I have no idea what they do, but they make the mixes sound kinda warm and fuzzy, and I like that. Scream's Tape setting is also great for warming up basses and kick drums. Distortion isn't necessarily a destructive tool. It can be really musical.
Scream's Tape setting is also great for warming up basses and kick drums. Distortion isn't necessarily a destructive tool. It can be really musical.
Any tips and tricks for mixing vocals in Reason?
Hmm, considering this was my first time mixing vocals, I think it might be me who needs a few tips and tricks! But I mean, it was a learning
experience. I'd say automate. Lots. I'm kind of a set-and-forget guy normally, but for vocals it just doesn't work. You have to really ride the faders and "play" the mix. Also try using ducking on your reverbs. So you could send a lead vocal to a nice long reverb with a compressor after it. Then use the Spider to take a copy of that dry vocal and send it to the sidechain input of the compressor. Kinda like lazy-man's automation. When there's singing there's less reverb. When there's no singing, there's more reverb. Works pretty well most of the time.
Could you share any synth patches used on the album?
Well a lot of the patches are really not that complicated; most of the basses and pads are really sort of "naked", in that they're not dressed up with a lot of effects or complex routing. It's mostly sawtooth oscillators (either dual detuned or something with a rich/wide unison section like Korg MonoPoly or Antidote) and then a suitable amp/filter envelope depending on whether it's a bass or a pad or whatever. I've included a few patches here, although they're not much to look at!
Download Zeta's Reason presets here!
(Please note that some of the patches requires Rack Extensions)
A few people have asked about the snare on The Distance. And I can tell you it's a layered 707 snare, 707 low tom, and the BBGunSnare_BSQ sample from the Reason FSB, all running into a gated reverb! Ohhh and guitars are almost entirely presets from Kuassa's excellent amp REs!
Follow ZETA on YouTube, Facebook, Spotify, Bandcamp.
LIsten to ZETA's new album here:
Try Reason 9.5 free for 30 days here!
Posted June 20, 2017, 9 a.m.
Over the last months we've been posting these #ReasonQuickTip videos on our social media channels and due to popular demand, we've now compiled them in one space. This YouTube playlist will be updated whenever a new #ReasonQuickTip gets posted so be sure to bookmark this page!
If you want to share your best tip with us, just tweet us or write to us on Facebook or Instagram! Maybe your tip will be our next video?
Posted Oct. 30, 2015, 3:32 p.m.
Ever since those Portishead folks in Bristol found the magic that happens when you pitch a drum sample down and bath it in gloomy reverb, Trip Hop has been one of the most popular genres for people learning to make beats. When we got requests to cover Trip Hop in this tutorial series, we wondered what people were really asking for. Trip Hop is a sample-loop based genre that doesn't require too much production wizardry, if you don't' want it to... In this tutorial, however, we'll cover those basics but we'll also delve further into the sound design theory that lies behind those loops so that you can create your own custom Trip Hop sounds and beats.
Posted Oct. 29, 2015, 9:33 a.m.
At its most basic, a shimmer reverb is a pitch-shifted reverb tail in a feedback loop. If you’ve listened to much U2 since the mid-80s, then you’ll have heard it. While it does work particularly well on guitars, it can also be used to great effect on other instruments. Brian Eno, who is generally credited with inventing the effect, had been using it on pianos long before it was popularised by U2’s Edge.
Here's a simple piece, played using a tweaked Radical Pianos preset, played through a shimmer reverb patch I created in Reason:
I built the shimmer effect in the Reason rack with an RV7000 Reverb and a Polar Dual Pitch Shifter. Hold down the shift key when you add these two devices to your rack though, because we don't want to use the default routing here - we're going to do things a little differently.
Connect an FX Send from the Master Section to the input of the RV7000, but instead of sending the RV7000's output back to the FX Return on the Master Section, connect it to a Spider Audio Merger & Splitter. Send one pair of outputs from the Spider to the FX Return on the Master Section, and send another to the input of the Polar Pitch Shifter. Send the output from the Polar its own channel in the mixer.
Now that we've got the routing sorted out, let's start dialling in some settings. You're going to want a pretty evident reverb. I've used the Arena algorithm, and selected the largest size available. Crank up the diffusion to make everything as fuzzy as possible. Turn the decay *nearly* all the way up, but not quite. Do not be overly concerned with subtlety here, people. Really: go big or go bigger. If you want to start with a preset, then the EFX Kick Bomb patch is as good a place as any. Add a little pre-delay to stagger the beginning of the reverb tail.
For the Pitch shift part of the sound, set both shifters to a shift of a single octave (by all means experiment with different intervals, but an interval of an octave is your safest bet). Play with the feedback level of and delay of each shifter to suit. Dial back on the delay and feedback if you find things are sounding a little seasick. I've detuned the second shifter, panned it to one side, and delayed it slightly.
Because you're adding higher frequencies to the signal, then it doesn't hurt to engage the Polar's LPF - you can select the frequency to match your material.
The final step is feeding the pitch shifted reverb tail back on itself. This shifts the reverb tail in pitch again and again, making for the characteristic sound of the effect.
Because you have the pitch shifted reverb tail in its own mixer channel, you can feed it back through the reverb by activating the same FX return that is connected to the reverb inputs.
In the screenshot here, I'm using FX Send 5 to send the Distant Piano instrument to my RV7000 reverb. The pitch-shifted reverb tail from the Polar is routed to the Shimmer Return channel in the mixer. This channel in turn has FX Send 5 activated, which feeds the pitch-shifted reverb tail back into the RV7000.
It's a good idea to lower the fader for this channel before you hit play! The channel fader can be used to blend the amount of pitch-shifted reverb against the normal reverb, and you can use the mixer channel's filters, EQ and compressor to control and reign in the signal and keep things under control
Here's the same piece without the shimmer effect:
Download the attached Reason song file and try it out! Try your own material through the shimmer effect. Try different intervals of pitch shift. What's important to bear in mind is that the material you’re running through the effect has space to breathe, allowing the sound to develop and flex. If your material is too dense, you're going to end up with some kind of sparkly celestial soup.