Home CompaniesB2B How Atlanta-based MEPTIK Helps TikTok & Others Create Immersive Experiences

How Atlanta-based MEPTIK Helps TikTok & Others Create Immersive Experiences

by Richie Watkins

When the nature of live events changed in 2020, MEPTIK co-founders Sarah Linebaugh and Nicholas Rivero were ready to bring their extended reality work center stage. 

“We focus on creating really unique environments in a digital world, and we really like to put real people inside of them,” Linebaugh told Hypepotamus. 

MEPTIK, a production and design studio in Atlanta and Nashville, centers around creating real-time content that isn’t hindered by traditionally slow rendering times. By using multi-user editing, the team is able to work with clients to provide one-on-one feedback during the creation process. 

“It’s been pretty essential in our pipeline as of late,” said Joss Abaco, interactive developer for MEPTIK. Abaco, a SCAD graduate, joined the team after working a year in the gaming industry.

That pipeline has grown to a unique client base, including TikTok.

The VR and immersive experience market has changed drastically over the last decade. (And definitely, since 2003, when Abaco fondly remembers his first 3D movie experience being Spy Kids 3D: Game Over.) 

Sarah and Joss took some time to divulge what it takes to build new digital worlds and a technical team here in Atlanta. Check out a segment of our Q&A here: 


HYPEPOTAMUS: What is the difference between Virtual Reality and Extended Reality?

LINEBAUGH: So the experiential side, a lot of what we have done is create physical experiences for people by using digital elements.  What we do on that side ranges from projecting on a building to projection mapping to interactive lobby installation.  You’ll see all this at a concert or a corporate event sometimes.  We’ve been into the real-time content side for a long time. So a lot of our work has some type of input, such as audio; in other words, content that is audio-reactive.


We’ve done work for a Super Bowl party, where we would take live input from concert artists and DJs and have all the content be reacting to it in real-time.  Interactive touch walls are another one.  We’ve also done a lot of experiential elements in regards to tradeshow booths or their design, where we usually sneak a digital interactive element in there as well.  We’ve also done event visuals that we design to be reactive, such as the onstage graphics at a corporate event or concert.  Virtual production is similar since we’re still designing the real-time content, and the set up to make it all work is the same.

As for extended reality, the camera is like a VR headset. So if we’re doing this for say a virtual event, we’ll put a speaker inside a CG (computer-generated) environment. And then we use camera tracking to basically tie the real-world camera to a digital camera in the virtual scene. And so as a real-world camera moves, it’s actually moving the digital scene along with it.  Essentially, the real camera is the VR headset if it were mobile. You can look around the entire environment using the camera. But that allows you to capture everything in-camera so you don’t have to do any post effects. Post-production, in that aspect, happens before filming.  Compositing is happening there in real-time, and the camera actually sees it. And your final output shows a real person in a digital environment.


HYPEPOTAMUS: Do you see virtual production phasing out green-screen in the next five to ten years?

LINEBAUGH: Not necessarily, because you can do the same workflow with a green screen. You can still render effects in real-time on a green screen.  You just use the green to key out the talent and still place it inside the virtual environment. So we’ve done that quite a bit.  That being said, though, the benefits of using an LED wall are that you get a really nice natural lighting on the talent. So because you’re getting reflections from the LED shining on them, it also makes it easier for them to kind of envision the environment they’re in, versus having to make it up in their head with a green screen.


HYPEPOTAMUS: Since you’re located in a major artery of film production, do you see Meptik stepping further into the film or TV realm?

LINEBAUGH: We are actually in the process of partnering with a studio here that’s going to focus primarily on films. And we are also setting up our own virtual production studio as well. So we’re gonna be focusing on commercials, music videos, and corporate virtual events.


Behind the scenes: Meptiks work with Lululemon for World Yoga Day

HYPEPOTAMUS: What fascinates me about this technology is that while it is more advanced than the green screen process, it actually harkens back to the old days of rear-screen projection in films.  In other words, this is a great example of “the more things change, the more they stay the same.”  That said, how do you see virtual production evolving over the next few years?

LINEBAUGH: I think we’ll just start to see this become more ubiquitous throughout the industry right now. It’s a few select films, commercials, virtual events that, frankly, can afford to include this workflow. But I think we’re going to see a lot more films start budgeting for this type of thing.  I think we’ll see some shift from a lot of the post-budget [to the production budget], because there’s so much planning on the front end, since you’re basically designing all of the digital environments or your set environments upfront.

That being said, the flexibility that it offers is tremendous because you can go into the studio at the golden hour or sunset, you know, for like, six hours if you want to. You can have the same actor standing there and you can go from a beach in Hawaii to the forests in Alaska all within a few seconds.  So it’s super-versatile, especially if you’re limited by location.  I also think we’re going to start to see more of the corporate sphere and the broadcast sector jump in on this because a lot of what we’ve seen is that film is primarily embracing it.


A look at the Haunted House experience

HYPEPOTAMUS: So you have a new haunted house escape game available now.  Tell us about it!

LINEBAUGH: So the haunted house started as a personal project. We always want to keep sharpening our skills at creating environments and making sure they look real.  We just play, by creating a variety of environments. And the team came up with an idea of a haunted house scene or an old Victorian-style house and focused on creating a unique, real-time environment.  From there, we decided, ‘hey, let’s turn that into a VR game.’”

The concept was we wanted to show off this incredible environment that the team built but give a little extra edge to it. So you have to find three keys to escape, and you have a ghost that gives you hints of the little extra spooky elements like the creaking chair.  We have a pretty cool easter egg in there: all the portraits on the walls are actually each member of Meptik’s team.


Abaco teased the Haunted House game with a low-key, impromptu narration: “Your eyes slowly open up and you hear the crash of thunder just beyond the walls. In front of you, you see these three steel locks just caging you inside this environment, as you walk through we have this nice dining room table on your right here and the candles…then a light flickers.  At the end, there’s just this little hint of a blue apparition in the corner before lightning crashes down and then what you saw is gone–like a wisp of smoke.”

The Haunted House is just one of the examples of how Meptik is paving new roads in the digital space. Linebaugh added that whether working on a film or with a corporate client, the team is “pushing the boundaries of the applications for our real-time content.”




You may also like