Last Update -
July 28, 2025 11:45 AM
⚡ Geek Bytes
  • Runway’s Act 2 lets you animate full upper-body motion, including hand gestures, using just a video and a still image.
  • Proper posing, prop use, and reference images dramatically improve the realism of your animations.
  • With a few smart tricks like green screen compositing and voice changers, you can turn simple videos into cinematic sequences.

Create Amazing Videos with Runway's Act 2 – How-To & Pro Tips

If you thought Runway’s Act 1 was cool, wait until you see what Act 2 can do. The jump from facial animation to full upper-body motion is no small leap—it completely changes how you can use AI in your videos. Whether you're a YouTuber, indie filmmaker, or just a creative tinkerer like me, Act 2 opens up some wild new possibilities.

Let’s break down how it works, how to get the best results, and how to avoid weird glitchy hand-melt moments along the way.

What's New in Act 2?

Runway’s Act 1 could animate faces pretty well—but that’s where the magic ended. Act 2 brings the rest of your upper body into the action, including arms and hands. That means you can now gesture, wave, hold objects, and even vibe your way through a video.

Oh, and it handles up to 30 seconds of motion, which is more than enough for most clips or dialogue-driven scenes.

Getting Started with Act 2

Inside Runway, just start a Generate Video session, and you’ll find the Act 2 tab. You’ll need two things:

  • A driving video – That’s you, moving around and talking.
  • A character image – This can be AI-generated or a photo you create with props, costumes, or cool poses.

Load both in, tweak a few settings (more on those below), and hit generate.

Pose Matching: The Secret Sauce

This is the #1 tip that will save you so much frustration: Match the starting pose of your character image to your video’s first frame. If your hands, head angle, or body orientation are wildly different, Act 2 will struggle—and the result might look like your character’s fighting invisible bees.

Try it yourself. Record your video. Take a still from the first frame. Then use that as a guide when generating or choosing your character image.

Hands, Props & Glitchy Fingers

Let’s talk props. Holding something in your hands? A glowing orb, a sword, maybe just a cup of coffee? Make sure the prop is visible in the character image. If it's not there, Runway will guess—and it often guesses wrong.

I tested it with a cardboard gun and a magical glowing cube. Sometimes the fingers melted into the object, or the gun handle disappeared mid-motion. But with careful posing and consistency, I got some awesome, magical results.

Pro tip: Glowing objects, like orbs, really pop if you add a bit of glow in the character image. Runway tries to keep those lighting effects in the final video.

Settings Breakdown: Expressiveness & Gestures

You’ve got a couple of toggles in Act 2 that make a big difference:

  • Facial Expressiveness (1-5): 1 = subtle, 5 = full-on cartoonish. I stick with 2-3 for most things.
  • Gestures On/Off: Turning this off gives you only facial movement. On = full body. Super useful if your character shouldn’t move too much.

Toggle them both and test what works best for your style.

Green Screen Magic & Multi-Character Shots

Want two characters in one scene? Here’s my method:

  1. Film yourself twice, doing different actions.
  2. Animate each performance in Runway with a green screen background.
  3. Composite the two clips together in a video editor and drop in your background.

I used this for a park bench scene with two characters chatting. It looked completely natural—and you wouldn’t know both were just me in a hoodie and a wig.

Using Flux Context for Better Characters

Creating awesome characters can make or break your video. You can generate them right in Runway, but if you want more control, I recommend:

  • Higsfield AI (with Flux Context Max): Perfect for stylized characters like aliens, robots, or wizards.
  • Foul AI: Another great option for detailed image creation and character design.

Just upload a still from your video and prompt the AI to turn it into whatever you want. Then, bring that image back into Runway for animation. Want your cube to glow blue? Tell it. Want a wizard hat? You got it.

Voiceovers & Sound Design

Want to change your voice? I use 11 Labs for character voices. Just drop in your audio and choose from a wide range of voice models. It’s scarily good. You can also generate entire scripts using their text-to-speech.

It’s what I used for the Orb of Oblivion ad voiceover—and yeah, it made my goofy prop sound like a legendary artifact.

Make It Cinematic: Depth of Field & Focus Tricks

I even tried depth of field tests—putting my hand close to the camera and pulling it back. Runway followed the focus shift. That’s huge. It makes everything feel way more cinematic and alive.

If you want your animations to look next-level? Add movement to your backgrounds. Animate a slow pan. Blur the background slightly. These little tweaks add depth to the whole scene.

Topaz Astra: The Final Polish

Runway’s output looks good—but upscaling it with Topaz Astra takes it into “wait, that’s AI?” territory. Sharper edges, better motion clarity, and less noise. Especially helpful if you want to go full 4K for YouTube or film projects.

Land of Geek Rating: 8.7/10

Verdict: A massive leap for creator-led AI animation. Runway’s Act 2 isn’t perfect, but it’s one of the most accessible, fun, and surprisingly powerful tools for storytelling and content creation on the market right now.

Pros:

  • Full Upper Body Animation: Huge upgrade from facial-only animation in Act 1. Now includes arms and hands.
  • Easy to Use: Straightforward interface with solid default settings and helpful templates.
  • Flexible Use Cases: Great for YouTubers, storytellers, meme-makers, and even indie filmmakers.
  • AI Character Generation Support: Use Runway or Flux Context platforms like Higsfield to easily generate matching images.
  • Gesture Tracking Works Surprisingly Well: Even with complex motion or props, results often look natural and engaging.

Cons:

  • Hand Glitches Still Happen: Especially when props aren’t present in the source image.
  • Pose Matching Required: To get solid results, you really need to match your video to the character pose, which can take trial and error.
  • Limited Background Control: No built-in dynamic backgrounds—requires compositing outside Runway.
  • Gestures Sometimes Overreach: At higher expressiveness levels, things can get… wonky.
  • No Full-Body Yet: Legs and walking animation are hit-or-miss until Act 3 becomes a reality.

It’s a huge leap over Act 1 and opens up storytelling possibilities that just weren’t possible before. Whether you’re building goofy characters, emotional monologues, or something surreal and cinematic—Act 2 is a powerhouse.

Will there be a full-body Act 3 someday? Probably. And I cannot wait. But for now, this is one of the most fun and creative tools in my entire workflow.

Keep experimenting, stay creative, and make your AI videos unforgettable. For more tech tips, tutorials, and digital magic, keep exploring Land of Geek Magazine.

#RunwayML #Act2Tutorial #AIAnimation #CreativeTools #LandOfGeek

Posted 
Jul 28, 2025
 in 
Tech and Gadgets
 category