Categories: Image

Time-to-Move + Wan 2.2 Test

Made this using mickmumpitz’s ComfyUI workflow that lets you animate movement by manually shifting objects or images in the scene. I tested both my higher quality camera and my iPhone, and for this demo I chose the lower quality footage with imperfect lighting. That roughness made it feel more grounded, almost like the movement was captured naturally in real life. I might do another version with higher quality footage later, just to try a different approach. Here’s mickmumpitz’s tutorial if anyone is interested: https://youtu.be/pUb58eAZ3pc?si=EEcF3XPBRyXPH1BX

submitted by /u/enigmatic_e
[link] [comments]

AI Generated Robotic Content

Share
Published by
AI Generated Robotic Content
Tags: ai images

Recent Posts

Intel announced new enterprise GPU with 32GB vram

If only it works well with work flow. Nvidia have CUDA, AMD have ROCM, I…

3 hours ago

5 Practical Techniques to Detect and Mitigate LLM Hallucinations Beyond Prompt Engineering

My friend who is a developer once asked an LLM to generate documentation for a…

3 hours ago

Exclusive Self Attention

We introduce exclusive self attention (XSA), a simple modification of self attention (SA) that improves…

3 hours ago

Unlocking video insights at scale with Amazon Bedrock multimodal models

Video content is now everywhere, from security surveillance and media production to social platforms and…

3 hours ago

DRA: A new era of Kubernetes device management with Dynamic Resource Allocation

The explosion of large language models (LLMs) has increased demand for high-performance accelerators like GPUs…

3 hours ago

Amazon Spring Sale Deal: The Typhur Dome 2 Air Fryer Is 30% Off

I tested more than 30 air fryers this past year. The Typhur Dome 2 is…

4 hours ago