Categories: Image

Time-to-Move + Wan 2.2 Test

Made this using mickmumpitz’s ComfyUI workflow that lets you animate movement by manually shifting objects or images in the scene. I tested both my higher quality camera and my iPhone, and for this demo I chose the lower quality footage with imperfect lighting. That roughness made it feel more grounded, almost like the movement was captured naturally in real life. I might do another version with higher quality footage later, just to try a different approach. Here’s mickmumpitz’s tutorial if anyone is interested: https://youtu.be/pUb58eAZ3pc?si=EEcF3XPBRyXPH1BX

submitted by /u/enigmatic_e
[link] [comments]

AI Generated Robotic Content

Share
Published by
AI Generated Robotic Content
Tags: ai images

Recent Posts

10 Ways to Use Embeddings for Tabular ML Tasks

Embeddings — vector-based numerical representations of typically unstructured data like text — have been primarily…

3 hours ago

Over-Searching in Search-Augmented Large Language Models

Search-augmented large language models (LLMs) excel at knowledge-intensive tasks by integrating external retrieval. However, they…

3 hours ago

How Omada Health scaled patient care by fine-tuning Llama models on Amazon SageMaker AI

This post is co-written with Sunaina Kavi, AI/ML Product Manager at Omada Health. Omada Health,…

3 hours ago

Anthropic launches Cowork, a Claude Desktop agent that works in your files — no coding required

Anthropic released Cowork on Monday, a new AI agent capability that extends the power of…

4 hours ago

New Proposed Legislation Would Let Self-Driving Cars Operate in New York State

New York governor Kathy Hochul says she will propose a new law allowing limited autonomous…

4 hours ago

From brain scans to alloys: Teaching AI to make sense of complex research data

Artificial intelligence (AI) is increasingly used to analyze medical images, materials data and scientific measurements,…

4 hours ago