facebook rss twitter

Ultra-realistic cloth simulation shown by researchers (video)

by Mark Tyson on 24 July 2013, 11:45

Quick Link: HEXUS.net/qabyzz

Add to My Vault: x

Researchers have demonstrated the Near-exhaustive Precomputation of Secondary Cloth Effects at SIGGRAPH 2013. A published research paper and a video are available detailing the findings. The approach the researchers, from Carnie Mellon and Berkeley Universities, took was "entirely data-driven", with the animation possibilities pre-rendered and stored.

The scientific team challenged the assumption that pre-computing "everything" about a complex space was impossible. Several thousand CPU-hours were used to calculate the cloth animation and "perform a massive exploration of the space of secondary clothing effects on a character animated through a large motion graph." The cloth animation could then be used to dress a character which was moved in real time through a variety of poses and actions. Please check out the video embedded below.

Hoodies are particularly problematic

The team seemed happy with their cloth animation quality in the video example above as the pre-computational time surpassed 4,500 hours and the animation reached nearly 100,000 frames. The tens of gigabytes of data used to deliver the animation could be compressed to only tens of megabytes. As such the team thinks it is a reasonable proposition to deliver this "high-resolution, off-line cloth simulation for a rich space of character motion" as part of an interactive application or video game.

It is the "rapid growth in the availability of low-cost, massivescale computing capability" that has allowed the team to challenge the preconception that pre-computing almost everything is not practical. Adrien Treuille, associate professor of computer science and robotics at Carnegie Mellon, spoke to TechCrunch; "The criticism of data-driven techniques has always been that you can’t pre-compute everything," he said. "Well, that may have been true 10 years ago, but that’s not the way the world is anymore."

The test system used by the researchers was an Apple MacBook Pro laptop (Core i7 CPU) which easily coped with the 70fps or faster animation playback and interacting with the joy-pad movements. The researchers thought that the run-time memory requirements of about 70MB for this animation are "likely too large to be practical for games targeting modern console systems (for example, the Xbox 360 has only 512 MB of RAM)", however the requirements are "modest in the context of today’s modern PCs" and next-gen consoles.



HEXUS Forums :: 7 Comments

Login with Forum Account

Don't have an account? Register today!
So, someone's published a genuine scientific research paper, saying we can stop researching clever solutions and start brute-forcing problems due to the more readily available compute power at our disposal?

I can't decide if I'm horrified or awestruck… :confused:
If i understand this correctly (and there i probably haven't), does this mean that a games developer could do all of the pre render work and just make the compressed output data set available to the game engine and therefore deliver higher quality gfx at lower computational expense for the game player with increased frame rates?

If so then that seems a great way to finally start to push the boundaries of whats possible in games without the need for ever more powerful hardware thats trying to calculate and display the output on the fly.
It wouldn't really change what you could do graphics wise, but for things like physics, a lot more could be done, as long as the game developer was willing to implement it.
this is the type of thing microsoft are targeting with cloud computing on xbox one
4.5 thousand hours and, when the hood is down, it doesn't move convincingly; certainly not compared to the rest of the garment.

Solution: Ban hoodies in CG, and hope that life eventually imitates art. ;)