Long Road to Make Virtual Apparel Look Real

The staggering but steady leaps in computing power make possible Embodee’s virtual product experiences, as they do countless other things. Without several decades of exponential advances, our servers would take intolerably long to dynamically render images of apparel. And your web-enabled devices would take even longer to display them.

For example, our recently patented virtual try-on technology produces in seconds a 360-degree view of how a pair of jeans fit you. In the past it would have taken minutes or even hours, an experience like trying on those jeans in a dressing room but endlessly waiting to see anything in the mirror.

While personal computers today process hundreds of billions of instructions a second, the number in 1987 was less than 20 million. That year two computer scientists in Palo Alto, California developed how to simulate shapes of flexible objects, including cloth.

They applied principles of physics to control how those objects moved virtually from effects of gravity, wind, contact with the body movements, and so forth.

But because even the legendary Cray computer, the most powerful available at the time, wasn’t powerful enough to process the required complex instructions, the simulations either didn’t work or were too short.

It would take 10 years for other scientists to improve the software and for computing power to catch up so that apparel would look and move realistically on animated characters. The public first saw the results in the 1997 Oscar-winning short Geri’s Game.

While working for French visual effects innovator BUF Compagnie, our R & D director, Isabelle Haulin, helped to create cloth simulation software used later in state-of-the-art feature films. The first was The Cell in 2000, which featured visual effects taken for granted today. But at the time the simulation of a giant purple cape was groundbreaking.

Haulin has applied her film work to Embodee’s technology that renders high-fidelity 3D images of apparel, realistically simulating drape, wrinkles, and weave.

Integrated with the technology is cloth simulator software from Syflex. The company was started after its founder developed visual effects for the film Final Fantasy. It now makes available a widely used cloth simulation plug-in.

“Besides film production, cloth simulation is increasingly used in the gaming and apparel design industries as some user-friendly applications have become available,” Haulin said. “It also can handle a higher concentration of particles–geometric shapes–and increased garment complexity (multiple layers) close to real time, improving rendering speed and realism.”

Virtual product experiences with apparel are sure to become even richer visually, not to mention faster, as computing power advances ever upward.