CS 184: Computer Graphics and Imaging

Henry Xu • Samuel Paradis • Ben Eisenberg

Spring 2019 • UC Berkeley

Midpoint VideoGoogle SlidesFinal Video

Final Project: Mop Simulator

Abstract

For our final project, we wanted to augment our real-time cloth simulator to turn it into a mop. In order to do so, we built a simulation that improves the Project 4 Cloth Simulator to (1) handle more complicated cloth structures, (2) accurately simulate water permeating throughout the cloth, and (3) be interactive through allowing the user to move the cloth. We primarily operated in the framework project 4 for the cloth/mop itself. First, we reworked the shape of the cloth so that it would look more moplike and modified the project 4 rendering code, as well as self collisions so that both parts could handle multiple threads. We then implemented a physics-based water permeation simulator to allow water to propagate throughout the cloth based on both sharing between neighbors and gravitational effects. Finally, we created an interactive UI that allowed mouse click and drag movements based on the ideas behind screen/world space and projection that we used in project 3. After completion of the above tasks, we then created our own pseudo-particle simulator that adds visible water droplets and improved on water saturation and permeation.

Technical Approach

a. Making the Mop

In terms of mop construction, we built the cloth using the project 4 cloth simulator as a backend. We modified the construction function from part 1 to form a circular shape rather than a square shape. This was achieved using sin/cos coordinates. This modification allowed us to do some interesting things for mop structure. Now the cloth essentially forms a tiered cake, with rings of Pointmasses. We pinned the top ring, so that the mop would maintain a cone-like structure when it fell into its movable position.

Following this, we created spaces in between different strands of the mop. To do this, we removed all the springs every eighth Pointmass. We chose eight as the dividing point because it was arbitrarily large enough to handle interesting textures, but small enough that runtime wasn’t too heavily impacted. These spaces required us to rederive the spring counts (so the wireframe could render) and also to rebuild the triangle matrix from scratch (so that textures could render). We also remade the u,v coordinate system so that each strand of the mop would get the same texture mapped on it (rather than spreading one texture over the whole mop).

The mop before it falls, demonstrating the shape of the cloth.

We had a lot of difficulty implementing self collision. We started off with the project four self collision system as a baseline and improved it slightly so that it operated less strictly. When collisions were strict, the mop would come to life on its own and wave around a bit. For that reason we loosened these rules a bit. This led to some weird moments for the mop, however based on this paper, we also experimented with categorizing collision types. This exploration wasn’t very fruitful, it seemed to complicated and was slowing down runtime a lot. Self-collisions are definitely an aspect of this project we’d improve on further if we had more time.

The mop after we run the simulation (but before any movement).

b. Implementing Wetness/Shading

Implementing water propagation physics involved three main steps: integrating saturation into the cloth, implementing permeation due to adjacent pointmass sharing and gravitational effects, and finally, building an OpenGL Shader to render the saturated cloth.

Integrating saturation into the cloth involved adding a saturation attribute to each Pointmass, and then each step, adjusting the mass of each Pointmass to account for the water ‘inside’ of it. The saturation itself is also updated at each step based on the the previous state.

Next, we needed the water to permeate through the cloth in a realistic way. To do so, we needed to update saturation of the current state based on the previous state. This involved adding a delta_sat attribute to each Pointmass, populating delta_sat based on the current state, and then updating the saturation to be saturation + delta_sat. Delta_sat was updated to account for sharing between Pointmasses, and gravitational effects. In terms of saturation sharing, we wanted to simulate the fact that water is not static within a cloth: it permeates throughout. This was implemented by iterating over all non-bending springs, which covers every adjacent PointMass, and having water transfer from one Pointmass to the other as a function of the saturation difference between them. The rate water is shared is adjustable, allowing for the simulation of liquids with different internal viscosities. In terms of gravity, we take the dot product between the external forces and the vector between two Pointmasses: the sign determines the direction water will flow, the magnitude determines the amount. This allows water to generally travel in the direction the external forces are pushing/pulling it. Each Pointmass will only share part of its liquid if the Pointmass itself has over the ‘lower sharing limit’, to simulate that even if water moves to the bottom of a cloth, the rest of the cloth stays wet.

Lastly, we needed a shader to visualize the saturation of the cloth. We implemented multiple shaders, all with the same fundamental principle: the darkness of the color at a vertex is a function of its saturation. This allowed us to simulate the cloth darkening where water is present, and remain ‘dry’ where it is not.

Water permeating throughout cloth, sped up to emphasis movement. Notice how the water quickly propagates throughout the cloth, but generally moves in the direction of gravity. Also note that the entire cloth is “wet” because gravity does not pull all the water from the top to the bottom; this is due to our ‘lower sharing limit’.

Part of the shader included reducing fuzziness when the cloth got wet. Here is a drastic example, where the center of the cloth is completely saturated and the edges are completely dry: notice the displacement/fuzziness on the edges, but the non-fuzzy flat section in the saturated section.

c. Implementing User Interaction

A mop is on its own cannot clean spills (at least not yet), so our next step was to implement interactivity to allow for human intervention. With their ambidexterity and eye for cleanliness, humans are uniquely adjusted to operate cleaning accoutrement. Unfortunately, they aren’t able to neurologically interface with machines (also at least not yet), so our first course of action was to figure out how to translate their mouse movements in the viewport to locations in the actual scene.

Enter the World to Screen Space Projection (Identification)

To convert between world space and screen space, due to difficulties working with unprojection (extensively detailed in the next section), we mapped each Pointmass’s location in the world space to the screen by multiplying each pointmass by the viewprojection and model matrices, comparing to the cursor location to approximate user intent. Although perhaps more inefficient and less accurate than a screen to world space approach, the benefits of increased understanding of what was occurring during the projection process outweighed the slightly decreased framerate.

At this point, we now have a rough gauge of human intention. We know what on the screen they want to interact with, but lack a mechanism to deliver satisfaction to them. As frequently observed in nature, humans respond positively to feedback, which is what we sought to accomplish next.

Far From Home (Movement)

To ensure preservation realistic cloth motion when provided user input, we mapped cursor movement to changes in external acceleration as opposed to change pure changes in location, which would require a not insubstantial amount of interpolation. In a sense, our scheme is interpolating also, but it is more implicit than explicit. What was just a cloth in a fixed location in project 4 is now a cloth that is capable of moving in 3D space. The world is its oyster! (Well, the world confined within the viewport).

One interesting issue arose from the fact that we were dealing with a 2D input axis (x,y), but outputting a 3D force vector. For a certain camera angle, we decided to constrain movement within the scene’s (x,y) axes, but the corresponding axes in the scene can be changed via rotation.

The moving mop! Notice how the water swishes around in the cloth in response to user movement, and the wetter part of the mops drag along the bottom. Movement is intuitive and works great!

Our brief attempt at philosophy: what good is a mop if there’s nothing to clean up? In other words, the motivation for the next section.

d. Particle Simulation

Upon realizing that mops are usually meant to clean up messes, we needed a way to generate scenes for our mop to mop up. A fully fledged simulator would’ve been ideal, but due to time constraints, the effective exploration of an implementation of a nice particle simulation was just not in the cards--we did spend a day trying adapting the water simulation found here, but could not reconcile what we were doing with the Javascript with the C++. Ultimately, we settled on a naive particle simulation by adding simulation functionality to the original spheres in the scene, subjecting them to external forces using a truncated form of Verlet integration, and enhancing their interaction with the cloth through the visualization of absorption. Finally, like many things in life, our mop is not perfect, so we also included a method of generating particles due to leakage or oversaturation of the material. Particle simulation results can be seen later in the page.

Problems, Solutions, and Lessons Learned

Problems & Solutions

One of the main problems were encountered were how to balance efficiency with graphics quality. Since our project is interactive, we needed it to not lag. This led us to cut down on the number of mop strands we used as well as cutting back on our self collision and fluid algorithms.

Another difficulty was the box of mysteries that is OpenGL. Hoping to avoid taking the path less travelled, we looked to the many questions and answers left in the wake of new game developers also discovering the ropes. For example, many the resources suggested gluUnProject as a nifty way to convert between window to object coordinates--who knew one function could do it all! After following a tutorial (or three) to load the relevant OpenGL pointers and properly supply arguments to the function, a problem quickly became apparent. The function didn’t exist. In fact, many of the new loaders introduced didn’t exist either. What happened next was a deep dive into the world of OpenGL loaders and the intricacies (and limitations) of nanogui. After playing with different glad loader configurations without much avail, we knew a new approach had to be taken.

Two novel ideas were introduced. First, why not just do it manually? We knew what unproject did--just inverted the view and projection matrices and applied them to the screen space. The post processing steps were also completely within reach, dividing the x, y, and z coordinates by the 4 entry in the resulting vector. A few implementation woes later, we had our second idea: since we’re looking for the nearest pointmass, instead of projecting the screen into world space, why not turn the entire idea on its head and project the world space of the pointmasses into the screen space? At the expense of inefficiency, it gave us something that was significantly easier to understand. Furthermore, after digging into the guts of nanogui, I found that many of the functions found in loaders such as GLU were had been reimplemented by nanoGUI, and modified my existing code accordingly. Although we didn’t have gluUnProject, we had alternatives that were (more importantly) functional. From there, the rest is history.

Particle simulation, due to its naivety, went off mostly without much trouble. However, we did experience issues with particles clipping through the cloth due to the low Pointmass count resulting from performance constraints. In a similar vein, the low Pointmass counts also created visual quirks such as hexagonal absorption shapes

On the subject of absorption, a lot of time was invested into making the cloth and water permeation look as realistic as possible. While the biggest challenge here was implementing a physics-based water permeation model, completing such model was only the beginning. In order for the water to look realistic, it required extensive parameter tuning, including parameters such as the rate external forces pull water, the rate water is shared between Pointmasses, the mass of the water, the effect of the saturation on the output color of the shader; all of these variables interacted with each other, and it was always a challenge to find a combination that accurately depicted the cloth and fluid we desired.

Lessons Learned

In short, we learned a lot about a wide gamut of subjects in computer graphics. It may have been a case of jack of all trades, master of none, but we got a chance to revisit old ideas and touch upon new ones in ways we would’ve never thought about when we embarked on the journey. We have a newfound respect for efficient particle simulation was and hope to improve upon ours if we have time in the future. The trials and tribulations that come with making an interactive UI and the delicate juggling act of balancing both a movable camera and a movable object produced a pang of regret that we didn’t pay more attention during the transformation lecture. In terms of life lessons, we learned painfully that short term solutions often lead to long term problems, and the hours of debugging can attest to the fact. Finally, we learned how to take on an ambitious project and break it down into manageable chunks in order to achieve all our goals!

Results

Mop

Water permeating throughout cloth, sped up to emphasis movement. Notice how the water quickly propagates throughout the cloth, but generally moves in the direction of gravity. Also note that the entire cloth is “wet” because gravity does not pull all the water from the top to the bottom; this is due to our ‘lower sharing limit’.

Due to the increase in magnitude of complexity of the actual mop, we could only render scenes with it using a very low point mass count--as previously noted, the low point mass count meant that particles could easily clip through the fabric due to the point mass and sphere collision not being detected. While we did experiment with increasing detection radius, it make for some especially unnatural looking scenes. In order to fully demonstrate our water permeation model and our cloth-particle physics, we switch back to the original cloth structure for the final deliverables. Even in the following scenes, the point mass count is still a little less than ideal, resulting in absorption that looks blocky--if only we were aiming for a Minecraft aesthetic.

Water Permeation & Particle Simulator

a. Wine Stain Simulator

Wine is a thin, nonviscous liquid. While the particles themselves are dark, wine quickly permeates through cloth to form light purple stains; we observe that behavior below.

Wine particles being absorbed and permeating through cloth. At least its not on a nice dress shirt!

b. Blood Stain Simulator

Blood while the same color as wine, acts very differently! It is much more viscous, and generally leaves dark stains. This is achieved by modifying the parameters that define the physics controlling the particle movement/viscosity within the cloth. The results are quite different than for wine!

That'll be hard to get out! Blood particles are absorbed into the cloth and stick to mostly to their absorption location. Better get some Tide (not sponsored)!

Included in the particle simulator is the liquid dripping from the cloth. Here is a better view of that:
Drip drip! As the blood drips out, the amount of saturation in the cloth decreases.

c. Mud Stain Simulator

While the other two particle simulations were for light liquids, it also works for more dense particles, such as mud!

We see the cloth act as a rag, cleaning mud off the ground. Notice how the cloth clumps up when trying to absorb all the dirt; the mud is massive, and thus as the cloth absorbs more mud, the leading edge becomes heavier, causing the folding seen above.
Notice the craters the mud leaves on the cloth when they strike it--it's nothing a good detergent can't fix though.

Final Information

References

http://movement.stanford.edu/courses/cs448-01-spring/papers/volino.pdf
http://www.cs.columbia.edu/cg/wetcloth/

Team Contributions

Ben worked on the core mop construction and fundamental frameworks, ensuring we had a very solid foundation to start from. He also created both the mid-project and final videos, and extensively investigated our issues with self-collision.
Sam implemented the physics-based water permeation model, built the core water shader, created the web pages, and generated the deliverables. He also worked on the powerpoint presentation.
Henry worked on the user interaction aspect and the fluid simulation that allowed the cloth to visually experience saturation. He also worked on deliverable generation and assorted quality of life improvements (e.g., leakage, enhanced json parsing, and slight shader tweaks).

Everyone worked on general debugging and integrating our parts together so they didn’t cause bugs. Moral support was in very high supply due to Sam’s ability to supply beverages on command--for which a mop would’ve been useful at times.

Last Remark

We would like to thank the CS184 Staff for an amazing semester. To show our gradtitude, finally, we would like to present a haiku:

I am a dry mop
Please soak me in some fluid
Then wring me out nice

>