Heuristics for 3D model decomposition


Background reading

I did some more rigorous reading of some decomposition papers, particularly the techniques presented in [1]. I'm using the ideas in [1] and [2] as the basis for my decomposition algorithm.

Understanding SWIFT++

I spent a considerable amount of time continuing to understand the structure of SWIFT++, and in particular the decomposition program that comes with it. Once I understood where the decomposition algorithms come into play, I began to implement my own idea for convex decomposition. A brief description of my algorithm is given below.

Previous Algorithm

As my algorithm is based on the algorithms presented in [1] and [2], I will first briefly describe those. The algorithm falls into the class of convex decomposition algorithms known as flooding heuristics. Flooding heuristics are incremental algorithms and work as follows:

In [1], Chazelle et al. divide flooding heuristics into two sub-groups: greedy flooding and controlled flooding. Greedy flooding heuristics collect faces until no adjacent face can be added without violating the convexity of the current patch being constructed. Controlled flooding heuristics have other stopping rules that can be other than simply convexity violation. The algorithm I am working on falls under the controlled flooding category.

One of the decomposition algorithms (cresting BFS) used in SWIFT++ is a flooding algorithm with a seeding technique that prioritizes the nodes to start searching from in the dual graph. Once the nodes have been prioritized, convex patches are created by adding faces to the current convex patch until a convexity constraint is violated (the constraints are explained in more detail in [2]). The seeding technique involves choosing start faces that are distant from non-convex edges in the surface. This gives the patch more opportunity to grow and become as large as possible. The idea is to try to minimize the number of patches by allowing them to grow as large as possible.

My Algorithm

The algorithm I have been implementing is similar to the cresting BFS algorithm in SWIFT++, but I use the seeding technique in the opposite direction. The plan is not to necessarily try to create the minimum number of convex pieces, but to try to create pieces of as equal size as possible. Size can represent number of faces or collective area of faces (as well as other heuristics, but this is what I have started out with). The idea is to start with faces that are closest to non-convexities in the polyhedron, because they will likely not be able to grow enough to be the same size as "equal-sized" pieces. Each time faces are added to a piece, the size of the piece is checked, and if it grows to a size greater than some threshold (roughly based on the number of non-convexities present in the polyhedron), then it stops growing and we start growing from a new face. It is not clear how well this algorithm will perform in creating "equal-sized" pieces. My implementation is well under way, but is not yet functional. This is probably due to not using some SWIFT++ data structures properly.

Future Work

An initial implementation should hopefully be done soon, after which testing can begin. I plan to calculate various statistics about the decompositions, including the number of resulting convex pieces and their (average) size, etc. I should also hopefully be able to do timing tests using SWIFT++ similar to those done in [2]. One possible extension to the algorithm above would be to recalculate the priorities after a piece has finished growing. This way the remaining pieces can be of more equal size to each other.

References

[1] Chazelle, B. et al. Strategies for polyhedral surface decomposition: An experimental study, Comp. Geom. Theory Appl., 7:327-342, 1997.

[2] Ehmann, Stephen A., Lin, Ming C. Accurate and Fast Proximity Queries Between Polyhedra Using Convex Surface Decomposition, EUROGRAPHICS 2001.

[3] Kim, Young J., Otaduy, Migual A., Lin, Ming C., Manocha, Dinesh. Fast Penetration Depth Computation for Physically-based Animation, ACM Symposium on Computer Animation, July 21-22, 2002.

[4] Lin, Ming C., Canny, John F. A Fast Algorithm for Incremental Distance Calculation, Proceedings of the 1991 IEEE International Conference on Robotics and Automation, April 1991.

[5] Chazelle, B. Convex Partitions of Polyhedra: A Lower Bound and Worst-Case Optimal Algorithm, SIAM J. Comp., Vol. 13, No. 3, August 1984.

[6] Bajaj, C. L., Dey, T. K., Convex Decomposition of Polyhedra and Robustness, SIAM J. Comp., Vol. 21, No. 2, April 1992.

[7] Kim, Young J., Lin, Ming C., Manocha, Dinesh. DEEP: Dual-space Expansion for Estimating Penetration Depth Between Convex Polytopes, Proc. IEEE International Conference on Robotics and Automation, May 11-15, 2002.