• Keine Ergebnisse gefunden

to place a point exactly on the peak of the cones. So, most probably the reconstructed surfaces of the cones will be connected by one or a chain of edges which are crucial for the correct topology.

While the “ear” on the boundary is the topological noise the connection between two reconstructed cones is essential for correct reconstruction. Removing the chain of singular edges results in first case in a correct 2D boundary but in the second destroys the topology. We call this thesingular edges problem.

Definition 4.14 (Singular Edges Problem). A reconstructed boundary has the singular edges problem if the reconstructed boundary contains singular edges with non-simple adjacent vertices.

Taking (α, β)-shape-reconstruction as an example we propose to fill the chains of singular edges by a set of triangles in such a way that no additional singular edges occur and all singular edges become simple. Obviously, we want the smallest triangles possible.

Definition 4.15 (Minimal Expansion). Let Dα,β be the result of thinned-(α, β)-shape-reconstruction method with singular edges problem and D0 be the set of singular edges. Then minimal expansion of Dα,β is the minimal reconstruction which does not contain any singular edges and all simple edges are inD0.

We expect the minimal extension of the boundary to fill the boundary “ears” with triangles less than β, and relevant connections between reconstructed regions by triangles greater thanβ since this extension would have to cross the erosion of the original region. So, the next thinning step is to collapse on simple edges if the extension consists of triangles less thanβand to drop the extension in other cases.

Conjecture 4.16(Contraction on Minimal Expansion). LetD+α,β be the minimal expansion of a result of thinned-(α, β)-shape-reconstruction method. If all simplices inD+α,β are less thanβ thenD+α,β is covered byq-dilation of the original boundary.

Notice that the “ears” on the boundary may be knotted. The minimal expansion of the chain of sin-gular edges is then a non-manifold 2D surface. Contraction of the minimal expansion does not necessarily solve all problems discussed previously. So, we do not state in Conjecture 4.16 that topologically-correct reconstruction results. In fact the expected result is areducible reconstruction. Consult Section 2.4.5 for introduction.

4.12. CONCLUSION 111

In the next chapter we introduce locally adaptive sampling criterion as a generalization of the sam-pling conditions defined for thinned-(α, β)-shape-reconstruction as well as a new reconstruction method to handle this new sampling condition.

Chapter 5

Refinement Reconstruction

5.1 Introduction

In real data sets, the sampling density often varies spatially. For instance, a laser scan captures data only where the laser “touches” the object’s surface. Which means that this data acquisition device does not “see” the concavities in the surface. It follows that, even though the sampling density is globally-set, the resulting data set will have gaps in concavities of the original surface.

The thinned-(α, β)-shape-reconstruction introduced in Chapter 4 extends the concept ofα-shapes in order to result in a one-to-one mapping between reconstructed regions and the original space partition.

The underlying sampling conditions are expected to be globally uniform. This condition is necessary to guarantee a correct parameter-setting for the reconstruction. The provided sampling parameters are used to set up the size of the boundary simplices as well as the minimal region size.

Our new reconstruction method is based on the 3D extension of the “Gabriel graph” first introduced in [Gabriel and Sokal, 1969]. The Gabriel graph can be computed by removal of edges from the Delaunay complex if their circumcircle contains further Delaunay vertices. As shown in [Edelsbrunner, 2003] this imitates the “flow curves”. The resulting set of Delaunay edges separates all discrete distance maxima.

As described in Section 2.4.10, the “Geomagic WRAPc” algorithm [Edelsbrunner, 2003] results in the 3D extension of the Gabriel graph. Originally, the method separated the infinite background from the interior of the original shape, assuming that the original surface is manifold. The result of the algorithm is the set of minimal boundary triangles. In [Edelsbrunner, 2003] no guarantees on original topology preservation are given, but it is proven that the result is manifold and the reconstructed shape is contractible.

Methods like “Crust” [Amenta et al., 1998] or “Cocone” [Amenta et al., 2000a] or any of their deriva-tives use the fact that the sampling is very dense and the maximal sample point deviation from the boundary is very low for proving topology preservation. The required sampling conditions may vary according to the curvature of the original surface, but the ratio between the sampling density and the distance to the medial axis is constant and positive. Taking a small fraction of this ratio ensures all sampling points to lie very much closer to and much denser around the boundary than their distance to the original medial axis. This is only possible for smooth shapes.

Although the medial axis is a complete shape descriptor, only a subset of it is required to preserve topological properties. For local maxima separation (cf. WRAP), the relevant subset is the set of local maxima. For homotopy preservation, the subset is the homotopy axis. The local region size and local homotopical feature size ( as introduced in Section 2.2.5 ) are greater than zero even for non-smooth

113

surfaces. Now, in order to ensure the sampling density and maximal sampling point deviation to be smaller than the distance values in the region’s interior, we relate the sampling conditions to a feature size which is strictly greater than zero. So, in Section 5.3 we define locally variable sampling conditions which are finite for non-smooth surfaces.

In Section 5.2, we introduce the local maxima separation as a refinement. A refinement correctly separates the original local maxima on the continuous distance transform. We apply the WRAP algo-rithm to all discrete local maxima. If the underlying sampling ensures the sampling density to be less than the distance value of the nearest local continuous maximum, the outcome of the WRAP algorithm is a refinement.

We define corresponding sampling conditions in Section 5.4. These new sampling conditions require that the original local maxima are surrounded by sampling points in such a way that The sampling density between the points is demanded to be higher than the continuous distance values on the original local maxima. This guarantees that, starting on original local maxima and following the steepest ascends on the discrete distance transform, we reach local discrete maxima without crossing the original boundary.

We call these discrete maximaassociated. Our new sampling conditions ensure the associated maxima to be correctly separated, which - as we prove in Section 5.5 - is equivalent to the statement: each original maximum is associated with a centered Delaunay cell and the circumcenter of this cell is inside the original region. So, deleting centered cells from the Delaunay complex we hit each original region at least once.

There are different ways to correctly separate the local maxima. For our framework we require the boundary of the refinement to be minimal.As we define in section Section 5.6, a minimal refinement separates the local maxima by smallest boundary simplices. In Section 5.7, we prove that the WRAP applied on every discrete local maximum results in minimal refinement.

In our framework the reconstruction is a refinement of the original space partition. A refinement is a special case of oversegmentation which preserves the correct local maxima separation. Still, the reconstruction may be an oversegmentation. Consider a shape with a narrowing: The interior consists of two local maxima in the same original region. The refinement reconstruction subdivides the space separating the associatives of these maxima. In Section 5.8 we introduce a criterion which rejects boundary simplices which are inappropriate to separate regions. Deleting such edges results in merging reconstructed regions and - as we prove - still in minimal refinement. We call this region-merging step therefinement reduction.

The sampling conditions defined according to local region size may lose the information which local maxima belong to the same region. Furthermore, the sampling is too sparse to guarantee that removal of certain boundary simplices reduces the refinement to a topological equivalent of the original space partition. In Section 5.9 we introduce a new stability definition. A refinement isstable if its boundary does not intersect the original homotopical axis. A refinement isreducible if it is a superset of a stable refinement.

In Section 5.11, we prove that refinement reconstruction results in reducible refinement if the sampling is defined using the local homotopical feature size. We introduce the new sampling conditions in Section 5.10. In Section 5.12, we undertake the particular reconstruction steps on three different samplings of the same 2D shape.

The evaluation of the new reconstruction method is done in three steps. In Section 5.13, we compare the sampling conditions and the underlying shapes. The experimental results are presented in Section 5.14. We evaluate the new reconstruction method on well-known laser range scan data from “The Stanford 3D Scanning Repository” and on data sets resulting from X-ray computed tomography imaging (using 3D Canny edge detection to extract a point cloud). In Section 5.15, we discuss the effects of the very sparse sampling conditions and too large amounts of noise which can be handled by our algorithm and make propositions for future work and in Section 5.16, we summarize our new results.

Refinement reconstruction is both theoretically and experimentally solid and gives a fundamental result which has advantages even over thinned-(α, β)-shape-reconstruction. The underlying shapes are assumed to be non-manifold multiregional surfaces. The method results under guarantee in reducible refinement, even if the sampling is very sparse and noise corrupted. The approach expects sampling