• Keine Ergebnisse gefunden

9.2 Scalar Data Visualization

9.2.3 Time-Dependent Data

Since ecosystems are not static, but rather exhibit complex dynamics over time, the analysis of their evolution, based on the evaluation of data at different points in time,

9.3. IMPLEMENTATION 95 is a valuable source of information. We are also able to handle time-dependant data and thus visualize the evolution of an ecosystem over time. A time-dependant data set is usually given as a set of scalar or label fields:

S : [t0, t1. . . tN]×D⊂R2 →R (9.3) It can be visualized using the same mapping technique described above, the only additional requirement being a proper interpolation function over time. We use linear interpolation on the time axis and thus tri-linear interpolation in the whole data set, as it is readily available in graphics hardware. Using this interpolation, one can compute the values of the data set at a given time and visualize it as in the previous section:

S(t) = (1−ut)Sti+utSti+1 (9.4) ti ≤t < ti+1

ut= t−ti ti+1−ti

9.3 Implementation

The mapping of scalar or label fields onto visual features as described above is well suited for our rendering pipeline, since all visual features can be either seamlessly controlled by parameters in the range [0.0. . . 1.0] or specified by the user as textures (hatch and silhouette strokes). However, as described in detail in 8, visual controls are implemented in different stages of the rendering pipeline, which in turn influences the way data is made available to the pipeline. Among the features shown in Table 9.1, abstraction is mostly performed in the early stage of the pipeline (vertex shader) while the other are done in post-processing.

While, of course, feeding the scalar field at the beginning of the pipeline would ensure that all features can be used for mapping, this raises serious complications:

sampling the scalar data from inside the vertex shader is not straightforward and would put a significant burden on the rest of the pipeline. Thus, also given the fact that the goal is not an exhaustive system, but rather exemplary visualization, we decided to only use the post-processing stage for data visualization, leaving a more complete implementation for an eventual commercial system.

In this case, data can be presented as textures in the graphics card memory, benefiting from the fast sampling and processing of the graphics hardware. The work flow is presented in Fig. 9.2 and uses the depth buffer to perform a projective texturing of the scene. For each pixel in the frame buffer, the computations are performed in the following steps:

• using the frame buffer depth data, a 3D position in camera space is computed by applying the inverse of the projection matrix.

• the 3D location in world space is computed by applying the inverse of the modelview matrix.

Figure 9.2:Embedding non-visuals in the landscape: the depth buffer is used to retrieve the world-space position at each pixel in order to projectively map a scalar field onto the scene (on the color attribute in this example).

• projective texturing is used to sample the data texture at the 3D location. The projection is similar to that used for terrain (orthogonal, on the z axis pointing upwards).

Obviously, the first step introduces numerical precision issues, as both depth and 2D pixel coordinates are discrete (limited in precision). However, the introduced artifacts are practically invisible, because the subsequent processing does not change the view and projection transformations and thus the sampling density is preserved.

Using the technique described above results in scalar data being made available to the post-processing stage of the rendering pipeline which, in turn, maps this data onto visual features accordingly.

9.4 Results

In this section, we have partially explored the potential of sketchy landscape visual-ization to be usefully combined with scientific visualvisual-ization of abstract environment data. Not only that such abstract representations leave sufficient room for additional visual information to be blended in, but they also lend themselves quite well as trans-porters of such information.

The key property that allow this kind of usage is the independent control of the vari-ous visual features, which can be thus correlated with scalar data values. The affinity of different visual features to various data semantics has been analyzed and should be taken into account in order to obtain meaningful visualizations.

The implementation in the sketchy rendering pipeline is lightweight and could be chosen such that only the final stage is affected. The data to be visualized is fed as a

9.4. RESULTS 97

Figure 9.3:Views of a scene. From top to bottom, left column: photorealism, 3D sketch, scalar field(via a colormap). Right column:triple scalar field using different hatch stroke color; two scalar fields using color and hatching; two scalar fields using color and silhouette style.

texture in the post-processing pixel shaders, where it is mapped onto visual features controlled by the user. Thus, the whole process is hardware-accelerated and also supports time-dependant data visualization.

Of course, in a practical visualization system it is possible and desirable to also in-corporate explicit, traditional visualization techniques, such as volume rendering or illuminated field lines, which also fit well in interactive sketches. We expect signifi-cant benefits for such an integrated visualization approach.

C HAPTER 10

C ONCLUSION

In this research work, we have investigated alternative approaches to real-time vi-sualization of complex ecosystem models and associated environmental data on the basis of interactive sketches. Our framework is able to render realistically complex, GIS-generated scenes according to the user’s specification with respect to abstraction degree and different visual styles that can be seamlessly integrated together, as well as with photorealistic renditions. Furthermore, visualization of environmental data can be embedded in the same interactive visualization.

Our primary motivation has been to create a practically usable test bed for land-scape professionals who need versatile visual tools and media in landland-scape analysis, planing, architecture etc. Particularly the communication and implementation of en-vironmental development projects and measures has been plagued by the inefficiency of existing computer-aided visualization means.

We have investigated traditional media in the landscape work-flow, the hand-drawn sketches, which are established and thus can serve as a source of inspiration. Then, their counterpart in the computer graphics, namely non-photorealistic rendering, has been reviewed and relevant existing algorithms have been adopted or adapted. Gaps in our target work-flow as well as problems of existing algorithms have been iden-tified and new techniques or improved versions of existing algorithms have been developed where necessary.

Specifically, we have covered the following fields:

Non-photorealistic rendering We have presented a framework for real-time illus-tration of complex landscape scenes that provides the user with the ability to control the appearance and drawing style of the scene by merely changing a few parameters.

Re-modeling with high-level primitives enables meaningful simplification as well as novel on-the-fly parametrization techniques for efficient stylization of silhouettes and real-time hatching of plants with spatial stroke coherence and reduced "shower door"

effect. The usual artistic usage of sketchy visual features becomes secondary. Rather, style and features should play the role of information carriers in conveying ideas and communication instruments. Moreover, photorealism and non-photorealism are no longer separated, but naturally integrated in the same rendering style palette.

Geometry processing and level-of-detail. A versatile billboard representation, com-puted in a pre-processing step, allows efficient rendering of complex plant models in both PR and NPR.

Data visualizationhe ability to handle large scenes opens interesting possibilities for the visualization of scalar data on landscapes, which is hardly possible within (too) complex photorealistic imagery.

It is in the nature of any research that, while attempting to clarify and propose solu-tions to certain problems, it also raises new quessolu-tions during the process - our work makes no exception in this respect. There are many aspects that can be improved, like a more efficient implementation (advances in graphics hardware will help) as well as more sensible abstraction mechanism that should come closer to human-like visual abstraction. A solution to the latter problem that we plan as future work is a segmentation of the scene according to relevant vegetation characteristics. We have also recognized new directions of research that are open. An interesting one is the automatic generation or selection of style and artwork textures according to vegeta-tion characteristics: this would remove the task of artwork creavegeta-tion from the user’s responsabilities and lead to faster workflow.

Due to its strong inter-disciplinary character, this work has been performed in a tight connection with experts and potential users in the landscape professional world.

However, as for any new work tool, the acceptance in practice will be shown in time. To ensure usability, the developed techniques have been integrated within a GIS-based landscape visualization system, Lenne3D. The ultimate goal is increased efficiency and acceptance of nature and environment development processes through enhancement of computer-aided tools, to which I therefore hope to have contributed.

A PPENDIX

B IBLIOGRAPHY

[1] E. Angel. Interactive Computer Graphics: A Top-Down Approach with OpenGL. Addison Wesley, 1999.

[2] Pascal Barla, Joëlle Thollot, and François Sillion. Geometric clustering for line drawing simplification. InProceedings of the Eurographics Symposium on Rendering, 2005.

[3] S. Behrendt, C. Colditz, O. Franzke, J. Kopf, and O. Deussen. Realistic real-time rendering of landscapes using billboard clouds. In Eurographics 2005 Conf. Proc.ACM SIGGRAPH, 2005.

[4] J. Bertin. Graphische Darstellungen und die graphische Weiterverarbeitung der Information. Walter de Gruyter, Berlin, New York, 1982.

[5] J. Buchanan and M. Sousa. The edge buffer: A data structure for easy silhouette rendering, 2000.

[6] S. C. Chen and L. Williams. View interpolation for image synthesis. In J. T.

Kajiya, editor, SIGGRAPH 93 Conf. Proc., Annual Conf. Series, pages 279–

288, Anaheim, 1993.

[7] Liviu Coconu, Carsten Colditz, Hans-Christian Hege, and Oliver Deussen.

Seamless integration of stylized renditions in computer-generated landscape vi-sualization. In E. Buhmann, P. Paar, I. Bishop, and E. Lange, editors,Trends in Real-Time Landscape Visualization and Participation, pages 88–96. Wichmann Verlag, Heidelberg, 2005.

[8] Liviu Coconu, Oliver Deussen, and Hans-Christian Hege. Real-time pen-and-ink illustration of landscapes. In NPAR ’06: Proceedings of the 4th interna-tional symposium on Non-photorealistic animation and rendering, pages 27–

35, New York, NY, USA, 2006. ACM Press.

[9] Liviu Coconu and Hans-Christian Hege. Hardware-accelerated point-based ren-dering of complex scenes. In Simon Gibson and Paul E. Debevec, editors, Pro-ceedings of the 13th Eurographics Workshop on Rendering Techniques, pages 43–52, Pisa, Italy, June 2002.

[10] Carsten Colditz, Liviu Coconu, Oliver Deussen, and Hans-Christian Hege.

Real-time rendering of complex photorealistic landscapes using hybrid level-of-detail approaches. In E. Buhmann, P. Paar, I. Bishop, and E. Lange, editors, Trends in Real-Time Landscape Visualization and Participation, pages 97–107.

Wichmann Verlag, Heidelberg, 2005.

[11] C. Curtis. Loose and sketchy animation. ACM SIGGRAPH 98 Electronic art and animation catalog, page 145, 1998.

[12] Carsten Dachsbacher, Christian Vogelgsang, and Marc Stamminger. Sequential point trees. ACM Transactions on Graphics, 22(3):657–662, July 2003.

[13] Doug DeCarlo, Adam Finkelstein, Szymon Rusinkiewicz, and Anthony San-tella. Suggestive contours for conveying shape. In SIGGRAPH ’03: ACM SIGGRAPH 2003 Papers, pages 848–855, New York, NY, USA, 2003. ACM Press.

[14] Xavier Decoret, Fredo Durand, and Francois X. Sillion. Billboard clouds. In SCG ’03: Proceedings of the nineteenth annual symposium on Computational geometry, pages 376–376, New York, NY, USA, 2003. ACM Press.

[15] B. Demuth and R. Fünkner. Einsatz computergestützter visualisierungstech-niken in der landschaftsplanung - chancen, risiken und perspektiven. In D. Gruehn, A. Herberg, and Ch. Roesrath, editors,Naturschutz und Landschaft-splanung: moderne Technologien, Methoden und Verfahrensweisen; Festschrift zum 60. Geburtstag von Prof. Dr. Hartmut Kenneweg, pages 97–111, Berlin, 2000.

[16] O. Deussen. Computergenerierte Pflanzen - Technik und Design digitaler Pflanzenwelten. Springer-Verlag, 2002.

[17] O. Deussen, C. Colditz, M. Stamminger, and G. Drettakis. Interactive visualiza-tion of complex plant ecosystems. InIEEE Visualization 2002, pages 219–226.

IEEE, 2002.

[18] O. Deussen, J. Hamel, A. Raab, S. Schlechtweg, and T. Strothotte. An illus-tration technique using hardware-based intersections and skeletons. InProc. of Graphics Interface 99, pages 175–182. Canadian Human-Computer Commu-nications Soc., 1999.

[19] O. Deussen, S. Hiller, K. van Overveld, and T. Strothotte. Floating points: a method for computing stipple drawings. Computer Graphics Forum, 19(4):40–

51 (Eurographics 2000 Conf. Proc.), 2000.

[20] O. Deussen and T. Strothotte. Computer-generated pen-and-ink illustration of trees. Computer Graphics,SIGGRAPH 2000 Conf. Proc., 34(4):13–18, 2000.

[21] G. Elber. Line art illustrations of parametric and implicit forms. IEEE Trans.

on Visualization and Computer Graphics, 4(1):71–81, 1998.

[22] S.M. Erwin. Digital landscape modeling and visualizations. a research agenda.

Landscape and Urban Planning 54. Spec. Issue, Our Visual Landscape., pages 49–62, 2001.

BIBLIOGRAPHY 103 [23] M. Kuwahara et al. Processing of ri-angio-cardiographic images. In Digital

Processing of Biomedical Images. Plenum Press, 1976.

[24] L. Evans. The New Complete Illustration Guide: The Ultimate Trace File for Architects, Designers, Artists, and Students. Van Nostrand Reinhold, 1996.

[25] J. Foley, A. van Dam, S. K. Feiner, and Hughes.Computer graphics - principles and practice. Addison-Wesley, 2 edition, 1990.

[26] H. Günther. Peter Joseph Lenné - Gärten, Parke, Landschaften. Verlag Bauwe-sen, 1991.

[27] Amy Gooch, Bruce Gooch, Peter Shirley, and Elaine Cohen. A non-photorealistic lighting model for automatic technical illustration. In SIG-GRAPH ’98: Proceedings of the 25th annual conference on Computer graphics and interactive techniques, pages 447–452, New York, NY, USA, 1998. ACM Press.

[28] Bruce Gooch, Peter-Pike J. Sloan, Amy Gooch, Peter Shirley, and Richard Riesenfeld. Interactive technical illustration. In SI3D ’99: Proceedings of the 1999 symposium on Interactive 3D graphics, pages 31–38, New York, NY, USA, 1999. ACM Press.

[29] Arthur Leighton Guptill. Rendering in Pen and Ink. Watson-Guptill Publica-tions, New York, 1976.

[30] Aaron Hertzmann and Denis Zorin. Illustrating smooth surfaces. In SIG-GRAPH ’00: Proceedings of the 27th annual conference on Computer graphics and interactive techniques, pages 517–526, New York, NY, USA, 2000. ACM Press/Addison-Wesley Publishing Co.

[31] A. Herwig, P. Paar, J. Rekittke, and A. Werner. Machbarkeitsstudie für ein visualisierungstool. analyse des bedarfes und der machbarkeit eines com-putergraphischen visualisierungssystems für interaktive planungs- und um-setzungsprozesse auf landschaftsebene - ergebnisbericht. Technical report, Unpublished. Gefördert von der Deutschen Bundesstiftung Umwelt (DBU), Zentrum für Agrarlandschafts- und Landnutzungsforschung (ZALF) e.V.

Müncheberg, 2001.

[32] H. Hoppe. Progressive meshes. InSIGGRAPH 96 Conf. Proc., pages 99–108.

[33] H. Hoppe, T. DeRose, T. Duchamp, J. McDonald, and W. Stuetzle. Mesh opti-mization. Computer Graphics, 27(3):19–26, 1993.

[34] V. Interrante. Illustrating surface shape in volume data via principal direction-driven 3d line integral convolution. Computer Graphics, 31(4):109–116, SIG-GRAPH 97 Conf. Proc.

[35] Tobias Isenberg, Bert Freudenberg, Nick Halper, Stefan Schlechtweg, and Thomas Strothotte. A developer’s guide to silhouette algorithms for polygo-nal models. IEEE Comput. Graph. Appl., 23(4):28–37, 2003.

[36] P. Jünemann, P. Paar, and J. Rekittke. Landschaftsplanung: Verbreitung und einsatz von 3d-visualisierungswerkzeugen in der planungspraxis. Kartografis-che Nachrichten (KN), 4, 4::200–204, 2001.

[37] Robert D. Kalnins, Philip L. Davidson, Lee Markosian, and Adam Finkelstein.

Coherent stylized silhouettes. ACM Transactions on Graphics, 22(3):856–861, July 2003.

[38] H. Keller, K.-D. Bendtfeldt, and G. Osburg. Darstellung in der Freiraumpla-nung. Blackwell Wissensch., Berlin, 1996.

[39] H. Keller, K.-D. Bendtfeldt, and G. Osburg. Zeichnen und Darstellung in der Freiraumplanung. Von der Skizze zum Entwurf. Ulmer (Eugen), 2002.

[40] Allison W. Klein, Wilmot W. Li, Michael M. Kazhdan, Wagner T. Correa, Adam Finkelstein, and Thomas A. Funkhouser. Non-photorealistic virtual en-vironments. In Kurt Akeley, editor, Siggraph 2000, Computer Graphics Pro-ceedings, pages 527–534. ACM Press / ACM SIGGRAPH / Addison Wesley Longman, 2000.

[41] Michael A. Kowalski, Lee Markosian, J. D. Northrup, Lubomir Bourdev, Ronen Barzel, Loring S. Holden, and John F. Hughes. Art-based rendering of fur, grass, and trees. In SIGGRAPH ’99, pages 433–438. ACM Press/Addison-Wesley Publishing Co., 1999.

[42] Adam Lake, Carl Marshall, Mark Harris, and Marc Blackstein. Stylized ren-dering techniques for scalable real-time 3d animation. InNPAR 2000 : First In-ternational Symposium on Non Photorealistic Animation and Rendering, pages 13–20, June 2000.

[43] C: Leonardi and F.Stagi.L’architettura degli alberi. Edition Mazzotta, Mailand, 1998.

[44] M. Levoy and T. Whitted. The use of points as display primitives. Technical Report TR 85-022, Univ. of North Carolina at Chapel Hill, 1985.

[45] F. Lohan. Pen and Ink Techniques. Contemporary Books, Chicago, 1978.

[46] L. Markosian, M. A. Kowalski, S. Trychin, L. Bourdev, D. Goldstein, and J. Hughes. Real-time nonphotorealistic rendering. In SIGGRAPH 97 Conf.

Proc., pages 415–420, 1997.

[47] Lee Markosian, Barbara J. Meier, Michael A. Kowalski, Loring S. Holden, J. D.

Northrup, and John F. Hughes. Art-based rendering with continuous levels of detail. InProceedings of the First International Symposium on Non Photoreal-istic Animation and Rendering (NPAR) for Art and Entertainment, June 2000.

To be held in Annecy, France.

[48] N. Max. Hierarchical rendering of trees from precomputed multi-layer Z-buffers. In Eurographics Rendering Workshop 96, pages 165–174. Springer-Verlag (Rendering Techniques 1996), 1996.

[49] N. Max, O. Deussen, and B. Keating. Hierarchical image-based rendering using texture mapping hardware. Eurographics Rendering Workshop 99, pages 57–

62, 1999.

[50] N. Max and K. Ohsaki. Rendering trees from precomputed Z-buffer views.

InEurographics Rendering Workshop 1995. Springer-Verlag (Rendering Tech-niques 1995), 1995.

BIBLIOGRAPHY 105 [51] Morgan McGuire and John F. Hughes. Hardware-determined feature edges.

In NPAR ’04: Proceedings of the 3rd international symposium on Non-photorealistic animation and rendering, pages 35–147, New York, NY, USA, 2004. ACM Press.

[52] Leonard McMillan and Gary Bishop. Plenoptic modeling: An image-based rendering system. Computer Graphics, 29(Annual Conference Series):39–46, 1995.

[53] Barbara J. Meier. Painterly rendering for animation. Computer Graphics, 30(Annual Conference Series):477–484, 1996.

[54] O. Meruvia. Frame-coherent stippling. In Eurographics 2002 Proc. (short presentation).

[55] J. Neider, T. Davis, and M. Woo. OpenGL Programming Guide: The Official Guide to Learning OpenGL. Addison-Wesley, opengl version 1.2 edition, 1999.

[56] J. D. Northrup and Lee Markosian. Artistic silhouettes: a hybrid approach.

In NPAR ’00: Proceedings of the 1st international symposium on Non-photorealistic animation and rendering, pages 31–37, New York, NY, USA, 2000. ACM Press.

[57] B. Orland, C. Ogleby, H. Campbell, and P. Yates. Multi-media approaches to visualization of ecosystem dynamics. In Proceedings, ASPRS/ACSM/ RT’ 97, Seattle, American Society for Photogrammetry and Remote Sensing. Washing-ton, DC, volume 4, pages 224–236, 1997.

[58] Oscar Meruvia Pastor, Bert Freudenberg, and Thomas Strothotte. Real-time animated stippling. IEEE Comput. Graph. Appl., 23(4):62–68, 2003.

[59] H. Pfister, M. Zwicker, J. van Baar, and M. Gross. Surfels: Surface elements as rendering primitives. InSIGGRAPH 2000 Conf. Proc., pages 335–242.

[60] P.Paar. Lenné3d - the making of a new landscape visualization system: From requirements analysis and feasibility survey towards prototyping. In E. Buh-mann and S. Ervin, editors, Trends in Landscape Modeling. Proc. at Anhalt University of Applied Sciences, pages 78–84. Wichmann, Heidelberg, 2003.

[61] Emil Praun, Hugues Hoppe, Matthew Webb, and Adam Finkelstein. Real-time hatching. InProceedings of ACM SIGGRAPH 2001, Computer Graphics Pro-ceedings, Annual Conference Series, pages 579–584, August 2001.

[62] R. Raskar and M. Cohen. Image precision silhouette edges. In 1999 ACM Symp. on Interactive 3D Graphics, pages 135–140. ACM SIGGRAPH, April 1999.

[63] W. Reeves and R. Blau. Approximate and probabilistic algorithms for shading and rendering structured particle systems. InComputer Graphics (SIGGRAPH

’85 Conf. Proc.), volume 19, pages 313–322, 1985.

[64] J. Rekittke. Drag and drop - the compatibility of existing landscape theories and new virtual landscapes. In Trends in GIS and Virtualization in Environmental Planning and Design, Proc. at Anhalt University of Applied Sciences, pages 110–123, Heidelberg, 2002. Wichmann.

[65] J. Rekittke, P. Paar, and L.Coconu. Dogma3d. Stadt+Grün, (7):15–21, 2004.

[66] S. Rusinkiewicz and M. Levoy. Qsplat: A mulitresolution point rendering sys-tem for large meshes. InSIGGRAPH 2000 Conf. Proc., pages 343–352.

[67] T. Saito and T. Takahashi. Comprehensive rendering of 3-d shapes. Computer Graphics, 24(4):197–206, SIGGRAPH 90 Conf. Proc.

[68] M. Salisbury, S. Anderson, R. Barzel, and D. Salesin. Interactive pen-and-ink illustration. Computer Graphics, 28(4):101–108, SIGGRAPH 94 Conf. Proc.

[69] M. Salisbury, M. Wong, J. F. Hughes, and D. Salesin. Orientable textures for image-based pen-and-ink illustration. InSIGGRAPH 97 Conf. Proc., 1997.

[70] T. Sasada. Drawing natural scenery by computer graphics. Computer-Aided Design, 19(4):212–218, 1987.

[71] G. Schaufler and W. Stürzlinger. A three dimensional image cache for virtual reality. Computer Graphics Forum, 15(3):227–236, 1996.

[72] William J. Schroeder, Jonathan A. Zarge, and William E. Lorensen. Decimation of triangle meshes. In SIGGRAPH ’92: Proceedings of the 19th annual con-ference on Computer graphics and interactive techniques, pages 65–70, New York, NY, USA, 1992. ACM Press.

[73] A. Secord. Weighted voronoi stippling. In2nd International Symp. on Non-Realistic Animation and Rendering (NPAR), pages 37–43. ACM Press, 2002.

[74] J. Shade, S. Gortler, L. He, and R. Szeliski. Layered depth images. In SIG-GRAPH 1998 Conf. Proc., pages 231–242.

[75] J. Shade, D. Lischinski, D. Salesin, T. DeRose, and J. Snyder. Hierarchical image caching for accelerated walkthroughs of complex environments. In SIG-GRAPH 96 Conf. Proc., pages 75–82, 1996.

[76] A. Smith. Plants, fractals and formal languages. Computer Graphics (SIG-GRAPH 84 Conf. Proc.), 18(3):1–10, 1984.

[77] A. R. Smith. A pixel is not a little square, a pixel is not a little square, a pixel is

[77] A. R. Smith. A pixel is not a little square, a pixel is not a little square, a pixel is