Data-Driven Appearance Modeling .YhA@8nc~l
Jiaping Wang ]D?"aX'q>
ABSTRACT vx>
b^tJKC
Appearance modeling and rendering is the core topic in computer graphics research, and is the foundation of realistic rendering. Appearance modeling aims to model how light interacts with objects surfaces and reproduce the measured appearance including surfaces of real world materials. A data-driven approach of appearance modeling is proposed in this dissertation. The data-driven approach expresses the intrinsic mechanism of appearance generation in multiple ways, including model decomposition and intrinsic data model. The data-driven approach allows simultaneously use of different methods to handle the decomposed sub-models based on their characteristics. Advantages of different appearance models are integrated in data-driven model and are successfully applied in modeling time-variant materials, translucent materials and surface meso-structure. %RTBV9LIXr
· A visual simulation technique called appearance manifolds is proposed for modeling the time-variant surface appearance of a material from data captured at a single instant in time. In modeling time variant appearance, our method takes advantage of the key observation that concurrent variations in appearance over a surface represent different degrees of weathering. By reorganizing these various appearances in a manner that reveals their relative order with respect to weathering degree, our method infers spatial and temporal appearance properties of the material’s weathering process that can be used to convincingly generate its weathered appearance at different points in time. Results with natural non-linear reflectance variations are demonstrated in applications such as visual simulation of weathering on 3D models, increasing and decreasing the weathering of real objects, and material transfer with weathering effects. The proposed appearance manifold technique generates weathering sequences that are consistent with the changing local reflectance characteristics of a material over time. It complements existing visual simulation techniques that are designed to compute weathering degree distributions, and leads to various weathering applications for synthetic 3D models, real weathered objects, and even single snapshots of weathered objects. With this method, the input data is simple to acquire, and natural non-linear appearance variations over time are easy to produce. This paper was published in Proceedings of ACM SIGGRAPH 2006 and ACM Transactions on Graphics, Volume 25, Issue 3, 2006. >"^ O"
E
· A novel technique for the visual modeling of spatially varying anisotropic reflectance using data captured from a single view is proposed. Reflectance is represented using a microfacet-based BRDF which tabulates the facets’ normal distribution (NDF) as a function of surface location. Data from a single view provides a 2D slice of the 4D BRDF at each surface point from which we fit a partial NDF. The fitted NDF is partial because the single view direction coupled with the set of light directions covers only a portion of the “half-angle” hemisphere. We complete the NDF at each point by applying a novel variant of texture synthesis using similar, overlapping partial NDFs from other points. Our similarity measure allows azimuthal rotation of partial NDFs, under the assumption that reflectance is spatially redundant but the local frame may be arbitrarily oriented. Our system includes a simple acquisition device that collects images over a 2D set of light directions by scanning a linear array of LEDs over a flat sample. Results demonstrate that our approach preserves spatial and directional BRDF details and generates a visually compelling match to measured materials. Our microfacet synthesis technique generates anisotropic, spatially varying surface reflectance consistent with the appearance of real measured appearance. A variety of materials has been modeled and reproduced successfully data captured from a single view. Our method avoids image registration and greatly simplifies data acquisition and processing. This paper was published in Proceedings of ACM SIGGRAPH 2008 and ACM Transactions on Graphics, Volume 27, Issue 3, 2008. /5S30 |K
· We propose techniques for modeling and rendering of general heterogeneous translucent materials that enable acquisition from measured samples, interactive editing of material attributes, and real-time rendering. The materials are assumed to be optically dense such that multiple scattering can be approximated by a diffusion process described by the diffusion equation. For modeling heterogeneous materials, we present the inverse diffusion algorithm for acquiring material properties from appearance measurements. This modeling algorithm incorporates a regularizer to handle the ill-conditioning of the inverse problem, an adjoint method to dramatically reduce the computational cost, and a hierarchical GPU implementation for further speedup. To render an object with known material properties, we present the polygrid diffusion algorithm, which solves the diffusion equation with a boundary condition defined by the given illumination environment. This rendering technique is based on representation of an object by a polygrid, a grid with regular connectivity and an irregular shape, which facilitates solution of the diffusion equation in arbitrary volumes. Because of the regular connectivity, our rendering algorithm can be implemented on the GPU for real-time performance. We demonstrate our techniques by capturing materials from physical samples and performing real-time rendering and editing with these materials. This paper was published in ACM Transaction on Graphics, Volume 27, Issue 1, 2008. }JF13beU
· Many translucent materials consist of evenly-distributed heterogeneous elements which produce a complex appearance under different lighting and viewing directions. For these quasi-homogeneous materials, existing techniques do not address how to acquire their material representations from physical samples in a way that allows arbitrary geometry models to be rendered with these materials. We propose a model for such materials that can be readily acquired from physical samples. This material model can be applied to geometric models of arbitrary shapes, and the resulting objects can be efficiently rendered without expensive subsurface light transport simulation. In developing a material model with these attributes, we capitalize on a key observation about the subsurface scattering characteristics of quasi-homogeneous materials at different scales. Locally, the non-uniformity of these materials leads to inhomogeneous subsurface scattering. For subsurface scattering on a global scale, we show that a lengthy photon path through an even distribution of heterogeneous elements statistically resembles scattering in a homogeneous medium. This observation allows us to represent and measure the global light transport within quasi-homogeneous materials as well as the transfer of light into and out of a material volume through surface meso-structures. We demonstrate our technique with results for several challenging materials that exhibit sophisticated appearance features such as transmission of back illumination through surface meso-structures. This paper was published in ACM SIGGRAPH 2005 and ACM Transaction on Graphics Volume 24, Issue 3, 2005. N-G1h?e4
} df
W%{
Key words: Realistic rendering, Real-time rendering, Bidirectional texture functions, Reflectance and shading models, BRDF, Subsurface scattering, Time-variant material, Diffusion Equation, Natural Phenomena