Rendering of 3D Meshes by Feature-Guided Convolution

Rendering of 3D Meshes by Feature-Guided Convolution

Yue Qi
Copyright: © 2012 |Volume: 4 |Issue: 3 |Pages: 10
ISSN: 1937-965X|EISSN: 1937-9668|EISBN13: 9781466610590|DOI: 10.4018/japuc.2012070105
Cite Article Cite Article

MLA

Qi, Yue. "Rendering of 3D Meshes by Feature-Guided Convolution." IJAPUC vol.4, no.3 2012: pp.81-90. http://doi.org/10.4018/japuc.2012070105

APA

Qi, Y. (2012). Rendering of 3D Meshes by Feature-Guided Convolution. International Journal of Advanced Pervasive and Ubiquitous Computing (IJAPUC), 4(3), 81-90. http://doi.org/10.4018/japuc.2012070105

Chicago

Qi, Yue. "Rendering of 3D Meshes by Feature-Guided Convolution," International Journal of Advanced Pervasive and Ubiquitous Computing (IJAPUC) 4, no.3: 81-90. http://doi.org/10.4018/japuc.2012070105

Export Reference

Mendeley
Favorite Full-Issue Download

Abstract

The author presents a feature-guided convolution method for rendering a 3D triangular mesh. In Their work, they compute feature directions on the vertices of a mesh and generate noise on the faces of a mesh. After projecting the directions and noise into 2D image space, the author executes convolution to render the mesh. They used three feature directions: a principal direction, the tangent of an isocurve of view-dependent features, and the tangent of an isophote curve. By controlling the value of noise, the author can produce several non-photorealistic rendering effects such as pencil drawing and hatching. This rendering process is temporally coherent and can therefore be used to create artistic styled animations.

Request Access

You do not own this content. Please login to recommend this title to your institution's librarian or purchase it from the IGI Global bookstore.