The Annotated VRML 97 Reference

1 Intro     Concepts     3 Nodes     4 Fields/Events    Conformance
A Grammar     B Java     C JavaScript     D Examples     E Related Info    References
Quick Java         Quick JavaScript         Quick Nodes   
 

  About the Book
  
Help
  Copyright © 1997-99
  Purchase the book from Amazon.com

 

 

Chapter 2
Key Concepts

2.1 Intro

2.1.1 Overview
2.1.2 TOC
2.1.3 Conventions

2.2 Overview
2.2.1 File Structure
2.2.2 Header
2.2.3 Scene graph
2.2.4 Prototypes
2.2.5 Routing
2.2.6 Generating files
2.2.7 Presentation
     Interaction
2.2.8 Profiles

2.3 UTF-8 syntax
2.3.1 Clear text
2.3.2 Statements
2.3.3 Node
2.3.4 Field
2.3.5 PROTO
2.3.6 IS
2.3.7 EXTERNPROTO
2.3.8 USE
2.3.9 ROUTE

2.4 Scene graph
2.4.1 Root nodes
2.4.2 Hierarchy
2.4.3 Descendants
       & ancestors
2.4.4 Hierarchy
2.4.5 Units coord sys

2.5 VRML & WWW
2.5.1 MIME type
2.5.2 URLs
2.5.3 Relative URLs
2.5.4 data:
2.5.5 Scripting protocols
2.5.6 URNs

2.6 Nodes
2.6.1 Intro
2.6.2 DEF/USE
2.6.3 Geometry
2.6.4 Bboxes
2.6.5 Grouping & children
2.6.6 Lights
2.6.7 Sensors
2.6.8 Interpolators
2.6.9 Time nodes
2.6.10 Bindable children
2.6.11 Textures

2.7 Field, eventIn,
     eventOut

2.8 PROTO
2.8.1 Declaration
2.8.2 Definition
2.8.3 Scoping

2.9 EXTERNPROTO
2.9.1  Interface
2.9.2  URL
2.9.3 Extensions

2.10 Events
2.10.1 Intro
2.10.2 Routes
2.10.3 Execution
2.10.4 Loops
2.10.5 Fan-in & fan-out

2.11 Time
2.11.1 Intro
2.11.2 Origin
2.11.3 Discrete/cont

2.12 Scripting
2.12.1 Intro
2.12.2 Execution
2.12.3 Initialize/shutdown
2.12.4 eventsProcessed
2.12.5 Direct outputs
2.12.6 Asynchronous
2.12.7 Languages
2.12.8 EventIns
2.12.9 fields events
2.12.10 Browser interface

2.13 Navigation
2.13.1 Intro
2.13.2 Navigation
2.13.3 Viewing
2.13.4 Collisions

2.14 Lighting
2.14.1 Intro
2.14.2 'off'
2.14.3 'on'
2.14.4 Equations
2.14.5 References

+2.14 Lighting model

2.14.1 Introduction

The VRML lighting model provides detailed equations which define the colours to apply to each geometric object. For each object, the values of the Material node, Color node and texture currently being applied to the object are combined with the lights illuminating the object and the currently bound Fog node. These equations are designed to simulate the physical properties of light striking a surface.

2.14.2 Lighting 'off'

A Shape node is unlit if either of the following is true:

  1. The shape's appearance field is NULL (default)
  2. The material field in the Appearance node is NULL (default)

Note the special cases of geometry nodes that do not support lighting (see "3.24 IndexedLineSet" and "3.36 PointSet" for details).

design note

A shape will be lit if you specify a material to be used for lighting. Shapes are unlit and bright white by default; you will almost always specify either colors (using a Color node) or a material (using a Material node). No lighting was chosen as the default because it is faster than lighting (wherever possible in VRML, default values were chosen to give maximum performance), and bright white was chosen so objects show up against the default black background.

If the shape is unlit, the colour (Irgb) and alpha (A, 1-transparency) of the shape at each point on the shape's geometry is given in Table 2-5.

Table 2-5: Unlit colour and alpha mapping

Texture type Colour per-vertex
     or per-face
Colour NULL
No texture Irgb= ICrgb
A = 1
Irgb= (1, 1, 1)
A = 1
Intensity
(one-component)
Irgb= IT × ICrgb
A = 1
Irgb = (IT,IT,IT )
A = 1
Intensity+Alpha
(two-component)
Irgb= I T × ICrgb
A = AT
Irgb= (IT,IT,IT )
A = A
T
RGB
(three-component)
Irgb= ITrgb
A = 1
Irgb= ITrgb
A = 1
RGBA
(four-component)
Irgb= ITrgb
A = AT
Irgb= ITrgb
A = AT



where:

AT = normalized [0, 1] alpha value from 2 or 4 component texture image
I
Crgb = interpolated per-vertex colour, or per-face colour, from Color node
I
T = normalized [0, 1] intensity from 1 or 2 component texture image
I
Trgb= colour from 3-4 component texture image

design note

If a full-color texture is given, it defines the colors on unlit geometry (if per-vertex or per-face colors are also given, they are ignored). If an intensity map (one- or two-component texture) is given, then it is either used as a gray-scale texture or, if colors are also specified, it is used to modulate the colors. If there is no texture, then either the per-vertex or per-face colors are used, if given, or white is used. Alpha values are always either 1.0 (fully opaque) or come from the texture image, if the texture image contains transparency.

If colors are specified per vertex, then they should be interpolated across each polygon (polygon and face mean the same thing--a series of vertices that lie in the same plane and define a closed 2D region). The method of interpolation is not defined. Current rendering libraries typically triangulate polygons with more than three vertices and interpolate in RGB space, but neither is required. Pretriangulate your shapes and limit the color differences across any given triangle (by splitting triangles into smaller triangles, if necessary) if you want to guarantee similar results in different implementations. Also note that some implementations may not support per-vertex coloring at all and may approximate it by averaging the vertex colors to produce one color per polygon.

Allowing the specification of transparency values per face or per vertex was considered. While that would have made the Color node more consistent with the Material and texture nodes (which allow both colors and transparencies to be specified), it would have added complexity to an already complex part of the specification for a feature that would be rarely used.

2.14.3 Lighting 'on'

If the shape is lit (i.e., a Material and an Appearance node are specified for the Shape), the Color and Texture nodes determine the diffuse colour for the lighting equation as specified in Table 2-6.

Table 2-6: Lit colour and alpha mapping

Texture type

Colour per-vertex
     or per-face
Color node NULL
No texture ODrgb = ICrgb
A = 1-TM
ODrgb = IDrgb
A = 1-TM
Intensity texture
(one-component)
ODrgb = IT × ICrgb
A = 1-TM
ODrgb = IT × IDrgb
A = 1-TM
Intensity+Alpha texture
(two-component)
ODrgb = IT × ICrgb
A = AT
ODrgb = IT × IDrgb
A = AT
RGB texture
(three-component)
ODrgb = ITrgb
A = 1-TM
ODrgb = ITrgb
A = 1-TM
RGBA texture
(four-component)
ODrgb = ITrgb
A = AT
ODrgb = ITrgb
A = AT



where:

IDrgb = material diffuseColor
O
Drgb = diffuse factor, used in lighting equations below
TM = material transparency

... and all other terms are as above.

design note

The rules (expressed in Table 2-4) for combining texture, Color, and Material nodes are as follows:

Textures have the highest priority; texture colors will be used if a full-color texture is specified (and the colors in the Color node or diffuseColor of the Material node will be ignored). If an intensity texture is specified, it will be used to modulate the diffuse colors from either the Color or Material nodes. If the texture contains transparency information, it is always used instead of the Material's transparency field.

Per-vertex or per-face colors specified in a Color node have the next highest priority and override the Material node's diffuseColor field unless a full-color texture is being used.

The diffuseColor specified in the Material node has lowest priority and will be used only if there is no full-color texture or Color node. The texture and Color nodes affect only the diffuseColor of the Material; the other Material parameters (specularColor, emissiveColor, etc.) are always used as is.

2.14.4 Lighting equations

An ideal VRML implementation will evaluate the following lighting equation at each point on a lit surface. RGB intensities at each point on a geometry (Irgb) are given by:

Irgb = IFrgb × (1 -f0) + f0 × (OErgb + SUM( oni × attenuationi × spoti × ILrgb
                                                                          × (ambient
i + diffusei + specular i)))

where:

attenuationi = 1 / max(c1 + c2 × dL + c3 × dL² , 1 )
ambient
i = Iia × ODrgb × Oa

diffuse
i = Ii × ODrgb × ( N · L )
specular
i = Ii × OSrgb × ( N · ((L + V) / |L + V|))shininess × 128

and:

· = modified vector dot product: if dot product < 0, then 0.0, otherwise, dot product
 
c1 , c2, c 3 = light i attenuation
d
V = distance from point on geometry to viewer's position, in coordinate system of current fog node
d
L = distance from light to point on geometry, in light's coordinate system
f
0 = Fog interpolant, see Table 2-8 for calculation
I
Frgb = currently bound fog's color
I
Lrgb = light i color
Ii = light i intensity
I
ia = light i ambientIntensity
L = (Point/SpotLight) normalized vector from point on geometry to light source i position
L = (DirectionalLight) -direction of light source i
N = normalized normal vector at this point on geometry (interpolated from vertex normals specified in Normal node or calculated by browser)
O
a = Material ambientIntensity
O
Drgb = diffuse colour, from Material node, Color node, and/or texture node
O
Ergb = Material emissiveColor
O
Srgb = Material specularColor
on
i = 1, if light source i affects this point on the geometry,
0, if light source i does not affect this geometry (if farther away than radius for PointLight or SpotLight, outside of enclosing Group/Transform for DirectionalLights, or on field is FALSE)
shininess = Material shininess
spotAngle = acos(
-L · spotDiri)
spotBW = SpotLight i beamWidth
spotCO = SpotLighti cutOffAngle
spoti = spotlight factor, see Table 2-7 for calculation
spotDiri = normalized SpotLight i direction
SUM: sum over all light sources i
V = normalized vector from point on geometry to viewer's position

Table 2-7: Calculation of the spotlight factor

Condition (in order)
spoti =
lighti is PointLight or DirectionalLight 1
spotAngle >= spotCO 0
spotAngle <= spotBW 1
spotBW  < spotAngle < spot CO (spotAngle - spotCO ) / (spotBW-spotCO)



Table 2-8: Calculation of the fog interpolant

Condition f0 =
no fog 1
fogType "LINEAR", dV < fogVisibility (fogVisibility-dV) / fogVisibility
fogType "LINEAR", dV > fogVisibility 0
fogType "EXPONENTIAL", dV < fogVisibility exp(-dV / (fogVisibility-dV ) )
fogType "EXPONENTIAL", dV > fogVisibility 0



tip

The following design note is useful to both authors and implementors.

design note

These lighting equations are intended to make it easier for implementors to match the ideal VRML lighting model to the lighting equations used by their rendering library. However, understanding the lighting equations and understanding the approximations commonly made to map them to common rendering libraries can help you create content that looks good on all implementations of VRML.

Performing the lighting computation per pixel (Phong shading) is not feasible on current graphics software and hardware; the hardware and software just aren't fast enough. However, within the next couple of years per-pixel lighting will probably be a common feature of very high-performance graphics hardware, and it may be a common feature in inexpensive software and hardware in five years, so VRML specifies an ideal lighting model that can grow with hardware progress. Because 3D graphics technology is evolving so fast, it is better to anticipate future developments and allow current implementations to approximate an ideal specification, rather than choosing a least-common-denominator model that will limit future implementations.

Current implementations typically perform lighting calculations only for each vertex of each polygon. The resulting colors are then linearly interpolated across the polygon (Gouraud shading). The most noticeable effects of this approximation are fuzzy or inaccurate edges for specular highlights, spotlights, and point lights, since the tessellation of the geometry affects where lighting calculations are done. The approximation can be improved by subdividing the polygons of the geometry, creating more vertices (and therefore forcing implementations to do more lighting calculations). This will, of course, decrease performance.

Application of a texture map should ideally occur before lighting, replacing the diffuse term of the lighting equation at each pixel. However, since lighting computations are done per vertex and not per pixel, texture maps are combined with the interpolated color. That is, instead of performing the ideal lighting calculation

OErgb + SUM(oni × attenuationi × spoti ×Lrgb
                                        × (ambient
i + (Ii × ODrgb× ( N · L )) + speculari) )

this approximation is computed when texturing

ITrgb × (OErgb + SUM(oni × attenuationi × spoti × ILrgb
                                                × (Iia × Oa + I
i × ( N · L ) + speculari) ) )

The terms inside the parentheses are computed per vertex and interpolated across the polygon, and a color is computed from the texture map and multiplied per pixel. Note that the approximation equals the ideal equation for purely diffuse objects (objects where OErgb = speculari = 0.0), and since the diffuse term dominates for most objects, the approximation will closely match the ideal for most textured objects. Errors are caused by the texture affecting the specular and emissive colors of the object.

Finally, implementations will be forced to quantize the ideal 0.0 to 1.0 RGB colors of the VRML specification into the number of colors supported by your graphics hardware. This is becoming less of an issue each year as more and more hardware supports millions of colors (24 bits of color--16 million colors--is near the limit of human perception), but displayed colors can vary widely on displays that support only thousands or hundreds of colors. In addition, different computer monitors can display the same colors quite differently, resulting in different-looking worlds. The VRML file format does not attempt to address any of these issues; it is meant to be only an ideal description of a virtual world.

2.14.5 References

The VRML lighting equations are based on the simple illumination equations given in [FOLE] and [OPEN].