< Zurück | Inhalt | Weiter >

IMPORTANT

The texture coordinates are specified in counterclockwise order. This is a requirement imposed by the com.sun.j3d.utils.geometry.Triangulator utility, which converts the polygon created from the texture coordinates into a TriangleArray.

The createTextureGeometry method performs most of the work related to assigning texture coordinates to vertices. There are eight basic steps:


1. Read Texture coordinates from file.

2. Generate vertex coordinates based on scaling and translating texture coordinates.

3. Load the texture image using the com.sun.j3d.utils.image.TextureLoader class and

assign to an Appearance.


//load the texture image and assign to the appearance TextureLoader texLoader = new TextureLoader( texInfo.m_szImage, Texture.RGB, this );

Texture tex = texLoader.getTexture(); app.setTexture( tex );


4. Create a GeometryInfo object to store the texture and vertex coordinates (POYGON_ARRAY).


//create a GeometryInfo for the QuadArray that was populated. GeometryInfo gi = new GeometryInfo( GeometryInfo.POLYGON_ARRAY );


5. Assign the texture and vertex coordinates to the GeometryInfo object.


//assign coordinates

gi.setCoordinates( texInfo.m_CoordArray ); gi.setTextureCoordinates( texInfo.m_TexCoordArray );


6. Triangulate the GeometryInfo object.


//use the triangulator utility to triangulate the polygon int[] stripCountArray = {texInfo.m_CoordArray.length}; int[] countourCountArray = {stripCountArray.length}; gi.setContourCounts( countourCountArray ); gi.setStripCounts( stripCountArray );

Triangulator triangulator = new Triangulator(); triangulator.triangulate( gi );


7. Generate Normal vectors for the GeometryInfo object.


//generate normal vectors for the triangles,

//not strictly necessary as we are not lighting the scene

//but generally useful

NormalGenerator normalGenerator = new NormalGenerator(); normalGenerator.generateNormals( gi );


8. Create a Shape3D object based on the GeometryInfo object.


//wrap the GeometryArray in a Shape3D and assign appearance new Shape3D( gi.getGeometryArray(), app );


Please refer to TextureTest.java for the full example. The important methods are listed in full next.


//create a TransformGroup, position it, and add the texture

//geometry as a child node

protected TransformGroup createTextureGroup( String szFile, double x, double y, double z, boolean bWireframe )

{

TransformGroup tg = new TransformGroup(); Transform3D t3d = new Transform3D(); t3d.setTranslation( new Vector3d( x,y,z ) ); tg.setTransform( t3d );

Shape3D texShape = createTextureGeometry( szFile, bWireframe ); if( texShape != null )

tg.addChild( texShape );


return tg;

}


//return a Shape3D that is a triangulated texture−mapped polygon

//based on the texture coordinates and name of texture image in the

//input file

protected Shape3D createTextureGeometry( String szFile, boolean bWireframe )


{

//load all the texture data from the file and

//create the geometry coordinates

TextureGeometryInfo texInfo = createTextureCoordinates( szFile ); if( texInfo == null )

{

System.err.println( "Could not load texture info for file:" +

szFile );

return null;

}


//print some stats on the loaded file System.out.println( "Loaded File: " + szFile );

System.out.println( " Texture image: " + texInfo.m_szImage ); System.out.println( " Texture coordinates: " +

texInfo.m_TexCoordArray.length );


//create an Appearance and assign a Material Appearance app = new Appearance();


PolygonAttributes polyAttribs = null;


//create the PolygonAttributes and attach to the Appearance,

//note that we use CULL_NONE so that the "rear" side

//of the geometry is visible with the applied texture image if( bWireframe == false )

{

polyAttribs = new PolygonAttributes( PolygonAttributes.POLYGON_FILL, PolygonAttributes.CULL_NONE, 0 );

}

else

{

polyAttribs = new PolygonAttributes( PolygonAttributes.POLYGON_LINE, PolygonAttributes.CULL_NONE, 0 );

}


app.setPolygonAttributes( polyAttribs );


//load the texture image and assign to the appearance TextureLoader texLoader = new TextureLoader( texInfo.m_szImage,

Texture.RGB, this );

Texture tex = texLoader.getTexture(); app.setTexture( tex );


//create a GeometryInfo for the QuadArray that was populated. GeometryInfo gi = new GeometryInfo( GeometryInfo.POLYGON_ARRAY ); gi.setCoordinates( texInfo.m_CoordArray ); gi.setTextureCoordinates( texInfo.m_TexCoordArray );


//use the triangulator utility to triangulate the polygon int[] stripCountArray = {texInfo.m_CoordArray.length}; int[] countourCountArray = {stripCountArray.length};

gi.setContourCounts( countourCountArray ); gi.setStripCounts( stripCountArray );


Triangulator triangulator = new Triangulator(); triangulator.triangulate( gi );


//Generate normal vectors for the triangles, not strictly necessary

//as we are not lighting the scene, but generally useful. NormalGenerator normalGenerator = new NormalGenerator();

normalGenerator.generateNormals( gi );


//wrap the GeometryArray in a Shape3D and assign appearance return new Shape3D( gi.getGeometryArray(), app );

}


/*

* Handle the nitty−gritty details of loading the input file

* and reading (in order):

* − texture file image name

* − size of the geometry in the X direction

* − Y direction scale factor

* − number of texture coordinates

* − each texture coordinate (X Y)

* This could all be easily accomplished using a scenegraph loader,

* but this simple code is included for reference.

*/

protected TextureGeometryInfo createTextureCoordinates( String szFile )


{

//create a simple wrapper class to package our return values TextureGeometryInfo texInfo = new TextureGeometryInfo();


//allocate a temporary buffer to store the input file StringBuffer szBufferData = new StringBuffer();


float sizeGeometryX = 0; float factorY = 1;

int nNumPoints = 0;

Point2f boundsPoint = new Point2f();


try

{

//attach a reader to the input file FileReader fileIn = new FileReader( szFile );


int nChar = 0;


//read the entire file into the StringBuffer while( true )

{

nChar = fileIn.read();


//if we have not hit the end of file

//add the character to the StringBuffer if( nChar != −1 )

szBufferData.append( (char) nChar ); else

//hit EOF break;

}


//create a tokenizer to tokenize the input file at whitespace

java.util.StringTokenizer tokenizer =

new java.util.StringTokenizer( szBufferData.toString() );


//read the name of the texture image texInfo.m_szImage = tokenizer.nextToken();


//read the size of the generated geometry in the X dimension sizeGeometryX = Float.parseFloat( tokenizer.nextToken() );


//read the Y scale factor

factorY = Float.parseFloat( tokenizer.nextToken() );


//read the number of texture coordinates

nNumPoints = Integer.parseInt( tokenizer.nextToken() );


//read each texture coordinate texInfo.m_TexCoordArray = new Point2f[nNumPoints]; Point2f texPoint2f = null;


for( int n = 0; n < nNumPoints; n++ )

{

texPoint2f = new Point2f( Float.parseFloat( tokenizer.nextToken() ),

Float.parseFloat( tokenizer.nextToken() ) ); texInfo.m_TexCoordArray[n] = texPoint2f;

//keep an eye on the extents of the texture coordinates

// so we can automatically center the geometry if( n == 0 || texPoint2f.x > boundsPoint.x )

boundsPoint.x = texPoint2f.x;


if( n == 0 || texPoint2f.y > boundsPoint.y ) boundsPoint.y = texPoint2f.y;

}

}

catch( Exception e )

{

System.err.println( e.toString() ); return null;

}


//build the array of coordinates texInfo.m_CoordArray = new Point3f[nNumPoints];


for( int n = 0; n < nNumPoints; n++ )

{

//scale and center the geometry based on the texture coordinates texInfo.m_CoordArray[n] = new Point3f( sizeGeometryX *

texInfo.m_TexCoordArray[n].x − boundsPoint.x/2), factorY * sizeGeometryX * (texInfo.m_TexCoordArray[n].y − boundsPoint.y/2), 0 );

}


return texInfo;

}


As the TextureTest example illustrates, using a static mapping from vertex coordinates is relatively straightforward. Texture coordinates are assigned to each vertex, much like vertex coordinates or per−vertex colors. The renderer will take care of all the messy details of interpolating the texture image between projected vertex coordinates using projection and sampling algorithms.

Texture coordinates themselves are usually manually calculated or are the product of an automated texture−mapping process (such as 3D model capture or model editor).


Note that although we have called this section static mapping, there is nothing to prevent you from modifying the texture coordinates within a GeometryArray at runtime. Very interesting dynamic effects can be achieved through reassigning texture coordinates.


Care must be taken to ensure that texture images do not become too pixilated as they become enlarged and stretched by the sampling algorithm. The MIPMAP technique covered in detail in Section 14.3.4 is useful in this regard in that different sizes of different texture images can be specified.


Needless to say, texture images consume memory, and using large 24−bit texture images is an easy way to place a heavy strain on the renderer and push up total memory footprint. Of course, the larger the texture image, the less susceptible it is to becoming pixilated so a comfortable balance must be found between rendering quality, rendering speed, and memory footprint. You should also be very aware that different 3D rendering hardware performs texture mapping in hardware only if the texture image falls within certain criteria. Modern 3D rendering cards typically have 16 MB or more of texture memory, and 64 MB is now not uncommon. Most rendering hardware will render texture images of up to 512 x 512 pixels. You should consult the documentation for the 3D rendering cards that your application considers important.