discrete techniques angel: chapter 7 opengl programming and reference guides, other sources, ppt...

90
Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Upload: gertrude-linda-lucas

Post on 28-Dec-2015

240 views

Category:

Documents


5 download

TRANSCRIPT

Page 1: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Discrete TechniquesAngel: Chapter 7

OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc.

CSCI 6360

Page 2: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Review and Introduction …Last time: Implementation – CG Algorithms

• First part of course dealt mainly with geometric processing– Below focuses on role of transformations:

• Last week, looked at other elements of viewing pipeline, and algorithms:– Clipping, was one thing we looked at

• Eliminating objects (and parts of objects) that lie outside view volume - and, so, not visible in image

– Rasterization• Produces fragments (pixels) from remaining objects

– Hidden surface removal (visible surface determination)• Determines which object fragments are visible, and, so, put in frame buffer

Page 3: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Review & Intro.Implementation -- Algorithms

• Next steps in viewing pipeline:– Clipping

• Eliminating objects (and parts of objects) that lie outside view volume

– Rasterization• Produces fragments (pixels) from

remaining objects– Hidden surface removal (or, visible surface

determination)• Determines which object fragments are

visible, and, so, put in frame buffer• Show objects (surfaces, pixels) not

blocked by objects closer to camera

• Recall, … and next week …

• Now, it’s next week!– Quick, re-orientation

Page 4: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Tasks to Render a Geometric Entity1

Review and Angel Explication

• Angel introduces more general terms and ideas, than just for OpenGL pipeline…

– Recall, chapter title “From Vertices to Fragments” … and even pixels– From definition in user program to (possible) display on output device– Modeling, geometry processing, rasterization, fragment processing

• Modeling– Performed by application program, e.g., create sphere polygons (vertices)– Angel example of spheres and creating data structure for OpenGL use– Product is vertices (and their connections)– Application might even reduce “load”, e.g., no back-facing polygons

Page 5: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Tasks to Render a Geometric Entity2

Review and Angel Explication

• Geometry Processing– Works with vertices– Determine which geometric objects appear on display– 1. Perform clipping to view volume

• Changes object coordinates to eye coordinates

• Transforms vertices to normalized view volume using projection transformation

– 2. Primitive assembly• Clipping object (and it’s surfaces) can result in new surfaces (e.g., shorter

line, polygon of different shape)• Working with these “new” elements to “re-form” (clipped) objects is primitive

assembly• Necessary for, e.g., shading

– 3. Assignment of color to vertex

• Modeling and geometry processing called “front-end processing”– All involve 3-d calculations and require floating-point arithmetic

Page 6: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Tasks to Render a Geometric Entity3

Review and Angel Explication

• Rasterization– Only x, y values needed for (2-d) frame buffer

• … as the frame buffer is what is displayed– Rasterization, or scan conversion, determines which fragments displayed (put in

frame buffer)• For polygons, rasterization determines which pixels lie inside 2-d polygon

determined by projected vertices– Colors

• Most simply, fragments (pixels) are determined by interpolation of vertex shades & put in frame buffer

– Color can also be determined during fragment processing (more later)– Output of rasterizer is in units of the display (window coordinates)

Page 7: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Tasks to Render a Geometric Entity4

Review and Angel Explication

• Fragment Processing – will consider more about this tonight– (last time) Hidden surface removal performed fragment by fragment using depth

information– Colors

• OpenGL can merge color (and lighting) results of rasterization stage with geometric pipeline

– E.g., shaded, texture mapped polygon– Lighting/shading values of vertex merged with texture map– For translucence, must allow light to pass through fragment

• Blending of colors uses combination of fragment colors, using colors already in frame buffer

– e.g., multiple translucent objects– Anti-aliasing also dealt with

Page 8: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Geometry PathGeometry Path

Pixel Path

Architecture ViewsGeometry Path & Pixel Path

• Pixel path

Page 9: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Overview

• Geometry path and pixel path

• Buffers

• Digital images– Color indexing for psuedocolor

• Bitmaps and bitblt

• Mappings – texture and environment– Spinning cube example– Multipass rendering

• Accumulation buffer– Blending, scene antialiasing, depth cue, motion blur, fog

• Antialiasing and sampling theory

Page 10: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Introduction• Historically, graphics APIs have provided access to geometric objects

– Lines, polygons, polyhedra, …– … which were rasterized into frame buffer– Earliest APIs not oriented to direct manipulation of frame buffer

• However, many techniques rely on direct interactions with frame buffer - pixels– Texture mapping, antialiasing, compositing, alpha blending, …

• OpenGL allows API access to both “geometry path” and “pixel path”– Bitmaps, pixel rectangles to rasterization and texture mapping to fragment processing

Geometry PathGeometry Path

Pixel Path

Page 11: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Buffers

• Recall, frame (as in movie frame) buffer– Holds values in (video) memory that are

continuously “scanned out” by controller

• Depth of frame (color) buffer(s) depends on n colors to be displayed

– 3 bytes/pixel standard allows 16.7m colors

• Some OpenGL buffers– “Color buffers” – frame buffer

• “Front” and “back”– Depth buffer

• Hardware storage of z/depth information to implement vsd with z-buffer algorithm

• N bits / pixel determines precision of depth resolution

• Depth / resolution of frame buffer:– m x n x k-bit memory– = say, 1000 x 1000 x (32-rgba) x (front + back)– … and X many more buffers, e.g., depth– Next slide

Page 12: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

OpenGL BuffersSaw some already and more tonight

• In practice, buffer depth typically a few hundred bits and can be much more

– Color• Front: from which image is scanned out • Back: “next” image written into• 2 x 32-color + 8-alpha

– Depth• Hidden surface removal using z-buffer

algorithm• 16 min, 32 for fp arithmetic

– Color indices• Lookup tables

– Accumulation• Blending, compositing, etc.

• Pixel at mi, nj is all bits of the k bit-planes

Page 13: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Digital Images Color-Index Mode

• Digital images are pervasive– Often “conceptually simple” but practically complex

• In program, simply an array of pixels– Focus on OpenGL– Image formats another subject

• In OpenGL can represent images with wide range of sizes and data types

• RGB image - easy– Represent each color component with byte, 0-255, so, for 512 x 512 image– Just an arrary: GLubyte rgb_image[512][512][3]

• Monchromatic or luminance image– Pixel represents gray level, 0-255– GLubyte m_image[512][512]

• Color-index mode image (more follows)– Each pixel is a pointer/cursor into a table of colors– Might use unsigned bytes for image plus three color tables– GLubyte index_image[512][512], red[256], green[256], blue[256]– Value at index_image[I,j] provides cursor into color array

Page 14: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Color Indexing:Color Look Up Table

• Pixel value is indexed to color look up table (CLUT) where color is stored – vs. e.g., RGB

– In examples, pixel value = 0..255– Just an index, not a color directly

• Maps value (used for index) to color displayed • CLUT allows variety of effects

– (also used in image formats, and in old days small memory) – pseudo coloring (weather, stress diagrams, thermograms...) – fast image changes: change table rather than stored image – multiple images: select or composite/blend – color balancing and image processing

Page 15: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Lookup Tables - Pseudocolor

• Lookup tables are essential cg techniques

– e.g., color image with psuedocolor

Pseudo-color IR image of Katrina in the Gulf from NASA GOES-12; processing by University of Wisconsin

Lodha and Verma, Western Criminology Revew, 1999http://wcr.sonoma.edu/v1n2/lodha.html

Page 16: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Creating an OpenGL Image

• In fact can directly form images in OpenGl– Mostly simple images, as shown below– Just assign a value, here, for each of R, G and B

GLubyte check[512][512][3] ; // RGB image

int i, j, k;

for (i=0; i<512; i++)

for (j=0; j<512; j++)

{

for (k=0; k<3; k++)

if ((8*i+j/64)%64)

check[i][j][0]=255;

else

check[i][j][k]=0;

}

• What does this form?

Page 17: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Digital Images

• Writing code to form images has limited, though useful, application– E.g., have array of real numbers from experiment– Can scale to 0-255, to form luminance range

• For many techniques will see tonight, will get image from scene that OpenGL generates, then do something with it

– E.g., obtain complete scene from shifted cop’s and combine for antialiasing

• Often use image in standard format– GIF, TIFF, PS, EPS, JPEG, etc.– Formats include direct coding of values in some order– Compressed but lossless coding– Compressed lossy coding– Each created for particular need

• E.g., PS for printers (character based), • GIF, low memory display, color index

Page 18: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Image Formats and OpenGLNot Good News

• Large number of formats poses problems for graphics APIs

• Angel:– “The OpenGL API avoids the problem by not including any image formats”

• Nice feature! (or not)– “… it is the application programmer’s responsibility to read any formatted

images into processor memory and write them out to formatted files.”

• So, no support in OGL for reading even “standard” format images!– Code available on Web– opengl.org

• Code for .bmp is available at a number of places– This is an easy, can be uncompressed format– When using, be sure image and reader are of same .bmp format

Page 19: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Writing in BuffersEssential notion for discrete techniques

• Have seen this, when discussed rubber banding and XOR writes …

• Conceptually, can consider all of memory large two-dimensional array of pixels

– Frame buffer is part of this memory

• Can read and write:– Rectangular block of pixels

• Bit block transfer (bitblt) operations– Individual values

• How “write” to memory depends on writing mode

– Read destination pixel before writing source– Not just write over old with new– Might have new value be, e.g., old XOR new

Page 20: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Also, Can Combine Images/Values OpenGL Writing Modes

• Accumulation, stencil, etc. buffers can be sources and destinations

– Use OpenGL Writing Modes

• Source and destination bits combined bitwise

• 16 possible functions

replace ORXOR

Page 21: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

OpenGL – bit block transferglBitmap and Example

• From Blue book …

• glBitmap -- draw a bitmap– Draws at current raster position

• void glBitmap(– GLsizei width, – GLsizei height, – GLfloat xorig, – GLfloat yorig, – GLfloat xmove, – GLfloat ymove, – const GLubyte *bitmap)

• PARAMETERS– width, height - … of bitmap image. – xorig, yorig - … loc of origin of bitmap

image– xmove, ymove - … x and y offsets to be

added to the current raster position after bitmap is drawn.

– bitmap - … address of bitmap image.

• Example - Create bitmap directly and sent to current buffer at current raster position

GLubyte wb[2] = {0 x 00, 0 x ff};GLubyte check[512];int i, j;

.:for(i=0; i<64; i++)

for (j=0; j<64, j++) check[i*8+j] = wb[(i/8+j)

%2];

glBitmap(64, 64, 0.0, 0.0, 0.0, 0.0, check);

Page 22: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

The Pixel PipelineWhat? Another pipeline?

Page 23: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

The Pixel PipelineWhat? Another pipeline?

• So far, have seen vertex pipeline– Focus on coordinate systems

• OGL has separate pipeline for pixels!– Use can in practice be costly

• Writing pixels involves (next slide)– Moving pixels processor memory to frame buffer– Format conversions– Mapping, lookups, tests, …

• Reading pixels– Format conversion, can be slow– In practice,

fragment

shaders

Geometry PathGeometry Path

Pixel Path

Page 24: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

The Pixel Pipeline

• Writing pixels involves – Moving pixels from processor memory to frame buffer– Format conversions– Mapping, lookups, tests, …

Geometry PathGeometry Path

Pixel Path

Page 25: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Buffer Selection

• OpenGL can draw into or read from any of the color buffers (front, back, auxiliary)

– Default to the back buffer– Have used “swapbuffer”– Change with glDrawBuffer and glReadBuffer

• Later, will discuss use of several buffers

• Again, format of pixels in frame buffer is different from that of processor memory and these two types of memory reside in different places

– Need packing and unpacking– Drawing and reading can be slow– In practice fragment shaders

Page 26: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Bitmaps and Masks• OpenGL treats 1-bit pixels (bitmaps) differently from multi-bit pixels (pixelmaps)

• Bitmaps are masks that determine if corresponding pixel in frame buffer is drawn with the present raster color

– 0 color unchanged– 1 color changed based on writing mode

• Textures are like bitmaps, but treated differently, e.g., where in pipeline

• Bitmaps are useful for raster text– GLUT font: GLUT_BIT_MAP_8_BY_13

glBitmap(width, height, x0, y0, xi, yi, bitmap)

first raster position

second raster position

offset from raster position

increments in raster position after bitmap drawn

Page 27: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Pixel Maps

• OpenGL works with rectangular arrays of pixels called pixel maps or images

• Pixels are in one byte (8 bit) chunks– Luminance (256 level gray scale) images 1 byte/pixel– RGB 3 bytes/pixel

• Three straightforward functions to manipulate pixels:– Read pixels: frame buffer to processor memory– Draw pixels: processor memory to frame buffer– Copy pixels: frame buffer to frame buffer

glReadPixels(x,y,width,height,format,type,myimage)

start pixel in frame buffer size

type of image

type of pixels

pointer to processor memory

Page 28: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Mapping – Uses Buffers

• Use of discrete data for surface rendering is powerful – Allows efficient means for

producing “realistic” images, where algorithmic techniques too slow

– E.g., “realistic” image of orange

• Texture mapping

• Environment mapping

Page 29: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Texture Mapping• Responsible for much of today’s

photorealistic cg

• Puts an image on a facet (polygon)– Using some geometry– Will see example

• Lots of variations

• OpenGL texture functions and options

x

y

z

image

geometry display

Page 30: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Texture Mapping: Simple, but …1

• Angel: “Conceptually, the texture-mapping process is simple. …”

• Textures are “patterns” (i.e., created or read in image)– Brought in (or created) as array– 1,2,3, or 4 D – here, just consider 2-D

• A texture has texels (texture elements), as a display has pixels – elts of array– Texture is T(s,t) , and s and t are texture coordinates

• Texture map associates a texel with each point on a geometric object– And that object is mapped to window coordinates on the display

0, 0

Texture Space

t1, 1

(0.8, 0.4)

A

B C

a

bc

Object Space

Page 31: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Texture Mapping: Simple, but …2

• Angel: “On closer examination, we face a number of difficulties.”

– In fact working with all of screen, object, texture, and parametric coordinates

– From texture coordinates to object coordinates

– Actually, because of pixel-by-pixel basis of rendering process, more interested in inverse map of screen coordinates to texture coordinates (than easier texture to screen)

– Actually, need to map not texture points to screen points, but texture areas to screen areas … which is fine

• Wealth of information available about this and other pragmatic issues

– Yet again, all this just a “survival guide”

Page 32: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Textures in OpenGLOverview again, will see example

• Here, texture is 256 x 256 image that has been mapped to a rectangular polygon which is viewed in perspective

• Three steps to applying a texture:

1. Specify the texture• A. read or generate image• B. assign to texture• C. enable texturing

2. Assign texture coordinates to vertices• Mapping function is set in application

3. Specify texture parameters• wrapping, filtering

• Will work through an example

Page 33: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Texture Mapping and OpenGL Pipeline

• Again, big idea …

• Images and geometry flow through separate pipelines that join during rasterization and fragment processing

– “Complex” textures do not affect geometric complexity– Texture application is done “at last minute”

• Can have gigabytes of textures in texture memory– Moore’s law is good

Geometry PathGeometry Path

Pixel Path

Page 34: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

• Define texture image from array of texels (texture elements) in memory– glubyte my_texels[512][512];

• Defined as any other pixel map – which we’ve seen– Scanned image, generate by application code, etc.

• Enable texture mapping– OpenGL supports 1-4 dimensional texture maps– glEnable(GL_TEXTURE_2D)

• Define image as texture:

Applying a texture:

1. Specifying a Texture Image

glTexImage2D( target, level, components, w, h, border, format, type, texels );

target: type of texture, e.g. GL_TEXTURE_2Dlevel: used for mipmapping (discussed later)components: elements per texelw, h: width and height of texels in pixelsborder: used for smoothing (discussed later)format and type: describe texelstexels: pointer to texel array

glTexImage2D(GL_TEXTURE_2D, 0, 3, 512, 512, 0, GL_RGB, GL_UNSIGNED_BYTE, my_texels);

Page 35: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Applying a texture:

May Need to Convert Texture Image …• OpenGL requires texture dimensions to be powers of 2

• If dimensions of image are not powers of 2, then make it so– gluScaleImage(

format, w_in, h_in, type_in, *data_in, w_out, h_out,t ype_out, *data_out );

• data_in is source image• data_out is for destination image

• Image interpolated and filtered during scaling

Page 36: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

• “Put the texture on the polygon”– Map texture to polygon

• Based on texture coordinates

• glTexCoord*() specified at each vert

• Extraordinary flexibility

s1, 0

t

1, 1

0, 1

0, 0

(s, t) = (0.2, 0.8)

(0.4, 0.2)

(0.8, 0.4)

A

B C

a

b

c

Texture Space Object Space

Applying a texture:

2. Mapping Texture -> PolygonglBegin(GL_POLYGON);

glColor3f(r0, g0, b0); // no shading usedglNormal3f(u0, v0, w0); // shading usedglTexCoord2f(s0, t0); // using values glVertex3f(x0, y0, z0);glColor3f(r1, g1, b1);glNormal3f(u1, v1, w1);glTexCoord2f(s1, t1);glVertex3f(x1, y1, z1);

.

.glEnd();

Page 37: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Applying a texture:

3. Texture Parameters• OpenGL has variety of parameters that determine how texture applied:

– Wrapping parameters • determine what happens if s and t are outside the (0,1) range

– Filter modes • Allow use of area averaging instead of point samples

– Mipmapping • Allows use of textures at multiple resolutions

– Environment parameters • Determine how texture mapping interacts with shading

– glTexEnv{fi}[v]( GL_TEXTURE_ENV, prop, param )

– GL_TEXTURE_ENV_MODE– GL_MODULATE: modulates with computed shade– GL_BLEND: blends with an environmental color– GL_REPLACE: use only texture color

Page 38: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Applying a texture: Magnification and Minification

Texture Polygon PolygonTexture

• Magnification• > 1 pixel can cover a texel

• Minification • > 1 texel can cover a pixel

Page 39: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Spinning Textured Cube Code

Page 40: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Spinning Textured Cube Code

• Angel demo

• Spinning cube

• Mouse to select which axis about which to rotate

• Change of rotation during idle

• Now, – Generate textures– Map textures on to cube

faces

Page 41: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Overview tex-cube.c

• 1. Data stuctures

• 2. Polygon definitions (colored)

• 3. Object definition

• 4. Display function

• 5. main – Initialize bitmap image, – standard glut elements, – texture application

• Spincube, mouse, reshape, key

Page 42: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

1. Data Structures, 2. Polygon Defs• // Data Structures

GLfloat planes[]= {-1.0, 0.0, 1.0, 0.0};GLfloat planet[]= {0.0, -1.0, 0.0, 1.0};GLfloat vertices[][3] = {{-1.0,-1.0,-1.0},{1.0,-1.0,-1.0}, {1.0,1.0,-1.0}, {-1.0,1.0,-1.0}, {-1.0,-1.0,1.0}, {1.0,-1.0,1.0}, {1.0,1.0,1.0}, {-1.0,1.0,1.0}};GLfloat colors[][4] = {{0.0,0.0,0.0,0.5}, {1.0,0.0,0.0,0.5}, // RGBA {1.0,1.0,0.0,0.5}, {0.0,1.0,0.0,0.5},

{0.0,0.0,1.0,0.5}, {1.0,0.0,1.0,0.5},

{1.0,1.0,1.0,0.5}, {0.0,1.0,1.0,0.5}};

// Polygon definitionsvoid polygon(int a, int b, int c, int d){ glBegin(GL_POLYGON); glColor4fv(colors[a]); glTexCoord2f(0.0,0.0); glVertex3fv(vertices[a]); glColor4fv(colors[b]); glTexCoord2f(0.0,1.0); glVertex3fv(vertices[b]); glColor4fv(colors[c]); glTexCoord2f(1.0,1.0); glVertex3fv(vertices[c]); glColor4fv(colors[d]); glTexCoord2f(1.0,0.0); glVertex3fv(vertices[d]); glEnd();}

Page 43: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

3. Object Definition, 4. Display• // Object onto which to map textures

void colorcube(){

/* map vertices to faces */

polygon(0,3,2,1); polygon(2,3,7,6); polygon(0,4,7,3); polygon(1,2,6,5); polygon(4,5,6,7); polygon(0,1,5,4);}

static GLfloat theta[] = {0.0,0.0,0.0};static GLint axis = 2;

// display, function named in main

void display(){ glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT); glLoadIdentity(); glRotatef(theta[0], 1.0, 0.0, 0.0); // have seen this spin strategy before glRotatef(theta[1], 0.0, 1.0, 0.0); glRotatef(theta[2], 0.0, 0.0, 1.0); colorcube(); glutSwapBuffers();}

Page 44: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

5. Main - Overview• // main – Initialize bitmap image, standard glut elements, texture application

int main(int argc, char **argv){ GLubyte image[64][64][3]; int i, j, c;

// Initialize “bitmap” image – easy way to get an image! for(i=0;i<64;i++) { for(j=0;j<64;j++) { c = ((((i&0x8)==0)^((j&0x8))==0))*255; image[i][j][0]= (GLubyte) c; image[i][j][1]= (GLubyte) c; image[i][j][2]= (GLubyte) c; } }

// Standard glut setup with hidden surface removal glutInit(&argc, argv); glutInitDisplayMode(GLUT_DOUBLE | GLUT_RGB | GLUT_DEPTH); glutInitWindowSize(500, 500); glutCreateWindow("colorcube"); glutReshapeFunc(myReshape); glutDisplayFunc(display); glutIdleFunc(spinCube); glutMouseFunc(mouse); glEnable(GL_DEPTH_TEST);

// Enable texture mapping and specify how to apply texture glEnable(GL_TEXTURE_2D); glTexImage2D(GL_TEXTURE_2D,0,3,64,64,0,GL_RGB,GL_UNSIGNED_BYTE, image); glTexParameterf(GL_TEXTURE_2D,GL_TEXTURE_WRAP_S,GL_REPEAT); glTexParameterf(GL_TEXTURE_2D,GL_TEXTURE_WRAP_T,GL_REPEAT); glTexParameterf(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_NEAREST); glTexParameterf(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_NEAREST);

// Final setup and start loop glutKeyboardFunc(key); glClearColor(1.0,1.0,1.0,1.0); glutMainLoop();}

 

Page 45: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

main - Init Internal Image, Setup• // main – Init. bitmap image, std glut elements, texture application

int main(int argc, char **argv){ GLubyte image[64][64][3]; int i, j, c;

// Here, would be file read

// Create “bitmap” image – essentially mix/blend colors

// across faces of cubes, with checkerboard for(i=0;i<64;i++) { for(j=0;j<64;j++) { c = ((((i&0x8)==0)^((j&0x8))==0))*255; image[i][j][0]= (GLubyte) c; image[i][j][1]= (GLubyte) c; image[i][j][2]= (GLubyte) c; } }

// Standard glut setup with hidden surface removal glutInit(&argc, argv); glutInitDisplayMode(GLUT_DOUBLE | GLUT_RGB | GLUT_DEPTH); glutInitWindowSize(500, 500); glutCreateWindow("colorcube"); glutReshapeFunc(myReshape); glutDisplayFunc(display); glutIdleFunc(spinCube); glutMouseFunc(mouse); glEnable(GL_DEPTH_TEST);

}

 

Page 46: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

main - Apply Texture and Run• // main – Init bitmap image, std glut elements, texture application

// Enable texture mapping and specify how to apply texture, as earlier glEnable(GL_TEXTURE_2D);

glTexImage2D(GL_TEXTURE_2D,0,3,64,64,0,GL_RGB,GL_UNSIGNED_BYTE, image);

glTexParameterf(GL_TEXTURE_2D,GL_TEXTURE_WRAP_S,GL_REPEAT); glTexParameterf(GL_TEXTURE_2D,GL_TEXTURE_WRAP_T,GL_REPEAT); glTexParameterf(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_NEAREST); glTexParameterf(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_NEAREST);

// Final setup and start loop glutKeyboardFunc(key); glClearColor(1.0,1.0,1.0,1.0); glutMainLoop();}

 

Page 47: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Idle, mouse, reshape and key funcs. Named in Main – Same as in Previous Program

• // Idle function (named in main called when idle)

// Here, no delay

void spinCube(){ theta[axis] += 2.0; if( theta[axis] > 360.0 ) theta[axis] -= 360.0; glutPostRedisplay();}

// mouse function (named in main)void mouse(int btn, int state, int x, int y){ if(btn==GLUT_LEFT_BUTTON && state == GLUT_DOWN) axis = 0; if(btn==GLUT_MIDDLE_BUTTON && state == GLUT_DOWN) axis = 1; if(btn==GLUT_RIGHT_BUTTON && state == GLUT_DOWN) axis = 2;}

// reshape function (named in main)void myReshape(int w, int h){ // standard change to ortho, etc.}

// key handlingvoid key(unsigned char k, int x, int y){ if(k == '1') glutIdleFunc(spinCube); if(k == '2') glutIdleFunc(NULL); if(k == 'q') exit(0);}

Page 48: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Environment Maps1

Page 49: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Environment MapsIntroduction

• Theme: – Global models are slow for interactive cg,

so use efficient and “good enough” models

• But, have seen that many elements of photorealism can’t be captured by fast

– Interactive cg models limited by just following projectors from object to eye/cop

– Global models, e.g., follow rays from light source, through all of it’s interactions with objects, to eye/cop

• Ok, actually other way around

• Environment, or reflection, maps improve on simple object to eye projects

• Use multipass rendering (a new big idea)• To capture some elements of global model,

e.g. reflection from mirror– Use texture mapping

• To (re)introduce into image that which is not captured at first, but on later pass

• Map mirror reflection onto object

Page 50: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Environment Maps2 steps to construct, multi-pass rendering, e.g., mirror in scene

• E.g., want to have mirror in scene – so far, local can’t handle

– Polygon with a highly specular surface

• Pos. of viewer and poly (mirror) orientation known– So, can calculate angle of reflection

• Can follow angle until intersect environment– And obtain shade that is reflected in mirror

• “What the mirror sees”– Shade result of shading process that involves light

sources and materials in scene

• Two step rendering pass to do this:– 1. Render scene without mirror polygon with camera

placed at center of the mirror pointed in the direction of the normal of the mirror

• “What the mirror sees” (or, is seen by an observer of mirror)• A map of the environment

– 2. Use this image to obtain shades (texture values) to place on mirror polygon for a 2nd rendering with mirror in scene

• Place texture map from 1st pass onto mirror in 2nd pass

COP

“Mirror COP”

Page 51: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Sphere Mapping - Briefly

Improving Environment Mapping

• Difficulties with environment mapping:– 1. Images obtained with 1st pass are not quite

right (formed without one of objects (mirror) in the environment!)

– 2. Projection problem: Issue of what surface to project the scene in 1st pass and where camera should be placed

• Potentially, want all information in scene• For, e.g., mirror movement, in which would see

different parts of environment in successive frames, so “simple” projection not work

• Spherical (sphere) mapping– Project environment onto a sphere centered at

eye/cop• Object appears as would without this step

– projectors do not change

– So, e.g., change in position of object easy to effect

• Can read spherical image from file to use with OpenGL texture mapping

– OpenGL is supplied circle image and maps appropriately

Page 52: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Cube Maps and OpenGL - Again, briefly

Improving on sphere mapping

• Sphere mapping leads to distortions at edges

• Cube mapping uses projection to cube, rather than sphere

– Use 6 projections for sides• Reflective objects

– Then treat as environment map• With different texture mappings, of course

– Uses appropriate blending at edges

• Basic OpenGL steps– Load image or generate image

• glTexture2d(…)• Or, say, 6 calls to glTexImage

– Let OGL generate coordinates:• glTextGeni (….Sphere[Cube]_Map)

Page 53: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Compositing Techniques - Blending

Page 54: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Compositing Techniques - Blending

• So far, concerned with forming single image using only opaque polygons

• Compositing techniques allow combining of objects (fragments, pixels)

• Translucent (and transparent) objects possible as well

– And done in hardware, so efficient– Makes use of pixel operations using

raster operations

• Alpha () blending– 4th channel in RGBA or RGB– Can control for each pixel– If blending enabled, value of controls

how RGB values written into frame buffer

– Objects are “blended” or “composited” together

Page 55: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Writing Model for Blending - 2

• Use A component of RGBA (or RGB) color to store opacity (transparency)

• During rendering can expand writing model to use RGBA values

• Opacity– measure of how much light

penetrates– Opacity of 1 ( = 1) for

opaque surface that blocks all light, and = 0 for transparent

• Subtractive model (more later)

Page 56: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Writing Model for Blending

• Use A component of RGBA (or RGB) color to store opacity

• During rendering can expand writing model to use RGBA values

Color Buffer

destinationcomponent

blend

destination blending factor

source blending factor sourcecomponent

Page 57: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Opacity and Blending• Again, opacity:

– measure of how much light penetrates– Opacity of 1 ( = 1) for opaque surface that

blocks all light, and = 0 for transparent

• Middle poly opaque, front translucent -> – Middle (opaque) blocks any view of farthest– Front translucent

• “Translucent” = partially opaque or transparent, 0 < < 1

• Color seen at overlap is blending of its color and color of farther poly

– Easy with gray!– Red front and middle blue, would be magenta

– Were middle translucent, would see blending of three colors

• Compositing/blending of colors– Like stacking glass filters

• Gives different color and higher opacity• Subtractive color model

– Not like combining wave lengths of light• Additive color model

Joe Pardue, www.codeproject.com/KB/GDI-plus/CsTranspTutorial1.aspx

Page 58: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Clamping and AccuracyFYI

• All the components (RGBA) are clamped and stay in the range (0,1)

• However, in a typical system, RGBA values are only stored to 8 bits– Can easily loose accuracy if we add many components together

• Example: add together n images• Divide all color components by n to avoid clamping• Blend with source factor = 1, destination factor = 1• But division by n loses bits

Page 59: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

About Blending – CompositingSome details - overview

• Render polygons one at a time into frame buffer– Opacity as part of fragment processing– Visible pixel-sized fragments are assigned colors based on shading model– Consider both contents of frame buffer (destination) and color of fragment (source)

• Some terminology…

• For blending source (fragment) and destination (frame buffer), let– s = [ sr sg sb s ]– d = [dr dg db d ] ---- frame buffer before compositing

• Using blending factors for:– Source, b = [ br bg bb b ]– Destination, c = [ cr cg cb c ]

• Compositing operation replaces d with new values, d’:– d’ = [ br

.sr + cr.dr bg

.sg + cg.dg bb

.sb + cb.db b

.s + c.d ]

– After compositing

• Will see details next

Page 60: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Image Compositing – Step by StepOptional

• Again, source (fragment) and dest. (frame buffer)– s = [ sr sg sb s ] and d = [dr dg db d ]

• Blending factors for:– Source, b = [ br bg bb b ]; Destination, c = [ cr cg cb c ]

• Compositing operation replaces d (current fb vals)– d’ = [ br

.sr + cr.dr bg

.sg + cg.dg bb

.sb + cb.db b

.s + c.d ]

• Image compositing – combine and display several images as pixel maps– For each image element (pixel) i, has image components Ci

– where C = (Ri, Gi, Bi) ----- just what image pixel value is

• Say, n images contribute equally to final display – two ways– 1. Replace Ci by 1/n Ci and by 1/n and simply add each image into frame buffer

• Just “reduce intensity” of color and opacity proportionally by changing the image values – 2. Use source blending factor of 1/n

• Set value for each pixel in each image to 1/n• Use 1 for destination blending factor and for source blending factor

• Both ways work fine! – but there is often hardware support for second– How ya gonna know …

Page 61: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

OpenGL Blending - Order MattersOptional

• To do blending, 1st enable blending:– glEnable(GL_BLEND)

• Set up source and destination blending factors:– glBlendFunction(source_factor, des_factor)– E.g., GL_ONE, GL_ZERO, GL_SRC_ALPHA, GL_ONE_MINUS_ALPHA_SRC, …DST

• But, order in which render polygons affects final image!

• E.g., if use source as source blending factor and 1- as destination blending factor, color and opacity =

– (Rd’, Gd’, Bd’, d’) = (sRs + (1-s)Rd, sGs + (1-s)Gd, sBs + (1-s)Bd, sd + (1-s)d )

– Ensures that neither colors nor opacities saturate: + (1 - ) <= 1– But, given two polygons, A and B, either might be rendered (put in the frame buffer) 1st,

and so be destination– Then, when 2nd is rendered, and the blending occurs, it is that of the 2nd which

determines how the two are combined– The used for blending might be for B, if A rendered 1st or for, A if B rendered 1st

• Solution depends on modification to how z-buffer is used

Page 62: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Recall, Z-Buffer Alg – Buffers in OGL

• Recall, frame buffer…– Screen refreshed one scan line at a time, from

pixel information held in frame buffer– Actually, OpenGL calls “color” buffer(s) and there

are both front and back

• Additional buffers to store other information– e.g., Double buffering

• Depth buffer for hidden-surface removal– Typically a few bytes deep and stores depth (z)

values of polygons that are written into that buffer

• Hardware implementation of algorithm

• In OpenGL can choose to render with or without hidden surface removal

Page 63: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Depth Comparison using Z-BufferRecall

• Z-buffer initialized to background value:– furthest plane of view volume, e.g, 255

• Polys scan-converted in arbitrary order– If new point has z val < what in z-buff

• i.e., closer to the eye• its z-value is placed in the z-buffer and its color

placed in color buffer at same (x,y); – Else

• previous z-value and color buffer val unchanged

• Can store depth as ints or floating point– Range of vals determines precision of z

resolution

Z-buffer init. Z vals of 1st poly 1 poly in z-buffer

Also color buffer values updated

Page 64: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Blending in OpenGLModification to Z-buffer use to handle transparency

• Besides order, another problem when combine opaque and translucent objects

• Normally, do not enable hidden-surface removal when blending– Polys behind poly already rendered would not be rasterized and so not contribute to scene!

• I.e., poly behind translucent poly eliminated and so not have color to blend with trans one in front!• E.g., below, say green poly is translucent

• But, do want polygons behind opaque polygons not rendered– Can effect this by enabling hidden-surface removal, and make z-buffer read-only for any

poly that is translucent– For all translucent polygons ( < 1.0), set glDepthMask (GL_FALSE)

• Will render translucent polygon (including blending with existing color), and update color buffer• But, not write new values for z into z-buffer at these points

– I.e., even though the translucent polygon is in front of (smaller z) something, don’t change z-buffer value• Result is that as polys are written in, color values continue to be updated (based on alpha)

– And order matters!

Page 65: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Antialiasing and Blending

Page 66: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Antialiasing and Blending

• Major use of -channel is antialiasing

• Default width of line is 1 pixel– But, unless horizontal or vertical, line covers a

number of pixels

• During geometry processing of fragment, might set –value for corresponding pixel

– Range 0 – 1 reflecting amount covered by fragment– In line drawing, pixel by pixel examination anyway– Use to modulate color as render to frame buffer– Destination blending factor of 1 - and Src factor of

Page 67: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Antialiasing and Blending – OverlapCan just set OpenGL to handle

• No overlap, opaque background w/frame buffer value C0– At start, can set = 0,

• as no part of pixel yet covered with fragments from polygons– Fragment that covers entire pixel (= 1)

• will have its color assigned to the destination pixel, • and dest pixel will be opaque

– For fragments not covering all of pixel• As polygon rendered:

– Color of dest pixel = Cd = (1- )C0 + 1C1 – –value = d = 1

• If fragments overlap, must blend colors– Cd = (1- ) ((1- ) C0 + 1C1) + C2

– d = + )

• To have OpenGL take care of antialiasing:– glEnable(GL_POINT_SMOOTH)– glEnable(GL_POINT_SMOOTH)– glEnable(GL_POINT_SMOOTH)– glEnable(GL_POINT_SMOOTH)– glBlendfunction (GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA)

Page 68: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

More Buffer Uses

• Depth cueing • Fog• Accumulation buffer• Scene antialiasing• Motion blur• Depth of field• Stencil buffer

Page 69: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Depth Cueing

• Depth cueing early cg technique– Human depth perception in fact arises from

many elements• Occlusion, motion parallax, stereopsis, etc.

• Drawing lines farther from viewer dimmer

• Also, atmospheric haze

Page 70: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Depth Cueing and Fog

• Extend basic ideas– Create illusion of partially translucent space

between object and viewer– Blend in a distance-dependent color

• Technique: let,– f = fog factor– z = distance viewer to fragment– Cs = color of fragment– Cf = color of fog

• Then, – Cs’ = fCs + (1-f)Cf

• How f varies determines how perceived– Linearly, depth cueing– Exponentially, more like fog

• OpenGL supports linear, exponential and Gaussian fog densities and user specified color

Page 71: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Fog - OpenGL Example

• For Cs’ = fCs + (1-f)Cf

• OpenGL has straightforward calls to set parameters

– linear, exponential and Gaussian fog densities

• To set up a fog density function, f = e-0.5z2

– Glfloat fcolor[4] = {…};

– glEnable(GL_FOG)

– glFogv = (GL_FOG_COLOR, fcolor)– glFogf = (GL_FOG_MODE, GL_EXP)– glFogf = (GL_FOG_DENSITY, 0.5)

Page 72: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Accumulation Bufferto composite sets of images

• Consider problem of compositing set of images, each with typical 1 byte color

• Could simply add RGB values into usual 32-bit RGBA color buffer, …– But, this results in color components that overflow single byte storage– Vals > 255, result in clamping buffer val at 255– Image appears “washed out”

• Can avoid overflow by scaling color values before add together– E.g., for 64 images, just scale (divide) each image color component by 64– Then, adding together would not reach val > 255

• But, this technique would reduce color resolution– Scaled values have maximum range of only 0..3!– Only 2 bits color resolution– Vs. original range 0..255, 8 bits color resolution

• Solution – accumulation buffer– Same spatial resolution (m x n) as frame buffer– Greater (color) depth resolution – more bytes of storage for each pixel– Typically, enough storage for floating point values with good precision– Moore’s law is good

Page 73: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Using OpenGL Accumulation BufferFor image compositing

• Using accumulation buffer to solve color resolution problem

• Same methods as available for any buffer

• Algorithm:– User’s function display_image below generates sequence of images into write buffer– Each added (or accumulated) into accumulation buffer

• With scale factor 1 over the number of images– At end, accumulated image is copied back to write buffer

• Code:glClear(GL_ACCUM_BUFFER_BIT)for (i=1; Ii<num_images; i++) { glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT); display_image(); glAccum(GL_ACCUM, 1.0/ (float) num_images);}glAccum(GL_RETURN, 1.0);

• BTW, bitwise or above …

Page 74: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

BTW – OR of ParametersA programming technique

• OR of parameter constants is standard method in various APIs of indicating that multiple operations are to be performed

• E.g., glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);– Goal is efficiency of function calls

• Single call to set several parameters using coded param, vs. multiple, e.g., Bool (1 bit) vals

• Let each bit of some n-bit value represent an operation, e.g.,• 100000000 LIB_CLEAR_BUF• 010000000 LIB_DEPTH_ONE_BYTE • 001000000 LIB_DEPTH_TWO_BYTES• 000100000 LIB_READ_ONLY• …• 000000001 LIB_whatever

– Then, might have call like:• libSetBuffer (LIB_CLEAR_BUF | LIB_DEPTH_TWO_BYTES | LIB_READ_ONLY);

– Evaluate OR of the values, bit by bit

• 100000000 LIB_CLEAR_BUF• 001000000 LIB_DEPTH_TWO_BYTES • 000100000 LIB_READ_ONLY• ----------------• 101100000 what passed by function

Page 75: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Scene Antialiasingusing “jittering”

• Have seen example of antialiasing of lines using “distributed” illumination

• Also, have seen that can (today) perform multiple renderings, e.g., for environment mapping, within time constraint of ~1/30 second

• Another technique for antialiasing is to perform multiple renderings, – Each with slightly (< 1 pixel) different viewer positions (COPs)– “Jitter” the viewing position

• Easy, just change parameters in glPerspective or glOrtho– Each time new rendering generated, creates slightly different aliasing artifacts

• Supersampling, in sampling theory terms– If average together resulting images, aliasing effects are smoothed out

• Can jitter viewer (few times) and generate entire scene (a few times)

• Then, use accumulation buffer as in example to, now, average color values

• Moore’s law is good– Feasability (computational tractability) of algs and techs change with hardware advances– For computer graphics, change happens fast

Page 76: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Motion Blur

• Motion blur– Objects in scene are blurred along path they are

taking

• Use accumulation buffer– Jitter an object and render it multiple times– Leaving positions of other object unchanged, get

dimmer copies of jittered objects in final image– Each is at slightly different location, so sum is

less, and so illumination level lower

• If object moved along a path, rather than randomly jittered, see the trail of the object

• Can adjust constant in glAccum to render object final position with greater opacity or to create the impression of speed differences

http://www.sgi.com/products/software/opengl/examples/glut/advanced/

Page 77: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Depth of Focus (Field)

• Eye’s lens changes shape to focus objects (at different distances) on retina

– Result is other objects are out of focus / blurred• Both closer and farther

– In normal vision, attention shifts constantly, refocusing eyes

• Focus effects important in separating foreground and background objects

• Depth of field– Area in which objects a certain distance from

lens are in focus

• Cg systems fundamentally different in that have infinite depth of field

– Everything is in focus!• Nothing “blurry” unless explicitly made so

– Not photorealistic– BTW, perhaps a “fatal flaw” for stereoscopy

Page 78: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Depth of Field and OpenGL

• Creating depth of field affects can be done by having some elements (at some depth range) in focus and the rest not (or not so much)

• Can again use jitter and accumulation buffer

• Move viewer in a manner that leaves a particular plane in focus

• To keep z = zf in focus– Move viewer from origin in x direction Dx– For frustum elts, specify near clipping plane, for each:

• as x’min = xmin + Dx/zf * (zf-znear)– As Dx and Dy increase narrower depth of field

http://www.sgi.com/products/software/opengl/examples/glut/advanced/

Page 79: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Stencil Buffer – Last One!

• Often used for masking

• In writing to frame buffer, stencil buffer values compared write values for write

– Sounds like use of accumulation buffer use– Provides alternative way

• Can use to mask off part of viewport– E.g., dashboard panel in car - on scene– Just 0’s and 1’s in stencil

• Just enable and write to it

• Can also draw into stencil buffer, then use what drawn in as mask

– E.g., just replace what would be drawn at some locs with what already in stencil

Page 80: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Aliasing and Sampling

Page 81: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Aliasing and Sampling1

• Term “aliasing” comes from sampling theory

• Consider trying to discover nature of sinusoidal wave (solid line) of some frequency

• Measure at a number of different times

• Use to infer shape/frequency of wave form

• If sample very densely (many times in each period of what measuring), can determine frequency and amplitude

http://www.daqarta.com/dw_0haa.htm

Page 82: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Aliasing and Sampling2

• But, if samples taken at too low a rate, cannot capture attributes of the sine wave (underlying function)

• Examples at right illustrate– Actual function is solid line– Squares indicate when sampled– Dotted line waveforms from sample

• Dotted line waveforms are aliases of solid line (underlying function)

– Though samples creating dotted lines could have come from true waveforms of these amplitudes and frequencies, they don’t

– E.g., 14.1 hz signal sampled 14 times/sec • Result seems same as if a 0.1 Hz signal

were sampled 14 times per second, • So, 0.1 Hz said to be an "alias" of 14.1 Hz

• Nyquist sampling theorem– Samples of continuous function contain all

information in original function iff cont. function is sampled at frequency greater than twice highest frequency in function

Page 83: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Aliasing and Sampling in CG

• Aliasing in computer graphics arises from sampling affects

• For cg, samples (of visual elements) are taken spatially, i.e., at different points

– Rather than temporally, at different times– E.g., for line, a pixel is set based on pixel

center and relation of line to pixel at one point – the pixel’s center

• In effect, line sampled at this one point– How “well”, or densely, can be sampled

depends on, here, spatial resolution (dpi) – No information is obtained about the line’s

presence in other portions of the pixel• As, “not looking densely enough” within the

pixel region• So, should test more points there to see which

ones line is covering

Density of Spatial Sampling

Page 84: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Antialiasing in CG

• Yet, higher resolution does not eliminate problem of insufficient spatial sampling

– Still are “representing analog world on a discrete device”

• Antialising techniques involve one form or another of “blurring” to perceptually “smooth” the image for the human viewer

– E.g, jitter of scene

• Can differentially color/shade pixels as they differ from “ideal” line

• Make discontinuity of “jaggie” less noticeable

– E.g., gray pixels by border (using amounts)

– Decreases illumination discontinuity to which eye sensitive

Antialising

Page 85: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Temporal Aliasing

• What is a temporal “jaggie”?– When animation appears “jerky”– E.g., frame rate too low (<~15 frames/sec)

• How might the problem be solved?– Sample more frequently

• Motion is continuous• A single frame is discrete• To increase frame rate is increase the temporal sampling density

• However, more to temporal aliaising?– http://www.michaelbach.de/ot/mot_wagonWheel/index.html– Frequency (rate of spin) varies 0-120, and sampled at ~24 fps

• Depending on system

Page 86: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Temporal Aliasing

• Recall, spatial alias– E.g., 14.1 hz signal sampled

14 times/sec • Result seems same as if a

0.1 Hz signal were sampled 14 times per second,

• So, 0.1 Hz said to be an "alias" of 14.1 Hz

• Analog for temporal sampling– In example 24 fps is

sampling rate– “stroboscopic effect” at

multiples of sampling rate– Wheel has 0 rotational rate

when sampled at frequency of display

Page 87: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

Temporal Aliasing

• Why appear to go backwards?

• Again, sampling rate …– Say, sampled just a bit “earlier”

in original each time• Sampling frequency a bit less than

rotational frequency

– “rotates a bit less than a full revolution per sampling”

• Human perceptual system combines/integrates images and perceives motion

• In this case, resulting in a “wrong” perception

– Except that we now know that it is the “right” perception given the understanding of sampling rate

– Nature of aliases• Here, a sequence of images in time

– And the perceptual system is just integrating a series of images …

“At top”

Page 88: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

FYI - A Second Factor1

• To accomplish animation (with observer perceiving smooth motion) need to

– Display element at point a– Display element at point a’– Repeat (at a pretty fast frame rate)

• Sequence of static pictures is then perceived as a smoothly moving object

• But, limitation on “throughput” (information) – How much data can be displayed to user per unit

time?– Here, amount that an object can be moved before

it becomes confused with another object in the next frame

– Correspondence problem

• Let = distance between pattern elements– Distance at which subsequent display of

elements is “right on top of” next

a

b

c

a

Page 89: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

FYI - A Second Factor2

• Let = distance between pattern elements– Distance at which subsequent display of elements

is “right on top of” next

• / 2 (in practice minus a bit, /3 emprically) is the maximum displacement/inter-frame movement for the element before the pattern is more likely to be seen as moving in reverse direction than what intended

• When elements identical, brain constructs correspondences based on object proximity in successive frames

– Sometimes called “wagon-wheel” effect, recall old western films

– With /3, frame rate = 60 fps, have upper bound of 20 messages per second

• Can increase by, e.g., using different colors (b) and shapes (c)

a

b

c

a

Page 90: Discrete Techniques Angel: Chapter 7 OpenGL Programming and Reference Guides, other sources, ppt from Angel, AW, etc. CSCI 6360

End