cs-m67-introduction to graphics with opengl

Post on 02-Nov-2021

5 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

CS-M67 – PART 2

INTRODUCTION TO GRAPHICS

WITH OPENGL

Benjamin Mora 1 Swansea University

Dr. Benjamin Mora

CONTENT

� Evolution of Graphics.

� Graphics Cards and Hardware (ATI, NVidia, Intel).

� Scene Modeling.

� OpenGL � Z-Buffer.

� Setting up the hardware.

� Shading Languages.

� What about DirectX?

� Advanced Rendering techniques� Multi-Pass renderings for shadows and lighting.

� Ambient Occlusion.

� Displaced Subdivisions.

� Global Illumination on GPUs.

2Benjamin MoraSwansea University

EVOLUTION OF GRAPHICS

3Benjamin MoraSwansea University

60’S

4Benjamin MoraSwansea University

1960 1970 1980 1990 2000

Hardware

Software

Entertainment

1959 : De Casteljau Algorithm

Smooth curve modelling for designing cars

60’S

5Benjamin MoraSwansea University

1960 1970 1980 1990 2000

Hardware

Software

Entertainment

1960 : Computer Graphics Term

By William Fetter

60’S

6Benjamin MoraSwansea University

1960 1970 1980 1990 2000

Hardware

Software

Entertainment

1962 : Bezier Curves

•Mathematical redefinition of De Casteljau algorithm.

•Automobile Context.

60’S

7Benjamin MoraSwansea University

1960 1970 1980 1990 2000

Hardware

Software

Entertainment

1962 : Bresenham Algorithm

Draw lines on Raster Displays

70’S

8Benjamin MoraSwansea University

1960 1970 1980 1990 2000

Hardware

Software

Entertainment

1970’s : Evans and Sutherland Computers

Graphics Oriented Computers

1971 : Gouraud Shading

Better shading

1972 : Pong

http://www.sirgalahad.org/paul/sw/winlock/img/pong.png

70’S

9Benjamin MoraSwansea University

1960 1970 1980 1990 2000

Hardware

Software

Entertainment

1973 : Phong Shading

Published in Phong Thesis. Illumination model + interpolation technique

http://en.wikipedia.org/wiki/Phong_shading

1974 : Edwin Catmull Thesis

-Z-buffering

-Texturing

-Catmull-Rom Splines

1976 : Future World movie

Used Computer Graphics effects.

70’S

10Benjamin MoraSwansea University

1960 1970 1980 1990 2000

Hardware

Software

Entertainment

1978 : Catmull-Clark and Doo-Sabin subdivisions

1978 : Space invaders1979 : Catmull launches Pixar

80’S

11Benjamin MoraSwansea University

1960 1970 1980 1990 2000

Hardware

&

Entertainment

1980 : Pacmanhttp://www.geekrant.org/files/2006/pacman-hide.png

1982 : SGI created by Jim Clark

-IRIS 1000 (84)

-Indigo, Indy, Octane Series will follow.

-IRIX OS

1980’s : More Computer Games. 8-16-32 bits consoles. Sprites used.

1984 : Commodore 64. 6510 1 MHz

1985 : Amstrad CPC 464, 664, 6128. Z80@4 MHz

1985 : Atari 520 st.

Amiga 1000. Motorola 68K. 7.16 MHz

16/32 bits. Co-processors, Blitter, Line drawing, area filling.

80’S

12Benjamin MoraSwansea University

1960 1970 1980 1990 2000

Software

1984 : J. Kajiya. & Von Herzen. Ray-Tracing Volume Densities.

Realistic Volume rendering

1984 : Goral et al.

Radiosity. (Global Illumination for diffuse only surfaces)

1988 : Marc Levoy: Display of surfaces from Volume Data.

Efficient shading for volume rendering

1986 : J. Kajiya. The Rendering Equation.

First General Global Illumination Algorithm

1987 : Lorensen & Cline: Marching Cube algorithm.

Triangularization of an isosurface

90’S

13Benjamin MoraSwansea University

1960 1970 1980 1990 2000

Hardware

Software

Entertainment

1991 : Terminator 2.

CG made characters

1992 : Wolfeinstein 3D

Ray-Casting

1993 : Doom.

Improved graphics.

1990’s : Graphics API’s.

GDI, DirectX (1995), Glide (3D FX).

1992 : OpenGL 1.0 (SGI).

1991 : S3 Graphics.

2D accelerator cards for PCs

1995 : 3D accelerators.

S3 Virge. ATI Rage. Matrox Mystique

1995 : PCs more powerful than consoles and other Computers

90’S

14Benjamin MoraSwansea University

1960 1970 1980 1990 2000

Hardware

Software

Entertainment

1996 : Voodoo 3D Fxaccelerator cards for PCs

Faster than anything else

1996 : SGI buys Cray

1997 : Instant Radiosity technique by A. Keller.

Hardware accelerated Global Illumination

1995 : Toy Story by Pixar

First full CG movie.

90’S

15Benjamin MoraSwansea University

1960 1970 1980 1990 2000

Hardware

Entertainment 1998 : Antz (Dreamworks) & A bug’s life (pixar).

Better Graphics and Animations

1998 : Voodoo 2

SLI

1999 : Voodoo 3, NVidia Riva TNT 2, ATI Rage 128, Matrox G400

~Same power

Direct 3D 5,6

1999 : Geforce 256

Transform and Lightning

Direct3D 7 compatible

2000’S

16Benjamin MoraSwansea University

1960 1970 1980 1990 2000

Hardware

Software

Entertainment

2001 : Final Fantasy

More realism than ever before

From 2000 : Arm Race

Between ATI and NVidia

GRAPHICS HARDWARE

(I.E., JUST MASSIVELY SIMD CPUS ;-) )

17Benjamin MoraSwansea University

NVIDIA SERIES: VERTICES/S AND TRANSISTORS

GeForce 256 GeForce 2GeForce 3 GeForce 4

GeForce fx 5800

GeForce 6800

GeForce 7800

GeForce 8800 GTX

GeForce 256 GeForce 2 GeForce 3GeForce 4

GeForce fx 5800

GeForce 6800

GeForce 7800

GeForce 256 GeForce 2GeForce 3 GeForce 4

GeForce fx 5800

GeForce 6800

GeForce 7800

GeForce 8800 GTX

GeForce 256 GeForce 2 GeForce 3GeForce 4

GeForce fx 5800

GeForce 6800

GeForce 7800

0

100

200

300

400

500

600

700

800

900

1000

18Benjamin MoraSwansea University

September 1999 November 2006

Millions Vertices/S

Millions Transistors

NVIDIA SERIES: TEXELS/S

GeForce 256GeForce 2 GeForce 3 GeForce 4 GeForce fx 5800

GeForce 6800

GeForce 7800

GeForce 8800 GTX

0

5000

10000

15000

20000

25000

30000

35000

40000

19Benjamin MoraSwansea University

September 1999 November 2006

Millions Texels/S

SCENE MODELLING

20Benjamin MoraSwansea University

MODELLING

� Mainly Triangles, points and lines with OpenGL.

21Benjamin MoraSwansea University

MODELLING

� Storage of Triangles:

22Benjamin MoraSwansea University

P1x

zy

x

zyP2

x

zy

x

zy

P3

Vert1Vert2Vert3Vert1Vert2Vert3Vert1Vert2Vert3

Triangle 1

Triangle 2

Vertex array Triangle array

{{{{

}}}

MODELLING

� Bezier Curves:

� Curved surfaces

now widely

supported

23Benjamin MoraSwansea University

http://pages.cpsc.ucalgary.ca/~apu/subdivision.jpg

MODELLING: DISPLACEMENT MAPPING

24Benjamin MoraSwansea University

Images © Fantasy Lab and Wizards of the Coast From GDC’06 slides made by Peter-Pike Sloan, Microsoft Corporation

OPENGL

25Benjamin MoraSwansea University

WHAT IS OPENGL?

� OpenGL is a C Graphics Library allowing rendering (i.e.,

displaying) 3D objects like triangles.

� Conceived in such a way that most routines are

hardware accelerated.

� Similar to Direct3D (DirectX, Microsoft) on many points.

� OpenGL 1.0 has been introduced in 1992 by SGI (Silicon

Graphics Incorporation).

� Current Version 4.3.

� Available now on every PC equipped with a descent

graphics card (NVidia, ATI, also Intel…).

26Benjamin MoraSwansea University

WHAT IS OPENGL?

� Pros:

� Industry Standard (not only used in games)

� Hardware-accelerated.

� Specially designed GPUs.

� Software-based versions are by far slower.

� Portable (Linux, Windows, Macintosh, etc…).

� Lot of documentation.

� www.opengl.org

� The OpenGL Programming Guide 4th Edition The Official Guide to

Learning OpenGL Version 1.4.

� OpenGL forums.

� Still evolving: more and more options for more realistic

renderings.

27Benjamin MoraSwansea University

WHAT IS OPENGL?

� Cons:

� State Machine: Awful programming, difficult to debug, and

difficult for beginners!!!

� Supposed to get better from the 3.0 release

� Computer Graphics concepts must be learned.

� Brute force approach: Hardware-Accelerated, but not

optimized.

� Not very flexible: Some computer graphics algorithms cannot

take advantage of OpenGL.

� Not the only way to display objects: Processing a huge amount

of data can be faster in software under some conditions.

28Benjamin MoraSwansea University

WHAT IS OPENGL?

� OpenGL does not contain a window manager, so an

OpenGL rendering context (i.e., an OpenGL windows)

must be created through another library (MFCs (?),

GLUT, FLTK, GTK…).

� To use OpenGL, the code should include the correct

libraries and files:

� #include <gl/gl.h>

� OpenGL extensions must be included separately.

� OpenGL is mainly used in a C/C++ context, but

extensions of the API to other languages exist (OpenGL

for java, Delphi,…).

29Benjamin MoraSwansea University

HOW DOES OPENGL WORK?

� Aim: Rendering (i.e. displaying) a 3D scene.

� Objects are “rendered” (drawn) into an invisible frame-buffer(image).

� When all the objects have been processed, the frame-buffer becomes the new image appearing in your window (double buffering).� Ideally 30 fps or more.

30Benjamin MoraSwansea University

HOW DOES OPENGL WORK?

� Rendering an image needs two main steps.

� Setting up the parameters.

� Telling OpenGL where the camera (viewpoint) is located.

� Telling OpenGL about the lights.

� position of the lights.

� Type of lighting, colors.

� Many other parameters according to the needs.

� Telling the system where the primitives (basic shapes like triangles that represent the object) are located in space.

� Previous parameters will be taken into account for rendering.

� If the object is opaque, primitives can be drawn in any order.

� Every time a primitive is declared, OpenGL rasterizes (drawn) it on the frame-buffer, and keeps only its visible parts using the z-buffer test.

31Benjamin MoraSwansea University

HOW DOES OPENGL WORK?

� The vertices sent to the OpenGL machine follow a

projection pipeline, before performing an on-screen

rasterization (area filling) using a scan-line algorithm.

32Benjamin MoraSwansea University

Camera SystemProjection

Primitive

HOW DOES OPENGL WORK?

� Example:

� Primitives enter the graphics pipeline when specifying their

vertex (points) location.

33Benjamin MoraSwansea University

Initial Frame-Buffer After the first primitive rasterization

After the second primitive rasterization

DISPLAYING PRIMITIVES

WITH OPENGL

34Benjamin MoraSwansea University

OPENGL PRIMITIVES

35Benjamin MoraSwansea University

GL_POINTS GL_LINES

GL_TRIANGLES

1

GL_LINE_STRIP GL_LINE_LOOP

2

3

4

5

1

2

3

4

5

GL_TRIANGLE_STRIP

GL_TRIANGLE_FAN GL_QUADS GL_QUAD_STRIP

1

23

4

5 6

1

23

54

2

31

54

1

24

3

1

23

45

6

GL_POLYGON

SENDING PRIMITIVES TO OPENGL

� A primitive mode must be setup first (e.g.,

glBegin(GL_TRIANGLES)).

� Vertex coordinates must then be send to the pipeline

� glVertex function, see later for code!

� GL_TRIANGLES case

� Every time 3 consecutive vertices have been sent to the

pipeline, a new triangle rasterization step happens.

36Benjamin MoraSwansea University

PRIMITIVE RASTERIZATION

� Every pixel on the projection is called a “fragment”.

� For every fragment, many parameters like the colors and

transparency (RGBA), depth, and the texture coordinates

are linearly interpolated.

37Benjamin MoraSwansea University

Rasterization Linear Interpolation

PRIMITIVE RASTERIZATION

� Different types of (i.e., precisions for) colors can be used

with OpenGL: [Black Value..White Value]

� GL_UNSIGNED_BYTE: [0..255].

� GL_UNSIGNED_SHORT: [0..65535].

� GL_FLOAT: [0..1.0].

� At some point, colors will be clamped to [0..1].

38Benjamin MoraSwansea University

OPENGL PIPELINE

AND

MATRIX TRANSFORMATIONS

39Benjamin MoraSwansea University

BASIC PIPELINE

40Benjamin MoraSwansea University

Daniel Weiskopf, Basics of GPU-Based Programming,

http://www.vis.uni-stuttgart.de/vis04_tutorial/vis04_weiskopf_intro_gpu.pdf

(REGULAR) OPENGL PIPELINE

� 4D vertex coordinates are expressed in the camera coordinate

system by applying a sequence of (4D) transformations.

� Pipeline and matrices fixed before processing a flow of vertices.

41Benjamin MoraSwansea University

w

z

y

x

Modelview Matrix

Projection Matrix

Perspective Division

Viewport Transformation

Window Coordinates

World coordinates

Clip coordinatesNormalized device

coordinates

Eye/Camera coordinates

OPENGL PIPELINE

42Benjamin MoraSwansea University

o

x

zWorld coordinate system

Object as stored in memory

OPENGL PIPELINE

43Benjamin MoraSwansea University

o

x

z

1-Modelview transform: Moving objects in space

2-Projection transform: Take into consideration the camera. Coordinates are now expressed in the camera coordinate system

3-w division followed by the viewport transform then occur to find the vertex projections. Once this done for a sufficient number vertices (e.g., 3 for a triangle), object rasterization can happen

OPENGL PIPELINE

� OpenGL is a state machine.

� The matrices must be set before sending the graphic primitives (e.g. triangles) into the pipeline.

� 2 main matrix stacks actually:

� GL_PROJECTION

� Used to specify the camera position and/or the model position.

� GL_MODELVIEW

� Used in order to specify the projection type, and clip coordinates.

� Before applying transformations, the relevant stack must be specified:

� Setting up the current matrix to modelview:

� glMatrixMode(GL_MODELVIEW);

� Use glLoadIdentity() to reset it.

44Benjamin MoraSwansea University

OPENGL PIPELINE

� The current matrix is the matrix in top of the stack.

� void glPushMatrix();

� Make a copy of the matrix in top of the stack and put it on the top.

� void glPopMatrix();

� Remove the top matrix from the stack.

� Loading the identity matrix.

� glLoadIdentity();

� Needed before starting to apply transforms

(rotation/translation/scaling/etc…)

� Stacks are used because they are useful to specify

relative coordinate systems.

45Benjamin MoraSwansea University

OPENGL PIPELINE: MATRICES

� The modelview matrix can be used to both set the camera location and to move objects in space.

� The projection matrix should be used to specify either an Orthographic Projection or an Perspective Projection

� Use of either glOrtho or glFrustum to specify them

� Used for normalizing coordinates between -1 and 1.

� Required for the quantization of the z value

46Benjamin MoraSwansea University

1||(xyz)|| and ,sin(angle)s ,cos(angle)c

1000

0 cc)-(1z xsc)-yz(1 ys-c)-xz(1

0 xs-c)-yz(1 cc)-(1y zsc)-yx(1

0ysc)-xz(1zs-c)-xy(1cc)-(1x

2

2

2

===

++++

++

with

Rotation matrix created from an angle and a line in the direction x,y,z that cross the origin

1000

tx 10 0

010

tx001

ty

Translation Matrix

PROJECTION MATRIX: GLORTHO

� Need to specify a 6-face box with 6 parameters.

� Normalized coordinates between -1 and 1 after this stage.

� void glOrtho(glDouble left, glDouble right, glDouble.

bottom,glDouble top, glDouble near, glDouble far);

47Benjamin MoraSwansea University

PROJECTION MATRIX: GLFRUSTUM

� void glFrustum(glDouble left, glDouble right, glDouble

bottom,glDouble top, glDouble near, glDouble far);

� Non Linearity:

48Benjamin MoraSwansea University

PERSPECTIVE DIVISION

� Once the vertices have been transformed, we need to know their

projection into the image space.

� Image coordinates are still expressed in the range [-1, 1]

49Benjamin MoraSwansea University

1w

zw

yw

x

w

z

y

x

Projection on x [-1..1]

Fragment Depth

VIEWPORT

� Map the normalized x and y coordinates to a portion of the image.

� void glViewport(GLint x, GLint y, GLsizei width, GLsizei height);

50Benjamin MoraSwansea University

0,0

x,y

x+width,y+height

image

VIEWPORT

� Map the normalized x and y coordinates to a portion of the image.

� void glViewport(GLint x, GLint y, GLsizei width, GLsizei height);

51Benjamin MoraSwansea University

0,0

x,y

x+width,y+height

image

Z-BUFFER TEST

� Example:

52Benjamin MoraSwansea University

Final image Final z-buffer

Z-BUFFER TEST

� Once the vertices projected, rasterization occurs.

� Primitives can be sent to the graphics hardware in any

order, thanks to the z-buffer test that will keep the

nearest fragments.

� A z value is stored for every pixel (z-buffer).

� Algorithm:

If the rasterized z-value is less than the current z-value

Then replace the previous color and z-value by the new ones

53Benjamin MoraSwansea University

STENCIL TEST

� A stencil buffer can be used for implementing complex

algorithms (e.g., Shadow volumes).

� A value is associated with every pixel.

� The stencil test is performed after the z-test and

compare the current stencil value with a reference value.

� The stencil value can possibly be incremented every time

a fragment passes the stencil test.

54Benjamin MoraSwansea University

STENCIL TEST

55Benjamin MoraSwansea University

http://www.opengl.org/resources/tutorials/advanced/advanced97/notes/node196.html

LIGHTING.

56Benjamin MoraSwansea University

OPENGL LIGHTING MODEL

� In real life, every material has its own reflection

properties (called BRDF, i.e. Bidirectional Reflectance

Distribution Function).

� Illumination is computed from the light source position,

the surface orientation and the camera position.

� Illumination is computed at

the vertices, and then interpolated

for every fragment.

57Benjamin MoraSwansea University

OPENGL LIGHTING MODEL (PHONG)

� 3 components:

� Ambient.

� Do not depend of the light

source.

� Diffuse.

� The light is evenly reflected

in all direction.

� Specular.

� The reflection is predominant in a given direction.

� The specular coefficient can be varied.

� The light intensity can decrease according the distance (constant, linear or quadratic).� Real-world: quadratic decrease!

58Benjamin MoraSwansea University

OPENGL LIGHTING MODEL

59Benjamin MoraSwansea University

http://en.wikipedia.org/wiki/Phong_shading

OPENGL LIGHTING MODEL

� OpenGL formula:

� A “light” can be created, specifying all the required constants for the light.

� A “material” can be created in OpenGL, specifying all the constants for the surface.

60Benjamin MoraSwansea University

S

X

v

MODEL LIMITATIONS

� No native shadows.

� Shadows can be implemented by combining specific algorithms

and texture mapping.

� Phong shading is just an imperfect model.

� Global (i.e. realistic) illumination can not be done efficiently.

� Better to interpolate normals first, and then compute

shading.

� Faked Normals.

� The actual surface derivative do not usually match the specified

normal.

61Benjamin MoraSwansea University

TEXTURE MAPPING

� OpenGL interpolates the texture coordinates for every

rasterized fragment and then fetches the pixel from the

texture.

� Textures are stored on the graphics board and are highly

optimized.

� Textures must be loaded first.

� Texture coordinates must be attributed to vertices at the same

time as vertex normals and vertex coordinates.

� See next slides.

62Benjamin MoraSwansea University

INITIALIZATION OF THE OPENGL

MACHINE

AND

OPENGL CODING

63Benjamin MoraSwansea University

OPENGL CODING

� First, set the parameters.

� Viewpoint (camera).

� Lights.

� Materials.

� Textures.

� Etc…

� Second, tell OpenGL the primitives to render, possibly specifying for every vertex its:

� Colors.

� Normals.

� Texture Coordinates.

� Extra properties…

64Benjamin MoraSwansea University

OPENGL CODING

� #include <gl/gl.h>

� Enabling some specific features:

� void glEnable(GLenum cap);

� void glDisable(GLenum cap);

� glEnable(GL_DEPTH_TEST);//Enabling z-buffer

� glEnable(GL_LIGHTING);//Enabling lighting.

� glEnable(GL_LIGHT0);//Enabling light 0 (at least 8 lights)

� glEnable(GL_TEXTURE_2D);//Enabling 2D textures

� …

65Benjamin MoraSwansea University

OPENGL CODING

� Specifying the current matrix:

� void glMatrixMode (GLenum mode);

� glMatrixMode(GL_MODELVIEW);

� glMatrixMode(GL_PROJECTION);

� Initializing the current matrix:

� void glLoadIdentity();

� Handling the matrix stack:

� void glPushMatrix ( );//New copy on the top of the stack.

� void glPopMatrix ( ); //Remove the top of the stack.

66Benjamin MoraSwansea University

OPENGL CODING

� Rotations (modifying the current matrix):

� void glRotated ( GLdouble angle , GLdouble x , GLdouble y ,

GLdouble z );

� void glRotatef ( GLfloat angle , GLfloat x , GLfloat y , GLfloat z );

� Translations (modifying the current matrix):

� void glTranslated ( GLdouble x, GLdouble y, GLdouble z );

� void glTranslatef ( GLfloat x , GLfloat y , GLfloat z );

� Scaling (modifying the current matrix):

� void glScaled ( GLdouble x , GLdouble y , GLdouble z );

� void glScalef ( GLfloat x , GLfloat y , GLfloat z );

67Benjamin MoraSwansea University

OPENGL CODING

� Erasing the Frame-Buffer:

� void glClearColor ( GLclampf red , GLclampf green , GLclampf blue ,

GLclampf alpha ); //Defines a clear color

� void glClear ( GLbitfield mask ); //Clear the image

� Examples:

� glClear(GL_COLOR_BUFFER_BIT |

GL_DEPTH_BUFFER_BIT |

GL_ACCUM_BUFFER_BIT |

GL_STENCIL_BUFFER_BIT );

68Benjamin MoraSwansea University

OPENGL CODING

� Sending graphics primitives to the OpenGL machine:

glBegin(GL_LINES); //Specify lines. Could be GL_TRIANGLES, etc…

//First vertex

glColor3f(1.0,0.,0.);//3 float colors for the first vertex

glNormal3f(0.707,0.707,0); //first normal

glTexcoord2f(0,0); //First texture coordinate

glVertex3f(500,100,2); //first vertex

//Second vertex

glColor4f(1.0,0.,0.,1.);//4 float colors (last value: opacity)

glNormal3fv(v); //gives a vector of float as parameters

glTexcoord2f(1,1); //Second texture coordinate

glVertex3d(500,100,2);//double instead of float

glEnd(); // End of the vertex flow

69Benjamin MoraSwansea University

OPENGL CODING

� Initializing lighting

void GLRenderer::InitLighting()

{

float ambiant[4]= {0.2,0.2,0.2,1.};

float diffuse[4]= {0.7,0.7,0.7,1.};

float specular[4]= {1,1,1,1.};

float exponent=8;

// glMatrixMode(GL_MODELVIEW);

// glLoadIdentity();

// Be careful here: the lights go through the OpenGL transform pipeline

float lightDir[4] = {0,0,1,0};

glEnable(GL_LIGHTING);

glEnable(GL_LIGHT0);

glLightfv(GL_LIGHT0,GL_AMBIENT,ambiant);

glLightfv(GL_LIGHT0,GL_DIFFUSE,diffuse);

glLightfv(GL_LIGHT0,GL_SPECULAR,specular);

glLightf(GL_LIGHT0,GL_SPOT_EXPONENT,exponent);

glLightfv(GL_LIGHT0, GL_SPOT_DIRECTION, lightDir);

}

70Benjamin MoraSwansea University

OPENGL CODING

� Initializing texturing

unsigned int *textureId=new unsigned int[nbOfTextures];

glGenTextures(nbOfTextures,textureId);

for (i=0;i<nbOfTextures;i++)

{

glBindTexture(GL_TEXTURE_2D, textureId[i]);

glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER, GL_LINEAR);

glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER, GL_LINEAR);

glTexImage2D(GL_TEXTURE_2D, 0, 3, textureDimensionsX[i], textureDimensionsY[i], 0,GL_RGB,GL_UNSIGNED_BYTE, texture[i]);

}

71Benjamin MoraSwansea University

SHADING LANGUAGES:

VERTEX AND FRAGMENTS PROGRAMS

72Benjamin MoraSwansea University

VERTEX AND FRAGMENT PROGRAMS

73Benjamin MoraSwansea University

Daniel Weiskopf, Basics of GPU-Based Programming,

http://www.vis.uni-stuttgart.de/vis04_tutorial/vis04_weiskopf_intro_gpu.pdf

INTRODUCTION

� OpenGL (SGI) early oriented the design of current graphics processors (GPUs).

� Fixed pipeline.� Once the different tests are passed, the old fragment color is replaced by the new

(textured & interpolated) one.

� Not realistic enough.

� The graphics pipeline is fed with Primitives like Triangles, Points, etc… that are rasterized.

� Two main stages:

� Vertex processing.

� Fragment (rasterized pixel) processing.

� These 2 stages have been extended for more realism.

74Benjamin MoraSwansea University

INTRODUCTION

� Some recent evolutions� Unified shaders.

�Automatic graphical units balancing between vertex and fragment programs.

�The lower the image size is, the more cpu and vertex bound the program is.

�The greater the image-size is, the more fragment/pixel bound the program is.

�Anti-aliasing and texture filtering parameters also contribute to this.

� Geometry, domain and Hull shaders discussed separately.

75Benjamin MoraSwansea University

PROGRAMMING THE GPU

� Low Level languages (Pseudo-assembler).

� Help to understand what is possible on the GPU.

� Large code is a pain to maintain/optimize.

� May be specific to the graphics card generation/supplier.

� High Level languages.

� Easier to write.

� Early compilers were not very good.

� Code may be more compatible.

� Loops.

76Benjamin MoraSwansea University

CURRENT LOW LEVEL LANGUAGES (APIS)

� DirectX class.

� Vertex shader 2.0.

� Pixel shader 2.0.

� OpenGL and its extensions.

� GL_ARB_vertex_program.

� GL_ARB_fragment_program.

� Vendor APIs

77Benjamin MoraSwansea University

HIGH LEVEL LANGUAGES (APIS)

� DirectX 10, 11.� Compute shaders.

� Microsoft, ATI.� High Level Shading Language (HLSL).

� NVidia.� Cg. Deprecated

� OpenGL Shading Language.� 4.4,

� ES 3.0 (no tessellation, limited precision, etc…)

78Benjamin MoraSwansea University

HOW TO USE THEM?

� Assembly programs:

� Can be loaded (and compiled) at run-time (OpenGL).

� Several programs can be loaded at once.

� Applying the suitable rendering style (i.e. program) to every scene primitive.

� Prevents from high latency due to pseudo-assembly compilation.

� High level Programs:

� Are usually compiled before run-time.

� The resulting (pseudo) assembly code can then be used.

79Benjamin MoraSwansea University

VERTEX AND FRAGMENT PROGRAMS

80Benjamin MoraSwansea University

Setup

Rasterization

Frame Buffer Blending

Texture Fetch, Fragment Shading

Tests (z, stencil…)

Vertices

Transform And Lighting

Vertex Programs:User-Defined Vertex

Processing

Fragment Programs:User-Defined

Per-Pixel Processing

VERTEX PROGRAMS

� Vertex Program.

� Bypass the T&L unit.

� GPU instruction set to perform all vertex math.

� Input: arbitrary vertex attributes.

� Output: Transformed vertex attributes.

� homogeneous clip space position (required).

� colors (front/back, primary/secondary).

� fog coord.

� texture coordinates.

� Point size.

81Benjamin MoraSwansea University

VERTEX PROGRAMS

� Customized computation of vertex attributes

� Computation of anything that can be interpolated linearly between

vertices.

� Limitations:

� Vertices can neither be generated nor destroyed.

� Geometry shaders for that.

� No information about topology or ordering of vertices is available.

82Benjamin MoraSwansea University

FRAGMENT PROGRAMS

� Similar to the Vertex Programs.

� Same way to load programs.

� Inputs and Outputs are differents.

� Different Set of instructions.

� More instructions, but tend to be the same…

� Versions available: 1.0, 2.0, and 4.0.

� 64 constant vector registers.

� 32 32-bit floating point precision registers or 64 16-bit floating point

precision registers.

83Benjamin MoraSwansea University

OPENGL

SHADING LANGUAGE

-GLSL-

84Benjamin MoraSwansea University

GLSL INTRODUCTION

� Introduced with OpenGL 2.0.

� High Level Language.

� Real shaders implementation hidden inside drivers.

� C/C++ like coding.

� Some differences.

� Stronger typing.

� Language still heavily influenced by current hardware design.

� Still somehow messy…

� Compatible with future pipelines.

� Replaces fixed vertex and pixel pipelines.

� Geometry shader available as an extension.

� OpenGL 3.0 adds some functionalities.

85Benjamin MoraSwansea University

HOW TO CODE/DEBUG SHADING LANG.

� Understanding of the whole pipeline needed.

� Thorough analysis of the algorithm/solution first.

� Favor simple solutions unless big performance issues.

� Start with a very simple shader that executes a simple

task.

� Test and iterate your code until done with your task.

� Frame rate scales with code efficiency.

� Thoroughly analyze your problem again…

� Check for redundancies, memory access, etc…

� Use textures for emulating/pre-computing complex functions

86Benjamin MoraSwansea University

HOW TO CODE/DEBUG

� Debugging:

� Again, difficult.

� Can test variables/temporary results by returning specific

colors.

� Tools:

� RenderMonkey (ATI).

� glslDevil

� http://www.vis.uni-stuttgart.de/glsldevil/

� gDEBugger (30-days trial version).

87Benjamin MoraSwansea University

SHADER LOADING

� Shaders should be ideally stored in a separate text file.

� Needs to be loaded into your program as a string (char *) variable,

and then sent to the API.

� Several different shaders can be stored by the API and interchanged during

rendering.

� Cannot be changed between glBegin(…) and glEnd() calls.

88Benjamin MoraSwansea University

…Char *myVertexProgram;LoadText(myVertexProgram,

“VertexProgram.shader”);//Use now the OpenGL 2.0 API//to compile and enable the program…

myProgram.c

void main(){

gl_Position=gl_ModelviewProjectionMatrix* gl_Vertex;

}

VertexProgram.shader

SHADER LOADING

� Loading a shader object (vertex or fragment) requires several

steps:

� Create a shader object

� glCreateShader.

� Associate the source code to the shader object

� glShaderSource.

� Compile the shader.

� glCompileShader

� Attach the shader to a program object (container for shaders)

� glAttachShader

� Link the compiled shader to a program.

� glLinkShader

� Replace the fixed pipeline with the program object.

� glUseProgram.

89Benjamin MoraSwansea University

SHADER LOADING:EXAMPLE

char *myVertexProgram;

char *myFragmentProgram;

GLuint vShader, fShader, program;

vShader=glCreateShader(GL_VERTEX_SHADER);

fShader=glCreateShader(GL_FRAGMENT_SHADER);

glShaderSource(vShader, 1, & myVertexProgram, NULL);

glShaderSource(fShader, 1, & myFragmentProgram, NULL);

glCompileShader(vShader);

glCompileShader(fShader);

//Source strings can now be deleted

90Benjamin MoraSwansea University

SHADER LOADING:EXAMPLE

program=glCreateProgram(); //Creates a program object

glAttachShader(program, vShader);

glAttachShader(program, fShader);

glLinkProgram(program);

glUseProgram(program);

//can come back to a fixed pipeline by passing NULL instead

//Don’t forget :

//Objects must be deleted when not needed anymore

91Benjamin MoraSwansea University

VARIABLES AND TYPES

92Benjamin MoraSwansea University

TYPES

� Simple types.

� Structures (struct keyword) and arrays possible.

� No Pointer!

� Implicit conversion generally not possible.

� Scalar types:

� float, bool, int

� int at least in the range [-65535, 65535]

� Declaration:

� float f,g=1.0;

93Benjamin MoraSwansea University

TYPES

� Vector types:

� vec2, vec3, vec4: Vectors of floats.

� bvec2, bvec3, bvec4: Vectors of booleans.

� ivec2, ivec3, ivec4: Vectors of integers.

� Declaration: vec3 v=vec3(1.0,0.0,3.0);

� Vector components:

� .xyzw, for vectors representing positions.

� .rgba, for vectors representing colors

� .stqp, for vectors representing texture coordinates.

� Designation not compulsory.

94Benjamin MoraSwansea University

TYPES

� Swizzling examples:

� float f;

vec4 v;

vec2 v2=v.ww;

vec3 v3=v.xzy;

v2=vec2(3.0,-1.0);

v2=texture1D(sampler,coordinate).xy;

v=v+f; //f is added to the 4 components of v!

v+=v; //Component-wise addition

95Benjamin MoraSwansea University

TYPES

� Matrices (of floats, square):

� mat2, mat3, mat4;

� mat4 m;

vec4 v=m[2];

float f=m[2][2];

� Row and columns inverted in OpenGL conventions!

� m[2] is the third column of the matrix.

� Don’t use oversized vector and matrices if not required.

96Benjamin MoraSwansea University

TYPES

� Structure :

� Struct light {

vec3 position;

vec3 color;

float watt; //could be actually stored with color

}

light myLight;

� No typedef!

97Benjamin MoraSwansea University

TYPES

� Arrays:

� vec3 vertices[20];

� vec3 vertices2[];

//Also possible. Size must be determinable at compilation //time. See

manual & specs.

� Special case: texture coordinate array.

� Internally declared as:

� varying vec4 gl_TexCoord[];

98Benjamin MoraSwansea University

TYPES

� Samplers

� Texture variables.

� sampler1D

� sampler2D

� sampler3D

� samplerCube

� sampler1DShadow

� sampler2DShadow

� Declaration:

� Uniform sampler2D brick;

vec4 col=texture2D(brick, texCoordinate);

99Benjamin MoraSwansea University

TYPES: SAMPLERS

� Example

� C/C++ core program:

glActiveTexture(GL_TEXTURE0);

glBindTexture(GL_TEXTURE_2D, marbleTex);

texLoc=glGetUniformLocation(myProgram, “marbleTexture”);

glUniform1i(texLoc,0);

� Vertex Program:

varying vec2 coord;

coord = gl_MultiTexCoord0.st; //Get the tex coordinates.

100Benjamin MoraSwansea University

TYPES: SAMPLERS

� Example

� Fragment Program:

varying vec2 coord;

uniform sampler2D marbleTexture; //texture object.

gl_FragColor = texture2D(marbleTexture, coord);

101Benjamin MoraSwansea University

TYPES

� Qualifiers:

� attribute

� For frequently changing variables, typically what would be passed to OpenGL

between glBegin(…) and glEnd().

� Built-in attributes include gl_Vertex, gl_Normal,…

� uniform

� For not-so-frequently changing variables, typically what would be passed to

OpenGL outside of a glBegin(…)/glEnd() section.

� At most changed once per primitive.

� Read-only.

102Benjamin MoraSwansea University

TYPES

� Qualifiers:

� varying

� For variables passed from the vertex shader to the fragment shader.

These variables undergo a linear interpolation.

� Variable declation must be consistent across the vertex and fragment

programs.

� Perspectively correct.

� const

� Variable value fixed at compilation time. Cannot be modifier

� The first 3 qualifiers must be global variables.

� No qualifier means a read/write variable local to the

shader.

103Benjamin MoraSwansea University

TYPES

� Functions:

� Functions can be written locally.

� Call by value-return.

� Parameter qualifiers:

� in

� out

� inout

� const

� In addition to previous qualifiers

� Example:

� float norm(in vec3 v) {…

104Benjamin MoraSwansea University

BUILT-IN VARIABLES

� GLSL pre-declares many (useful) variables.

� Input/Output variables are used for communication between programs.

� Additional attributes can also be specified.

� Implementation dependent.

� A minimum number defined by OpenGL.

� glGet(GL_MAX_VERTEX_ATTRIBS);

� See later.

105Benjamin MoraSwansea University

PREDEFINED VERTEX VARIABLES

� attribute vec4 gl_Color;

� attribute vec4 gl_SecondaryColor;

� attribute vec3 gl_Normal;

� attribute vec4 gl_MultiTexCoord0;

� attribute vec4 gl_MultiTexCoord1;

� const int gl_MaxTextureCoords;

� …

� attribute vec4 gl_FogCoord;

106Benjamin MoraSwansea University

VERTEX OUPUT VARIABLES

� vec4 gl_Position;

� vec4 gl_ClipVertex;

� float gl_PointSize;

107Benjamin MoraSwansea University

VERTEX VARYING OUPUT VARIABLES

� varying vec4 gl_FrontColor;

� varying vec4 gl_BackColor;

� varying vec4 gl_FrontSecondary;

� varying vec4 gl_BackSecondary;

� varying vec4 gl_TexCoord[];

� float gl_FogFragCoord;

108Benjamin MoraSwansea University

SPECIAL FRAGMENT INPUT VARIABLES

� varying vec4 gl_Color;

� varying vec4 gl_SecondaryColor;

� varying vec4 gl_TexCoord[];

� varying float gl_FogFragCoord;

109Benjamin MoraSwansea University

SPECIAL FRAGMENT INPUT VARIABLES

� bool gl_FrontFacing;

� vec4 gl_FragCoord;

110Benjamin MoraSwansea University

FRAGMENT OUTPUT VARIABLES

� vec4 gl_FragColor;

� vec4 gl_FragData;

� float gl_FragDepth;

� //gl_FragCoord.z by default

� These variables have a global scope.

111Benjamin MoraSwansea University

BUILT-IN CONSTANTS

� const int gl_MaxClipPlanes;

� const int gl_MaxCombinedTextureImageUnits;

� const int gl_MaxFragmentUniformComponents;

� const int gl_MaxVertexAttribs;

� const int gl_MaxVaryingFloats;

� const int gl_MaxDrawBuffers;

� const int gl_MaxTextureCoords;

� const int gl_MaxTextureUnits;

� const int gl_MaxTextureImageUnits;

� const int gl_MaxVertexTextureImageUnits;

� const int gl_MaxLights;

112Benjamin MoraSwansea University

BUILT-IN UNIFORM VARIABLES

� uniform mat4 gl_ModelViewMatrix;

� uniform mat4 gl_ModelViewProjectionMatrix;

� uniform mat4 gl_ProjectionMatrix;

� uniform mat4 gl_TextureMatrix[gl_MaxTextureCoords];

� uniform mat4 gl_ModelViewMatrixInverse;

� uniform mat4 gl_ModelViewProjectionMatrixInverse;

� uniform mat4 gl_ProjectionMatrixInverse;

� uniform mat4 gl_TextureMatrixInverse[gl_MaxTextureCoords];

113Benjamin MoraSwansea University

BUILT-IN UNIFORM VARIABLES

� uniform mat4 gl_ModelViewMatrixTranspose;

� uniform mat4 gl_ModelViewProjectionMatrixTranspose;

� uniform mat4 gl_ProjectionMatrixTranspose;

� uniform mat4 gl_TextureMatrixTranspose[gl_MaxTextureCoords];

� uniform mat4 gl_ModelViewMatrixInverseTranspose;

� uniform mat4 gl_ModelViewProjectionMatrixInverseTranspose;

� uniform mat4 gl_ProjectionMatrixInverseTranspose;

� uniform mat4 gl_TextureMatrixInverseTranspose[gl_MaxTextureCoords];

� uniform mat3 gl_NormalMatrix;

� uniform float gl_NormalScale;

114Benjamin MoraSwansea University

BUILT-IN UNIFORM VARIABLES

� struct gl_LightSourceParameters {

vec4 ambient;

vec4 diffuse;

vec4 specular;

vec4 position;

vec4 halfVector;

vec3 spotDirection;

float spotExponent;

float spotCutoff;

float spotCosCutoff;

float constantAttenuation;

float linearAttenuation;

float quadraticAttenuation;

};

� uniform gl_LightSourceParameters gl_LightSource[gl_MaxLights];

� etc…

115Benjamin MoraSwansea University

VERTEX AND FRAGMENT

PROCESSORS

116Benjamin MoraSwansea University

VERTEX AND FRAGMENT PROCESSORS

� Replaces the fixed pipeline.

� Input/Ouput Data: Attribute or Uniform variables.

� Built-In or User defined.

� Uses “Varying Data” for the communication of Linearly

interpolated Values between the vertex and the

fragment program.

117Benjamin MoraSwansea University

VERTEX PROCESSOR

� Modelview and projection matrices not applied.

� Normals not transformed to eye-coordinate.

� Normals not normalized.

� Texture coordinates not processed.

� Lighting not performed.

� Color material computations not performed.

� …

118Benjamin MoraSwansea University

VERTEX PROCESSOR

� After the vertex program, the following fixed functionalities are

still applied:

� Color clamping.

� Perspective division.

� Viewport mapping.

� Depth range scaling.

� Additional Vertex Attributes can be send from the main program.

� Additional colors, tangents, curvatures…

119Benjamin MoraSwansea University

PASSING MORE VERTEX ATTRIBUTES

Main C/C++ program:

� Texture coordinates can be used.

� Not best.

� glVertexAttrib function.

� void glVertexAttrib2dv(GLuint index, const GLdouble *v);

� void glVertexAttrib4s(GLuint index, GLshort v0, GLshort v1, GLshort v2, GLshort v3) ;

� void glVertexAttrib4fv(GLuint index, const GLfloat *v);

� etc…

� Index at least in the range [0..16]

� Attrib 0 indicates the completion of a vertex.

� Version for normalized data available…

120Benjamin MoraSwansea University

PASSING MORE VERTEX ATTRIBUTES

� How to associate a fragment program variable with an attribute

index in the C/C++ program?

� Use glBindAttribLocation function.

� void glBindAttribLocation(GLuint program, GLuint index, const GLchar

*name);

� glBindAttribLocation(myProgram, 1, “objectTangent”);

� Must be done before calling the linker.

121Benjamin MoraSwansea University

PASSING MORE VERTEX ATTRIBUTES

Main C/C++ program:

� glVertexAttribPointer.� void glVertexAttribPointer(GLuint index, GLint size, GLenum type, GLboolean

normalized, GLsizei stride, const GLvoid *pointer);

� Similar to vertex arrays. Arrays can now be stored in video

memory and processed optimally.

� Vertex attrib arrays possible.

� Enabling/Disabling Attrib arrays:� void glEnableVertexAttribArray(GLuint index);

� void glDisableVertexAttribArray(GLuint index);

� Arrays used when making a call to glDrawArrays,

glDrawElements, etc…

122Benjamin MoraSwansea University

PASSING MORE VERTEX ATTRIBUTES

Uniform variables:

� Setup in a different way than attribute variables.

� After linking the program, the main application (C/C++) must query the location of the uniform variable, and then set its value.

� GLint glGetUniformLocation (GLuint program, const GLchar *name) :

� Look for a specific variable.

� Returns the location.

� void glUniform{1|2|3|4}{f|i} (Glint location, TYPE v);

� Set the uniform value. Should not happen between glBegin/glEnd.

123Benjamin MoraSwansea University

FRAGMENT PROCESSOR

� The fragment program mainly processes interpolated information

generated from the vertex program.

� e.g. gl_Color.

� The fragment program must replace/code:

� Texture mapping environments & functions.

� Texture application.

� Color application/generation.

� Shading.

� Fog application.

124Benjamin MoraSwansea University

BUILT-IN FUNCTIONS

125Benjamin MoraSwansea University

BUILT-IN FUNCTIONS

� Easy Shader Development.

� Readability.

� Simplicity.

� Common functions needed for graphics.

� Mask the actual hardware implementation.

� The compiler has to be efficient/clever.

� No warranty that a function is hardware accelerated.

� Non-accelerated functions could be slower.

� Most of them available from both programs.

126Benjamin MoraSwansea University

BUILT-IN FUNCTIONS

� genType = float | vec2 | vec3 | vec4

� Trigonometry Functions.

� genType sin( genType );

� genType cos( genType );

� genType tan( genType );

� genType asin( genType );

� genType acos( genType );

� genType atan( genType, genType );

� genType atan( genType );

� genType radians( genType );

� genType degrees( genType );

127Benjamin MoraSwansea University

BUILT-IN FUNCTIONS

� Inverse, Exponential and square root functions.

� genType pow( genType, genType );

� genType exp( genType );

� genType log( genType );

� genType exp2( genType );

� genType log2( genType );

� genType sqrt( genType );

� genType inversesqrt( genType );

128Benjamin MoraSwansea University

BUILT-IN FUNCTIONS

� Common functions

� Min, Max, Clamping, Linear interpolation (Mix), modulo, floor, frac, step functions.

� genType abs( genType );

� genType ceil( genType );

� genType clamp( genType, genType, genType );

� genType clamp( genType, float, float );

� genType floor( genType );

� genType fract( genType );

� genType max( genType, genType );

� genType max( genType, float );

129Benjamin MoraSwansea University

BUILT-IN FUNCTIONS

� Common functions

� genType mix( genType, genType, genType );

� genType mix( genType, genType, float );

� genType mod( genType, genType );

� genType mod( genType, float );

� genType sign( genType );

� genType smoothstep( genType, genType, genType );

� genType smoothstep( float, float, genType );

� genType step( genType, genType );

� genType step( float, genType );

130Benjamin MoraSwansea University

BUILT-IN FUNCTIONS

� 3D functions and Matrix functions.� dot product, length, multiplications…

� vec4 ftransform(); //Vertex ONLY. Same transform as //done with a fixed pipeline. A direct ModelviewProjection //multiplication may lead to a slightly different result.

� vec3 cross( vec3, vec3 );

� float distance( genType, genType );

� float dot( genType, genType );

� genType faceforward ( genType V, genType I, genType N );

� float length( genType );

� genType normalize( genType );

� genType reflect( genType I, genType N );

� genType refract( genType I, genType N, float eta );

� mat matrixCompMult( mat, mat );

131Benjamin MoraSwansea University

BUILT-IN FUNCTIONS

� Texture Lookup functions

� //Optional bias term is Fragment ONLY

� vec4 texture1D( sampler1D, float [,float bias] );

� vec4 texture1DProj( sampler1D, vec2 [,float bias] );

� vec4 texture1DProj( sampler1D, vec4 [,float bias] );

� vec4 texture2D( sampler2D, vec2 [,float bias] );

� vec4 texture2DProj( sampler2D, vec3 [,float bias] );

� vec4 texture2DProj( sampler2D, vec4 [,float bias] );

132Benjamin MoraSwansea University

BUILT-IN FUNCTIONS

� Texture Lookup functions

� vec4 texture3D( sampler3D, vec3 [,float bias] );

� vec4 texture3DProj( sampler3D, vec4 [,float bias] );

� vec4 textureCube( samplerCube, vec3 [,float bias] );

� vec4 shadow1D( sampler1DShadow, vec3 [,float bias] );

� vec4 shadow2D( sampler2DShadow, vec3 [,float bias] );

� vec4 shadow1DProj( sampler1DShadow, vec4 [,float bias] );

� vec4 shadow2DProj( sampler2DShadow, vec4 [,float bias] );

133Benjamin MoraSwansea University

BUILT-IN FUNCTIONS

� Texture Lookup functions

� //Vertex ONLY; ensure

//GL_MAX_VERTEX_TEXTURE_IMAGE_UNITS > 0

� vec4 texture1DLod( sampler1D, float, float lod );

� vec4 texture1DProjLod( sampler1D, vec2, float lod );

� vec4 texture1DProjLod( sampler1D, vec4, float lod );

� vec4 texture2DLod( sampler2D, vec2, float lod );

� vec4 texture2DProjLod( sampler2D, vec3, float lod );

� vec4 texture2DProjLod( sampler2D, vec4, float lod );

134Benjamin MoraSwansea University

BUILT-IN FUNCTIONS

� Texture Lookup functions� //Vertex ONLY; ensure

//GL_MAX_VERTEX_TEXTURE_IMAGE_UNITS > 0

� vec4 texture3DProjLod( sampler3D, vec4, float lod );

� vec4 textureCubeLod( samplerCube, vec3, float lod );

� vec4 shadow1DLod( sampler1DShadow, vec3, float lod );

� vec4 shadow2DLod( sampler2DShadow, vec3, float lod );

� vec4 shadow1DProjLod( sampler1DShadow, vec4, float lod );

� vec4 shadow2DProjLod( sampler2DShadow, vec4, float lod );

135Benjamin MoraSwansea University

BUILT-IN FUNCTIONS

� Other functions:

� float noise1( genType );

� vec2 noise2( genType );

� vec3 noise3( genType );

� vec4 noise4( genType );

� genType dFdx( genType );

� genType dFdy( genType );

� genType fwidth( genType );

� …

136Benjamin MoraSwansea University

APPLICATION: PHONG SHADING

137Benjamin MoraSwansea University

Ian Fergusson, https://www.cis.strath.ac.uk/teaching/ug/classes/52.359/lect13.pdf

APPLICATION: PHONG SHADING

� How to realize a Phong interpolation ?

� Pass the normal as a texture coordinate at the vertex level.

� The texture coordinates will be automatically interpolated at

the fragment level.

� Normalize the normal in the fragment program first, and then

compute a Phong shading.

138Benjamin MoraSwansea University

OTHER APPLICATIONS

� Bump Mapping.� Can be done at the vertex or

at the fragment level.

� Volume Rendering.

� Use of 3D textures.

� GPGPU.

� General Processing on Graphics Processor Unit.

� A lot of GFLOPS…

� Scientific calculations like Fourier transforms.

� Geometry modification (Animation, Morphing…).

139Benjamin MoraSwansea University

WHAT ABOUT DIRECTX 11-WINDOWS 7

140Benjamin MoraSwansea University

http://www.pcgameshardware.com/screenshots/medium/2009/02/DirectX-11-Compute-Shader.PNG

-Tesselator

-Compute shader

-Dynamic Shader Linkage

WHAT ABOUT DIRECTX 11.1 – OPENGL 4.4

141Benjamin MoraSwansea University

Input Assembler

Vertex Shader

Hull Shader

Tesselator

GeometryShader

Stream Output

Rasterizer

Pixel Shader

VSMain {…}

HSMain {…}

PatchMain {…}

Domain Shader

DSMain {//Get s uv… //output vertices}

GSMain {… // Modify primitive shapes}

GSMain {…}

Data

ADVANCED RENDERING TECHNIQUES

142Benjamin MoraSwansea University

ADVANCED FEATURES

�Limitations:� OpenGL 1.1 was limited to the Phong shading model.

� Pixel precision may depend on the graphics card manufacturer (usually 8 bits per color).

� Creating real-time shadows and more complex effects is difficult.

� Power-of-two textures.

� Fixed rendering pipeline.

� Only one possible rendering per frame.

�Main manufacturers: ATI, NVidia, SGI, XGI.�Functionalities and extensions may differ from the

different manufacturers => Specific code must often be developed for each platform.

143Benjamin MoraSwansea University

ADVANCED FEATURES: EXTENSIONS

�Since OpenGL 1.2, extensions are supported.�Main Extensions:

� Floating-point extensions to get a better precision at the frame buffer level.� ATI: 24 bit precision (now 32). NVidia: 16 or 32 bit precision.� High Dynamic Range (HDR) images, Tone Mapping.

� Non-Power-of-Two textures: Saves memory.� Render-to-Texture (Cube Maps).

� Shadow Maps.

� Bump-Mapping.� Vertex and Fragment programs.

� Floating-point precision at the program level.

144Benjamin MoraSwansea University

ADVANCED FEATURES: EXTENSIONS

char* extensionsList = (char*) glGetString(GL_EXTENSIONS);// All the supported extensions are inside the string // extensionsList.

//Example for getting the 3D texture functionality.//First step: Declare a function typetypedef void (APIENTRY * PFNGLTEXIMAGE3DPROC) (GLenum target, GLint level, GLint internalFormat, GLsizei width, GLsizei height, GLsizei depth, GLint border, GLenum format, GLenum type, const GLvoid *pixels)

PFNGLTEXIMAGE3DPROC glTexImage3D; // Declare a function pointerglTexImage3D= (PFNGLTEXIMAGE3DPROC)

wglGetProcAddress("glTexImage3D");//Get the pointer address;

145Benjamin MoraSwansea University

ADVANCED FEATURES: RENDER-TO-TEXTURE

� Allows rendering into an intermediate image that will be re-used

in the final image.

� Useful for (cube or spherical map) environmental mapping.

� Mirroring effects.

� Reflections.

� Refractions.

� Lighting effects.

� Bump mapping.

� Vertex Textures.

146Benjamin MoraSwansea University

ADVANCED FEATURES: SHADOWS

� How to efficiently compute shadows that are compatible with

current graphics hardware ?

� Two possibilities:

� Shadow Maps.

� See CS 307

� Shadow Volumes.

147Benjamin MoraSwansea University

ADVANCED FEATURES: SHADOW VOLUME

148Benjamin MoraSwansea University

Image created by Steve Leach using OpenOffice Draw (v1.9.95), 30 May 2005 for use in the Shadow Volumes article.

http://en.wikipedia.org/wiki/Shadow_volume

ADVANCED FEATURES: SHADOW VOLUME

� Encode the surface of regions (volumes) of the scene

that are located inside the penumbra.

� Make use of the stencil test after having rendered the

scene.

� Stencil buffer initialized to 0.

� Every fragment that passes the z-test adds +1 to the stencil

value.

� If even count at the end => object visible from light source.

� Exact shadow contours.

� Shadow volumes are hard to update when the scene is

complex and dynamic.

149Benjamin MoraSwansea University

ADVANCED FEATURES: AMBIENT OCCLUSION

� Global Illumination is not possible yet.

� Estimates/Fakes local inter-reflections of light.

� Cf. SIGGRAPH slides by NVidia.

150Benjamin MoraSwansea University

Siggraph 2008: Image-Space Horizon-Based Ambient Occlusion (HSBA). Siggraph 2008, Louis Bavoil & Miguel Sainz, NVIDIA

ADVANCED FEATURES: AMBIENT OCCLUSION

151Benjamin MoraSwansea University

Siggraph 2008: Image-Space Horizon-Based Ambient Occlusion (HSBA). Siggraph 2008, Louis Bavoil & Miguel Sainz, NVIDIA

ADVANCED FEATURES: AMBIENT OCCLUSION

152Benjamin MoraSwansea University

Siggraph 2008: Image-Space Horizon-Based Ambient Occlusion (HSBA). Siggraph 2008, Louis Bavoil & Miguel Sainz, NVIDIA

ADVANCED FEATURES: AMBIENT OCCLUSION

153Benjamin MoraSwansea University

Siggraph 2008: Image-Space Horizon-Based Ambient Occlusion (HSBA). Siggraph 2008, Louis Bavoil & Miguel Sainz, NVIDIA

GLOBAL ILLUMINATION

ON GPUS

154Benjamin MoraSwansea University

GLOBAL ILLUMINATION ON GPUS

� Goal: simulating all the possible light paths.

� Real-time Global Illumination is not 100% possible yet,

but close enough.

� 2 algorithms are more or less appropriate:

� Instant Radiosity.

� Image-Space Photon Mapping

155Benjamin MoraSwansea University

INSTANT RADIOSITY

� Very efficient algorithm for global illumination.

� Can be implemented through hardware shadow mapping.

� A few fps on latest hardware.

� Caustics difficult to render in the original algorithm.

� Singularities in the image.

156Benjamin MoraSwansea University

Alexander Keller. Instant Radiosity. SIGGRAPH97 Proceedings (August 1997), pp. 49-56.

INSTANT RADIOSITY VS PATH TRACING

157Benjamin MoraSwansea University

ViewpointPrimary

Rays

Light test Light test Light test

Secondary rays

Light test

Instant

Radiosity

INSTANT RADIOSITY VS PATH TRACING

� Better: Sending Photons from light source.

158Benjamin MoraSwansea University

ViewpointPrimary

Rays

Light Source

Instant

Radiosity

INSTANT RADIOSITY VS PATH TRACING

� A few photons only need to be generated.

159Benjamin MoraSwansea University

IMAGE-SPACE PHOTON MAPPING

� Image Space Photon Mapping.

� See:

� High-Performance Graphics 2009

� paper by Morgan McGuire & David Luebke.

� NVIDIA Case Studies: OptiX & Image Space Photon Mapping.

� David Luebke, NVIDIA Research

160Benjamin MoraSwansea University

IMAGE-SPACE PHOTON MAPPING

� Goal: Dynamic Global Illumination.

161Benjamin MoraSwansea University

IMAGE-SPACE PHOTON MAPPING

� Goal: Dynamic Global Illumination.

162Benjamin MoraSwansea University

http://s09.idav.ucdavis.edu/talks/11-Luebke-NVIDIA-BPS-case-study-siggraph2009.pdf

IMAGE-SPACE PHOTON MAPPING

163Benjamin MoraSwansea University

top related