day 2 unity training bat all n 52

78
DAY 2 Cameras

Upload: luis-gomez

Post on 22-Oct-2015

23 views

Category:

Documents


5 download

TRANSCRIPT

Page 1: Day 2 Unity Training Bat All n 52

DAY 2Cameras

Page 2: Day 2 Unity Training Bat All n 52

In Unity, Cameras are used to display the game world to the player.

You will always have one camera in the scene. It is, however, possible to have more than one.

Using multiple cameras, you can pull off some pretty cool effects, like multiplayer split screen, render to texture, etc.

You can animate cameras, control them with physics, or bind them to animated rigs and watch them take off.

Practically anything you can imagine is possible with cameras, and you can use typical or unique cameras to fit your game’s style.

An Intro To Cameras

Page 3: Day 2 Unity Training Bat All n 52

The Camera property found on a Camera is very unique, and has some remarkably powerful properties.

Over the next set of slides, we’ll take a look at some of these properties and how you can use them to your advantage.

Anatomy of a Camera

Page 4: Day 2 Unity Training Bat All n 52

Clear FlagsEach Camera stores color and depth information when it renders its view.

The portions of the screen that are not drawn in are empty, and will display the skybox by default.

When you are using multiple Cameras, each one stores its own color and depth information in buffers, accumulating more data as each Camera renders.

As any particular Camera in your scene renders its view, you can set the Clear Flags to clear different collections of the buffer information. This is done by choosing one of the four options

Clear Flags

Page 5: Day 2 Unity Training Bat All n 52

SkyboxThis is the default setting. Any empty portions of the screen will display the current Camera's skybox. If the current Camera has no skybox set, it will default to the skybox chosen in the Render Settings (found in Edit->Render Settings). It will then fall back to the Background Color.

Solid ColorAny empty portions of the screen will display the current Camera's Background Color.

Clear Flags Options

Page 6: Day 2 Unity Training Bat All n 52

Depth OnlyIf you wanted to draw a player's gun without letting it get clipped inside the environment, you would set one Camera at Depth 0 to draw the environment, and another Camera at Depth 1 to draw the weapon alone. The weapon Camera's Clear Flags should be set to to depth only. This will keep the graphical display of the environment on the screen, but discard all information about where each object exists in 3-D space. When the gun is drawn, the opaque parts will completely cover anything drawn, regardless of how close the gun is to the wall.

Don't ClearThis mode does not clear either the color or the depth buffer. The result is that each frame is drawn over the next, resulting in a smear-looking effect. This isn't typically used in games, and would likely be best used with a custom shader.

Clear Flags Options Continued...

Page 7: Day 2 Unity Training Bat All n 52

Clip PlanesThe Near and Far Clip Plane properties determine where the Camera's view begins and ends. The planes are laid out perpendicular to the Camera's direction and are measured from the its position. The Near plane is the closest location that will be rendered, and the Far plane is the furthest.

The clipping planes also determine how depth buffer precision is distributed over the scene. In general, to get better precision you should move the Near plane as far as possible.

Clip Planes

Page 8: Day 2 Unity Training Bat All n 52

Culling MaskThe Culling Mask is used for selectively rendering groups of objects using Layers.

Commonly, it is good practice to put your User Interface on a different layer, then render it by itself with a separate Camera set to render the UI layer by itself.

In order for the UI to display on top of the other Camera views, you'll also need to set the Clear Flags to Depth only and make sure that the UI Camera's Depth is higher than the other Cameras.

Culling Mask

Page 9: Day 2 Unity Training Bat All n 52

Normalized Viewport RectangleNormalized Viewport Rectangles are specifically for defining a certain portion of the screen that the current camera view will be drawn upon. You can put a map view in the lower-right hand corner of the screen, or a missile-tip view in the upper-left corner. With a bit of design work, you can use Viewport Rectangle to create some unique behaviors.

It's easy to create a two-player split screen effect using Normalized Viewport Rectangle. After you have created your two cameras, change player one's Ymin value to 0.5, and player two's Ymax: value to 0.5. This will make player one's camera display from halfway up the screen to the top, and player two's camera will start at the bottom and stop halfway up the screen.

Normalized Viewport Rectangle

Page 10: Day 2 Unity Training Bat All n 52

Two-player display created with Normalized Viewport Rectangle

Normalized Viewport Rectangle Example

Page 11: Day 2 Unity Training Bat All n 52

Marking a Camera as Orthographic removes all perspective from the Camera's view. This is mostly useful for making isometric or 2D games.

Orthographic Camera

Page 12: Day 2 Unity Training Bat All n 52

Render TextureThis feature is only available for Unity Pro licenses (not applicable to any version of Unity iPhone).

It will place the camera's view onto a Texture that can then be applied to another object. This makes it easy to create sports arena video monitors, surveillance cameras, reflections etc.

Render Texture

Page 13: Day 2 Unity Training Bat All n 52

DAY 2Lights

Page 14: Day 2 Unity Training Bat All n 52

Lights are an essential part of every scene.

While meshes and textures define the shape and look of a scene, lights define the color and mood of your 3D environment.

You’ll likely work with more than one light in each scene. Making them work together requires a little practice, but the results can be quite amazing.

Lights

Page 15: Day 2 Unity Training Bat All n 52

Lights can be added to your scene from the GameObject > Create Other menu option.

Lights can be added to your scene from the GameObject by using Component > Rendering > Light.

There are many different options within the Light Component in the Inspector

By simply changing the color of a light, you can give a whole different mood to the scene.

Adding Lights

Page 16: Day 2 Unity Training Bat All n 52

Adding Lights

This is the same scene with differentlighting effects

Page 17: Day 2 Unity Training Bat All n 52

There are three different types of lights in Unity:

- Point Lights shine from a location equally in all directions, like a light bulb.- Directional Lights are placed infinitely far away and affect everything in the scene, much like the sun.- Spot Lights shine from a point in a direction and only illuminate objects within a cone, like the headlights of a car.

Lights can also cast Shadows. Shadows are a Pro-only feature.

Shadow properties can be adjusted on a per light basis.

Types of Lights

Page 18: Day 2 Unity Training Bat All n 52

Lights have some pretty unique Properties, including:

Type, Color, Attenuate, Intensity, Range, Spot Angle, Shadows, Resolution, Strength, Projection, Constant, Bias, Object Size, Cookie, Draw Halo, Fare, Render Mode, Auto, Force Pixel, Force Vertex, and Culling Mask.

Needless to say, we won’t cover all of these.

Light Properties

Page 19: Day 2 Unity Training Bat All n 52

Point Light Example

Page 20: Day 2 Unity Training Bat All n 52

Spot Light Example

Page 21: Day 2 Unity Training Bat All n 52

Directional Lights

Page 22: Day 2 Unity Training Bat All n 52

Lights can impact performance in a bit way. This is especially important depending on what platform you are developing for.

There are two ways lights can be rendered: vertex lighting and pixel lighting.

Vertex lighting only calculates the lighting at the vertices of the game models, and interpolates the lighting over the surface of the models.

Pixel lights are calculated at ever screen pixel, and hence are much more expensive.

Some older graphics cards only support vertex lighting.

While pixel lighting is slower to render, it allows for some cool effects.

Thinking About Performance

Page 23: Day 2 Unity Training Bat All n 52

Lights have a big impact on render speeds. A tradeoff has to be made between lighting and game speed.

Since pixel lights are much more expensive than vertex lights, Unity will only render the brightest lights at per-pixel quality. The actual number of pixel lights can be set in a setting.

You can explicitly control if a light should be rendered as a vertex or pixel light using the Render Mode property. By default, Unity will classify the light automatically based on how much the object is affected by the light.

The actual lights that are rendered as pixel lights are determined on an object-by-object case.

Rendering Lights

Page 24: Day 2 Unity Training Bat All n 52

Huge objects with bright lights could use all the pixel lights (depending on the quality settings). If the player is far from these, nearby lights will be rendered as vertex lights. Therefore, it is better to split huge objects up in a couple of smaller ones.

Examples of Vertex lights:

Rendering Lights Continued...

Page 25: Day 2 Unity Training Bat All n 52

DAY 2Meshes

Page 26: Day 2 Unity Training Bat All n 52

Most of the objects in a game are going to be Meshes. Meshes are not built in Unity. They must be built in another 3d modeling program and exported into a Unity compatible format.

The accepted formats are: .fbx, .dae, .obj, .3ds, and .dxf.

The most common Unity compatible 3d art programs are: Maya, Cinema 4d, 3ds Max, Cheetah3d, Modo, Lightwave, and Blender. However, there are many more programs, including 3d art exporters, that allow for successful incorporation of mesh assets.

Mesh Types

Page 27: Day 2 Unity Training Bat All n 52

Animation Properties:

Generation - Controls how animations are imported. This contains several options in a drop down including:

Don’t import - does not import animationsStore in original roots - maintains placement of animations in root object of animation package.Store in nodes - stores animations in the object they animate.Store in root- stores the animation in the Scene’s transform root object. This should be used when animating things that have a hierarchy.

Mesh Animation

Page 28: Day 2 Unity Training Bat All n 52

Animation Properties:

Bake Animations - Converts IK to FK on import.

Animation Compression - determines what type of compression will be applied to the animation.

Animation Wrap Mode - sets the Wrap Mode for the animations.

Mesh Animation Continued...

Page 29: Day 2 Unity Training Bat All n 52

Animation Properties:

Split Animations - used in instances when several animations are contained in one animation file. Split the animations by setting the Name, Start and End frame, Wrap Mode and Loop.

Name - sets the name of the animation clip.Start - the frame on which the animation clip begins.End - the final frame in the animation clip.Wrap Mode - sets a wrap mode for the animation.Loop - sets an extra frame for looping animations. If an animation that is supposed to loop does not look correct try enabling this option.

Mesh Animation Continued...

Page 30: Day 2 Unity Training Bat All n 52

DAY 2Audio

Page 31: Day 2 Unity Training Bat All n 52

An Audio Source takes a clip and plays it from the source location in the world. This is very useful for utilizing 3D sound. Audio Sources will not do anything without an Audio Clip. The source works as the controller for the clip, allowing stop and playback.

To create an Audio Source first create a GameObject. Do this by going to GameObject > Create Empty. With the new object selected go to Component > Audio > Audio Source. Then assign a Clip.

Audio Sources

Page 32: Day 2 Unity Training Bat All n 52

Properties:

Audio Clip - a reference to the clip.

Play On Awake - if enabled the Audio Clip will begin to play as soon as the level has loaded. If disabled Play() must be called to begin the clip.

Volume - how loud the sound is from a distance of 1 meter(1 world unit) away.

Pitch - amount of change in pitch due to speed up or slow down of play.

Min and Max Volume - sets the lowest and highest level of volume despite distance.

RollOff - determines how quickly the sound fades. The lower the value the closer the listener has to be.

Loop - makes the clip loop when it reaches the end.

Audio Sources Continued...

Page 33: Day 2 Unity Training Bat All n 52

Audio Clips are assets used by an Audio Source. Supported formats are: .aif, .wav, .mp3, and .ogg. Both Mono and Stereo assets.

Properties:Format - Compressed or Native. Native has a larger files size and is higher quality, usually used for short clips.

3d Sound - plays in 3d space.

Force to mono - plays stereo as single channel.

Decompress on Load - if enabled, the clips is decompressed on scene load, otherwise it is decompressed in real time.

Audio Clips

Page 34: Day 2 Unity Training Bat All n 52

The Audio Listener acts as a microphone. It receives input from the Audio sources round it in the scene and plays sounds through the speakers. By default this is attached to the Main Camera.

The Audio Listener has no properties, it simply must be in the scene in order for the audio clips to work. Audio Listeners allow for creating an aural experience in the game. When an Audio Listener is attached to a Game Object, any Sources near by will be picked up. Sources that are in a Mono format will automatically be played through the Stereo field from the appropriate direction.

In order for the audio to work properly in the Scene, there should only one Audio Listener.

Audio Listener

Page 35: Day 2 Unity Training Bat All n 52

DAY 2Particle Systems

Page 36: Day 2 Unity Training Bat All n 52

Particle Systems in Unity are used to make clouds of smoke, steam, fire and other atmospheric effects. Particle Systems work by using one or two Textures & drawing them many times, creating a chaotic effect.

Particle Systems

Page 37: Day 2 Unity Training Bat All n 52

A typical Particle System in Unity is an object that contains a Particle Emitter, a Particle Animator and a Particle Renderer.

The Particle Emitter generates the particles, the Particle Animator moves them over time, and the Particle Render draws them on the screen.

Think of Particles as little 2D images that head in a changing direction and having varying properties that make them appear different over time.

Particle System Basics

Page 38: Day 2 Unity Training Bat All n 52

The Ellipsoid Particle Emitter spawns particles inside a sphere.

Ellipsoid Particle Emitters have a wide variety of properties. Many of these properties are “try it out” types of values.

Let’s now practice creating an Ellipsoid Particle Emitter and try changing some of these values to see what we come up with.

The Ellipsoid Particle Emitter

Page 39: Day 2 Unity Training Bat All n 52

Ellipsoid Particle Emitters (EPEs) are the basic emitter, and are included when you choose to add a Particle System to your scene.

Spawning PropertiesSpawning properties like Size, Energy, Emission, and Velocity will give your particle system distinct personality when trying to achieve different effects. Having a small Size could simulate fireflies or stars in the sky.

Energy and Emission will control how long your particles remain onscreen and how many particles can appear at any one time. For example, a rocket might have high Emission to simulate density of smoke, and high Energy to simulate the slow dispersion of smoke into the air.

The Ellipsoid Particle Emitter Details

Page 40: Day 2 Unity Training Bat All n 52

Velocity will control how your particles move. You might want to change your Velocity in scripting to achieve interesting effects, or if you want to simulate a constant effect like wind, set your X and Z Velocity to make your particles blow away.

Simulate in World SpaceIf this is disabled, the position of each individual particle will always translate relative to the Position of the emitter. When the emitter moves, the particles will move along with it. If you have Simulate in World Space enabled, particles will not be affected by the translation of the emitter. For example, if you have a fireball that is spurting flames that rise, the flames will be spawned and float up in space as the fireball gets further away. If Simulate in World Space is disabled, those same flames will move across the screen along with the fireball.

Ellipsoid Particle Emitter Details Continued...

Page 41: Day 2 Unity Training Bat All n 52

Emitter Velocity ScaleThis property will only apply if Simulate in World Space is enabled.

If this property is set to 1, the particles will inherit the exact translation of the emitter at the time they are spawned. If it is set to 2, the particles will inherit double the emitter's translation when they are spawned. 3 is triple the translation, etc.

One ShotOne Shot emitters will create all particles within the Emission property all at once, and cease to emit particles over time. Here are some examples of different particle system uses with One Shot Enabled or Disabled:

Enabled: Explosion, Water splash, Magic spell

Disabled: Gun barrel smoke, Wind effect, Waterfall

Ellipsoid Particle Emitter Details Continued...

Page 42: Day 2 Unity Training Bat All n 52

Min Emitter RangeThe Min Emitter Range determines the depth within the ellipsoid that particles can be spawned. Setting it to 0 will allow particles to spawn anywhere from the center core of the ellipsoid to the outer-most range. Setting it to 1 will restrict spawn locations to the outer-most range of the ellipsoid.

Ellipsoid Particle Emitter Details Continued...

Page 43: Day 2 Unity Training Bat All n 52

The Mesh Particle Emitter emits particles around a mesh. Particles are spawned from the surface of the mesh, which can be necessary when you want to make your particles interact in a complex way with objects.

Mesh Particle Emitters (MPEs) are used when you want more precise control over the spawn position & directions than the simpler Ellipsoid Particle Emitter gives you.

MPEs work by emitting particles at the vertices of the attached mesh.

Mesh Particle Emitter Details

Page 44: Day 2 Unity Training Bat All n 52

Interpolate Triangles

Enabling your emitter to Interpolate Triangles will allow particles to be spawned between the mesh's vertices.

Interpolate Triangles is off by default, so particles will only be spawned at the vertices.

Mesh Particle Emitter

Page 45: Day 2 Unity Training Bat All n 52

Systematic

Enabling Systematic will cause your particles to be spawned in your mesh's vertex order. The vertex order is set by your 3D modeling application.

Vertex order varies on the application you are using, but generally speaking this effect will cause your particles to be emitted sequentially among adjacent verts.

Mesh Particle Emitter

Page 46: Day 2 Unity Training Bat All n 52

Normal Velocity

Normals refer to the direction that a face on a mesh is pointing.

Normal Velocity controls the speed at which particles are emitted along the normal from where they are spawned.

For example, create a Mesh Particle System, use a cube mesh as the emitter, enable Interpolate Triangles, and set Normal Velocity Min and Max to 1. You will now see the particles emit from the faces of the cube in a straight line

Mesh Particle Emitters Continued...

Page 47: Day 2 Unity Training Bat All n 52

Particle AnimatorParticle Animators move your particles over time, you use them to apply wind, drag & color cycling to your particle systems.

Particle Animators allow your particle systems to be dynamic. They allow you to change the color of your particles, apply forces and rotation, and choose to destroy them when they are finished emitting

Particle Animators

Page 48: Day 2 Unity Training Bat All n 52

Animating ColorIf you would like your particles to change colors or fade in/out, enable them to Animate Color and specify the colors for the cycle.

Any particle system that animates color will cycle through the 5 colors.

The speed at which they cycle will be determined by the Emitter's Energy value.

If you want your particles to fade in rather than instantly appear, set your first or last color to have a low Alpha value.

Particle Animators Continued...

Page 49: Day 2 Unity Training Bat All n 52

Rotation AxesSetting values in either the Local or World Rotation Axes will cause all spawned particles to rotate around the indicated axis (with the Transform's position as the center). The greater the value is entered on one of these axes, the faster the rotation will be.

Setting values in the Local Axes will cause the rotating particles to adjust their rotation as the Transform's rotation changes, to match its local axes.

Setting values in the World Axes will cause the particles' rotation to be consistent, regardless of the Transform's rotation.

Particle Animators Continued...

Page 50: Day 2 Unity Training Bat All n 52

Forces & DampingYou use force to make particles accelerate in the direction specified by the force.

Damping can be used to decelerate or accelerate without changing their direction:

A value of 1 means no Damping is applied, the particles will not slow down or accelerate. A value of 0 means particles will stop immediately. A value of 2 means particles will double their speed every second.

Particle Animators Continued...

Page 51: Day 2 Unity Training Bat All n 52

Destroying GameObjects attached to ParticlesYou can destroy the Particle System and any attached GameObject by enabling the AutoDestruct property.

Note that automatic destruction takes effect only after some particles have been emitted.

The precise rules for when the object is destroyed when AutoDestruct is on:

If there have been some particles emitted already, but all of them are dead now, or If the emitter did have Emit on at some point, but now Emit is off.

Particle Animators Continued...

Page 52: Day 2 Unity Training Bat All n 52

The World Particle Collider is used to collide particles against other Colliders in the scene.

To create a Particle System with Particle Collider :

1.Create a Particle System using GameObject->Create Other->Particle System

2.Add the Particle Collider using Component->Particles->World Particle Collider

Particle Colliders

Page 53: Day 2 Unity Training Bat All n 52

MessagingIf Send Collision Message is enabled, any particles that are in a collision will send the message OnParticleCollision() to both the particle's GameObject and the GameObject the particle collided with.

Hints

1Send Collision Message can be used to simulate bullets and apply damage on impact.

2 Particle Collision Detection is slow when used with a lot of particles. Use Particle Collision Detection wisely.

3 Message sending introduces a large overhead and shouldn't be used for normal Particle Systems.

Particle Colliders Continued...

Page 54: Day 2 Unity Training Bat All n 52

The Particle Renderer renders the Particle System on screen.

Particle Renderers are required for any Particle Systems to be displayed on the screen.

Using this Component, you can greatly change the visual appearance of the particles you are generating.

Particle Renderer

Page 55: Day 2 Unity Training Bat All n 52

Choosing a MaterialWhen setting up a Particle Renderer it is very important to use an appropriate material and shader.

Most of the time you want to use a Material with one of the built-in Particle Shaders. There are some premade materials in the Standard Assets->Particles->Sources folder.

Creating a new material is easy:

1.Select Assets->Create Other->Material from the menu bar.2.The Material has a shader popup, choose one of the shaders in the Particles group.

Eg. Particles->Multiply.3.Now assign a Texture. The different shaders use the alpha channel of the textures

slightly differently, but most of the time a value of black will make it invisible and white in the alpha channel will display it on screen.

Particle Renderer Continued...

Page 56: Day 2 Unity Training Bat All n 52

Distorting particlesBy default particles are rendered billboarded. That is simple square sprites. This is good for smoke and explosions and most other particle effects.

Particles can be made to either stretch with the velocity. This is useful for sparks, lightning or laser beams. Length Scale and Velocity Scale affects how long the stretched particle will be.

Sorted Billboard can be used to make all particles sort by depth. Sometimes this is necessary, mostly when using Alpha Blended particle shaders. This can be expensive and should only be used if it really makes a quality difference when rendering

Particle Renderer Continued...

Page 57: Day 2 Unity Training Bat All n 52

Animated texturesParticle Systems can be rendered with an animated tile texture. To use this feature, make the texture out of a grid of images. As the particles go through their life cycle, they will cycle through the images. This is good for adding more life to your particles, or making small rotating debris pieces.

Particle Renderer Continued...

Page 58: Day 2 Unity Training Bat All n 52

DAY 2Physics

Page 59: Day 2 Unity Training Bat All n 52

Unity has the next-generation Ageia PhysX physics engine built-in.

To put an object under physics control, simply add a Rigidbody to it.

GameObjects with the Rigidbody component will be affected by gravity, and can collide with other objects in the world.

You use Rigidbodies for anything that needs to simulate physics

Rigidbodies are most often used in combination with primitive colliders.

Rigid Bodies

Page 60: Day 2 Unity Training Bat All n 52

Kinematic Rigidbodies are not affected by forces, gravity or collisions.

Kinematic Rigidbodies driven explicitly by setting the position and rotation of the transform or animating them.

Kinematic Rigidbodies can interact with other non-Kinematic Rigidbodies.

Kinematic Rigidbodies are used for three purposes:

1.Sometimes you want an object to be under physics control but in another situation to be controlled explicitly from a script or animation.

2.Kinematic Rigidbodies play better with other Rigidbodies. For example if you have an animated platform and you want to place some Rigidbody boxes on top, you should make the platform a Kinematic Rigidbody instead of just a Collider without a Rigidbody.

3.You might want to have a Kinematic Rigidbody that is animated and have a real Rigidbody follow it using one of the available Joints.

Kinematic Rigidbodies

Page 61: Day 2 Unity Training Bat All n 52

Static Colliders are used for level geometry which does not move around much. You add a Mesh Collider to your already existing graphical meshes.

There are two reasons why you want to make a Static Collider into a Kinematic Rigidbody instead:

1.Kinematic Rigidbodies wake up other Rigidbodies when they collide with them.2.Kinematic Rigidbodies apply friction to Rigidbodies placed on top of them

Static Colliders

Page 62: Day 2 Unity Training Bat All n 52

You use Character Controller if you want to make a humanoid character.

These Controllers don't follow the rules of physics since it will not feel right (in Doom you run 90 miles per hour, come to halt in one frame and turn on a dime). Instead, a Character Controller performs collision detection to make sure your characters can slide along walls, walk up and down stairs, etc.

Character Controllers are not affected by forces but they can push Rigidbodies by applying forces to them from a script.

Character Controllers are inherently unphysical, thus if you want to apply real physics - Swing on ropes, get pushed by big rocks - to your character you have to use a Rigidbody, this will let you use joints and forces on your character. But be aware that tuning a Rigidbody to feel right for a character is hard due to the unphysical way in which game characters are expected to behave.

Character Controllers

Page 63: Day 2 Unity Training Bat All n 52

When using Character Controllers, you have to be careful:

1. Never have a parent and child Rigidbody together.

2. Never scale the parent of a Rigidbody.

Character Controller Considerations

Page 64: Day 2 Unity Training Bat All n 52

Constant Force is a quick utility for adding constant forces to a Rigidbody.

Constant Force works great for one shot objects like rockets, if you don't want it to start with a large velocity but instead accelerate

To make a rocket that accelerates forward set the Relative Force to be along the positive z-axis. Then use the Rigidbody's Drag property to make it not exceed some maximum velocity (the higher the drag the lower the maximum velocity will be). In the Rigidbody, also make sure to turn off gravity so that the rocket will always stay on its path.

Constant Force

Page 65: Day 2 Unity Training Bat All n 52

The Sphere Collider is a basic sphere-shaped collision primitive.

The Sphere Collider can be resized to uniform scale, but not along individual axes. It works great for falling boulders, ping pong balls, marbles, etc.

Properties of a collider

Colliders work with Rigidbodies to bring physics in Unity to life. Whereas Rigidbodies allow objects to be controlled by physics, Colliders allow objects to collide with each other.

Colliders must be added to objects independently of Rigidbodies.

Sphere Colliders

Page 66: Day 2 Unity Training Bat All n 52

A Collider does not necessarily need a Rigidbody attached, but a Rigidbody must be attached in order for the object to move as a result of collisions.

When a collision between two Colliders occurs and if at least one of them has a Rigidbody attached, collision messages are sent out to the objects attached to them.

Sphere Colliders Continued

Page 67: Day 2 Unity Training Bat All n 52

TriggersAn alternative way of using Colliders is to mark them as a Trigger. Triggers are effectively ignored by the physics engine, and have a unique set of that are sent out when a collision with a Trigger occurs.

Triggers are useful for triggering other events in your game, like cutscenes, automatic door opening, displaying tutorial messages, etc.

Triggers

Page 68: Day 2 Unity Training Bat All n 52

The Box Collider is a basic cube-shaped collision primitive.

The Box Collider can be resized into different shapes of rectangular prisms. It works great for doors, walls, platforms, etc.

Box Colliders

Page 69: Day 2 Unity Training Bat All n 52

The Mesh Collider takes a Mesh Asset and builds its Collider based on that mesh. It is far more accurate for collision detection than using primitives for complicated meshes. Mesh Colliders that are marked as Convex can collide with other Mesh Colliders.

Collision meshes use backface culling. If an object collides with a mesh that will be backface culled graphically it will also not collide with it physically.

There are some limitations when using the Mesh Collider: Usually, two Mesh Colliders cannot collide with each other. All Mesh Colliders can collide with any primitive Collider. If your mesh is marked as Convex, then it can collide with other Mesh Colliders.

Mesh Colliders

Page 70: Day 2 Unity Training Bat All n 52

The Physic Material is used to adjust friction and bouncing effects of colliding objects.

To create a Physic Material select Assets->Create->Physic Material from the menu bar. Then drag the Physic Material from the Project View onto a Collider in the scene.

Physics Materials

Page 71: Day 2 Unity Training Bat All n 52

Friction is the quantity which prevents surfaces from sliding off each other. This value is critical when trying to stack objects. “Forms of Friction” Static friction is used when the object is lying still. It will prevent the object from starting to move. If a large enough force is applied to the object it will start moving. At this point Dynamic Friction will come into play. Dynamic Friction will now attempt to slow down the object while in contact with another.

Physics Materials Continued...

Page 72: Day 2 Unity Training Bat All n 52

The Hinge Joint groups together two Rigidbodies, constraining them to move like they are connected by a hinge.

-A single Hinge Joint should be applied to a GameObject. The hinge will rotate at the point specified by the Anchor property, moving around the specified Axis property. -You do not need to assign a GameObject to the joint's Connected Body property. -You should only assign a GameObject to the Connected Body property if you want the joint's Transform to be dependent on the attached object's Transform.

ChainsMultiple Hinge Joints can also be strung together to create a chain. Add a joint to each link in the chain, and attach the next link as the Connected Body.

Hinges

Page 73: Day 2 Unity Training Bat All n 52

DAY 2Terrain Engine

Page 74: Day 2 Unity Training Bat All n 52

The concept of Terrain stems from the need to create vast, outdoor environments without having to use meshes for everything.

Unity created the Island Demo not by using meshes, but by using the powerful terrain engine that they build.

Terrain editors are not a new concept in the world of gaming, but they are a convenient one.

Unity’s terrain engine features a powerful editor that lets you create stunning outdoor environments very quickly.

What is the Terrain Engine...

Page 75: Day 2 Unity Training Bat All n 52

We will now walk you through some of the basics of setting up a Terrain.

Terrains are one of the more advanced tools available in Unity, with a wide variety of options to choose from.

By the end of this hands on practice session, you might not have the best looking terrain in the world, but will have the tools necessary to make stunning environments given practice and time.

The Terrain Engine...

Page 76: Day 2 Unity Training Bat All n 52

Cameras

Lights

Particle Systems

Meshes

Audio

Physics

Terrain Engine

Today We Covered:

Page 77: Day 2 Unity Training Bat All n 52

Last Minute Discussion and Questions

Page 78: Day 2 Unity Training Bat All n 52

THANK YOU