Quantcast
Channel: directxtk Wiki Rss Feed
Viewing all articles
Browse latest Browse all 874

Updated Wiki: Effects

$
0
0
This is a native Direct3D 11 implementation of the five built-in effects from XNA Game Studio, providing identical functionality and API:
  • BasicEffect supports texture mapping, vertex coloring, directional lighting, and fog
  • AlphaTestEffect supports per-pixel alpha testing
  • DualTextureEffect supports two layer multitexturing (for lightmaps or detail textures)
  • EnvironmentMapEffect supports cubic environment mapping
  • SkinnedEffect supports skinned animation
See also EffectFactory

Initialization

The BasicEffect constructor requires a Direct3D 11 device.

std::unique_ptr<BasicEffect> effect(new BasicEffect(device));
For exception safety, it is recommended you make use of the C++ RAII pattern and use a std::unique_ptr or std::shared_ptr

Set effect parameters

effect->SetWorld(world);
effect->SetView(view);
effect->SetProjection(projection);

effect->SetTexture(cat);
effect->SetTextureEnabled(true);

effect->EnableDefaultLighting();

Draw using the effect

effect->Apply(deviceContext);

deviceContext->IASetInputLayout(...);
deviceContext->IASetVertexBuffers(...);
deviceContext->IASetIndexBuffer(...);
deviceContext->IASetPrimitiveTopology(...);

deviceContext->DrawIndexed(...);

Input Layout

To create an input layout matching the effect vertex shader input signature:

// First, configure effect parameters the way you will be using it. Turning// lighting, texture map, or vertex color on/off alters which vertex shader// is used, so GetVertexShaderBytecode will return a different blob after// you alter these parameters. If you create an input layout using a// BasicEffect that had lighting disabled, but then later enable lighting,// that input layout will no longer match as it will not include the// now-necessary normal vector.voidconst* shaderByteCode;
size_t byteCodeLength;

effect->GetVertexShaderBytecode(&shaderByteCode, &byteCodeLength);

device->CreateInputLayout(VertexPositionNormalTexture::InputElements,
                          VertexPositionNormalTexture::InputElementCount,
                          shaderByteCode, byteCodeLength,
                          pInputLayout);

For the built-in effects, the usual trigger for needing to create a new layout would be
  • Enabling or disabling lighting (which requires a vertex normal)
  • Enabling or disabling per vertex color (which requires a vertex color value)
  • Enabling or disabling textures (which requires vertex texture coordinates)

Interfaces

The built-in effects support a number of different settings, some of which are organized into more 'generic' interfaces.
  • IEffect is the basic interface for all effects which includes applying it to the device context and obtaining the shader information needed to create a Direct3D 11 input layout with a signature that matches the effect's shader. Remember that a given Effect instance could return a different shader based on internal state.
  • IEffectMatrices is the interface for setting an effects' world, view, and projection matrices. All the built-in effects support this interface.
  • IEffectLights is the interface for controlling the effects' lighting computations and settings. This is supported by BasicEffect, EnvironmentMapEffect, and SkinningEffect.
  • IEffectFog is the interface for control the effects' fog settings. This is supported by all the built-in effects.

Coordinate systems

The built-in effects work equally well for both right-handed and left-handed coordinate systems. The one difference is that the fog settings start & end for left-handed coordinate systems need to be negated (i.e. SetFogStart(6), SetFogEnd(8) for right-handed coordinates becomes SetFogStart(-6), SetFogEnd(-8) for left-handed coordinates).

Feature Level Notes

The built-in shaders are compiled using the vs_4_0_level_9_1 and ps_4_0_level_9_1 profiles to support all feature levels.

The compiled shaders are integrated into the DirectXTK library to avoid the need for runtime compilation, shader reflection, or deploying compiled shader binary files (.cso).

Threading model

Creation is fully asynchronous, so you can instantiate multiple effect instances at the same time on different threads. Each instance only supports drawing from one thread at a time, but you can simultaneously draw on multiple threads if you create a separate effect instance per Direct3D 11 deferred context.
http://msdn.microsoft.com/en-us/library/windows/desktop/ff476892.aspx

Further reading

http://blogs.msdn.com/b/shawnhar/archive/2010/04/28/new-built-in-effects-in-xna-game-studio-4-0.aspx
http://blogs.msdn.com/b/shawnhar/archive/2010/04/30/built-in-effects-permutations-and-performance.aspx
http://blogs.msdn.com/b/shawnhar/archive/2010/04/25/basiceffect-optimizations-in-xna-game-studio-4-0.aspx
http://blogs.msdn.com/b/shawnhar/archive/2008/08/22/basiceffect-a-misnomer.aspx
http://blogs.msdn.com/b/shawnhar/archive/2010/08/04/dualtextureeffect.aspx
http://blogs.msdn.com/b/shawnhar/archive/2010/08/09/environmentmapeffect.aspx
http://blogs.msdn.com/b/chuckw/archive/2012/05/07/hlsl-fxc-and-d3dcompile.aspx

Viewing all articles
Browse latest Browse all 874

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>