Showing posts with label vulkan. Show all posts
Showing posts with label vulkan. Show all posts

2016-08-20

How a frame is rendered in Aether3D?

Aether3D is a component-based game engine supporting modern rendering APIs. Currently there is no lighting, but I'm porting my Forward+ implementation to it soon. While there is no lighting at the moment, there are directional and spot light shadows. In this post I will run through steps to render one frame.

Scene object contains GameObjects. GameObjects containing component types SpriteRendererComponent, MeshRendererComponent and TextRendererComponent will be rendered by those GameObjects that contain CameraComponents. Cameras can render into the screen or into a render-texture. If DirectionalLightComponent or SpotLightComponent has its shadow-flag enabled, those will cause a special camera to render their shadow maps.


Rendering steps:

1. Scene.Render()


This method first does housekeeping work needed to render a frame: resets statistics, begins a new render pass, acquires the next swapchain image depending on API etc.

Then it calculates an axis-aligned bounding box for the whole scene (needed for shadow frustum calculation) and updates transformation matrix hierarchy.

Then it collects game objects with camera components into a container and sorts it depending on camera type (render-texture, normal) and its layer etc.

2. Scene.RenderWithCamera()


First, render-texture cameras are looped and this method is called, once for a normal camera and six times for a cube map camera. Then, shadows are rendererd. Finally, cameras rendering directly into the screen are rendererd. If any cameras also want to render depth and normals into a texture (can be used later in post-processing and lighting effects), it's done after this step.

Camera's clear flag is applied at the beginning of this method (clear color, clear depth, don't clear).

At this point, skybox is rendered.
Then, camera's frustum is calculated.
Now game objects are looped and objects containing sprite renderer or text renderer are rendered. Mesh renderer objects are collected and sorted by their distance, then rendered. They reference a Material which feeds the blending state, culling state, shader and uniforms into the renderer.

3. GfxDevice::Draw()


Everything that's rendered uses a method from GfxDevice namespace:
void ae3d::GfxDevice::Draw( VertexBuffer& vertexBuffer, int startIndex, int endIndex, Shader& shader, BlendMode blendMode, DepthFunc depthFunc,
                            CullMode cullMode )


This method first calculates a pipeline state object (PSO) hash and if it's not found in cache, it creates a new PSO.
On Vulkan and D3D12 renderer, descriptor set is filled with draw parameters and the actual drawing uses vkCmdDrawIndexed() or DrawIndexedInstanced(). 

Future work

There is room for improvement, as the engine is still in its early stages (v0.6 under development).

1. PSO objects are expensive to generate so it would be better to generate them before the main loop.
2. There is no instancing support yet.
3. Too little profiling done so far as the main goal has been to get things to work on all APIs (Vulkan, OpenGL, Metal, D3D12).
4. Handling transparency, this is actually currently in development.

2016-03-12

Debugging Graphics

Intro

I develop a lot of graphics code using Vulkan, OpenGL, D3D12 and Metal and have found the following methods to make my life easier when something doesn't render right:

Enable the debug layer

Enable your API's debug output and check for runtime errors and warnings and fix them all.

 

D3D12

ID3D12Debug* debugController;
const HRESULT dhr = D3D12GetDebugInterface( IID_PPV_ARGS( &debugController ) );
if (dhr == S_OK)
{
debugController->EnableDebugLayer();
debugController->Release();
}
... CreateDevice()...
hr = device->QueryInterface( IID_PPV_ARGS( &infoQueue ) );
infoQueue->SetBreakOnSeverity( D3D12_MESSAGE_SEVERITY_ERROR, TRUE );

OpenGL


See https://www.opengl.org/wiki/Debug_Output

Vulkan


Sidenote: On my system RenderDoc crashes if I try to attach a program that has debug layer enabled.
First you need to install LunarG Vulkan SDK from http://lunarg.com/vulkan-sdk/
I contain my debug layer code inside a namespace like this:

namespace debug
{
    PFN_vkCreateDebugReportCallbackEXT CreateDebugReportCallback = nullptr;
    PFN_vkDestroyDebugReportCallbackEXT DestroyDebugReportCallback = nullptr;
    PFN_vkDebugReportMessageEXT dbgBreakCallback = nullptr;

    VkDebugReportCallbackEXT debugReportCallback = nullptr;
    const int validationLayerCount = 9;
    const char *validationLayerNames[] =
    {
        "VK_LAYER_GOOGLE_threading",
        "VK_LAYER_LUNARG_mem_tracker",
        "VK_LAYER_LUNARG_object_tracker",
        "VK_LAYER_LUNARG_draw_state",
        "VK_LAYER_LUNARG_param_checker",
        "VK_LAYER_LUNARG_swapchain",
        "VK_LAYER_LUNARG_device_limits",
        "VK_LAYER_LUNARG_image",
        "VK_LAYER_GOOGLE_unique_objects",
    };

    VkBool32 messageCallback(
        VkDebugReportFlagsEXT flags,
        VkDebugReportObjectTypeEXT,
        uint64_t, size_t, int32_t msgCode,
        const char* pLayerPrefix, const char* pMsg,
        void* )
    {
        if (flags & VK_DEBUG_REPORT_ERROR_BIT_EXT)
        {
            ae3d::System::Print( "Vulkan error: [%s], code: %d: %s\n", pLayerPrefix, msgCode, pMsg );
        }
        else if (flags & VK_DEBUG_REPORT_WARNING_BIT_EXT)
        {
            ae3d::System::Print( "Vulkan warning: [%s], code: %d: %s\n", pLayerPrefix, msgCode, pMsg );
        }

        return VK_FALSE;
    }


    void Setup( VkInstance instance )
    {
        CreateDebugReportCallback = (PFN_vkCreateDebugReportCallbackEXT)vkGetInstanceProcAddr( instance, "vkCreateDebugReportCallbackEXT" );
        DestroyDebugReportCallback = (PFN_vkDestroyDebugReportCallbackEXT)vkGetInstanceProcAddr( instance, "vkDestroyDebugReportCallbackEXT" );
        dbgBreakCallback = (PFN_vkDebugReportMessageEXT)vkGetInstanceProcAddr( instance, "vkDebugReportMessageEXT" );

        VkDebugReportCallbackCreateInfoEXT dbgCreateInfo;
        dbgCreateInfo.sType = VK_STRUCTURE_TYPE_DEBUG_REPORT_CREATE_INFO_EXT;
        dbgCreateInfo.pNext = nullptr;
        dbgCreateInfo.pfnCallback = (PFN_vkDebugReportCallbackEXT)messageCallback;
        dbgCreateInfo.pUserData = nullptr;
        dbgCreateInfo.flags = VK_DEBUG_REPORT_ERROR_BIT_EXT | VK_DEBUG_REPORT_WARNING_BIT_EXT | VK_DEBUG_REPORT_PERFORMANCE_WARNING_BIT_EXT;
        VkResult err = CreateDebugReportCallback( instance, &dbgCreateInfo, nullptr, &debugReportCallback );

    }

When creating Vulkan instance, I append VK_EXT_DEBUG_REPORT_EXTENSION_NAME into instanceCreateInfo.ppEnabledExtensionNames.
When creating Vulkan device, I pass validation layers like this: 
deviceCreateInfo.enabledLayerCount = debug::validationLayerCount;
deviceCreateInfo.ppEnabledLayerNames = debug::validationLayerNames;

Use debug names

Debug names appear in graphics debugger tools and validation layer messages so they help you find the object.

OpenGL

You'll need to make sure extension KHR_debug is available before using these functions:
glObjectLabel( GL_TEXTURE, textureHandle, nameLength, name );
glObjectLabel( GL_PROGRAM, shaderHandle, nameLength, name );
glObjectLabel( GL_FRAMEBUFFER, fboHandle, nameLength, name );

etc.

D3D12

ID3D12Resource* texture = ...;
texture->SetName( L"texture" );


If you need to convert a const char* into an LPCWSTR you can do it like this: 
wchar_t wstr[ 128 ];
std::mbstowcs( wstr, my_string.c_str(), 128 );
texture->SetName( wstr );

Metal

Many objects have a .label property:
metalTexture.label = @"texture";

Use tools

These tools can be used to verify the rendering process by inspecting textures, render targets, buffers, rasterizer state etc.
RenderDoc is a good debugger for D3D11, OpenGL and Vulkan.
For D3D12 Visual Studio's own debugger is good. You can get it by installing "graphics tools" in Windows 10: Settings->System->Apps & Features -> Manage optional features
OpenGL ES and Metal users on Mac will probably use Xcode's debugger.
AMD and NVIDIA also have tools for this.

Shader debugging

GLSL shaders can be compiled and checked for errors by glslangValidator. You can make it a part of your build process for extra credit. You can also use general static analysis tools like PVS-Studio or CppCheck. Using these tools I have found uninitialized variables in rendering code etc. Writing a shader hot-reloading system is not a big task but it pays off: Imagine debugging a video blitting shader that shows wrong colors. You can pause your game on a frame, modify the shader and see the results in that video frame instantly. You can also make the system take a screenshot before and after recompilation to more easily compare the results in an external program like Photoshop.

Test on multiple GPUs, even from the same vendor

There are differences in how textures are initialized (garbage or white/black etc.), resource transitions are done, flags are handled etc.

Conclusion

Many things can and will go wrong when rendering but there are features like validation layers and graphics debuggers that make finding the problem easier.

2015-05-16

What I've been doing recently

I've been adding graphics features into my new engine slowly because I don't want to write a lot of code that will be replaced by newer APIs. Regarding that, I made an iOS branch into GitHub and I'm learning Metal and already got a textured quad to render. So far the best resource for learning has been http://metalbyexample.com. I also installed Windows 10 preview and VS 2015 RC into my secondary laptop and are learning D3D12. When I have more experience on D3D12 and Metal, I'll merge my renderers into the master branch. I'm also waiting for Vulkan and reading Mantle documentation until it's out.

Writing only engine code would not be so productive, so I'm working on two small games at the moment. The first one is a desktop FPS made using Unity 5.

Some of the assets are downloaded from sites like https://freesound.org, pamargames.com and cgtextures.com but some are done by myself. While the level design/art design/balancing is amateurish, I'm paying special attention to polishing feedback like dealing/receiving damage, transitions, sounds, bullet holes etc. When the game is ready, I'll put the player into my website along with the project folder. I also ditched the built-in MonoDevelop in favour of Xamarin Studio which is faster, but Unity overrides my formatting options.

My other game under construction is a roguelike using my own engine.

I haven't decided on the final design or platform. If my iOS/Metal renderer advances quickly, it would be nice to test the game on my iPhone 5S. Using my own engine for a game development has been productive since it has uncovered some bugs that would have bitten me later on otherwise. Also the new engine's component-based game object system has been nice to use in this game. Some of the used components are TextRendererComponent, SpriteRendererComponent, TransformComponent and AudioSourceComponent.

Next steps in my engine will be a virtual filesystem which enables faster load times by packing multiple files into one. While making the iOS port I'll also be writing NEON SIMD matrix operations. I'm also making a multi-platform scene editor using Qt and its first version will be included in the next engine release.

2015-03-08

GDC 2015, Vulkan etc.

Game Developers Conference was held on March 2-6 at San Francisco.

Unity 5 was finally released. I have used it since beta 1 and I'm happy about the licensing change that allows all engine features to be used on the personal edition. Epic also dropped the subscription fee for Unreal Engine lowering the entry barrier for casual developers, as those who are serious have already subscribed. With the Valve announcement of Source 2 being free there's now good competition on indie-friendly engines.

John Carmack talked about mobile VR, noting that it has potential to reach far more customers than desktop VR. I have always liked Carmack's talks, he just sits there without any materials and talks as long as allowed.

Khronos revealed Vulkan, the successor to OpenGL and OpenGL ES. It's evolved from Mantle and the abstractions are similar to D3D12, making the driver more simple. It also allows more control than OpenGL in that you have to allocate memory yourself, avoid hazards and handle threading. More control means more potential for performance but also more responsibility. There will, however, be an optional debug layer that should help the developer to find problems. Shaders will finally be supported in an IR, making them load faster and, more importantly, alleviating compatibility problems caused by subtle differences in parsers ie. different drivers accepting broken syntax or not accepting correct syntax.

All these new APIs (Metal, Mantle, D3D12, Vulkan) provide abstractions that are different from earlier APIs. Personally, for me that means that when I begin writing my new engine's renderer, I'll follow those abstractions. I cannot use D3D12 or Vulkan today and probably will begin writing a new renderer before they are out (I'll be starting my new engine development next week), but I will try to design the renderer to allow for the sane usage of those new APIs when they are released and will be implementing D3D12 and Vulkan renderers as soon as they are released.