Now that Qt 3D Studio 1.0 has been released, it is time to have a sneak peek at some of the upcoming developments. As outlined in the release announcement, there is work on-going to move the runtime, meaning the engine that renders and runs the scenes, on top of Qt 3D. Let’s take a closer look at this.
Overview
Scenes and Layers
Qt 3D Studio is a design tool that allows rapid development of 3D scenes, focused on, but not limited to, 3D user interfaces. Once assets like 3D models and texture maps are imported, designers create scenes by placing and transforming 3D models, applying materials, and setting up keyframe-based animations that target properties of the models, materials and layers. The concept of layers map naturally to what one may be familiar with from tools like Photoshop: each layer contains a 3D scene with its own camera. These are then composited together based on their position, size and blending settings, thus forming the final output of the rendering.
On the layer level there are multiple antialiasing techniques available, like multisampling, supersampling, progressive and temporal antialiasing. See the documentation for an overview of these.
Slides
All this is complemented by a slide system, not unlike presentation tools like Powerpoint. A slide could be thought of as a state: it defines the set of active (visible) objects, the property changes that get applied to the various scene objects, and the set of animations that start when entering the slide in question. This is complemented by the concept of the master slide, which allows defining a set of objects and animations that are present on all slides.
Materials
When the default material, that provides pixel-based lighting, directional, point and area lights, shadow mapping, screen space ambient occlusion, image-based lighting and a number of other features, is not sufficient, custom materials can be applied. These provide custom (fragment) shader code together with a set of properties that form the input to the shaders. Such properties are editable and animatable in the editor just like the built-in ones. While many custom materials will contain a single shader, they can also contain multiple ones, thus defining multiple passes that run in order, each of them further processing the results of the previous passes.
Effects
To apply effects on the content of a given layer, post-processing effects can be used. These are similar to custom materials, but take the output of the 3D rendering from a given layer as their input. Conceptually they map to the ShaderEffect items of Qt Quick but are somewhat more powerful.
Sub-presentations
While one presentation (a single .uip file) describes a single scene (albeit with multiple layers, hence it is more like a 2D composition of multiple 3D scenes), it is possible to have multiple presentations loaded and run in parallel. Here one presentation serves as the “main” one, which is the presentation that gets rendered to the screen. The others serve as sub-presentations that are first rendered offscreen, and then used as texture maps in the materials of the main scene. They can also be used as the source for one or more of the layers of the main presentation.
Building on this, Qt 3D Studio also offers interoperation with Qt Quick. This is achieved by the familiar QQuickRenderControl. This means that interactive Qt Quick scenes can be displayed inside the Qt 3D Studio scene.
This list, while already long enough, does not cover everything. See the documentation for more details.
The screenshot shows many of the facilities mentioned above:
- The slide management pane on the left,
- the pane on the right that displays either basic objects (that can be dragged into the scene) or the presentation and asset browser (where 3D models and texture maps are dragged and dropped in order to import assets, and then dragged into the scene or the scene browser below),
- the bottom pane contains the scene browser (note how the example has two layers, each with its own camera, lights and models) and the timeline that is used to define and edit keyframes,
- the bottom-right pane, showing the properties for the currently selected model, material or other object.
What’s in a Runtime?
The main editor application is complemented by the so-called runtime component, which consists of the C++ and OpenGL-based engine that renders and runs the presentations created with the editor both in the viewer application shipped with Qt 3D Studio and in any other Qt applications. The APIs provided allow integrating Qt 3D Studio scenes into Qt Quick, QWidget and QWindow-based applications, and also provide facilities for rendering completely offscreen in order to generate videos for example. The rendering APIs are complemented by a set of QML and C++ APIs that allow changing properties of the scene objects at runtime and controlling the slide and animation system. See the links to the documentation for more details.
For an impression of what the APIs Qt application developers would use look like, let’s look at the source of two of the included examples. First, a straightforward pure QWindow-based application (can be found under example/studio3d/surfaceviewer):
#include <QtStudio3D/Q3DSSurfaceViewer> #include <QtStudio3D/Q3DSViewerSettings> #include <QtStudio3D/Q3DSPresentation> #include <QtStudio3D/Q3DSSceneElement> #include <QtStudio3D/Q3DSElement> ... int main(int argc, char *argv[]) { QGuiApplication app(argc, argv); QWindow window; QSize size(1200, 800); window.resize(size); window.setSurfaceType(QSurface::OpenGLSurface); window.setTitle(QStringLiteral("Qt 3D Studio surface viewer example")); window.create(); QOpenGLContext context; context.setFormat(window.format()); context.create(); Q3DSSurfaceViewer viewer; viewer.presentation()->setSource(QUrl(QStringLiteral("qrc:/presentation/circling_cube.uip"))); viewer.setUpdateInterval(0); viewer.settings()->setScaleMode(Q3DSViewerSettings::ScaleModeFill); viewer.settings()->setShowRenderStats(true); Q3DSSceneElement sceneElement(viewer.presentation(), QStringLiteral("Scene")); Q3DSElement counterElement(viewer.presentation(), QStringLiteral("Scene.Layer.Loopcounter")); viewer.initialize(&window, &context); window.show(); int n = 0; QString loopCounter = QStringLiteral("Loop %1"); QObject::connect(&sceneElement, &Q3DSSceneElement::currentSlideIndexChanged, [&]() { if (sceneElement.currentSlideIndex() == 1) n++; counterElement.setAttribute(QStringLiteral("textstring"), loopCounter.arg(n)); }); return app.exec(); }
In practice one will likely rather use the Qt Quick integration, at the core of which stands the Studio3D element. Under the hood this is built on QQuickFramebufferObject. (the snippet here is from examples/studio3d/qmldynamickeyframes).
import QtQuick 2.7 import QtStudio3D 1.0 Item { ... Studio3D { id: studio3D anchors.fill: parent // ViewerSettings item is used to specify presentation independent viewer settings. ViewerSettings { scaleMode: ViewerSettings.ScaleModeFill showRenderStats: false } // Presentation item is used to control the presentation. Presentation { source: "qrc:/presentation/dyn_key.uip" // SceneElement item is used to listen for slide changes of a scene in the presentation. // You can also change the slides via its properties. SceneElement { id: scene elementPath: "Scene" onCurrentSlideIndexChanged: { console.log("Current slide : " + currentSlideIndex + " (" + currentSlideName + ")"); } onPreviousSlideIndexChanged: { console.log("Previous slide: " + previousSlideIndex + " (" + previousSlideName + ")"); } } // Element item is used to change element attributes Element { id: materialElement elementPath: "Scene.Layer.Sphere.Material" } property int desiredSlideIndex: 1 property int colorIndex: 0 property var colorArray: [ [1.0, 1.0, 1.0], [1.0, 0.0, 0.0], [0.0, 1.0, 0.0], [0.0, 0.0, 1.0], [0.0, 1.0, 1.0], [1.0, 0.0, 1.0], [1.0, 1.0, 0.0] ] function nextSlide() { // Separate desiredSlideIndex variable is used to keep track of the desired slide, // because SceneElement's currentSlideIndex property works asynchronously. // This way the button click always changes the slide even if you click // it twice during the same frame. desiredSlideIndex = desiredSlideIndex != 3 ? desiredSlideIndex + 1 : 1; scene.currentSlideIndex = desiredSlideIndex slideButtonText.text = "Change Slide (" + desiredSlideIndex + ")" } function resetTime() { scene.goToTime(0); } function changeColor() { colorIndex = colorIndex >= colorArray.length - 1 ? colorIndex = 0 : colorIndex + 1; materialElement.setAttribute("diffuse.r", colorArray[colorIndex][0]); materialElement.setAttribute("diffuse.g", colorArray[colorIndex][1]); materialElement.setAttribute("diffuse.b", colorArray[colorIndex][2]); changeColorButton.color = Qt.rgba(colorArray[colorIndex][0], colorArray[colorIndex][1], colorArray[colorIndex][2], 1.0); } } onRunningChanged: console.log("Presentation running") } ...
(Before anyone asks: The ability to embed the output of the various Qt UI frameworks into each other opens up the possibility for creative recursive solutions indeed. However, while the idea of embedding a Qt Quick scene into a Qt 3D Studio scene embedded again in Qt Quick which then perhaps gets embedded into a plain Qt 3D scene which in turn can be part of another Qt Quick scene, … may sound exciting at first, it is best not to overdo it.)
Of course, the editor application itself also needs to display, edit and have fine-grained control over the scene. In the current iteration (1.0) not everything is unified in this respect, meaning the way the editor renders under the hood is not necessarily based on the same code as in the viewer or in the external applications. Longer term we expect to have a more unified approach in place. For now, let’s focus on the runtime from the perspective of a typical Qt application.
Towards Qt 3D Studio 2.0
You said C++ and OpenGL. Good enough, no?
The 1.0 runtime is pretty portable already and it comes with the necessary Qt integration bits as shown above. However, as it is based on the code inherited from the original NVIDIA contribution with only certain parts Qt-ified, and with a strong dependency to OpenGL, there is room for improvement. Fortunately, it turns out we have a good way forward – while staying fully compatible with presentations created with the already released version 1.0.
Qt 3D 2.0 (not to be confused with the Qt 3D 1.0 project from Qt 4 times) has been introduced to Qt 5 by developers from KDAB. See this post for an introduction. Besides providing the necessary building blocks, like an entity-component system and the concept of framegraphs that turn out to be an excellent tool for describing how to render multi-pass 3D scenes in a data-driver manner, its abstractions become handy also when looking towards the future where graphics APIs other than OpenGL will play an increasingly important role.
Therefore, back in April we started a little experiment under the Dragon3 internal codename to see what it would take to load and run those .uip presentations on top of Qt 3D. The new code base is developed with the familiar Qt style and conventions, thus providing a better, more maintainable, and more future-proof component in the larger Qt ecosystem. This is the project now often referred to as the Qt 3D Studio Runtime 2.0, with a first release due around May 2018 together with Qt 3D Studio 2.0. Like Qt 3D Studio itself, the code is available in public already today, although it does come with a huge disclaimer of being heavily work in progress.
Should I wait then?
Not at all. As mentioned before, compatibility is a priority, so for all practical purposes the 1.0 release is the thing to use. Naturally those who feel like experimenting are welcome to check out the code from the qt3d-runtime repository as well (which needs the dev (eventually 5.11) branch of qtbase and qt3d to build).
Let’s continue in the next post. In the meantime those who cannot wait and are keen on diving into the juicy details can check out my presentation at the Qt World Summit earlier this year.
That’s all for now, happy 3D hacking!
The post Towards an Improved Qt 3D Studio Runtime appeared first on Qt Blog.