Video details

Building Virtual Objects in Meta Spark Studio


What are virtual objects (VOs) and what role do they play in augmented reality? Learn more about what they are and how to create them in Meta Spark Studio.


Music. Metaspark is our platform for building the augmented layer of the Metaverse. Today, we're launching a beta for creating a new type of era experience in Metaspark Studio virtual objects. Hi, I'm Lucie at the Hong, and today I'm going be showing you these new virtual objects, how they work and how to create them. Virtual objects will populate the world around us. You'll be able to place them on surfaces across your home or office to access unique functionalities or to make the space feel more personal. They can be as simple as an interactive 3D asset or as complex as a fullfledged game or virtual assistant. Virtual objects are a building block of the augmented layer of the metadorce. They contain logic and can interact with the real world. Virtual objects can serve a multitude of purposes. They can be utilitarian like a clock, inspiring like a piece of digital art, or entertaining like a game. They can be tokens that remind you of people or via variety of other things that change the way you interact with the world around you. Once you create virtual objects, you'll be able to test them through the Metaspark Player app, which is also launching in beta today. The new Metaspark Player app allows you to test virtual objects on the Meta quest family of devices to better approximate future Air glasses. This is particularly true on Metaquest Pro, our newest device, which is Color Passthrough. Let's get started with creating your first virtual object. Those of you who are familiar with AR might be wondering what's the difference between virtual objects and 3D models or effects? Well, virtual objects are a route towards standardizing content for AR, a first step towards interoperability. They can coexist with other virtual objects in the same scene, even if they are created by different people. This is enabled by the virtual object ports, or inputs and outputs. These ports allow the virtual object to receive information about the world and other objects and to inform the app about its state or actions. Blocks in Studio are similar, but you can customize their inputs and outputs. This customization, though, makes it so that blocks cannot coexist between different creators in the same application, since there are no standards in how they can interact. Now that you have a general understanding of what a virtual object is, I'm going to show you how to build a virtual object in Metaspark Studio and how to test it in the Metaquest Pro. You'll be making the timer shown here. First, we're going to open Studio, and right on the welcome window, you'll see a tile to create a new virtual object. Virtual objects are represented by an ecosyhedron icon. This is the new window to create virtual objects. On the left, you can see the scene tree with the virtual object node on the Inspector. You'll notice that we have a list of inputs and outputs that are available for virtual objects. They also show up in the patch editor. You can manage them and add more by clicking on the More Ports button. This brings up the picker with the Virtual Object port sets, which we call Interfaces. For now, we only have core and activation, but plan to have more as virtual objects gain new interactions and capabilities. We are also introducing new features in the simulator to help you create virtual objects. It now has the same aspect ratio as the new Metaquest Pro, and we have added interactive environments where you can simulate virtual objects. There are six spaces, including indoor, outdoor, domestic and art exhibition spaces. They represent a variety of places where AR could appear in the Metaverse. You can navigate around them and toggle them between day and night to test different illumination settings. You can also use them to test plane tracker and world. They are effects. Let's start building the Timer virtual object. First, you can import the timer model and put it on the scene. Now let's look at the first input show. Show is a pulse that the Air application will trigger. When a user places the object in the scene, it tells the object to appear and expects a transition or animation. We want to connect an animation with this pole so that the timer will have a nice ScaleUp effect when it is placed. To do this, you can connect an animation patch and a transition patch to the object scale. This will make the object appear in the scene by changing its scale from zero to one. You can click the Show input in the Inspector to test that it works and to view the animation. The next input we want to connect is Hide. This pulse will be triggered whenever the application wants to hide the object. With a transition, it's possible to reuse the same animation we have already used, but scaling the object down until it disappears. For this, you can connect the height input to reverse in the animation patch. If you click the height pulse in the Inspector, you'll be able to see the virtual object scaling down to disappear. Next, you can connect the show complete and hide complete outputs. These ports will send the information to the AR application that the virtual object has completed its transition to its shown and hidden states. For this, you can create an option switch and an option sender that receives pulses that enables you to send the animation's complete pulse to both virtual object outputs. You'll need to connect the show and hide inputs, as well as the complete poles from the animation to the option switch and sender patches. The remaining parts of the core set show instantly and hide instantly need to be connected with an animation that literally lasts 0 second so that the virtual object can be shown in its full form immediately or hidden without any transitions. We recommend that you always connect all virtual object inputs of outputs that will make your virtual object complete and ready for future AR applications that may use it in different ways. You can learn more about connecting inputs and outputs in the virtual object templates as well as in the documentation. In the next section, you'll learn how to use the activation set of inputs and outputs. Finally, over in the Inspector, you can click the ports icons to check that all the ports are working well. Now you can send a virtual object to the METAC Quest Pro. Very exciting. You should make sure that you have Metaspark Player installed on your Metacrest Pro and that you have connected it either with a cable or over WiFi. Once connected, it will show here in the testing panel. You could even share this with another creator or collaborator by creating a link. But for now, we're just going to send it to the device by clicking Send. And here it is on metaspark player. On the new Meta Quest Pro, you can place it on the desk and you can scale it using the controllers. You can move it, and you can change its elevation. Using the left hand controller, you can test the virtual object inputs, including Show and Hide. Now that we understand the foundations of creating virtual objects, what's the next step towards making an interactive object? Here to explain is Rodrigo Castile. Over to you, Rodrigo. Thank you, Lucia. Hi, everyone. I'm Rodrigo, and I'm a software engineer at Metaspark. I'll be showing you how to build a slightly more complex and dynamic virtual object. That is, we'll be adding interactivity and animations through Patch Editor or visual programming language. We first cover how to import and use textures, materials, and other assets to animate your virtual object. Then you learn how to use Patch Editor to define its behavior and to integrate the logic with animation and user commands. At the end, we test our virtual object on Metaspark Player. Let's view this digital timer together to make it fun. The timer will start the countdown only when it's activated via user input, and then it would explode once it reached zero. Upon explosion, the state should be reset. We can model this by two states idle when the timer is blinking and waiting to be activated, and active when the timer is counting down to zero and exploding. We'll start from a halfbake project containing some preimported 3D objects, materials, and explosion animation. Before we start working on it, let's briefly go over what we currently have in this project. On the right side of the window, we have the Activate input port, which will be used to trigger the timer countdown. This input is part of the activation interface, and it's pulsed ascend by the controller layer when the user is interacting with a virtual object. On the bottom, notice that we already have a patch editor graph that uses Lope animations to move and rotate the individual fragments of the timer. Once the explosion starts in its current state. Though the timer doesn't display any digits, nor does it respond to user inputs. On the left, you can see some scene objects and assets. The 3D objects are assets that we have instantiated into the Scene Object panel on the top left corner of the window. They can be broken down into four components the back, the front, the display, and the digits. The digits and the display share the same materials. They are named in the format MMS, where the uppercase letter indicates which digit it refers to. Changing this material will affect its correspondence in objects. So now let's begin our work by making the timer display some numbers. As mentioned previously, the 3D assets are made up of fragments that can be pulled apart by applying different transformations. However, they also have mesh groups that can display texture separately. If the texture represents an Led number from zero to nine, we can change it to display digits. Therefore, each digit object points to its corresponding material, and each material points to a number texture. We don't have those textures yet in the project. To import them as a sequence, we can click on the plus button and choose Import Texture Animation. We can then choose the ten number PNG files. Here we use the suffix underscore N so that the texture frame will match its number. Once chosen, we simply click Import. This process will create one animation sequence which can be used to choose which frame that is texture we're displaying at a given moment. Since we need one independent animation for each digit of the timer, let's create three more sequences and rename them to match the material names. In the MMS format, each animation sequence is linked to a texture sequence from which it will collect its frames. So we can simply select on animation sequences and go to the spectrum and choose the timer digits textures that we just imported. Then for each material, we choose the animation sequence that will be providing the texture to be displayed. Notice how the timer is now displaying random digits. How fun. That's because we're not yet specifying which frame should be picked by the animation sequences. To do that dynamically, we have to use the patch editor. Now, I'll give you a quick overview of patch editor. Patches are building blocks that can pass and receive different data types. Here you can see the value patch, which receives and produces a number. Numbers can be used in math operations or to define a logical flow or even states. They can also be used to control animations or to scale objects. This is the Boolean version of the value patch. A Boolean data type is simple. It can only carry a value that is either true or false. We will be using it to specify whether the timer display is on or not, which give us the blinking behavior. Another important data type is the pulse. A pulse is an instantaneous signal change emitted when something happens. For example, when the user activates a virtual object, a post is emitted. Well, if you're wondering what are those yellow patches? Those are consumer patches and they are associated with assets and objects from your project. When essentially them into patch editor, it can dynamically change properties of your animations, textures and other types of objects. Our next step is to create a patch graph to set the digits of the timer given the remaining countdown time. For example, if there are 67 seconds left which should display 0107, in order to choose which texture sequence frame will be displayed, we will instantiate the current frame property. As a consumer patch, notice that all the digits now become zero. We know that converting seconds into minutes and seconds requires computing multiple integer divisions. Although we have builtin patches for division and modulo, reusing them multiple times will be repetitive and tedious. So now show you how to create a patch group asset that can reuse to improve this process. To create a patch group out of your selected patches, you can simply right click one of them and select group. Now, all patches are hidden inside this new group. The next step is to expose patches within the group via its input and output ports. You can do so by right clicking on the group and selecting group properties. In this case, we create inputs and outputs for the numerator, the denominator, the quotient and the remainder, all of which have number as the data type. Once the ports are added, if you enter the group to edit its patches, you'll notice that there are now magenta and yellow patches representing the input and output ports. So let's simply connect the output patches representing the cosion and the remainder, and the input patches representing the numerator and the dena denominator. Now our patch group is complete. If you convert your patch group into a patch asset, every instance of the patch asset will be automatically updated. If the original patch group changes. With improved division and modular patch asset completed, we can easily calculate how many minutes are contained in a given number of seconds. The following patch graph computes which decimal digits should be displayed by just doing some basic time conversion math. Here's a bonus patch graph to make the display blink if needed. If the blinking is off, then it should always be set to visible. You can read more about the patches in our documentation. Now that you learn how to create a patch group, let's put together another patch group to sort the logic of the timer. The idea is that it will have three inputs enable, reset, initial value, all represented by magenta patches. Dispatch group will also have two outputs timer value and completed the yellow patches. We combine a loop animation patch which takes every second to increment the counter patch value. In order to calculate the remaining time, we use the subtract patch with the initial value to trigger the completed pulse. When the timer reaches zero, we compare it to minus one, and when it becomes true, we generate the pulse with a pulse batch. In the current state, our timer is able to show digits to blink when needed, to explore, and to count. However, those things are not connected. Let's see how we can integrate all of them. Preserve the state of the timer, and use the Activate port to change states. Here, we can use an integer value to hold the state zero for Idle and one for Active. To better organize things, we can use two equals exact patches to give us booleans. For Is Idle and Is Active, we will be connecting them with the timer state patch group we just created and the virtual object inputs. Let's now bring those patches closer to the flows we created before. If the state is Idle, we can just connect it to Schlink. To make the timer count, we have to enable it. If the state is active, then to set timer digits, we connect the timer state output with the duration. Remember that the timer should reset its account once the state becomes idle. This can be achieved by generating a post once the Is Idle output becomes true, and by connecting it to the reset port once the timer counts to zero, it's in the the post completed. We can connect it to the animation patches to trigger the explosion. Now we can use the Delay value patch to pass the current state from one frame to the next via its receiver patch, the initial state is Idle or zero. To preserve the state of the timer, we can instantiate the store value patch, which updates values once the input store is triggered. Then we use the option picker to select the next stage. Based on the current stage, the next stage should only be set once either Activate is triggered or the timer is completed. Hence, we use a post merge patch. Last, we connect the activate input to set the next day to active one only if the timer is idle. Otherwise, it's ignored. This completes our timer. Let's now test our timer on Medicare card Player. Once it's been initiated in the scene, we can select it and then click Activate with this left controller. This makes the timer start to countdown. We can move it around using the transform option on the right controller. Boom. The timer just exploded. So it worked as expected. Thank you all for watching this session. As of Connect 2022, virtualje creation for Mattispart Studio can only be accessed through our beta program. You'll need to sign up to learn more and get access. You can also watch our other Connect session, beating Advanced Virtual Objects with Scripting and Shaders to learn more. We look forward to working with you and seeing what you build. Thank you.