Video details

Chain React 2018: Building AR Apps with React Native by Vladimir Novick

React Native

Description: With the release of ARKit and ARCore by Apple and Google we see various Augmented reality apps created for iOS and Android. Have you ever wondered how you can create such apps in React Native? In this talk we will see how it can be done fairly easily.


Hi, excited to be here. We'll talk now about really cool stuff have about AR apps and how we can build it with React native. But before we just jump into coding and how we build the AR, let me just introduce myself. My name is Vladimir Novak. This is my new fancy logo. I'm independent consultant and I work in web, mobile, VRAR and IoT fields. Probably I should add AI and Blockchain to be like on top of Buzzwords, but don't do anything on that side. So what we will be talking today, we will start with the history of augmented reality. How we got to the point where we are right now. We'll talk about the purpose of AR apps, we'll show some cool demos and we'll talk about Viro React as one of the options for building AR apps. And we actually will talk about a little bit of science behind developing for AR and 3D in general. So what is augmented reality? So it's not like holograms like this but it's more like having some sort of glasses or having our devices and basically an overlay digital overlay on a physical world. It can get beyond the only beyond pictures. It can get to our senses to haptic feedback to various stuff. So how everything began and yeah, the last one, we are not there yet. We're on the third person but eventually we will get to the last one in several years. I hope so. It started in the beginning of the last century when there was a novel called Character Marker. And then there was the first introduction of AR as a concept. 50 years later, the first machine was built where you need to put your head on inside like it's huge like arcade machine that instead of a monitor you just stick your head inside. And it was also for VR. At 68 there was sold of the mockless and it actually was the first head mounted display with tracking. But it was so heavy they had a crane to hang this stuff on your head. So if the crane breaks you die. At 82 there was AR for weather broadcasters and well till 90s there was no such term as AR in general. There were concepts, there were ideas. But the actual term was born at 90s and it was invented by Tom Kudo and then it just began the whole AR revolution. It started with virtual features. It's kind of machine. You will see pictures in a bit how it looks like then in 96. Well there is a concept of markers in Nar. When you scan something like a marker sort you have 3D model on top of this. So the idea of this and actually the first implementation of markers was born at six. It was called Cyber Code 98. Nasa developed their first Arflex 38 99. There was ITAP wearable that served as prototype and progress later on to Google in 2012, commercial wise, 2008 there was the first introduction of AR for commercial purposes by BMW. Before that, it was more research oriented or like military like for NASA and stuff like that. 2010 was Connect, then Google last and right now, recent years, like several years ago, there was Microsoft Hollands who have seen Microsoft Homelands demos. Nice. So probably half of you, a little bit more than half all these demos, when you put this headset and you see the whole world changes, right? It's not actually as it looks like, but you actually can put holograms in the real world, like digital overlays. If you remember the actual location of the hologram in the world. In the same year, Pokemon Go, what was released, and back then, Pokemon Go was just an image overlaid on the video without any perception of physical context. But still world gone crazy with the Pokemon Go, 45 million daily users. It was just crazy at 2017. Last year, that was a really breaking change for us. At Naked Developers, AR Kitten AR Core was introduced by Apple and Google. And these frameworks gives us an ability to perceive our reality, our physical world, understand the surfaces, understand how it's structured and put things on surfaces. We can actually apply physics on these surfaces. For instance, I can have a 3D ball jump on this table because I know that this is a physical table, this is a surface. And right now we're waiting for these wearables. We're waiting for magic Leap glasses. We're waiting for we will see where it goes. And if we talk about old stuff, that's how it looked like. So the first one is Santorama, sort of the monkeys that you see. It can collapse the whole ceiling on the guy. So I want to try that. And then this is the dashboard thing and the markers, pictures and. Yeah, this is the Google Glass prototype. Yeah, I don't know. So remember 20 years ago? Yeah. Don't sit near your TV. Yeah. Right now we have it on our faces and. Yeah, everything is fine, right? Yeah. And we get to the point then when 30 years old talk about love of their life and researching for Pokemon makes sense. So the main question is why AR? Well, not only because we want paintings to react to stuff that we do. Actually, there are lots of applications for AR. We can use it in retail. It's actually really successful in retail. We can preview products, we can get deals based on physical markers working in the store. We can see a deal for like 50% off on something or stuff like that. We can get additional info on products, we can share products, images. We are all social right now. We have a bunch of social experiences, social apps, social networks. So we can share these 3D objects, these models in the real world with our friends and potentially make them buy this product. So, yeah, it's all about added in the end, right? We have AR games which are really cool because you interact with the digital world, but you can use the physical environment. It's much more addictive because you don't perceive it as you're into the game near your computer or headset or whatever in the real world. And you have these digital characters, monsters, whatever, and you just interact with them. Social sharing again. And you can explore physical world with digital overly well, it can get really problematic. We all knew those things that went with Pokemon Go and people got into different places they're not supposed to go into and got injured. So we need to be careful about that too. So AR is used already for manufacturing. We can get valuable information on the detail of machine we're building without any destruction. Instead of looking at the code of the machine and looking at the manual, how it's supposed to look like we can get a 3D model of machine part and interact with it. We can get instructional videos if we didn't do our homework. And we are better technicians, we can just like, okay, let me just check how it works. We can use various set of tools right now. For example, you can download apps which lets you just measure your room so you want to know dimensions of your room. You can just measure it with AR. Location based AR is really popular. You walk on the street, you just open your phone, you look at the sign and you see some info or restaurant nearby or stuff like that. And again, you can get information based on physical context. You can display various markers for location info on data and eventually it helps users. And again, Ads. So yeah, at some point you remember the picture with the history of AR and the guy working with the glasses. We all will be working with the glasses. Between hanging out there, ads and the main purpose of this is how we can get started with React native, as with AR and React Native. So there are various libraries for AR that leverage AR and the Air core. There are rappers, but they can be experimental. Some of them like React Native, AR course, a little bit experimental. React native. Ar kit does support everything there is Unity 3D and Unreal gaming engines. It makes sense to create AR games with heavy lifting 3D model interaction there. But we want to create apps, right? So we had about React 360, it was called React, but it's VR, it's not what we're looking for and there is zero React. These guys did amazing job and they actually go sponsor of this conference and their booths over there. So yeah, these guys. So what does Viro React gives you? We have real world 3D tracking. We can detect plane surfaces. It has its own powerful native render that renders an image. And it leverages both AR and ARCore. So Archit and ARCore, so you can use on both Android and iOS. You can recognize images, markers. You can create various physical, real world effects. You can do physics, animations, particle effects, stuff like that. To get started with this, you simply go to the website, you install, react to your CLI on the website, you get the API key and you init your app. So the basic army of the app looks like this. We have the AR scene, we have a bunch of components, we have the scene, we have the light, we have text hanging out there, we have plane selectors, we can select various planes and let's see its election how it looks like. I hope it will work. Yeah, it always happened. It didn't work. Now I tried to mirror all of these things. It's a cool up, but sometimes, yeah, now it works. Want to have AR experience and I'm in the portal now let me just walk on the stage. I tried to scan surfaces on the stage and you see I have a surface here. So let's put a monster on the stage. So yeah, these guys here, you see his shadow is on the stage also. It's kind of cold, right? Snow. So this is particle effects. In addition to that, let me try to find another surface. Let's see if it will work a bit off. But yeah, here's me just working on the stage, searching for stuff. So you understand why it can be complex for like. Yeah. And can get awkward sometimes. I have a whole collection of people working randomly, not on the stage, just like bumping into walls. But let me just reload the thing and we'll try it again. Yeah. Hello. And let's do another surface. Yeah. So this is a portal and I hope I won't get stuck inside. But let's do it. I'm in the portal and here you see I have the real world up there. Yeah. Hey there. So I'll get back until it close and leave me in the virtual world and let's scan another surface. And this is just dancing model. Yeah. Oops, got through the portal. Sorry. Better go around. That right. Yeah. I don't want to get stuck there. So you got the idea. It's cool. Right now I need to find my mouse. Let's just close this thing and yeah, after the talk, everything will be open source. So you can just check the code. So AR is cool, right? You understand why it's addictive. You can get stuck inside. If the portal closes, that's the end. You are there, you cannot go out. So let's talk about how it's built. Right. Check this out. For example, the code I showed you, you probably see a bunch of numbers here and it looks a little bit weirdish. So we are working with 3D. So we have coordinate system and it's called right handed coordinate system. You can just memorize the X's, Y and the X and Z. But there is a better solution. Why right handed. Because if you take right hand, you point it at yourself. This is Y, this is X, this is Z. Meaning if I want to put something in front of me, I put it on Z and Y and X, respectively. So positive Z, the high end of your important thing. Directions changed with the applied rotation component. Rotation probe. So you apply rotation. You can get -90 degrees on X axis to determine positive or negative rotation. Like, how do you know how we rotate? There's a thumb roll. So you just hold your hand like that. And this is the positive rotation. So coordinates, pretty much. We understood the coordinates, but we have light materials. And you've probably seen in the call there's ambient light. There are a bunch of other lights and who have done 3D development ever. Nice. So for some of you, this will be like to recall the basics of 3D development. For some of you, it will be new. So what is light and material? There are four types of light. There is ambient light. Basically light around us, directional light. You can say sun is directional light, point light, or for the components, light. For ambient, directional is zero directional light. For spotlight, there is zero spotlight component. And for point light is also called omni light, there is vioonne light component. So when you deal with 3D models, you need to light them up with these types of lights and models are not enough. We need textures. So textures. Virus supports different modes, supports regular modes, supports PBR modes, which is physical based rendering. It's a bit different concepts, having more textures than having an object. An object looks more real, more realistic. But let's talk about the regular one. So we had that guy that danced over there, so his face looks weird here because this is the diffused texture. So it's flat textures that gives color and the whole idea of how the model will look like. And it's called diffuse textures. Texture. We have specular texture, which means it's basically how shiny an object is. And we have normal map. Sometimes we have a mesh. When lights hit the mesh, we want to get that idea of the depth of the model. But we don't want to create like complex model in 3D program. For example, we want to create Cube and make it look like a brick wall. So we don't create each brick separately. We just apply regular diffuse lecture and apply normal map. To give this idea of bumpy surface 3D models in viral kind of combining memes in 3D without texture, though. So you can get OBG files and you can specify what textures they use as PNGs. You can use VRX files, which are for some of you that use 3D modeling and 3D development in general. Sounds kind of familiar because we know there is FBX format. So with VRX, you basically import the FBX model that you can get from various places. And you basically transform this model by running this script and you get your model with the pictures. You can load it and like Monster, for example, basically everything like monster, the portal frame, and the dancing person over there. It was also model Nfbx transformed with VRX. And you see this on the stage. There is also a new form of glTF which is essentially a JSON file specifying materials. And it links to Bin file specifying things like animations, key frames, stuff like that. And there are ten G parts which are textures. In terms of code, it looks like this. So we have zero ambient light. We specify the color, we can specify intensity. Ambient light comes from every direction. So we don't have directional information on the light itself. On directional light, we can specify. We actually need to specify the direction where it comes from. So we see it comes from negative one. And we have bunch of those like shadow, photographic position and near Z and farzen shadow through stuff like that. So in light component, we can specify various shadows parameters. And later on if we put a quad, it's called Vera quad. It's basically a plane surface in the scene and give it prop to cast shadows. You will get the shadow of the model above this quad. So that's what you've seen with the monster. You've seen a shadow under the monster, right? So it was the quad reflecting the shadow from the light specified in the scene. With spotlight, for example, you can define various angle properties like inner angle, outer angle direction. And again you see some shadow property. So here object. Remember I told you about the textures. So here where it is. So I have the source for the VR 3D object, which is Monster VRX. Now, VRX specifies what images it will use, but I can also require them in resources to specify. Implicitly, I have this diffuse and this normal PNG. We also have a bunch of callbacks we have on load end of the model because I want to. For example, here I had AR select why I did that. Because when model was presented on the selected plane, I want the selection plane to reset and give me the ability to place another object, another object, and another object. That's the quad I told you about. You see, you have Arshadow receiver. This prop basically gives you an ability to receive shadows. It's all descriptive kind of and quad also always vertical. So you need to transform it to -90 to be on the ground. In terms of material, we can also create materials. Now, with materials we specify separately from our components, we call zero materials great materials. Also, with animations, you can specify animations separately or with tracking targets and markers. There are a set of APIs that you specify not inside the component itself, but outside. And then just use in the component like here. For example, I specify that I have ground material later on in my component when I render something I can specify that my material is ground just passing a string ground and it will know that it needs to search through the materials directory of an application and apply specified material. So I have various shiny sprouts and diffused color. Again regular color, just treat it as just the color of this thing. And I have lighting model so I have Bling which sounds kind of weird world but actually there are four various lighting types. If you don't specify anything it's constant light type this year but you see how they are different. Like this is more glossy kind of. This is like more matte. So you can specify material to be one of those lighting types and with lights in general you have various like with VR also you can create VR things so you can create like lightning environments and you can go really beyond just the basic materials. Another important thing in AR is detecting planes you've seen here. I have kind of greyish surface that I clicked on and the monster appeared or the portal appeared so I can do that myself. But there is a component that helps me with this thing. It's called Viro AR plane selector and basically renders this Gray box for me to being able to select the plane and I have here like model array and I have this guy here zero node. You can trade it as like a view direct native because it's kind of wraps the node you want to present in AR seen in AR world. So you also specify position. You can specify if you drag this thing around the world you can do various stuff on this. In addition to plane selector I have your airplane component which is much more low level and it doesn't give you the ability to render this Gray surface but it gives you power to put things on AR planes, on surfaces and stuff like that. Looks really cool. So how we do this stuff in Viro react we deal with various things. So first of all we have zero portals in basically it's a wrapper component, it wraps the whole thing. It's the whole thing, it's the doorway, it's the image beyond the doorway. It's like everything, the behavior, everything. So inside of the portal scene there is a portal and there's 360 image because with zero react you can also do 3D and this 360 image you can do VR with 360 image is basically this VR world beyond the portal, right? The VR 3D object is basically a 3D object for the doorway and the portal component knows when to render which camera on the doorway to show this 360 image or to show the real world camera and stuff like that. In addition to that you've seen snow right? So there was snow coming from the we don't have sky but coming from the ceiling and it's basically particles we can do snow, we can do fire, we can do lots of things, we can do fireworks, we can do lots of stuff with particles, we can do smoke like anything you can imagine of that comes in large quantities spawns like images that spawns from direction gets some form and shape and then disappears. And to do that we use zero particle meter. We position it as regular 3D model with just position and we have various props on the meter. We have the duration, which time will take to spawn the whole thing. And if we want to loop we just specify we want to loop. We have the image that we wanted to create every time for every particle. So here we have particle snow for every tiny particles of snow that is spawned from this emitter and you can define the spawn behavior. We see that there is some kind of lifetime of these particles and we see the emission rate. So we can configure that to be like more snow, less snow. It can be with the shape. You see the shape here is a box and it's above. So that's why you see everything just going in the box down. But we can define it as firebox, it will be sphere, it will go outside stuff like that image prop we just put here image prop. Actually the first thing people ask me about AR and like we are in general where do I get 3D stuff? I don't want to go to some website and just buy huge amount of money of hiring like 3D model, stuff like that. Don't get me wrong, 3D model will always be better for the long term project but we want to play around with things right? So where do I get stuff? So you can get things from Unity assets store which is really cool. There is Unity 3D game engine and it has Asset store. You can get let's say for $30 you can get the whole model with animation and stuff. So it will probably take much more for the 3D model to create this in terms of time and pricing and stuff. But we still need to open these models in 3D program optimize them for real time, not only Viro, with the AR in general we can answer the phone. We cannot render 3D models with huge amount of polygons really heavy stuff. We cannot render on the phone. It will probably render but it will be performant wise. It will just give you a battery, it won't work properly, it will be legging and stuff like that. So we need to search for models that are real time optimized. They are low Polygon and it's a general rule for developing mobile phones with 3D. We can of course go to Turbosquite, we can search for free models, we can buy models there, we can search for stuff there on Sketchfab. There's another site we can go mix them up, mixing up is really cool project by Dobby. And actually my models were from Mixamo. This is a website. You go to your login with your other account and you can choose from set of premade characters and both models from this set of characters. And it has lots and lots of animations there. So you can just download models from there. And actually they work with VR perfectly. You don't need to optimize anything. So just go there, grab models, put in your app and just play around with them. Also it's a really cool website if you just downloaded the model and you want to do an animation. So when you animate 3D stuff you have sort of 3D rig on the model. So you need to define where joint of bones and stuff like that. So let me somehow I can do it automatically and apply animations that are out there. So it really speeds up the process. You of course can model in Blender and maybe it sounds like tedious and we are not 3D modelers, right? At least most of us. And why should we model in Blender? But as I've told you before, if you take just a Cube and you apply premium textures on this Cube, you may get to the level of like really good 3D model with low level of polygons and really optimized for mobile. And these textures you can get from various websites I wrote here and we can get from Substance Source which is another cool program for painting on 3D. And it has Substance painter, but you can get from Substance Source. It's like a library of premium textures. So the question is when to develop AR apps for zero. I would say if you want to develop an app, do it for zero. If you want to develop a game like the full blown game with lots of interaction with 3D models and particles and stuff and it's not connected to an app in any way, then probably you should deal with Unicorn, real engines for games, dedicated for games. But most of apps can be dealt with zero. And actually you can take any app and you probably can think of AR experience that you can add to any app and it will make you up cooler. Let's just take for an instant change react app, right? So there's a feature there. If you haven't tried, you should try it's actually created with your own open source. So you can search this, you can go to GitHub, you can get this code, you can look how it looks like and the same components I showed you here during the talk. So it makes your app really cool and you can actually get really exciting features in the app like producing products and stuff like that. So go out and start building with Rap native and Bureau today or tomorrow or whatever. You have time for that, but you definitely should try it. Thanks a lot. We're it's really a pleasure speaking here thanks.