Julia Beabout, Katerina Ufnarovskaia
Cassie: All right. Looks like we are ready to go. I’m so excited because this is our first session where we have three people. And it’s like one, two, three faces. And before we jump into this next session, I just want to remind everybody that if you are just coming in for the live session, Don’t worry! Everything is going to be playable on demand. So if you want to go into day one or earlier on day two sessions, if you have some time, you know, in between some calls or not have much work to do, please go back and check out some of the sessions that you may have missed. Now, without further ado.talking to you about a particular project we did which was really kind of a groundbreaking project and I’m proud to say we just won an award for it and would not have been possible without Augmented City system. So we’re going to start by showing a quick video of the project. It is kind of hard to really understand without that as opposed to just a still picture. So we’ll do that first. It’s a little less than a minute. And then we’ll go into some slides about it. And talk about the project qualitatively and some of the challenges that we encountered. So I am not super familiar with AirMeet, so hopefully appreciate your patience. Let me share that. Let me know when you can see. Here we go. Can you guys see my screen?
Katherina: Yeah, okay.
Julia: And let’s see. Hopefully, it can hear it as well. I’m not sure.
Katherina: Not yet. All right.
Julia: I think we’ll have to go without the soundtrack, but there’s jazz music playing. Okay, so hopefully that gave you a bit of an idea about the project. So now I’m going to move over to some slides to talk about some of the specifics of what you were seeing and a little bit of background on the project. So, I’m just going to ask, can you hear me? Okay, Katerina?
Katherina: Yep. Okay, great.
Julia: And you can see my slide presentation.
Katherina: Hey, can see it.
Julia: Okay. So the project? Couple. I’m at the Blackhawk and we’ll just walk through. It’s a mobile phone-based, augmented reality project. As you mentioned, 5G has really enabled the development of this and certainly the better user experience. First of all, I just want to thank everybody involved. In this project. I always say it takes an AR Village to make these projects and that was certainly true in this case as well. So one of the goals we had of the project is it’s very much in a challenging neighborhood in San Francisco known as The Tenderloin and people are very, it’s a very maligned neighborhood and very out of touch with actually the positive elements of the neighborhood. And so, what we were really the goal of the project was kind of the move people from this microcosm of phone AR to really connect them with their physical environment and understanding their Community better and Helping to build relationships. One of the great things about The Tenderloin is it has this Rich amazing Innovation Innovative history. A lot of that Innovation was around music. One of the most famous jazz bands of all time, Dave Brubeck’s Band, used to appear at the Blackhawk jazz club, which was located in this neighborhood at the corner of Turk and Hide. But today, that spot is occupied by a parking lot that you can see here. It’s also populated by many people that have many of the Social Challenges, substance abuse homelessness, mental illness challenges that we see on our streets today. And so this was something we wanted to kind of rehabilitate or draw new energy to this as well. But what we ultimately decided to do was locate this in the neighborhood park which provided a much safer location in terms of being away from traffic and much more accessibility to the neighborhood. This is a real beloved park that had gone through a renovation. And so we also had this great green that was accessible to people of all abilities. And so we decided to locate it here. You can just see some other views and here, you can see the Blackhawk on the phone there. So, a couple of things that may be difficult to pick up in the video are that the billboards that used to populate the outside of the real black hawk, or actually, we’re actually repurposed to be an Art Gallery space. We worked with local youth to create at the Boys and Girls Club in The Tenderloin to create these artworks for it. And these were based on stories that they wanted to tell about their neighborhood. And then, You can tap on these billboards and you’ll get additional audiovisual content that you can see here on the right. So this will be a narrated audio story. That tells you about that artwork and the neighborhood story that’s behind it and the neighborhood history. And so there’s a total of five billboards so you can interact with and that can be expanded as you can see to the left on some of the striped billboards. The other things this project Incorporated and we really wanted to be very inclusive and involved the community as much as possible and having to be part of the creation process. So we invited a number of local artists that are based in The Tenderloin or have a history of working with the tenderloin to create digital artworks to populate the interior of the Blackhawk jazz club, just like you would in the real world. The idea is that this becomes a rotating digital art gallery, space, moving forward. So this was kind of our first exhibit. And then we also invited local, guest musicians, to create the soundtrack. And again, the idea would be that this gets rotated out with each exhibition that we start. And then, we wanted to make sure we were kind of historically accurate, but in a playful way. And one of the most interesting things about the Blackhawk jazz club was that it was really radically inclusive and that’s something we really wanted to emphasize. And one of the ways they were inclusive was not just racially. And many of the ways we think about it, but actually age inclusive. So they had this separate room in the back with this chicken wire fence, so that if you were underage at that time below 18, you could come and enjoy Jazz and learn about it, but be safely away, I guess you could say from the alcohol. And so, one of the most interesting things is this project was actually conceived well, before covid. And so many of the changes that we’ve experienced over the last two years, and during that time, the technology completely changed in terms of a are really Advanced. And so how we had planned to do the project, which was a more older generation of a are really transformed and really this transformation has been brought about by a combination of 5G, advances in computer vision, which augmented city is really going to speak about. And then also, this Edge, Edge Computing. And so, particularly, when the 5 G combined with the Edge Computing, it really allowed us to utilize the content of this size. So the content in this particular application is actually 35 megabytes. It can take quite a while to download and Katerina will be talking about some statistics with that. But certainly, the 5G made that much much faster and a more user-friendly experience and also enabled us to increase the resolution and quality of the textures that we could utilize in the projecte, and decrease the latency that people were experiencing. It also enabled us to transition to using Dynamic content. It was really key in the so the assets are actually loaded upon app opening. So that allowed us to certainly meet App Store limitation requirements in terms of the size of the overall app. And then the 5G enabled us, it enables us to actually keep those assets, change them out easily and keep them up to date. And then finally, the Computer Vision was key for this. So again a couple of years ago, we would not have been able to do it this way.So this type of environment with the park actually would be a very challenging situation for computer vision to recognize that there was enough detail around the edge of the park that we were able to capture with Augmented City system, to create a scan of the area. And here you can see that. So you can see the areas that I scaned here on the left. We did quite an extensive area, made sure. We got a good point Cloud for the scan to insert the content. And here you can see the low density Point Cloud. That is actually used to identify where the person is and what they’re looking at and position the content appropriately based on their, what, their local and what their viewing certainly 25g, really facilitates this rapid localization, and this frequency of it as well. With that I will let Katarina take over from here. Katarina. Would you like me to continue to manage the slides for you?
Katherina: Yes, please change the horses on the crossroad. So, well, just a, let’s look behind the curtains of our technical part. Let’s see. Oh, for us, to be honest this project was quite a challenge and I will explain why. Why? Usually, we do the following, we do three steps with our technology. So first of all, we scan the space with our application. So a user just has to download the app and yeah, just scan any space, it wants. So it could be in-door. It could be outdoor, but for outdoor, maybe it’s even better because you can just scan a map kilometres and it’s quite fast. It’s quite easy. Then you create, let’s say, Point of cloud of your space like Street or square, whatever, and then you play with content, but it was not only a case of this project and I’ll explain why. Because usually what we do is we do navigation with iron or we put some kind of content on the buildings on the squares. Or whatever. Here, in this case we had to put the content inside the model. So we have to do navigation not in a space in real space but inside the 3D model, so could you please move to the next slide with maybe some kind of challenges? Yeah, here we are. So what kind of challenges do we face? First of all, you know that the hunger is getting bigger when you start to eat. But say so all the time you like to make some additional function, the functionality some additional small thing. So to make it better to make it more interactive, and actually what we have at the end of the day, we had a very big sized 3D model with a lot of textures and it was like a 140 MB model. So we had to be very careful with our app at the end. So just install it in the application. Then we have to add a lot of different, let’s say interaction details. So for example, to interact with Billboards, to interact with instruments, to navigate inside the model. It looks quite easy to do but it’s not. And of course, this model should be fixed in space. So it shouldn’t float from one place to another. Will we need to fix it for you? Think of our Geo pulse or legalization? And not waiting for our end users to download this model. So maybe we could just move to another slide.
Julia: I think there’s some lag. So, oh, yeah. Yeah.
Katherina: I have some lag. Well, actually, what we result in. First of all, we played a lot with different audio effects because the end-user shouldn’t see the difference between passes. So the end-user just has to view it in quite a user-friendly way every detail. So, if I put my phone too, let’s say one billboard, I would like to scroll and see different content. If I look at another one, I interact with another billboard, just to see and maybe to hear different sound effects. So then when we had to upload to the market, our let’s say application, we found out that it was quite big. So the size was quite big and we couldn’t, for example, pass the control of the Android platform. So then we just took a step back, and let’s say we reduce all textures and sizes and we optimize everything to embed it. Let’s say in an app in an optimized way. Then just to play with light. So I suppose that we can improve it. Because if you look at the picture on the slide, you’ll see that we don’t have a shadow. So still, we cannot play in AR with shadows and real shadows. So, if we have some from one side, our Museum will show another shadow. So it means that, still for our brains. It’s still an artificial picture. And as we are all let’s say animals inside. I mean we are anyway, we have instincts. Our brain doesn’t recognize, in a good way, such a kind of picture. So our next step will be to play better with lightning with shadows and well, we hope that 5G traffic will help us with it because we need it. We needed faster. We needed to be more sufficient and more strong like to improve every detail of our special cinema action. So that was our first step. And actually this kind of project gave life to other next projects because now we are doing things like, for example, interactive museums on the streets. It means that different characters from different places in a quite long prospect can interact with the end-user. And we again, we talk about big models, big size models, 3D models. We need to move ahead with some kind of better graphics. I mean, with a more realistic one, but still,we have some limits of unity. We still have some technical limits of platforms. And let’s say, we have a lot of things to do, to be closer to the metaverse and perfect metaverse. So it’s challenging. It’s beautiful and well, we are ready. Actually, I talked about results as well.
Julia: Okay, so we’re all stopped sharing now. Are we good.?
Katherina: Whoa, just I already did it. Okay?
Julia: Alright, are we ready to move to Q&A or?
Katherina: Hmm? Well, actually maybe we have to wait for some answer questions or we can give you some answers.
Julia: Okay, I think I’m going to stop sharing and then we can, I was going to just comment further on how you brought up some really good points as well. So I think one of the things that, you know, you’re basically talking about the lighting and the effects, and all of that. And I think one of the things that AR has really made me appreciate more is What we’re really dealing with is human perception of space and light and that does not necessarily translate to actual space and light in the real world. And so as a 3D professional, somebody who’s been in the 3D field for more than 30 years at this point, it feels deeply unsatisfying but I’m always amazed actually at how sophisticated actually our vision and perception is. And so it’s really about creating something that satisfies you from that perspective. And so frequently the lighting is really key as Katherina said. So the more improvements we can get in that the better. What we find generally right now is that the automated lighting shadows lighting effects are not realistic enough or provide consistent results. On the quality of the models and the textures. And so frequently, we take an approach where we bake in what’s going to bake in the lighting. And so you’re the trade-off there is you get higher quality kind of textures and content, but it doesn’t necessarily reflect the real ambient lighting at the time. So those are just kind of, you know, questions, artistic questions, quality questions that choices that you make on each project.
Katherina: We talk all the time about immersion. So we talk about feelings and feelings are inside our brains. So we talked about impressionism. Let’s say so if we have a feeling, if we have a feeling of some kind of perception of something, it’s just good enough. So that ‘s enough to see a very strange character. So if we don’t need to show it is very realistic. So it will be something very nice and fluffy and it’s good enough, but still from the point of Technology from the point of even Perfection. We need to move ahead and show the same Graphics as we can see from the movie, we can move it to our augmented reality. And that’s why we need such instruments as 5G because we know how to do it. I mean technicians we know, but we don’t have enough traffic and we don’t have enough Hardware to arrive at it. Yes, the biggest case also is about smart glasses, but it’s another topic. Let’s leave it because I will not finish. Anyway, my discussion, so we have a question from Cassie. See how long this project takes? Let’s talk about deployment. Yeah.
Julia: Well, you know, we had covid in between as I mentioned, so it took years of a kind of that. Really the project was conceived in. Let me think about that. Like I think it was 2018 through just working in the neighborhood a lot. Time and just you know working with stakeholders there and familiar with the history and learning more about it. So the project from concept was really conceived in 2018. We had originally planned to deploy it in the spring of 2020. So about a year, not quite a year and a half later, but then covid hit. So ultimately we ended up deploying it this past summer. At the end of August. So all told it was really a couple of years from start to completion. I would say in terms of actual design and production. We really went into that heavily, you know, it was probably about a six-month cycle it, depending on the amount of community engagement, you want to do, and that was really one of our goals with this, you know, it really takes about a year to get meaningful community engagement, but in terms of design and production, you could, once you know what you’re going to do, you could do something like this in 12 weeks, or so. It would be a push, but we’ve done it in less, but I don’t recommend that. I know Katarina if you want to chime in on your side about that too.
Katherina: Well, actually the biggest part of the project of such a program project. The longest part is still that one that belongs to the way of communication between the community and the way of Education of the community. And so that’s why this project was quite unique. Because we didn’t talk about millions, millions, millions of dollars or whatever, and some trial that at the end of the day is nothing technological, but we talk about, let’s say, from the point of view, technologically ignorant community. Correct me if I’m wrong. So, people who let’s say, well, it’s not a primary goal for them to talk about metaverse, to be in metaverse and everything, but they believed. So at the end of the day, they believed in the technology and they contributed with their soul to this project. So even children and for us, well, it’s a big case to be honest. It was quite challenging anyway, because to work with children, they’re like directly. I mean to get some kind of feedback from them. And then after that stress, we worked with artists because they are very careful with technology and everything because there is a lot of stuff right now around. To actually artists look at such technology with quite, I don’t know, which was quite an unsure feeling because they don’t care about hype, they care about feelings. They care about the beauty of their masterpieces. So at the end of the day, Julia convinced them to do it and well for us. It was challenging to do it, as they shared their experience. So not only for one mobile phone, but for multiple users so for us it was a big case as well. So yeah, 12 weeks. Maybe it depends on Graphics to be honest and well for years and years till the Education of our audience.
Julia: So yeah, great. That was a ball pit, right? Yeah. I would say it really for me. Then for Community engagement, you really need a six-month design cycle Community engagement design cycle for sure. And then I would give myself, you know, three to six months for development, for design, development and creation and development.
Katherine: Julia at the end of the day design still depends also on let’s say money.
Julia: So, absolutely!
Katherina: Well, if you need, if you have money and you need to develop good design. You can do it even in. Let’s say tomorrow.
Julia: Absolutely. Yeah, so that’s what they say. It’s a difficult question to answer because it’s not really a question of production for this project. It had so many other factors. There was a lot of community engagement, a lot of design, a lot of research on the history, you know, those types of things. So like we took the kids on field trips to learn about their neighborhood for them to discover what stories they wanted to tell. So, you know, all of that. You know, yeah, okay. Is there or will there be a spatial audio dimension? Great question. We do not have that right now and would love to do that. Yeah, I think that would be super cool. Definitely. Okay, any other questions? What is the ideal? They asked, okay. Well, what is the ideal project in terms of graphics in terms of?
Katherina: Well, everybody who wants to look? They could, they can look at it. So it doesn’t depend. Now, as we talk about iCloud and we talked about sharing their experience. So like I don’t know, 100, 200, 300. Julia, what was it?
Julia: You know, in terms of the launch event. We had probably about 75 to 100 people there. So the project is live and ongoing. The idea is that it’s now a Community Asset. There’s a number of Tours, guided tours of the neighborhood. Obviously those have been on pause because of covid as well, but they are starting to get restarted. So the idea is that it would be able to be used by local community stakeholders, including tour guides, merchants and all that for advertising. So, you know, I would imagine, you know, it’s hard for me to imagine, you know, there would be more than 100 people at one time. I think it will probably be between 10 and 25 people that might be involved in a tour, but I think it’s really a question of technical limits in terms of augmented cities. The number of users. I don’t know. What is your maximum number of users?
Katherina: Actually thanks to 5G there in that location, we didn’t have too big problems. But the risk of 4G. So if here in this case, we talked about, let’s say 4-5 seconds. So of course we didn’t arrive in 3 seconds or less because still it’s hardly possible. But well, 5-4 seconds for downloading something. It’s good enough because with 4G we talk about, let’s say 40 seconds for such a big model and well, good comment from Cassie actually about Graphics without 5G. It’s possible to do. Because well, for cases, we have technicians and algorithms because they can invent the optimized way to show these big models, but it’s much more difficult.
Julia: Well, I will say, I just will try, what we did here. This isn’t only a 5G experience. So people were using it with 4G phones. So that was at least for this experience. So that’s something that we’re very conscious of in our projects is recognizing that. Not everybody has the latest phones and latest Technologies or iPhones or Androids. And so, we’re always making sure that our projects are accessible on as many different types of devices and networks as possible. So this one, we designed it so that it would be capable for the broadest audience. But certainly 5G, I think we’re where we really noticed. The biggest impact of 5G is really loading of content. So that is an issue. So as Katarina mentioned with this app, we tend to like to download our content at the time of app open and so 5G like literally cuts that time in half. This project depends where you are in the world and what server it’s accessing. And that’s part of the reason I was saying earlier, is 5G with Edge Computing is actually a big part of a big piece of this puzzle, as well as 5G, with Edge Computing. So for instance, in San Francisco, this content happens to be on a local server there. So like, literally it is like less than 10 seconds. It’s faster than in San Francisco at the park. They have like the highest level of 5G available. And so I mean it’s like less than 10 seconds. So here, I’m in Seattle, and I have a test bed here so we can work with the project and it does take longer. I would say it’s more like 20 seconds. But when I compare that to say with the 4G download that’s like 40 seconds, if not more and I know Katherina when they’re downloading it in Saint Petersburg or Italy it takes forty seconds, right?
Katherina: We talk about medieval streets, very few, narrow streets where they were not, we don’t have sometimes not only 4G, but 3G doesn’t work properly. So, that’s why we have to, let’s say, cover in our way, in our technological way, with some kind of if we talk about navigation. Because for example, we did a project in a small town, it’s called Anjera. So they’re a small snake that guides you to a castle. So this snake guides you along very, very narrow streets sometimes without any internet or Wi-Fi. So, of course, we needed to cut it into pieces. Every let’s say pass and then combine it with some downloading elements. Let’s say in the application. So the end user doesn’t see so much the gaps between, but of course with 5G, it would be much easier to do. So. Yeah.
Julia: Did you have problems with two different people not having devices that work with 5G? So yes, and no, like I said, we definitely designed and you know, created the specs of the models and everything to work with 4G and the big a number of films or types of phones as possible. There were a few folks that had problems. It frequently boils down to if they have the latest AR core or AR kit on their phone. We have run into problems with some Android phones more than iPhones, but overall it was not a problem. There were just a few folks that had some issues but overall it was not a problem. Like that really just gets to the timing. And so, particularly for when you have a launch event like that. It’s great. Obviously everybody is there for the specific event or a specific app and so they kind of have an event kind of psychology in their head. And so they’re willing to kind of wait, but if you compare, if they have a slower phone right? If it’s going to take them 40 seconds, but if you compare that to somebody, that’s a person out on the street and maybe just discovering this on their own, asking somebody to wait 40 seconds or longer in that kind of situation. You kind of risk them, aborting the app and aborting the experience. So that’s really kind of where I see, you know, the 5G and Edge Computing is really helping as actually. Helping to retain user engagement and actually get them into the experience.