Sunday, 6 June 2021

Zuckerberg & Bosworth Talk Neural Interfaces & Virtual Work – Video & Transcript

https://ift.tt/eA8V8J

This week, Facebook VP of VR/AR Andrew Bosworth sat down with Facebook CEO Mark Zuckerberg to answer questions about a range of VR and AR topics.

The conversation happened over Instagram Live, with users sending in questions via the live chat over the course of the half an hour conversation. We already posted some remarks around building a reality operating system, but we’ve got a full transcript of the conversation below for those who are interested.

The following transcript features sections of the conversation that were relevant to the future of VR and AR, edited for clarity and grammar. We’ve also bolded some of the more interesting quotes throughout.


On Portal, Spatial Audio and Working Virtually

Bosworth: So, things are starting to open up. People want to know where Portal is going.

We can both take a shot at that, but Mark, you really put a lot of energy into Portal during the pandemic. It became such a lifeline for us at the company, with employees working from home. Where do you see Portal going in the long-term?

Zuckerberg: Yeah, well look, I do think a lot of norms got established during the pandemic that I think are going to be around for a long time.

People are more likely, I think, to just casually want to do video calls with friends and do social gatherings that way when people can’t be together. Before that was kind of like, “All right. That’s not so fun, we’ll just hang out with the people around us. This is going to be sort of annoying if you have to do this over video.” But now that that’s just a more normal thing, both for work and doing a lot more video and hanging out.

I think that’s going to be a big thing. I try to not mess this up because I don’t know… It’s hard for me to keep in mind exactly which of the details of what we’re working on are public and not [public].

[Bosworth laughs]

But we have an exciting roadmap of stuff that basically leans into what I think are going to be some of the new social norms around video presence. And that’s certainly connects to the bigger picture of all the AR and VR stuff that we do.

[When] a lot of people think about VR and AR it’s like, there are these technologies. What they really are is tools and platforms to deliver a sense of presence.

And video does that too. It’s not ideal – it’s 2D. When I see you on a screen, I can kind of get a sense of what’s going on with you. But I sort of still need to trick my mind into feeling like I’m there with you, whereas VR and AR really just make you, in a very native way, feel like you’re there with the person.

But video for a lot of people it’s the best that we have for presence today. And we view this as all just kind of one extended product category in Facebook Reality Labs. Like, “Here’s all the ways that you’re gonna feel present with people.”

Bosworth: Yeah. I love that. Like locally, just looking at Portal, you’re right — connecting people during the pandemic was this great, important piece. Connecting people who work together is an ongoing piece that’s going to be — and I think that is part of the macro shift for us there — but it does stitch into something bigger.

This is a good example, right here. [Referring to his Instagram live video] One-on-one communication, even over this mobile phone (and I have a mini phone), is pretty good.

But as soon as you start adding more people, you need space. That’s how your brain understands group conversations. Like, side conversations are impossible on VC (video chat) – totally reasonable in real life.

I’m gonna tie into this, then I’m going to get back to the questions. You got to try this week Mark – I’ve teased a little bit in the past that we’re working on tools for people to collaborate at work. And you got to jump in with me this week on one of those tools.

I take meetings in VR every week and I tell people that’s something that we’re working on, that we’re developing. And it gets directly to the point you’re talking about. It’s just so much easier. We actually had side conversations effectively in a virtual meeting and you can’t do it on VC.

Zuckerberg: Yeah.

Bosworth: You can’t do it without spatial audio.

Zuckerberg: Yeah. One of the things that that really clicked for me is that I have a hard time sometimes remembering all the video conversations that I do, because there’s no sense of space. It’s like, this is how we form memories – you’re to my right and I’m to your left, and we kind of are in a space together.

But now it feels like a lot of the place that the conversations happen is over video conference.  Some kind of grid is the space that you’re in and everything sort of blends together. Which, again, it’s better than a lot of the other tools that we have and we help hundreds of millions of people a month have those kinds of conversations. Between WhatsApp and Messenger, we’re probably building two of the biggest and leading video chat services in the world.

But I just think the promise is so much greater on VR and AR. The demo and the work that we’re doing over there and some of the things that we have coming (not too far down the line, which I think is going to be really exciting) help you deliver the sense of presence.

It’s not just the spatial audio. You physically feel like you’re in a place, even if you’re not actually physically together. It’s really cool.

Bosworth: This is one of my favorite things to tease all my viewers with, which is [that] it’s coming sooner than they think — of course, they don’t know how soon I think they think it’s coming, but it’s, that’s what I always tell them.

Zuckerberg: Yeah. And it’s tough because we think pretty long-term, right? So when you have a ten-year roadmap, saying that something is coming kind of soon is… [jokingly] It might be easy to get people’s hopes up.

But no, I mean this one — if we’re doing demos in it, it can’t be that far off.

Bosworth: This one’s pretty close, you’re right.


On Using High Bandwidth Neural Interfaces With VR

Bosworth: This is Aqua VR (who always comes and comments on and ask great questions on my AMAs) asking me a question about assistive touch on iWatch. Did you see that demo that came out for Apple? Their kind of assistive touch. It’s very oriented towards accessibility… it’s kind of a gesture-based control using existing iWatch things.

Aqua VR. I can tell you right now, we can’t answer. We haven’t used it, we don’t know what it’s about.  But they do ask a question about EMG and the bands that we’re talking about with CTRL-Labs, inside of Facebook Reality Labs…

You have, from a very long time back, had a very strong vision for neural interfaces — the ability to have more fluid controls. You’ve invested in this. You know, before we worked with CTRL-Labs, we had research and Building 8 that was oriented in this direction. We still have those teams working with UCSF.

Talk to me about how… Why is it so important that we have these higher bandwidth neural interfaces?

Zuckerberg: Well, whenever you’re designing a new platform, I think one of the most important aspects of it is input. I think in a lot of ways, how you control it is the most defining aspect of a platform, right? A lot of people think about AR and VR as sort of ‘what’s the output? Like what do you see?’

The bigger thing that defines PCs is you have keyboard and mouse. For phones, it was [that] you have this multi-touch and kind of swipe input.

So the question is — what are you going to use to control this natural interface around AR and VR? Our view is that it’s going to be somewhat of a combination of things, right?

You’ll have voice assistance and that’s going to be neat. But you’re not always going to want to use voice, because there are privacy issues with that – you want to sometimes control things without it telling everyone around you what you’re doing.

Hands are going to be a thing. People want to control hands. But you’re not always going to be walking around through the world with your hands outstretched in front of you doing stuff. So that will work sometimes better than others.

Controllers are going to be one interesting dimension of this too. Because as good as hands can get, if you’re doing something that’s really a micro movement – any gamer can tell you this – actually having a thumb pad and that kind of tactile feedback is super important. So for things like writing, you want a stylus – that’s super helpful to have something physical.

But, in some ways the holy grail of all this is a neural interface, where you basically just think something and your mind kind of tells the computer how you want it to go and that works.

There’s a bunch of research that we and others are doing into this. I think the key insight that our team has had… A lot of people, when they think about neural interfaces, they think about ‘how can we understand what you’re thinking?’

And it’s actually not about that. You don’t want to read the person’s mind. You’re not trying to understand what they’re thinking. What you’re trying to do is give the person an ability to have their brain send signals to the rest of the body about how this works.

And we have a system that does this, right? With motor neurons where your brain basically sends signals to your hands and your body telling them when you want to make movements, how to control it. And it turns out that we all have some extra redundant capacity for that, right? It’s part of the neuro-plasticity. If one pathway gets damaged, your brain can kind of get rewired but you can train those extra pathways to control, for example, a second set of virtual hands, so that way you just kind of think and, like, down the line, you have your virtual hands are typing and controlling what you’re doing in VR and AR, and then you don’t need to actually have a physical controller or anything like that because that’s awesome.

When you get to that, we’re gonna have this whole constellation of inputs, but that is perhaps one of the more ambitious projects that we have going on. But I think it’s really promising long-term and I think the team is making good progress towards it.


On Building A Reality Operating System

Bosworth: I enjoyed watching everyone’s minds blown as you were going through that. [Laughing] The comments were flying by, like, what is Mark talking about?

Yeah, no, I agree that as soon as you get into neuro-plasticity, you had me.

It’s been pretty staggering. Sometimes, I think we do feel held back by the rate at which we can communicate with these machines. They can do so much, but can I tell it what I want with enough precision, with enough specificity… and neural interfaces do hold the promise, but of course they are a ways away. So, virtual work rooms — sooner. Neural interfaces — later. That’s just a rough sequencing over a 10-year arc…‘I thought you were building an operating system. Why are you building one?’ Do you want to take that one, Mark?

Zuckerberg: Yeah, sure. I mean, so first… We are building a reality operating system. That’s sort of how we think about it. These new platforms are so different from everything that’s come before it — not just the input, but the app model, how you’re going to discover things, how tightly they need to be optimized…

If you’re building a pair of glasses that need to look like normal glasses… You need to have the system be so tightly optimized so you can basically do all the computation that you would expect from a modern computer, but do it on someone’s face within a thermal envelope and a power envelope that can last all day long. So that’s a very big challenge.

The team is pretty far along in this, at this point. We’re building a microkernel-based operating system, which is the architecture that you want to segment the pieces to make it as secure as possible. That way you have a small [set] of pieces that you know are going to be fundamentally trustworthy that you can build on top of.

But at the end of the day, we need to basically be able to design and customize every layer of the stack in order to build out the performance and efficiency that we need in order to deliver these systems.

And a lot of the people… I mean, Boz… Here’s a little-known fact — when I was getting started with Facebook, a lot of what I studied at Harvard was computer systems engineering. I think it was one of the reasons why Facebook was always able to scale pretty well, because we have that deep in the DNA of the company. And Boz was actually the teaching fellow for one of the classes I took, which is how we originally got connected and how you go to know me and eventually chose to join the company.

So between the two of us — we’re not actually the ones building this thing, which is good [laughs] – but I think the DNA (in terms of empathy for these deep systems problems and how tightly you need to optimize things to deliver such a specific problem space, that has never really been solved before) is absolutely critical.

So we’re definitely focused on this. I think we’ll have more to share for developers and some folks that at some point in… I guess that the theme of today is the ‘not too distant future’, but we’re going to leave it vague about what that actually means.

Bosworth: Whenever this comes up –  ‘why are you building?’ – I want everyone to know: I don’t need to build everything myself. I want to build as little as possible. And Facebook really was built on top of open source. We’re big contributors to open source. When software is available for us to use, we love using it. Obviously, our Oculus and Portal systems are built on Android, which we’ve had great success with.

So, I don’t want to build it. I want to build as little as I can. What is amazing is how much you have to build to fit into these tight thermal envelopes.

And I do feel at times that mine was a generation of computer programmers who were a little bit lazy. We got to be lazy. We were at the absolute, just the fattest part of Moore’s law, delivering tremendous gains in silicon. So you can just write high-level inefficient code and who cares?

We’re not up against Moore’s law — much tougher than that, we’re up against the first law of thermodynamics. The amount of heat that we can dissipate off of your face — not very much without burning you, which we strongly oppose. [Joking]

So any piece of work you see me doing, any piece of work you see Facebook Realily Labs doing… I don’t want to do that work – I feel like I have to do it to deliver the vision.

And building our own reality operating system is a part of that.

Zuckerberg: It’s also worth noting – it’s not just the reality operating system. We actually have to go even deeper than that in terms of technical products, right?

It’s not just making the apps efficient, the system efficient…  It’s the operating system layer and then optimizing the hardware and actually building out a bunch of our own custom silicon. It’s all the way up and down the stack.

Our hope over time, especially for both augmented and virtual reality, are that eventually a bunch of the pieces of the stack can become sufficiently… You know, when this is at a big scale, each layer of the stack will be its own industry by itself. And then you’d want the whole thing to be more modular. So that way, we can work with other partners who are building chips and support a bunch of different people who are building different hardware or different things like that.

But for the initial version in order to get this to be tightly optimized, you really have to go pretty deep in order to deliver something that’s the experience that we want and have it last throughout the day.

Bosworth: Yeah, the mobile workloads that we use on our phones – we think of those as having been highly efficient performance per watt. [But] even they’re an order of magnitude less efficient than what we need, in terms of performance per watt, for augmented reality, which is actually pretty demanding. You’re doing 3D graphics that have to be responsive to a real world environment. The end-to-end latency there is very tight. So it’s a tremendous technical challenge.

Michael Abrash, our chief scientist, also our chief historian, keeper of the knowledge of the history of computing, from early on one of the pioneers of computer graphics as we know them. He talks about this a lot. This is some of the hardest problems that we’ve taken on in a generation of computer scientists. And I came up in an easy period and boy we’re at the hard stuff now.

On The Future of Ads In VR

Bosworth: Okay. I got one that I’ll answer from Chad – ads are running in the Oculus app. Are we going to have ads in VR too?

I think you probably are. I don’t think you’ll hate it as much as your question suggests that you might. One thing that people forget about the ecosystem around applications and games is that they really rely on distribution and that’s not new. They’ve always relied on distribution.

In fact, previously part of the reason there was these big game publishers is because they could have enough… clout with retailers to get distribution. And advertising is a big part of that.

I [have a] long history with this, long before I was working on Facebook Reality Labs, even when I was working on Facebook ads. Mobile app install ads powered casual gaming for an entire generation of players. Casual gaming at times, especially the height of the mobile boom, is bigger than any kind of gaming. People just playing casual, easy-playing games.

So I do think ads are super important. You can pick your favorite indie developer, pick any game you love — go ask the developer how they feel about having the ability to promote their applications in places that give them a good return on their investment, whether it be on Facebook, Instagram, or in the Oculus app.

So promotion is actually important. It’s not just like a thing that you have to struggle through. Though I respect [that] we want those ads to be as good as possible. I don’t want to sell garbage, so if you see garbage, you can report it. Heck, you can report it to me.

Likewise, in VR, at some point when there’s an economy there — you need to power that economy. That doesn’t mean it has to be intrusive or bad. I think we all know what bad ads are. We hate bad ads as much as anybody — probably more, since it causes me to get more questions like this.

So it’s on us to make that a good experience. It’s on us to make that… to live up to your expectations, Chad. But yeah, they are a little bit there and probably more coming.

And by the way, they also drive down the cost, I should mention, of both the content (as it gets a broader distribution base) and of the hardware upon which that content is built. And that’s an important piece for us. We do want this to be accessible. In fact, that’s my next question…


On Making Emerging VR, AR Technologies Accessible

Bosworth: How are you going to lower barriers to emerging technologies like augmented reality? Mark, one of the things that has been so important to us for all the products that we’ve built is that we get them out there and everybody can use them.

When you’re building communication tools, you’re building these utilities that give people power, that’s always been so important to us. How are we thinking about augmented reality? When these technologies are new, they’re going to start out kind of expensive. How does that play out over time?

Zuckerberg: Yeah, I think there’s two big pieces here, in terms of the experience and how you get it to be accessible. One thing, that I think people probably underrate, is that if you’re delivering a product that’s about presence, you really can’t have wires.

That’s probably not the most obvious place to go with this question, but I do think that if you want this to be something that a lot of people are going to experience, it needs to be a good experience. If you’re trying to deliver a sense of presence, you don’t want a wire wrapped around your neck. It really breaks the whole thing.

I think that is going to be the bar for VR and AR products of high quality, going forward. I think you’ll kind of see the market split into wired experiences (which are maybe going to be less accessible to a broader number of people) and then the things that are going to be the mainstream line of technology, even if it’s a little harder to develop… I think getting on that wireless path is really important.

The other piece, that you were alluding to, is just getting the price to be as affordable as possible. If your mission as a company is to serve as many people as possible, then fundamentally you’re not trying to charge a premium for your devices.

You’re trying to drive the price down as much as possible, including potentially even doing what consoles have historically done. At the beginning, when they ship something, they know that they’re gonna be selling it for a little bit and then they’re gonna be able to make it cheaper. But they even subsidize it a little bit upfront, with the hope and expectation that they’ll basically make that up on app sales and on other experiences in the rest of the economy around it.

So that’s going to be more our plan in VR and AR, right? We’re coming at this from the perspective of, “How do we get this in as many people as possible hands?” Which is going to mean, “How do you make the price as low as possible for people?” We’re not looking for ways to go out of our way to basically charge a premium for folks and I think that is going to be a pretty big defining thing for how many people can use this stuff over the next five, 10 years.


Let us know what you thought of  Bosworth and Zuckerberg’s conversation in the comments below. 



from UploadVR https://ift.tt/3iij1O5
via IFTTT

No comments:

Post a Comment

Related Posts Plugin for WordPress, Blogger...