Note: This episode was recorded before Apple announced the preorder and launch details of Apple Vision Pro.
Oliver Weidlich is Director of Design and Innovation at Contxtual. We discuss his design work in spatial computing that predates Apple’s unveiling of visionOS by several years. We also dive into his experiences with other headsets including the HoloLens and the Magic Leap 1/2.
This episode is sponsored by Glisten. Glisten is the “Good Listen” podcast app for Language Learners. It’s all you need to immerse yourself in a language, on the path to becoming fluent. Learn more at www.glisten.ist. Download Glisten now for iPad, iPhone, and Apple Watch. Coming Soon to the Apple Vision Pro!
YouTube Version of the Podcast
Links and Show Notes
Links:
https://mastodon.social/@oliverw
https://www.contxtu.al/spatialcomputing
https://www.thisishcd.com/episode/oliver-weidlich-creating-immersive-experiences-with-spatial-computing
https://ieeexplore.ieee.org/document/10269051
https://www.aweasia.com/blog/oliver-weidlich
Chapter Markers:
00:00:00: Opening
00:01:49: Support the Podcast
00:02:32: Oliver Weidlich
00:07:00: Unlimited apps
00:10:33: What have you designed in the past?
00:11:28: What headsets have you tried?
00:18:26: Magic Leap
00:21:31: Outdoors
00:22:16: Your first experience
00:23:46: Hololens
00:26:30: Apple’s potential
00:29:14: Sponsor – Glisten
00:30:51: What has your team been working on?
00:35:30: Unity
00:36:31: Buying AVP in Australia
00:36:49: Spatial Computing
00:38:36: API Limits
00:42:34: Year to year upgrades
00:45:54: Interaction Methods
00:50:25: ARKit apps in visionOS?
00:52:21: AR Research Paper
00:56:32: AI
00:58:57: Australia
01:00:29: What questions do you want Apple to answer?
01:02:46: Anything else?
01:07:00: Where can people follow you online?
01:08:02: Closing
Transcript of the Interview
(2m 34s) Oliver Weidlich:
Thanks Tim, it’s great to be here.
(2m 45s)
Yeah, I’ve listened to them all. It’s been really great to have them as a way of understanding who else is exploring this space and what sort of things they’re investigating and trialling.
(2m 57s)
So thank you very much for doing all of these interviews.
(3m) Tim Chaten:
Yeah, absolutely. It’s always fun at the very beginning as things are so very new and we’re trying out all these different ideas and concepts of what could be in the possibilities of this new platform we’re about to get our hands on here.
(3m 14s) Oliver Weidlich:
That’s the exciting time before everything’s really locked down, right?
(3m 16s)
Like where there’s still the opportunity to sort of really invent or explore or yeah.
(3m 17s) Tim Chaten:
Yeah.
(3m 23s)
Right, like for iPhone parlance, that’s when Tweedy invented pull down the refresh.
(3m 30s)
A developer could invent some like custom hand gesture that Apple adopts and becomes just how we do things.
(3m 37s) Oliver Weidlich:
Yeah, looking forward to that.
(3m 37s) Tim Chaten:
Yeah.
(3m 39s)
Yeah. So prior to getting into spatial computing, what kind of work did you do in user interface?
(3m 45s) Oliver Weidlich:
So my background’s originally in psychology and in human computer interaction.
(3m 50s)
And as that, as a basis for the last sort of 24 years,
(3m 53s)
I’ve been working and consulting in user experience.
(3m 57s)
So specifically research with end users,
(4m)
with people who are gonna use a digital service,
(4m 4s)
understanding their needs and behaviors,
(4m 5s)
and of course, designing those and doing the strategy around different digital platforms,
(4m 11s)
different interfaces, et cetera.
(4m 13s)
And one of the ones that I really focused on
(4m 16s)
in the early thousands was mobile UX.
(4m 18s)
And so, you know, did a lot of work,
(4m 20s)
even, you know, the early days of Nokia,
(4m 24s)
the wireless application protocol, PalmPilots,
(4m 27s)
I know ATP are talking about PalmPilots again at the moment and stuff.
(4m 32s)
And so that was a lot of fun.
(4m 33s)
And when the iPhone launched, obviously in 2007,
(4m 37s)
that was a momentous occasion,
(4m 39s)
especially for interaction design, right?
(4m 40s)
And for user experience,
(4m 41s)
because suddenly it drew a lot of attention to what was the.
(4m 45s)
The importance of the user experience and how to get that right and to think through this new opportunity of direct manipulation right of being able to touch an app and at launch and and also prioritizing the customer experience in that the customer could decide what apps they wanted where the phone was just an app.
(5m 5s)
It wasn’t sort of this functionality that that sort of took over the phone and I think That approach was very customer friendly.
(5m 15s)
And actually we see some of that in the Vision Pro today as well.
(5m 20s) Tim Chaten:
Yeah, the whole concept of the screen just transforming into whatever you want to be having custom keyboards and just it’s, you know, the world’s your oyster as far as designing what the screen contains is exciting.
(5m 33s) Oliver Weidlich:
- Exactly.
(5m 36s)
And you know, obviously we’ll get to it,
(5m 37s)
but Vision OS is almost the polar opposite in a good way in that it shows you nothing until you want something.
(5m 43s) Tim Chaten:
Yes. So augmented reality, virtual reality, what’s your background here?
(5m 51s) Oliver Weidlich:
Yeah, so, you know, we really saw my experience with computing, I suppose,
(5m 56s)
goes back to the Apple IIe and the Apple IIc, you know, prior to the Mac even,
(6m 1s)
and really seeing that transition from command line to obviously the Mac and the graphical user interface and that different interaction type,
(6m 11s)
and then obviously the internet as well, really changing how we interacted with computing,
(6m 15s)
and then mobile, and we were sort of looking at what is the next era beyond that.
(6m 21s)
So we’re sort of really laser-focused on that, not even on mobile AR and definitely not on full immersive VR.
(6m 44s)
You know, they have their roles, we’re not saying that they’re, we just, that’s not our focus.
(6m 49s)
So we really want to thank you.
(6m 51s)
I focus on adding utility and functionality to people’s digital experiences in their everyday sort of, I want to go in and get stuff done.
(6m 59s) Tim Chaten:
- Yeah, and the cool thing about Vision OS to me is,
Unlimited apps
(7m 4s) Tim Chaten:
it seems like you can run just unlimited apps all at the same time.
(7m 8s)
And if you have something that’s non-obtrusive,
(7m 12s)
that’s like an AR app that kind of is additive,
(7m 15s)
you could be running some productivity apps alongside this unobtrusive,
(7m 19s)
like in the Mac parlance, you have all these menu bar apps.
(7m 22s)
You might have all these just little,
(7m 24s)
just additive, delightful, just AR apps on top of your other stuff.
(7m 30s) Oliver Weidlich:
Yeah, and that’s what we are excited about is the people creating and defining their own spatial computing experiences. You know, we’re limited at the moment by these rectangles that sit in front of us, whether that’s an iPhone or an XDR display or an iPad or a Mac laptop,
(7m 50s)
that have to, by their nature, contain things in a certain space. And obviously spatial computing gives us this opportunity to put things in in.
(8m)
In places that maybe make more sense to us or that we can group spatially, you know,
(8m 4s)
maybe a particular set of, you know, an email I’m working on plus a numbers document and my calendar will sit in one particular area of my view so I can focus on those and then I’m going and focusing on a collaborative experience that I’m having with my colleagues on a freeform board and things like that.
(8m 22s)
So arranging these things around us in places that make sense to us, I think is incredibly powerful and having…
(8m 30s)
them interact in a way that’s different to windows on a computer screen that are contained by that. Both the dimensionality of 3D and depth as well. You know, being able to put things behind things and so on. I think all of that is going to be fascinating.
(8m 50s) Tim Chaten:
I do wonder if there’s going to be a way to hide an app, like on Mac you can command H and have to hide it, on an iPad you have different stages and it seems like you just have one big stage and
(9m 3s)
if you want to get rid of something you’re more or less closing it.
(9m 8s) Oliver Weidlich:
Yes, and while the visibility of something can fade into the background a bit more,
(9m 14s)
if there are other things in front of it, obviously,
(9m 17s)
that can demote it visually.
(9m 19s) Tim Chaten:
Yeah, it’s still there.
(9m 19s) Oliver Weidlich:
But yeah, I think that’s interesting.
(9m 21s)
Also, command tab as well,
(9m 23s)
as an interaction of switching between apps,
(9m 26s)
or bring them to the front.
(9m 28s)
I think there’s things that might be additive later on in Vision OS,
(9m 34s)
and I think we’re starting with the basics,
(9m 35s)
and I think that’s the perfect place to start.
(9m 39s)
But there’s a lot of opportunity to add these types of interactions as the platform develops.
(9m 43s) Tim Chaten:
A command tab is an interesting thing because in Vision OS you just look at it and that’s what brings the attention to it so I’m very curious when I get my hands on
(9m 51s)
Apple Vision Pro is there a command tab and what would that do exactly?
(9m 57s)
It might just bring the focus to it without you needing to look.
(10m 1s) Oliver Weidlich:
- Yeah, and to me, it’s more like almost,
(10m 4s)
maybe a space is a better analogy,
(10m 6s)
where I’m working in a particular set of windows and I have them all arranged,
(10m 11s)
and then I command tab to a different space that comes in front of me,
(10m 16s)
and is the layout that I want of those different screens or documents or, yeah.
(10m 18s) Tim Chaten:
It’d be interesting if there was be gestures to like move your world into a different space like I don’t know what it is you’re swimming or something.
(10m 25s) Oliver Weidlich:
Yeah, yeah.
(10m 28s) Tim Chaten:
Yeah that’ll be yeah it’ll be interesting to see how that all develops. So prior to working in spatial computing, what have been some of your favorite experiences you’ve designed for you know Apple Watch, iPhone, and
What have you designed in the past?
(10m 43s) Oliver Weidlich:
- Yeah, certainly, because we’ve been in there from the very start of a lot of those Apple platforms,
(10m 48s)
we’ve had the opportunity to work with, you know,
(10m 50s)
some of the telcos and the airlines and so on here in Australia and the banks.
(10m 54s)
So, you know, we’ve probably researched and designed over a hundred mobile apps and websites,
(10m 58s)
including, you know, one of the first iPhone banking apps in Australia, and as I said, for the telcos and airlines,
(11m 5s)
and we did the first iPad app for Australia’s national broadcaster and for Fox Sports here in Australia.
(11m 10s)
We worked with Salesforce on an IR.
(11m 13s)
We did a bunch of Apple Watch apps for everything from people with epilepsy to exploring the opportunities around the detection of the onset of a post-traumatic stress disorder episode for army veterans who are aware of it.
(11m 27s) Tim Chaten:
Very cool. So in the spatial computing world and AR/VR world
What headsets have you tried?
(11m 33s) Tim Chaten:
What headsets have you had personal experience with? Because me personally I’ve only really used PlayStation VR 1 and 2 and
(11m 40s)
then at very old PAX East I had like the first Oculus experience on the demo and
(11m 46s)
That was awesome. But yeah, I my understanding you’ve tried pretty much everything
(11m 47s) Oliver Weidlich:
- Yeah.
(11m 51s)
Yeah, and I think that’s the joy of having your own company.
(11m 55s)
You get to decide where the budget goes.
(12m 3s)
Because of my background in psychology and human-computer interaction,
(12m 8s)
I’ve long had an interest in the more academic side of things.
(12m 12s)
And that’s where a lot of the research around head-mounted augmented reality,
(12m 17s)
these headsets that we talk about,
(12m 19s)
has really taken place over the last bunch of years, right?
(12m 23s)
But because of that, they’re doing a lot of research with these types of devices as well.
(12m 27s)
And I really like that approach of getting a device and playing with it.
(12m 30s)
So we’ve got a HoloLens,
(12m 33s)
and one of my first experiences with the HoloLens was with Mark Pesci, the co-author of VRML, who’s a friend.
(12m 40s)
But that really sort of triggered this,
(12m 42s)
okay, we can put digital content in the real world,
(12m 45s)
overlaid in the real world.
(12m 47s)
And we can also interact with it through a multimodal interface, right?
(12m 52s)
With voice and with gesture and as a different visual, et cetera.
(12m 56s)
But we’ve also had the opportunity,
(12m 57s)
we’ve got the Oculus Quest 2 and 3 and the Pro.
(13m)
And while the 3 has an all right mixed reality experience,
(13m 4s)
I think it’s important for us as UX researchers and designers to understand these design patterns that the Oculus is using, that HoloLens is using,
(13m 14s)
and so on, so that we can see what works
(13m 17s)
and what doesn’t.
(13m 18s)
And I sort of, the analogy for me is,
(13m 20s)
back in the day, we used to have Windows Phone and BlackBerry and Android and iOS,
(13m 23s)
and they were all trying different things.
(13m 26s)
And that led to a better overall user experience, right?
(13m 29s)
Over time, webOS, for example, as well.
(13m 29s) Tim Chaten:
Right. I love the WebOS, yeah.
(13m 32s) Oliver Weidlich:
So, yeah, yeah, exactly, it was great.
(13m 35s)
And so devices like Snap Spectacles,
(13m 37s)
so they’re developer edition only,
(13m 39s)
but they’re a lot of fun.
(13m 41s)
And then headsets like the Magic Leap 1 and the 2,
(13m 44s)
I think those headsets are our favorite.
(13m 47s)
Very powerful, really that optical see-through approach to spatial computing or augmented reality that we really see is quite useful and interesting.
(14m)
And they really did a lot of thinking around the UX and the richness of the UX, especially on Magic Lake One because it sort of had this, it was pushing for a consumer experience as well as an enterprise experience back then.
(14m 14s)
whereas now it’s much more sort of clinical.
(14m 17s)
I’m focused on the enterprise experience.
(14m 20s)
And of course, Apple Vision Pro.
(14m 22s) Tim Chaten:
And you guys have been in a couple of labs,
(14m 23s)
which is very exciting.
(14m 25s) Oliver Weidlich:
Yes, yeah.
(14m 26s)
Unfortunately we don’t have one, but.
(14m 28s) Tim Chaten:
Right, they are very hard to come by here.
(14m 28s) Oliver Weidlich:
[laughs]
(14m 31s) Tim Chaten:
Yes, yes.
(14m 33s)
Magic Leap and being able to optically see the world around you.
(14m 38s)
What’s that experience like?
(14m 39s)
Like you have a screen there that you’re able to like glance up at,
(14m 44s)
or are you able to overlay stuff with the entire world or?
(14m 48s) Oliver Weidlich:
- Yeah, so I think that we sort of keep coming back to the Magic Leap One experience as really well thought through user experience design for spatial computing in that you could take an object and let’s for example, it’s a clock, right?
(15m 8s)
So it’s super easy to understand.
(15m 10s)
Everybody knows what a clock is.
(15m 12s)
You could create a clock and this digital object was something that you could grab.
(15m 18s)
Magnetically, it would do the plane detection.
(15m 20s)
So it knew there was a wall that was there and you could stick it to the wall and slide it up and down and move it around.
(15m 26s)
And then you could either leave it there and so on,
(15m 30s)
or you could actually lock it there.
(15m 32s)
And so what that meant was that object would be there,
(15m 35s)
it would stay there.
(15m 36s)
You could turn the headset off, you could go away,
(15m 38s)
you could come back and it would still be there.
(15m 40s)
And I think that’s really powerful for a spatial computing experience.
(15m 43s)
And also the fact that it didn’t have additional UI attached to it.
(15m 48s)
So it just felt like an object, or looks like an object, in the real world.
(15m 52s)
And, you know, that is different from what we’re seeing with Vision OS,
(15m 56s)
where everything has a close button and a grab handle,
(15m 59s)
or the vast majority of things do.
(16m 1s)
There are some exceptions to that.
(16m 3s)
But I think they really thought through that digital object in the physical world type of experience.
(16m 10s)
And I think Apple will get there,
(16m 12s)
but it’s really impressive to see the thinking that went into these types of experiences years ago,
(16m 18s)
to create these really lovely and engaging UX.
(16m 25s) Tim Chaten:
Yeah, one thing I think that will be popular in Vision OS are like photo frames that you put on the wall of your desk.
(16m 30s)
And I think that would work and you can just leave them there and come back to them.
(16m 35s) Oliver Weidlich:
- Yes, and this is where I think there’s a lot of nuance,
(16m 39s)
especially when something lives in your world all the time,
(16m 42s)
like a photo frame.
(16m 43s)
You know, you potentially want it there all the day on your work desk or a clock on the wall, right?
(16m 48s)
Is that if you hit your collect button on your Apple Vision Pro,
(16m 54s)
all the windows will come in front of you, right?
(16m 56s)
I don’t want my clock to, yeah,
(16m 57s)
I don’t want my clock to pop off the wall at that point on my photo, you know?
(17m 1s)
So there’s things like that.
(17m 3s)
There’s things like, I don’t want to grab handle
(17m 5s)
or close button on that photo frame sitting there all the time.
(17m 8s)
You know, it’s these little nuances that I think over the course of the evolution of Vision OS, we’re not even at 1.0, right?
(17m 16s)
We’ll see those types of things happen,
(17m 18s)
but to see that that thinking had gone before,
(17m 20s)
and I think that those opportunities will arise again in the future.
(17m 24s) Tim Chaten:
- Yeah, interesting.
(17m 25s)
Yeah, I’ll be curious when we get our hands on Vision OS,
(17m 28s)
how many times a day I do need to reset the world.
(17m 31s)
I know in PSVR 2, I do it somewhat frequently when I’m in like the movie theater mode,
(17m 37s)
but not that much when I’m actually in a VR game.
(17m 40s)
So we’ll see what’s needed.
(17m 44s) Oliver Weidlich:
- Yeah, and I think certainly over the last however many years it’s been,
(17m 49s)
the Apple OSs have got better at retaining state, right?
(17m 53s)
Like, so, you restart your Mac now and everything pops back up seconds later where exactly where it was.
(17m 59s)
My constant frustration at the moment is I’ve got my display in front of me, my studio display,
(18m 5s)
and then I’ve got two iPads next to it that I use for continuity.
(18m 10s)
But, um, but what happens is every now and then.
(18m 14s)
They’ll get out of sync and they’ll go to the other side of the display.
(18m 16s)
And it’s like really frustrating because I’ve got to, and, and, and I’m like,
(18m 20s)
Oh, I hope that doesn’t happen in vision.
(18m 21s)
I asked where I’m spending half my life, rearranging the screens back to,
(18m 25s)
to where they were.
Magic Leap
(18m 27s) Tim Chaten:
- So with Magic Leap 1 and 2, what,
(18m 30s)
for those that have not worn these or seen them,
(18m 33s)
can you describe the actual hardware by,
(18m 36s)
like with the Apple Vision Pro,
(18m 38s)
it’s like a big pair of ski goggles,
(18m 39s)
that’s the way to describe it.
(18m 40s)
You have a screen over your entire face.
(18m 42s)
Is that not the case here?
(18m 44s) Oliver Weidlich:
Yeah, that’s right. So the way that we think about it when we’re looking at all of these headsets is we divide them into sort of two main categories. The first one is
(18m 52s)
what we call optical see-through. So that’s where it’s essentially a pair of glasses so you can see through the glass and the digital content is overlaid and usually there’s a small square of content or small square of screen that sits over the middle of that and then the opposite to that or the alternative to that is the video
(19m 12s)
pass through that.
(19m 14s)
So what that means for the Magic Leap is that the headsets are more glasses-like in that you can see through them without turning them on, etc.
(19m 26s)
But that they’re a bit more closed in, I suppose.
(19m 30s)
So you have a smaller field of view, so the area of screen that you can put digital content on tends to be a lot smaller.
(19m 40s)
smaller so the Magic Leap 1 was what we call 50 degrees field of view.
(19m 44s)
The Magic Leap 2 was 70 degrees field of view whereas the Apple Vision Pro I think is 110 or something degrees field of view. So that content area is a lot smaller and it’s also a bit more washed out right because you because you’re trying to create light overlaid on the real world and you need to have really strong light right. The headset itself both the Magic Leap 1 and the Magic Leap 2 are cabled as well, like the Apple Vision Pro.
(20m 14s)
But instead of having the battery on the end of the cable,
(20m 17s)
they’ve got the compute and the battery on the end of the cable.
(20m 19s)
So it’s a larger, it’s sort of like a small,
(20m 23s)
like a bread plate type size almost, and quite thick.
(20m 27s)
And they actually had a ARM chip in the Magic Leap 1
(20m 36s)
that ran quite cool.
(20m 37s)
And on the Magic Leap 2, they moved to an x86 processor.
(20m 41s)
So it gets quite hot quite quickly.
(20m 44s)
You have to find some way to put it, right?
(20m 45s)
And the other thing with Magic Leap 1 and Magic Leap 2 is they both have a hand controller as well.
(20m 49s)
So you can imagine you’re trying to put these goggles on,
(20m 52s)
then trying to find a place for the puck to clip onto your pocket or onto your belt.
(20m 57s)
And then at the same time, you’ve got a hand controller.
(21m)
You know, I think the AVP approach of slip the battery into your pocket and put the headset on is a lot easier, not having to worry about controllers.
(21m 9s) Tim Chaten:
Yeah, I was always wondering why the processor and stuff wasn’t in that battery pack and we could just upgrade that pack every couple years to get a faster Apple Vision Pro.
(21m 21s) Oliver Weidlich:
It’s a very interesting question, yes.
(21m 21s) Tim Chaten:
Yeah, but I’m sure Apple has their reasons.
(21m 26s) Oliver Weidlich:
Yeah, yeah, I’m sure that it.
(21m 27s) Tim Chaten:
Yeah, but with all these headsets, have you tried any of these outdoors?
Outdoors
(21m 33s) Tim Chaten:
It seems like that would be a challenging environment to work in, especially Australia.
(21m 39s) Oliver Weidlich:
Yeah, it’s a beautiful day here at the moment. No, we haven’t. We’ve walked around within buildings but not outside. The exception to that is probably the snap spectacles. But as, yeah, the biggest issue is that the amount of light coming in from the surrounding environment. You need to really power an image to make it bright enough to see in those types of environments outside.
(22m 4s) Tim Chaten:
Yeah, and I imagine you just burn through the battery even faster because, yeah, the phone, when you have that cranked up to the max, it just, yeah, you can tell it’s just like burning through the battery.
Your first experience
(22m 16s) Tim Chaten:
So what was your very first headset experience and was it like an aha moment from the first moment or did it take you a while to grasp it?
(22m 26s) Oliver Weidlich:
Yeah, it was, the HoloLens was the first sort of mixed reality or grounded reality headset experience, and that was really powerful, A, because of the AR aspect, because there were digital content, but also that multimodal aspect. And that really resonated with me. I think
(22m 44s)
one of the things that we haven’t fully taken advantage of with user experience design,
(22m 52s)
And this is even with mobile, is the full power of…
(22m 56s)
of context and those different what we call modalities, interaction types, you know, being voice or gestures or sound, you know, both inputs and outputs. So the HoloLens was good in that you could use your hand to gesture things and also you could interact via voice. You could talk to Cortana embedded in HoloLens and that was really powerful. And then that Magic Leap 1 experience was really powerful more from that.
(23m 26s)
And just dialing everything up, the resolution’s much better, the brightness is better, the care around the graphics and the animations in the onboarding experience are phenomenal,
(23m 38s)
you know, very Apple-like in that.
(23m 41s)
So that’s where it sort of really went wow, I went wow, okay, this is the first time I’ve
Hololens
(23m 46s) Tim Chaten:
- Yeah, the pitch for HoloLens always seemed interesting.
(23m 48s)
You know, you have an HVAC technician repairing something and an overlay is what’s going on.
(23m 55s)
And it seemed very enterprisey as far as like,
(23m 59s)
versus, you know, you have these custom overlays versus this more general computing platform that Apple is like, you have all these apps.
(24m 7s)
And, you know, there is some augmented reality,
(24m 10s)
but Apple Vision Pro’s approach seems to be,
(24m 14s)
you know, more.
(24m 16s)
App app based versus these, you know, more niche markets.
(24m 20s) Oliver Weidlich:
Yeah, look, those headsets, like the HoloLens and even the Magic Leap 2,
(24m 27s)
you would tend to buy as an enterprise or, you know, for a vertical application, right?
(24m 33s)
Like, you would expect that the individual using them would be in that app for the vast majority of the time that was using it.
(24m 39s)
Whereas, I think Apple have taken that real spatial computing approach to,
(24m 43s)
"Hey, we want you to be in this a lot more of the time,
(24m 47s)
in your general computing experience.
(24m 51s)
And not necessarily having to be in an app all the time.
(24m 56s)
And I think one of the greatest things that Apple is doing is providing this real transition experience for people.
(25m 3s)
So, again, going from, OK, well, I’ve currently got a Mac and I use a bunch of apps on that and I’m very familiar with that and I might have an iPad, but I get what an iPad app looks like and now I can put that up in my spatial computing experience.
(25m 18s)
and they know that they haven’t got a…
(25m 20s)
bunch of AR experiences and full immersive experiences that they can go straight to out of the gate, right?
(25m 25s)
So it’s this real opportunity to transition customers from going, "Buy this thing.
(25m 30s)
"You can use it from day one "with the stuff that you know and love.
(25m 33s)
"And by the way, there’s all these additional aspects "that over time will build up, right?
(25m 39s)
“And people will understand better.”
(25m 41s)
And while I think there’s obviously a lot of experience in the VR space of creating fully immersive experiences
(25m 50s)
and a lot in the 2D screen space,
(25m 52s)
I think the most interesting for us is in that middle ground,
(25m 55s)
that volumetric experience that is shared with the other windows.
(26m 1s)
Because when you’re in a fully immersive experience,
(26m 3s)
you can’t have it as a shared experience with other apps and windows.
(26m 7s)
They all close down or disappear.
(26m 9s)
But that volumetric one,
(26m 11s)
where I can have 3D objects and 3D interactions,
(26m 16s)
but alongside the other things that I’m doing,
(26m 18s)
I think is the most exciting.
(26m 20s)
And I think that’s the area that will show the most growth.
(26m 25s)
Once people get these in their hands and can see the power of those types of experiments.
Apple’s potential
(26m 31s) Tim Chaten:
Yeah. In your mind, Apple Vision Pro has the potential to succeed where others have failed from a hardware perspective? You’re able to read text and not be fatigued by that experience.
(26m 45s)
Is the hardware the standout or is it the operating system? A combination of both, I guess.
(26m 51s) Oliver Weidlich:
- Yeah, it’s a good Apple product, right?
(26m 53s)
So it’s, the opportunity is in both the software and the hardware and getting those things right.
(26m 53s) Tim Chaten:
Right.
(26m 58s) Oliver Weidlich:
I think, you know, we’re already starting to see the other headset makers sort of,
(27m 5s)
I mean, I’m sure they’ve got, had plans for many years,
(27m 7s)
but they’re already looking at aspects of what Apple is doing and tuning into that.
(27m 13s)
So one of the best things, as I said,
(27m 15s)
was that Apple takes you into your reality.
(27m 17s)
It doesn’t put things in front of you by default when you first enter the.
(27m 21s)
Experience and I think that’s really powerful the quality of the visual you know is widely reported as very high quality the quality of the screens that people are looking at you can get a higher quality headset with something like a value right but the price is even more expensive you know double the cost plus you have to have it tethered to a PC great for vertical you know if you want to render a Ferrari or something and look at it in detail but this is a nice middle ground of combining super high end visuals a device that’s
(27m 51s)
comfortable to wear you know it doesn’t have controllers they’ve taken a really good approach to that and again as I said you can use day one because you probably use a bunch of these apps and things already so.
(28m 3s)
Apples pick you know a sweet spot.
(28m 6s)
On on a whole range of those aspects that I think you know we’ll we’ll create an amazing v1 experience it’s not going to be an everyday consumer experience you know we’re I think we’re years away from that but it’s a it’s a very.
(28m 21s)
Strong starting position and I think using these types of platforms you really have to experience it and interact with it as good as a simulator is it’s just not the same right and it’s I keep using the analogy you know back to the mobile of trying to design a good mobile app on a desktop screen right like you don’t have the ability to that for that direct manipulation you just don’t get that you can carry this thing.
(28m 51s)
Around in the world and pull it out of your pocket and things like that and it’s very similar to a VP like some of the nuances in the interaction design.
(29m 1s)
I’m going to take not some getting used to because they’re very natural but they’re very different to how we interact with computing today that will open up a whole new range of.
Sponsor – Glisten
(29m 15s) Tim Chaten:
This episode of Vision Pros is sponsored by Glisten.
(29m 17s)
Glisten is a different kind of podcast app that helps immerse you in a language you are learning.
(29m 22s)
Glisten stands for “Good Listen.” It is the only podcast player app designed specifically for language learners and it’s coming soon to the Apple Vision Pro.
(29m 32s)
Listen to compelling foreign language podcasts on your way to language fluency.
(29m 36s)
Glisten makes it possible, utilizing repetitive listening workouts,
(29m 40s)
which are like a trip to the gym for your ears.
(29m 42s)
Using the latest AI technology, Glyphs and Determiners.
(29m 45s)
Glyphs determines where sentences start and end, then repeats each sentence as many times as you need to grasp it.
(29m 51s)
You can also read along with the transcript that Glyphs generates automatically, learning new words and deciphering difficult passages.
(29m 57s)
The key to stepping beyond introductory language apps is to start listening to native speakers.
(30m 3s)
Glyphs immerses you in a foreign language, taking your listening comprehension to the next level.
(30m 8s)
With Glyphson, you can learn English, Spanish, French, German, Italian, Portuguese, Dutch, Japanese, Chinese, and more.
(30m 16s)
With more languages coming very soon.
(30m 18s)
When I was learning Japanese a few years back, I made it a point to open up a Japanese iTunes store account and purchase many big blockbusters that I knew inside and out with Japanese audio.
(30m 28s)
I also listened to the official Monster Hunter podcast from Japan.
(30m 32s)
This is an app I really wish existed back then.
(30m 35s)
This podcast app transforms the hundreds upon hundreds of hours of podcasts into a tool to help you learn a new language.
(30m 42s)
The app does a great job at helping you discover potential podcasts in the language you are working on learning, and even as a special search and directory of language learning podcasts.
What has your team been working on?
(30m 51s) Tim Chaten:
This is one of those apps that I think will truly shine on the Apple Vision Pro.
(30m 55s)
One category of apps tons of press have been talking about have been the meditation apps.
(31m 1s)
And Glisten, I think, will shine in the same way that you can have a very focused experience where you can really pay attention to the podcast, follow the transcript,
(31m 10s)
and maybe have another window open with a dictionary.
(31m 13s)
Gleason provides tons of different audio workouts aimed at either speaking or listening.
(31m 18s)
Examples include the do-over,
(31m 20s)
which repeats each sentence once.
(31m 23s)
The kitchen sink, which repeats often with pauses.
(31m 26s)
For speaking workouts, there are things included such as read my lips, which repeats each sentence twice and then gives time to repeat it yourself.
(31m 35s)
Or you can try the slow shadowing,
(31m 37s)
which plays slowly twice when you try to speak with the audio shadowing on the third time.
(31m 43s)
There are many more audio exercises.
(31m 45s)
included, and I can’t wait to try these out in Vision OS when that version launches.
(31m 50s)
To get started before Apple Vision Pro arrives, head on to the App Store and download GLSEN for iPhone, Apple Watch, and iPad.
(31m 57s)
It is free to download and get started with, and if you want even more from the app, give GLSEN Pro a try to unlock the full feature set.
(32m 5s)
My thanks to GLSEN for sponsoring this episode of Vision Pros.
(32m 10s)
So what has your team been working on for Vision OS so far that you can share?
(32m 14s) Oliver Weidlich:
Yeah so we’re kind of different in that we don’t you know because we’re a user experience consultancy we don’t have development capability right like that’s not we we work with third-party developers we work with our clients developers etc but we don’t have that capability in-house but we do want to understand the full process of taking an idea from looking at the customer needs and behaviors you know sketching that out understanding what the relevant
(32m 44s)
what does AVP add to their experience how’s it going to contribute to their their everyday lives and we do want to see the different paths and the different strengths and weaknesses of those different approaches right of well I can design a 2d experience that’s really not our thing for AVP you can design a shared volumetric experience as I said that’s really interesting for us but we also want to see what that experience would be like in an immersive
(33m 14s)
experience and one of the reasons for that is that with the immersive experience you can leverage a our kit whereas in the shared experiences you can’t and to us that’s a that’s that’s kind of in for unfortunate right because we can’t do things like plane detection in a shared space but we want to understand what those capabilities are in a shit you know in a fully immersive space so we’re actually building a little set of utilities is probably the best way of describing them and we…
(33m 44s)
think or we frame them as what we call ambient information visualization. So we’re not going to do a big app that’s going to be a car game or you know a word processor or anything like that. We want to create these little objects that sit in these people’s spatial computing experiences and give them insight into a particular thing that they might be interested in. So there’s these little tiny things that run alongside your other windows and things like that. In In fact, I’ve bought the.
(34m 14s)
Domain desk accessories as in in excess as opposed to the original Mac early system desk accessories like the calculator and things like that because we think there are these bunch of little objects that can just live around your world right and it gives us the opportunity to create these little things in both of these experiences and in fact, so the volumetric in the shared space we’re working with another iOS developer yarn.
(34m 41s)
who’s great in ARKit, so he’s looking at the full–
(34m 44s)
immersive view of that, just to try those things out.
(34m 48s)
And then we’ve also had a Unity prototype developer that we work with for the stuff that we do with Magic Leap.
(34m 54s)
He’s actually created the same experience in the Magic Leap 2 using the windowing, so we can use the gestures and the gaze,
(35m 1s)
not quite as high fidelity as obviously the AVP,
(35m 4s)
but we can simulate that experience in a Magic Leap.
(35m 8s)
And then we’ve taken that path of using the Unity plugin to explore how to pull that into.
(35m 14s)
Our aim as a UX consultancy is a) we can talk to our clients and say we have experience in exploring these three paths, but also to other UX designers we want to explain to them how to design for these different experiences and what the strengths and weaknesses of each of those paths are.
Unity
(35m 32s) Tim Chaten:
So Unity’s been a pretty important tool for actually testing this stuff in Magic Leap and then porting it over to Apple’s framework.
(35m 43s) Oliver Weidlich:
Exactly. So, you know, we’ve known for a while,
(35m 48s)
we’ve expected for a while that something would be happening in this space.
(35m 51s)
So, we did a whole bunch of exploration,
(35m 55s)
talking to designers and devs about what they might want from this future device before Apple had even announced.
(36m 1s)
We did a whole bunch of prototypes of experiences of what most of them were an optical see-through.
(36m 8s)
So, a smart glasses type Apple experience might be.
(36m 13s)
It hasn’t eventuated, so we’ve now tuned that.
(36m 16s)
But the Magic Leap 2 is a good headset to sort of play with that because we don’t have our hands on an AVP.
(36m 24s)
And we’re in Australia, so we probably won’t get through the retail here for a while yet.
(36m 30s) Tim Chaten:
Yeah. Are you guys trying going to look at importing one for an astronomical price on eBay or something?
Buying AVP in Australia
(36m 39s) Oliver Weidlich:
I have the luxury of having two brothers-in-law in Los Angeles, so I might see if they can send me something.
(36m 47s) Tim Chaten:
Oh, very nice. Yes. So the term “spatial computing” was new to me in June when Apple introduced VisionOS,
Spatial Computing
(36m 54s) Tim Chaten:
but this is something you’ve been talking about for years at this point. Is Apple’s vision of spatial computing the same as you guys were conceptualizing this before Apple shared their vision of what spatial computing could be?
(37m 10s) Oliver Weidlich:
- Yeah, I think the whole aspect of spatial computing is really fascinating.
(37m 14s)
It sort of goes back to the mid 90s, 1995,
(37m 17s)
I think it was Simon Greenwald sort of coined the term.
(37m 21s)
And it was really about how can computing understand the environment.
(37m 27s)
And that real direction was around how can computing
(37m 32s)
understand the objects and the environment so that my work and play can be enhanced.
(37m 39s)
we can be a better part.
(37m 40s)
And that really evolved and, you know, there’s a great book called the infinite retina by Cronin and Scoble that really goes into special computing and AR in great detail.
(37m 57s)
But I think it’s this unique combination of having multimodal interaction design.
(38m 2s)
So that’s a core thing, but also understanding the physical environment.
(38m 7s)
And I think Apple have really executed that well with the AVP.
(38m 10s)
They ticked all the boxes.
(38m 11s)
I think there’s detail, as I said, like not supporting AR kit in a volumetric space.
(38m 16s)
I think, you know, that’s an opportunity for improvement.
(38m 19s)
I think, you know, playing with these volumes in the spatial, like locking them or not having grab handles again are opportunities for the future.
(38m 26s)
But it is like if you look at all the headsets now, it’s probably, well, it is the ultimate
(38m 35s) Tim Chaten:
Yeah, so limitations of the API, you mentioned AR kits, a big one, I guess just using the camera itself seems like such a big
API Limits
(38m 44s) Tim Chaten:
potential limiting thing as far as let me identify
(38m 50s)
Stuff in the room and like work
(38m 52s) Oliver Weidlich:
Yeah. Yeah, and I totally understand the privacy and security approach to that.
(39m 1s)
To me, it feels like a version one thing,
(39m 4s)
like they want to lock it down as much as possible.
(39m 8s)
But in the future, I think there might be opportunities where they’ll open up certain things because they’ll understand the use case and you can’t cater to every use case straight out of the box.
(39m 18s)
It’s that sort of thing where it’s better…
(39m 22s)
…to err on the safe side than to go out with everything open.
(39m 28s)
As people who run a lot of usability testing…
(39m 31s)
…one of the things that we want to make sure we can see…
(39m 34s)
…is obviously when people are using the headset…
(39m 36s)
…that we can see their experience…
(39m 38s)
…how they’re interacting from their perspective.
(39m 41s)
So that’s critical for us, for example.
(39m 44s) Tim Chaten:
Yeah, the privacy angle of the killer use of this, you know, you’re at a party and you know who everybody is, just with a little avatar above their head or something.
(39m 57s)
But privacy would forbid that from happening.
(40m 1s) Oliver Weidlich:
- Yeah, that’s right.
(40m 2s)
And I think a lot of our sort of little concepts that we explored were around things like that, right?
(40m 8s)
And it’s fun to play with because a lot of the work
(40m 14s)
that we’ve had the opportunity to do are around wearable devices.
(40m 17s)
So tracking sports players and getting live player data,
(40m 21s)
or we worked with a company doing yoga pants that were all hooked up to sensors to detect whether you were in the right position and things like that.
(40m 31s)
Projects have the opportunity to explore that future state
(40m 36s)
to see where the boundaries are, right?
(40m 38s)
Because we can’t run into this, especially as UX designers,
(40m 44s)
run into this, trying to do everything from the start because there’s so much nuance to this that we have to be careful about these things and really take our time.
(40m 53s)
And there’s the group, the XR Guild,
(40m 55s)
run by Avi Barziv and others,
(40m 58s)
who I think are doing a great job of sort of highlighting.
(41m 1s)
of people to engage with this platform in a way that they just simply couldn’t with different
(41m 30s) Tim Chaten:
My dream, like 20 years from now, when it is just a little tiny glass you can throw on is my grocery store. I can put these on and instead of spending an hour trying to find whatever spice, they can walk me through, “Here’s my grocery list. Let me walk through the store as efficiently as possible and point out where that tiny spice jar might be on the shelf.”
(41m 52s) Oliver Weidlich:
Yeah, and that’s kind of the exciting thing, right? Like, I think if people sit down, they can think of a bunch of different opportunities that this type of device will open up for them.
(42m 5s)
And that’s why, you know, we’re very excited about V1, but we’re also looking to a future state of 5, 10, 15 years or whatever that is, because the opportunities around that are even greater,
(42m 19s)
and yeah it’s fun
(42m 22s)
just exploring them and understanding people’s needs and behaviors and how to extrapolate those and address those needs behaviors in a different way with
(42m 33s) Tim Chaten:
Yeah, I’ll be so curious about the hardware ramp up with generation to generation.
Year to year upgrades
(42m 40s) Tim Chaten:
Apple Watch rapidly improved year to year.
(42m 43s)
It went from the slow, cruddy thing to fast GPS, then we had cellular on the watch, then this redesign with the ECG and then always on display.
(42m 53s)
It seemed like every year is just a revolutionary upgrade.
(42m 56s)
iPhone, early years were drastic improvements as well.
(43m 1s)
at Apple platforms early on.
(43m 3s)
And we’ll see you in a bit.
(43m 14s) Oliver Weidlich:
There’s a guy in the industry called Carl Guttag,
(43m 21s)
and I think it was him that phrased it.
(43m 23s)
He said, “You can’t buy physics.”
(43m 26s)
And basically what he was saying is no matter how much money Meta or Apple have,
(43m 30s)
because of the nature of physics and light and things like that, there’s only certain things you can actually do.
(43m 40s)
So I think that’s an interesting frame of reference.
(43m 42s)
- Yep.
(43m 44s)
I try to look at it and go,
(43m 47s)
is the AVP similar to the iPhone?
(43m 49s)
And that’s different because people had phones that they used daily before the iPhone came.
(43m 55s)
It just changed the nature of interaction and so on.
(43m 57s)
So I don’t think it’s as similar to that.
(44m)
Could you compare a Bondi Blue iMac and the size and depth of that compared to an iMac today, right?
(44m 8s)
And maybe that’s a better analogy, I’m not sure.
(44m 11s)
But I actually think it’s kind of–
(44m 14s)
interesting, and while I’m not a huge fan of the Meta business model, I do have a great appreciation for what they’re doing at both ends of the spectrum.
(44m 23s)
So they’ve got the MetaQuest Pro,
(44m 26s)
which obviously isn’t as high end as the Apple Vision Pro,
(44m 29s)
but it’s that direction, and it’s video pass-through.
(44m 34s)
And at the other end, they’re playing around with the Meta RayBans, which are going to have an AI built into them and can see the camera view and things like that.
(44m 44s)
And so they’ve sort of got both ends covered.
(44m 48s)
And at some point, I can’t get my head around how Apple doesn’t address the optical see-through at some point, right?
(44m 57s)
If you want to get to smart glasses, it can’t be video cameras anymore.
(45m 2s)
And that’s going to change the field of view significantly.
(45m 4s)
It’s going to change a whole bunch of things.
(45m 6s)
And that’s why I think Meta is quite smart in that they’ve taken both ends of the spectrum and they’re going to probably work towards the middle, right?
(45m 14s)
So to me, is this the pro version of the Apple Vision?
(45m 18s)
And is there going to be more a Google Glass-type version that’s like a companion to your iPhone, like your Apple Watch is,
(45m 27s)
or some sort of satellite of devices type of approach that covers that bottom end, right?
(45m 33s)
And I think that would be fascinating.
(45m 36s) Tim Chaten:
Yeah, it’ll be interesting to see the different form factors, because I’ve said before on the podcast,
(45m 39s)
you know, macOS has laptops, desktops, all-in-one desktops, and different form factors. iOS has all these different form factors. iPod touch back in the day, yep, and iPad. And yeah, so I’ll be curious to see the different form factors that come out of this. So as far as interaction method,
Interaction Methods
(45m 58s) Tim Chaten:
how do you see the hand tracking and, you know, not needing a traditional VR controller?
(46m 4s)
I’d imagine for most day-to-day.
(46m 6s)
applications that’s pro for gaming less less pro less good for if you’re wanting it for that kind of dedicated gaming.
(46m 17s) Oliver Weidlich:
Yeah, look, I can imagine and to be honest, you know, I have to, I’m not a gamer at all.
(46m 22s)
So I don’t have that view on things.
(46m 24s)
But to me, not having that controller as a core element of the interaction design really enables a whole bunch of people to use it, not only because of physical dexterity, but just that, as I said before, of just putting the thing on and, you know, things to lose and things to run out of battery and you know where is it? Oh, I can see.
(46m 47s)
I can see through with the video see pass through but you know so I think this approach that Apple have taken with a very simple set of gestures and very from all reports accurate eye gaze and for navigation I think that combination is super powerful. Again we’re introducing a new platform to people, get the basics right, do that really well, make it feel super natural to those people
(47m 13s)
And then over time, you know, through custom gesture.
(47m 17s)
If you’re in a specific app or even I see sort of this opportunity probably more for power users is over time when I have my spatial computing experience that I do a gesture that’s sort of like my shortcut, right?
(47m 31s)
Like leveraging Apple shortcuts to do something because I make this particular gesture.
(47m 37s)
It’s going to go and get that bit of information or it’s going to present these windows or whatever it’s going to be.
(47m 41s)
So I think starting with that basic set is the perfect place to start.
(47m 47s)
start. I think it will evolve.
(47m 49s) Tim Chaten:
Yeah, I always wondered if we’ll get a glove one day that has haptics in it for Apple Vision Pro.
(47m 55s) Oliver Weidlich:
Well, there’s been patents around rings and, you know, there’s some interesting stuff around,
(48m)
you know, devices like watches, et cetera.
(48m 3s)
So I think there’s certainly seems like they’re exploring those types of things.
(48m 8s)
And there’s a lot going on in this, you know, if we talk about spatial computing and contextual computing more broadly, you know, we’ve just recently had the finally the announcement of the humane AI pin, right, which is, you know, a lot of people from Apple in the past.
(48m 25s)
Looking at this whole different way of using a camera to understand the environment and creating experiences off that.
(48m 33s)
So yeah, I think it’s going to be really interesting time as computing starts to understand the world around it, either through video cameras like the AVP and sensors on the AVP or through to AI enabling various aspects of these interactions.
(48m 52s) Tim Chaten:
Yeah, and your mind is amazing at playing tricks on you.
(48m 55s)
I remember the first Oculus Quest demo back in the day,
(49m)
it was like the original Oculus,
(49m 1s)
and there was this demo I played with just a regular controller
(49m 4s)
of this little tiny toy soldier knight,
(49m 7s)
and he jumped on my shoulder,
(49m 10s)
and I could have sworn I felt something on my shoulder.
(49m 13s)
Like, it’s, you know, even in those early days,
(49m 15s)
like your mind can sometimes, if it’s done,
(49m 16s) Oliver Weidlich:
Yeah, yeah, I listened to a great podcast ages ago and I can’t remember what it was, but it talked about this concept of what’s called umwelt, or understanding the environment,
(49m 30s)
like how an organism understands the environment that it sits within. And that ultimately
(49m 36s)
everything is an electrical signal going into the brain, right? So what people and Roni Abowits,
(49m 44s)
who created Magic Leavers has talked about.
(49m 46s)
this of, you know, what’s the fine line between tricking the brain through the various senses,
(49m 55s)
you know, especially the eyes, but also sound and touch and haptics and, you know, those sorts of things where it does, it overrides our understanding that this isn’t real and it becomes as good as real or, you know, it’s close to a simulation of real, that we are wholly invested and emotionally invested or engaged.
(50m 16s)
It’s a very powerful thing and coming back to the XR Guild, you know, we’ve got to be super careful with these types of things as well.
(50m 24s) Tim Chaten:
Yeah, so for many years now, it was kind of a running joke, Apple showing off augmented reality games and applications at WWDC, you know, building Legos while you’re holding an iPad in your hand.
ARKit apps in visionOS?
(50m 39s) Tim Chaten:
It’s like, how ridiculous is that?
(50m 41s)
But I’m curious, all these ridiculous things doing with your iPhone or iPad in your hand,
(50m 47s)
how translatable are these things to Vision OS?
(50m 51s)
His reality kit, he said, has to be fully immersive.
(50m 54s)
If you are fully immersive, can they bring all those experiences to Vision OS?
(51m) Oliver Weidlich:
Yeah, to me, that should be a relatively smooth path.
(51m 4s)
And, you know, that constraint of needing to hold a device in just the right way, in just the right angle,
(51m 12s)
and move it around with this very small window to view this type of content changes.
(51m 19s)
And a lot of those assets should be, I would expect,
(51m 21s)
would be able to be reused, essentially.
(51m 24s)
Probably updated, but reused.
(51m 26s)
and again adapted to the gestures.
(51m 30s)
that we now have, but again it’s quite a simple set of gestures and because of the nature of
(51m 36s)
interacting with the screen on the iPhone and the iPad, it was a pretty simple set of gestures generally speaking anyway. So to me that seems like a smooth path, but yeah. We see the opportunity to think through things from scratch, right, and for us that’s our particular perspective because because we, while we did some of those,
(51m 57s)
like it wasn’t our core focus.
(52m)
The opportunity to start with a new platform and leverage the power of that platform without the constraints or the modalities,
(52m 9s)
the interaction methods, et cetera, of other ones,
(52m 12s)
I think is a unique opportunity with this headset and to really get it right for this particular.
AR Research Paper
(52m 22s) Tim Chaten:
So you shared with me that you published a research paper on wearable augmented reality.
(52m 28s)
What were some of your findings in this paper? I believe it was like four years in the making of researching and anything we haven’t covered yet that you’d like to touch on here?
(52m 37s) Oliver Weidlich:
- Yeah, look, it comes back to that aspect of human-computer interaction.
(52m 42s)
And I think more so than any other platform that we have designed for,
(52m 48s)
there is a lot that has a deeper background in human-computer interaction because of the combination of modalities of seeing and voice and gestures and all of these types of things.
(53m)
So I’ve had an adjunct position with Sydney University for a number of years.
(53m 4s)
I’ve taught there for nine years on interaction design.
(53m 7s)
And so we did a collaborative research paper driven by Tram Tran, a PhD student there.
(53m 12s)
And we looked at the last five years of academic research into head-mounted augmented reality.
(53m 17s)
So these types of headsets,
(53m 18s)
HoloLens is the standard one.
(53m 20s)
And we’ve really tried to track how the research is changing.
(53m 24s)
And the academic research into this space is really transitioning from more about the functionality of the hardware and getting it to work,
(53m 31s)
things like tracking and SLAM and all that sort of stuff,
(53m 35s)
to more things around the consumer experience.
(53m 37s)
So, the interaction design, the ethics of these types of experiences.
(53m 42s)
There’s a great paper that looks at how skin is rendered differently in AR and what the impact is for different populations, right, and the ethics around that.
(53m 53s)
Accessibility is another one, right, that is the research area.
(53m 57s)
Those research areas are really growing as they become more important as the consumer starts to uptake these types of devices.
(54m 5s)
So I think we’ll only see an Excel.
(54m 7s)
Oration around that, but it’s a great indicator on on on where the industry is focusing, because a lot of these are driven by companies like matter and Apple and people like that. These research papers, so it’s it can be good insight into where things are going and it gives you a good background onto all the different types of research.
(54m 26s)
So, you know, people have even been researching that future state where that person’s names above the head.
(54m 31s)
If that happens, how do I feel as the person viewing that and how do I feel as the person?
(54m 37s)
Who is has my label above my head, right? So people are trying to investigate these things again ahead of time before we start designing these experiences and just rolling them out.
(54m 49s) Tim Chaten:
Yeah, I’d imagine in Apple’s ecosystem, like if they’re in your contact book, maybe it uses like the AirTag chip to tell you, "Oh, you’ve already verified that. It’s okay for me to get reminded of who you are since his or her Apple Watch may be on and tells you, ‘Hey, that is me. We’ve met
(55m 14s) Oliver Weidlich:
- And I think this is the amazing thing about the Apple ecosystem is if you look around you and all the opportunities that the platform has, right?
(55m 25s)
Because of, you know, the HomePods,
(55m 27s)
because of other IoT devices, you know,
(55m 30s)
lights and things like that.
(55m 32s)
So one of the prototypes that we played with was looking at a light and gesturing and turning the light on and off, right?
(55m 38s)
And people say, "Well, why don’t you just say,
(55m 39s)
“turn the light on?”
(55m 40s)
And I’m like, "Well, I might be on the phone “or something,” right?
(55m 42s)
So we have to think through all these.
(55m 44s)
Different opportunities and and the strength of the Apple platform is how these all tie together in very unique ways, right that universal control of me being able to slide my cursor across from my Mac onto my iPad.
(55m 57s)
You know that’s unique to this platform because of the tight integration that these devices have things like you know the air tags things like the home kit devices.
(56m 7s)
These will present so many different opportunities for combination and I think you know.
(56m 14s)
Apple will obviously enable a bunch of those that they’re in a very unique position to do, but that opens up so many opportunities for other designers and developers to also to think how can we connect these in interesting ways that will benefit people.
(56m 31s) Tim Chaten:
- Yeah. How impactful do you think Apple’s work in AI?
AI
(56m 38s) Tim Chaten:
I use that term as general as you can,
(56m 41s)
’cause it means a whole lot of different things to different people.
(56m 44s)
But 2024, people have said Apple’s gonna be doing a lot of stuff with their OSs that leverage that technology.
(56m 52s)
How big could this be for Vision OS?
(56m 57s) Oliver Weidlich:
- Yeah, I see AI as an enabling layer, right?
(57m 2s)
Like it’s a way things can be enhanced in interesting and unique ways.
(57m 7s)
And I think particular to the Apple Vision Pro type experience, there’s a guy called Russ,
(57m 13s)
and I forget his last name, he’s at Shopify,
(57m 15s)
and he sort of does a lot of the future state prototyping.
(57m 18s)
And he’s got this amazing product or demonstration of a future state where he, the person is talking
(57m 27s)
their intelligent agent, which is obviously enabled by AI, ’cause I think it’s really important to differentiate those, so the modality,
(57m 35s)
the intelligent agent, the Siri, the whatever,
(57m 39s)
and what brings that to life.
(57m 42s)
But he’s talking to that about a shopping experience.
(57m 44s)
So he wants to buy, he’s interested in latte art.
(57m 46s)
So the intelligent agent brings up three AR models of different coffee machines and puts them on his bench,
(57m 53s)
right, in his kitchen.
(57m 54s)
So he can see them and turn them around,
(57m 56s)
and he sort of talks to the AI.
(57m 57s)
and asks them about various qualities of these things and the AI takes one away because, you know,
(58m 2s)
so and then, you know, I’ll ring that up for me.
(58m 5s)
I want to find some cups that would suit this as well.
(58m 7s)
And so this whole opportunity around intelligent agents and conversational interaction for people
(58m 15s)
with an awareness of the context of use and the objects within your world,
(58m 20s)
I think is hugely powerful.
(58m 21s)
And that’s what Meta is sort of starting to talk about for their AI and their glasses on their Ray-Bans.
(58m 28s)
So I think if you’re looking at a Siri version of that for AVP and enhanced in this way,
(58m 33s)
I think that’s, yeah, it will be very powerful if it’s done right.
(58m 37s)
But again, there’s so much nuance in getting that stuff done right that you have to really make it feel natural and sensible.
(58m 45s)
And all of us have had experience with intelligent agents that feel neither natural nor useful, to be honest.
(58m 54s)
But hopefully that will be enhanced.
Australia
(58m 57s) Tim Chaten:
So, it just struck me as I was writing up the notes, that this is episode 7, and so far 3 out of the 7 episodes are from people in your neck of the woods, Australia and New Zealand.
(59m 6s) Oliver Weidlich:
I know we love the New Zealanders.
(59m 7s) Tim Chaten:
I know you guys don’t like to clump those two together.
(59m 11s) Oliver Weidlich:
We had, you had Clarko, you had James, yeah, I was, it was fascinating.
(59m 17s) Tim Chaten:
Do you have a sense on, uh, is that just a coincidence or is there a big, big interest in this part of the world, uh, for ARV or.
(59m 24s) Oliver Weidlich:
- It’s a great question.
(59m 25s)
Look, it’s a good community down here.
(59m 29s)
I think there’s a lot of,
(59m 31s)
though, you know,
(59m 32s)
Kleiko’s obviously been in the US for a number of years,
(59m 34s)
so, and James, I don’t know directly,
(59m 38s)
but he obviously comes from that iOS background and that game background,
(59m 41s)
credit goes for iOS.
(59m 43s)
So I don’t think there’s a,
(59m 45s)
there’s probably an overarching strong VR community that’s been there for a number of years,
(59m 50s)
but it’s certainly something that we’re looking at now,
(59m 52s)
And we’ve connected with a bunch of people.
(59m 54s)
All around Australia who are particularly interested in Apple Vision Pro and Augmented Reality and Spatial Computing and trying to network those together.
(1h 2s)
So my expectation is it’s largely a coincidence, but yeah, I think it’s actually a question for you.
(1h 9s) Tim Chaten:
Yeah, I don’t know. It’s kind of wild, yeah.
(1h 9s) Oliver Weidlich:
How have you found these people, Michael?
(1h 12s) Tim Chaten:
It is funny where, you know, what parts of the world people kind of prop up and kind of identify them.
(1h 19s) Oliver Weidlich:
Yeah, that’s great to see.
(1h 19s) Tim Chaten:
Yeah. Yeah, no, it’s awesome, yeah.
(1h 22s)
One of these days, I’ll have to get down over there. Yeah. Yeah.
(1h 26s) Oliver Weidlich:
Please do, absolutely.
(1h 28s) Tim Chaten:
So, we’re recording this at the very end of December here.
What questions do you want Apple to answer?
(1h 32s) Tim Chaten:
And we’re hearing more firm and firm things that February is the likely launch timeframe.
(1h 39s)
So, I suspect that we’re going to have an Apple event sometime between now and then.
(1h 47s)
What unanswered questions do you hope Apple addresses in that presentation?
(1h 54s) Oliver Weidlich:
I think Apple has only shown us, you know, a very small aspect of what this device can do,
(1h 1m)
and I think, you know, that launch will probably showcase a much broader range of things,
(1h 1m 4s)
particularly in the volumetric and potentially the fully immersive space.
(1h 1m 9s)
The questions I have are more, because I’ve had the opportunity to use it a couple of times,
(1h 1m 14s)
are less about the device and its capabilities. The questions I have are more about
(1h 1m 19s)
access, you know, to the rest of the world because it is U.S. first.
(1h 1m 24s)
And to really be almost beyond version one, right?
(1h 1m 26s)
I think that will be, I expect that that will be fairly locked by this stage.
(1h 1m 30s)
But it’s sort of more of the roadmap,
(1h 1m 31s)
which traditionally obviously Apple don’t talk about.
(1h 1m 34s)
So I’m excited about it.
(1h 1m 37s)
I’m very keen to get my hands on one and have it in the office.
(1h 1m 43s)
And I think, as I said before, it’s about how can we enable more ARKit functionality within that shared space.
(1h 1m 51s)
you know and back to those other elements.
(1h 1m 54s)
of are they interested in this this space the elements of spatial computing just being around and being part of the environment as opposed to a particular app or you know an in-depth experiences and we we might frame it so I think I think we’ll see signs of that whether they have examples themselves or they’re showcasing third parties that they’ve you know identified and and will be showing off we’ll see and price obviously for Australia yeah and we’ve got the joy of conversion to the Australian dollar which is not not very good
(1h 2m 20s) Tim Chaten:
- Yeah, and I’ll be super curious, the price, yes.
(1h 2m 24s)
Starting at 3,499, what does it end at?
(1h 2m 28s)
Is it, is that…
(1h 2m 34s)
Yeah, I am so curious if there’ll be multiple storage SKUs and what that would look like.
(1h 2m 39s)
Like, am I gonna spend an extra thousand to get the max storage or what not?
(1h 2m 45s)
Yeah.
Anything else?
(1h 2m 47s) Tim Chaten:
Anything we haven’t covered on spatial computing,
(1h 2m 50s)
other topics before we wrap it up.
(1h 2m 52s) Oliver Weidlich:
I don’t think so. I think, you know, as I said, this is a really new platform.
(1h 2m 57s)
And one of the trickiest things with all of spatial computing and AR,
(1h 3m 3s)
and I think Apple will really, you know, be the, be a leader in this space,
(1h 3m 11s)
is getting people to understand the opportunities, right?
(1h 3m 14s)
Like we went to the Augmented Reality World Expo in Santa Clara in June,
(1h 3m 19s)
The week before that Apple launched, Yin was at the…
(1h 3m 22s)
…launched the Apple Vision Pro, but I was really impressed with the whole industry…
(1h 3m 27s)
…sort of looking forward to Apple coming in. It wasn’t like, “Oh, it’s going to be shit” and…
(1h 3m 31s)
…you know, all that sort of stuff. It was… Sorry, I don’t know. But they were really positive…
(1h 3m 31s) Tim Chaten:
[Chuckle]
(1h 3m 36s) Oliver Weidlich:
…because they knew that would bring a lot of attention. And one of the trickiest things is…
(1h 3m 41s)
…you need to get someone in a headset for them to truly understand this experience. It’s not like…
(1h 3m 46s)
…you can show them a mobile phone and they, you know, they can just hold it for a second.
(1h 3m 51s)
They have to put this thing on.
(1h 3m 52s)
Their eyes have to be calibrated.
(1h 3m 54s)
It’s not a simple onboarding experience, necessarily, compared to other types of devices.
(1h 4m)
So, you know, Apple’s retail store, I think, are going to play a critical part in this.
(1h 4m 6s)
But also, us using it as designers and developers to understand that nuance in interaction design,
(1h 4m 13s)
so that we can really leverage the power of the platform.
(1h 4m 17s) Tim Chaten:
Yeah, it is. As you said, you need to wear it to actually understand. It’s so hard to describe.
(1h 4m 26s)
When I say, when I watch PSVR2 for just movies, it’s hard to describe that it does feel like I’m in an IMAX kind of theater versus just holding an iPad in front of my face real close. It’s way different, even though it shouldn’t be conceptually, but it is just the way it works.
(1h 4m 47s) Oliver Weidlich:
And we heard all the reporters the other day talking about the special photos and the special videos, right?
(1h 4m 52s)
Like, it’s a whole new area, right?
(1h 4m 52s) Tim Chaten:
Yeah, I’m so excited for that. Yeah, I have a one-year-old daughter I’ve been doing a lot of spatial video captures of her doing adorable things So I’m very excited to see what that’s like in a month or so. Yeah
(1h 4m 54s) Oliver Weidlich:
Even the panoramas.
(1h 5m 1s)
Brilliant. Oh that’s perfect. Yeah, yeah, exactly, yeah.
(1h 5m 8s) Tim Chaten:
And yeah, I’m so glad they brought that to the phone as quickly as they did
(1h 5m 13s)
Yeah, I’m I’m curious why they didn’t do any photo capture. It’s just video
(1h 5m 19s) Oliver Weidlich:
It’s a really interesting question. I’ve been because because one of the things with video is when you move
(1h 5m 25s)
With a video with a with an iPhone if you’re recording the special video when you’re wearing the headset You’re obviously not moving. So I’m kind of interested in that that disconnect between the video is moving, but I’m not moving
(1h 5m 37s)
Whereas with a photo obviously, I thought it would have been the simpler place to start not only technically
(1h 5m 42s)
But also from ah, well, it’s just a one shot and it’s from that position. That’s that’s you know simple
(1h 5m 47s) Tim Chaten:
Right. And you can maybe turn your head around it to kind of look at different perspectives that way.
(1h 5m 50s) Oliver Weidlich:
Exactly exactly and I think what we it we I think I saw something from Apple’s research teams on Gaussian splatting the other day But you know that whole thing of how can I get a full scan almost, you know Whether it’s a you know, the leveraging room plan or something that they’ve got from an API perspective But more detail than that to to model the whole environment
(1h 6m 13s)
You know to retain a memory in that sense. Yeah, I think there’s a lot of opportunity in that way
(1h 6m 18s) Tim Chaten:
Yeah, it’ll be very interesting and I know people have been dissing on the spatial video capture as they translate it to like
(1h 6m 27s)
MetaQuest headsets, but I have to feel the way Apple’s gonna implement it
(1h 6m 32s)
will take advantage of that 1080p to make it really shine because the experience of
(1h 6m 38s)
Experiences we’ve been hearing from the press are drastically different from those that I’ve just like put it through a translator
(1h 6m 45s) Oliver Weidlich:
Yeah, look, I think that sort of experimentation is fun and great that people do it, but it’s
(1h 6m 52s) Tim Chaten:
Well, anyways, this has been a really fun chat.
(1h 6m 55s)
I’ve enjoyed learning more about what you do and this whole space.
Where can people follow you online?
(1h 6m 59s) Tim Chaten:
And where can people find you online and the work your company does?
(1h 7m 4s) Oliver Weidlich:
So we for 13 years, I think we were called mobile experience, but now we call contextual. So it’s
(1h 7m 11s)
c-o-n-t-x-t-u
(1h 7m 15s)
Dot al is our URL and then there’s you can scroll down in the you can hit the spatial computing section And you’ll go and see all the video prototypes that we’ve been working on and playing with
(1h 7m 26s)
I’m on Mastodon as Oliver W
(1h 7m 30s)
I’m not on Twitter anymore as Oliver W but you can probably connect there.
(1h 7m 34s)
And LinkedIn, of course.
(1h 7m 35s) Tim Chaten:
Excellent. Yes, it was great. And yeah, do check out those prototypes. We did not have time to touch on them, but there are a lot of really cool ones and some stuff like the walking directions that I know Apple has disallowed navigation for safety reasons at this time,
(1h 7m 36s) Oliver Weidlich:
But thanks so much for having me on Tim.
(1h 7m 38s)
It’s been great, really appreciate it.
(1h 7m 52s)
Yep.
(1h 7m 55s) Tim Chaten:
but maybe in the future they’ll figure that out. Yeah. Well, thank you so much.
(1h 7m 56s) Oliver Weidlich:
Yeah, yeah.
(1h 7m 58s)
Exactly, exactly.
Made with Transcriptionist

One thought on “Episode 7 – Oliver Weidlich from Contxtual”