Episode 3 – visionOS and Platform Launches with Ken Case

Ken Case is the CEO of the OmniGroup and has been there as a developer for the launch of many platforms including the iPhone, iPad, and Mac OS X. In this episode we dive into Ken’s thoughts on visionOS, Apple Vision Pro, and headsets in general. We also reflect on some past platform launches and how that might relate to the future of visionOS.

YouTube Version of the Podcast

Links and Show Notes

Chapter Markers:

00:00:00: Opening
00:01:31: Support the Podcast
00:02:07: Ken Case
00:05:17: The Omni Group
00:07:52: New Platforms
00:09:41: Unlimited Space
00:11:00: Successful Platforms
00:12:46: Different Form Factors
00:13:38: Other Headsets
00:15:29: iPad or Bust
00:17:11: Constraints
00:20:21: Tabletop Virtual Controls
00:23:03: 3D Graphics
00:25:29: Windows in Different Rooms
00:28:09: VR only apps?
00:31:45: Working in Different Places?
00:32:41: Ornament Windows
00:33:50: Multitasking
00:35:19: The Simulator
00:36:58: Default Distance
00:39:40: A Different Kind of Headset
00:40:47: What are you most excited for?
00:42:02: Early iPhone Days
00:43:46: OmniFocus Now in Swift
00:44:47: The Mac in visionOS
00:46:35: What visionOS could replace
00:47:59: Anything else?
00:50:13: Eye Tracking
00:51:00: Accessories
00:51:48: The First Omni app for visionOS?
00:52:58: More info?
00:53:20: https://mastodon.social/@kcase
00:53:50: Closing

Transcript

Opening

(0s) Tim Chaten:

Welcome to Vision Pros, the show all about spatial computing, VisionOS, and getting work done on the Apple Vision Pro.

(10s)

I’m Tim Chaten, host of the show.

(15s) Ken Case:

Well, I guess I’m personally most excited about having a huge canvas to spread out and getting rid of these constraints of the physical screens that we’ve been around for so long.

(28s)

And thinking back on your earlier comment about, well, would you take this outside?

(32s)

And that’s actually a place where maybe you would do that more often.

(35s)

Like right now, if I go outside to work on something, maybe I’m bringing my laptop with me, but then I’ve got a small screen.

(45s)

So I’ve got these on, and maybe I really would love to be there, or I’m out at the park or wherever.

(51s)

And I’m enjoying the physical environment that I’m in, plus I’m enjoying being able to have this huge workspace even bigger than the large screen that I’m used to for my Mac.

(1m 6s) Tim Chaten:

Welcome back to another episode of Vision Pros. I’m thrilled to be joined by Ken Case of the Omni Group.

(1m 12s)

Ken has been through many, many platform launches and in this episode we dive into what he’s excited about by VisionOS and discuss more broadly this very special time of a new platform being born and what can be learned from some of the past new platforms that have emerged over the years. As a reminder, you can support this podcast over at patreon.com/I-M-O-N-E.

Support the Podcast

(1m 36s) Tim Chaten:

With that, here’s my interview with Ken. All the

(2m 6s)

things you need to know about VisionOS. Welcome to the podcast, Ken. So, for those that don’t know who you are, can you share a bit on your background and how long you’ve been a developer and whatnot?

Ken Case

(2m 8s) Ken Case:

Thank you.

(2m 18s)

Sure, you bet. Well, I got my start with computing, really, I guess, when the Apple II arrived at my school back in ’79, and started learning to program a little bit, and so on. Fast forward, I went to the University of Washington, took computer science classes there, and quickly fell in love with Unix computers.

(2m 47s) Tim Chaten:

Mm-hmm Yes

(2m 48s) Ken Case:

And while I was there, the Macintosh came out, and I fell in love with that as well, but I couldn’t afford it, so I kind of watched it from afar, and studied all of the inside Macbooks, and so on, but never actually programmed it, and I went back to programming the VAX, and so on, using these old, ancient 9-track teams, and so on.

(3m) Tim Chaten:

[gulps]

(3m 2s)

[laughs]

(3m 10s) Ken Case:

But, fast forward a bit more, after using about 17 computing platforms over a few years, I was…

(3m 18s)

I was trying to write software, you know, as I wrote code, invested time in putting it into code, I wanted it to be available to as many people as possible, so I was trying to write for really cross-platform code.

(3m 28s) Tim Chaten:

Yeah. Mhm.

(3m 33s) Ken Case:

And then, I discovered the next platform, and found how much more productive it made me, and I was like, well, I guess there are two ways for the time I invest in coding to be efficient.

(3m 43s)

efficient. One is for it to be available to as many people as possible and the other is to have

(3m 46s) Tim Chaten:

  • Yeah.

(3m 46s)

Right.

(3m 48s) Ken Case:

it be as efficient an environment as possible. And so I ended up going down that next path and never looked back. So next led me then to Mac OS X, to iPhone and iPad and Apple Watch and so on.

(3m 51s) Tim Chaten:

[laughs]

(3m 57s)

And how, and in NeXT Computing, Steve Jobs, Brainchild,

(4m 6s)

when he was in his period away from Apple,

(4m 9s)

how’d you kind of stumble upon NeXT?

(4m 10s) Ken Case:

Yeah, so I was working for the University of Washington at the time, and NeXT went through a few different periods where they were targeting different environments, trying to figure out what market they could actually stick in, and one of the early choices was higher ed.

(4m 11s) Tim Chaten:

Was that in an educational environment or?

(4m 16s)

Okay.

(4m 32s) Ken Case:

So they offered a bunch of NeXT computers for the lab there at the computer center,

(4m 40s)

and I saw what the team that had accepted them was doing with them, and they were treating them kind of like Macintosh’s, and you had to bring in your own media and reset as soon as you left, and so on.

(4m 46s) Tim Chaten:

Yeah.

(4m 49s) Ken Case:

And I’m like, "But these are great Unix machines.

(4m 51s)

Let’s hook them up to the mainframe as an NFS server, and let’s give people their own student accounts on there, and really turn these into some great Unix workstations."

(5m)

So that was how I first started.

(5m 2s)

This was NeXT step 0.8, I guess, and started adding a learning advocate.

(5m 5s) Tim Chaten:

  • Yeah.

(5m 5s)

Right.

(5m 9s)

[laughing]

(5m 11s) Ken Case:

Which I still use today, so it’s worked out, yeah.

(5m 11s) Tim Chaten:

Paid off.

(5m 14s)

Yeah, back then, yeah.

(5m 16s)

Yeah.

(5m 17s)

So when did the Omni group take form in all this?

The Omni Group

(5m 21s) Tim Chaten:

Mac OS was out for a few years at this point,

(5m 23s)

or I can forget when that all started.

(5m 26s) Ken Case:

Sure, yeah. So the Omni Group, we got together and we were already kind of working together around that university environment. This was back in ’89. But then we started consulting for Next,

(5m 41s)

working for them directly and helping them with some of their clients,

(5m 43s)

like the William Morris Talon Agency and so on. And then we formed as a company in ’92.

(5m 44s) Tim Chaten:

  • Yeah.

(5m 44s)

  • Yeah.

(5m 52s) Ken Case:

or that’s when I registered on a group that comments is going to date as any I guess

(5m 54s) Tim Chaten:

Yes, excellent.

(5m 57s)

And for those that aren’t familiar with your company,

(6m 1s)

productivity apps,

(6m 2s)

what are kind of the apps that you guys create?

(6m 4s) Ken Case:

Yeah, you bet. One of our touchstone inspirations is Steve Jobs’ quote from when he was at NEXT in the early years about how computers help us be mentally more efficient, that he saw them as a bicycle for the mind, much like bicycles made us much more efficient animals for moving, that computers made us much more efficient animals for thinking.

(6m 6s) Tim Chaten:

[heart beating]

(6m 8s)

[heart beating]

(6m 11s)

[laughs]

(6m 14s)

Mm-hmm.

(6m 24s)

Yeah.

(6m 26s)

Mm-hmm.

(6m 32s)

Yeah.

(6m 34s) Ken Case:

And so, yes, all of the products that we build are helping people be more productive in one way or another. And those products are OmniFocus, which is software for busy professionals and helps people kind of organize their lives. OmniPlan, which helps people by building… project managers manage

(6m 35s) Tim Chaten:

Mm-hmm.

(7m) Ken Case:

Project that seems a little redundant

(7m 1s) Tim Chaten:

Hehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehe

(7m 4s) Ken Case:

You know by working with Gantt charts, big Gantt charts and perch charts and so on, the sort of work that my dad used to do when he worked for Boeing and when he was planning it, helping schedule out the Saturn V or the Boeing 747, big Gantt charts. That’s the sort of stuff that OmniPlan is about. And then OmniGraffle is one of our most popular products and that’s diagramming software that we’ve had since the launch of Mac OS X.

(7m 15s) Tim Chaten:

Mm-hmm.

(7m 30s)

  • Yeah, excellent, yeah.

(7m 33s)

And OmniOutliner is,

(7m 35s)

it’s kind of, yeah, and OmniOutliner kind of gave birth to OmniFocus,

(7m 37s) Ken Case:

Yeah, absolutely. I shouldn’t leave that on the outliner. I certainly am using it right now. I have like 50 open windows.

(7m 40s) Tim Chaten:

which is a fun kind of thing, yeah.

(7m 42s)

[laughing]

(7m 48s)

Yeah, excellent.

New Platforms

(7m 53s) Tim Chaten:

Well, VisionOS, a new platform,

(7m 56s)

brand new platform and a new paradigm in computing,

(7m 59s)

And you’ve kind of experienced.

(8m)

A bunch of them you went from command line interface to gooey with you know Apple the Mac and then 2007 direct interface with touch screens in the iPhone and later the iPad and Apple watch and now Eye-tracking and hand gestures and spatial computing it feels like that kind of moment again Does it yeah?

(8m 14s) Ken Case:

Absolutely, yeah. Yeah, it’s exciting to be here in this moment. I feel like it’s a moment that we’ve kind of been waiting for for a long time. I think there have been some important other steps along the way. I don’t want to minimize the value of having a computer in your pocket that you have with you all the time that’s connected to the internet, and another one on your wrist later that starts tracking, you know.

(8m 27s) Tim Chaten:

[laughs]

(8m 39s)

Yes. Yeah.

(8m 43s)

Yeah.

(8m 44s) Ken Case:

your pulse is going crazy or whatever. All of these things are also exciting in their own way,

(8m 52s)

but so one of the things about this past decade is that of changes have been that the devices have mostly been getting smaller and smaller workspaces and that that’s of course been really valuable in terms of being able to put it on your pocket and your wrist and take it with you on the go and be connected to the internet. So I don’t want to minimize the value in that of course,

(9m 1s) Tim Chaten:

Yeah.

(9m 11s) Ken Case:

But one of the really exciting things about

(9m 14s)

the Vision Pro is that, in a sense, this is now giving us a much huger workspace to work with. Suddenly the canvas for our apps to to build content within is much much bigger rather than going to a smaller and smaller screen.

(9m 20s) Tim Chaten:

Yes.

(9m 22s)

us.

(9m 30s) Ken Case:

So it feels very exciting to me.

(9m 34s)

Absolutely, yeah.

(9m 35s) Tim Chaten:

[laughs]

(9m 36s)

Yeah, and it’s anywhere you wanna be with it,

(9m 38s)

which is really cool.

(9m 40s)

Yeah, and one of the cool things I realized when I’m seeing how the developer kind of things about it is on the Mac, you have these toolbox windows and auxiliary windows.

Unlimited Space

(9m 53s) Tim Chaten:

Apple calls these in VisionOS ornament windows.

(9m 57s)

And this is something I would love to have an iPad.

(9m 59s)

This is something that they’ve like refused to do an iPad pretty much.

(10m 1s) Ken Case:

[laughs]

(10m 2s) Tim Chaten:

It’s kind of, can’t have extra,

(10m 5s)

secondary windows on iPad,

(10m 6s)

but it’s basically the app running twice and it’s,

(10m 9s) Ken Case:

Right, yeah.

(10m 9s) Tim Chaten:

but this feels like a nice merger of the Mac where you have all these extra auxiliary windows with iPad.

(10m 15s)

And it’s, so you can have this huge workspace with all the good pop-up windows and extra goodies that kind of brings everything together in a sense.

(10m 27s)

[heavy breathing]

(10m 28s) Ken Case:

Yeah, I love what they’re doing with the design of this platform, that it’s both letting you stay connected to the environment that’s around you. That’s one of the big kind of new features,

(10m 37s) Tim Chaten:

Mm-hmm.

(10m 38s)

[chuckles]

(10m 41s) Ken Case:

I think, around this particular, around their spatial computing vision. And at the same time,

(10m 49s)

giving you all of the space to work with. And as you say, having, I mean, they’ve just done done a beautiful job of the way they would…

(10m 55s) Tim Chaten:

  • Yeah, yeah.

(10m 58s) Ken Case:

connected all of the pieces here.

Successful Platforms

(11m) Tim Chaten:

  • So you’ve been around for a lot of different platforms and some that haven’t been successful, some that have.

(11m 5s)

Do you have a sense of like,

(11m 8s)

is it to really tell if this will be a successful platform?

(11m 11s)

It’s kind of, Apple’s in a different era right now where they’re a much bigger company than, you know,

(11m 16s)

they were back when the Apple Pippin was a gaming console that failed and then it’s a different kind of world.

(11m 20s) Ken Case:

Yeah, that’s an interesting question.

(11m 22s) Tim Chaten:

But yeah, what’s your sense of all that?

(11m 27s) Ken Case:

One of the platforms, as I guess I just alluded to, the next platform was never really a commercial success in and of itself, but it planted the seeds that led to the success of Mac OS X,

(11m 34s) Tim Chaten:

No.

(11m 40s) Ken Case:

the iPhone, the iPad, and the Apple Watch.

(11m 44s)

So all of that, the technology and the steps we were taking and what we learned along the

(11m 50s)

way, all still matter today.

(11m 50s) Tim Chaten:

Yeah.

(11m 53s) Ken Case:

And I think of this the same way that I don’t have a crystal ball that will tell me whether this will succeed or not, but I sure hope it does.

(12m) Tim Chaten:

[laughs]

(12m 2s)

Yeah.

(12m 4s) Ken Case:

And I want to invest my time in it, and I don’t think it will be a waste of time even if it takes a while to get here.

(12m 4s) Tim Chaten:

Yes.

(12m 12s)

Mm-hmm.

(12m 12s) Ken Case:

I don’t expect that this sort of change replaces the other environments right away, much like the iPhone and the iPad haven’t.

(12m 18s) Tim Chaten:

No, right.

(12m 20s) Ken Case:

replaced my Mac. There are different things that I use in different environments in different times and I think I’m going to make a lot of use of all of these things and the balance remains to be seen and how it feels. And the balance at the very beginning of the platform when it first launches might be very different from how it feels a decade down the road when the batteries are better and everything else right.

(12m 37s) Tim Chaten:

  • Mm-hmm.

(12m 38s)

Yes, yeah.

(12m 44s)

And it’s interesting.

Different Form Factors

(12m 46s) Tim Chaten:

Each OS has spurred a lot of different form factors.

(12m 48s)

Mac has desktops, all-in-one desktops, laptops.

(12m 52s)

iPhone’s spurred iPad, Apple Watch,

(12m 56s)

and many different form factors within those.

(12m 58s)

So this, you know, Vision OS,

(13m)

I’m sure it will spur many different form factors,

(13m 2s)

including pure augmented reality and, you know,

(13m 6s)

which is more stripped down from the–

(13m 7s)

there’s more vision OS, Apple Vision Pro,

(13m 10s)

which lets you do both VR and AR.

(13m 12s)

So I’ll be curious to see the different form factors that emerge from this in the coming decades.

(13m 17s) Ken Case:

Yeah, that’ll be fascinating to see how how it gets adapted to different scenarios and And of course, you know, we’ve had in science fiction. We’ve had things like sunglasses that you put on Carry around in your pocket. We may be a few years off from that kind of vision, but But when we get there.

(13m 24s) Tim Chaten:

Do you have any experience with other headsets, HoloLens, or is this a new world to you?

Other Headsets

(13m 43s) Ken Case:

Yeah, so I’ve been interested in, I guess, in 3D graphics and this whole space since.

(13m 46s) Tim Chaten:

Yeah.

(13m 49s) Ken Case:

I remember all those UNIX platforms I was talking about.

(13m 52s)

One of them was Silicon Graphics and the Ira workstation that I had.

(13m 57s)

One of those that I was, this is admin for.

(14m)

I didn’t get to have it on my desk all the time, but after everybody else went home,

(14m 2s)

I got to play around on it and have live 3D stuff when that was very unusual.

(14m 10s)

We didn’t have any 3D hardware in any other.

(14m 11s) Tim Chaten:

[laughs]

(14m 13s) Ken Case:

I’ve been at the graphics lab at the University of Washington experimenting with VR from the very, very early days, around 1990, 1989, too.

(14m 27s)

Over the years, as things come out, I’m very interested in seeing what they can do and how they work and everything else.

(14m 36s)

So, yes, I’ve done multiple generations of device, pre-ordered the audio.

(14m 43s) Tim Chaten:

Mm-hmm. Yeah. Right. Yep. Yep.

(14m 43s) Ken Case:

I’ve done a lot of Oculus, but I don’t think I’ve got Facebook. I’m not sure anymore if they can be tracking everything.

(14m 49s)

I can see sending ads based on it.

(14m 53s)

And, of course, I’m here in Seattle where a lot of friends that I know have gone to work for Microsoft.

(15m 1s) Tim Chaten:

Yeah. Mm-hmm. Yeah. Yeah. Cool. Yeah. No, it’s a fun space. And I have a PSP or two. And I know how good that looks. And we’re doubling the resolution from 2K for I to 4K for i.

(15m 2s) Ken Case:

I haven’t spent a lot of time with the HoloLens. I have played around with it just a little bit with them.

(15m 7s)

They said, “Hey, yeah, check it out.”

(15m 21s) Tim Chaten:

So, it’s gonna be like… I’m excited to actually get my hands on and see what this is all about,

(15m 26s)

hardware-wise yeah yeah so we spoke back in 2010

iPad or Bust

(15m 31s) Tim Chaten:

very beginning the iPad you wrote this amazing iPad or bust blog post which you know it’s now 13 years later do you have any reflections back at the birth of the platform and now it’s kind of it’s maturing quite a bit I’ve got an extra old monitor up with you know a bunch of windows I’m talking to you and it’s feels like a whole different platform at this point

(15m 51s) Ken Case:

Yeah, I’m really glad to see how it’s matured over the years. I still use mine every day.

(15m 58s) Tim Chaten:

[laughs]

(15m 58s) Ken Case:

I’m using it right now for this conversation. One of them. I have several of different sizes that I use for different things. I really liked the way that Tim Cook put it a few years ago at the introduction of the iPad Pro, where he said, "It’s this transforming piece of glass, or a piece of glass that can transform to be whatever you want it to be. Something along that.

(16m) Tim Chaten:

Yep. [laughs]

(16m 2s)

Yep.

(16m 14s)

Yeah Mm-hmm. Yeah

(16m 21s) Ken Case:

And I think it’s a great tool for that, you know, when you want a very focused device.

(16m 29s)

I think it’s less, for me at least, I find it less good for multitasking than a Mac.

(16m 33s) Tim Chaten:

  • Mm-hmm, sure.

(16m 34s) Ken Case:

And especially now that Apple Silicon has made its way to the Mac, you know, for a while there, it was, you know, always this balance of,

(16m 35s) Tim Chaten:

Yes.

(16m 39s)

[laughs]

(16m 44s) Ken Case:

“Well, if I take the iPad, you know, the battery life’s going to be better, it’s a fast thing,” and so on versus…

(16m 49s) Tim Chaten:

I don’t, I burned my lap, yeah.

(16m 51s) Ken Case:

And now, you know, on the latest business trip, I brought my phone and I brought my laptop and I didn’t bring the iPad because I knew I wasn’t going to need it for that short period of time.

(16m 51s) Tim Chaten:

Yeah.

(16m 58s)

Right, sure, yeah.

(17m) Ken Case:

But I think, but yeah, it’s still a very important part of my daily computing experience, I guess.

(17m 8s) Tim Chaten:

  • Yeah, yeah.

Constraints

(17m 11s) Tim Chaten:

So software, it’s about constraints a lot of times.

(17m 15s)

Each platform kind of has their own constraints.

(17m 18s)

Apple Watch, very constrained,

(17m 20s)

very focused device to what you do there.

(17m 23s)

iPhone, very constrained as far as, you know,

(17m 25s)

screen space and dealing with, you know,

(17m 27s)

iPhone SE back in the day.

(17m 29s)

And the Mac has different constraints,

(17m 32s)

but still constraints.

(17m 33s)

What do you see as, you know,

(17m 36s)

the constraints within vision away.

(17m 38s)

and kind of like the things that aren’t constrained,

(17m 42s)

such as like screen real estate.

(17m 44s)

You can have this like 50 inch app to work within.

(17m 47s)

And you have the constraint of,

(17m 50s)

does this person just have their hands and eyes and no keyboard or mouse?

(17m 54s)

And designing around being a pro tool for when you do have the keyboard and track pad to get, you know, serious, you know, input done versus being accessible and usable still with just your hands and eyes.

(18m 8s) Ken Case:

Yeah, that’s one of the exciting things, I think, anytime there’s a new platform that comes out is kind of thinking about, “Well, now what does computing look like given this whole different set of constraints that are going on?”

(18m 14s) Tim Chaten:

Yeah.

(18m 22s) Ken Case:

And I think you covered kind of the basic changes, right, that we have the much—in some ways, the screen is more limited in that it’s enclosed around your face, and so other people can’t see it as you’re working with other people nearby unless they have one too.

(18m 32s) Tim Chaten:

Mm-hmm, right. Yeah. Yes.

(18m 38s) Ken Case:

And you’re in a shared environment. But then it is so much more freeing to be able to put your windows all over the place. At least it is in the simulator. Of course, Apple tries to ensure that that we don’t require.

(18m 47s) Tim Chaten:

Mm-hmm Yes

(19m 8s) Ken Case:

They’re devices that aren’t absolutely necessary on each new thing, right?

(19m 13s)

So, you know, like famously, the iPhone ditched the stylus and the little touch tiny keyboard that the Blackberry had.

(19m 18s) Tim Chaten:

Yes.

(19m 19s)

Yeah.

(19m 22s) Ken Case:

And like, you know, we’re just going to do everything with touch and that’s it.

(19m 23s) Tim Chaten:

Mm-hmm.

(19m 25s) Ken Case:

You don’t have to bring anything else.

(19m 26s)

You’ve just got your phone in your pocket and you can use it.

(19m 28s)

And with these glasses, you know, unlike, say, the Vive controllers that you hold in in your hands, or the PlayStation controllers either.

(19m 37s) Tim Chaten:

Yeah.

(19m 38s)

Yeah.

(19m 38s) Ken Case:

It’s all done just with your gestures and doing motion tracking.

(19m 44s)

I think that’s pretty exciting to have that be so independent in that way,

(19m 53s) Tim Chaten:

Right.

(19m 53s) Ken Case:

so you don’t have to carry other things around and bring them with you.

(19m 54s) Tim Chaten:

Mm-hmm.

(19m 55s) Ken Case:

That makes it more powerful and more portable than everything else.

(19m 58s)

At the same time, I do think if I’m going to want to get a lot of work done,

(20m 5s)

I probably want a real keyboard and you know different depends on my activity

(20m 6s) Tim Chaten:

Yeah? Sure. Sure. Yes.

(20m 8s) Ken Case:

maybe sometimes I’m gonna want a game controller and play in the space or whatever else so I’m just excited to see where it goes

(20m 19s) Tim Chaten:

  • Yeah, yeah, one thing I’m thinking about is with all these, you know, ornament windows is placing like a control surface on like a table in front of you and having a customized,

Tabletop Virtual Controls

(20m 33s) Tim Chaten:

I don’t know, a customized say, omni-graphal,

(20m 37s)

like here are all my different shapes and I can mess with the shapes then throw them up on the main window.

(20m 40s) Ken Case:

Right.

(20m 42s) Tim Chaten:

Or if I’m a music composer,

(20m 45s)

I have a virtual full screen piano that I can, you know, play in front of me

(20m 50s)

and like all these, you know, virtual input.

(20m 52s)

I’m not sure how feasible that is in version one of the SDK,

(20m 59s)

if you’re able to do that kind of stuff yet,

(21m)

but that seems like an option down the road where you wouldn’t need hardware as much.

(21m 5s) Ken Case:

Yeah, I’m really curious about that myself. I feel like there’s opportunity there with the skeletal hand tracking that they do and so on, that in theory, you could, for example, maybe put a virtual keyboard on the desk in front of you or on your lap and then just tap at it and not have to have a real one with you to be able to type efficiently. But I don’t know how the,

(21m 18s) Tim Chaten:

Wait.

(21m 19s)

Yeah.

(21m 24s)

Accuracy.

(21m 28s) Ken Case:

I mean, you’ll see how the accuracy is, as you say, in this first version.

(21m 29s) Tim Chaten:

Yeah.

(21m 32s) Ken Case:

but I do think it’s exciting to think.

(21m 35s)

And because the device itself is so portable,

(21m 39s)

you know, I’m seeing in the social media feeds of other developers that are working in the simulator and adapting their apps.

(21m 49s)

Like one of the examples is somebody’s working on their sheet music app.

(21m 53s)

And now you can sit down at the piano and have sheet music go as wide as you want and you don’t have to worry about turning, flipping pages and about trying to find a stand that will work.

(22m 1s) Tim Chaten:

  • Yeah, yeah.

(22m 3s) Ken Case:

that’ll work in the end.

(22m 5s)

constraints and so on.

(22m 6s) Tim Chaten:

It’d be kind of interesting if a music notation app could actually see a piano and translate that into their app, but I don’t think,

(22m 16s)

I think the privacy thing locks out the camera entirely,

(22m 19s)

so it’d have to be a digital overlay of a real piano ’cause they can’t see the camera really.

(22m 25s)

  • Yeah.

(22m 26s) Ken Case:

Yeah, I don’t um, maybe you could place things on each of the keys though So yeah, yes you say you sort of have your own algorithm, it’s like all right, here’s my keyboard I’m gonna put it right there on the real keyboard it now

(22m 31s) Tim Chaten:

Right. Yes.

(22m 33s)

[Laughter]

(22m 39s)

Yeah. Yeah, it’d be fun with, like, keyboard shortcuts, being able to program in, like, a digital touch pad of, like, you have all your hotkeys, just big buttons on your desk that you can just hit as a thing.

(22m 50s) Ken Case:

Right. Yeah. Right. A touch bar in multiple dimensions.

(22m 51s) Tim Chaten:

Yeah. I was – the idea of the touch bar finally taking off of this huge touch bar, that’s actually useful, yeah.

(22m 59s)

[laughs]

(23m 1s)

. Yes. Yeah. So, something I’m curious about is we’ve only had to deal with like 2D graphics,

3D Graphics

(23m 10s) Tim Chaten:

really. Or at least, you know, no 3D, true 3D as far as, you know, yes. So, this shift into being able to work with 3D assets and applications, how do you see that going? And do you see any any potential uses within Omni apps to make–

(23m 16s) Ken Case:

At least as a productivity developer again.

(23m 18s)

[laughs]

(23m 31s) Tim Chaten:

make use of that, like OmniGraph 11 3D assets,

(23m 33s)

and how does that translate when you’re trying to share a single document across different platforms and things like that.

(23m 38s) Ken Case:

Right. Well, I think it’s important to be able to have those sorts of options. Like, you could imagine maybe one of the stencil sets in OmniGraffle is now a chess set. You’re moving stuff around and you’re showing positions and it really looks like you’ve got a board. In that case, presumably you still want to be able to share all that content with people using OmniGraffle somewhere else. And I think you would just do a visualization and say, "Okay, well, what’s the angle that you

(23m 49s) Tim Chaten:

Yeah, right.

(24m 8s) Ken Case:

sent this at?" Or maybe let people rotate it in other contexts as well. That said, I don’t know whether that’s the right thing to do with OmniGraffle. I look forward to experimenting and kind of figuring out, “Okay, what makes sense? What doesn’t make sense? Is that really the kind of content people want to build, or do they want to use 3D in other ways?” So it still may be 2D content, but the 2D content is now being, well, of course, first of all, unconstrained by the way that you’re screaming but

(24m 20s) Tim Chaten:

  • Mm-hmm.

(24m 21s)

  • Yeah.

(24m 22s)

  • Mm-hmm.

(24m 38s) Ken Case:

also, you know, like maybe one of the requests we’ve had, people have asked about having layers in OmniGraffle that go off at angles, for example, right? And you can imagine saying, “Okay, I’m going to put this here and this here and this other thing there,” as you slide things around there at different angles and use it that way. Or sometimes people do want 3D assets, not for 3D visualization, but just to have some really great assets that look like a hard drive or whatever.

(24m 45s) Tim Chaten:

Right. Yeah.

(24m 57s)

[footsteps]

(24m 59s)

Right. Yeah. [laughs]

(25m 8s) Ken Case:

So they might pick an angle and then they might turn it in 3D for real, but then say, “Okay, now flatten it down. Now it’s just the flat image that goes into the poster they’re making,” or whatever.

(25m 16s) Tim Chaten:

Yeah.

(25m 17s)

Yeah.

(25m 18s)

Yeah.

(25m 19s)

Seeing layers is a cool idea because we deal with layers and we’re hiding layers and adding them back in.

(25m 25s)

It would be cool to actually just see them in a 3D space, yeah.

(25m 26s) Ken Case:

Right.

(25m 27s)

[laughs]

Windows in Different Rooms

(25m 31s) Tim Chaten:

So spatial computing, is the term Apple’s been using.

(25m 36s)

What do you see the potential of, you know, leave, you know, OmniFocus?

(25m 40s)

You leave one OmniFocus window at your work desk and you’re at home, you have different OmniFocus windows and it knows…

(25m 46s)

if room scans remembers your different rooms and do you see that as being something pretty powerful?

(25m 49s) Ken Case:

I think so. It remains to be seen where people are going to want to wear these things. I could imagine walking into where you do the laundry and seeing your list of things related to laundry that you need to get done. You walk into the kitchen and you see your shopping list or your menu or whatever. All of those windows, the way it works in the simulator.

(25m 59s) Tim Chaten:

Yes, right.

(26m 19s) Ken Case:

You can just drag them wherever you want, leave them there, and come back later and they’ll still be there. I certainly see a lot of power and flexibility there. But again,

(26m 30s)

I don’t know until we get a chance to actually live with these things. Is this something that I’m going to walk around the house with really? Or is it something that I’ll mostly use in a few specific locations and I’ll put it on there? Then when I’m done, I’m done.

(26m 31s) Tim Chaten:

Right.

(26m 41s)

  • Right, yeah.

(26m 43s)

Your office, right, your office or for entertainment.

(26m 46s)

Right, yeah.

(26m 47s) Ken Case:

  • That’s pretty interesting, yeah.

(26m 48s) Tim Chaten:

Yeah.

(26m 49s)

And something I see about this device is that it’s very much pitched as augmented reality, right?

(26m 54s)

And that seems to be its primary.

(26m 57s)

You’re still in the world and people can interact with you and come into your world and all that.

(27m 2s) Ken Case:

Right. It’s not being cut off from the world that is actually around you, generally,

(27m 3s) Tim Chaten:

Yes.

(27m 8s) Ken Case:

except when you turn that dial.

(27m 11s) Tim Chaten:

You don’t have to, like in TSVR,

(27m 13s)

the way I make sure I’m aware when people enter the room is I have, you’re able to add the mic of the headset mixed into the sound of your headphones so you can actually hear when doors open behind you and people are talking.

(27m 26s) Ken Case:

Right.

(27m 27s) Tim Chaten:

So that’s been helpful.

(27m 29s)

But outside of that, I’d be completely startled if someone came up and tapped me on the shoulder, you know?

(27m 33s) Ken Case:

I’ve experienced that before, yes.

(27m 35s) Tim Chaten:

Yeah.

(27m 35s)

Yeah.

(27m 39s)

So it’ll be nice.

(27m 40s) Ken Case:

Or a cat, sometimes.

(27m 41s) Tim Chaten:

Yeah, yeah.

(27m 42s)

I do wonder that ability of when someone enters your area,

(27m 47s)

if animals like a dog would enter your,

(27m 51s)

or is that only detecting people that would set that off?

(27m 55s)

Yeah.

(27m 55s)

Yeah.

(27m 56s) Ken Case:

I don’t know. My sense was that it’s just proximity things in general. If something’s moving and it’s near you, you can be. Right.

(28m 2s) Tim Chaten:

Moving, yes.

(28m 3s)

If a ball is being thrown at you,

(28m 6s)

that’ll enter your, catch, yes.

VR only apps?

(28m 9s) Tim Chaten:

Yeah.

(28m 11s)

Something I’m curious about,

(28m 13s)

do you know, as far as developing an app,

(28m 17s)

like with OmniFocus, could you make a pure VR version,

(28m 21s)

a VR version of OmniFocus and you have these different environments you’re working within OmniFocus or whatnot,

(28m 28s)

and then twist that dial to scale back, you know,

(28m 33s)

the environment you created,

(28m 35s)

which is all about OmniFocus to something more suitable for multitasking and…

(28m 41s)

being in AR mode.

(28m 42s)

Can a single app do that, or?

(28m 45s)

Okay, yeah.

(28m 46s) Ken Case:

I don’t know the answer to that question. That’s a good question. I don’t know what access we have as app developers. I know we can create a completely immersive environment and that environment still doesn’t really block out the other world. If something, as you say,

(28m 48s) Tim Chaten:

  • Yeah.

(28m 56s)

Mm-hmm. Right.

(28m 58s)

Yes.

(29m 3s) Ken Case:

the ball’s coming at you, it’ll open a hole in your environment and show you, hey, here’s reality that you might want to pay attention to, or somebody who just walks over and says hello.

(29m 6s) Tim Chaten:

Yeah. Right.

(29m 12s)

Yeah.

(29m 16s) Ken Case:

But what I don’t know is whether we get to choose how much that environment is on or off the way that it’s built in. Of course, the system has this already built in with its own, you can go be in Mount Hood or whatever and here’s this great environment. You turn the dial and that seems like It’s like a fine place to go do my OmniFocus review.

(29m 29s) Tim Chaten:

Right.

(29m 37s)

Yeah, right.

(29m 44s) Ken Case:

I don’t know that we need to have a-

(29m 46s)

to build our own VR environment.

(29m 48s)

When Apple’s doing some great environments,

(29m 50s) Tim Chaten:

Right Right

(29m 50s) Ken Case:

maybe we’ll open that up for, you know,

(29m 52s)

I hope that they open that up for three,

(29m 54s)

for third parties to do, like, just environments that maybe any app could live in.

(30m)

And so it’s not something that each app would have its own environment,

(30m 4s)

but it’s something where you can kind of pick and choose environments.

(30m 7s) Tim Chaten:

  • Yeah, right.

(30m 8s) Ken Case:

Maybe not for version one, though. I don’t know.

(30m 10s)

[laughs]

(30m 10s) Tim Chaten:

For version one, it seems to make most sense as at least a starting point of doing the more augmented reality focus where it’s the floating windows versus the more volumetric stuff with your apps just based on the need of multitasking with them and whatnot.

(30m 12s) Ken Case:

Down the road.

(30m 30s)

I think so, and I have this sense that that is more what Apple’s emphasis is as well, right?

(30m 30s) Tim Chaten:

  • Mm-hmm.

(30m 35s)

Mm-hmm.

(30m 36s) Ken Case:

That their vision for this product is not… I mean, they never said the word “VR” or “virtual reality” at all in their presentation.

(30m 37s) Tim Chaten:

[laughs] No?

(30m 43s) Ken Case:

That what they care about is not replacing the world around you, but augmenting it and making you more efficient in it.

(30m 51s) Tim Chaten:

Yeah. Yeah. Yeah.

(30m 53s) Ken Case:

Or maybe having some more enjoyable experiences with it.

(31m)

That starts to tread that line. There’s a line between, “OK, what point does this become virtual?”

(31m 6s)

When you do turn on those environments, so you’re blocking out the plane in that example in the keynote.

(31m 9s) Tim Chaten:

[chuckles]

(31m 11s)

Right.

(31m 12s) Ken Case:

And you’re just watching the movie or whatever.

(31m 15s)

There are times where an immersive environment totally makes sense, of course.

(31m 15s) Tim Chaten:

Totally, yeah.

(31m 19s) Ken Case:

out [laughter]

(31m 20s) Tim Chaten:

  • And for certain, if you have a really tiny office,

(31m 23s)

you’re probably gonna need to do at least frontal VR to get the space and distance faked to you to have enough room to work in.

(31m 32s)

In my room I’m currently at,

(31m 34s)

I’d probably just wanna turn my chair around and have a little table in front of me for a keyboard

(31m 40s)

and use all the space behind me rather than being so close to the wall where I have my computer.

(31m 43s) Ken Case:

Right, that makes sense.

(31m 44s) Tim Chaten:

Yeah.

Working in Different Places?

(31m 45s) Tim Chaten:

Yeah, so it’ll be interesting how workspaces shift.

(31m 47s)

And I mean, do you see yourself,

(31m 49s)

I could see myself working on the outdoor patio outside on spring or fall days.

(31m 54s)

Are the different places you could see yourself working that you wouldn’t otherwise with this kind of setup?

(32m 2s) Ken Case:

I think it will be more comfortable visually to work in other places, but I have intentionally set up a really nice office chair and so on.

(32m 9s) Tim Chaten:

Yes, right, yes, yeah, yeah.

(32m 11s) Ken Case:

And so it’s hard for me to imagine, well, let’s see, would it be truly more comfortable outside?

(32m 16s)

Well, the whole point is I’m kind of bringing my environment with me anyway. It doesn’t matter where I am, and I might as well be in the most comfortable chair.

(32m 23s)

So I guess we’ll see. I think there’s certainly opportunity there and times where maybe you can do it.

(32m 32s)

You want to be in a pre-calculation because you are trying to augment the world around you, not just withdraw from it.

(32m 34s) Tim Chaten:

Yeah.

Ornament Windows

(32m 41s) Tim Chaten:

How do you see developers taking advantage of these ornament windows alongside the main windows?

(32m 49s)

This kind of immersion of Mac and iPad in my mind kind of…

(32m 52s)

It really feels that way to me.

(32m 55s) Ken Case:

Yeah, so I think we’re going to see a lot of ornament windows, for sure, that they’re a great way of being able to have sort of these utilities.

(32m 58s) Tim Chaten:

Yeah.

(33m 10s) Ken Case:

It’s almost like the widgets that we have on the iPad, in a sense, except that they’re much more focused.

(33m 10s) Tim Chaten:

Yes!

(33m 15s) Ken Case:

They’re usually associated with a particular main document window, and it’ll be interesting to just see what kind of control we have over that.

(33m 24s)

we have some level of control.

(33m 25s)

As we’ve been working in the simulator with OmniFocus, for example,

(33m 29s)

now the perspective bar goes along the left edge as an ornament window just kind of detached from the main window.

(33m 32s) Tim Chaten:

Yeah, do you have a sense of multitasking yet, you know, like drag stuff from one app to another, you know, you’re clicking and then basically using your eyes to tell it

(33m 35s) Ken Case:

And it’s all exciting to work with and see.

(33m 39s)

And I can’t wait to really work with it truly on a daily basis.

(33m 47s)

Yeah.

Multitasking

(34m 2s) Tim Chaten:

what to drag this to, I guess, like, how seamless do you think working across apps will be in this environment?

(34m 7s)

Yeah.

(34m 8s) Ken Case:

Right. I certainly hope so. Drag and drop has been such an important part of, or maybe I should just say app interoperability has been such an important part of computing experience for decades now. When we didn’t have it on the phone for a few years, or between different apps or on the iPad, came to the iPad first, that felt very limiting.

(34m 13s) Tim Chaten:

Yeah.

(34m 20s)

Mm-hmm, yeah, yeah.

(34m 38s) Ken Case:

Of course, this is an environment that I want to be exactly the opposite of that, but I have not explored it a lot. I really should set it up with two of our apps, OmniOutliner and OmniFocus, side by side. What happens when I drag a project from OmniFocus and OmniOutliner?

(34m 40s) Tim Chaten:

[laughs]

(34m 52s)

Yeah.

(34m 54s) Ken Case:

All the pasteboard types come across properly, as I would expect. Does that gesture even work the way I would expect? That’s where maybe it gets a little bit hard to know for sure how it works until we have the real hardware to play with.

(35m) Tim Chaten:

Right.

(35m 7s)

Yes, yeah.

(35m 8s) Ken Case:

Because trying to do that in the simulator, all right, maybe it works, but then how does it actually work when you’re doing that with your eyes and gestures?

(35m 8s) Tim Chaten:

[laughter]

(35m 17s)

  • Yeah.

(35m 18s)

How’s your experience with the simulator been so far?

The Simulator

(35m 22s) Tim Chaten:

Has it been, have there been a lot of roadblocks to getting the iPad versions of your apps up and running just in the rudimentary form so far?

(35m 30s)

Yeah.

(35m 30s) Ken Case:

No, no, it’s it’s been a really great tool. I think I really appreciate that they That they have it there and that it’s so robust. It’s sometimes maybe a little too forgiving and so, you know, it will support things that again that I don’t necessarily expect the real hardware to support well and things like It’s really easy for a mouse pointer to stay in one point position But I don’t think it’s gonna be so easy for an eye to stay but that’s all focus. So

(35m 53s) Tim Chaten:

Yes. No. Yeah.

(36m)

Mm-hmm.

(36m) Ken Case:

You really want To do the sort of you know simulate the sort of micro motions that your eyes do this as you’re looking at something and Understand that if you’re pointing at something you might accidentally trigger the neighboring element If it’s too close and so you should spread those things out, right Apple gives some good specific advice about that Like, you know that things should be well, I think it’s 60 points apart

(36m 15s) Tim Chaten:

Yes. Mm-hmm. Mm-hmm. Yeah.

(36m 26s) Ken Case:

um, and then the the point system is set up so that it scales the

(36m 30s)

based on the distance of the window. So when a window is further away,

(36m 33s)

all of the content grows so that from the eye’s point of view,

(36m 38s)

they’re all still kind of the same distance apart,

(36m 40s)

which is an interesting thing to do.

(36m 43s) Tim Chaten:

Yes. Yeah. Yeah. And Apple seems to have a default distance in mind. And it seems like they let you put the windows close, but they’re not encouraging it for sure. Do you see ergonomic,

(36m 44s) Ken Case:

And it’s kind of hard to accurately position windows from my point of view in the simulator, but I imagine in reality, it’s going to be pretty easy to,

(36m 52s)

you know, grab something and then put it right there and then understand what happened and where it is.

Default Distance

(37m 13s) Tim Chaten:

reasons as to why you would probably not want a app that is focused on, say, putting a digital sheet of paper on your desk and you’re just staring down at it for a long period of time.

(37m 25s)

Is that something that people you think should stay away from just eye health wise probably until we have hardware that physically moves to have a different focal distance, you know?

(37m 35s) Ken Case:

Right, right Well, and all that’s kind of an interesting question like when something is Wouldn’t you position a window near your face is it really is it in near focal distance or is it?

(37m 48s)

You know because the glasses are the same distance no matter what right? I mean the goggles are and so is that?

(37m 51s) Tim Chaten:

It is, yes. Yeah.

(37m 55s) Ken Case:

Distance basically constant and it doesn’t matter how far away you theoretically put it or how close you theoretically put it It’s always gonna be kind of the same

(38m 2s) Tim Chaten:

  • Yeah, ’cause I know like prescription,

(38m 5s)

like you need them, even if you don’t need prescriptions for looking at stuff up close,

(38m 10s)

it’s a distance prescription that you need to be able to see with clearly within here,

(38m 16s)

even for close-up stuff.

(38m 17s)

So I have to imagine there is a focal distance in mind that it’s mimicking to your eyeball and brain.

(38m 23s) Ken Case:

Yeah.

(38m 24s) Tim Chaten:

And when it gets close,

(38m 24s) Ken Case:

[laughs]

(38m 25s) Tim Chaten:

if it gets close to your face virtually,

(38m 28s)

I don’t know, I’m not sure.

(38m 32s)

Next year, if that’s uncomfortable for a long time or not.

(38m 33s) Ken Case:

Yeah, I look forward to finding out.

(38m 35s) Tim Chaten:

Yeah, right, yeah, yeah.

(38m 36s) Ken Case:

We’ll follow up on that when we get to talk next.

(38m 39s)

Yeah.

(38m 40s) Tim Chaten:

Which is why I see potential for like,

(38m 43s)

surface and control interactions at your fingertips where you’re just kind of glancing down versus perhaps doing all your work down there,

(38m 51s)

but mainly focused in the distance a bit.

(38m 54s)

[laughs]

(38m 54s) Ken Case:

Yeah, I feel like I ought to know the answer to this because it’s not like VR is new and I have been using you know other VR devices over the years, but but I feel like the work or I Shouldn’t say work the thing the activities that I’ve been doing in the other VR devices are basically not work, right? And so You know might be exercise or whatever, but they’re not actually And and I imagine with the vision Pro that I will be doing a lot of work and so it’s very different activity and

(39m 11s) Tim Chaten:

Yes. [laughs]

(39m 24s) Ken Case:

different kinds of Focus and you know, I don’t know I’m gonna spend hours looking at windows in this space in the way that I never would do when I’m playing beat saber

(39m 36s) Tim Chaten:

  • Right, yes.

(39m 37s)

Yeah, it’s kind of interesting ’cause, you know,

A Different Kind of Headset

(39m 42s) Tim Chaten:

before the keynote, I was not too excited ’cause,

(39m 46s)

oh, Apple doesn’t get gaming and why am I gonna care about this?

(39m 49s) Ken Case:

Right.

(39m 51s) Tim Chaten:

And I was right, they don’t get gaming and I’m still gonna game on my PSVR too.

(39m 56s) Ken Case:

Right.

(39m 57s) Tim Chaten:

But this is a new computing platform that’s gonna be,

(40m)

you know, a desktop replacement for many use cases,

(40m 3s)

I think.

(40m 4s)

Um…

(40m 5s) Ken Case:

Yeah, I mean you would never get the sort of apps that I’m expecting to see on day one here on the other platforms It’s just not Designed that way they’re not designed for reading a bunch of text for example The resolution isn’t there

(40m 6s) Tim Chaten:

No. Yeah.

(40m 13s)

No. Yeah, I’ve… Yeah, there is, you know, some games where you need to read some text in VR and it’s…

(40m 26s)

That’s where it shows its weakness. Like, the… Yeah. I mean, the HDR stuff’s there on PlayStation.

(40m 27s) Ken Case:

[laughter]

(40m 29s)

Yeah.

(40m 34s) Tim Chaten:

You can look up at the sun and kind of.

(40m 36s)

blind you a bit like that stuff that sells it.

(40m 38s) Ken Case:

Right.

(40m 39s) Tim Chaten:

But when you have to get like very detailed text,

(40m 42s)

I’m excited to see what Apple can do with their hardware there.

What are you most excited for?

(40m 47s) Tim Chaten:

Yeah. What are you personally most excited about for with VisionOS in this whole new platform?

(40m 55s) Ken Case:

Well, I guess I’m personally most excited about having a huge canvas to spread out and getting rid of these constraints of the physical screens that we’ve been around for so long. And thinking back on your earlier comment about, well, would you take this outside? And that’s actually a place where maybe you would do that more often. Like right now, if I go outside to work on something, maybe I’m bringing my laptop with me, but then I’ve got a small screen. But imagine if

(41m 25s)

I put these on, and maybe I really would love to be there, or I’m out at the park or wherever,

(41m 31s)

and I’m enjoying the physical environment that I’m in. Plus, I’m enjoying being able to have this huge workspace even bigger than the large screen that I’m used to for my Mac.

(41m 39s) Tim Chaten:

Yes.

(41m 41s)

Yeah. Yeah, exactly.

(41m 45s) Ken Case:

So that’s, I guess, the thing I’m personally most excited about. I think it’s important to remember

(41m 55s)

the potential of the iPhone. Even looking back, the first year was amazing, what it could do,

(41m 57s) Tim Chaten:

  • Yeah, mm-hmm.

Early iPhone Days

(42m 2s) Tim Chaten:

Yeah, I’m trying to remember,

(42m 2s) Ken Case:

but it was nothing compared to the second year once we had all the apps.

(42m 5s)

We did. Yeah, we would start up a web server from OmniFocus on the Mac that your phone could then connect to and do a little bit of stuff on it.

(42m 6s) Tim Chaten:

did you guys have like a web app in that first year to access some, yeah, yeah, that’s right.

(42m 20s)

Yes.

(42m 22s)

Yeah.

(42m 25s) Ken Case:

I was so glad when we were able to replace that with a real app.

(42m 27s) Tim Chaten:

[laughs]

(42m 30s)

We’re…

(42m 30s) Ken Case:

But for the first year, that was Apple’s story, right?

(42m 33s) Tim Chaten:

Yeah.

(42m 34s) Ken Case:

It was, “Oh, yeah, you can build web experiences for it.”

(42m 35s) Tim Chaten:

Yeah.

(42m 38s)

Were you jailbreaking in 2007, getting OmniFocus running before it was official?

(42m 38s) Ken Case:

No, I kind of followed what Craig Hockenberry was doing and so on from a slight distance.

(42m 50s) Tim Chaten:

  • Yeah.

(42m 50s)

[chuckles]

(42m 52s) Ken Case:

and like all right that’s fascinating I think they were doing with

(42m 55s)

I have like forensic on really early plus twitter I think maybe or some early iterations of it but but I always felt like I don’t have time for that right now i’m busy trying to get this omni focused stuff done right we were shipped we had just we were shipping version 1.0 that year right and so it was a busy time for us much like right now i’m not playing with the uh the beta version of iOS 17 because I’ve got to stay focused on all right, what’s that?

(42m 56s) Tim Chaten:

Mm-hmm, yeah.

(43m 6s)

[laughs]

(43m 8s)

Yeah, right, yes, yeah.

(43m 16s)

Very.

(43m 18s)

[laughs]

(43m 23s)

Yeah.

(43m 25s) Ken Case:

Actually shipping that I need to have our software ready for.

(43m 28s)

Yeah.

(43m 28s) Tim Chaten:

Well, yeah, I’m very excited and it sounds like you are as well and yeah, we got another,

(43m 34s) Ken Case:

Yeah, a lot of challenges ahead, I think, that we have.

(43m 36s) Tim Chaten:

  • A little bit less than a year to wait on this stuff.

(43m 39s)

  • Yeah.

(43m 40s)

Uh, mm-hm.

(43m 45s) Ken Case:

Well, thank goodness we already did all this with UI work that we started a few years ago, right?

OmniFocus Now in Swift

(43m 51s) Ken Case:

right, that OmniFocus is positioned, you know.

(43m 55s)

The reason that it works as well as it does right now in the simulator is because all of these controls are already, you know, native controls by virtue of being written in Switch UI.

(44m 5s)

And so that all just translated over well.

(44m 5s) Tim Chaten:

  • Yeah, and that was Obtifocus 4.

(44m 7s) Ken Case:

Yeah, exactly.

(44m 7s) Tim Chaten:

That was one of the big things you guys undertook was SwiftUI entirely, right?

(44m 12s)

Yeah, yeah.

(44m 13s) Ken Case:

And so, you know, when I press on the, like on a row in the simulator, and then it does the hover effect for much like a long tap,

(44m 23s)

which I assume is going to be…

(44m 25s)

like, I gaze at it and I pinch and then hold.

(44m 27s) Tim Chaten:

Right, yes.

(44m 29s) Ken Case:

And then what happens in the simulator is the row raises up a little bit out of the window,

(44m 35s)

you know, comes toward you, and then the context menu pops up and it’s at the same level,

(44m 36s) Tim Chaten:

  • Uh-huh.

(44m 36s)

[laughing]

(44m 39s) Ken Case:

and it’s all beautiful, and it’s just like, okay, yeah, I can’t wait.

(44m 43s) Tim Chaten:

  • Yeah.

(44m 46s)

Yeah.

The Mac in visionOS

(44m 47s) Tim Chaten:

And yeah, I’m super curious with the Mac interaction.

(44m 50s)

I’m curious if that evolves into like right now just showing you like an external monitor of exactly your Mac screen,

(44m 56s)

but I’d love a future where you can just like pick what Mac app you want to run as its own instance in window and you have several Mac apps.

(45m 4s) Ken Case:

Right, pull some windows out, spread them out, not have them be constrained by some virtual screen.

(45m 6s) Tim Chaten:

  • Yeah.

(45m 7s)

Yes.

(45m 8s)

Yes, yeah.

(45m 10s)

‘Cause yeah, right now the Mac experience in VisionOS will be constrained by that one screen versus each Mac app being its own screen,

(45m 17s)

which is what I think we all want.

(45m 19s) Ken Case:

Yeah, agreed. Yeah, it’s like the difference between using screen sharing to a Mac versus…

(45m 19s) Tim Chaten:

Yeah.

(45m 22s)

Yeah.

(45m 28s) Ken Case:

Oh, back in the day, I guess I don’t know if this maybe still exists in some form like in the Linux world.

(45m 37s)

XWindows was set up so that every app could project to a different display that you were running on a computer.

(45m 45s)

and so each window was kind of being positioned on the other display.

(45m 49s)

Now that had its own other problems. It did make things like drag-and-drop really super challenging or impractical or whatever.

(45m 52s) Tim Chaten:

Right, yeah.

(45m 54s)

[laughs]

(45m 58s) Ken Case:

But I could imagine a world where I can have, like, I’m never going to have the horsepower, presumably in my headset, that I can have in my Mac Studio on my desk.

(46m 11s) Tim Chaten:

No. Yeah.

(46m 12s) Ken Case:

And so I want that power, I want to be able to tap into it, and I want the flexibility of the display that I’ve got on my headset.

(46m 13s) Tim Chaten:

Yes.

(46m 19s) Ken Case:

And then have them work together and be able to throw windows around.

(46m 23s)

And again, this is just the beginning of the platform, so maybe what I’m thinking about is going to be a few years off.

(46m 29s)

But I hope that somebody does it. If not Apple, then maybe a third party can build something that does that part of it.

What visionOS could replace

(46m 36s) Tim Chaten:

  • Yeah, and yeah, I do wonder, VisionOS, it feels like,

(46m 41s)

has potential to eventually replace a Macintosh for a lot of people, but as you said, the horsepower,

(46m 48s)

that’s one reason it’ll stick around.

(46m 50s)

How, do you see it taking a while, or if,

(46m 53s)

will Apple ever get to like full Xcode running on VisionOS versus just using the Mac version within VisionOS?

(47m)

Yeah.

(47m 1s) Ken Case:

I’ve wondered that too. I mean, I’ve wondered that for years for the iPad as well.

(47m 2s) Tim Chaten:

Yeah, ’cause if you’ll, yeah.

(47m 4s)

Right, yeah.

(47m 5s) Ken Case:

It mattered, especially before that Apple Silicon transition on the Mac side.

(47m 10s)

But I’m starting to think maybe that’s not the direction they’re headed.

(47m 17s)

That instead they’re headed towards putting Xcode in the cloud. Right.

(47m 20s) Tim Chaten:

Oh, right.

(47m 21s) Ken Case:

And so you, yes,

(47m 24s)

you can do your programming all from the headset and you don’t have to have a Mac that is with you. But it’s not that,

(47m 31s)

that you’re carrying it around with you and running Xcode locally.

(47m 34s)

It’s that you now are just like you interact with the web.

(47m 37s)

You’re interacting with Xcode somewhere else, which is fine.

(47m 39s) Tim Chaten:

  • Yeah. Right. Yes. Not a subscription. Well, yeah, I guess if you’re already, if the $99 a year, whatever it is, covers it, that’d be great. But if it’s anything more than that,

(47m 42s) Ken Case:

I would like it if it could be my cloud, not just Apple’s cloud.

(47m 46s)

Yeah.

(47m 48s)

Right. Right.

(47m 54s) Tim Chaten:

that’s another. Yeah. Yeah. Yeah. Well, anything else before we wrap it up? Thank you so much.

Anything else?

(48m 4s) Ken Case:

Thank you. Yeah, I really appreciate it. I think I would just say, obviously, we’re really excited about doing this.

(48m 15s)

We still have OmniFocus 4 itself to ship for the other platforms. So I’m trying to…

(48m 16s) Tim Chaten:

Yes.

(48m 21s) Ken Case:

Fortunately, I don’t have the hardware yet, so that helps me temper my enthusiasm a little bit at the distraction level.

(48m 24s) Tim Chaten:

Yeah, the request to Apple to get your own headset for developers is now available.

(48m 34s)

So you can get in that queue.

(48m 35s) Ken Case:

Yeah, we’re talking about some recent news of a potential distraction. But I already did write my essay and submitted it already. Because yes, I am that interested. We have all of the compatibility updates across all our platforms for this fall as well. Of course, with iOS 17,

(48m 40s) Tim Chaten:

Yes.

(48m 43s)

Excellent.

(48m 46s)

Yes.

(48m 58s) Ken Case:

iPadOS 17, and MacOS 14, and then get all of that.

(48m 59s) Tim Chaten:

Yeah.

(49m)

Heh.

(49m 2s)

Yeah. Heh heh.

(49m 5s) Ken Case:

apps ready for Apple Vision Pro. So there’s some challenge. We’ll approach it with some humility.

(49m 7s) Tim Chaten:

Heh heh heh heh heh.

(49m 11s) Ken Case:

What we’ll be able to accomplish. But I would love, of course, to have stuff there as soon as we can get it there. And one of the things that I find really exciting, I guess, about Apple doing this platform is their focus on the humanities. I sort of already alluded to being connected with the outside world is an important focus of theirs. But then also their focus on privacy.

(49m 26s) Tim Chaten:

[footsteps]

(49m 28s)

Mm-hmm. Yeah.

(49m 32s)

Yeah.

(49m 35s) Ken Case:

As a fundamental human right. And so they’re not letting you do eye tracking,

(49m 37s) Tim Chaten:

Yeah.

(49m 40s) Ken Case:

and all those sorts of things. It helps me, I guess, place more trust in the platform.

(49m 48s) Tim Chaten:

Yes Right, yeah, yeah Yes

(49m 48s) Ken Case:

If I actually want to be using this as a productivity tool, not just gaming on it,

(49m 54s)

then I feel like I can do that. And that’s something that we, of course, value strongly as well, you know, we have a page about it, omnigroup.com/privacy.

(50m 5s)

See what our stance is. We believe, as well, that privacy is an important fundamental human right.

Eye Tracking

(50m 13s) Tim Chaten:

Yeah, I do really hope gaming gets an entitlement in the future West

(50m 18s)

For that to enable eye tracking because you can there’s some games that make great use of that where you’re looking at object You’re lifting up your hand and link Yeah Yes Yeah, the Facebook app should not have my eye data yes, yes

(50m 27s) Ken Case:

Yeah, I think there are absolutely applications where it makes sense, and where as long as the user is consenting to it and knows what they’re getting into and has decided that I think it’s appropriate here, and that it’s not just taken for granted that in general you have it all the time.

(50m 46s) Tim Chaten:

Yeah

(50m 49s) Ken Case:

I don’t know how some of these other things…

(50m 53s)

Maybe you’ll use other tools for some of those things, like we used the stylus on the iPhone.

(50m 55s) Tim Chaten:

  • Right, yeah.

(50m 57s) Ken Case:

I don’t use the iPad, sometimes, but…

(50m 58s) Tim Chaten:

  • Yeah, and I will be curious if there’ll be physical tools we use with this one day, like a stylus even,

Accessories

(51m 7s) Tim Chaten:

for like interacting on the table in front of you with, you know, to someone looking at you.

(51m 12s)

You look like a crazy person, but you know,

(51m 14s) Ken Case:

Right.

(51m 15s) Tim Chaten:

with knobs that we turned and like different, yeah.

(51m 17s)

I wonder what the hardware accessory ecosystem will look like in 10 years for this thing.

(51m 22s) Ken Case:

Yeah, absolutely. I would think that you’d want to be able to have your art app and not be constrained by the size of a screen anymore. Artist easels are typically bigger than most people’s screens, and they’re drying across all this stuff. Wouldn’t it be great if you could do that in virtual space, in spatial computing reality? Yeah, infinite spatial canvas. I think that’s the word I’m going to be.

(51m 30s) Tim Chaten:

Right. Yes. Yeah. Yeah.

(51m 36s)

Spatial computing, yes.

(51m 42s)

Do you have a sense yet of which Omni app would be the priority for day one if you can only ship one on VisionOS next year?

The First Omni app for visionOS?

(51m 53s) Ken Case:

So, there’s priority and there’s what’s closest to being ready.

(52m) Tim Chaten:

  • Ah, yes. (laughing)

(52m) Ken Case:

Because OmniFocus 4 just had all of this rewrite into SwiftUI, I think it’s the one that’s most likely to be there on day one because it’s already in SwiftUI, exactly.

(52m 3s) Tim Chaten:

Yes.

(52m 5s)

Right, ’cause it’s already in Swift.

(52m 8s)

Yeah, yeah, yeah.

(52m 11s)

Very cool, yeah.

(52m 11s) Ken Case:

[laughs]

(52m 13s) Tim Chaten:

And it’ll be, yeah, it’ll be cool to see what that product looks like next year at some point to see this kind of, as we’ve been talking about it,

(52m 13s) Ken Case:

[cough]

(52m 19s) Tim Chaten:

this merger of, you know, iPad design language with the Mac,

(52m 22s)

you know, the extra windows and stuff.

(52m 24s)

It’ll be fun to kind of see that ’cause it’s kind of a merger of the two.

(52m 26s) Ken Case:

Yeah Yeah, but I absolutely want you know The huge canvas is a big benefit to an app like OmniPlan or an app like OmniGraffle, right?

(52m 28s) Tim Chaten:

So it’ll be, it’s gonna be fun to see

(52m 38s) Ken Case:

Where you’re already working with a big canvas of stuff and you’re trying to build relationships between it and if you can put it, you know back on the wall and then another one here that you’re editing and Invite other people to collaborate with you and so on that this is all

(52m 50s) Tim Chaten:

Yeah, yes, for sure.

(52m 53s) Ken Case:

Yeah, there’s a lot to do over the next day

(52m 56s)

[laughs]

(52m 58s) Tim Chaten:

Well, thank you, Ken, I really appreciate your time.

More info?

(53m 2s) Tim Chaten:

Where can folks find more information about your awesome apps?

(53m 5s) Ken Case:

Oh, yeah, so we’re at omnigroup.com, and from there, you know, you can find links to our social media platforms and so on. Personally, I’m at Mastodon these days at kcase@mastodon.social,

(53m 16s) Tim Chaten:

  • Mm-hmm.

(53m 17s)

Excellent.

(53m 19s) Ken Case:

and of course, we have a bunch of Mastodon accounts for the Omni products at mastodon.onigroup.com.

KenCase@mastodon.social

(53m 28s) Tim Chaten:

Thank you so much.

(53m 29s)

Really appreciate your time.

(53m 30s)

And yeah, I’d love to chat once we have Apple Vision Pros next year and your apps are on the,

(53m 36s)

you know, at the door and yeah.

(53m 37s) Ken Case:

Yeah, absolutely. I’d love to do this on the other side and look back. Okay, what happened?

(53m 38s) Tim Chaten:

Yes.

(53m 43s) Ken Case:

And now that we really have it, let’s talk about now, where do we think things will go?

Closing

(53m 50s) Tim Chaten:

Well, that’s my discussion with Ken, all about VisionOS.

(53m 53s)

Learn more about the OmniGroup at omni-group.com.

(53m 57s)

My thanks to Ken for his time recording,

(53m 58s)

and my thanks to you for your time and attention tuning in.

(54m 1s)

As a reminder, you can support this podcast over patreon.com/ipadpros or by subscribing in Apple Podcasts.

(54m 10s)

My thanks to everyone that supports the podcast.

(54m 12s)

With that, I’ll talk to everyone again real soon.


Made with Transcriptionist


Leave a comment