Total Pageviews

Saturday, 4 May 2019

iOS 13: Developer rumors you'll actually care about

Siri intents, visual AR studio, more taptics, direct photo and scan capture, and more! It's an iOS 13 developer rumorpalooza!

Just when you think it's safe to hit pub on your iOS 13 rumor roundup, another full set of spoilers get spilled. This one is a little different, though. A little nerdier. A little geekier. A little more from the developer side than the user side, though we can still suss a lot of great user stuff from it.

Rather watch than read? Hit play on the video above!

New Siri Intents

Siri Shortcuts let you assign voice triggers to any simple action surfaced by an app, or any workflow you create by linking those actions. They can add a lot of functionality but it's also very limited functionality. Siri intents, on the other hand, well, those are the holy grail of voice control.

Intents are deep, robust ways for apps to surface functionality for Siri, ways that don't require specific triggers but can respond to a wide range of different sentence structures. For example, Skype Lory, make a Skype call to Lory, call Lory on Skype.

If Shortcuts are like a Siri day pass, Intents are much closer to first class Siri citizenship.

Apple announced the first, limited batch a few years ago, and a couple more have tricked out since, but nowhere nearly enough and nowhere nearly fast enough.

According to Gui Rambo of 9to5Mac, though, that's about to change with new intents coming to iOS 13 for event ticketing, message attachment, trains, planes, and some big ones search and media playback.

I'm not sure how either will manifest yet, but if Apple does media playback right, it'll be what pretty much all of us have been waiting years for: Full on Siri control for everything from Spotify to Netflix to Overcast and Audible. Basically, every third-party video and audio app would suddenly be as integrated with Siri as Apple's own TV, Music, and Podcast apps.

Seriously, this alone just quadrupled, quintupled, dectupled? My WWDC excitement.

Augmented Reality

Apple doesn't really see augmented reality — AR — as an app or even a feature. If you listen to Tim Cook enough over the years, it's clear Apple sees AR as a core technology key to Apple's future. And that, one day, having an AR view will be like having a display. See my previous video, link in the description.

That's why Apple's been so aggressive with iterating ARKit, their framework for AR. They've gone from relatively simple surfaces and ephemeral models to full-on, multi-person, persistent environments and Memoji-style face and expression tracking in a couple of years flat. Even working with Pixar on a new, portable, standardized file format where directors can have different opinions on where virtual props go than the set designers who initially placed them. It's all shades of goofy cool.

According to Gui, Apple won't be slowing down with iOS 13 either, adding the ability to detect human poses — I'm guessing that means bipedal, sorry dog and cat friends… for now! Also, a new Swift-based framework and app that'll let you create AR experiences visually, and support for controllers with touchpads and… wait for it… stereo AR headsets.

No, that doesn't mean Apple has to announce its own AR glasses at WWDC. Just like with ARKit in general, Apple is using its existing devices, in the hands of hundreds of millions of people, and features and apps like Animoji and Measure to slowly get developers and customers alike familiar with AR, so they can understand and iterate as much as possible at the bit level before spitting out any atoms.

Clever Apple. But it does feel like we must be getting close, right?

Taptics

Confession: I'm overly smitten with the idea of tactile interfaces. Graphical User Interfaces have been a thing since Xerox Park, the original Lisa and Mac, and Windows. Voice interfaces have grown with Siri, Alexa, and Google Assistant. We've had things like the Taptic Engine in iPhones and Nintendo's Joy-Cons for a while, and everything from the really abstract head shake feedback when you try to 3D Touch an icon without any options, to the feeling of ice shaking in your hand when playing a game is just… beyond cool. But, the technology also still feels like it's in its infancy.

According to Gui, though, Apple is working on maturing them with a new iOS 13 frameworks that'll give developers even more control over the Taptic Engine in modern iPhones.

I don't know what we'll see as a result of this. I don't know what I even want to see — I mean feel. But developers, especially and including game developers, have been pretty savvy about incorporating force feedback so far, so at this point, I just want to see it — feel it — dammit.

Direct Capture

Right now, if you want to get a photo off your camera or SD card and into an app, you have to go through Apple's Photo-based import feature. Even if you want to use it in Lightroom or whatever. You have to go through Photos. Likewise, if you want to scan a document, you have to go through Notes or an app that has its own, built-in, capture system.

With iOS 13, though, Gui says Apple will be providing API — Application Programming Interfaces — so that any app that implements them will be able to pull photos and scan documents directly, no Apple app intermediation needed.

If it works well, that should seriously speed up a lot of workflows and remove a lot of duplicate content across libraries. Hurrah.

Machine Learning

Apple is all-in on AI these days the way it was on silicon a decade ago, and we've all seen the results of that push. Now, with John Giannandrea as head of his own, ethically-focused org, the sky — instead of the Skynet — may be the limit.

How much of iOS 13 his team has had time to work on, I don't know, but Gui does mention a few new machine learning features coming our way.

CoreML will be updatable on-device, so apps will be able to learn and improve in real time. That'll help make sure the privacy benefits of not hoovering all our data to the cloud are matched by greater dynamism and, hopefully, even better results.

Vision is getting a built-in image classifier, and a new API will provide for a new sound analyzer. That'll save developers having to roll — sorry, train — their own and maybe even more and better integration into a wider range of apps and features.

Mouse support

This one comes from Federico Viticci of MacStories fame, from his Connected podcast, and was echoed by Steve Troughton Smith on Twitter.

https://twitter.com/stroughtonsmith/status/1120447708215554049?ref_src=twsrc%5Etfw

If you missed last week's @_connectedfm, @viticci had a pretty interesting scoop that he'd been sitting on re mouse support coming to iPad as an accessibility feature. As far as I'm aware, that *is* indeed in the works. I feel like every pro user will turn that on, day one

Why as an accessibility feature and if that will impact anyone wanting to use it more generically, is impossible to say. But, you know, even Steve Jobs relented and let arrow keys and command lines onto the Mac, so letting people fall back onto the mouse, trackpads, and pointers almost a decade post-iPad doesn't seem like a bad thing. Especially since the keyboard made its official comeback a few years ago already.

Basically, anything that lets us keep our eyes and hands in context is a huge win for productivity.

And More

Gui had a few more spoilers tucked away in his report. NFC is getting broader read support, though it still doesn't sound like there'll be any write support. Link previews, iMessage-style will be available as a framework for any app. And there's likely still a bunch of stuff that hasn't been spoiled yet.

We'll only know for sure when Tim Cook hands off to Craig Federighi for the big announcement on June 3rd at WWDC 2013 in San Jose. And yeah, I'll be there live to bring you back all the action as it happens.



from iMore - The #1 iPhone, iPad, and iPod touch blog http://bit.ly/2Wt1ULq
via IFTTT

No comments:

Post a Comment