Apple’s Plans For Developer-Specific Changes In iOS 13 And macOS 10.15 Leaked

With WWDC 2019 fast approaching we have started to see leaks about the upcoming iOS 13 and macOS 10.15 updates. Up until now the leaks were specific to user related features that will be added in the next major release of Apple operating systems, however that changes today with a new report from 9to5Mac revealing details about developer related announcements that we can expect from the conference.

According to the report Apple is planning on giving developers the ability to new Siri intents such as search, media playback, event tracking, voice calling, message attachment, flight, airport gate, train trip, and seat information. Previously app developers did not have access to these intents but after WWDC 2019 they will be able to integrate them in their apps.

WWDC 2019 will also bring new improvements related to macOS Marzipan, which will make it possible for users to have access to new APIs that will help them in integrating Mac specific features to their iOS apps. This includes Mac features like Touch Bar, Menu bar and keyboard shortcuts. The report also notes that UIKit on the Mac will be able to open multiple windows. Apple is making it super easy to offer their iOS apps on the Mac as enabling Mac support on iOS apps will be as easy as checking a checkbox, according to the report.

Apps that already support iPad’s Split View feature will support the same feature on the Mac. Users will be able to resize the app window by dragging the divider, just like they can on Mac apps.

ARKit is also receiving major update this year and will gain the ability to detect human poses. AR based games will gain support for controllers with touch pads and stereo AR headsets. Apple is introducing a brand new Swift-only framework for Augmented Reality and is offering a companion app that will allow developers to visually create AR experiences.

Apple is giving developers more control over iPhone’s Taptic Engine to make it possible for them to use it for offering feedback within their apps. Developers will also be able to offer iMessage-like link previews in their apps. Third-party apps will also be able to use iPhone’s NFC chip to read ISO7816, FeliCa or MiFare tags in addition to NDEF that is already available.

Apple is also going to offer the document scanning functionality of iOS to third-party apps. Currently the functionality is only available in stock apps like Notes. A new API will allow third-party apps to capture photos from external devices such as cameras and SD cards without having to go through the Photos app.

Machine Learning is also getting a boost, with the new version of CoreML allowing developers to update their machine learning models right on the device, as opposed to current offering that requires the models to be pre-trained and they remain static after they have been deployed. With this updated capability apps will be able to learn on the go and adapt to user actions. A new sound analysis API with machine learning will also be available to developers. The Vision framework will be getting a built-in image classifier, eliminating the need for developers to add a machine learning model to classify images into common categories.

Disclosure: iOSHacker may receive a commission if you purchase products through our affiliate links. For more visit our privacy policy page.
Total
0
Shares
Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Related Posts