WWDC 2019: Siri To Gain More Features, Apple Opens Up Tap Engine, CoreML, UIKit Framework, And More

Apple blog 9to5mac have shared a batch of information on what's coming to iOS 13 and macOS 10.15. Today, they shared more details on new features developers could come up with on WWDC 2019.


Firstly, developers will be able to employ new Siri features including the media playback, Search, Voice calling, messaging attachments, train inquiries, flight inquiries, and more. On WWDC 2019, developers will be able to integrate their own UIKit apps with more Mac-specific features, including the Touch Bar, Menu Bars and shortcut keys, using Marzipan's new API.

Further, the AR capabilities on the Apple platform will see significant improvements, including the new AR apps, the Swift framework, which allows developers to create AR experiences. ARKit can detect human movements. For developers, the system will support handles like the touchpads and AR headsets.

Apple will also offer a new framework that gives developers more control over the Taptic Engine, and currently, only a few Taptic Engine vibrating rules are available for third-party developers. At the same time, developers can add link preview functionality to their own apps, similar to previews in iMessage.

NFC will also see improvements, and third-party applications can read any ISO7816, FeliCa or MiFare tag. Apple will also introduce a new CoreML, developers will be able to update the machine learning model locally on the device. Currently, the model needs to be trained in advance and cannot be updated after deployment. The new version of CoreML allows developers to let their own ML models learn the user's actions.

Last but not least, the document scanning feature in the iOS memo will be open to developers, and with the new API, apps can read photos from external devices such as cameras or SD memory cards without having to apply through photos.

Post a Comment

Previous Post Next Post