Apple announced their latest phones during their September 12 event at the Steve Jobs theatre. The biggest news was the release of iPhone X and its use of FaceID technology. As expected, consumers, analysts, and Apple fans alike are quite excited.
From a developer perspective, however, these updates mean increased complexity for the iOS native and hybrid apps. The new OS (iOS 11) and the new iPhones now support:
- Additional Form Factor: While iPhone 8 and 8 plus will continue to have the same form factor as iPhone 7 and 7 plus (4.7 inches and 5.4 inches), the iPhone X will now be 5.8 inches. This means developers will need to care for three form factors now – and possibly more if you are still supporting iPhone 5 and below.
- New Pixel Densities: iPhones 8, 8+ and X will have different pixel densities which means that the amount of content displayed on each phone will vary. As a result, developers must build and monitor applications to these multiple specifications.
- New Authentication Mechanisms, including FaceID, TouchID: These new authentication methods may impact the user flow in your application, especially the login and payment experience. To accomodate, you may need to build and display new assets and screens to help users understand how to use the new device-based authentication.
In this complex new world where you are serving customers ranging from iPhone 5 to iPhone X, with varied form factors, processing power, authentication mechanisms and OS versions, it’s critical to have the right tools that will give you real-time visibility into app performance and user flows.
In fact, you should know exactly what users are seeing and experiencing when they use your app – especially when there are UI issues. This will help you understand user behavior on these new devices and enable you to quickly identify the root of the UI issues.
Here are a few data points you should have at your fingertips:
- Understand user interaction and touchpoints: Developers need visibility into user interactions and touchpoints with all screens – whether it’s the homepage or checkout screen – including button clicks, text field input, and table cell selection.
- Debug issues through screen visualization: Debugging a bug becomes exceptionally simple when you know exactly what user screens looked like during the point of issue. More often than not, the issue turns out to be a UI bug. For example, when an order is being processed (see the screenshot below) and user tries to swipe, the UI may result into weird state.
- Understand application latency: Now, with improved processing speed using the A11 Bionic chip, application responsiveness will improve – and customer expectations will also increase. As result, any app or network slowness will be noticeable, making it even more imperative for developers to know exactly when the application is slow and which line in the code is slowing the app down.
- Continue to monitor user segmentation and crash performance: You should continue to monitor app crashes and user segmentation to understand which devices, OS versions or form factors users are using. In addition, segmentation of performance issues based on different app versions will help identify how your code has improved or degraded with new version releases.
As you continue to serve customers on various mobile devices and complex applications, it’s imperative to have a comprehensive Mobile Real-User Monitoring tool you can rely on. Learn more about AppDynamics Application Performance Management solutions now.