In VoiceOver, when using Group Navigation style, the cursor first focuses on the semantic group. To navigate inside the group, a two-finger swipe (left or right) can be used. This behavior works for default containers like the Navigation Bar, Tab Bar, and Tool Bar.
How can I achieve the same behavior for a custom view?
I tried setting accessibilityContainerType = .semanticGroup, but it only works for Mac Catalyst. Is there an equivalent approach for iOS?
Explore best practices for creating inclusive apps for users of Apple accessibility features and users from diverse backgrounds.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hi,
Our app has a section where, we show to users how to activate "Silence Unknown Callers", because is a crucial feature for our app. But, we saw that 30% of users drop the process here, because we can't open directly that setting option in phone app.
We are using this url scheme to open phone settings in iOS 18:
if let url = URL(string: "App-prefs:com.apple.mobilephone") {
UIApplication.shared.open(url)
}
But, we don't see other way to open directly the path "silence", like in iOS 17, with this url scheme: prefs:root=Phone&path=SILENCE_CALLS
So, do you know if is possible open that option directly? We want to improve our accessibility.
Thank you!
I am invoking the UIImagePickerController of type UIImagePickerControllerSourceTypePhotoLibrary from my viewController. I want shift the keyboard focus to the Cancel button which is the first interactive element on the gallery picker. When a user has full keyboard access turned on they should be able to tap tab and interact with the gallery picker modal. How do I achieve this?
I have an issue in my app when it is used together with the assistive access feature.
For authentication, we are using the capacitor firebase authentication plugin (https://www.npmjs.com/package/@capacitor-firebase/authentication) which enables users to login via apple (FirebaseAuthentication.signInWithApple(...)), google (FirebaseAuthentication.signInWithGoogle(...)), or email. Works just fine. However, when the assistive access feature is enabled, the login fails for apple ("The operation couldn't be completed. com.apple.AuthenticationServices.AuthorizationError error 1000) and google ("The user canceled the sign-in flow).
It seems like the popups for sign-in are blocked and therefore an error is returned immediately. The popups may be blocked by assistive access, causing the capacitor plugin to be unable to authenticate.
I have tested this on my iPhone 12 Pro using iOS 17.7
I would appreciate any suggestions to handle this issue!
Topic:
Accessibility & Inclusion
SubTopic:
General
Since the last bet upgrade for iPad to 26.3 labels have disappeared.
Going into system/accessibility the toggle setting for labels makes no difference whether on or off.
labels are permanently not there/missing.
Topic:
Accessibility & Inclusion
SubTopic:
General
Hello! I'm adding VoiceOver support for my app, but I'm having an issue where my accessibility value is not being spoken. I have made a helper class that creates an NSString from a double and converts it to the user's region currency.
CurrencyFormatter.m
+ (NSString *) localizedCurrencyStringFromDouble: (double) value {
NSNumberFormatter *formatter = [[NSNumberFormatter alloc] init];
formatter.numberStyle = NSNumberFormatterCurrencyStyle;
formatter.locale = [NSLocale currentLocale];
NSString *currencyString = [formatter stringFromNumber: @(value)];
[formatter release];
return currencyString;
}
View Contoller
self.checkTotalLabel.accessibilityLabel = NSLocalizedString(@"Total Amount", @"Accessibility Label for Total");
self.checkTotalLabel.accessibilityValue = [CurrencyFormatter localizedCurrencyStringFromDouble: total];
I'm confused on whether the value should go into the accessibility label or not. When the currency is just USD and the language is English, it's a simple fix. But when the currency needs to be converted, I'm not sure where to go from here.
If anyone has any guidance, it would help me a lot!
Thank you!
Please excuse me if this is obvious. I'm new to Apple development.
Is there a SwiftUI Accessibility Inspector? I run the standard one, in Xcode 26b3, and it shows me warnings for things that I didn't create in SwiftUI. I presume that "SwiftUI" is primarily implemented using macros and that these things are either generated or boilerplate lower-level things. But if so, then why would they trip Accessibility Inspector warnings? Is there something I can do from SwiftUI to clear them?
Or... is there a demangler somewhere that will translate from these names into something this human might recognize?
I'm targeting macos, btw, if that makes any difference.
Topic:
Accessibility & Inclusion
SubTopic:
General
ar quicklook suddenly is grayed out on iphone 15 pro, I bought the phone new recently ot was working great, 2 days ago updated to ios 18.1.4, ar mode kept opening but i started getting a move iphone over surface message and the object wouldn’t detect surfaces correctly, updated to ios 18.5, now when i open quicklook modesl ar is completely greyed out,
can someone help me fix or detect the issue
thank you
VoiceOver reads out all visible content on the screen, which is essential for visually challenged users. However, this raises a privacy concern—what if a user accidentally focuses on sensitive information, like a bank account password, and it gets read aloud?
How can developers prevent VoiceOver from exposing confidential data while still maintaining accessibility? Are there best practices or recommended approaches to handle such scenarios effectively?
I am testing the accessibility feature available in the Settings app called "Speak Screen". The help text in the Setting app states that swiping down with two fingers will cause the screen content to be spoken. However, I've been unable to get this feature to work. Every time I try the double finger swipe down, it behaves the same as the single finger swipe down gesture. Usually this manifests as making scroll views bounce.
I've tried toggling the feature on and off, turning off Reachability, and rebooting my phone, but I can't get the speak screen gesture to work. If I access the speak screen feature from the "Speech Controller" button, then the screens content is spoken, as expected, so I know the feature is enabled. It's just the gesture that doesn't work.
Is there something else I need to do to get this gesture to work? I don't want to tell my users to turn this feature on if I can't verify that the gesture will work with my app.
Hi,
I'm trying to fix tvOS view for VoiceOver accessibility feature:
TabView { // 5 tabs
Text(title)
Button(play)
ScrollView { // Live
LazyHStack { 200 items }
}
ScrollView { // Continue watching
LazyHStack { 500 items }
}
}
When the view shows up VoiceOver reads:
"Home tab 1 of 5, Item 2" - not sure why it reads Item 2 of the first cell in scroll view, maybe beacause it just got loaded by LazyHStack.
VocieOver should only read "Home tab 1 of 5"
When moving focus to scroll view it reads:
"Live, Item 1" and after slight delay "Item 1, Item 2, Item 3, Item 4"
When moving focus to second item it reads:
"Item 2" and after slight delay "Item 1, Item 2, Item 3, Item 4"
When moving focus to third item it reads:
"Item 3" and after slight delay "Item 1, Item 2, Item 3, Item 4"
It should be just reading what is focused, idealy just
"Live, Item 1, 1 of 200"
then after moving focus on item 2
"Item 2, 2 of 200"
this time without the word "Live" because we are on the same scroll view (the same horizontal list)
Currently the app is unusable, we have visually impaired testers and this rotor reading everything on the screen is totaly confusing, because users don't know where they are and what is actually focused.
This is a video streaming app and we are streaming all the time, even on home page in background, binge plays one item after another, usually there is never ending Live stream playing, user can switch TV channel, but we continue to play. Voice over should only read what's focused after user interaction.
Original Apple TV app does not do that, so it cannot be caused by some verbose accessibility settings. It reads correctly only focused item in scrolling lists.
How do I disable reading content that is not focused?
I tried:
.accessibilityLabel(isFocused ? title : "")
.accessibilityHidden(!isFocused)
.accessibilityHidden(true) - tried on various levels in view hierarchy
.accessiblityElement(children: .ignore) - even focused item is not read back by voice over
.accessiblityElement(children: .ignore) - even focused item is not read back by voice over
.accessiblityElement(children: .contain) - tried on various levels in view hierarchy
.accessiblityElement(children: .combine) - tried on various levels in view hierarchy
.accessibilityAddTraits(.isHeader) - tried on various levels in view hierarchy
.accessibilityRemoveTraits(.isHeader) - tried on various levels in view hierarchy
// the last 2 was basically an attempt to hack it
.accessibilityRotor("", ranges []) - another hack that I tried on ScrollView, LazyHStack, also on top level view.
50+ other attempts at configuring accessibility tags attached to views.
I have seen all the accessibility videos, tried all sample code projects, I haven't found a solution anywhere, internet search didn't find anything, AI didn't help as it can only provide code that someone else wrote before.
Any idea how to fix this?
Thanks.
I have been working to remediate PDFs for a client. The documents/forms have many tables. When I correctly tag a table, using Foxit Editor Pro, it works beautifully on a PC reading it with NVDA. On Mac using VoiceOver the table isn't accessible. It doesn't matter if I try to read it in Adobe Acrobat, Foxit, or Preview. The reader often says the document is empty, omits column headers, and/or associates the wrong header with the column data.
The documents have essentially the same coding behind them as for the web. Why is it they perform so well on a PC with NVDA, but so poorly with Mac VoiceOver? I am a Quality Assurance Specialist. I review websites apps, and documents for accessibility. Why can't I do my job using only my Mac system?
As a Mac user, it frustrates me that I can't use my preferred system for checking documents to see if they are accessible because VoiceOver doesn't work well. I actually have to recommend to my clients and their customers that they need to use a PC with NVDA or Jaws for these documents to be able to get all the information. Unfortunately, most people aren't able to have, or maintain, both systems. Overall, Mac products are very high quality. This, and other issues with VoiceOver, seems to be a large gap in Apple's offerings and functionality.
I would appreciate a human response to the original email I sent about this on 7/30/2025.
Topic:
Accessibility & Inclusion
SubTopic:
General
Triple tap for screenshot->notification->triple tap detected becomes a part of the screenshot and obscures the top part of screenshot.
Thanks
Topic:
Accessibility & Inclusion
SubTopic:
General
I have users who need to be able to hear the content of SwiftUI Text views. I have specified the .textSelection(.enabled) modifier for the text views. Adding this modifier causes a "copy" option to appear on long press, but it doesn't enable the visible selection of text, nor does it provide the "Speak" menu item that UIKit allows on text selection.
Is the "Speak Selection" accessibility feature broken for SwiftUI Text views? I've found that there's another accessibility feature that does work (enabling the Speech Controller button for "Speak Screen"). Do I need to tell my users that Apple is deprecating the "Speak Selection" accessibility feature, and that they need to use the Speech Controller instead? Or is there something else I can do to my SwiftUI to get that feature to work?
I have more than 1000 notes classified in parent/child folders up to 5 levels. From the 5th level of files I can no longer share the note. The note is not shared. It is that of the parent file that is shared.
Thank you very much
Good to you
Christophe
Topic:
Accessibility & Inclusion
SubTopic:
General
In the app I'm working on, I have a SwiftUI View embedded in a UIKit Storyboard. The SwiftUI View holds a menu with a list of payment tools, and the ForEach loop looks like this:
ForEach(self.paymentToolsVM.paymentToolsItems, id: \.self) { paymentTool in
Button {
navigationCallback(paymentTool.segueID)
} label: {
PaymentToolsRow(paymentToolName: paymentTool.title, imageName: paymentTool.imageName)
.accessibilityElement()
.accessibilityIdentifier("Billing_\(paymentTool.title.replacingOccurrences(of: " ", with: ""))")
}
if paymentTool != self.paymentToolsVM.paymentToolsItems.last {
Divider()
}
}
So you can see the accessibility ID is there, and it shows up properly when I open up Accessibility Inspector with the simulator, but the testing script isn't picking up on it, and it doesn't show up when the view is inspected in Appium. I have other SwiftUI views embedded in the UIKit view, and the script picks up the buttons on those, so I'm not sure what's different about this one.
If it helps, the script is written in Java with the BDD framework. I can try to get the relevant part of the script if anyone thinks that would be helpful. Otherwise, is there anything else I can try?
Hi Community,
I'm excited to share R Helper, a speech practice app I built with accessibility as the core focus from day one.
App Store: https://apps.apple.com/app/speak-r-clearly/id6751442522
WHY I BUILT THIS
I personally struggled with R sound pronunciation growing up. It affected my confidence in school and job interviews. That experience taught me how important accessible practice tools are.
R Helper helps children and adults practice R sounds with full accessibility support.
ACCESSIBILITY FEATURES IMPLEMENTED
VoiceOver - complete navigation and feedback
Voice Control - hands-free operation
Dynamic Type - scales to large accessibility sizes
Reduce Motion - respects user preference
Dark Mode - user controllable
High Contrast compatibility
Differentiate Without Color
THE CHALLENGE
Most speech practice apps ignore accessibility. I wanted to change that and prove that specialized educational apps can be fully accessible.
KEY FEATURES
Works 100% offline, no internet needed
Zero data collection, privacy first
Generous free tier with all accessibility features included
10 story missions with gamification
7 languages supported including RTL for Arabic
LESSONS LEARNED
Accessibility is not hard when you prioritize it from the start. VoiceOver labels and hints make a huge difference. Testing with accessibility features enabled is essential. Standard SwiftUI components handle most accessibility automatically. Reducing motion significantly helps users with vestibular issues.
TECHNICAL DETAILS
Built with SwiftUI, targets iOS 17 and up. Universal app for iPhone and iPad. Fully offline using CoreData and local storage. No third party analytics, privacy focused.
QUESTIONS FOR THE COMMUNITY
What accessibility features do you find users request most? How do you test accessibility features efficiently?
WHATS NEXT
I'm currently working on expanding the word library, adding more story content, improving haptic feedback
Thanks for reading.
Nour
Topic:
Accessibility & Inclusion
SubTopic:
General
Tags:
Education and Kids
Education
Machine Learning
Apple Intelligence
I am developing a vision os app for controlling an underwater ROV. I have ornaments with telemetry and buttons around a central video view feed. I have custom buttons mappings, such as "A" for locking the depth of the drone. However, when I look at buttons or certain ornaments, my custom gamepad logic is kept from running. This means that when a SwiftUI Button gains focus on visionOS, pressing the controller’s A button triggers the system’s default “click” on that Button rather than my custom buttonA handler. Essentially, focus interception by the system is stealing my A-press events and preventing my custom gamepad logic from running.
Is there a way to disable the built in gamepad interaction and only allow my custom gamepad mappings?
Topic:
Accessibility & Inclusion
SubTopic:
General
Tags:
Game Controller
Accessibility
Focus
visionOS
When I try to get the frames of a AXUIElementRef using AXUIElementCopyAttributeValue(element, (CFStringRef)attribute, &result) the frames are shifted and rotated on the iOS simulator.
I get the same frames when using the Accessibility Inspector when the Max is selected as the host.
When I switch the host to the iOS simulator the frames are correct.
How is the Accessibility Inspector getting the correct frames? And how can I do the same in my app?
Topic:
Accessibility & Inclusion
SubTopic:
General
Hi everyone — I’m implementing the new Hearing Device Support API described here:
https://developer.apple.com/documentation/accessibility/hearing-device-support
I have MFi hearing aids paired and visible under Settings → Accessibility → Hearing Devices, and I’ve added the com.apple.developer.hearing.aid.app entitlement (and also tested with Wireless Accessory Configuration: https://developer.apple.com/documentation/bundleresources/entitlements/com.apple.external-accessory.wireless-configuration
).
com.apple.developer.hearing.aid.app
xxxxx
but the app won't even compile with this entitlement
Problem
NotificationCenter.default.addObserver(...) for
pairedUUIDsDidChangeNotification
never fires — not on app launch, not after pairing/unpairing, and not after reconnecting the hearing aids.
Because the notification never triggers, calls like:
HearingDeviceSession.shared.pairedDevices
always return an empty list.
What I expected
According to the docs, the notification should be posted whenever paired device UUIDs change, and the session should expose those devices — but nothing happens.
Questions
Does the hearing.aid.app entitlement require special approval from Apple beyond adding it to the entitlements file?
Is there a way to verify that iOS is actually honoring this entitlement?
Has anyone successfully received this notification on a real device?
Any help or confirmation would be greatly appreciated.