Explore best practices for creating inclusive apps for users of Apple accessibility features and users from diverse backgrounds.

All subtopics
Posts under Accessibility & Inclusion topic

Post

Replies

Boosts

Views

Activity

How to disable the default focus effect and detect keyboard focus in SwiftUI?
I’m trying to customize the keyboard focus appearance in SwiftUI. In UIKit (see WWDC 2021 session Focus on iPad keyboard navigation), it’s possible to remove the default UIFocusHaloEffect and change a view’s appearance depending on whether it has focus or not. In SwiftUI I’ve tried the following: .focusable() // .focusable(true, interactions: .activate) .focusEffectDisabled() .focused($isFocused) However, I’m running into several issues: .focusable(true, interactions: .activate) causes an infinite loop, so keyboard navigation stops responding .focusEffectDisabled() doesn’t seem to remove the default focus effect on iOS Using @FocusState prevents Space from triggering the action when the view has keyboard focus My main questions: How can I reliably detect whether a SwiftUI view has keyboard focus? (Is there an alternative to FocusState that integrates better with keyboard navigation on iOS?) What’s the recommended way in SwiftUI to disable the default focus effect (the blue overlay) and replace it with a custom border? Any guidance or best practices would be greatly appreciated! Here's my sample code: import SwiftUI struct KeyboardFocusExample: View { var body: some View { // The ScrollView is required, otherwise the custom focus value resets to false after a few seconds. I also need it for my actual use case ScrollView { VStack { Text("First button") .keyboardFocus() .button { print("First button tapped") } Text("Second button") .keyboardFocus() .button { print("Second button tapped") } } } } } // MARK: - Focus Modifier struct KeyboardFocusModifier: ViewModifier { @FocusState private var isFocused: Bool func body(content: Content) -> some View { content .focusable() // ⚠️ Must come before .focused(), otherwise the FocusState won’t be recognized // .focusable(true, interactions: .activate) // ⚠️ This causes an infinite loop, so keyboard navigation no longer responds .focusEffectDisabled() // ⚠️ Has no effect on iOS .focused($isFocused) // Custom Halo effect .padding(4) .overlay( RoundedRectangle(cornerRadius: 18) .strokeBorder( isFocused ? .red : .clear, lineWidth: 2 ) ) .padding(-4) } } extension View { public func keyboardFocus() -> some View { modifier(KeyboardFocusModifier()) } } // MARK: - Button Modifier /// ⚠️ Using a Button view makes no difference struct ButtonModifier: ViewModifier { let action: () -> Void func body(content: Content) -> some View { content .contentShape(Rectangle()) .onTapGesture { action() } .accessibilityAction { action() } .accessibilityAddTraits(.isButton) .accessibilityElement(children: .combine) .accessibilityRespondsToUserInteraction() } } extension View { public func button(action: @escaping () -> Void) -> some View { modifier(ButtonModifier(action: action)) } }
1
0
532
Sep ’25
Autocomplete Select not working with VoiceOver in iOS 18.6.2
Hey folksI, I would like to ask for help on this topic: I think this is exactly the same problem Combobox not working with VoiceOver after… - Apple Community. VoiceOver also breaks the combobox from the official ARIA W3C website https://www.w3.org/WAI/ARIA/apg/patterns/combobox/examples/combobox-autocomplete-list/. When VO is turned off, I can use the up/down arrow to go through the menu items from the dropdown, but when VO is turned on, the up/down arrows cannot access the dropdown menu items. Is there an official tutorial on how to control it using voice over? Kind regards, Jakub
1
1
426
Sep ’25
HID Braille keyboard support on iPhone 6S
Hello, I am working on a Braille keyboard by using HID approach. Current the device works with iPhone 11 and SE3. However, when tested in iPhone 6s with iOS 15, although the device can be connected and recognized as Braille device in VoiceOver screen, the phone shows no response to key press report. Would there be any requirement at points such as HID descriptor for iPhone 6s support on Braille device? If iPhone 6s does not support such devices, what is the minimum system requirements? Thank you!
1
1
1.5k
Sep ’25
How to Ensure Data Privacy with VoiceOver Reading Sensitive Information?
VoiceOver reads out all visible content on the screen, which is essential for visually challenged users. However, this raises a privacy concern—what if a user accidentally focuses on sensitive information, like a bank account password, and it gets read aloud? How can developers prevent VoiceOver from exposing confidential data while still maintaining accessibility? Are there best practices or recommended approaches to handle such scenarios effectively?
1
0
440
Mar ’25
Voice Over Sound
Hello, When I listen to title in my app with VoiceOver, it makes a strange sound. This characters make with Korean+number+Alphabet. Is this combination makes some strange sound with voice over? I would like to ask if Apple can fix this issue. Thank you.
1
0
214
Mar ’25
Add VoiceOver touch gesture guidance for frame iframe in webView and Safari web
Please update Accessibility OS Settings for VoiceOver in iPhone iOS and iPadOS to include frames on the Rotor, and to make web navigation and component gestures easier to find and assign. Please add content to the iPhone and iPad Apple User Guide to use VoiceOver in web navigation with touch gestures. Specifically... iframes. There is no clear guidance in Apple documentation for VoiceOver users in iPhone or iPadOS to access iframes with touch gestures. A common belief as written on AppleVis, other blogs, and internet searches is that iframes in Safari or a webView in an app are only available with explore by touch. If explore by touch is the only option for some interactions, that needs to be included in Apple User Guides. If not, details on equivalent touch gestures for VO that have keyboard interactions in Mac need to be clear for users. VoiceOver for Mac includes a default keyboard interaction of VO-Command-F in its extensive User Guide (https://support.apple.com/guide/voiceover/by-images-or-frames-mchlp2740/mac). A user can include a rotor option for web navigation for iframes. VoiceOver for iPhone and iPad does not include a default swipe gesture assigned to frames. An option is not available for the Rotor. While there is iPhone User Guide guidance that gestures can be customized (https://support.apple.com/guide/iphone/customize-gestures-and-keyboard-shortcuts-iph59a8e6fd2/18.0/ios/18.0), it is not clear that for adding this gesture, "Move to the next frame" is tucked into the advanced navigation commands for VoiceOver Accessibility Settings in the OS. At least in my phone, the word "frame" was not searchable despite the All Commands screen using a search bar.
1
0
151
Apr ’25
Please consider having Name Recognition in a shortcut automation
Request: Name Recognition → Shortcut for SOS Flashlight + Vibration Right now, iOS Name Recognition works, but all I can do is flash the tiny notification light. It would be much more useful if Name Recognition could trigger a Shortcut. That way, I could set it to flash the flashlight in an SOS pattern and vibrate, making the alert impossible to miss. I tried using Custom Alarm, but it won’t let me record my spoken name, so it doesn’t really solve the problem. If Apple allowed Name Recognition to trigger Shortcuts — or expanded “Custom” to support names/words — this would open up far more practical, real-world alerts.
1
0
635
Sep ’25
Proposal: Using ARKit Body Tracking & LiDAR for Sign Language Education (Real-time Feedback)
Hi everyone, I’ve been analyzing the current state of Sign Language accessibility tools, and I noticed a significant gap in learning tools: we lack real-time feedback for students (e.g., "Is my hand position correct?"). Most current solutions rely on 2D video processing, which struggles with depth perception and occlusion (hand-over-hand or hand-over-face gestures), which are critical in Sign Language grammar. I'd like to propose/discuss an architecture leveraging the current LiDAR + Neural Engine capabilities found in iPhone devices to solve this. The Concept: Skeleton-based Normalization Instead of training ML models on raw video frames (which introduces noise from lighting, skin tone, and clothing), we could use ARKit's Body Tracking to abstract the input. Capture: Use ARKit/LiDAR to track the user's upper body and hand joints in 3D space. Data Normalization: Extract only the vector coordinates (X, Y, Z of joints). This creates a "clean" dataset, effectively normalizing the user regardless of physical appearance. Comparison: Feed these vectors into a CoreML model trained on "Reference Skeletons" (recorded by native signers). Feedback Loop: The app calculates the geometric distance between the user's pose and the reference pose to provide specific correction (e.g., "Raise your elbow 10 degrees"). Why this approach? Solves Occlusion: LiDAR handles depth much better than standard RGB cameras when hands cross the body. Privacy: We are processing coordinates, not video streams. Efficiency: Comparing vector sequences is computationally cheaper than video analysis, preserving battery life. Has anyone experimented with using ARKit Body Anchors specifically for comparing complex gesture sequences against a stored "correct" database? I believe this "Skeleton First" approach is the key to scalable Sign Language education apps. Looking forward to hearing your thoughts.
1
0
746
Dec ’25
I have a problem
I want to open a developer account, but it is not personal, but rather a company, and I have an existing company, and I have DUNS, and I have a website that has been made, and everything is ready, and an official email, but when the application is made at Apple, he sends to my email that he wants a public website for people, and it will be in the name of the organization, and all of these matters have been resolved. Why do they not respond to us?
1
0
656
Sep ’25
Unexpected behaviour of hardware keyboard focus in UITests
Hello! I was faced with unexpected behavior of hardware keyboard focus in UITests. A clear description of the problem When running UITests on the iOS Simulator with both "Full Keyboard Access" and "Connect Hardware Keyboard" options enabled, there is a noticeable delay between keyboard actions for focus managing (like pressing Tab or arrow keys). The delay seems to increase with repeated input and suggests that events are being queued instead of processed immediately. I will describe why I have such an assumption later. A step-by-step set of instructions to reproduce the problem Launch the iOS Simulator. Enable both "Full Keyboard Access" and "Connect Hardware Keyboard" in the Simulator settings. Run a UITest on a target application (ideally an endless or long-running test). Once the app is launched, press the Tab key several times. Observe the delay in focus movement. Optionally, press the Tab or arrow keys rapidly, then stop the UITest. After stopping, you’ll see a burst of rapid focus changes. What results you expected We expected keyboard actions (like Tab) to be handled immediately and the UI focus to update smoothly during UITests. What results you saw There was a 4–10 (end more) second delay between pressing keys and seeing a response. All stacked keyboard events (used for managing focus) are performed all at once after stopping the UITest. The version of Xcode you are using Xcode: Version 16.3 (16E140) Simulator: iPhone 16 Pro (iOS 18.4 and 18.1) Simulator: iPad Pro 11-inch (M4) (iPadOS 17.5)
1
2
270
Apr ’25
Imessage and Facetime error
Yesterday I installed iOS 26 on my iPhone as a beta tester. At first there was no problem, but during the afternoon I noticed that neither FaceTime nor IMessage worked... I tried to go through the settings as described by Apple Support, but my phone number would not activate. Sometimes I was even asked to activate iCloud. I always get a REG-RESP message. Does anyone have any ideas what the problem could be?
1
1
158
Jun ’25
SwiftUI PhotosPicker accessibility issue
iOS 18.3.1, iPhone 16 Pro. I pick photos using connected physical keyboard from the user's photo library using: .photosPicker(isPresented: $viewModel.isImagePickerPresented, selection: $viewModel.selectedImageItem, matching: .images) When picker appears, accessibility focus is moved to "dynamic Island" instead of cancel button. There is no possibility to navigate by keyboard in photos picker view without tapping on this view and move focus to this view manually . I noticed the same behavior in Notes app.
1
0
587
Mar ’25
Feature Request – Bionic Reading Accessibility Setting
I’d love to see Apple implement a Bionic Reading feature as a system-wide accessibility option. This type of reading aid highlights the first part of each word in bold to help guide the eyes and improve comprehension. It’s been shown to be especially helpful for people with ADHD, dyslexia, and other neurodivergent needs. Having a toggle in Settings > Accessibility would be life-changing. Ideally, it could be: • Enabled system-wide, or per-app • Allow customization of how much of the word is bolded • Available in Safari, Messages, Books, News, etc.
1
1
138
Apr ’25
The camera preview screen cannot be previewed in full screen
I downloaded the official camera sample code(https://developer.apple.com/tutorials/sample-apps/capturingphotos-camerapreview )it's a .swiftpm package and created a SwiftUI project. I copied the official sample code into this new project, build it, and ran it on an iPhone 13 for testing. I found that there were black empty areas on the top and bottom of the application interface, which means that the application interface cannot be previewed in full screen. I have tried many methods but cannot preview in full screen. How can I modify the code?
1
0
201
Apr ’25
FocusState Issue in iOS 18 with Keyboard Navigation
I have implemented a SwiftUI view containing a grid of TextField elements, where focus moves automatically to the next field upon input. This behavior works well on iOS 16 and 17, maintaining proper focus highlighting when keyboard full access is enabled. However, in iOS 18 and above, the keyboard full access focus behaves differently. It always stays behind the actual focus state, causing a mismatch between the visually highlighted field and the active text input. This leads to usability issues, especially for users navigating with an external keyboard. Below is the SwiftUI code for reference: struct AutoFocusGridTextFieldsView: View { private let fieldCount: Int private let columns: Int @State private var textFields: [String] @FocusState private var focusedField: Int? init(fieldCount: Int = 17, columns: Int = 5) { self.fieldCount = fieldCount self.columns = columns _textFields = State(initialValue: Array(repeating: "", count: fieldCount)) } var body: some View { let rows = (fieldCount / columns) + (fieldCount % columns == 0 ? 0 : 1) VStack(spacing: 10) { ForEach(0..<rows, id: \.self) { row in HStack(spacing: 10) { ForEach(0..<columns, id: \.self) { col in let index = row * columns + col if index < fieldCount { TextField("", text: $textFields[index]) .frame(width: 40, height: 40) .multilineTextAlignment(.center) .textFieldStyle(RoundedBorderTextFieldStyle()) .focused($focusedField, equals: index) .onChange(of: textFields[index]) { newValue in if newValue.count > 1 { textFields[index] = String(newValue.prefix(1)) } if !textFields[index].isEmpty { moveToNextField(from: index) } } } } } } } .padding() .onAppear { focusedField = 0 } } private func moveToNextField(from index: Int) { if index + 1 < fieldCount { focusedField = index + 1 } } } struct AutoFocusGridTextFieldsView_Previews: PreviewProvider { static var previews: some View { AutoFocusGridTextFieldsView(fieldCount: 10, columns: 5) } } Has anyone else encountered this issue with FocusState in iOS 18? I really do believe that this is a bug strictly connected to keyboard navigation since I experienced similar problem also on UIKit equivalent of the view. Any insights or suggestions would be greatly appreciated!
1
0
615
Mar ’25
VoiceOver accessibility issue in UIKit for line granularity
Context: We are using UIKit to provide accessibility in our app for our iOS users. Our app majorly contains documents/books that user can read. Issue: The issue is VoiceOver is skipping the lines given to it when there are some leading spaces in it. We have observed this issue in different languages. This is only happening for line granularity, other granularities seems to be working as expected. Implementation: We are using below API's to provide line content to voice over. UIAccessibilityReadingContent - accessibilityPageContent - accessibilityFrameForLineNumber - accessibilityContentForLineNumber We are creating UIAccessibilityElement objects to pass to VoiceOver and each UIAccessibilityElement implements UIAccessibilityReadingContent to provide readable content. We also use below APIs to cross element boundaries for all granular navigations. accessibilityNextTextNavigationElement accessibilityPreviousTextNavigationElement We want to know whether skipping the line when provided with leading spaces is expected or a bug in UIKit.
1
0
427
Nov ’25
Unable to set dialect of Chinese of AVSpeechSynthesisVoice in iOS 18
The AVSpeechSynthesizer on some iOS 18 device has a bug that it will read always read Chinese of: AVSpeechUtterance(string: "中文") // Any Chinese Content in the dialect specified by: Settings > Accessibility > Spoken Content > Voices > Chinese > Spoken Language instead of the dialect that I specified in AVSpeechUtterance.voice: AVSpeechSynthesisVoice(language: "zh-HK") // Cantonese AVSpeechSynthesisVoice(language: "zh-TW") // Mandarin However, setting Chinese dialect of AVSpeechSynthesisVoice by "zh-HK" or "zh-TW" has been working on iOS 17 and below. My app has a feature that requires reading sentences in Mandarin followed by Cantonese, i.e., both dialects is needed every time. Therefore, setting the dialect in Spoken Language of Settings is not a workaround to make my app to function correctly in iOS 18. Further to the above, I've also discovered that, if iOS 18 (in my case, 18.5 is tested) is freshly installed (not upgrading from iOS 17 or below, nor restoring backup after fresh installation of iOS 18), the bug above will not happen. However, if it was an upgrade from iOS 17 or below, or backup is restored (in my case, I freshly installed iOS 18.5 on a new iPhone and then restored a backup from another iPhone on iOS 16.2), the bug above happens. This bug puzzled me because I need both dialect of Chinese to be read aloud one by one, but as reported by many users, on most iOS 18 devices (since a fresh installation of latest iOS without upgrading or restoring is uncommon nowadays), my app will read Cantonese two times or Mandarin two times (depending on Spoken Language in Settings). It is the iOS 18 bug which made my app unable to perform the expected behavior. Would Apple developers look into this and advise if there are any possible workaround within the code of app to overcome this bug, or please fix this bug with an iOS 18 update. Thank you.
1
1
122
Jun ’25
How to Implement Dynamic Type for UITextFields Without Resetting Data
Hello! I was doing some accessibility testing for my app and found out that when the user switches the text size, all of the data in the text fields is reset, which causes major disruption. I've tried looking for documentation, but all I've found is information on how to dynamically scale the UI for different text sizes, which I've already implemented. My guess is that every time Dynamic Type registers a change, it redraws my UI instead of just updating it. How can I make sure the data is not reset when the text size changes?
1
1
684
Dec ’25
pairedUUIDsDidChangeNotification never fires, even with MFi hearing aids paired
Hi everyone — I’m implementing the new Hearing Device Support API described here: https://developer.apple.com/documentation/accessibility/hearing-device-support I have MFi hearing aids paired and visible under Settings → Accessibility → Hearing Devices, and I’ve added the com.apple.developer.hearing.aid.app entitlement (and also tested with Wireless Accessory Configuration: https://developer.apple.com/documentation/bundleresources/entitlements/com.apple.external-accessory.wireless-configuration ). com.apple.developer.hearing.aid.app xxxxx but the app won't even compile with this entitlement Problem NotificationCenter.default.addObserver(...) for pairedUUIDsDidChangeNotification never fires — not on app launch, not after pairing/unpairing, and not after reconnecting the hearing aids. Because the notification never triggers, calls like: HearingDeviceSession.shared.pairedDevices always return an empty list. What I expected According to the docs, the notification should be posted whenever paired device UUIDs change, and the session should expose those devices — but nothing happens. Questions Does the hearing.aid.app entitlement require special approval from Apple beyond adding it to the entitlements file? Is there a way to verify that iOS is actually honoring this entitlement? Has anyone successfully received this notification on a real device? Any help or confirmation would be greatly appreciated.
1
0
640
Dec ’25