This topic area is about the programming languages themselves, not about any specific API or tool. If you have an API question, go to the top level and look for a subtopic for that API. If you have a question about Apple developer tools, start in the Developer Tools & Services topic.
For Swift questions:
If your question is about the SwiftUI framework, start in UI Frameworks > SwiftUI.
If your question is specific to the Swift Playground app, ask over in Developer Tools & Services > Swift Playground
If you’re interested in the Swift open source effort — that includes the evolution of the language, the open source tools and libraries, and Swift on non-Apple platforms — check out Swift Forums
If your question is about the Swift language, that’s on topic for Programming Languages > Swift, but you might have more luck asking it in Swift Forums > Using Swift.
General:
Forums topic: Programming Languages
Swift:
Forums subtopic: Programming Languages > Swift
Forums tags: Swift
Developer > Swift website
Swift Programming Language website
The Swift Programming Language documentation
Swift Forums website, and specifically Swift Forums > Using Swift
Swift Package Index website
Concurrency Resources, which covers Swift concurrency
How to think properly about binding memory Swift Forums thread
Other:
Forums subtopic: Programming Languages > Generic
Forums tags: Objective-C
Programming with Objective-C archived documentation
Objective-C Runtime documentation
Share and Enjoy
—
Quinn “The Eskimo!” @ Developer Technical Support @ Apple
let myEmail = "eskimo" + "1" + "@" + "apple.com"
Swift
RSS for tagSwift is a powerful and intuitive programming language for Apple platforms and beyond.
Posts under Swift tag
201 Posts
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
After upgrading to a new iPhone and restoring from an iCloud backup using the same Apple ID, I noticed an issue with Health app permissions.
■ What is happening
On my previous iPhone, an app had permission to read step count data.
After restoring to the new iPhone, the app still appears in the Health app under Sources.
However, when I tap the app, the usual data type permission toggles (such as Steps) are not displayed at all.
As a result, the app is unable to read step count data.
■ Additional details
The app itself seems to be recognized as a Health data source.
However, the data type permission screen is empty.
No ON/OFF switches are shown.
The backup was created on iOS 18, and the restore was performed on iOS 26.
I have not yet confirmed whether this also happens with other iOS version combinations.
■ Questions
Is it expected behavior that Health app permissions (per data type) are not restored via iCloud backup?
Has anyone experienced a similar situation where the app appears under Sources but the permission options are missing? If so, how did you resolve it?
Any information from users who have experienced the same issue would be greatly appreciated.
It appears that starting with macOS Sequoia, Quick Look Preview extension no longer loads MapKit maps correctly anymore. Map tiles do not appear, leaving users with a beige background.
Users report that polylines do render correctly, but annotations appears black.
This was previously working fine in prior macOS versions including Sonoma.
STEPS TO REPRODUCE
Create a macOS app project, with an associated document.
Ensure project has a Quick Look preview extension, with necessary basic setups.
Ensure that the extension mentioned in (2) must have a MKMapView. Any other cosmetic changes, etc, does not need to be implemented to observe the base issue. Do note that it has been reported that in addition to the map tiles not loading, annotations don't render correctly as well.
Announcing the Swift Student Challenge 2026
Every year, Apple’s Swift Student Challenge celebrates the creativity and ingenuity of student developers from around the world, inviting them to use Swift and Xcode to solve real-world problems in their own communities and beyond.
Learn more → https://developer.apple.com/swift-student-challenge/
Submissions for the 2026 challenge will open February 6 for three weeks, and students can prepare with new Develop in Swift tutorials and Meet with Apple code-along sessions.
The Apple Developer team is here is to help you along the way - from idea to app, post your questions at any stage of your development here in this forum board or be sure to add the Swift Student Challenge tag to your technology-specific forum question.
Your designs. Your apps. Your moment.
I have no idea how to do it: on MacOS, in Document.init(configuration: ReadConfiguration) I decode file, and restore objects from data, which in some cases could take a long time. Document isn't fully inited, so I have no access to it. But would like to have a progress bar on screen (easier to wait for done, for now). I know size, progress value, but no idea how to make view from object during init.
I know, this question may be very stupid.
init(configuration: ReadConfiguration) throws {
guard let data = configuration.file.regularFileContents
else {
throw CocoaError(.fileReadCorruptFile)
}
let decoder = JSONDecoder()
let flat = try decoder.decode(FlatDoc.self, from: data)
print ("reverting \(flat.objects.count) objects...")
///This takes time, a lot of time, need progress bar
///Total is `flat.objects.count`, current is `objects.count`
/// `Show me a way to the (next) progress bar!`
revertObjects(from: flat.objects)
print ("...done")
}
update: I defined var flatObjects in Document, and I can convert in .onAppear.
struct TestApp: App {
@State var isLoading = false
...
ContentView(document: file.$document)
.onAppear {
isLoading = true
Task {file.document.revertObjects()}
isLoading = false
}
if isLoading {
ProgressView()
}
...
}
But progress bar never shows, only rainbow ball
Hi everyone,
I have a question about showcasing my Swift Student Challenge submission.
My app includes layouts optimized for iPhone, iPad, and Mac, and I’d like to highlight that it works well across all three device sizes.
Would it be acceptable to upload one combined screenshot that shows the same screen displayed on iPhone, iPad, and Mac side-by-side, to demonstrate the responsive design?
Thanks in advance for any advice!
Topic:
Community
SubTopic:
Swift Student Challenge
Tags:
Swift Student Challenge
Swift
Swift Playground
Context: Xcode 16.4, Appkit
In a windowController, I need to create and send a mouseDown event (newMouseDownEvent).
I create the event with:
let newMouseDownEvent = NSEvent.mouseEvent(
with: .leftMouseDown,
location: clickPoint,
// all other fields
I also need to make window key and front, otherwise the event is not handled.
func simulateMouseDown() {
self.window?.makeFirstResponder(self.window)
self.calepinFullView.perform(#selector(NSResponder.self.mouseDown(with:)), with: newMouseDownEvent!)
}
As I have to delay the call( 0.5 s), I use asyncAfter:
DispatchQueue.main.asyncAfter(deadline: .now() + 0.5, qos: .userInteractive) {
self.simulateMouseDown()
}
It works as intended, but that generates the following (purple) warning at runtime:
[Internal] Thread running at User-interactive quality-of-service class waiting on a lower QoS thread running at Default quality-of-service class. Investigate ways to avoid priority inversions
I have tried several solutions,
change qos in await:
DispatchQueue.main.asyncAfter(deadline: .now() + 0.5, qos: ..default)
call from a DispatchQueue
let interactiveQueue = DispatchQueue.global(qos: .userInteractive)
interactiveQueue.async {
self.window?.makeFirstResponder(self.window)
self.calepinFullView.perform(#selector(NSResponder.self.mouseDown(with:)), with: newMouseDownEvent!)
}
But that's worse, it crashes with the following error:
FAULT: NSInternalInconsistencyException: -[HIRunLoopSemaphore wait:] has been invoked on a secondary thread; the problem is likely in a much shallower frame in the backtrace; {
NSAssertFile = "HIRunLoop.mm";
NSAssertLine = 709;
}
How to eliminate the priority inversion risk (not just silencing the warning) ?
I am working on a remote control application for macOS where I need to maintain two "virtual" cursors:
Remote Cursor: Follows the remote user's movements.
Local Cursor: Follows the local user's physical mouse/trackpad movements.
To move the system cursor (for the remote side), I use CGWarpMouseCursorPosition as follows:
void DualCursorMac::UpdateSystemCursorPosition(int x, int y) {
CGPoint point = CGPointMake(static_cast<CGFloat>(x), static_cast<CGFloat>(y));
// Warp the cursor to match remote coordinates
CGWarpMouseCursorPosition(point);
}
Meanwhile, I use a CGEventTap to monitor local physical movements to update my local virtual cursor's UI:
CGEventRef Mouse::MouseTapCallback(CGEventTapProxy proxy, CGEventType type, CGEventRef event, void *refcon) {
if (remoteControlMode) {
// We want to suppress system cursor movement but still read the delta
const int deltaX = static_cast<int>(CGEventGetIntegerValueField(event, kCGMouseEventDeltaX));
const int deltaY = static_cast<int>(CGEventGetIntegerValueField(event, kCGMouseEventDeltaY));
NSLog(@"MouseTapCallback: delta:(%d, %d)", deltaX, deltaY);
// Update local virtual cursor UI based on deltas...
return nullptr; // Consume the event
}
return event;
}
The Problem:
When CGWarpMouseCursorPosition is called frequently to update the system cursor, it interferes with the kCGMouseEventDeltaX/Y values in the Event Tap.
Specifically, if the local user moves the trackpad slowly (expecting deltas of 1 or 2), but a "Warp" occurs simultaneously (e.g., jumping the cursor from (100, 100) to (300, 300)), the deltaX and deltaY in the callback suddenly spike to very large values. It seems the system calculates the delta based on the new warped position rather than the pure physical displacement of the trackpad.
This makes the local virtual cursor "jump" erratically and makes it impossible to track smooth local movement during remote control.
My Question:
Is there a way to get the "raw" or "pure" physical relative movement (delta) from the trackpad/mouse that is independent of the system cursor's absolute position or warping?
Are there alternative APIs (perhaps IOKit or different CGEvent fields) that would allow me to get consistent deltas even when the cursor is being warped programmatically?
Hi,
I tried to follow this guide:
https://developer.apple.com/documentation/networkextension/filtering-traffic-by-url
And this:
https://github.com/apple/pir-service-example
I already deploy the pir service on my server. And set the configuration on the app like this:
{
name = SimpleURLFilter
identifier = xxxxx
applicationName = SimpleURLFilter
application = com.xxxx.SimpleURLFilter
grade = 2
urlFilter = {
Enabled = YES
FailClosed = NO
AppBundleIdentifier = com.mastersystem.SimpleURLFilter
ControlProviderBundleIdentifier = com.xxxx.SimpleURLFilter.SimpleURLFilterExtension
PrefilterFetchFrequency = 2700
pirServerURL = https://xxxxx/pir
pirPrivacyPassIssuerURL = https://xxxxx/pir
AuthenticationToken = AAAA
pirPrivacyProxyFailOpen = NO
pirSkipRegistration = NO
}
}
But I got this error when I tried to enable the service on the app:
Received filter status change: <FilterStatus: 'stopped' errorMessage: 'The operation couldn’t be completed. (NetworkExtension.NEURLFilterManager.Error error 9.)'>
What does that error mean? And how to fix it?
I have an app that records video (and also provides a custom remote interface) so it needs to remain awake and in the foreground.
It sets;
UIApplication.shared.isIdleTimerDisabled = true
I've also tried catching willEnterForegroundNotification to ensure it resets it if the app is backgrounded and resumes;
.onReceive(
NotificationCenter.default.publisher(
for: UIApplication.willEnterForegroundNotification)
) { _ in
UIApplication.shared.isIdleTimerDisabled = true
}
However, it seems that on some devices it will still go to sleep. This seems to be the case when Adaptive Power Mode is on (or rather, I've not managed to reproduce it when Adaptive Power Mode is off) even when battery percentage is well over 20% (I sort of expected Low Power Mode to trigger this)
Am I missing something obvious? there must be a way to make sure media capture apps stay awake (I'm surprised AVFoundation doesn't do it anyway!)
If it is related to Adaptive Power Mode, is there any way to detect that programatically to at least provide a warning to the user that having it on will affect operation of the app?
Description
I've encountered a consistent hang/freeze issue in SwiftUI applications when using nested LazyVStack containers with Accessibility Inspector (simulator) or VoiceOver (physical device) enabled. The application becomes completely unresponsive and must be force-quit.
Importantly, this hang occurs in a minimal SwiftUI project with no third-party dependencies, suggesting this is a framework-level issue with the interaction between SwiftUI's lazy view lifecycle and the accessibility system.
Reproduction Steps
I've created a minimal reproduction project available here:
https://github.com/pendo-io/SwiftUI_Hang_Reproduction
To Reproduce:
Create a SwiftUI view with the following nested LazyVStack structure:
struct NestedLazyVStackView: View {
@State private var outerSections: [Int] = []
@State private var innerRows: [Int: [Int]] = [:]
var body: some View {
ScrollView {
LazyVStack(alignment: .leading, spacing: 24) {
ForEach(outerSections, id: \.self) { section in
VStack(alignment: .leading, spacing: 8) {
Text("Section #\(section)")
// Nested LazyVStack
LazyVStack(alignment: .leading, spacing: 2) {
ForEach(innerRows[section] ?? [], id: \.self) { row in
Text("Section #\(section) - Row #\(row)")
.onAppear {
// Load more data when row appears
loadMoreInner(section: section)
}
}
}
}
.onAppear {
// Load more sections when section appears
loadMoreOuter()
}
}
}
}
}
}
Enable Accessibility Inspector in iOS Simulator:
Xcode → Open Developer Tool → Accessibility Inspector
Select your running simulator
Enable Inspection mode (eye icon)
Navigate to the view and start scrolling
Result: The application hangs and becomes unresponsive within a few seconds of scrolling
Expected Behavior
The application should remain responsive when Accessibility Inspector or VoiceOver is enabled, allowing users to scroll through nested lazy containers without freezing.
Actual Behavior
The application freezes/hangs completely
CPU usage may spike
The app must be force-quit to recover
The hang occurs consistently and is reproducible
Workaround 1: Replace inner LazyVStack with VStack
LazyVStack {
ForEach(...) { section in
VStack { // ← Changed from LazyVStack
ForEach(...) { row in
...
}
}
}
}
Workaround 2: Embed in TabView
TabView {
NavigationStack {
NestedLazyVStackView() // ← Same nested structure, but no hang
}
.tabItem { ... }
}
Interestingly, wrapping the entire navigation stack in a TabView prevents the hang entirely, even with the nested LazyVStack structure intact.
Questions for Apple
Is there a known issue with nested LazyVStack containers and accessibility traversal?
Why does wrapping the view in a TabView prevent the hang?
Are there recommended patterns for using nested lazy containers with accessibility support?
Is this a timing issue, a deadlock, or an infinite loop in the accessibility system?
Why that happens?
Reproduction Project
A complete, minimal reproduction project is available at:
https://github.com/pendo-io/SwiftUI_Hang_Reproduction
My team is developing an enterprise VPN application that needs to respond to Mobile Device Management (MDM) profile installations and removals in real-time. Our app uses the NetworkExtension framework and needs to update the UI immediately when VPN configurations are added or removed via MDM.
We are currently observing NEVPNConfigurationChangeNotification to detect VPN configuration changes:
While NEVPNConfigurationChangeNotification fires reliably when users manually remove VPN profiles through Settings > General > VPN & Device Management, it appears to have inconsistent behavior when MDM profiles containing VPN configurations are installed programmatically via MDM systems.
STEPS TO REPRODUCE
From MDM Admin Console: Deploy a new VPN profile to the test device
On Device: Wait for MDM profile installation (usually silent, no user interaction required)
Check Device Settings: Go to Settings > General > VPN & Device Management to confirm profile is installed
Return to App: Check if the UI shows the new VPN profile
When using the new RealityKit Manipulation Component on Entities, indirect input will never translate the entity - no matter what settings are applied. Direct manipulation works as expected for both translation and rotation.
Is this intended behaviour? This is different from how indirect manipulation works on Model3D. How else can we get translation from this component?
visionOS 26 Beta 2
Build from macOS 26 Beta 2 and Xcode 26 Beta 2
Attached is replicable sample code, I have tried this in other projects with the same results.
var body: some View {
RealityView { content in
// Add the initial RealityKit content
if let immersiveContentEntity = try? await Entity(named: "MovieFilmReel", in: reelRCPBundle) {
ManipulationComponent.configureEntity(immersiveContentEntity, allowedInputTypes: .all, collisionShapes: [ShapeResource.generateBox(width: 0.2, height: 0.2, depth: 0.2)])
immersiveContentEntity.position.y = 1
immersiveContentEntity.position.z = -0.5
var mc = ManipulationComponent()
mc.releaseBehavior = .stay
immersiveContentEntity.components.set(mc)
content.add(immersiveContentEntity)
}
}
}
I have a question regarding the behavior of AVAudioSession.sharedInstance().outputVolume.
Observed behavior:
When the app is in the foreground, I read audioSession.outputVolume (for example, 0.1).
The app is then moved to the background.
While the app is in the background, the user changes the system volume using the hardware buttons (for example, to 0.5).
When the app returns to the foreground, audioSession.outputVolume still reports the previous value (0.1).
From my testing, outputVolume only seems to update when the system volume is changed while the app is in the foreground. Volume changes made while the app is in the background are not reflected when the app returns to the foreground.
Questions:
According to Apple’s documentation for AVAudioSession.outputVolume:
“The systemwide output volume set by the user.”
https://developer.apple.com/documentation/avfaudio/avaudiosession/outputvolume
However, based on our testing on iOS 18.6.2 and iOS 18.1, the observed behavior seems to differ from this description.
Questions:
The documentation states that outputVolume represents the system-wide volume set by the user. In our testing, the value does not reflect volume changes made while the app is in the background and only updates when the app is in the foreground.Is this the expected behavior of AVAudioSession.outputVolume?
Is there any other recommended way in Swift to retrieve the current system volume that reflects user changes made both while the app is in the foreground and while it is in the background?
Any clarification on the intended behavior or recommended handling would be greatly appreciated.
In an NSTableView (Appkit), I need to colour a cell background when it is selected.
That works OK, except that the colour does not span the full cell width, nor even the text itself:
The tableView structure is very basic:
I see there is a TextCell at the end that cannot be deleted. What is this ?
And the colouring as well:
func tableView(_ tableView: NSTableView, viewFor tableColumn: NSTableColumn?, row: Int) -> NSView? {
let p = someDataSource[row]
if let cellView = tableView.makeView(withIdentifier: NSUserInterfaceItemIdentifier(rawValue: "Cell"), owner: self) {
(cellView as! NSTableCellView).textField?.stringValue = p
if selected[row] {
(cellView as! NSTableCellView).backgroundColor = theLightBlueColor
} else {
(cellView as! NSTableCellView).backgroundColor = .clear
}
return cellView
}
}
I've tried to change size constraints in many ways, to no avail.
For instance, I changed Layout to Autoresising :
I tried to change TableCellView size to 170 vs 144:
Or increase tableColum Width.
I have looked at what other object in the NSTableView hierarchy should be coloured without success.
Nothing works.
What am I missing ?
Hello every developers. I need your help. Do you know how to attach animation to appearance, like a smooth transition from dark to light and vise versa. My code here:
@main
struct The_Library_of_BabelonApp: App {
@AppStorage("selectedAppearance") private var selectedAppearance = 0
@StateObject private var router = AppRouter()
var scheme: ColorScheme? {
if selectedAppearance == 1 { return .light }
if selectedAppearance == 2 { return .dark }
return nil
}
var body: some Scene {
WindowGroup {
RootView()
.preferredColorScheme(scheme)
.environmentObject(router)
// this is doesn't work correctly
.animation(.smooth(duration: 2), value: selectedAppearance)
}
}
}
And my appearance switching looks:
struct SettingsView: View {
@AppStorage("selectedAppearance") private var selectedAppearance = 0
var body: some View {
List {
Section(header: Text("Appearance")) {
HStack(spacing: 20) {
ThemePreview(title: "Light", imageName: "lightTheme", tag: 1, selection: $selectedAppearance)
ThemePreview(title: "Dark", imageName: "darkTheme", tag: 2, selection: $selectedAppearance)
ThemePreview(title: "System", imageName: "systemMode", tag: 0, selection: $selectedAppearance)
}
.padding(.vertical, 10)
.frame(maxWidth: .infinity)
}
}
}
}
struct ThemePreview: View {
let title: String
let imageName: String
let tag: Int
@Binding var selection: Int
var body: some View {
Button {
selection = tag
} label: {
VStack {
Image(imageName)
.resizable()
.aspectRatio(contentMode: .fill)
.frame(width: 120, height: 80)
.clipShape(RoundedRectangle(cornerRadius: 12))
.overlay(
RoundedRectangle(cornerRadius: 12)
.stroke(selection == tag ? Color.blue : Color.clear, lineWidth: 3)
)
Text(title)
.font(.caption)
.foregroundColor(selection == tag ? .blue : .primary)
}
}
.buttonStyle(.plain)
}
}
I guess my code works but animation working another way, its turn my Section, I don't know.... Thank you in advance
When inputting data within a sheet, I'm allowing the user to take a photo, so the camera is called and presents itself within a 2nd sheet, however the controls are centered within the iPhone's entire screen, cropping the top controls and not extending down to the bottom of the phone's screen.
Any help on how to fix this?
I’m using Network Framework with UDP and calling:
connection.receive(minimumIncompleteLength: 1,
maximumLength: 1500) { data, context, isComplete, error in
... // Some Logic
}
Is it possible for this completion handler to be called with data==nil if I haven't received any kind of error, i.e., error==nil and the connection is still in the .ready state?
I am developing an iOS App for a Bluetooth peripheral using SwiftUI with Swift 5 or 6. I have a few past attempts that got so far (connected to a peripheral), and some downloaded examples that connect to peripherals. Lately (last month or so), my current attempt never gets BleManager to start, and every attempt ends at my View that says 'please enable Bluetooth'.
The Xcode console is totally blank with no print outputs.
Coding Assistant suggested the init() in my @main structure could contain print("App initializing"), but even that never prints.
Coding Assistant suggests:
"• Open your project's Info.plist in Xcode.
• Make sure UIApplicationSceneManifest is present and configured for SwiftUI, not referencing any storyboard.
• Ensure UIMainStoryboardFile is not present (or blank)."
but there is no info.plist because it is no longer required.
Downloaded sample code runs and connects to peripherals, so Bluetooth is working on my iPhone and the Bluetooth device is accessible. My older attempts used to work, but now have the same problem.
All attempts have "Enable Bluetooth to connect to Device" in the Privacy - Bluetooth Info.plist setting.
Something is fundamentally wrong with many different code attempts.
I have searched all the various settings for mention of SwiftUI or Storyboard, but not found them in working or failing projects.
The downloaded code which works has minimum deployment iOS 14.0 and Swift Compiler Language Version Swift 5.
My latest code attempt has minimum deployment iOS 16 and Swift 5.
All code is target device iPhone (I am testing on iPhone 16e running iOS 26.2.1) and developing with Xcode 26.2 on MacBook Air M1 running the latest Tahoe.
I do a Clean Build Folder before every test, and have tried re-starting both Mac and iPhone.
How can my coding fail so spectacularly?
Hello,
After upgrading to macOS 26.2, I’ve noticed a significant performance regression when calling evaluateJavaScript in an iOS App running on Mac (WKWebView, Swift project).
Observed behavior
On macOS 26.2, the callback of evaluateJavaScript takes around 3 seconds to return.
This happens not only for:
evaluateJavaScript("navigator.userAgent")
but also for simple or even empty scripts, for example:
evaluateJavaScript("")
On previous macOS versions, the same calls typically returned in ~200 ms.
Additional testing
I created a new, empty Objective-C project with a WKWebView and tested the same evaluateJavaScript calls.
In the Objective-C project, the callback still returns in ~200 ms, even on macOS 26.2.
Question
Is this a known issue or regression related to:
iOS Apps on Mac,
Swift + WKWebView, or
behavioral changes in evaluateJavaScript on macOS 26.2?
Any information about known issues, internal changes, or recommended workarounds would be greatly appreciated.
Thank you.
Test Code Swift
class ViewController: UIViewController {
private var tmpWebView: WKWebView?
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view.
setupUserAgent()
}
func setupUserAgent() {
let t1 = CACurrentMediaTime()
tmpWebView = WKWebView(frame: .zero)
tmpWebView?.isInspectable = true
tmpWebView?.evaluateJavaScript("navigator.userAgent") { [weak self] result, error in
let t2 = CACurrentMediaTime()
print("[getUserAgent] \(t2 - t1)s")
self?.tmpWebView = nil
}
}
}
Test Code Objective-C
- (void)scene:(UIScene *)scene willConnectToSession:(UISceneSession *)session options:(UISceneConnectionOptions *)connectionOptions {
NSTimeInterval startTime = [[NSDate date] timeIntervalSince1970];
WKWebView *webView = [[WKWebView alloc] init];
dispatch_async(dispatch_get_main_queue(), ^{
[webView evaluateJavaScript:@"navigator.userAgent" completionHandler:^(id result, NSError *error) {
NSTimeInterval endTime = [[NSDate date] timeIntervalSince1970];
NSLog(@"[getUserAgent]: %.2f s", (endTime - startTime));
}];
});
}