Hi :) I'm new to app store connect, and I just want to verify what does it take to be able to test subscription for a new app that isn't approved yet using sandbox? Or is this not possible that the app has to be approved first?
More context below:
My app is a new app, I only submitted for review and I linked the subscription from the app’s In-App Purchases and Subscriptions section on the version page when submit it for review. It got rejected for now.
When the app review status is both in-review and rejected, I've tried to test my subscription, where there is a button (like "subscribe"/"become a member") in my app that user can click on, which it calls ios's IAPProvider.startMembershipPurchase, I just get Error: [IAPService] Product not found: [<my_subscription_id>].
I ensured my subscription's product id in app store connect matches with the one in my code.
I can see the "rejected" status both on my app and the subscription.
So can anyone help clarify if the app has to be approved first in order to test subscription? Or am I missing any other setup? Or it might just be my code?
Thanks in advance! Any info is super helpful!
Overview
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hi everyone,
I’m building a messaging app because I’ve seen firsthand how much support and safety is overlooked for this generation online. My goal is to give teens a foundation of security, privacy, and mental health support, while still letting them connect freely. I want to leverage Apple’s platform to help this mission reach the right audience and have real impact.
The app already includes:
Community chat with message blurring for sensitive or harmful words.
Anti-shoulder surfing tools to protect private conversations.
Shake dashboard for quick access to emergency services.
In-chat locks with ML detection for grooming patterns, offering resources while respecting privacy.
Full user control: messages can be deleted anytime, blocking is permanent, and accounts can’t bypass restrictions on the same device.
User consent-first design: every feature is opt-in and controlled by the user.
At this point, I’m looking for guidance on how to position and prepare the app to reach Apple editorial or headline attention — what steps or best practices help mission-driven apps get noticed for features, WWDC spotlights, or App Store promotion? My focus isn’t just on improving the app, but on launch strategy and visibility in a way that amplifies the mission responsibly.
If it’s helpful, I can share a TestFlight build or walkthrough to illustrate the app in action.
Thank you for any insights or advice — I want to make sure this mission has the best chance to reach and support the generation it’s built for.
Topic:
App Store Distribution & Marketing
SubTopic:
General
Tags:
Privacy
SwiftUI
Prototyping
Machine Learning
I have the following snippet (but you can see my entire code in GitHub, if you want):
LazyVGrid(columns: columns) {
ForEach(books) { book in
BookView(book: book)
.draggable(Book.BookTransferable(persistanceIdentifier: book.id))
}
}
and BookView is:
VStack {
Image(nsImage: book.image)
.resizable()
.frame(width: 150, height: 200)
.scaledToFill()
Text(book.title)
.lineLimit(1)
.font(.headline)
HStack {
ForEach(book.tags.sorted(), id: \.self) { tag in
TagView(tag: tag, showText: false)
}
}
}
.padding()
This will render each BookView on a different base-line because of the fact that the Text view sometimes takes 1, 2 or even 3 lines (as shown).
How can I have all BookViews alligned at a common base-line (as it would if Text would only take one line, for example)?
I need help designing the image of a ControlWidgetToggle.
do I understand correctly that I can only use an SFSymbol as image and not my custom image (unless setup via a custom SFSymbol)?
is there any way I can influence the size of the image? I tried multiple SwiftUI modifiers (.imageScale, .font, .resizable, .controlSize) none of them seem to work. My image remains too tiny
the image size of the on and off state is different. Seems to be enforced by the system. Is there any way to make both images use the same size?
the on-state tints the image. Is there a way to set the tint color? .tint and .foregroundstyle seem to be ignored.
Thank you for your help
I had a VoiceOver user point out an issue with my app that I’ve definitely known about but have never been able to fix. I thought that I had filed feedback for it but it looks like I didn’t.
Before I do I’m hoping someone has some insight. With Swift Charts when I tap part of a chart it summarizes the three hours and then you can swipe vertically to hear it read out details of each hour. For example, the Y-Axis is the amount of precipitation for the hour and the X-Axis is the hours of the day. The units aren't being read in the summary but they are for individual hours when you vertical swipe.
The summary says something such as "varies between 0.012 and 0.082". In the AXChartDescriptor I’ve tried everything I can think of, including adding a label to the Y axis in the DataPoint but nothing seems to work in getting that summary to include units. With a vertical swipe it seems to just be using my accessibility label and value (like I would expect).
Hi!
I would like to experiment with the Accessory Setup Kit, but I am not sure how I can add the required entitlement to my Apple Developer account.
I added the following to the plist file:
NSAccessorySetupKitSupports
WiFi
NSAccessorySetupWiFiSSIDPrefix
test_
test
And the following to the entitlements file:
com.apple.developer.accessory-setup-kit
But I am unable to build as my provisioning profile doesn't include "com.apple.developer.accessory-setup-kit entitlement".
On the developer portal, I navigated to “Certificates, Identifiers & Profiles” > “Identifiers” > “”, but I don't see it under "Capabilities" or "App services", nor under requests.
How do I configure my profile so that I am able to make use of this kit?
Topic:
Developer Tools & Services
SubTopic:
Apple Developer Program
I need to detect whether a view controller is presented in a popover or in fullscreen mode, as on iPhone.
I checked viewController.popoverPresentationController but it returns a non-nil value even on iPhone, when it's clearly not in a popover.
I then checked viewController.presentationController?.adaptivePresentationStyle but it returns .formSheet even when it's presented in a popover!?! Why?
This whole adaptive presentation thingie is a mess. Heck, viewController.presentationController returns _UIPageSheetPresentationController even when the view controller is in a UINavigationController, so not presented at all.
Anybody got any ideas?
As the title says, in apple developer site under (Certificates, Identifiers & Profiles) all my devices are set to inactive, and I cannot enable them back because my twices are glitched I have 2 devices and they are there twice each with the same UDID. So I cannot enable a single one. I’m hit with
“A device with number 'UDID number' already exists on this team.” I have an iPad Pro and iPhone 17 Pro Max, I cannot sign any apps to them at all or do anything because my devices are set to inactive, I’ve contacted apple numerous times, and I can’t seem to get any help regarding this issue. I’ve called and emailed, and the developer team has yet to respond to me, if anyone’s had this issue please tell me how you solved it.
i also cannot erase any devices myself until my renewal date.
ive tried manually adding my devices, and its the same message that my devices already exist on team. Ive signed out of my apple account on my devices, changed pw, turned off developer mode and renabled it numerous times, I really don’t know what to do.
I have a tester who switched phones a few months back. Crucially he no longer has access to the old phone. On his new phone when I send him a test flight invite to his apple account, 123@example, he receives it but it fails to work when he clicks the link. He is getting the following message.
The Apple account (123@example) you're currently signed in with does not match the one (456@example) associated with this invitation.
He has no idea what email 456 is. Since his old phone was a hand me down, I'm assuming it's someone else's email entirely. I'm trying to figure out how we clear this association out. I tried deleting his app store connect for my app and reinviting him all the way from the beginning. This did not work.
As a stopgap I'm giving him a public link under an external testing group but I'd really like to not have to do that.
Anyone have any ideas how we can fix this so I can shift him back to the Internal testing group?
Hello Apple Developer Technical Support,
I’m following up on case #102807413324 and submitting this as a code-level support request.
We are integrating iOS Live Activities (ActivityKit + WidgetKit extension written in SwiftUI) into an Expo/React Native app. We’re seeing behavior where the Live Activity UI shown on the Lock Screen appears to “stick” to an older layout and ignores updated SwiftUI code and/or bundled assets, even after rebuilding, reinstalling, and removing existing Live Activities before testing again.
Environment
Device: iPhone 13
iOS: 26.2
macOS: 15.7.3 (24G419)
Xcode: 16.4 (16F6)
Expo SDK: 52
React Native: 0.76.9
expo-live-activity: ^0.4.2
Build type: Ad-Hoc signed IPA (EAS local build)
Summary
We have a WidgetKit extension target (LiveActivity.appex, bundle id: stimul8.LiveActivity) using ActivityConfiguration(for: LiveActivityAttributes.self).
The extension contains multiple SwiftUI views selected via a “route” (derived from deepLinkUrl / title / subtitle), and uses images/backgrounds from the extension asset catalog (Assets.xcassets). We also support loading images from an App Group container with a fallback to the asset catalog.
After shipping updates, the Live Activity UI shown on the Lock Screen continues to resemble an older/default layout (example: a progress-bar-like element remains visible even after removing ProgressView usage from LiveActivityView.swift). Some custom backgrounds/images also fail to display as expected.
Routing (examples)
/streak -> StreakLiveActivityView
/streak-urgent -> StreakUrgentLiveActivityView
/lesson/create -> AILessonLiveActivityView1
/lesson/reminder -> AILessonLiveActivityView2
default -> LiveActivityView
Steps to reproduce (high-level)
Install/build and trigger a Live Activity.
Modify the SwiftUI layout in the extension (e.g., remove ProgressView and change obvious UI elements), rebuild, and reinstall.
Remove any existing Live Activities from the Lock Screen, then trigger a new Live Activity again.
Observed: Lock Screen Live Activity still renders the prior/older-looking UI and/or ignores updated assets.
Troubleshooting already done
Verified the extension (LiveActivity.appex) is included in the IPA and properly signed.
Verified Assets.car is present in the extension and PNG assets are present in the build artifacts.
Ensured SwiftUI source files used by the extension are overwritten during prebuild so the intended versions are present in ios/LiveActivity.
Cleared DerivedData related to LiveActivity builds.
Reinstalled the app and removed existing Live Activities from the Lock Screen before re-triggering new ones.
Questions
Is there any known caching behavior where Live Activities can continue to display a previous UI layout after an app/extension update, even when the activity is re-created?
Are there recommended steps to force the system to load the newest widget extension binary/UI beyond reinstalling and removing existing Live Activities?
What’s the recommended way to confirm which exact extension binary/UI version is being rendered on-device (e.g., specific Console logs, sysdiagnose signals, or other indicators)?
Are there any known constraints with Assets.xcassets usage for Live Activities that could cause bundled assets not to render even when present?
We can provide
A minimal reproduction Xcode project (preferred)
The IPA build
Build logs (Xcode/EAS)
Screenshots/video and a sysdiagnose captured after reproduction
Thank you for your guidance.
Best regards
Topic:
App & System Services
SubTopic:
Widgets & Live Activities
SKProductsRequest always returns as USD not local currency for debug environment and even some time it fails. This is only happening for debug or TestFlight build.
Hello Apple Developer Support,
We are observing inconsistent behavior with push notification sounds routing to Bluetooth / external speakers.
Our app sends push notifications with a custom sound file using the sound parameter in the APNs payload. When an iPhone is connected to a Bluetooth speaker or headphones:
On some devices, the notification sound plays through the connected Bluetooth/external speaker.
On other devices, the notification sound plays only through the iPhone’s built-in speaker.
We also tested with native apps like iMessage and noticed similar behavior — in some cases, notification sounds still play through the phone speaker even when Bluetooth is connected.
Media playback (e.g., YouTube or Music) routes correctly to Bluetooth, so the connection itself is functioning properly.
We would like clarification on the following:
Is this routing behavior expected for push notification sounds?
Are notification sounds intentionally restricted from routing to Bluetooth in certain conditions (e.g., device locked, system policy, audio session state)?
Is there any supported way to ensure notification sounds consistently route through connected Bluetooth/external speakers?
The inconsistent behavior across devices makes it difficult to determine whether this is by design or a configuration issue.
Thank you for your guidance.
I purchased the apple developer program on 13th Februray 2026, and still (17th Feb) haven't recieved the access yet. How long does this process usually take? Anything I can do or needed from my end to speed this up?
Hi, this is my third time posting here, and opened 3 support tickets through apple developer support already (so don't ask me to open another ticket there, I have also sent them follow up). It has been more than 2 weeks and they are not even trying to respond, I have been getting issue with App Transfer it says:
"You can't transfer this app because of the following reasons:
App Transfers Disabled for This Account
Due to irregular activity associated with your account, you cannot transfer or receive apps at this time. If you think this is an error, contact Developer Support.
"
I haven't received any warning or any policy violation, also I am able to upload apps for public release only App Transfer and TestFlight is having issues.
Case IDs:
102817552619
20000107793589
102823905165
Please please please respond me and resolve my issue, it is hurting my business.
I am encountering an issue where the Lock Screen Quick Action fails to visibly open my app.
My app is a camera application that utilizes a CameraCaptureIntent to launch a standalone, lightweight camera view (accessible while the device is locked), distinct from the main application.
Steps to Reproduce:
Open the lightweight camera view using the Lock Screen Quick Action.
From this view, launch the Main App.
Lock the iPhone (put it to sleep).
Attempt to launch the lightweight camera view via the Quick Action again.
A slight animation occurs, but the camera view does not appear on screen. After multiple tests, it seems the view is actually launching but remains in an "invisible state."
I suspect that the system hides the lightweight camera view when transitioning to the Main App, but fails to reset this hidden state when the Quick Action is triggered subsequently.
I would appreciate any guidance on a potential workaround or confirmation if this is a known issue awaiting a system update.
Hi @DTS Engineer
in tvOS 26.2 Beta is still this annoying Shadow Glitch… 😔
I have submitted an Bug-Report, but dont get an Answer… FB18719371
The Animation is not smooth and the Shadow is abruptly „jumping“…
I don’t get any Response from the Apple Engineers. But this GUI Glitch makes the otherwise very high-quality tvOS GUI appear very unprofessional.
Could you please help me? 🤔
Topic:
Community
SubTopic:
Apple Developers
Our children's educational app (COPPA-compliant) was rejected under Guidelines 5.1.1(i) and 5.1.2(i) for sharing personal data with a third-party AI
service without clear disclosure and user permission.
How our app works:
Our app is an AI-powered learning assistant for kids. Children type questions (e.g., "Why is the sky blue?") and the app sends the question to Google
Gemini's API to generate an age-appropriate answer. This is the core and only purpose of the app — it's an AI chat app, similar to how a search engine
sends queries to its servers.
Our current setup:
Google Gemini operates as a data processor (not a data recipient) — zero data retention, no model training on user data
Our privacy policy already discloses Google Gemini as the AI provider, what data is processed, and that no data is stored
The app is clearly marketed as an AI-powered assistant — users understand they are interacting with AI
Our questions:
Infrastructure vs. data sharing: We use Google Gemini to process queries the same way we use Google Sign-In for authentication, MongoDB Atlas for our
database, and Railway for hosting. In all cases, user data passes through a third-party service to provide core functionality. Is the expectation that AI
services require additional consent beyond what's expected for other third-party infrastructure services? If so, what distinguishes them?
2. Minimum consent implementation: If in-app consent is required, what constitutes sufficient "explicit permission"? Specifically:
- Is a simple alert dialog (similar to the ATT prompt) with "Allow" / "Not Now" before first use sufficient?
- Or is a more detailed consent screen with checkboxes/toggles required?
- Since our app's sole purpose is AI-powered Q&A, what should happen if the user taps "Not Now"? The app cannot function without the AI service.
3. Privacy policy disclosure: Our privacy policy already identifies Google Gemini by name, describes what data is sent (child's questions, name, and age
for personalization), and explains Google's zero-retention policy. Is updating the privacy policy alone sufficient, or is a separate in-app consent
mechanism always required under 5.1.2(i)?
4. Children's apps specifically: Since parents set up the app (behind a parental gate), should the consent be presented to the parent during setup, or does
it need to appear elsewhere?
Any guidance on the minimum compliant implementation would be greatly appreciated. We want to get this right.
Thank you.
Topic:
App Store Distribution & Marketing
SubTopic:
App Store Connect
In an NSTableView (Appkit), I need to colour a cell background when it is selected.
That works OK, except that the colour does not span the full cell width, nor even the text itself:
The tableView structure is very basic:
I see there is a TextCell at the end that cannot be deleted. What is this ?
And the colouring as well:
func tableView(_ tableView: NSTableView, viewFor tableColumn: NSTableColumn?, row: Int) -> NSView? {
let p = someDataSource[row]
if let cellView = tableView.makeView(withIdentifier: NSUserInterfaceItemIdentifier(rawValue: "Cell"), owner: self) {
(cellView as! NSTableCellView).textField?.stringValue = p
if selected[row] {
(cellView as! NSTableCellView).backgroundColor = theLightBlueColor
} else {
(cellView as! NSTableCellView).backgroundColor = .clear
}
return cellView
}
}
I've tried to change size constraints in many ways, to no avail.
For instance, I changed Layout to Autoresising :
I tried to change TableCellView size to 170 vs 144:
Or increase tableColum Width.
I have looked at what other object in the NSTableView hierarchy should be coloured without success.
Nothing works.
What am I missing ?
Hello.
To determine wether "AVB/EAV Mode" of a AV-capable network interfaces is turned on or off I query the IO registry and evaluate the property "AVBControllerState".
I was wondering if this is the "correct" approach and if there is anything known about the values for this property?
Network interfaces without AV capability may also carry this property (e.g.: for my WiFi adapter the value of 1) whereas the value for interfaces with AV capability can be 0 and 3. At least as far as I could observe with my limited amount of test devices at hand.
Is it safe to assume that a value of 3 means this feature is turned on, 0 that it is turned off and ignore values of 1?
Is there another approach to get to know the status of the "AVB/EAV Mode"?
Thanks for any insight.
Best regards,
Ingo
Currently, I have purchased the Apple Developer Program for 4 days. However, I can not access or hear any news from Apple team that where I was wrong or do my account have any problems.