My app is a camera app that supports Picture-in-Picture (PiP) mode.
Normally, when the device rotates, I get the device orientation from iOS and use it to rotate the camera feed so that the preview stays correctly aligned.
However, when the app enters PiP mode, it is considered to be in the background, and I can no longer receive orientation updates from the system.
As a result, I can’t apply rotation corrections to the camera video in PiP mode.
Is there any way to retrieve device orientation while the app is in the background (specifically during PiP mode)?
Any guidance would be greatly appreciated.
Thank you!
Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I'm working on a media app that would like to be able to tell if the TV connected to tvOS is running at 59.94hz or 60.00hz, so it can optimize a video stream. It looks like the best I can currently do is to check if the user has Match Content Rate enabled, and based on that, when calling displayManager.preferredDisplayCriteria to change video modes, I could guess which rate their TV might be in. It's not very ideal, because not all TVs support both of these rates, and my request for 59.94 might end up as 60 and vice versa.
I dug around and can't find any available method in UIScreen to get this info. The odd thing is, the data is right there in currentMode when I look in the debugger, but it seems to be in a private or undocumented class. Is there any way to get at it?
Hello, I'm trying to create a webbrowser but currently when signed into apple music webplayer I get the following message when I attempt to play on any versions of my webbrowser:
Not available on the web
You can listen to this in the Apple Music app.
Is there a way to setup DRM (assuming this is the issue) with apple to allow my webbrowser to play this content?
I believe Apple TV is also affected.
Thank you ahead of time.
I maintain a couple of CoreImage libraries that provide custom Metal kernel backed CIFilters. In iOS/iPadOS 26, the CIColorKernel.apply() method invoked in the CIFilter subclass fails to add the coreimage::destination parameter to the Metal function call:
-[CIColorKernel applyWithExtent:arguments:options:] argument count mismatch for kernel 'FractalNoise3D', expected 13 but saw 12.
I've compiled the code with Xcode 26 and deployed to iOS 18 devices without any breakage, so this is definitely an iOS problem, not an Xcode problem.
Library here: https://github.com/JoshuaSullivan/SimplexNoiseFilter
Feedback ID: FB17874311
In iOS 18, CarPlay shows an error: “There was a problem loading this content” after playback starts. Audio works fine, but the Now Playing screen doesn’t load. I’m using MPPlayableContentManager. This worked fine in iOS 17. Anyone else seeing this error in iOS 18?
I’m building a music app using Apple Music streaming via ApplicationMusicPlayer.
My goal is to decrease the volume of the current song during the last 10 seconds, and when the next track begins, restore the volume to its normal level.
I know that ApplicationMusicPlayer doesn’t expose a volume API, and I want to avoid triggering the system volume HUD.
✅ Using Apple Music streaming (not local files)
❓ Is it possible to implement per-track fade-out/fade-in logic with ApplicationMusicPlayer?
Appreciate any clarification or official guidance!
My app Balletrax is a music player for people to use while they teach ballet. Used to be you could silence notifications during use, but now the customer seems to have to know how to use Focus mode, remember to turn it on and off, and have to check the notifications one does and doesn't want to use. Is there no way to silence all notifications when the app is in use?
I set the device format and colorspace to Apple Log and turn off the HDR, why the movie output is still in HDR format rather than ProRes Log?
Full runnable demo here:
https://github.com/SpaceGrey/ColorSpaceDemo
session.sessionPreset = .inputPriority
// get the back camera
let deviceDiscoverySession = AVCaptureDevice.DiscoverySession(deviceTypes: [.builtInWideAngleCamera], mediaType: .video, position: .back)
backCamera = deviceDiscoverySession.devices.first!
try! backCamera.lockForConfiguration()
backCamera.automaticallyAdjustsVideoHDREnabled = false
backCamera.isVideoHDREnabled = false
let formats = backCamera.formats
let appleLogFormat = formats.first { format in
format.supportedColorSpaces.contains(.appleLog)
}
print(appleLogFormat!.supportedColorSpaces.contains(.appleLog))
backCamera.activeFormat = appleLogFormat!
backCamera.activeColorSpace = .appleLog
print("colorspace is Apple Log \(backCamera.activeColorSpace == .appleLog)")
backCamera.unlockForConfiguration()
do {
let input = try AVCaptureDeviceInput(device: backCamera)
session.addInput(input)
} catch {
print(error.localizedDescription)
}
// add output
output = AVCaptureMovieFileOutput()
session.addOutput(output)
let connection = output.connection(with: .video)!
print(
output.outputSettings(for: connection)
)
/*
["AVVideoWidthKey": 1920, "AVVideoHeightKey": 1080, "AVVideoCodecKey": apch,<----- prores has enabled.
"AVVideoCompressionPropertiesKey": {
AverageBitRate = 220029696;
ExpectedFrameRate = 30;
PrepareEncodedSampleBuffersForPaddedWrites = 1;
PrioritizeEncodingSpeedOverQuality = 0;
RealTime = 1;
}]
*/
previewSource = DefaultPreviewSource(session: session)
queue.async {
self.session.startRunning()
}
}
We are encountering an issue where AVPlayer throws the error:
Error Domain=AVFoundationErrorDomain Code=-11819 "Cannot Complete Action" > Underlying Error Domain[(null)]:Code[0]:Desc[(null)]
This error seems to occur intermittently during video playback, especially after extended usage or when switching between different streams. We observe Error 11819 (AVFoundationErrorDomain) in Conviva platform that some of our users experience it but we couldn't reproduce it so far and we’re need support to determine the root cause and/or best practices to prevent it.
Some questions we have:
What typically triggers this error?
Could it be related to memory/resource constraints, network instability, or backgrounding?
Are there any recommended ways to handle or recover from this error gracefully?
Any insights or guidance would be greatly appreciated. Thanks!
Topic:
Media Technologies
SubTopic:
Streaming
I developed a driverkit extension based on overriding-the-default-usb-video-class-extension, but the link didn’t give the details of realization. I asked DTS who gave two tips:
1, Do you also have a CMIO extension to load in place of the default overriding-the-default-usb-video-class-extension
2, Your DriverKit extension’s info.plist is also missing the CameraAssistantBundleID.
I want to know why a driverkit extension needs a CMIO extension, what’s the data and control flow?
When I play an m3u8 video using AVPlayer, it can play smoothly at 2x speed. However, when I set it to 3x speed, the playback is not smooth and there is no sound.
Topic:
Media Technologies
SubTopic:
Video
I am following the Apple sample code and trying to add a manual focus lens position slider:
@available(iOS 18.0, *)
private func addCameraControls() {
if !self.session.controls.isEmpty {
for control in self.session.controls {
self.session.removeControl(control)
}
}
self.cameraControlFocusSlider = nil
//Focus Slider
if self.videoDevice!.isLockingFocusWithCustomLensPositionSupported {
self.cameraControlFocusSlider = AVCaptureSlider("Focus", symbolName: "dot.square", in: 0.0...1.0)
self.cameraControlFocusSlider!.setActionQueue(self.sessionQueue) { focusValue in
//Do manual focus
}
if self.session.canAddControl(self.cameraControlFocusSlider!) {
self.session.addControl(self.cameraControlFocusSlider!)
}
}
}
So there are these AVCaptureSessionControlsDelegate methods:
final func sessionControlsDidBecomeActive(_ session: AVCaptureSession) {
print ("sessionControlsDidBecomeActive")
}
final func sessionControlsWillEnterFullscreenAppearance(_ session: AVCaptureSession) {
print ("sessionControlsWillEnterFullscreenAppearance")
}
final func sessionControlsWillExitFullscreenAppearance(_ session: AVCaptureSession) {
print ("sessionControlsWillExitFullscreenAppearance")
}
final func sessionControlsDidBecomeInactive(_ session: AVCaptureSession) {
print ("sessionControlsDidBecomeInactive")
}
So when self.cameraControlFocusSlider is presented, I have to show the current value of the lense position. Lens position can change from auto focus and also from manual focus by the user using the app UI. Is there a way to see if self.cameraControlFocusSlider is active or being used?
Please note that I will have more than one AVCaptureSlider in the final code.
Hello! I have been following the UsingAVFoundationToPlayAndPersistHTTPLiveStreams sample code in order to test persisting streams to disk. In addition to support for m3u8, I have noticed in testing that this also seems to work for MP3 Audio, simply by changing the plist entries to point to remote URLs with audio/mpeg content. Is this expected, or are there caveats that I should be aware of?
Thanks you!
I’m building a SwiftUI app whose primary job is to play audio. I manage all of the Now-Playing metadata and Command center manually via the available shared instances:
MPRemoteCommandCenter.shared()
MPNowPlayingInfoCenter.default().nowPlayingInfo
In certain parts of the app I also need to display videos, but as soon as I attach another AVPlayer, it automatically pushes its own metadata into the Control Center and overwrites my audio info.
What I need: a way to show video inline without ever having that video player update the system’s Now-Playing info (or Control Center).
In my app, I start by configuring the shared audio session
do {
try AVAudioSession.sharedInstance().setCategory(.playback,
mode: .default,
options: [
.allowAirPlay,
.allowBluetoothA2DP
])
try AVAudioSession.sharedInstance().setActive(true)
} catch {
NSLog("%@", "**** Failed to set up AVAudioSession \(error.localizedDescription)")
}
and then set the MPRemoteCommandCenter commands and MPNowPlayingInfoCenter nowPlayingInfo like mentioned above.
All this works without any issues as long as I only have one AVPlayer in my app. But when I add other AVPlayers to display some videos (and keep the main AVPlayer for the sound) they push undesired updates to MPNowPlayingInfoCenter:
struct VideoCardView: View {
@State private var player: AVPlayer
let videoName: String
init(player: AVPlayer = AVPlayer(), videoName: String) {
self.player = player
self.videoName = videoName
guard let path = Bundle.main.path(forResource: videoName, ofType: nil) else { return }
let url = URL(fileURLWithPath: path)
let item = AVPlayerItem(url: url)
self.player.replaceCurrentItem(with: item)
}
var body: some View {
VideoPlayer(player: player)
.aspectRatio(contentMode: .fill)
.onAppear {
player.isMuted = true
player.allowsExternalPlayback = false
player.actionAtItemEnd = .none
player.play()
MPNowPlayingInfoCenter.default().nowPlayingInfo = nil
MPNowPlayingInfoCenter.default().playbackState = .stopped
NotificationCenter.default.addObserver(forName: .AVPlayerItemDidPlayToEndTime,
object: player.currentItem,
queue: .main) { notification in
guard let finishedItem = notification.object as? AVPlayerItem,
finishedItem === player.currentItem else { return }
player.seek(to: .zero)
player.play()
}
}
.onDisappear {
player.pause()
}
}
}
Which is why I tried adding:
MPNowPlayingInfoCenter.default().nowPlayingInfo = nil
MPNowPlayingInfoCenter.default().playbackState = .stopped // or .interrupted, .unknown
But that didn't work.
I also tried making a wrapper around the AVPlayerViewController in order to set updatesNowPlayingInfoCenter to false, but that didn’t work either:
struct CustomAVPlayerView: UIViewControllerRepresentable {
let player: AVPlayer
func makeUIViewController(context: Context) -> AVPlayerViewController {
let vc = AVPlayerViewController()
vc.player = player
vc.updatesNowPlayingInfoCenter = false
vc.showsPlaybackControls = false
return vc
}
func updateUIViewController(_ controller: AVPlayerViewController, context: Context) {
controller.player = player
}
}
Hence any help on how to embed video in SwiftUI without its AVPlayer touching MPNowPlayingInfoCenter would be greatly appreciated.
All this was tested on an actual device with iOS 18.4.1, and built with Xcode 16.2 on macOS 15.5
The app registers a periodic time observer to the AVPlayer when the playback starts and it works fine. When switching to AirPlay during playback, the periodic time observation continues working as expected.
However, when switching back to local playback, the periodic time observer does not fire anymore until a seek is performed. The app removes the periodic time observer only when the playback stops.
I can see that when switching back to local playback, the timeControlStatus successively changes
to .waitingToPlayAtSpecifiedRate (reason: .evaluatingBufferingRate)
then to .waitingToPlayAtSpecifiedRate (reason: .toMinimizeStalls)
and finally to .playing
But the time observation does not work anymore.
Also, the issue is systematic with Live and VOD streams providing a program date (with HLS property #EXT-X-PROGRAM-DATE-TIME), with or without any DRM, and is never reproduced with other VOD streams.
We have the necessary background recording entitlements, and for many users... do not run into any issues.
However, there is a subset of users that routinely get recordings ending.. we have narrowed this down and believe it to be the work of the watch dog.
First we removed the entire view hierarchy when app is backgrounded. There is just 'Text("Recording")'
This got the CPU usage in profiler down to 0%. We saw massive improvements to recording success rate.
We walked away assuming that was enough. However we are still seeing the same sort of crashes. All in the background. We're using Observation to drive audio state changes to a Live Activity.
Are those Observations causing the problem? Why doesn't apple provide a better API to background audio? The internet is full of weird issues
https://stackoverflow.com/questions/76010213/why-is-my-react-native-app-sometimes-terminated-in-the-background-while-tracking
https://stackoverflow.com/questions/71656047/why-is-my-react-native-app-terminating-in-the-background-while-recording-ios-r
https://github.com/expo/expo/issues/16807
This is such a terrible user experience. And we have very little visibility into what is happening and why.
No where in apple documentation states that in order for background recording to work, the app can only be 'Text("Recording")'
It does not outline a CPU or memory threshold. It just kills us.
When we use AVPlayer to play DRM encrypted streams, it will not play normally under iOS 17.6.1 system version, and there is a high probability of system restart. This is the relevant core error log:
error 14:47:53.323369+0800 audiomxd [AirPlayError] carManager_copyProperty_block_invoke:499: got error -12784/0xFFFFCE10 kCMBaseObjectError_PropertyNotFound
error 14:47:53.323414+0800 audiomxd [SPEndpointManagerFactory] SidePlay Endpoint Manager creation failed with -72390/0xFFFEE53A
error 14:47:53.364949+0800 audiomxd [APBrowserCarSessionHelper] [0xF6AA] [Bonjour/WiFi] Unrecognized ConnectivityHelper event 101
error 14:47:53.375313+0800 audiomxd AddInstanceForFactory: No factory registered for id <CFUUID 0xa5c5118c0> F8BB1C28-BAE8-11D6-9C31-00039315CD46
The device is connected to Bluetooth A and Bluetooth B, currently the audio is played through Bluetooth A, click the interface button, how to realize the code to switch to Bluetooth B?
I am playing FairPlay + Multi-Key content (fMP4) in Safari browser.
I want to implement the implementation to distinguish between SD and HD video quality, and play it in HD if HDCP is supported, and in SD if HDCP is not supported.
I have already confirmed that HDCP support is the default, and that a black screen is output in non-HDCP environments.
What I want is to improve the user experience by appropriately switching to SD/HD depending on HDCP support when playing DRM content.
Question: Is there an API or function that can detect HDCP support in Safari through JavaScript or other methods? Or is there a way to indirectly guess it?
Topic:
Media Technologies
SubTopic:
Streaming
Tags:
FairPlay Streaming
WebKit
Safari
HTTP Live Streaming
Short summary
When setting exposureMode to .locked or .custom the brightness of a video stream still changes depending on the composition and contrast of the visible scene. These changes seem to come from contrast enhancements or dynamic range optimizations and totally break any analysis of the image that requires to assess absolute luminance. While exposure lock seems to indeed lock the physical exposure parameters of the camera (shutter speed and ISO), I cannot find any way to control these "soft" modifiers.
Details
Background
I am the developer of the app "phyphox", an educational app that makes the phone's sensors accessible to students as measurement tools in science experiments. Currently I am working on implementing photometric measurements through the camera and one very important aspect of it is luminance measurements.
This is particularly relevant since the light sensor of the phone has no publicly accessible API and the camera could to some extend make experiments available to Apple users that are otherwise only possible on Android devices.
Implementation
The app uses AVFoundation and explicitly picks individual cameras since camera groups do not support custom exposure settings. This means that it handles camera switching during zoom by itself and even implements its own auto exposure routines to optimize for the use in experiments. Therefore it always stays in custom exposure mode. The app uses YUV420 color space and the individual frames are analyzed in Metal using compute shaders.
However, the effects discussed here still occur if I remove all code to control the camera and replace it with a simple sequence of setting the exposure mode to custom, setting custom exposure values, setting a fixed white balance and then setting the exposure mode to locked as suggested on stackoverflow. This neither helps on an iPhone 14 Pro nor on an iPhone 8 despite a report on the developer forums that it would resolve the issue for older devices.
The app is open source, so the code can be seen in our current development branch (without the changes for the tests here, though) on github.
The videos below use the implementation with the suggestion from stackoverflow, but they can be reproduced in the same way with "professional" camera apps that promise manual control over the camera (like the Blackmagic cam to quote a reputable company) as well as the stock camera app after pressing and holding on the preview to enable AE/AF lock.
Demonstration
These examples were captured on an iPhone 14 Pro. The central part of the image (highlighted by the app using metal shaders after capture) should not change with fixed exposure settings, but significant changes are noticable if there are changes at the edge of the frame when I move a black piece of cardboard in from above:
https://share.icloud.com/photos/0b1f_3IB6yAQG-qSH27pm6oDQ
The graph above the camera preview is the average luminance (gamma corrected and weighted based on sRGB) across the highlighted central area and as mentioned before it should not change because of something happening at the side of the frame (worst case it should get a bit darker because of the cardboard's shadow).
In my opinion, the iPhone changes its mind on the ideal contrast as soon as it has a different exposure histogram because of the dark image part from the cardboard, but that's just me guessing.
For completeness here is the same effect in the stock camera app with AE/AF lock enabled:
https://share.icloud.com/photos/0cd7QM8ucBZKwPwE9mybnEowg
Here you can also see that the iPhone "ramps" the changes. The brightness of the gray area does not change immediately but transitions smoothly, so this is clearly deliberate postprocessing.
So...
Any suggestion on how to prevent this behavior would be highly appreciated.