Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.

All subtopics
Posts under Media Technologies topic

Post

Replies

Boosts

Views

Activity

FCPX Workflow Extension: Drag & Drop FCPXML with Titles to Timeline Not Working Despite Following File Promise Guidelines
I'm developing a Final Cut Pro X workflow extension that transcribes audio and creates a text output. I need to allow users to drag this text directly from my extension into FCPX's timeline as titles. Current Implementation: Using NSFilePromiseProvider as per Apple's guidelines for drag and drop Generating valid FCPXML (v1.10) with proper structure: Complete resources section with format and asset references Event and project hierarchy Asset clip with connected title elements Proper timing and duration calculations Supporting multiple pasteboard types: com.apple.finalcutpro.xml.v1-10 com.apple.finalcutpro.xml.v1-9 com.apple.finalcutpro.xml What's Working: Drag operation initiates correctly File promise provider is set up properly FCPXML generation is successful (verified content) All required pasteboard types are registered Proper logging confirms data is being requested and provided Current Pasteboard Types Offered: com.apple.NSFilePromiseItemMetaData com.apple.pasteboard.promised-file-name com.apple.pasteboard.promised-suggested-file-name com.apple.pasteboard.promised-file-content-type Apple files promise pasteboard type com.apple.pasteboard.NSFilePromiseID com.apple.pasteboard.promised-file-url com.apple.finalcutpro.xml.v1-10 com.apple.finalcutpro.xml.v1-9 com.apple.finalcutpro.xml What additional requirements or considerations are needed to make FCPX accept the dragged FCPXML content? Are there specific requirements for workflow extensions regarding drag and drop operations with titles that aren't documented? Any insights, especially from those who have implemented similar functionality in FCPX workflow extensions, would be greatly appreciated. Technical Details: macOS Version: 15.5 (24F74) FCPX Version: 11.1.1 Extension built with SwiftUI and AppKit integration Using NSFilePromiseProvider and NSPasteboardItemDataProvider Full pasteboard type support for FCPXML versions
0
0
197
Jul ’25
Error -50 writing to AVAudioFile
I'm trying to write 16-bit interleaved 2-channel data captured from a LiveSwitch audio source to a AVAudioFile. The buffer and file formats match but I get a bad parameter error from the API. Does this API not support the specified format or is there some other issue? Here is the debugger output. (lldb) po audioFile.url ▿ file:///private/var/mobile/Containers/Data/Application/1EB14379-0CF2-41B6-B742-4C9A80728DB3/tmp/Heart%20Sounds%201 - _url : file:///private/var/mobile/Containers/Data/Application/1EB14379-0CF2-41B6-B742-4C9A80728DB3/tmp/Heart%20Sounds%201 - _parseInfo : nil - _baseParseInfo : nil (lldb) po error Error Domain=com.apple.coreaudio.avfaudio Code=-50 "(null)" UserInfo={failed call=ExtAudioFileWrite(_impl->_extAudioFile, buffer.frameLength, buffer.audioBufferList)} (lldb) po buffer.format <AVAudioFormat 0x302a12b20: 2 ch, 44100 Hz, Int16, interleaved> (lldb) po audioFile.fileFormat <AVAudioFormat 0x302a515e0: 2 ch, 44100 Hz, Int16, interleaved> (lldb) po buffer.frameLength 882 (lldb) po buffer.audioBufferList ▿ 0x0000000300941e60 - pointerValue : 12894608992 This code handles the details of converting the Live Switch frame into an AVAudioPCMBuffer. extension FMLiveSwitchAudioFrame { func convertedToPCMBuffer() -> AVAudioPCMBuffer { Self.convertToAVAudioPCMBuffer(from: self)! } static func convertToAVAudioPCMBuffer(from frame: FMLiveSwitchAudioFrame) -> AVAudioPCMBuffer? { // Retrieve the audio buffer and format details from the FMLiveSwitchAudioFrame guard let buffer = frame.buffer(), let format = buffer.format() as? FMLiveSwitchAudioFormat else { return nil } // Extract PCM format details from FMLiveSwitchAudioFormat let sampleRate = Double(format.clockRate()) let channelCount = AVAudioChannelCount(format.channelCount()) // Determine bytes per sample based on bit depth let bitsPerSample = 16 let bytesPerSample = bitsPerSample / 8 let bytesPerFrame = bytesPerSample * Int(channelCount) let frameLength = AVAudioFrameCount(Int(buffer.dataBuffer().length()) / bytesPerFrame) // Create an AVAudioFormat from the FMLiveSwitchAudioFormat guard let avAudioFormat = AVAudioFormat(commonFormat: .pcmFormatInt16, sampleRate: sampleRate, channels: channelCount, interleaved: true) else { return nil } // Create an AudioBufferList to wrap the existing buffer let audioBufferList = UnsafeMutablePointer<AudioBufferList>.allocate(capacity: 1) audioBufferList.pointee.mNumberBuffers = 1 audioBufferList.pointee.mBuffers.mNumberChannels = channelCount audioBufferList.pointee.mBuffers.mDataByteSize = UInt32(buffer.dataBuffer().length()) audioBufferList.pointee.mBuffers.mData = buffer.dataBuffer().data().mutableBytes // Directly use LiveSwitch buffer // Transfer ownership of the buffer to AVAudioPCMBuffer let pcmBuffer = AVAudioPCMBuffer(pcmFormat: avAudioFormat, bufferListNoCopy: audioBufferList) /* { buffer in // Ensure the buffer is freed when AVAudioPCMBuffer is deallocated buffer.deallocate() // Only call this if LiveSwitch allows manual deallocation } */ pcmBuffer?.frameLength = frameLength return pcmBuffer } } This is the handler that is invoked with every frame in order to convert it for use with AVAudioFile and optionally update a scrolling signal display on the screen. private func onRaisedFrame(obj: Any!) -> Void { // Bail out early if no one is interested in the data. guard isMonitoring else { return } // Convert LS frame to AVAudioPCMBuffer (no-copy) let frame = obj as! FMLiveSwitchAudioFrame let buffer = frame.convertedToPCMBuffer() // Hand subscribers a reference to the buffer for rendering to display. bufferPublisher?.send(buffer) // If we have and output file, store the data there, as well. guard let audioFile = self.audioFile else { return } do { try audioFile.write(from: buffer) // FIXME: This call is throwing error -50 } catch { FMLiveSwitchLog.error(withMessage: "Failed to write buffer to audio file at \(audioFile.url): \(error)") self.audioFile = nil } } This is how the audio file is being setup. static var recordingFormat: AVAudioFormat = { AVAudioFormat(commonFormat: .pcmFormatInt16, sampleRate: 44_100, channels: 2, interleaved: true)! }() let audioFile = try AVAudioFile(forWriting: outputURL, settings: Self.recordingFormat.settings)
0
0
457
Mar ’25
iOS Camera access issues in Developer mode on real device - PermissionStatus.permanentlyDenied
Xcode Version 16.3 (16E140) App developed in Flutter Flutter 3.29.3 Test iPhone device: iPhone 16 Pro running iOS 18.5 I have an app that requires Camera access. This used to work before with iOS 18.4.x. I have dumbed down my app to just get Camera permission. Even then it fails flutter: Camera permission: PermissionStatus.denied flutter: Photos permission: PermissionStatus.denied flutter: Microphone permission: PermissionStatus.denied flutter: --- End Debug Info --- flutter: Loaded translations from asset for en_US container_create_or_lookup_app_group_path_by_app_group_identifier: client is not entitled container_create_or_lookup_app_group_path_by_app_group_identifier: client is not entitled container_create_or_lookup_app_group_path_by_app_group_identifier: client is not entitled container_create_or_lookup_app_group_path_by_app_group_identifier: client is not entitled container_create_or_lookup_app_group_path_by_app_group_identifier: client is not entitled container_create_or_lookup_app_group_path_by_app_group_identifier: client is not entitled flutter: CAMERA PERMISSION STATUS: PermissionStatus.permanentlyDenied Camera permissions don't show up in my App settings or under "Settings -&gt; Privacy and Security -&gt; Camera" and I am at loss to understand why this is happening.
1
0
171
May ’25
Set the capture device color space to apple log not works.
I set the device format and colorspace to Apple Log and turn off the HDR, why the movie output is still in HDR format rather than ProRes Log? Full runnable demo here: https://github.com/SpaceGrey/ColorSpaceDemo session.sessionPreset = .inputPriority // get the back camera let deviceDiscoverySession = AVCaptureDevice.DiscoverySession(deviceTypes: [.builtInWideAngleCamera], mediaType: .video, position: .back) backCamera = deviceDiscoverySession.devices.first! try! backCamera.lockForConfiguration() backCamera.automaticallyAdjustsVideoHDREnabled = false backCamera.isVideoHDREnabled = false let formats = backCamera.formats let appleLogFormat = formats.first { format in format.supportedColorSpaces.contains(.appleLog) } print(appleLogFormat!.supportedColorSpaces.contains(.appleLog)) backCamera.activeFormat = appleLogFormat! backCamera.activeColorSpace = .appleLog print("colorspace is Apple Log \(backCamera.activeColorSpace == .appleLog)") backCamera.unlockForConfiguration() do { let input = try AVCaptureDeviceInput(device: backCamera) session.addInput(input) } catch { print(error.localizedDescription) } // add output output = AVCaptureMovieFileOutput() session.addOutput(output) let connection = output.connection(with: .video)! print( output.outputSettings(for: connection) ) /* ["AVVideoWidthKey": 1920, "AVVideoHeightKey": 1080, "AVVideoCodecKey": apch,<----- prores has enabled. "AVVideoCompressionPropertiesKey": { AverageBitRate = 220029696; ExpectedFrameRate = 30; PrepareEncodedSampleBuffersForPaddedWrites = 1; PrioritizeEncodingSpeedOverQuality = 0; RealTime = 1; }] */ previewSource = DefaultPreviewSource(session: session) queue.async { self.session.startRunning() } }
1
0
488
Mar ’25
Is Phase Detection Autofocus degrading video stabilization, and can I disable it?
I'm developing a video capture app using AVFoundation, designed specifically for use on a boat pylon to record slalom water skiing. This setup involves considerable vibration. As you may know, the OIS that Apple began adding to lenses since the iPhone 7 is actually very problematic in high vibration circumstances, ironically creating very shaky video, whereas lenses without OIS produce perfectly stable video. Because of this, up until iPhone 14, the solution for my app was simply to use the Selfie lens, which did not have OIS. Starting with iPhone 14 through iPhone 16 (non-Pro models), technical specs suggest the selfie lens still does not include OIS. However, I’m still seeing the same kind of shaky video behavior I see on OIS-equipped lenses. The one hardware change I see in this camera module is the addition of PDAF (Phase Detection Autofocus), so that is my best guess as to what is causing the unstable video. 1- Does that make any sense - that in high vibration settings, PDAF could create unstable video in the same way that OIS does? Or could it be something else that was changed between the iPhone 13 and 14 Selfie lens? Thinking that the issue was PDAF, I figured that if I enabled my app to set a Manual Focus level, that ought to circumvent PDAF (expecting that if a lens is manually focusing, it can’t also be autofocusing via PDAF). However, even with manual focus locked via AVCaptureDevice in my app, on the Selfie lens of an iPhone 16, the video still comes out very shaky, basically unusable. I also tested with the built-in Apple Camera app (using the press-and-hold to lock focus and exposure) and another 3rd party camera app to lock focus, all with the same results, so it's not that my app just isn't correctly doing manual focus. So I'm stuck with these questions: 2- Does the selfie camera on iPhones 14–16 use PDAF even when focus is set to locked/manual mode? 3- Is there any way in AVFoundation to disable or suppress PDAF during video recording (e.g., a flag, device format setting, or private API)? 4- Is PDAF behavior or suppression documented or controllable via AVCaptureDevice or any related class? 5- If no control of PDAF is available, are there any best practices for stabilizing or smoothing this effect programmatically? Note that I also have set my app to use the most aggressive form of stabilization available, so it defaults to .cinematicExtendedEnhanced, if that’s not available, then .cinematicExtended, etc. On the 16 Selfie lens, it is using .cinematicExtended. As an additional question: 6- Would those be the most appropriate stabilization settings for a high vibration environment, and if not, what would be best?
0
0
227
May ’25
AVAudioSession automatically sets the tablet audio volume to 50% when recording audio.
Environment→ ・Device: iPad 10th generation ・OS:**iOS18.3.2 I'm using AVAudioSession to record sound in my application. But I recently came to realize that when the app starts a recording session on a tablet, OS automatically sets the tablet volume to 50% and when after recording ends, it doesn't change back to the previous volume level before starting the recording. So I would like to know whether this is an OS default behavior or a bug? If it's a default behavior, I much appreciate if I can get a link to the documentation.
0
0
122
Apr ’25
Is it possible to query the TV's current refresh rate?
I'm working on a media app that would like to be able to tell if the TV connected to tvOS is running at 59.94hz or 60.00hz, so it can optimize a video stream. It looks like the best I can currently do is to check if the user has Match Content Rate enabled, and based on that, when calling displayManager.preferredDisplayCriteria to change video modes, I could guess which rate their TV might be in. It's not very ideal, because not all TVs support both of these rates, and my request for 59.94 might end up as 60 and vice versa. I dug around and can't find any available method in UIScreen to get this info. The odd thing is, the data is right there in currentMode when I look in the debugger, but it seems to be in a private or undocumented class. Is there any way to get at it?
0
0
232
Jun ’25
When deleting photos, encountered PHPhotosError.operationInterrupted (3301).
Hi, I’ve developed a photo app that includes a photo deletion feature. Some users have reported encountering PHPhotosError.operationInterrupted (3301) when attempting to delete photos. Initially, I suspected that some of the assets might have a sourceType of typeiTunesSynced, since the documentation notes that iTunes-synced assets cannot be edited or deleted. However, after checking the logs, all of the assets involved are of typeUserLibrary. Additionally, the user mentioned that some photos in the iPhone Photos do not show a delete button. I’m unsure whether the absence of the delete button is related to the 3301 error. I’d like to confirm the following: Under what conditions does PHPhotosError.operationInterrupted (3301) occur, and how should it be handled? Why do some photos in the iPhone Photos not show a delete button? The code for deleting photos is as follows: PHPhotoLibrary *library = [PHPhotoLibrary sharedPhotoLibrary]; [library performChanges:^{ PHFetchResult *assetsToBeDeleted = [PHAsset fetchAssetsWithLocalIdentifiers:delUrls options:nil]; if (assetsToBeDeleted) { [PHAssetChangeRequest deleteAssets:assetsToBeDeleted]; } } completionHandler:^(BOOL success, NSError *error) {
0
0
125
Jun ’25
AVAssetResourceLoaderDelegate for radio stream
Hi everyone, I’m trying to use AVAssetResourceLoaderDelegate to handle a live radio stream (e.g. Icecast/HTTP stream). My goal is to have access to the last 30 seconds of audio data during playback, so I can analyze it for specific audio patterns in near-real-time. I’ve implemented a custom resource loader that works fine for podcasts and static files, where the file size and content length are known. However, for infinite live streams, my current implementation stops receiving new loading requests after the first one is served. As a result, the playback either stalls or fails to continue. Has anyone successfully used AVAssetResourceLoaderDelegate with a continuous radio stream? Or maybe you can suggest betterapproach for buffering and analyzing live audio? Any tips, examples, or advice would be appreciated. Thanks!
0
0
153
Jun ’25
Clarification of use of `AVAssetDownloadConfiguration` and `AVAssetDownloadTask` to persist MP3 Audio downloads/streams
Hello! I have been following the UsingAVFoundationToPlayAndPersistHTTPLiveStreams sample code in order to test persisting streams to disk. In addition to support for m3u8, I have noticed in testing that this also seems to work for MP3 Audio, simply by changing the plist entries to point to remote URLs with audio/mpeg content. Is this expected, or are there caveats that I should be aware of? Thanks you!
0
0
75
Apr ’25
The files generated using AVAudioRecorder have a constant size of only 4kb
Hello. My app uses AVAudioRecorder to generate recording files, which are consistently only 4kb in size. Most users generate audio files normally, with only a few users experiencing this phenomenon occasionally. After uninstalling and installing the app, it will work normally, but it will reappear after a period of time. I have compared that the problematic audio files generated each time are fixed and cannot be played. Added the audioRecorderDidFinishRecording proxy method, which shows that the recording was completed normally. The user also reported that the recording is normal, but there is a problem with the generated file. How should I handle this issue? Look forward to your reply. - (void)startRecordWithOrderID:(NSString *)orderID { AVAudioSession *audioSession = [AVAudioSession sharedInstance]; [audioSession setCategory:AVAudioSessionCategoryRecord error:nil]; [audioSession setActive:YES error:nil]; NSMutableDictionary *settings = [[NSMutableDictionary alloc] init]; [settings setObject:[NSNumber numberWithFloat: 8000.0] forKey:AVSampleRateKey]; [settings setObject:[NSNumber numberWithInt: kAudioFormatLinearPCM] forKey:AVFormatIDKey]; [settings setObject:[NSNumber numberWithInt:16] forKey:AVLinearPCMBitDepthKey]; [settings setObject:[NSNumber numberWithInt: 1] forKey:AVNumberOfChannelsKey]; [settings setObject:[NSNumber numberWithBool:NO] forKey:AVLinearPCMIsBigEndianKey]; [settings setObject:[NSNumber numberWithBool:NO] forKey:AVLinearPCMIsFloatKey]; NSString *path = [WDUtility createDirInDocument:@"audios" withOrderID:orderID withPathExtension:@"wav"]; NSURL *tmpFile = [NSURL fileURLWithPath:path]; recorder = [[AVAudioRecorder alloc] initWithURL:tmpFile settings:settings error:nil]; [recorder setDelegate:self]; [recorder prepareToRecord]; [recorder record]; }
0
0
221
Jul ’25
Inconsistent history responses in early morning polls
I use the Apple Music API to poll my listening history at regular intervals. Every morning between 5:30AM and 7:30AM, I observe a strange pattern in the API responses. During this window, one or more of the regular polling intervals returns a response that differs significantly from the prior history response, even though I had no listening activity at that time. I'm using this endpoint: https://api.music.apple.com/v1/me/recent/played/tracks?types=songs,library-songs&include[library-songs]=catalog&include[songs]=albums,artists Here’s a concrete example from this morning: Time: 5:45AM Fetch 1 Tracks (subset): 1799261990, 1739657416, 1786317143, 1784288789, 1743250261, 1738681804, 1789325498, 1743036755, ... Time: 5:50AM Fetch 2 Tracks (subset): 1799261990, 1739657416, 1786317143, 1623924746, 1635185172, 1574004238, 1198763630, 1621299055, ... Time: 5:55AM Fetch 3 Tracks (subset): 1799261990, 1739657416, 1786317143, 1784288789, 1743250261, 1738681804, 1789325498, 1743036755, ... At 5:50, a materially different history is returned, then it returns back to the prior history at the next poll. I've listened to all of the tracks in each set, but the 5:50 history drops some tracks and returns some from further back in history. I've connected other accounts and the behavior is consistent and repeatable every day across them. It appears the API is temporarily returning a different (possibly outdated or cached?) view of the user's history during that early morning window. Has anyone seen this behavior before? Is this a known issue with the Apple Music API or MusicKit backend? I'd love any insights into what might cause this, or recommendations on how to work around it.
0
0
143
Apr ’25
Capture session interrupted randomly
We are facing a strange issue where a small portion of our large userbase can not start the capture session in our app, as it gets interrupted with the following reason: AVCaptureSessionInterruptionReasonVideoDeviceNotAvailableWithMultipleForegroundApps Our users are all from iPhones, no one is using an iPad. Just to be sure we have set session.isMultitaskingCameraAccessEnabled = true but it does not seem to make any difference. Another weird scenario we are seeing on an even smaller number of users is that the following call: AVCaptureDevice.default(.builtInWideAngleCamera, for: .video, position: .back) returns nil. A quick look at our error reports show this happening on iPhone XR, 13 and 14 models. They should all support this device type. Any help on investigating these issue would be greatly appreciated!
0
0
126
Jun ’25
AVPlayer periodic time observer stops notifying when switching back from AirPlay to local playback
The app registers a periodic time observer to the AVPlayer when the playback starts and it works fine. When switching to AirPlay during playback, the periodic time observation continues working as expected. However, when switching back to local playback, the periodic time observer does not fire anymore until a seek is performed. The app removes the periodic time observer only when the playback stops. I can see that when switching back to local playback, the timeControlStatus successively changes to .waitingToPlayAtSpecifiedRate (reason: .evaluatingBufferingRate) then to .waitingToPlayAtSpecifiedRate (reason: .toMinimizeStalls) and finally to .playing But the time observation does not work anymore. Also, the issue is systematic with Live and VOD streams providing a program date (with HLS property #EXT-X-PROGRAM-DATE-TIME), with or without any DRM, and is never reproduced with other VOD streams.
2
0
149
Apr ’25