'How to save JPEG/RAW image data to camera roll in iOS 11, can't access processed image data

I'm attempting to write a photoapp that can take both RAW and JPEG images and save them to the camera roll. The functions jpegPhotoDataRepresentation and dngPhotoDataRepresentation seem to be the key to all examples I've found, however both of these are deprecated in iOS 11 and the function for saving after "capturePhoto" is now

func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) {

The main example I've been able to find of a working RAW iOS11 app is this: https://ubunifu.co/swift/raw-photo-capture-sample-swift-4-ios-11 which works, however it only shoots RAW and saving is clumsy because it's not on the camera roll.

I've changed my photo settings to allow for both raw and processed capture with this line

photoSettings = AVCapturePhotoSettings(rawPixelFormatType: availableRawFormat.uint32Value, processedFormat: [AVVideoCodecKey : AVVideoCodecType.jpeg])

But once I've actually captured the photo I have no idea how to access the processedFormat data. fileDataRepresentation seems to be the only way to access the dng stuff, but there's no way to get at the jpeg separately? The code I've found from Apple pre-iOS11 suggests to use PHPhotoLibrary and add a resource, but this requires a data representation which I'm unable to access other than as a dng file, which when saved to the library is just white because the library is not able to handle RAW files. Here's my photoOutput code in case it helps.

 func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) {

    let dir = NSSearchPathForDirectoriesInDomains(.documentDirectory, .userDomainMask, true).first! as String
    let formatter = DateFormatter()
    formatter.dateFormat = "yyyyMMddHHmmss"
    formatter.locale = Locale.init(identifier: "en_US_POSIX")
    let filePath =  dir.appending(String(format: "/%@.dng", formatter.string(from: Date())))
    let dngFileURL = URL(fileURLWithPath: filePath)

    let dngData = photo.fileDataRepresentation()!
    do {
        try dngData.write(to: dngFileURL, options: [])
    } catch {
        print("Unable to write DNG file.")
        return
    }

    PHPhotoLibrary.shared().performChanges( {
        let creationRequest = PHAssetCreationRequest.forAsset()
        let creationOptions = PHAssetResourceCreationOptions()
        creationOptions.shouldMoveFile = true


        //dngData is the problem, this should be the jpeg representation
        creationRequest.addResource(with: .photo, data: dngData, options: nil)
        //This line works fine, the associated file is the correct RAW file, but the jpeg preview is garbage
        creationRequest.addResource(with: .alternatePhoto, fileURL: dngFileURL, options: creationOptions)

    }, completionHandler: nil)

}


Solution 1:[1]

Okay, following up on comment from earlier and the Apple docs on Capturing Photos in RAW Format:

As you’ve noticed, if you want to shoot RAW and save it in the Photos library, you need to save DNG+processed versions together in the same asset so that Photos library clients that don’t support RAW still have a readable version of the asset. (That includes the Photos app itself...) Saving both RAW+processed means specifying that in the capture.

If you’re requesting RAW+processed capture (where processed is JPEG, or even better, HEIF), you’re getting two photos for every shot you take. That means your didFinishProcessingPhoto callback gets called twice: once to deliver the JPEG (or HEIF), again to deliver the RAW.

Since you need to add RAW+processed versions of the asset to Photos together, you should wait until the capture output delivers both versions before trying to create the Photos asset. You’ll notice the code snippets in that Apple doc stash the data for both versions in the didFinishProcessingPhoto callback:

func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) {
    if photo.isRawPhoto {
        // Save the RAW (DNG) fileDataRepresentation to a URL
    } else {
        // Hold JPEG/HEIF fileDataRepresentation in a property
    }
}

Then, when the didFinishCaptureFor callback fires, they make sure they have both versions, and add them together to the Photos library.

Notice that when you add DNG and JPEG or HEIF versions of a photo together...

  • The JPEG/HEIF needs to be the primary photo resource, and the DNG the alternatePhoto resource.
  • You can add a JPEG/HEIF resource straight from Data in memory, but DNG needs to be added from a file URL.

So the Photos library part goes like this (again, inside the didFinishCaptureFor callback):

PHPhotoLibrary.shared().performChanges({
    // Add the compressed (HEIF) data as the main resource for the Photos asset.
    let creationRequest = PHAssetCreationRequest.forAsset()
    creationRequest.addResource(with: .photo, data: compressedData, options: nil)

    // Add the RAW (DNG) file as an altenate resource.
    let options = PHAssetResourceCreationOptions()
    options.shouldMoveFile = true
    creationRequest.addResource(with: .alternatePhoto, fileURL: rawURL, options: options)
}, completionHandler: self.handlePhotoLibraryError)

Solution 2:[2]

You can make CGImage extension like here and then get pixel buffer from that cgImage.

if let cgImage = photo.cgImageRepresentation() {
    let pixelBuffer = cgImage.pixelBuffer()
}

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 rickster
Solution 2 ????? ??????