I am trying to load a jpg image together with a mov file with objective-c on ios device to display a live photo, and I make following code snippet to do that in viewDidLoad function:
- (void)viewDidLoad {
[super viewDidLoad];
PHLivePhotoView *photoView = [[PHLivePhotoView alloc]initWithFrame:self.view.bounds];
NSURL *imageUrl = [[NSBundle mainBundle] URLForResource:@"livePhoto" withExtension:@"jpg"];
NSURL *videoUrl = [[NSBundle mainBundle] URLForResource:@"livePhoto" withExtension:@"mov"];
[PHLivePhoto requestLivePhotoWithResourceFileURLs:@[videoUrl, imageUrl] placeholderImage:[UIImage imageNamed:@"livePhoto.jpg"] targetSize:self.view.bounds.size contentMode:PHImageContentModeAspectFit resultHandler:^(PHLivePhoto *livePhoto, NSDictionary *info){
NSLog(@"we are in handler");
photoView.livePhoto = livePhoto;
photoView.contentMode = UIViewContentModeScaleAspectFit;
photoView.tag = 87;
[self.view addSubview:photoView];
[self.view sendSubviewToBack:photoView];
}];
}
I have drag the file livePhoto.jpg and livePhoto.mov to Xcode project
But when build this Xcode log this error:
2017-11-28 17:46:08.568455+0800 Live Photos[3669:1276778] we are in handler
2017-11-28 17:46:08.580439+0800 Live Photos[3669:1276778] we are in handler
2017-11-28 17:46:08.597147+0800 Live Photos[3669:1276806] Error: Invalid image metadata
2017-11-28 17:46:08.607881+0800 Live Photos[3669:1276806] Error: Invalid video metadata
2017-11-28 17:46:08.608329+0800 Live Photos[3669:1276778] we are in handler
Any idea about that? Thanks.
And another thing to ask:
Why does the resultHandler was called twice according to what is printed?
TL;DR
Here's the code to store Live Photos and upload them to a server:
1. Capturing Live Photo
expectedAssetis just an object holding all required information. You can use a NSDictionary instead. And since this code snippet is a >= iOS 11 API, heres the one for "deprecated" iOS...2. Generate NSData
Long Answer
This is caused by wrong Metadata in the video/image file. When creating a live photo, PHLivePhoto searches for the key 17 in
kCGImagePropertyMakerAppleDictionary(which is the asset identifier) and matches this with thecom.apple.quicktime.content.identifierof the mov file. The mov file also needs to have an entry for the time where the still image was captured (com.apple.quicktime.still-image-time).Make sure your files haven't been edited (or exported) somewhere. Event the UIImageJPEGRepresentation function will remove this data from the image.
Here's a code snippet I'm using to convert the UIImage to NSData:
The Handler gets called twice to first tell you about corrupt data, and the second time about the cancellation of the process (these are two different keys).
EDIT:
Here's your mov data:
$ ffmpeg -i cf70b7de66bd89654967aeef1d557816.mov Metadata: major_brand : qt minor_version : 0 compatible_brands: qt creation_time : 2018-01-27T11:07:38.000000Z com.apple.quicktime.content.identifier: cf70b7de66bd89654967aeef1d557816 Duration: 00:00:15.05, start: 0.000000, bitrate: 1189 kb/s Stream #0:0(und): Video: h264 (High) (avc1 / 0x31637661), yuv420p(progressive), 540x960, 1051 kb/s, 29.84 fps, 29.97 tbr, 30k tbn, 59.94 tbc (default) Metadata: creation_time : 2018-01-27T11:07:38.000000Z handler_name : Core Media Data Handler encoder : 'avc1' Stream #0:1(und): Audio: aac (LC) (mp4a / 0x6134706D), 44100 Hz, stereo, fltp, 128 kb/s (default) Metadata: creation_time : 2018-01-27T11:07:38.000000Z handler_name : Core Media Data HandlerThe
com.apple.quicktime.still-image-timekey is missing here.Here's the metadata how it should look like:
Metadata: major_brand : qt minor_version : 0 compatible_brands: qt creation_time : 2017-12-15T12:41:00.000000Z com.apple.quicktime.content.identifier: 89CB44DA-D129-43F3-A0BC-2C980767B810 com.apple.quicktime.location.ISO6709: +51.5117+007.4668+086.000/ com.apple.quicktime.make: Apple com.apple.quicktime.model: iPhone X com.apple.quicktime.software: 11.1.2 com.apple.quicktime.creationdate: 2017-12-15T13:41:00+0100 Duration: 00:00:01.63, start: 0.000000, bitrate: 8902 kb/s Stream #0:0(und): Video: h264 (High) (avc1 / 0x31637661), yuvj420p(pc, smpte170m/smpte432/bt709), 1440x1080, 8135 kb/s, 26.94 fps, 30 tbr, 600 tbn, 1200 tbc (default) Metadata: rotate : 90 creation_time : 2017-12-15T12:41:00.000000Z handler_name : Core Media Data Handler encoder : H.264 Side data: displaymatrix: rotation of -90.00 degrees Stream #0:1(und): Audio: pcm_s16le (lpcm / 0x6D63706C), 44100 Hz, mono, s16, 705 kb/s (default) Metadata: creation_time : 2017-12-15T12:41:00.000000Z handler_name : Core Media Data Handler Stream #0:2(und): Data: none (mebx / 0x7862656D), 12 kb/s (default) Metadata: creation_time : 2017-12-15T12:41:00.000000Z handler_name : Core Media Data Handler Stream #0:3(und): Data: none (mebx / 0x7862656D), 43 kb/s (default) Metadata: creation_time : 2017-12-15T12:41:00.000000Z handler_name : Core Media Data HandlerAnd just FYI, heres your JPEG Data:
$ magick identify -format %[EXIF:*] cf70b7de66bd89654967aeef1d557816.jpg exif:ColorSpace=1 exif:ExifImageLength=960 exif:ExifImageWidth=540 exif:ExifOffset=26 exif:MakerNote=65, 112, 112, 108, 101, 32, 105, 79, 83, 0, 0, 1, 77, 77, 0, 1, 0, 17, 0, 2, 0, 0, 0, 33, 0, 0, 0, 32, 0, 0, 0, 0, 99, 102, 55, 48, 98, 55, 100, 101, 54, 54, 98, 100, 56, 57, 54, 53, 52, 57, 54, 55, 97, 101, 101, 102, 49, 100, 53, 53, 55, 56, 49, 54, 0, 0