objective c - iOS AVFoundation how to convert camera frames per second to create a timelapse using CMTime? -


i have iphone camera attachment captures video @ 9fps , makes available individual uiimages. i'm trying stitch these images create timelapse video of camera sees using avfoundation.

i'm not sure how convert frames , timing achieve time compression want.

for example - want 1 hour of real life footage converted 1 minute of time lapse. tells me need capture every 60th frame , append timelapse.

does code below accomplish 60 seconds 1 second time lapse conversion? or need add more multiplication/division krecordingfps?

#define krecordingfps 9 #define ktimelapsecaptureframewithmod 60 //framecount number of frames camera has output far if(framecount % ktimelapsecaptureframewithmod == 0) {     //...     //convert image , prepare recording     [self appendimage:image                 attime:cmtimemake(currentrecordingframenumber++, krecordingfps)]; } 

your code make 1 frame out of 60 go in film every 1/9s, , increase frameindex 1 each time.

the result should film of [max value framecount] / (60 * 9). if have 32400 frames (1h of film @ 9fps), film of {540:9 = 60s}.

try printing cmtime using cmtimeshow() @ every call appendimage:attime: check.

//at 5 fps, 300 frames recorded per 60 seconds //to compress 300 frames 1 second @ 5 fps, need take every 300/5 = 60th frame //to compress 300 frames 1 second @ 15 fps, take every 300/15 = 20th frame //to compress 300 frames 1 sec @ 30 fps, take every 300/30 = 10th frame  #define kreceivedfps 9 #define kstorefps 9 #define ksecondspersecondcompression 60  #define ktimelapsecaptureframewithmod (ksecondspersecondcompression * kreceivedfps) / kstorefps 

Comments

Popular posts from this blog

php - Wordpress website dashboard page or post editor content is not showing but front end data is showing properly -

javascript - Get parameter of GET request -

javascript - Twitter Bootstrap - how to add some more margin between tooltip popup and element -