use a circular buffer for video frames on iOS

https://stackoverflow.com/questions/33581369/how-to-use-tpcircularbuffer-for-video

https://github.com/jeremytregunna/Ring

https://www.codesd.com/item/is-it-possible-to-use-a-circular-buffer-for-video-images-on-ios.html

http://atastypixel.com/blog/a-simple-fast-circular-buffer-implementation-for-audio-processing/

http://www.boost.org/doc/libs/1_39_0/libs/circular_buffer/doc/circular_buffer.html

https://github.com/lmtanco/videoDelayLine


How to use TPCircularBuffer for Video?

We have a VoIP app for iOS platform. Where we are using TPCircularBuffer for audio buffering and it's performance is so good.

So i was wondering if it's possible to use TPCircularBuffer for Video buffering also. I have searched a lot but didn't find anything useful on "Using TPCircularBuffer for Video". Is that even possible ?? If yes, then can anyone shade some light on it ? And any code sample would be highly appreciated.

I guess you could copy your video frame's pixels into a TPCircularBuffer, and you'd technically have a video ring buffer, but you've already lost the efficiency race at that point because you don't have time to copy that much data around. You need to keep a reference to your frames.

Or, if you really wanted to mash a solution into TPCircularBuffer, you could write the CMSampleBuffer pointers into the buffer (carefully respecting retain and release). But that seems heavy handed, as you're really not gaining anything from TPCircularBuffer's magical memory mapping wrapping because pointers are so small.

I would simply make my own CMSampleBufferRef ring buffer. You can grab a prebuilt circular buffer or do the clock arithmetic yourself:

 CMSampleBufferRef  ringBuffer[10]; // or some other number

 ringBuffer[(++i) % 10] = frame;

Of course your real problem is not the ring buffer itself, but dealing with the fact that decompressed video is very high bandwidth, e.g. each frame is 8MB for 1080p, or 200MB to store 1 second's worth at 24fps, so you're going to have to get pretty creative if you need anything other than a microscopic video buffer.

Some suggestions:

  • the above numbers are for RGBA, so try working in YUV, where the numbers become 3MB and 75MB/s
  • try lower resolutions
I'm currently looking for a way to create a "live photos"-like functionality in iOS, but using videos. The goal is to have the application store a few seconds of frames prior to starting taking a video. I'm thinking a circular buffer would serve well here, but most of the libraries I've seen are for audio only. Given that the live photos functionality is exclusive to iPhone 6S, I'm wondering if a functionality like this is possible on a device with equal amount of RAM, for example an iPad Air 2, or a device with lesser RAM, for example iPhone 6.

In theory, it would be possible to store and loop video, but in practice you will find that it is unlikely to actually work for full screen video at a fast FPS rate like 30 FPS. The problem is total amount of memory used for video. Each pixel is 32 bits of data (RGBX) each pixel is basically a word and you need to multiply by W x H to figure out how large that is for the given camera resolution. To make a long story short, for a very large W and H there is too much data for the CPU and memory bus to keep up with reads and writes. Now iOS devices do have hardware to help with this task, for example you can encode movies using the built in hardware, but that is likely to be the only way you will be able to get it working when dealing with very large W x H values and fast frame rates. You also have to be careful with aspect ratios because you camera will likely capture pictures in aspect ratios that the h.264 encoding hardware will not support.




猜你喜欢

转载自blog.csdn.net/jeffasd/article/details/78746168