lightasfen.blogg.se

Slow motion fotomagico
Slow motion fotomagico







  1. #Slow motion fotomagico movie#
  2. #Slow motion fotomagico code#

#Slow motion fotomagico code#

There is a similar misuse of the CoreVideo APIs in this metal sample code which uses the Metal equivalent API CVMetalTextureCacheCreate. In my implementations which use XXXTextureCacheCreateFromImage (iOS,Mac,Metal) I simply pass the CVPixelBufferRef without locking the base address and it's working fine. If CVOpenGLTextureCacheCreateTextureFromImage needs to lock the pixel buffer I'm sure it will do so. The call to CVPixelBufferLockBaseAddress will cause the IOSurfaceRef to be mapped into system memory which is a very expensive operation on some GPU models. You must hold on the to the reference count of CVOpenGLTextureRef until you are done with the GL texture owned by the CVOpenGLTextureRef.Īlso, I believe the call to CVPixelBufferLockBaseAddress is unnecessary because for an OpenGL compatible CVPixelBuffer, the backing will be an IOSurfaceRef which means the pixel data is already on the GPU and the GL texture will be created via a GPU copy (blit). Therefore, when you call CVOpenGLTextureRelease shortly after taking the textureName, the GL texture is in fact returned to the pool of textures, so CoreVideo can immediately reuse it. I believe the TextureCache is the owner of the GL texture object. I believe there is a problem in the texturing that stems from a misunderstanding of who owns the GL textures created by CVOpenGLTextureCacheCreateTextureFromImage. Hope that somebody can shed light on this issue. If ()ĬVPixelBufferRef buffer = ĬVPixelBufferLockBaseAddress(buffer,kCVPixelBufferLock_ReadOnly) ĬVReturn err = CVOpenGLTextureCacheCreateTextureFromImage(kCFAllocatorDefault,self.textureCache,buffer,0,&texture) ĬVOpenGLTextureCacheFlush(self.textureCache, 0) ĬVPixelBufferUnlockBaseAddress(buffer,kCVPixelBufferLock_ReadOnly) If (playerItem != nil & output != nil & self.isStarted)ĬMTime itemTime = Use CoreVideo to get the frames.ĪVPlayerItemVideoOutput* output = self.output If a new image is available then copy it.

#Slow motion fotomagico movie#

It seems to be completely random and doesn't depend on the movie file itself. Like I mentioned before, it sometimes works, sometimes it doesn't. This is where the problem occurs: when movie playback doesn't work, we always get NO from hasNewPixelBufferForItemTime: - and it doesn't recover from this situation. For movies the render method will be called to get the movie frame textures. Our rendering engine uses a display link to draw OpenGL content. [player seekToTime:time completionHandler:^(BOOL inFinished) If (player.status = AVPlayerStatusReadyToPlay & playerItem.status = AVPlayerItemStatusReadyToPlay)ĬMTime time = CMTimeMakeWithSeconds(_inPoint,_timescale) (void) observeValueForKeyPath:(NSString*)inKeyPath ofObject:(id)inObject change:(NSDictionary*)inChange context:(void*)inContextĪVPlayerItem* playerItem = ayerItem Check if enough of the movie has loaded so that we can start playing. Once the flag is YES, the engine calls -play on AVPlayer. Once they are we seek to the correct starting time and set a flag, that is observed by our engine. In the -observeValueForKeyPath: method we check whether the AVPlayer and AVPlayerItem are ready for playback. Make sure we are notified once the player is ready for playback. NSDictionary* attributes (NSString*)kCVPixelBufferBytesPerRowAlignmentKey: = initWithPixelBufferAttributes:attributes] Create a video output, and add it to the playerItem. Orientation: find out if the video was shot vertically and needs to rotated to be viewed correctly.ĬGAffineTransform transform = ĭouble radians = atan2(transform.b,transform.a) _physicalHeight = _logicalHeight = _movieSize.height

slow motion fotomagico

_physicalWidth = _logicalWidth = _movieSize.width _movieSize = NSMakeSize(size.width,size.height) To prepare for playback the -load method is being called on a background queue:

slow motion fotomagico slow motion fotomagico

We have a class that wraps movie playback and access to the OpenGL textures. Am I doing something fundamentall wrong? Is this a known issue in 10.11? Are there any workarounds? Has anybody encountered this problem and can shed some light on this. This leads me to believe we have a timing related problem. On some Macs it always works, on others almost never, while on yet other machines it works aboud 50% of the time. Movie playback sometimes works, depending on the machine. Our existing code worked fine until the last release of Yosemite, but with the release of El Capitan it broke and is really unreliable now. We have an OpenGL based slideshow application (Boinx FotoMagico) that uses AVFoundation for movie playback.









Slow motion fotomagico