You are here because you want to...
- How to upload camera feed direct to metal kernel/fragment functions.
- Give a start with basic Still Image Texture / Camera Metal Texture loading/processing in iOS graphics using metal API(s).
- Read a
MTLTexture
buffer, Then use the buffer to create aCVPixelBuffer
/UIImage
and whatever you like. - If you are frustrated to understand the metal kernel functionalities and how it should be calculated; then please give a look to this official document from Apple: https://developer.apple.com/documentation/metal/setting_up_a_command_structure
- If you would like to read more from the following link then it will be easier to understand the basic of Metal GPU pipeline: https://developer.apple.com/documentation/metal/setting_up_a_command_structure
- Metal shaders for blending and applying effects
Basic is grabbed from here :: https://github.com/navoshta/MetalRenderCamera
I grabbed camera session, metal texture cache creation and other setup related code from here. I perform subclassing for MTLCommandEncoder
class. I subclassed AVAssetWriter
to generate video from MTLTexture
.
Here are some important notes about the metal rendering process
- Metal Texture used for rendering in
MTLView
is not suitable for generating video/still image. MTLComputeCommandEncoder
is used to process camera feed in metal kernel function.- The output is written to an internal texture for rendering and later video/image generation.
- Kernel function subclassing for applying the different effect in live iOS camera stream.
- If you would like to add some custom metal effects you my subclass
BaseKernelPipelineState
and overrideprocessArguments(computeEncode:)
method.
- Any Contribution and Suggestion about metal will be highly appreciated.