You can find the sample project at github. I created a little test project that allows to reverse the video in background service. In this post I describe my experience implementing this feature. I know that it's possible to implement this feature using ffmpeg or some other external library, but I was curious how to do this using only the standard Android APIs. I'm planning to use MediaCodec and OpenGL ES also in my future video editing projects and one of the features that I've found would be cool to implement is reversing video, so that it can be played backwards. MediaCodec and OpenGL ES are the main parts that enabled video processing. I used some of these APIs already in my previous projects. If this cannot be done theoretically, can you provide me with some reliable links which can succeed installing gstreamer with omxh265dec or nvdec support on Windows 10? Thank you for your help and looking forward to your early reply.Android natively offers APIs that enable video processing and various video effects. So the question comes: If I set the timestamp in appsrc with gstreamer, how can I get the timestamp with ffmpeg(it seems ffmpeg has no GstBuffer struct)? Because of the failness, I have to handle the H265 data with ffmpeg at client. Of course,if I get success,I can use appsink to abtain the timestamp which was set in appsrc. However, I tried my best to install gstreamer with nvdec or omxh265dec support on Windows 10,but all failed. But the situation is a bit of diffrent.Īt server,I indeed use the appsrc to push outer YUV data into pipeline,and I can set the timestamp using GST_BUFFER_PTS.īut at the client, we need to choose Windows 10 as client and we need to use Hardware Acceleration to decode(we can use omxh265dec or nvdec). Thanks for your reply,and the method you provided is just what I’m looking for.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |