直播技术笔记,ES来播音录制

亚洲必赢官网 ,unity中播放录像步骤如下:

unity5.6初叶扩充了videoPlayer,使得摄像播放相对比较不难,项目须要开始展览了须臾间斟酌利用,也遇上不少坑,谷歌百度时而发现确实有那一个标题,1些简易问题如下:

在讲代码实现在此之前,笔者先讲讲TextureView, SurfaceTexture,OpenGL
ES都是些什么鬼东西,小编又是怎么利用那多少个东西来浮现3个录像的。

按: 方今做了一个直播的预备性讨论项目,
因而记录下直播的技巧的贯彻,在那进程中1些题材一举成功的思绪,以android平台的兑现认证。

一.将要播放的录像拖入projec。(注意:unity一般帮助的录制格式有mov, .mpg,
.mpeg, .mp5,.avi, .asf格式  )

壹)播放无声音

TextureView
顾名思义也正是1个再而三了View的二个View控件而已,官网的演说是那样的:
A TextureView can be used to display a content stream. Such a content
stream can for instance be a video or an OpenGL scene. The content
stream can come from the application’s process as well as a remote
process.

它亦可去呈现3个内容流,比如录像流,OpenGL渲染的风貌等。这么些流能够是本土程序进程也足以是长距离进度流,有点绕,笔者的精通就是,比如既能够是本地摄像流,也足以是互联网录制流。
专注的是: TextureView
选拔的是硬件加快器去渲染,就接近录制的硬解码跟软解码,1个靠的是GPU解码,一个靠CPU解码。
那正是说哪些去选择这几个TextureView呢?
OK,现在SurfaceTexture将要上场了,从这八个类的命名大家就通晓TextureView重点是View,而SurfaceTexture
重点是Texture它的官网解释:
Captures frames from an image stream as an OpenGL ES texture.The image
stream may come from either camera preview or video decode. \

也便是说它能捕获七个图像流的一帧来作为OpenGL
的texture也正是纹理。这么些图片流首假诺来源于相机的预览或摄像的解码。(我想这一个特点是不应有能够用来做过多事了)。
到这儿,texture也有了,那么OpenGL\也就能够出来工作了,它亦可绑定texture并将其在TextureView上壹帧一帧的给绘制出来,就形成了大家所看到录制图像了(切实有关SurfaceTexture、TextureView我们能够参照那里)
说了这么,是该来点代码来瞧瞧了,好的代码就跟读管艺术学小说同样,那样的绝色,并不是说我写的代码很漂亮啦,那只是追求。。。

品种协会

2.在场所中添加RawImage。(因为Image使用sprite渲染,rawImage是用texture渲染)

直播技术笔记,ES来播音录制。2)通过slider控制作和播出放进程

代码

先从MainActicity主类开端:

public class MainActivity extends AppCompatActivity implements TextureView.SurfaceTextureListener,
        MediaPlayer.OnPreparedListener{
    /**本地视频的路径*/
    public String videoPath = Environment.getExternalStorageDirectory().getPath()+"/aoa.mkv";
    private TextureView textureView;
    private MediaPlayer mediaPlayer;
    /**
    * 视频绘制前的配置就发生在这个对象所在类中.
    * 真正的绘制工作则在它的子类中VideoTextureSurfaceRenderer
    */
    private TextureSurfaceRenderer videoRenderer;
    private int surfaceWidth;
    private int surfaceHeight;
    private Surface surface;


    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_main);

        textureView = (TextureView) findViewById(R.id.id_textureview);
        //注册一个SurfaceTexture,用于监听SurfaceTexure
        textureView.setSurfaceTextureListener(this);

    }
    /**
    * 播放视频的入口,当SurfaceTexure可得到时被调用
    */
    private void playVideo() {
        if (mediaPlayer == null) {
            videoRenderer = new VideoTextureSurfaceRenderer(this, textureView.getSurfaceTexture(), surfaceWidth, surfaceHeight);
            surface = new Surface(videoRenderer.getSurfaceTexture());
            initMediaPlayer();
        }
    }

    private void initMediaPlayer() {
        this.mediaPlayer = new MediaPlayer();
        try {
            mediaPlayer.setDataSource(videoPath);
            mediaPlayer.setSurface(surface);
            mediaPlayer.prepareAsync();
            mediaPlayer.setOnPreparedListener(this);
            mediaPlayer.setLooping(true);
        } catch (IllegalArgumentException e1) {
            // TODO Auto-generated catch block
            e1.printStackTrace();
        } catch (SecurityException e1) {
            // TODO Auto-generated catch block
            e1.printStackTrace();
        } catch (IllegalStateException e1) {
            // TODO Auto-generated catch block
            e1.printStackTrace();
        } catch (IOException e1) {
            // TODO Auto-generated catch block
            e1.printStackTrace();
        }
    }
    @Override
    public void onPrepared(MediaPlayer mp) {
        try {
            if (mp != null) {
                mp.start(); //视频开始播放了
            }
        } catch (IllegalStateException e) {
            e.printStackTrace();
        }
    }


    @Override
    protected void onResume() {
        super.onResume();
        if (textureView.isAvailable()) {
            playVideo();
        }
    }

    @Override
    protected void onPause() {
        super.onPause();
        if (videoRenderer != null) {
            videoRenderer.onPause();  //记得去停止视频的绘制线程
        }
        if (mediaPlayer != null) {
            mediaPlayer.release();
            mediaPlayer =null;
        }
    }

    @Override
    public void onSurfaceTextureAvailable(SurfaceTexture surface, int width, int height) {
        surfaceWidth = width;
        surfaceHeight = height;
        playVideo();
    }

    @Override
    public void onSurfaceTextureSizeChanged(SurfaceTexture surface, int width, int height) {

    }

    @Override
    public boolean onSurfaceTextureDestroyed(SurfaceTexture surface) {
        return false;
    }

    @Override
    public void onSurfaceTextureUpdated(SurfaceTexture surface) {

    }

}

那就是先后的入口类,关于Mediaplayer是怎么播放时录制源的,笔者就在此就不说了,那其间其实还有很多事物的,大家能够活动的查实。有一点自身要求说说正是,一般MediaPlayer.setSurface(param)里面的参数param皆以SurfaceView.SurfaceHolder,而本人此时平昔用的是Surface
(关于Surface能够参照那里),作者那个摄像播放与其余的摄像播放的不一样就在此。那篇先临时写在此时啦,后续宗旨的绘图工作,就后边有空就再写了。上边写的比方有怎么样难题期待我们能多多辅导,多谢不尽!
下一篇已写好TextureView+SurfaceTexture+OpenGL
ES来播音摄像(2)

  • unity纹理插件和录制采访(录像源)
    VideoSourceCamera
  • Mike风韵集(音频源)
    AudioSourceMIC
  • 摄像编码
    VideoEncoder
  • 节奏编码
    AudioEncoder
  • FLV编码(混合)
    MuxerFLV
  • http流上传(上传源)
    PublisherHttp
  • 流摄像播放(重放)
    play
  • OpenGL图形图象处理

3.rawImage下添加videoPlayer组件,将录像赋给videoplayer,将其拖到video
clip上。

三)录像截图(texture->texture②d)

从本篇小说开端将会介绍那多少个零部件的完结细节,相互信赖关系的处理情势。

四.创办脚本PlayVodeoOnUGUI,大旨代码:rawImage.texture =
videoPlayer.texture,即将video的tuxture赋值给rawImage就能看出要播放的录制了

四)录像结束时事件激活

(一) —— unity纹理插件

小编们的直播项目劳务于unity,而unity是三个跨平台的3日游引擎,底层依照分化平台,选取了directx,
opengl, opengles, 因而须求贯彻分裂平台的图形插件。
(unity的图形插件文书档案)
https://docs.unity3d.com/Manual/NativePluginInterface.html
在anroid平台下的直播,unity图形插件效能重大是渲染线程通告,
因为不管录像采访,成立GALAXY Tab,
图像处理(shader),依旧编码摄像纹理传入,都亟需工作在unity的渲染线程下,

  • unity创立纹理,将纹理ID传递到直播插件。

  • 开拓camera设备,准备好采访GALAXY Tab,
    mCameraGLTexture =
    new GLTexture(width, height, GLES11Ext.GL_TEXTURE_EXTERNAL_OES,
    GLES20.GL_RGBA);
    note: camera
    三星GALAXY Tab是1种新鲜类型的纹理,通过GLES1一Ext.GL_TEXTURE_EXTERNAL_OES参数创制

  • 回调布告每一帧数据准备完结
    public void onFrameAvailable(final SurfaceTexture surfaceTexture)
    {
    //那里将征集线程的图象push到渲染线程处理
    getProcessor().append (new Task() {
    @Override
    public void run() {
    surfaceTexture.updateTexImage();
    }
    });
    }

    camera 平板电脑也亟需做特别纹理表明

      #extension GL_OES_EGL_image_external : require
      precision mediump float;
      uniform samplerExternalOES uTexture0;
      varying vec2 texCoordinate;
      void main(){
          gl_FragColor = texture2D(uTexture0, texCoordinate);
      }
    
  • 将camera 三星GALAXY Tab纹理写入到 unity的纹路,
    将一张纹理写入到另一纹理,可以二种格局,

    • 透过glReadPixels, 但那样会造成巨大的内存拷贝,CPU压力。

    • 渲染到纹理(render to texture)
      mTextureCanvas = new
      GLRenderTexture(mGLTexture);//声明rendertexture

        void renderCamera2Texture()
        {
            mTextureCanvas.begin();
            cameraDrawObject.draw();
            mTextureCanvas.end();
        }
      

      GLRenderTexture的实现, 如下
      GLRenderTexture(GLTexture tex)
      {
      mTex = tex;
      int fboTex = tex.getTextureID();
      GLES20.glGenFramebuffers(1, bufferObjects, 0);
      GLHelper.checkGlError(“glGenFramebuffers”);
      fobID = bufferObjects[0];

            //创建render buffer
            GLES20.glGenRenderbuffers(1, bufferObjects, 0);
            renderBufferId = bufferObjects[0];
            //绑定Frame buffer
            GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, fobID);
            GLHelper.checkGlError("glBindFramebuffer");
            //Bind render buffer and define buffer dimension
            GLES20.glBindRenderbuffer(GLES20.GL_RENDERBUFFER, renderBufferId);
            GLHelper.checkGlError("glBindRenderbuffer");
            GLES20.glRenderbufferStorage(GLES20.GL_RENDERBUFFER, GLES20.GL_DEPTH_COMPONENT16, tex.getWidth(), tex.getHeight());
            GLHelper.checkGlError("glRenderbufferStorage");
            //设置为framebuffer为texutre类型
            GLES20.glFramebufferTexture2D(GLES20.GL_FRAMEBUFFER, GLES20.GL_COLOR_ATTACHMENT0, GLES20.GL_TEXTURE_2D, fboTex, 0);
            GLHelper.checkGlError("glFramebufferTexture2D");
            //设置depthbuffer
            GLES20.glFramebufferRenderbuffer(GLES20.GL_FRAMEBUFFER, GLES20.GL_DEPTH_ATTACHMENT, GLES20.GL_RENDERBUFFER, renderBufferId);
            GLHelper.checkGlError("glFramebufferRenderbuffer");
            //we are done, reset
            GLES20.glBindRenderbuffer(GLES20.GL_RENDERBUFFER, 0);
            GLHelper.checkGlError("glBindRenderbuffer");
            GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, 0);
            GLHelper.checkGlError("glBindFramebuffer");
        }
      
        void begin()
        {
            GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, fobID);
            GLHelper.checkGlError("glBindFramebuffer");
            GLES20.glViewport(0, 0, mTex.getWidth(), mTex.getHeight());
            GLHelper.checkGlError("glViewport");
        }
      
        void end()
        {
            GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, 0);
        }
      
  • 美颜
    因此shader达成实时的美颜效能,美白,磨皮
    (美颜效能的规律可参看)
    http://meituplus.com/?p=101
    (更多的实时shader处理可参照)
    https://github.com/wuhaoyu1990/MagicCamera

大抵无太大题材,以上五个难题一挥而就方案在下文肉色文字区域,先介绍一下video
Player应用,后续对那三个难题开始展览缓解。

 

(一)新建video Player 可以在ui下田间video
Play组建,也足以直接右键-video-videoplayer,添加后能够见见如下图所示的组件

亚洲必赢官网 1

本文重要重点说一下一下参数:source有三种方式clip情势和url方式,clip则足以一贯通过videoClip实行播报,url则可以因此url举办广播。renderMode为渲染形式,既能够为camera,material等,借使是行使ui播放的选拔render
texture,本文选拔此方式。audioOutputMode有两种,none形式,direct格局(没尝试)和audiosource形式,本文接纳audiosource格局,选用此情势时只须要将audiosource组建拖入上海体育地方中videoPlayer中的audiosource参数槽中即可,不须求别的处理,但有时会油然而生拖入后videoPlayer中的audiosource参数槽消失,且无声音播放,所以1般采用代码添加,如下所示:

 

      //代码添加
        videoPlayer = gameObject.AddComponent<VideoPlayer>();
        //videoPlayer = gameObject.GetComponent<VideoPlayer>();
        audioSource = gameObject.AddComponent<AudioSource>();
        //audioSource = gameObject.GetComponent<AudioSource>();
        videoPlayer.playOnAwake = false;
        audioSource.playOnAwake = false;
        audioSource.Pause();

 

(二)录制播放的操纵与节奏/动画播放类似,videoPlayer有play/pause等方法,具体能够瞻仰前面完整代码。

         
在调用摄像播放完结时事件loopPointReached(此处为借鉴旁人称作,此事件实际上并不是录像播放实现时的风云),顾名思义,此事件为达到录制播放循环点时的事件,即当videoplay
的isLooping属性为true(即循环播放摄像)时,摄像结束时调用此措施,所以当摄像非循环播放时,此事件在录制截止时调用不到。要想调用此方法能够把录制安装为循环播放,在loopPointReached钦赐的风波中停播录像

(三)关于录像播放的ui选取题材,接纳render texture时须求钦赐target
texture。

      
一)在project面板上create-renderTexture,并把新建的renderTexture拖到videoplayer相应的参数槽上

      
二)在Hierarchy面板上新建ui-RawImage,并把上一步新建的renderTexture拖到RawImage的texture上即可。

      
其实能够不要那样处理,videoPlayer有texture变量,直接在update里面把texture值赋给RawImage的texture即可,代码如下

rawImage.texture = videoPlayer.texture;

      摄像截图时能够通过videoPlayer.texture,把图像保存下来然则要求把texture转变为texture二d,尽管后者继续在前端,但是不可能强制转货回去,转换以及存款和储蓄图片代码如下:

   private void SaveRenderTextureToPNG(Texture inputTex, string file)
    {
        RenderTexture temp = RenderTexture.GetTemporary(inputTex.width, inputTex.height, 0, RenderTextureFormat.ARGB32);
        Graphics.Blit(inputTex, temp);
        Texture2D tex2D = GetRTPixels(temp);
        RenderTexture.ReleaseTemporary(temp);
        File.WriteAllBytes(file, tex2D.EncodeToPNG());
    }

    private Texture2D GetRTPixels(RenderTexture rt)
    {
        RenderTexture currentActiveRT = RenderTexture.active;
        RenderTexture.active = rt;
        Texture2D tex = new Texture2D(rt.width, rt.height);
        tex.ReadPixels(new Rect(0, 0, tex.width, tex.height), 0, 0);
        RenderTexture.active = currentActiveRT;
        return tex;
    }

 

最后说一下透过slider控制录制播放进程的题目,

由此slider控制录制播放存在八个难题,一方面在update实时把videoPlayer.time
赋值给slider,壹方面须要把slider的value反馈给time,要是用slider的OnValueChanged(float
value)
方法则存在冲突,导致难题。所以能够由此UI事件的BeginDrag和EndDrag事件

事件展开,即当BeginDrag时,结束给slider赋值,当EndDrag时再也初始赋值。如下图所示

亚洲必赢官网 2

 全代码

using System;
using System.Collections;
using System.Collections.Generic;
using System.IO;
using UnityEngine;
using UnityEngine.UI;
using UnityEngine.Video;

public class VideoController : MonoBehaviour {
    public GameObject screen;
    public Text videoLength;
    public Text currentLength;
    public Slider volumeSlider;
    public Slider videoSlider;

    private string video1Url;
    private string video2Url;
    private VideoPlayer videoPlayer;
    private AudioSource audioSource;
    private RawImage videoScreen;
    private float lastCountTime = 0;
    private float totalPlayTime = 0;
    private float totalVideoLength = 0;

    private bool b_firstVideo = true;
    private bool b_adjustVideo = false;
    private bool b_skip = false;
    private bool b_capture = false;

    private string imageDir =@"D:\test\Test\bwadmRe";

    // Use this for initialization
    void Start () {
        videoScreen = screen.GetComponent<RawImage>();
        string dir = Path.Combine(Application.streamingAssetsPath,"Test");
        video1Url = Path.Combine(dir, "01.mp4");
        video2Url = Path.Combine(dir, "02.mp4");

        //代码添加
        videoPlayer = gameObject.AddComponent<VideoPlayer>();
        //videoPlayer = gameObject.GetComponent<VideoPlayer>();
        audioSource = gameObject.AddComponent<AudioSource>();
        //audioSource = gameObject.GetComponent<AudioSource>();
        videoPlayer.playOnAwake = false;
        audioSource.playOnAwake = false;
        audioSource.Pause();

        videoPlayer.audioOutputMode = VideoAudioOutputMode.AudioSource;
        videoPlayer.SetTargetAudioSource(0, audioSource);

        VideoInfoInit(video1Url);
        videoPlayer.loopPointReached += OnFinish;
    }

    #region private method
    private void VideoInfoInit(string url)
    {
        videoPlayer.source = VideoSource.Url;
        videoPlayer.url = url;        

        videoPlayer.prepareCompleted += OnPrepared;
        videoPlayer.isLooping = true;

        videoPlayer.Prepare();
    }

    private void OnPrepared(VideoPlayer player)
    {
        player.Play();
        totalVideoLength = videoPlayer.frameCount / videoPlayer.frameRate;
        videoSlider.maxValue = totalVideoLength;
        videoLength.text = FloatToTime(totalVideoLength);

        lastCountTime = 0;
        totalPlayTime = 0;
    }

    private string FloatToTime(float time)
    {
        int hour = (int)time / 3600;
        int min = (int)(time - hour * 3600) / 60;
        int sec = (int)(int)(time - hour * 3600) % 60;
        string text = string.Format("{0:D2}:{1:D2}:{2:D2}", hour, min, sec);
        return text;
    }

    private IEnumerator PlayTime(int count)
    {
        for(int i=0;i<count;i++)
        {
            yield return null;
        }
        videoSlider.value = (float)videoPlayer.time;
        //videoSlider.value = videoSlider.maxValue * (time / totalVideoLength);
    }

    private void OnFinish(VideoPlayer player)
    {
        Debug.Log("finished");        
    }

    private void SaveRenderTextureToPNG(Texture inputTex, string file)
    {
        RenderTexture temp = RenderTexture.GetTemporary(inputTex.width, inputTex.height, 0, RenderTextureFormat.ARGB32);
        Graphics.Blit(inputTex, temp);
        Texture2D tex2D = GetRTPixels(temp);
        RenderTexture.ReleaseTemporary(temp);
        File.WriteAllBytes(file, tex2D.EncodeToPNG());
    }

    private Texture2D GetRTPixels(RenderTexture rt)
    {
        RenderTexture currentActiveRT = RenderTexture.active;
        RenderTexture.active = rt;
        Texture2D tex = new Texture2D(rt.width, rt.height);
        tex.ReadPixels(new Rect(0, 0, tex.width, tex.height), 0, 0);
        RenderTexture.active = currentActiveRT;
        return tex;
    }
    #endregion

    #region public method
    //开始
    public void OnStart()
    {
        videoPlayer.Play();
    }
    //暂停
    public void OnPause()
    {
        videoPlayer.Pause();
    }
    //下一个
    public void OnNext()
    {
        string nextUrl = b_firstVideo ? video2Url : video1Url;
        b_firstVideo = !b_firstVideo;

        videoSlider.value = 0;
        VideoInfoInit(nextUrl);
    }
    //音量控制
    public void OnVolumeChanged(float value)
    {
        audioSource.volume = value;
    }
    //视频控制
    public void OnVideoChanged(float value)
    {
        //videoPlayer.time = value;
        //print(value);
        //print(value);
    }
    public void OnPointerDown()
    {
        b_adjustVideo = true;
        b_skip = true;
        videoPlayer.Pause();
        //OnVideoChanged();
        //print("down");
    }
    public void OnPointerUp()
    {
        videoPlayer.time = videoSlider.value;

        videoPlayer.Play();
        b_adjustVideo = false;  
        //print("up");
    }
    public void OnCapture()
    {
        b_capture = true;
    }
    #endregion

    // Update is called once per frame
    void Update () {
        if (videoPlayer.isPlaying)
        {            
            videoScreen.texture = videoPlayer.texture;
            float time = (float)videoPlayer.time;
            currentLength.text = FloatToTime(time);

            if(b_capture)
            {
                string name = DateTime.Now.Minute.ToString() + "_" + DateTime.Now.Second.ToString() + ".png";
                SaveRenderTextureToPNG(videoPlayer.texture,Path.Combine(imageDir,name));                
                b_capture = false;
            }

            if(!b_adjustVideo)
            {
                totalPlayTime += Time.deltaTime;
                if (!b_skip)
                {
                    videoSlider.value = (float)videoPlayer.time;
                    lastCountTime = totalPlayTime;
                }                
                if (totalPlayTime - lastCountTime >= 0.8f)
                {
                    b_skip = false;
                }
            }
            //StartCoroutine(PlayTime(15));   

        }
    }
}

 

 

 

借使运用插件AVPro Video全体毛病小意思

网站地图xml地图