Detailed explanation of WebRTC - web real-time communication technology

Introduction

The full name is: Web browser Real Time Communication
Features are as follows:

  • It is a browser-based real-time audio and video (data) communication technology
  • Plug-in free
  • open source
  • Has been included in the HTML5 standard by W3C
  • Cross-platform, cross-browser, cross-mobile application
  • Mac OSX、Windows、iOS、Android、Linux


Application Scenario

Applicable to real-time audio and video communication between webpages, peer-to-peer data sharing, QQ and Tencent video have been applied

Advantage

1. Convenience. For users, plug-ins and clients need to be installed for real-time communication before the emergence of WebRTC, but for many users, the operations of plug-in download, software installation and update are complicated and prone to problems. Now WebRTC The technology is built into the browser, and users do not need to use any plug-ins or software to realize real-time communication through the browser.
2. Free. Although WebRTC technology is relatively mature, it integrates the best audio/video engine and very advanced codec, but Google does not charge any fees for these technologies.
3. Strong hole punching ability. WebRTC technology includes key NAT and firewall penetration technologies using STUN, ICE, TURN, RTP-over-TCP, and supports proxies.

shortcoming

1. Transmission quality is difficult to guarantee, such as cross-region, cross-operator, low bandwidth, high packet loss, P2P connection rate, and call success rate.
2. Device-side adaptation, such as echo, recording failure and other problems emerge in endlessly. This is especially true on Android devices. Due to the large number of Android device manufacturers, each manufacturer will customize the standard Android framework, resulting in many usability problems (failure to access the microphone) and quality problems (such as echo, howling).

WebRTC media session principle

Simplified diagram of the internal structure of WebRTC

WebRTC architecture diagram (screenshot from the official website https://webrtc.org/ )

The core technical points of WebRTC are briefly summarized into three parts

The following details the WebRTC core API and signaling server part

Detailed explanation of WebRTC core API

By using the two core APIs of RTCPeerConnection and RTCDataChannel, point-to-point exchange of arbitrary data can be realized. The demo on the official website is as follows:

The Demo does not require servers, because the caller (sending data) and the call answering party (receiving data) are on the same page, so that the principle of RTCPeerConnection API can be clearly understood, and the RTCPeerConnection object on the page can directly exchange data and messages without Use a signaling server.
You can view WebRTC statistics with developer tools

  • Chrome:chrome://webrtc-internals
  • Opera:opera://webrtc-internals
  • FireFox: about:webrtc
    View WebRTC in the chrome developer tools, as shown in the following figure:

Demo code analysis

Taking Demo as an example, analyze the processes of Web P2P creation, communication, and data transmission, and specifically analyze the meaning and standard operation posture of each key attribute, method, and event in the API. See Github for the complete source
code

function createConnection() {
    sendButton.disabled = true;
    megsToSend.disabled = true;
    var servers = null;

    bytesToSend = Math.round(megsToSend.value) * 1024 * 1024;

    // 创建连接,servers可以传入一些描述信息,由于这个demo不需要验证连接信息,在同一个页面上可以直接连接,该参数传null即可
    localConnection = localConnection = new RTCPeerConnection(servers);
    //打印log
    trace('Created local peer connection object localConnection');

    var dataChannelParams = { ordered: false };
    if (orderedCheckbox.checked) {
        dataChannelParams.ordered = true;
    }

    //创建数据通道 语法:dataChannel = RTCPeerConnection .createDataChannel(label [,options ]);,lable:通道的名称;optins:是个可选参数,传入数据通道配置参数,有很多参数可选,例子中的ordered:true表示有序模式,false即为无序模式,还有其他参数maxPacketLifeTime 、maxRetransmits等等
    sendChannel = localConnection.createDataChannel('sendDataChannel', dataChannelParams);
    sendChannel.binaryType = 'arraybuffer';
    trace('Created send data channel');

    //绑定onopen、onclose、onicecandidate(当RTCPeerConnection被createPeerConnection()成功创建时触发,回调会返回待连接端的配置信息)
    sendChannel.onopen = onSendChannelStateChange;
    sendChannel.onclose = onSendChannelStateChange;
    localConnection.onicecandidate = function (e) {
        onIceCandidate(localConnection, e);
    };

    //创建呼叫实例
    localConnection.createOffer().then(
        gotDescription1,
        onCreateSessionDescriptionError
    );

    //创建远端接收连接实例
    remoteConnection = remoteConnection = new RTCPeerConnection(servers);
    trace('Created remote peer connection object remoteConnection');

    remoteConnection.onicecandidate = function (e) {
        onIceCandidate(remoteConnection, e);
    };
    //当一个RTC数据通道已被远端调用createDataChannel()添加到连接中时触发
    remoteConnection.ondatachannel = receiveChannelCallback;
}

function receiveChannelCallback(event) {
    trace('Receive Channel Callback');
    receiveChannel = event.channel;
    receiveChannel.binaryType = 'arraybuffer';
    //接收到数据时触发
    receiveChannel.onmessage = onReceiveMessageCallback;

    receivedSize = 0;
}

function onReceiveMessageCallback(event) {
    receivedSize += event.data.length;
    receiveProgress.value = receivedSize;

    if (receivedSize === bytesToSend) {
        closeDataChannels();
        sendButton.disabled = false;
        megsToSend.disabled = false;
    }
}

function onSendChannelStateChange() {
    var readyState = sendChannel.readyState;
    trace('Send channel state is: ' + readyState);
    if (readyState === 'open') {
        sendGeneratedData();
    }
}

function sendGeneratedData() {
    sendProgress.max = bytesToSend;
    receiveProgress.max = sendProgress.max;
    sendProgress.value = 0;
    receiveProgress.value = 0;

    var chunkSize = 16384;
    var stringToSendRepeatedly = randomAsciiString(chunkSize);
    var bufferFullThreshold = 5 * chunkSize;
    var usePolling = true;
    if (typeof sendChannel.bufferedAmountLowThreshold === 'number') {
        trace('Using the bufferedamountlow event for flow control');
        usePolling = false;

        // 缓冲区大小限值
        bufferFullThreshold = chunkSize / 2;

        // 缓冲区大小控制
        sendChannel.bufferedAmountLowThreshold = bufferFullThreshold;
    }

    // bufferedamountlow 事件处理
    var listener = function () {
        sendChannel.removeEventListener('bufferedamountlow', listener);
        sendAllData();
    };
    var sendAllData = function () {
        // 把一堆数据排队进行处理,在数据通道被填满时停止,这里不建议每次发送后设置Timeout,这样会降低吞吐量
        while (sendProgress.value < sendProgress.max) {
            if (sendChannel.bufferedAmount > bufferFullThreshold) {
                if (usePolling) {
                    setTimeout(sendAllData, 250);
                } else {
                    sendChannel.addEventListener('bufferedamountlow', listener);
                }
                return;
            }
            sendProgress.value += chunkSize;
            // send方法发送数据,RTCDataChannel的语法跟WebSocket语法非常相似,都有message事件和sand方法
            sendChannel.send(stringToSendRepeatedly);
        }
    };
    setTimeout(sendAllData, 0);
}

WebRTC Core API Compatibility

MediaStream and getUserMedia

  • Chrome desktop 18.0.1008+; Chrome for Android 29+
  • Opera 18+; Opera for Android 20+
  • Opera 12, Opera Mobile 12 (based on Presto engine)
  • Firefox 17+
  • Microsoft Edge

RTCPeerConnection

  • Chrome desktop 20+ (now ‘flagless’, i.e. no need to set about:flags); * * Chrome for Android 29+ (flagless)
  • Opera 18+ (on by default); Opera for Android 20+ (on by default)
  • Firefox 22+ (enabled by default)

RTCDataChannel

  • Experimental version in Chrome 25, more stable in Chrome 26+ (and with Firefox interoperability); Chrome for Android 29+
  • Opera for stable releases in Opera 18+ (and with Firefox interoperability); Opera for * * * Android 20+
  • Firefox 22+ (enabled by default)

signaling server

Signaling is the process of coordinating communication. In order to establish a webRTC communication process, the client needs to exchange the following information:

  • Session control information, used to start and end a call, that is, start video, end video and other operation instructions.
  • Handle error messages.
  • Metadata, such as the respective audio and video decoding methods and bandwidth.
  • Network data, the other party's public network IP, port, internal network IP and port.
    We need an intermediate server to exchange signaling messages and data between clients. This process is not implemented in WebRTC, but the WebRTC protocol does not specify the communication method with the server, so various methods can be used, such as WebSocket. Beginners can use NodeJS to build a simple signaling server to exchange metadata between the two parties. There will also be STUN and TURN servers in real projects  【more】

The following is the source code for NodeJS to create a signaling server:
 

'use strict';

var os = require('os');
var nodeStatic = require('node-static');
var http = require('http');
var socketIO = require('socket.io');

var fileServer = new (nodeStatic.Server)();
var app = http.createServer(function (req, res) {
    fileServer.serve(req, res);
}).listen(8080);

var io = socketIO.listen(app);
io.sockets.on('connection', function (socket) {

    // 打印日志功能
    function log() {
        var array = ['Message from server:'];
        array.push.apply(array, arguments);
        socket.emit('log', array);
    }

    socket.on('message', function (message) {
        log('Client said: ', message);
        // 本示例使用广播方式,真实项目中应该是指定房间号(Socket.IO适用于学习WebRTC信号,因为它内置了'房间'的概念)
        socket.broadcast.emit('message', message);
    });

    socket.on('create or join', function (room) {
        log('Received request to create or join room ' + room);

        var clientsInRoom = io.sockets.adapter.rooms[room];
        var numClients = clientsInRoom ? Object.keys(clientsInRoom.sockets).length : 0;

        log('Room ' + room + ' now has ' + numClients + ' client(s)');

        if (numClients === 0) {
            socket.join(room);
            log('Client ID ' + socket.id + ' created room ' + room);
            socket.emit('created', room, socket.id);

        } else if (numClients === 1) {
            log('Client ID ' + socket.id + ' joined room ' + room);
            io.sockets.in(room).emit('join', room);
            socket.join(room);
            socket.emit('joined', room, socket.id);
            io.sockets.in(room).emit('ready');
        } else { // 最多两个客户端
            socket.emit('full', room);
        }
    });

    socket.on('ipaddr', function () {
        var ifaces = os.networkInterfaces();
        for (var dev in ifaces) {
            ifaces[dev].forEach(function (details) {
                if (details.family === 'IPv4' && details.address !== '127.0.0.1') {
                    socket.emit('ipaddr', details.address);
                }
            });
        }
    });

});

There are many innovations that can be made by using WebRTC-related technologies. For example, there are already entrepreneurial teams in the industry that are doing Web P2P. The core technology is WebRTC + DASH protocol to share idle resources. Based on this, you can do fog CDN. The nodes are all on the user side, decentralized , there is still a lot of room for digging.

Reference materials
webrtc.org/
developer.mozilla.org/zh-CN/docs/…
hpbn.co/webrtc/
webrtchacks.com/
codelabs.developers.google.com/codelabs/we…

Author: Lin Congyuan tinylin
Original text link  detailed WebRTC- -Web real-time communication technology- Nuggets

 

★The business card at the end of the article can receive audio and video development learning materials for free, including (FFmpeg, webRTC, rtmp, hls, rtsp, ffplay, srs) and audio and video learning roadmaps, etc.

see below!

Guess you like

Origin blog.csdn.net/yinshipin007/article/details/132120654