The technical force behind Tencent's 1,300 NBA live broadcasts

About the Author

Li Zhendong

Tencent OMG's deputy director of operation and maintenance
has successively been responsible for the operation and maintenance of Tencent.com, Tencent News, Tencent Video and other businesses. He has in-depth research on the smooth, massive, seconds-opening, and low-latency live broadcast. Currently, he focuses on building a live broadcast automation system. The construction of the live broadcast monitoring system, the optimization of the live broadcast quality, and the reputation verification in Good Voice, live music concerts, and NBA live broadcasts.

Preface

We are gradually improving our technology in the course of some important work challenges. This article introduces the content of my massive live broadcast over the past year. Among them, the most representative NBA live broadcast has 1,300 games per year.

1. The fiery development of live broadcast business

From 2015 to 2016, the entire live broadcast industry was in the rise, and a large number of live broadcast businesses emerged. Why has the live broadcast brought a great improvement in the past two years? In fact, it mainly consists of four points.

ddf1794f5ff798216e7a23e4f26e7533.jpeg

  • The
    first technology driver is the technology driver, which includes smart hardware, mobile phones, and network bandwidth enhancement, which will bring you more convenient services for watching live broadcasts.

  • The
    second is the commercial drive, the emergence of content IP, including sports and similar concerts, like Tencent also has concerts, there are hundreds of live concerts every year, and some personal live broadcast anchors also appear. Shouting Mai. We have enriched our entire content system.

  • The
    third content atmosphere is similar to business-driven, and now there are rich ways to realize business, including the membership system, reward functions, and the emergence of a monetization model similar to content distribution, which will inject vitality into the industry when profits are made.

  • The
    fourth is the demand of the public, because the emergence of online live broadcast has enriched the form of entertainment, especially the Chinese people have a more gossip psychology, if you tell you this happened a few days ago, or quietly tell you that others already know At this time, everyone feels uncomfortable. If you can go to see the scene and experience what this person is doing or how this thing is done.

When everyone can grasp first-hand information and satisfy the gossip psychology, everyone is also very willing. In fact, it is not only girls who love gossip, but boys also love gossip, but the content is different.

The emergence of these factors as a whole, including the assistance of the entire technology, has brought our entire live broadcast industry to some new heights. It can be said to be the hottest height. Anything can be broadcast live. It is also live broadcast now. Thanks to the live broadcast team, I Know that they are very hard.

2. Features needed for an excellent live broadcast

5457172da5bff4400844ebff42c410a0.jpeg

What are the elements required for a good live broadcast? Everyone wants to broadcast live, but how can live broadcasts be done well? I was under pressure at the time. Tencent, a large Internet company, would face a huge reputation problem if the live broadcast was not successful and the users were not satisfied.

I have listed some excellent technical points, including that the image quality must be clear. For example, when watching a Victoria’s Secret show, you must have the feeling of adding a screen. When the key part is not very clear, the barrage can’t stand it. 1080P image quality Not as good as 720P.

When you watch a game, others are already about to shoot and get stuck at this moment, and then the goal is not scored. I don't know what the mood of the people watching the live broadcast was and what broke the live broadcast.

There is also audio and picture synchronization. This kind of scene is also very important. For example, in some interactive sessions, I said that everyone has a question session. Please ask online friends if you have any questions. Then I found out that everyone doesn’t like my speech Yes, someone sent me a WeChat message and told me that before the signal arrived, I would wait two more minutes. It is estimated that the people below will collapse.

Including delay and synchronization of audio and video, during the concert, the barrage suddenly cursed and sang in lie. The singer explained that there was no false singing, this time it was true singing. I haven't lip-synched yet, screenshots, and a video is recorded. Indeed, the lyrics have been sung far ahead, and the mouth shape is not right. This is a typical sound-picture out of sync.

Of course, there are more situations. I will introduce more details later. When so many technical requirements are given to our live broadcast, because live broadcast is indeed a business that requires very high experience requirements, how can we do it well to reduce scolding? Can you reduce the probability of not doing it well?

3. Technical selection of the live broadcast industry

4e8835be97c2f456a84bac802a64fc34.jpeg

Next, I will introduce to you the typical scenarios of the NBA process. We are faced with so many technical requirements and so many user pressures. How do we choose the technology? Live broadcast is still a very complicated technology because it contains many links.

3.1 Video live streaming process

bd8339e066ca356942a6b19f0d6af42e.jpeg

For example, the entire live video broadcast includes multiple links from video capture, transmission, packaging, encoding, streaming, transcoding, distribution, decoding, and playback. Excluding the others, we have a total of 18 links. Let’s say camera 1 and 2 Station 3, to do simple signal processing, on satellite or network transmission, network transmission and then transmitted to the program production center, and then packaged, after the package is finished, it will start to be distributed to users.

In addition to this, it is also facing various user needs and various clarity needs, because users’ network audiences are not the same, and the clarity may not be the same. This requires multiple definitions, and multiple terminals are required. For example, FLV and HLS, after a certain distribution, go through the content distribution network, and finally to each user's terminal, the terminal must be adapted technology, the whole process is very complicated.

3.2 Challenges in playback experience

046a39e1e5d35d8d1d8f3fa636a5ee6a.jpeg

Let me briefly introduce the technical problems we faced during the NBA live broadcast and how to solve them. I mainly listed four points.

  • Transmission
    Because the NBA is played in the United States, users from a camera in the United States to China have to experience 18,000 kilometers of signal transmission, because the production room in the United States is in New Jersey, the east coast of the United States, and we are on the west coast of the Pacific , To traverse the entire North American continent but also to traverse the Pacific Ocean, and then log in from Hong Kong, the southern tip of China, to the Beijing studio, and then distribute it from the Beijing studio to thousands of households. Basically, there is 18,000 kilometers of ocean transmission. This requires some technical thresholds.

  • After the production is
    passed, like the signal from the Americans, the simplest is the English transmission. Of course, some people say that I like to listen to the original sound, but most Chinese have Chinese needs, including player information and event information, which requires programs. The packaging enhances the audiovisual experience during the live broadcast, making the Chinese people more comfortable, accepting, and easier to understand.

  • Play
    In addition, we also need to multi-angle and multi-resolution, multi-angle is the angle to meet various viewing, we realize that there are three in the NBA perspective, such as baskets below, left and right viewing angle viewing angle, so that everyone from a different perspective Go to the game.

    There are also clarity and playback. Playback is also a very critical point. If your playback is not smooth, unclear or even mosaic, it will be unacceptable for the entire user. How to ensure that the user will not freeze during the playback process, and that the user can see the picture faster, this is a technical problem that needs to be solved.

  • The
    last monitoring is monitoring. With so many links and such a distance, the number of active users is close to over 100 million. With so many terminals and users, the probability of failure is very high. How can we ensure that various risks are minimized, but Risks are impossible not to appear. Failures must occur. How to make the plan start quickly after the failure occurs. This monitoring is a very important link.

但是大家都有监控,腾讯也有这个课程,现在的监控是已经到了新的层面,除了现在必要的监控的各个环节以外,另外是大数据的冲击。每天整个监控的信息,一天差不多有2000亿条,这数据如何收集起来,如何进行分析,如何快速的在极短时间内发现问题所在,帮助我们快速定位。这是一个巨大的挑战,这就涉及到大数据处理。

我现在就开始一个个说当时我们是怎么来解决这个问题的。这是2015年的夏季,当时接到这个任务的时候挺开心的,NBA是很大的投入,腾讯每年投入一个亿,还是美金,公司把这么重要的任务交给我们,要体现我们的价值。干得好,升值加薪不在话下,干得不好,可能就要去财务领工资了,所以风险和机会永远是并存的。

4、面对传输问题的挑战

4.1 挑战一:传输过程中容易出现花屏和断流

9d46fa0e87d50bbfad6e3c8b88919a40.jpeg

但是直播第一天的时候傻眼了,因为画面这样了,有深深无助的压力。领导说,你这问题怎么解决?因为以前从来没搞过直播,我们就开始分析,因为传播过程中为了满足低延时和实时性都采用 UDP 的传输。

但是是像车队一样,很容易造成一些卡顿的情况,因为一旦传输要进行一些同传的时候,包卡在那里,视频的画面都是经过大量的压缩,一个包丢失的话可能是一个区块,一个像素的影响,我们在传输过程中那么远的距离传输,势必导致丢包,一旦丢包就出现了画面的卡顿。当时不管是运营还是各种技术,都说要解决。其实有解决的办法,我后面再说。

4.2 挑战二:传输过程中容易出现花屏和断流

759ee25c0938f048e76b0967006ce8bb.jpeg

我们当时选择的方式是网络传输,从美国的新泽西机房一直到传过北美大陆,传过太平洋,再从香港的节点到北京,这么大的距离,我们当时测了一下距离,是17286.59公里,这么远的距离,大海有自然因素,可能会有海啸,非常容易导致整个线路的不稳定。

可能有些同学就说了,为什么不用卫星传输?卫星传输是很简单的,只要经过两个卫星就行,美国卫星发过欧洲的卫星,欧洲的卫星再中转给中国的卫星。

卫星传输确实是简单,但是价格是非常昂贵的,可以说通过卫星传输的话,差不多是网络传输的价格的50倍,网络传输价格已经很贵了,如果1300场都用卫星传输不太现实。

4.3 传输优化解决方案

5ac7711868bab4fa4c93aa2c5d9f59c1.jpeg

  • 容错技术
    技术有时候碰到问题才能够体现出价值来,我把这个任务接下来了,丢包有办法解决,我们当时采用了纠错的技术,会把包的传输变成一个矩阵,在每横每列多出一个交验包,像这样10乘10的矩阵,每行每列丢一个包的时候,是可以通过校验码把它补齐的。

    如果发100个包,通过加上20个校验码的话,每行每列不管丢哪一个包,是不会出现任何的画面抖动的。就提升了原来UDB传输过程中的可靠度,我们当时的网络专线的容错可靠度能够达到千分之一。

    所以出现刚才的画面,网络很难再提升了,很难再去做更大的优化。我们直接把丢包率的要求降低到了10%,但是连续丢包也是不行,一行丢两个包或者三个包也是不行。这样就把整个画面容错率降到差不多千分之五的概率。

    原来一场比赛每天都会有一些画面的情况,但是现在已经把一场比赛基本上是十场比赛才可能出现一个很短暂、很细微的画面感觉。如果大家在远距离传输中出现问题,可以去看这个纠错技术,这个矩阵只要不断变小的话,甚至变成2-2的矩阵的话,纠错能力会更强。

    但是它带来另外一个因素,加了大量纠错码的话,会把传输过程中的码率变大,因为加了纠错码以后,像已经加了20%的流量,原来只要1兆的,加上之后可能到1.2兆,典型的是用空间去换取时间的方式。

    我们当时还研究了美国军方无线传输过程中的方式,这里就不说了,基本上就能够解决。

  • 多链路备份
    刚才说的还有千分之五的概率,怎么解决呢?我们做了网络备份的技术。我们在北美大陆和太平洋专线部署了三条专线,每条专线会有红色或者绿色或者是黄色,还有黑色,颜色只是代表着一些信号传输的区别,因为我们不是每一路信号都进行备份操作或者是有多份的备份,有些只对主信号进行多备份的传输。

    通过这么复杂的网络,降低了不管是在正常情况下的丢包概率,还是在应对比较复杂的自然气侯,还有一些小概率的比如施工导致专线影响。

    当然我们在重要比赛的时候,还是会把卫星的传输信号当成一个备用方案,所以信号的传输过程中主要是利用纠错的技术和多转多发的技术,去降低我们在整个信号传输过程中的花屏或者中断的影响。

5、面对制作技术的挑战

5.1 视觉优化-字幕

435e9ad6f2c4b2c08c172ecbe183d968.jpeg

当信号完美传到制作中心的时候,这时候就开心要进行一些节目制作中的包装了,比如说加一些字幕,通过字幕机把球员信息转化成中文的信息。

5.2 视觉优化-AR

c9e5610cdc7b185e2ed9807b13828809.jpeg

我们还可以利用一些AR技术,将我们在比赛过程中一些互动的过程,或者一些数据的分析加到直播画面过程中。

5.4 视觉优化-多角度

9b2046ea3e0fa58723ae0c363e34f20a.jpeg

比较重要的一点是多角度,这是提升用户在观看过程中的吸引力,比如加了英文的原音还有低角度和右篮板多视角的技术。整个过程完成了节目的传输和制作包装的过程。

6、面对播放问题挑战

6.1 问题一:播放流畅度问题

到了重点的地方了,节目已经准备好了,接下来就要传给用户,传播给用户的过程中,具体是有要求的,就是流畅度。

6535bd7588657ea5770a2249382fc5e7.jpeg

  • 第一个是2秒法则,当一个用户打开视频的时候,如果超过两秒时间,用户这时候选择离开的可能性会逐步增大,每增加一秒钟的时间,打开时间超过一秒,用户的离开率可能会增加6%,我们就要赶在用户两秒之内看到我们的画面,用户是上帝,不能考验他们的耐心,超过两秒人就会走。

  • 第二是卡顿的影响,这也是从数据中分析的,如果用户每增加一秒钟的卡顿,用户的观看时长就会降低1%,用户离开的可能性也就越大。我们怎么去解决在播放过程中的流畅性这样一个问题?

6.2 解决方案-CDN技术

e5f77009280de1c6b98a08291c8ead7b.jpeg

首先是最普世的技术,CDN 的技术,我们在全国部署了500个 CDN 的节点,包括新疆、香港这些地区,包括很偏远的云贵地区。

CDN 是一个比较成熟的技术了,把用户的内容推到离用户最近的地方,拥有500个节点以后,还做了提升用户接入速度的技术,我们直接使用IP的调度,没有经过 DNS 的解析,节省了用户在接入过程中的时间。另外就是我们会对整体状况进行实时统计。

有了优秀的 CDN 技术和覆盖以后,是不是就真的能够满足两秒打开的要求?其实不是的,因为直播过程中有一个重要的特点是,直播开始的效率。

直播不是24小时都有的,有时候信号没有了,用户根本就不用去看,但是一旦直播开始,比如说一场球赛开始,这时候用户会有非常强的直播效应就是进入效应。

6.3 问题二海量用户播放体验保障问题

e588463e7f9799005047107e4e70080c.jpeg

像腾讯拥有包括微信、QQ的渠道,NBA一场比赛开始的时候,一分钟内我们的用户就能够达到峰值,每一分钟进入大概都是在200多万。

人多的时候就会拥挤,不是技术无能,是用户实在太多了,我们可以去想象一下,每次在刷票的过程中,看到12306的时候,每个人都骂12306的时候,我是坚决不骂的,因为那个量确实太大了,每天有多少人,具体的数据12306都会公布。

在海量用户的时候,大家都想在那个时刻进入的时候,确实是很难支撑的,那怎么办?生活还是要继续,尽量还是要保住饭碗。

6.4 解决方案—调度策略

21b6ddbe3639b36ae161c18bc3b8b66f.jpeg

在快速海量的用户进入的过程中,在这么强大的用户冲击下面,它会造成对用户的冲击分为哪些方面,我这里总结了是两个方面。

第一个是用户快速进入的时候会造成局部系统的拥塞,另外就是用户实在太多了,我的系统没办法支撑了,这时候该怎么办?局部的拥塞是用预调度的策略,就是用户来得快,我的应对机制更快。

d9059f214214c50639148423227df9bc.jpeg

第二是柔性降级,是海量技术里非常重要的一个思路,其实是通过服务有损的方式对用户提供服务。

举个例子,比如说现在只准备了一百个位置,却来了两百人的时候,这时候该怎么办?如果是无序的,什么都不干,可能会在现场打起来,那会引起更大的混乱。

这时候怎么办?如果你的平台的能力已经无法完全支撑这么多用户,预估是不准的时候怎么办?就需要有柔性降级的策略,我接下来详细说。

天下武功唯快不破。当用户快速进入,势必而言会对局部系统给出很大压力,我们怎么快速分解这部分压力?这里用了两个重要方式。

  • SNMP协议采集数据延时信息
    第一个方式是用简单网络协议SNMP协议直接采集交换机流量,这时候统计起来了,用户找进来了,可能延迟三四秒,但是每三十秒都有三万人的进入,而且直播是高带宽的服务,上万人可能就已经出现了几十G、上百G的扩张。这时候我们不在统计网卡里,统计交换机流量,把流量收集的数据延时降到最低。

  • 预测技术及时分流
    另外一个技术是采用预测的技术,预测的技术就是跌倒了以后把自己看看我是怎么跌倒的,分析一下自己跌倒姿势的技术。

    每个用户虽然我们说用户是快速介入的,但是是有一定规律的,我们通过每一场比赛,用户进入了一个规律,我们去看曲线,用户如果一分钟内进入多少万,这时候对于冲破这个机房的概率是多大。

    当我们满足什么条件时,机房冲破的概率一旦超过60%的时候,可能流量还没有到60%,只到30%,但是我们发现流量的产生曲线已经大概率可能出现冲破机房的情况时,我们就把机房提前分流,它就不再进入机房了。

之前我们跌倒就是因为延迟只有一分钟,但是一分钟过程中用户进入这个领域的时候,已经完全把机房冲跨了,但是我们开始预测,只要前一分钟的曲线,可能会出现把机房冲爆,就不再给机房导流。提前进行分流,通过预调度方式解决局部的拥塞问题,就是快,甚至是通过预测的方式。

调度策略—柔性策略
另外的方式就是柔性解决全局拥塞风险,当然我们有一个非常丰富的用户在线预测体系,也会根据每一场比赛的球队粉丝数还有不可控因素,还有这场比赛推哪些渠道和引流,每个比赛之前都会有专业的数据分析,比如这场比赛可能会有五百万人或者六百万人,但实际上预测是很重要的环节,但不是绝对安全的环节。没办法预测完全准确,就像九三阅兵的时候,大家都预测有多少人会看阅兵,最后让我们大跌眼镜,每个人都在看阅兵,所以预测不是绝对可靠的,只能做一个理论的依据。

方法一:排队
   如果我预测的一桌人,来了两桌人怎么办?怎么样不形成现场的混乱,这时候一定要有柔性机制,我们有很多的方法。

第一个方法就是排队,当一个用户去预测,比如只有五百万人,却来了五百零二万人的时候,这时候不要直接挤进来,直接进来就容易进行资源的竞争,直播是一个高资源带宽的业务,一旦形成资源竞争,用户下载不到足够数据就会产生卡顿,这时候就让他先不进来,让他先等一等。

方法二:柔性降级
有人说等不了怎么办?那也没有办法,如果进来了就会影响剩下的五百万人,可能就会打起来的情况。也可能会提供一些更丰富的场景,比如说如果用户特别多的时候,像演唱会甚至比赛,这时候我们会提供一些视频流,没办法提供就是提供音频流,像王菲演唱会专门提供了音频流,如果用户太多了,带宽不够,用户还可以选择音频。

这就是柔性降级的重要策略,千万不要因为超出预期了,让这部分人去无序和现有已经能够服务很好的人造成资源竞争,如果产生这种竞争的话,整个服务体系就会全部崩溃了,所以一定要有预案,要有一个准入机制,或者有一些降级的丰富的手段,既能保证现有的用户,体验不受到影响,也能对想进来的人有一个很好的预案去解释。

调度策略就是这两种,如果用户在快速进入过程中的话,如果只是局部的,那就是以快制快,通过更快的速度拿到我们现场的机房流量,另外一个方式就是通过柔性方式,当用户来得我们无法承担的时候,不是说用户从A机房挪到B机房能够解决的时候,这时候就要极少解决,排队或者降级策略,比如说音频或者低清晰度画质,来满足部分用户,避免他造成全局用户的影响。

把2秒法则和卡顿解决之后,通过在应对各种用户场景的技术的情况下,就能够很好的把流畅度需求解决掉,用户还是会有一些需求的,两秒是用户基本的耐心,但是用户还想更快看到画面,这里有个重要的技术就是秒开的技术,就是如何让用户更快看到画面,事情无绝对,能做到极致。

6.5 解决方案—提升用户看到画面的速度

75ab95c3f60db4e72a23e539d3fcb45b.jpeg

这里用的是I帧压缩去掉图像的空间冗余度,I帧是可以完全解码的,只是帧内的压缩,没有掺杂时间的属性,I帧能够独立解码出来,P帧需要依赖于I帧,这时候是解不出来画面的,需要去参考前面的I帧,通过I帧把背景信息和运动信息补齐,这里是带运动参数才能解出来,而B帧是双向帧,也是解不出来,它还要依赖于后面的P帧,所以基本上就是这样的画面压缩逻辑。B帧就需要同时拿到I帧和P帧,根据拿到的压缩数据去解压。

之前是一个无序的过程,就是可能会给你I帧,也会给你B帧,也会给你P帧,如果你下的是B帧,那解不出来,把I帧先下完,再把P帧下完,才能够解压出来。这种情况就会出现需要下载更多的数据,等待更长的时间才能看到画面,这样对于追求技术极致的人是没办法忍受的。

我们就用了一个技术,让用户更快看到画面的技术,首先我都是下I帧,这个和播放器一起去改造,用户下到I帧马上画面就出来,降低用户的时间,降低了接近两百毫秒,让我们的上帝去看到画面的速度又提升了两百毫秒。

但在体育大型的赛事直播,尤其是个人主播的时候,体现的优势会更加明显,通过这些技术,我记得有一个有意思的问题。当时有一个同学说,这个东西很难吗?我说其实感觉不是特别难,概念一说很清楚,改造的话估计一两个星期就可以了。

他说,为什么不难的技术,其他的直播或者行业做不到呢?我当时回答的是,我觉得做技术或者海量的话其实应该有两个点,第一个是单纯一个点解决起来是不困难的,困难的是把一个技术体系,针对于这个业务,这个方面遇到的各种问题解决。

我们解决了 CDN 问题,解决了纯属问题,在 CDN 上又直接调度问题,解决了流畅性上海量冲击的问题,再加上解决了打开画面快速的问题,实际上是有很多的点去解决的。把整个点再复盘一下,才慢慢形成一套方法,并不是一两个点能够来解决。

所以海量技术并不是容易解决,而是过程中不放弃,把每个技术点做到极致,而且是非常适合自己的业务体验的极致。

7、面对海量监控问题的挑战

7.1 监控的目的

46a57cd141f5c59801d1df59e30fb154.jpeg

最后说一下关于监控的问题,全流程监控是为了发现质量问题,比如说基础监控是最底层的,包括 CPU、内存、网卡、IO硬件,还有网络,因为现在都是互联网服务,网络监控是必须的,比如说点到点 ping 的延时,udp探测,链路分段检测,慢速这些监控,另外就是播放,播放属于业务层,这个时候就需要有包括对播放量、打开时间、卡顿时长、卡顿率和失败率,包括一些码流去监控。

另外针对直播的业务属性,更加偏向业务的监控,比如说直播流,比如说黑屏能不能监控,用户看到的画面是不是屏幕已经变黑了,或者可能是马赛克,可能有慢速或者丢包导致的情况,另外就是静音,直播过程中用户是不是听不到画面了,或者爆音,用户听到刺耳的声音,还有转码这些过程。这是一个立体化的模型,所有这些点聚合起来的时候,前面我提到各种数据上报,包括后台日志。

7.2 监控的挑战-日志分析效率

103aeb678455dec3a3d61285e5e44bf7.jpeg

日志整体一天是2千亿条,未来可能会超过5千亿条,这么大的量半天以后拿到结果或者一天后再拿到结果,黄花菜都凉了,怎么办?我们需要的是分钟级的。

The traditional method is no longer adapted to the demand. Now we are faced with hundreds of billions of data per day, each of which may have a hundred dimensions, and the amount of data per day exceeds 100. We also need a second-level response, and the opening speed is required to be 10 Response in seconds, the delay is 30 seconds. At this time, we will introduce new technologies, analysis-oriented and search-oriented technologies to advance the data volume challenges we face in the monitoring field.

7.3 Solution-Big Data Processing

86b52c7d32d8f18eeef39cd639a4010d.jpeg

This is our big data processing process. In fact, it is a relatively classic big data processing process. After reporting the data from various terminals, including Apple, Android, TV, PAD, and PC web, it is received through the log collection system. After a simple cleaning and Kafka transfer to the Spark cluster, the dimensions are calculated, and our data products are generated after the statistics are completed.

The Eagle Eye log is developed based on ES. Here is to share the experience of big data, mainly using real-time computing to monitor the playback process and CDN speed monitoring. This architecture basically meets the requirements of 200 billion daily Data above 100T has many dimensions, almost a log of more than one hundred complex data.

Once you have the monitored data and can get it quickly, you can really find the problem first, and you can quickly get what kind of problem. The technology here actually covers many aspects. Although it is simple to say, it covers the technical basis of mass operations, the basis of streaming media, and the technology of big data.

How to get the data out and analyze it in real time. It also covers the network transmission technology of CDN, how to ensure the number of networks, how to quickly accelerate in the process of CDN, and how to turn the original DNS method into an IP direct connection method. In fact, It contains many ways. This may not be clear at once, but it is equivalent to throwing in the limelight.

8. Summary

Massive operation technology is a very large system. I hope that when you encounter this situation, you can stand up bravely and face challenges. As long as we have a heart to strive for excellence and keep trying, most people can do better. , This is my little experience.


Guess you like

Origin blog.51cto.com/14996608/2548284