How to prepare for the actual scene questions of the big data interview, I really can't stand it!

37241f7dee904c175677ce43fe527046.png3 million words! The most complete big data learning interview community on the whole network is waiting for you!

A few days ago, I posted an article summarizing interviews. The actual scene questions accounted for more and more interviews in the field of big data. Everyone began to be "pragmatic" in the interview. In the actual scene questions, by the way, the investigation of the principle content of the framework was added. This is definitely a progress in the field of big data. Huge challenge!

This article summarizes the background and solutions to such problems.

background

From the perspective of the interviewer, I think that the reasons for more and more inspections of actual scene problems are mainly the following factors:

1. The framework itself is not difficult to learn

Judging from the current development of the entire Internet industry, there are many open source things, and the threshold for everyone to acquire "basic knowledge" is not high. You can easily find the resources you need from the Internet. Even if your personal comprehension ability is weak, you can still take time to accumulate ;Another main reason is that the investigation of the pure framework principle is not highly differentiated, which is the same as the college entrance examination questions. Everyone can understand the simple and medium ones, but these questions cannot effectively distinguish those students who are really capable, and it is unfair to them. Therefore, it is necessary to increase the difficulty of the last small question of each major question to distinguish those students who really combine theory and practice.

2. The importance of the practice itself

Practical scene questions are the simplest and most effective way to demonstrate personal ability and vision, especially in relatively new fields. These fields have high learning costs and steep learning curves. It can filter out "magic resume" as much as possible, I believe everyone knows what it means.

What should we do?

We look at these issues from two perspectives:

First, from a simple interview perspective

Our basic principles in answering such questions should be three-stage:

  1. Be clear about your business/technical background

The interviewer needs to clearly know what kind of business scenario this is, and what are the specific needs of the demand side? Why is there such a need? And what is the upstream and downstream information linked to this scene itself?

  1. Concise and clear expression of personal technical solutions

Generally speaking, for a specific business requirement, we need to give a technical solution according to the business requirement. It is impossible for all business scenarios to be supported by a set of solutions, so what are the advantages and disadvantages of the technical solutions in the current business scenarios? Is there anything worth noting? What kind of problems did you encounter in actual development and what was your solution? The above three consecutive questions are what the interviewer pays most attention to.

In addition, there is an obvious misunderstanding here: the data volume/data size is only a small reference standard, and it is not even a reference standard. It is worth discussing here. I often see such descriptions in the resumes of some students: the average daily access data of the platform is xxx billion, xxxG, etc. Everyone should understand that this is not the end of the story. Secondly, it is the capability of the platform itself/framework itself. After any company’s business is complicated, the data scale will increase. What the interviewer cares about is that when your data reaches a certain scale, What did you do yourself? For example, some students will also write, 10 billion messages and 200 fields in our ClickHouse table, what does this have to do with you? This is the capability of ClickHouse. What are you doing with this data scale?

  1. To clarify the benefits

We have now designed targeted solutions and technical solutions for specific business scenarios to solve business problems. Then it will definitely bring benefits. This benefit has two aspects, one can be technical, and the other is business. For example, technically, you can explain what technical problems were solved and what effects were achieved. What business indicators have been improved in the business, and how many percentage points of profit have been brought.

These benefits are also of great concern to the interviewer.

Second, from the perspective of way of thinking

Did you see the way the question was answered above? This actually reflects the way of thinking, the way you usually do things and think, and it is also the way of thinking about project summary and reporting.

This is a good habit to develop, especially in interviews for high-end positions. When everyone answers every question of the interviewer, they are actually telling the interviewer that this is my way of thinking about a problem, and it is also my way of thinking. The usual way of doing things.

The interviewer will consider it in his heart. In addition to the skills themselves, the way of doing things and thinking also greatly determines a person's ability to succeed, which is of course a factor of consideration.

If this article is helpful to you, don't forget to  "Like",  "Like",  and "Favorite"  three times!

0d9357ed75518e1517374bd05cc8d2d0.png

7315650f2540ff8800893712b3b8901e.jpeg

It will be released on the whole network in 2022 | Big data expert-level skill model and learning guide (Shengtian Banzi)

The Internet's worst era may indeed be here

I am studying in university at Bilibili, majoring in big data

What are we learning when we are learning Flink?

193 articles beat Flink violently, you need to pay attention to this collection

Flink production environment TOP problems and optimization, Alibaba Tibetan Scripture Pavilion YYDS

Flink CDC I'm sure Jesus can't keep him! | Flink CDC online problem inventory

What are we learning when we are learning Spark?

Among all Spark modules, I would like to call SparkSQL the strongest!

Hard Gang Hive | 40,000-word Basic Tuning Interview Summary

A Small Encyclopedia of Data Governance Methodologies and Practices

A small guide to user portrait construction under the label system

40,000-word long text | ClickHouse basics & practice & tuning full perspective analysis

[Interview & Personal Growth] More than half of 2021, the experience of social recruitment and school recruitment

Another decade begins in the direction of big data | The first edition of "Hard Gang Series" ends

Articles I have written about growth/interview/career advancement

What are we learning when we are learning Hive? "Hard Hive Sequel"

Guess you like

Origin blog.csdn.net/u013411339/article/details/132033255