Huawei, ants gold dress all in big data development recruitment, What skills required

Disclaimer: This article is a blogger original article, follow the CC 4.0 BY-SA copyright agreement, reproduced, please attach the original source link and this statement.
This link: https://blog.csdn.net/wwdede/article/details/100547430

Everybody today to detail the provisions of the "most popular" big data jobs developing technology engineers.

The key is often said that popular scenario two reasons, one because many graduates want to go to big data or business students, all practitioners are looking forward to the "big data development" which must be located not very clear position; two Since a minimum of three positions can be called "big data development," so here's "most popular" must-depth analysis.

Old routines start with a good number of companies post the Rule.

 

Headlines

1 technical professional, bachelor's degree, University of Science and Technology of China / computer / communications / math and so on;

2, understand the Hive SQL language, understand the shell, python and other least developed this language;

3, application experience hadoop, spark, flink least this and other data management platform;

4, logical thinking ability, good communication, a sense of responsibility and team spirit;

5, the data warehouse infrastructure, business data analysis and improve the work experience is preferred.

If you are interested in big data development, want to learn the system big data, you can join the big data exchange technology to learn buckle group: 189 + 307 digital 522+ digital, private letters administrator can receive a free development tools and entry-learning materials

 

There Chan

1, java / scala skillful application of this happened

2, with a solid basic theory basic computer, a strong basic optimization algorithm and data structure

3, least understood this instant calculation module Storm, Spark Streaming, Flink, green for hadoop other components have to be mastered, such as HBase, hadoop, Hive, Druid, etc.

4, the scale of application development experience preferred server cluster

5, BAT work experience preferred

6, with excellent work habits and team spirit

 

Huawei

1, computer or undergraduate degree or above related, have a deep knowledge and practical experience of data processing methods, data modeling, data statistical analysis;

2, understand Hadoop / Spark / Hive / HBase and other Internet gadgets big data, core data platform is too big and medium-sized builders preference;

3, skilled SQL, common understanding of relational databases, non-relational database queries and data warehouses, with SQL features to enhance the experience;

4, master the core concept of micro-services development and design, maintaining technical, general understanding of design patterns, flexible use of SSH architecture development and design, to carry out skilled Java, Python coding writing, understand the multi-threaded programming;

5, logical thinking smart, sensitive to new technologies, there is a strong assiduously self-learning ability;

 

Ants gold dress

About 1, 3 years experience, it has large and medium industries engaged in work experience

2, system software development and design work on the Hadoop / Hive / Spark / Storm / Zookeeper and other relevant development experience in the design or practitioners distributed system

3, understand the Linux / Unix system software and a variety of Java development and design work experience

4, have a clear sense of responsibility, truth-seeking strong impulse

 

Small shadow

1, college degree or above, computer-related technical expertise

2, about three years to develop company-wide data warehouse design work experience

3, understand the basic theory of data warehouse, with complex business processes requires the ability to work finishing

4, SQL development design skill, proficiency, etc. Mysql relational database or classes of such

5, and flexible use of Hadoop Map-Reduce application development, flexible use HBase, Hive, Storm, spark and other large data development in this gadget or categories

6, understand the Linux system software, with shell, python and other scripting production development and design capabilities preferred choice

7, strong self-learning ability, love of new technology applied scientific research of open source systems, the elite team spirit, with individual problem-solving ability to work

Ability to work three-body model verification analysis

Comprehensive results are given

 

Professional knowledge

1, college degree or above (3)

2, computer-related technology professional (4)

Hard skills

1, flexible use hadoop, Hbase, Hive, Storm, Spark Streaming, flink other large data development in this gadget or categories (5)

2, understand the shell, python, scala, java programming language like this happened (5)

3, understand the linux / unix system software (2)

4, the data warehouse infrastructure, data processing methods, data modeling, data analysis relevant work experience (2)

5, skilled sql, common understanding of relational databases and non-relational databases (1)

Can or soft skills

1, excellent team spirit (3)

2, strong self-learning ability and impulse (3)

3, it is apparent sense (2)

4, the ability to work alone to solve the problem (1)

Analyze

It seems hard from the top professional skills, together with a 1,2,3 item is not no way, this is all the software development level, but to have this standard and 4,52 together will be very difficult, it is remarkable the two positions. It is necessary to look at the content and duties under Gongzuogangwei

 

Job duties

1, assume the data warehouse infrastructure, ETL development and design, statistical analysis (3)

2, the statistics bear index value statistical analysis (2)

3, assume a large Internet data services platform instant estimates and business process development and design (1)

4, assume the basic data management platform construction and maintenance (2)

According to job duties after the top four nodules can see, 1,2 two job responsibilities for a data warehouse job requirements, that the provisions of article content products in front of large Internet data warehouse jobs data detailed content; the key item 3 job duties is to match the first item of hardware expertise provisions hadoop green ecological management system, the key is in the immediate calculation and program development level; item 4 posts provision is development and design of the gadget data management platform, usually containing platform, scheduling system, a large meta data platform and other small tools, and other key match java language skills development and design work requirements.

I work experience

According to the analysis to understand the data warehouse design, development and design of real-time calculation, design data management platform development is usually called big data development, I think it was three positions, respectively, also have different requirements. Expectations of the students or graduates want to go to the Big Data industry and thus help the students.

Broaden what the next hadoop green management system too great, included but not limited to hdfs, hive, hbase, storm, spark, flink, kafka, flume, etc., so as a great big data development, especially in the big Internet provision of data on operation and maintenance of the students self-learning ability is very high.

In a large non-Internet data the students to see that the Internet is full of large data students should grasp, but not easy to recruit their own software for each system in a normal company to recruit people, but also full of comprehensive regulations, as master or system architect market is still there, just too small, so pay wages big data developed high is some truth.

Guess you like

Origin blog.csdn.net/wwdede/article/details/100547430