Big data (3) Jobs related to big data
Contents of this article:
1. Digression written in front
2.1. CIER index of college graduates with different types of enterprises
2.2. CIER index of college graduates of different enterprise sizes
2.3. Top 15 Cities in Supply and Demand of College Graduates
2.4. Industry rankings with high and low prosperity index in the first quarter
2.5. Occupation rankings with higher and lower prosperity index in the first quarter
3.1. Big data development engineer
3.3. Data mining analysis engineer
Big Data Engineer Job Description (1)
Big Data Engineer Job Description (2)
Big Data Engineer Job Description (3)
Big Data Engineer Job Description (4)
Big Data Engineer Job Description (5)
Big Data Engineer Job Description (6)
1. Digression written in front
I once heard the person in charge of recruitment in a certain unit express this emotion: During the interview, this person obviously scored very high in the test, and even answered the competition questions correctly. We have great expectations for him, but in actual work, he seems to work very hard, but his grades are mediocre, why?
Hehe, if I guessed correctly, the person he said got high test scores by brushing the questions, and those who brushed the questions are prone to high scores and low abilities. There are too many temptations in society now, and there are fewer people who can concentrate on learning something.
Now the job market is not optimistic. Some units always require postgraduate and doctoral students, but the actual work done by undergraduates is already competent. The people who are recruited and actually work in the unit are doing jobs that are no less difficult than those with high education, and the difficulty is no less than that of others, but the salary is really a lot less, and I feel a little bit aggrieved. High education requirements have raised the entry threshold, and some people cannot find suitable jobs and choose to continue their studies to improve their education.
Education is explicit, ability is implicit. Academic qualifications can be quantified, but ability cannot be quantified.
Which one is more important, education or ability, is affected by multiple factors such as individuals, companies, positions, and environments. Enterprises often say that ability is important, but in practice, they make the opposite choice.
So, everyone, while you are young, take time to recharge yourself...Education and ability at the same time.
Sober in adversity
2023.8.27
2. Employment status in 2022
2.1. CIER index of college graduates with different types of enterprises
2022 CIER Index of Graduates from Colleges and Universities with Different Types of Enterprises
2.2. CIER index of college graduates of different enterprise sizes
CIER index of college graduates of different enterprise sizes in 2022
2.3. Top 15 Cities in Supply and Demand of College Graduates
Top 15 Cities in Supply and Demand of College Graduates in 2022
2.4. Industry rankings with high and low prosperity index in the first quarter
Ranking of industries with high and low prosperity index in the first quarter of 2022
2.5. Occupation rankings with higher and lower prosperity index in the first quarter
Occupation Rankings with Higher and Lower Sentiment Index Q1 2022
2.6. Salary by developer type
Salary by Developer Type
Senior positions such as executives and engineering managers tend to pay the best. But in the U.S., Germany, U.K., and Canada, we see comparable salaries for blockchain developers, even with the lowest average years of experience.
3. Jobs related to big data
As a trend, big data has attracted more and more attention. At present, from the national ministries and commissions to ordinary companies, various big data platform construction and analysis applications have been carried out one after another.
What are the current jobs related to big data? What are the job requirements and salary? Here is some information collected for your reference:
3.1. Big data development engineer
Big data development engineers, also called database development engineers, refer to professionals who design, develop, maintain and manage large databases.
job requirements:
1. Bachelor degree or above in communication, electronic engineering, automation, computer and other related majors;
2. Experience in analysis, design and development of database application software;
3. Proficient in relational database principles, proficient in SQLSERVER, familiar with MySQL, oracle;
4. Skilled in using mainstream database analysis and design tools; can independently complete the database system design, and can design the database server according to the specific application;
5. Familiar with NET architecture and J2EE architecture, familiar with ASP?NET and ADO?NET;
OOD/OOP concept and XML standard;
7. Familiar with data warehouse, OLAP and data mining; 8. Good English reading ability, able
to read English test materials;
;
10. Have a sense of time, strong independence, and teamwork spirit.
Work content:
1. Design and optimize the physical construction plan of the database;
2. Formulate database backup and recovery strategies, workflows and specifications;
3. Undertake the implementation of the database during project implementation
; Solution;
5. Supervise the installation and operation process of the database on UNIX, Tandem, NT and other systems;
6. Analyze, design and rationally develop the spatial database to achieve effective management;
7. Supervise the implementation of database backup and recovery strategies;
8 , Provide technical consulting services for application development, system knowledge, etc.
3.2. Data Analyst
A data analyst is a type of data engineer, referring to professionals in different industries who specialize in collecting, sorting, and analyzing industry data, and making industry research, evaluation, and prediction based on the data.
Job Responsibilities:
1. Abstract data analysis requirements, form competitive data products, and help product business to understand and use data more deeply;
2. Based on in-depth understanding of Internet products and business, independently undertake complex analysis tasks, and provide product direction Provide decision-making support;
3. Responsible for data support for relevant project work and production of business reports; 4. Establish data
statistical models related to product business;
job requirements:
1. Bachelor degree or above, major in computer, statistics, mathematics, information management, sociology, etc., more than 3 years working experience in consulting and Internet industry; 2. Good
data analysis ability, rich in data analysis, mining, cleaning 3. Familiar
with Internet data collection, capable of big data processing, proficient in using SQL, and master hive and other related data tools;
4. Good at communication and good teamwork spirit;
5. Have data analysis work in the Internet industry , Those who have worked in the data mining industry are preferred.
3.3. Data mining analysis engineer
A data mining engineer is a type of data scientist. It generally refers to engineering and technical professionals who use algorithms to search for knowledge hidden in a large amount of data. This knowledge can be used to make enterprise decision-making intelligent and automatic, so that enterprises can improve work efficiency and reduce the possibility of wrong decision-making, so as to be invincible in the fierce competition.
Job Responsibilities:
1. Responsible for researching methods and tools of new data analysis and data mining techniques;
2. Responsible for organizing the promotion and training of big data analysis techniques in the industry.
job requirements:
1. Bachelor degree or above, majored in finance, mathematics, statistics, computer and other science and engineering related majors;
2. Have solid knowledge of mathematics, statistics and computer, familiar with the basic theory and relevant aspects of new data mining and analysis methods such as machine learning 3. Familiar with a traditional data mining
tool, such as SAS, SPSS, Matlab, Weka, etc., or have actual project development experience of at least one big data analysis tool, such as R/Python, Mahout, Hadoop, SparkML, etc.;
4. Have rigorous data thinking, excellent problem analysis ability, and good analysis and research report writing ability, can produce professional analysis and research reports, and are willing to solve challenging problems; 5. Excellent academic performance in school
, CET-4 and above, strong sense of responsibility, good cooperation spirit, strong learning ability, communication and coordination skills, and strong love for technology; 6. Good
ideological quality, good health, no serious medical history.
3.4. Data product manager
In a broad sense, a data product is a form of product that can give full play to the value of data to assist users in making better decisions (or even actions). It can act as an information analysis displayer and value enabler in the user's decision-making and action process. From this perspective, search engines and personalized recommendation engines are obviously also data products. Since the product form is relatively mature, they are rarely classified into the concept of data products. Layers of clothing, so that non-professional users cannot intuitively feel the existence of data.
Job Responsibilities:
1. Establish and optimize the model laboratory and support daily operations, support the model construction and optimization of merchants
2. Design and develop big data risk control products for the pain points of Internet risk control business, including but not limited to credit reports, user portraits, and risk scores , credit integration system, marketing platform, etc.
3. Build a quality analysis and value analysis system for upstream and downstream data products, periodically track and optimize the quality work of each layer of data products
job requirements:
1. Bachelor degree or above, major in statistics and computer related majors
2. More than 2 years of experience in data management or participation in the construction of risk strategies, risk control system construction, and marketing platform construction
3. Master the use of Python, SAS, R, etc. at least one A statistical analysis software or proficiency in prototyping tools such as AxureRP
4. Excellent communication skills, coordination skills and organizational skills
4. Strong sensitivity to industry development and risk evolution
3.5. Data architect
A data architect is a team leader who needs to control the whole as well as understand local bottlenecks and give solutions based on specific business scenarios. An architect needs enough imagination to expand various target requirements in different dimensions and provide target customers with a more comprehensive list of requirements. Architects play a very important role in the whole process of software development.
Job Requirements:
1. Bachelor degree or above in computer, industrial automation and other related majors.
2. Familiar with data warehouse modeling theory, more than 3 years of practical experience in related fields, and be able to independently complete the system architecture.
3. Familiar with Hadoop, Hive, Spark and other big data technologies.
4. Proficiency in using Java or other languages for complex business logic data processing, with the ability to process massive data and optimize performance.
5. Proficient in MySQL, Redis, HBase and other databases, able to optimize storage queries.
6. Clear thinking, good communication and understanding skills, strong learning ability and the ability to solve problems quickly.
7. Good exploration and curiosity about new technologies and new things.
Job Responsibilities
1. Responsible for the evaluation and design of the technical architecture of big data projects.
2. Responsible for the development of key functions, the solution of technical problems, and the key control of the output code.
3. Responsible for the pre-research and selection of the key technologies used, and complete the PoC.
4. Responsible for the development, use, performance optimization and testing of big data platform new technologies.
5. Be able to lead the team to complete the project development work and share technical experience.
6. Research mainstream technologies that can be applied to products to promote the improvement of product functions and performance.
3.6. Visual R&D Engineer
Data visualization is a scientific and technological research on the visual representation of data. Among them, the visual representation of this data is defined as information extracted in a certain summary form, including various attributes and variables of the corresponding information unit.
Job Responsibilities
1. Big data visualization platform architecture and development;
2. Participate in the technical selection, technical specification and key feature design of big data platform data visualization;
3. Participate in the core function research and development of general data visualization products, component, template design, modularization Frame design;
4. Responsible for the development and maintenance of big data visualization products, based on understanding the product business, improve the user experience of the product, and technology-driven business development
; Various challenges and technical difficulties such as performance and high stability;
6. Pay attention to the technological development trend of the industry, continue to improve, and maintain the advanced nature of data visualization product architecture and technology.
job requirements
1. Computer or software engineering related majors, full-time undergraduate degree or above, solid computer and network foundation
2. More than 3 years of data visualization work experience, participated in the technical design and development of large-scale data visualization products
3. Excellent front-end research and development capabilities , have rich interactive product development experience, familiar with mainstream front-end frameworks, such as vue/react/angular, etc.
4. Familiar with mainstream data visualization components, such as echarts, thingJS, datav, etc., have read the source code, understand the code design of these chart libraries, Secondary development experience is preferred
5. Love the field of data visualization and have a deep understanding of data visualization. If you are interested in big data development and want to study big data systematically, you must have a high degree of technical sensitivity and a broad technical perspective. Have experience in performance optimization/cross-terminal and multi-platform development
6. Have the ability to independently develop front-end Web, and be familiar with a back-end language (PHP/Python/Java/Node, etc.).
3.7. Big data engineer
Big Data Engineer Job Description (1)
Job Responsibilities:
1. Responsible for the development of data analysis, processing, cleaning and processing procedures;
2. Engage in massive data analysis and mining related work;
3. Responsible for the construction, development, maintenance and optimization of big data related platforms;
4. Realize and support the data analysis needs of the business department;
5. Realize the company's "data-driven operation" business goals with big data architecture solutions.
job requirements:
1. Computer-related majors, bachelor degree or above, more than 3 years of Java development work experience, outstanding learning ability;
2. Familiar with the use of common projects in the hadoop ecosystem (hdfs, hive, hbase, spark, zookeeper, yarn, etc.), have development experience in python, spark, MapReduce, experience in actual big data projects is preferred;
3. Familiar with mainstream databases such as Oracle and MySql;
4. Proficient in JAVA, familiar with J2EE-based WEB architecture design, familiar with Web development process, and rich experience in Web MVC (Struts/Spring MVC, Spring, Hibernate/Mybatis, etc.) development;
5. Familiar with the operation of Linux/Unix system environment; familiar with the configuration and optimization of Tomcat and other application servers;
6. Have good communication skills, organizational skills and teamwork spirit, and strong ability to analyze and solve problems.
Big Data Engineer Job Description (2)
Job Responsibilities:
1. Function planning and design of big data platform;
2. Design and development of big data platform;
3. Big data platform implementation and data access.
job requirements:
1. Undergraduate degree or above, major in computer science, software engineering, mathematics, statistics;
2. Have a solid Java foundation, proficient in SQL, mysql and more than 2 years of software development experience;
3. Familiar with HADOOP platform architecture, have experience in hadooop, HBase, Hive, Spark, kafka;
4. Proficiency in Python for data processing and web crawling;
5. Experience in unstructured data analysis and identification is preferred.
Big Data Engineer Job Description (3)
Job Responsibilities:
1. Responsible for data warehouse construction, design, optimization and implementation;
2. Responsible for data ETL development, data platform construction, design and implementation of BI analysis, and data product development.
job requirements:
1.985 colleges and universities with a bachelor’s degree or above in a computer-related major;
2. Have a relatively solid theoretical foundation for data warehouses, and have relatively rich experience in data model construction and application layer construction;
3. Experience in using big data technologies such as Hadoop, Hive, Spark, Impala is preferred.
Big Data Engineer Job Description (4)
Job Responsibilities:
1. Mainly responsible for the big data development of telecom business;
2. Design data analysis scenarios based on business requirements, form a background implementation plan, independently complete business data modeling, and convert the results into operational indicators;
3. Follow up the whole life cycle of the product, coordinate various resources to ensure the smooth development of the product.
job requirements:
1. Bachelor degree or above in computer or related majors, 3 years or above related work experience;
2. Proficient in java development language, familiar with multi-threading and thread tuning, such as scala development is better;
3. Familiar with Linux server environment and shell;
4. Familiar with distributed systems, cluster management, SOA architecture and other related technologies, familiar with hadoop on yarn cluster deployment and working principles, proficient in MR development, and experience in large-scale cluster development and operation and maintenance is preferred;
5. Those who are familiar with 3 or more of the following technologies are preferred: hive, hbase, impala, Nginx, zookeeper, redis, kafka, storm, spark, es, crawler;
6. Have good communication skills and team spirit, and have a good ability to resist pressure.
Big Data Engineer Job Description (5)
Job Responsibilities:
1. Responsible for the development of big data processing platform applications;
2. Use Hadoop yarn/Storm/Spark to implement code development, storage, and distributed computing applications;
3. Optimize Hadoop yarn/Storm/Spark parameters, realize system tuning, and meet the real-time big data processing requirements of industrial applications;
4. Learn new technologies to improve the computing power and efficiency of the entire platform;
5. Develop script programs based on HIVE, Hbase, ES and other data processing methods.
job requirements:
1. More than 3 years of experience in Hadoop/Storm/Spark development, with a deep understanding of distributed and parallel computing theory;
2. Proficient in any language of java/scala/python;
3. Have research on Hadoop/Storm/Spark source code, experience in big data platform development is preferred;
4. Have good technical sensitivity, good learning ability and hard-working ability;
5. Have good communication and coordination skills, teamwork skills, and document writing skills.
Big Data Engineer Job Description (6)
work content
Build a big data analysis platform
Participate in business data construction, data thematic system construction, and data middle platform construction
Data analysis and mining work
Data-based offline and real-time stream analysis
Support business data model construction and data index calculation and analysis
Data storage and query Build and operate data analysis system Use
Hadoop\Spark\ES and other distributed computing and storage platforms to
optimize the ETL process
Necessary skills
Spark/Flink (architecture and development), ES, Flume/Filebeat, Kafka
Proficiency in using SQL
Hadoop related technologies (development , deployment, tuning), understand the principles and processes of MapReduce
4. Remarks
Recruitment picture source: China's big data industry talent ecology status quo
Big data articles:
- Big data (1) Definition and characteristics
- Big data (2) Statistics related to big data industry
- Big data (3) Jobs related to big data
- Build big data visualization big screen based on Echarts
- Big Data (4) Mainstream Big Data Technology
Recommended reading:
|
|
|
Tomcat11, tomcat10 installation configuration (Windows environment) (detailed graphics) |
Tomcat startup flashback problem solving set (eight categories in detail) |
|