AI business-Ali and other major manufacturers spent 5 billion US dollars to buy Nvidia chips; the Cambrian was laid off, and only a few employees were retained in the hardware part; Xiaomi exposed a 6.4 billion parameter AI model|AI Weekly Information

Ali and other major manufacturers spent 5 billion US dollars to buy Nvidia chips; the Cambrian was laid off, and only a few employees were retained in the hardware part; Xiaomi exposed a 6.4 billion parameter AI model|AI Weekly Information

 

AI Frontline  2023-08-13 13:30  Published in Heilongjiang

Organized | Li Dongmei, Liu Shaofen  

Information

 The Cambrian was laid off, and only a few employees were retained in the hardware part

Following the "malicious layoffs" in April this year, news of the "team disbandment" came out again in the Cambrian.

Recently, some media reported that due to financial considerations, Cambrian will lay off some employees of the related subsidiary Xingge Technology, and the Nanjing and Shenzhen teams have become the "hardest hit areas" of this layoff.

picture

In the hardware part, only a small number of employees are retained to "repair the aftermath", and new projects have been suspended, and may be abandoned in the future. "

Cambrian's layoffs are mainly due to financial considerations, hoping to achieve a financial balance as soon as possible. In terms of personnel, Sun Xiaoyun, the head of the Xingge team's structure, left last year.

If the above news is true, it means that the Cambrian, which is known as the "first AI chip stock", is likely to completely abandon the smart driving chip business in the future.

 Upgrade payment caused a lot of dissatisfaction, GoodNotes announced exclusive discounts for Chinese users

According to IT House reports, the note-taking app GoodNotes 6 has recently been launched on the Apple App Store. It is the same as the 5th generation, and still needs to be purchased. The official offers two methods: annual payment and buyout. The former is 68 yuan a year, and the latter is . 186 yuan buyout. However, paying users of the 5th generation cannot directly obtain the functions of the 6th generation, and still need to pay again, which has caused dissatisfaction among many old users.

Subsequently, the official Xiaohongshu account of GoodNotes issued a statement: "After carefully listening to everyone's feedback on this product upgrade and pricing, we decided to provide exclusive discounts for Chinese users."

  • The price of annual membership and one-time unlocking has been reduced by 40% across the board

  • GoodNotes 5 paid members can superimpose the use of upgrade discounts

GoodNotes official stated that GoodNotes 5 paid users can also choose to continue using GoodNotes 5 without upgrading to a new version. If users who have upgraded in the past two days need a refund, they can contact customer service.

 Xiaomi exposes 6.4 billion parameter AI model

Xiaomi did not "officially announce" to directly join the AI ​​large-scale model track before, but Xiaomi's AI large-scale model MiLM-6B has quietly appeared in the C-Eval and CMMLU large-scale model evaluation lists. As of now, the Xiaomi large-scale model is in the C-Eval -Eval ranks 10th in the overall list, and ranks 1st in the same parameter magnitude.

It is reported that MiLM-6B is a large-scale pre-trained language model developed by Xiaomi, with a parameter scale of 6.4 billion.

picture

Image source MiLM-6B GitHub page

According to the information given by C-Eval, the MiLM-6B model has achieved 100% in all 20 STEM (science, technology, engineering and mathematics education) subjects in terms of specific subjects, such as metrology, physics, chemistry, biology, etc. The project achieved a high accuracy rate.

Project address: https://github.com/XiaoMi/MiLM-6B

 Microsoft Edge team begins working with Office team to improve product performance

Microsoft has many Office applications available on the web, such as Word, PowerPoint, Excel, Outlook, and Teams. Even though they are used in web browsers, that doesn't mean they aren't complex applications. In a blog post, the Microsoft Edge team said they have been working with the Office performance team to help improve the overall performance of these web apps. The blog mentions that the Office team is working on a PowerPoint web application.

Microsoft says that by using a more precise method of sleep timing, the team was able to reduce CPU sampling overhead by 95 percent and reduce Edge's overall CPU consumption while profiling by 71 percent. Of course, this helps the PowerPoint team investigate and improve the load performance of their application, but it also means that everyone using the performance tools in DevTools (in Edge or any Chromium browser) will now have a better experience.

 Google launches browser AI development environment IDX, which supports full-stack programming languages, on-the-go, and one-click deployment

Google recently released Project IDX. It's a brand new in-browser code editor + development environment. It will support multiple frameworks such as Angular, Next.js, React, Svelte, and Flutter, and will support Python and Go soon.

This is Google's first foray into providing a browser-based, AI-assisted full-stack web and multi-platform application development environment. It currently supports multiple frameworks and languages, and integrates Codey AI programming assistance. As a cloud IDE, it combines Google Firebase, GitHub and other services. IDX allows Google to showcase the application of AI in programming, but whether it will become the IDE of choice for developers remains to be seen.

 Nvidia partners with Hugging Face to provide cloud-based AI model training services

GPU maker and AI provider Nvidia has announced a new line of generative AI products designed to accelerate the development of large language models and other advanced AI applications.

According to reports, Nvidia is working with AI startup Hugging Face to launch a new cloud service called "training cluster as a service" for the training of enterprise custom AI models. The service will combine Nvidia's DGX Cloud infrastructure with Hugging Face's library of models and datasets. The partnership allows Nvidia to expand its business in AI cloud services as enterprise demand for AI training soars. For the high-growth Hugging Face, the partnership provides access to cutting-edge infrastructure to support users in developing custom models.

 Microsoft and Aptos Labs cooperate to integrate AI and blockchain in the field of Web3

Aptos Labs is a blockchain platform focused on Web3 development. Recently, the company announced a strategic partnership with Microsoft to integrate Microsoft's Azure OpenAI service into the Aptos network. This integration will enable Aptos developers and users to use the power of artificial intelligence and machine learning on the decentralized web.

Aptos Labs has built a scalable and secure blockchain network that supports multiple applications and use cases for Web3. Aptos Labs has also developed its own programming language Move, which allows developers to easily write smart contracts. Move ensures safe, secure, and verifiable code and a smooth user experience.

By partnering with Microsoft, Aptos Labs provides developers with state-of-the-art artificial intelligence technologies that can enhance their applications and services on Web3. Aptos Labs will launch Aptos Assistant, which will act as a responsible, friendly and secure assistant to build a bridge from Web2 to Web3 for every ordinary Internet user and organization.

The integration of Microsoft's AI with the Aptos network is a major milestone in Web3 adoption and innovation, lowering the barriers to entry for Web3 exploration and development, and enabling more people to participate and benefit from the decentralized web.

 Stability AI releases AI programming tool: StableCode

Stability AI just announced their first generative LLM AI product for programming - StableCode. The product is designed to help programmers with their daily tasks and provide a practical learning tool for novice developers.

According to the official introduction, StableCode provides a unique way to help developers write code by using three different models, thereby prompting development efficiency.

The base model is first trained using multiple programming languages ​​from BigCode stack-dataset (v1.2), and then further trained using popular languages ​​such as Python, Go, Java, Javascript, C, markdown and C++. In total, they trained the model on an HPC cluster with a code of 560B tokens.

Once the base model is established, Stability AI tunes the instruction model for specific use cases to help solve complex programming tasks. To achieve this result, they trained about 120,000 code instruction/response pairs (instruction/response) in Alpaca format on the base model.

Hot news in the IT industry

 Ali and other major manufacturers spend 5 billion US dollars to buy Nvidia chips in response to US export restrictions

Major Chinese tech companies including Alibaba, Baidu, ByteDance and Tencent have jointly ordered about $1 billion worth of Nvidia A800 GPUs, while also planning to buy $4 billion worth of GPUs next year.

These GPUs are crucial for key AI applications such as training large language models, and U.S. restrictions on technology exports have been triggering a wave of purchases by Chinese companies. Despite export restrictions, Chinese technology companies are still struggling to obtain compliant next-generation GPUs to meet their AI technology needs.

 OpenAI launched the web crawler GPTBot to collect training data, users can choose to join

OpenAI launched GPTBot, a web crawler that can identify itself with a specific user agent string. GPTBot crawls web content to gather the data needed to train OpenAI AI models such as ChatGPT. GPTBot filters out paid web pages, web pages that contain personal privacy information, and web pages that violate OpenAI policies. Allowing GPTBot to visit websites can help improve the accuracy and capabilities of AI models. Site owners can block access to GPTBot by adding it to their robots.txt file. There is also an option to allow GPTBot to access only parts of the website.

The move has sparked discussion of privacy and copyright issues surrounding the collection of AI training data. On the one hand, GPTBot can expand the scope of OpenAI to collect the data needed for model training; on the other hand, some websites may worry that their content will be abused, so they provide the option to opt out.

today

Python is out of favor! Hugging Face has written a new ML framework in Rust, which is now low-key and open source

Conversation with serverless expert Luca Mezzalira: Are you really ready for Serverless X AI?

Nvidia's world's first super AI chip: 50% faster than the previous generation, and the cost of training large models is lower

Musk won the top domain name AI.com: It was bought by OpenAI for tens of millions of dollars, and it changed hands in less than a year?

The content of the meeting was used to train the large model! Zoom: My AI function is not "free"

The Python team officially announced the offline GIL

Guess you like

Origin blog.csdn.net/sinat_37574187/article/details/132259290