The secrets of Google’s self-developed chips were exposed for the first time; Hackers asked for $100,000 to sell Razer’s source code and other data; Baichuan Intelligent released a large model of Baichuan-13B (source code provided)

Tech News Highlights for July 12, 2023! 10 second overview!

1. HKUST Xunfei: all in the Xinghuo model in the second half of the year

2. Baichuan Intelligent released Baichuan-13B (the source code and installation and training tutorials are used as benefits to tell the acquisition method at the end of the article)

3. Google's self-developed chip secrets exposed for the first time

4. Hackers asked for $100,000 to sell Razer source code and other data, the official response is under investigation

5. The co-author of the Transformer paper will leave Google

6. The Bcachefs file system failed to be merged into the Linux 6.5 kernel

domestic news

1. HKUST Xunfei: all in the Xinghuo model in the second half of the year

At the investor briefing held by HKUST Xunfei, Jiang Tao, secretary of the company's board of directors and vice president, answered and explained the commercial landing of the Spark large model and R&D investment. The current large model of cognitive intelligence brings new opportunities to the artificial intelligence industry, and Xunfei will maintain its focus on technological innovation and firm investment in key strategic directions. Jiang Tao said, "In the second half of the year, the company will all in large-scale models. While continuing to increase investment in research and development, we will accelerate the commercial implementation of Spark large-scale models, accelerate the self-healing of large models, and ensure the company has a stable cash flow."

2. Baichuan Intelligent released Baichuan-13B

According to news on July 11, Baichuan Intelligent, founded by Sogou founder Wang Xiaochuan, officially released a general-purpose large language model Baichuan-13B-Base with a parameter volume of 13 billion, a dialogue model Baichuan-13B-Chat and two quantized versions of INT4/INT8. The Baichuan-13B Chinese-English large-scale model integrates many features such as high performance, complete open source, and free commercial use. It is currently the best commercially available large language model among all open source models below 33B in size. Under the background that a complete ecology of closed-source and open-source large-scale models has been established abroad, it makes up for the shortcomings of domestic high-quality open-source business models.

international news

1. Google's self-developed chip secrets exposed for the first time

Through interviews with former Google chip executives and semiconductor analysts, The Information reporter analyzed the three major reasons behind Google's fully customized version of the chip "jump ticket", including: poor teamwork, lack of R&D experience, and relatively low input and output wait. As early as more than ten years ago, Apple blazed a new path by customizing the iPhone's A-series of self-developed mobile phone chips. Google saw another possibility, so it wanted to follow Apple's lead in developing custom chips for its Pixel series smartphones. Google’s delay in bringing its self-developed custom Tensor to market is partly due to the company’s challenges in dividing and coordinating its work between the U.S. and India, according to former Google chip executives on the “Redondo” program.

2. Hackers asked for $100,000 to sell Razer source code and other data, the official response is under investigation

On July 11th, Razer (Razer) issued a short tweet saying that it had begun to investigate the data breach. A user with the online name Nationalist posted on the dark web last Saturday, claiming to have stolen Razer.com's source code, database, encryption keys and back-end access login information, asking for $100,000 worth of Monero (XMR) cryptocurrency. The publisher of the post did not set any restrictions or exclusivity, which means that anyone willing to pay the required amount will get the entire dataset. File trees, email addresses, alleged source code for anti-cheat and reward systems, API details, Razer Gold balance, and more can be seen displayed. (IT House)

3. The co-author of the Transformer paper will leave Google

Llion Jones, who co-authored the seminal AI paper Attention Is All You Need, has confirmed he is leaving Google later this month and plans to Start a company after your sabbatical. Published in 2017, the paper introduced the concept of a Transformer, a system that helps AI models zero in on the most important information in the data they're analyzing. Transformers are now a key building block for large language models. Over the years, the paper's authors have launched some high-profile startups, including Cohere, which provides large-scale language models for corporate clients, and Chatbot company Character.AI. (The Paper)

Programmer Zone

Bcachefs filesystem failed to merge into Linux 6.5 kernel

Linus Torvalds released Linux 6.5 RC1, the most notable of which is that the Bcachefs file system has not been merged into the kernel mainline. The debate surrounding the Bcachefs merger has sparked so much that Torvalds told everyone to calm down. The copy-on-write file system Bcachefs was announced in 2015 and has a history of nearly ten years. It is derived from a kernel block-level cache called bcache. The project developers hope to provide performance similar to XFS/EXT4 and similar to Btrfs and ZFS characteristics. New features of Linux 6.5 include new cachestat() system call, preliminary work on Intel Lunar Lake audio, preliminary support for USB4 v2, deprecation of SLAB allocator, LoongArch architecture support for SMT and SIMD/Vector, AMD RDNA3 GPU overclocking support, improvements Btrfs performance, and more. (Solidot)

Refer to the original link: https://blog.csdn.net/csdngeeknews/article/details/131674286

Guess you like

Origin blog.csdn.net/xyk2000114/article/details/131675838