Quickly generate unique card numbers

Spiderweb has many shopping cards, movie coupons, and vouchers.


For example, a demand background needs to generate a new batch of 100,000 card numbers, 30,000 for WeChat channels, 50,000 for main sites, apps, and 20,000 for external channels.


So how to quickly generate card numbers without repeating them?

The logic is very simple, it is to generate a random card number through some method, then go to the database to sort the weight, insert it without repeating it, until 100,000.


1. How to generate the card number?

Different business types can be distinguished. The voucher starts with DY, and the redemption ticket starts with DH. This is to speed up the query when repeating. You can continue to add dimensions.


2. How to generate and ensure the speed?

At the beginning, I only thought of how to generate card number data to avoid repetition of queries. As a result, the length of the card is unlikely to be short, and the experience is not good.

Later, when a colleague reminded me to store the procedure, then write the stored procedure.

The result of the storage process is also relatively slow, about 33s for 1000 pieces. Then I asked that colleague, batch. I think of mysql batch

set autocommit=0;

CREATE TABLE `biz_card` (
  `id` int(11) NOT NULL AUTO_INCREMENT,
  `cardno` varchar(50) DEFAULT NULL,
  `status` tinyint(4) DEFAULT '0',
  PRIMARY KEY (`id`),
  KEY `ix_cardno` (`cardno`)
) ENGINE=InnoDB AUTO_INCREMENT=7303 DEFAULT CHARSET=utf8;

DROP PROCEDURE IF EXISTS `genCardNo`;
CREATE PROCEDURE genCardNo(IN num int)
begin 
	-- 产生一个随机7位数
		DECLARE newCard int;  
	
    set autocommit=0;
		while num>0 do
    
      SET newCard=FLOOR(10000000+ (RAND() * 89999999));

			select @repCount:=count(id) from biz_card where cardno=CONCAT('DY',newCard);
			-- 重复判断
			if @repCount>0 THEN
				SELECT concat('the value is repeat:', 'DY',newCard);
      else
				insert biz_card(cardno,status)values(CONCAT('DY',newCard),0);
				set num = num - 1; 
				if num%5000=0 then 
					SELECT concat('begin commit,num:',num);
					commit;
				end if;
			end if;

    end WHILE;
		commit;
 end;

[SQL] call genCardNo(1000);
受影响的行: 1
时间: 0.3s

[SQL] call genCardNo(100000);
Affected rows: 0
Time: 15.350s


This seems to be fine. If the amount of data is large, my next idea is to distribute it to different machines through the message queue to generate, and use redis to store and process the weight.

Guess you like

Origin blog.csdn.net/penkee/article/details/52947846