MYSQL 查询单一数据优化
一、 前言
在本次项目开发的时候,有相当一部sql语句是根据条件只查询一条数据,当然这个只查询一条数据无论是业务需求还是实际情况,总会需要用到。此时这条语句就可以优化。
二、操作
优化策略添加 LIMIT 1 如:
SELECT * FROM stuinfo
where stuphone
= '944f6d77f2dfa2cbd514';
SELECT * FROM stuinfo
where stuphone
= '944f6d77f2dfa2cbd514' LIMIT 1;
我们可以看到其实只是在原有sql后面添加了limit 1,但是其所带来了很大的性能提升。实际测试 a. 我在本地创建了个数据表为 ’stuinfo’
CREATE TABLE `stuinfo
` (
`stunum
` varchar(20) CHARACTER SET utf8
COLLATE utf8_general_ci
NOT NULL DEFAULT '',
`stuname
` varchar(20) CHARACTER SET utf8
COLLATE utf8_general_ci
NOT NULL DEFAULT '',
`stusex
` varchar(20) CHARACTER SET utf8
COLLATE utf8_general_ci
NOT NULL DEFAULT '',
`stuage
` varchar(20) CHARACTER SET utf8
COLLATE utf8_general_ci
NOT NULL,
`stumajor
` varchar(20) CHARACTER SET utf8
COLLATE utf8_general_ci
NOT NULL DEFAULT '',
`stuphone
` varchar(20) CHARACTER SET utf8
COLLATE utf8_general_ci
NOT NULL,
`stuweixin
` varchar(20) CHARACTER SET utf8
COLLATE utf8_general_ci
DEFAULT NULL,
`stuaddress
` varchar(50) CHARACTER SET utf8
COLLATE utf8_general_ci
NOT NULL DEFAULT '',
PRIMARY KEY (`stunum
`) USING BTREE
) ENGINE = InnoDB CHARACTER SET = utf8
COLLATE = utf8_general_ci ROW_FORMAT
= Dynamic
;
b. 之后通过编写存储过程向表填入了20000条数据
delimiter &&
CREATE PROCEDURE addRandomData2
()
MODIFIES SQL DATA
BEGIN
DECLARE counts
INT DEFAULT 0;
SET counts
= 20000;
label:
LOOP
INSERT INTO stuinfo
(`stunum
`, `stuname
`, `stusex
`, `stuage
`, `stumajor
`, `stuphone
`, `stuweixin
`, `stuaddress
`) VALUES (counts
, (substring
(MD5
(RAND
()),1,20)), (substring
(MD5
(RAND
()),1,20)), (substring
(MD5
(RAND
()),1,20)), (substring
(MD5
(RAND
()),1,20)), (substring
(MD5
(RAND
()),1,20)), (substring
(MD5
(RAND
()),1,20)), (substring
(MD5
(RAND
()),1,20)));
SET counts
= counts
-1;
IF counts
= 10001 THEN
LEAVE label
;
END IF;
END LOOP label
;
END &&
delimiter ;
CALL addRandomData2
();
c. 分别执行查询语句(要先确保查出的数据确实是唯一的)
SELECT * FROM stuinfo
where stuphone
= '944f6d77f2dfa2cbd514'
查询时间为0.024s
SELECT * FROM stuinfo
where stuphone
= '944f6d77f2dfa2cbd514' LIMIT 1
查询时间为0.003s
三、结果
添加’LIMIT 1’ 确实可以减少很多的查询时间,但是要确认的是一定要确保查询的结果的确是唯一的。