您好,登录后才能下订单哦!
小编给大家分享一下MySQL中如何使用GROUP BY分组取字段最大值,相信大部分人都还不怎么了解,因此分享这篇文章给大家参考一下,希望大家阅读完这篇文章后大有收获,下面让我们一起去了解一下吧!
假设有一个业务场景,需要查询用户登录记录信息,其中表结构如下:
CREATE TABLE `tb` (
`id` int(11) NOT NULL AUTO_INCREMENT,
`uid` int(11) NOT NULL,
`ip` varchar(16) NOT NULL,
`login_time` datetime,
PRIMARY KEY (`id`),
KEY (`uid`)
);
再来点测试数据:
INSERT INTO tb SELECT null, 1001, '192.168.1.1', '2016-01-01 16:30:47';
INSERT INTO tb SELECT null, 1003, '192.168.1.153', '2016-01-01 19:30:51';
INSERT INTO tb SELECT null, 1001, '192.168.1.61', '2016-01-01 16:50:41';
INSERT INTO tb SELECT null, 1002, '192.168.1.31', '2016-01-01 18:30:21';
INSERT INTO tb SELECT null, 1002, '192.168.1.66', '2016-01-01 19:12:32';
INSERT INTO tb SELECT null, 1001, '192.168.1.81', '2016-01-01 19:53:09';
INSERT INTO tb SELECT null, 1001, '192.168.1.231', '2016-01-01 19:55:34';
表数据情况:
+----+------+---------------+---------------------+
| id | uid | ip | login_time |
+----+------+---------------+---------------------+
| 1 | 1001 | 192.168.1.1 | 2016-01-01 16:30:47 |
| 2 | 1003 | 192.168.1.153 | 2016-01-01 19:30:51 |
| 3 | 1001 | 192.168.1.61 | 2016-01-01 16:50:41 |
| 4 | 1002 | 192.168.1.31 | 2016-01-01 18:30:21 |
| 5 | 1002 | 192.168.1.66 | 2016-01-01 19:12:32 |
| 6 | 1001 | 192.168.1.81 | 2016-01-01 19:53:09 |
| 7 | 1001 | 192.168.1.231 | 2016-01-01 19:55:34 |
+----+------+---------------+---------------------+
如果只需要针对用户查出其最后登录的时间,可以简单写出:
SELECT uid, max(login_time)
FROM tb
GROUP BY uid;
+------+---------------------+
| uid | max(login_time) |
+------+---------------------+
| 1001 | 2016-01-01 19:55:34 |
| 1002 | 2016-01-01 19:12:32 |
| 1003 | 2016-01-01 19:30:51 |
+------+---------------------+
若还需要查询用户最后登录时的其他信息,就不能用这种sql写了:
-- 错误写法
SELECT uid, ip, max(login_time)
FROM tb
GROUP BY uid;
-- 错误写法
这样的语句是非SQL标准的,虽然能够在MySQL数据库中执行成功,但返回的却是未知的
(如果sql_mode开启了only_full_group_by,则不会执行成功。)
可能ip字段会取uid分组前的第一个row的值,显然不是所需信息
写法①
那么写一个子查询吧:
SELECT a.uid, a.ip, a.login_time
FROM tb a
WHERE a.login_time in (
SELECT max(login_time)
FROM tb
GROUP BY uid);
写法②
再或者换一个写法:
SELECT a.uid, a.ip, a.login_time
FROM tb a
WHERE a.login_time = (
SELECT max(login_time)
FROM tb
WHERE a.uid = uid);
顺便测了一下
在5.6以前的版本中,写法②这条sql在大数据量的情况下,执行计划不理想,目测性能不佳。
在5.6及以后的版本中,写法②这条sql会快很多,执行计划也有了改变
5.5.50:
+----+--------------------+-------+------+---------------+------+---------+------+------+-------------+
| id | select_type | table | type | possible_keys | key | key_len | ref | rows | Extra |
+----+--------------------+-------+------+---------------+------+---------+------+------+-------------+
| 1 | PRIMARY | a | ALL | NULL | NULL | NULL | NULL | 7 | Using where |
| 2 | DEPENDENT SUBQUERY | tb | ALL | uid | NULL | NULL | NULL | 7 | Using where |
+----+--------------------+-------+------+---------------+------+---------+------+------+-------------+
5.6.30:
+----+--------------------+-------+------+---------------+------+---------+------------+------+-------------+
| id | select_type | table | type | possible_keys | key | key_len | ref | rows | Extra |
+----+--------------------+-------+------+---------------+------+---------+------------+------+-------------+
| 1 | PRIMARY | a | ALL | NULL | NULL | NULL | NULL | 7 | Using where |
| 2 | DEPENDENT SUBQUERY | tb | ref | uid | uid | 4 | test.a.uid | 1 | NULL |
+----+--------------------+-------+------+---------------+------+---------+------------+------+-------------+
写法③
索性直接改成join性能会更加好:
SELECT a.uid, a.ip, a.login_time
FROM (SELECT uid, max(login_time) login_time
FROM tb
GROUP BY uid
) b JOIN tb a ON a.uid = b.uid AND a.login_time = b.login_time;
当然,结果都相同:
+------+---------------+---------------------+
| uid | ip | login_time |
+------+---------------+---------------------+
| 1003 | 192.168.1.153 | 2016-01-01 19:30:51 |
| 1002 | 192.168.1.66 | 2016-01-01 19:12:32 |
| 1001 | 192.168.1.231 | 2016-01-01 19:55:34 |
+------+---------------+---------------------+
当然……如果要分组取最小值直接改对应函数和符号就行了。
以上是“MySQL中如何使用GROUP BY分组取字段最大值”这篇文章的所有内容,感谢各位的阅读!相信大家都有了一定的了解,希望分享的内容对大家有所帮助,如果还想学习更多知识,欢迎关注亿速云行业资讯频道!
免责声明:本站发布的内容(图片、视频和文字)以原创、转载和分享为主,文章观点不代表本网站立场,如果涉及侵权请联系站长邮箱:is@yisu.com进行举报,并提供相关证据,一经查实,将立刻删除涉嫌侵权内容。