目录

redis中报too many connections错误的解决

目录

最近在对系统中某个接口进行压力测试的时候发现,有个redis的查询接口大量的报错,于是查看后台日志,发现是aioredis 报了Too many connections, 起初我认为是由于在进行压力测试,由于这个接口需要访问redis数据,同一时间来了大量的请求,所以会对redis服务器造成大量的请求,redis服务器抗不住返回了Too many connections 错误,但是经过排查不是这样的。

查看详细的报错信息

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
Traceback (most recent call last):
  File "handle_request", line 83, in handle_request
    FutureStatic,
  File "/app/handlers/test_handler.py", line 67, in get
    n = await userdao.getTaskNum()
  File "/app/dao/user_dao.py", line 33, in getTaskNum
    resutl = await self.request.app.ctx.redis.get("task_num")
  File "/usr/local/lib/python3.8/site-packages/aioredis/client.py", line 1082, in execute_command
    conn = self.connection or await pool.get_connection(command_name, **options)
  File "/usr/local/lib/python3.8/site-packages/aioredis/connection.py", line 1411, in get_connection
    connection = self.make_connection()
  File "/usr/local/lib/python3.8/site-packages/aioredis/connection.py", line 1449, in make_connection
    raise ConnectionError("Too many connections")
aioredis.exceptions.ConnectionError: Too many connections

查看aioredis源代码,

1
2
3
4
5
6
def make_connection(self):
    """Create a new connection"""
    if self._created_connections >= self.max_connections:
        raise ConnectionError("Too many connections")
    self._created_connections += 1
    return self.connection_class(**self.connection_kwargs)

这里报错是由于_created_connections 大于max_connections,这个max_connections 又是在哪里初始化的呢?

我的代码在初始化redis连接池时设置了 max_connections 参数我这里设置成了50,

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
def make_redis(redisConf):
    '''
    redis://[[username]:[password]]@localhost:6379/0
            rediss://[[username]:[password]]@localhost:6379/0
            unix://[[username]:[password]]@/path/to/socket.sock?db=0
    :param redisConf:
    :return:
    '''
    password = redisConf.get("password")
    host = redisConf.get("host")
    port = redisConf.get("port")
    db = redisConf.get("db")
    address = f'redis://:{password}@{host}:{port}/{db}'
    try:
        pool = aioredis.ConnectionPool.from_url(address, max_connections=50)
        redis = aioredis.Redis(connection_pool=pool, encoding='utf-8')
        return redis
    except:
        logger.error("连接redis失败")
        raise Exception("连接redis失败")

于是就有了最大的连接数为50,也就是我这个系统与redis服务器最多会有50个连接,redis 的 ConnectionPool 的实例中, 有两个list,一个是_available_connections, _in_use_connections, 在获取连接的时候从_available_connections 中弹出一个连接放到_in_use_connections

1
2
3
4
5
6
async with self._lock:
    try:
    	connection = self._available_connections.pop()
    except IndexError:
    	connection = self.make_connection()
    self._in_use_connections.add(connection)

也就是说,当报 Too many connections 时,后台想要去连接 redis 时,如果此时由于访问量太大,前一个请求还没有处理完,这时就要创建一个新的连接,但是如果此时连接数超过了 max_connections 则就会报 Too many connections 错误了!

解决方法也很简单,就是不要设置 max_connections 参数,当不设置该参数的时候,aioreids 将设置为2 ** 31, 也就是2个31次方,该值已经非常大。

但是 redis 服务器也不能无限的连接,可以登录到redis服务器上,查看redis服务器自身可以连接的数量。该值默认为10000.

在我取消了aioredis连接池最大连接数限制以后,再次进行压测

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
10 threads and 2000 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   318.38ms  115.42ms   1.09s    70.99%
    Req/Sec   622.78    241.92     1.58k    68.12%
  Latency Distribution
     50%  309.36ms
     75%  377.44ms
     90%  453.45ms
     99%  711.74ms
  60149 requests in 10.02s, 6.48MB read
Requests/sec:   6003.59
Transfer/sec:    662.51KB

这时请求量上去了,可以达到6K,在查看服务端以后发现上面的6W个请求,在服务端只需要大概500多个redis连接即可。