We are testing how high the maximum connection rate can reach. We are using two servers as clients. The client on the CentOS server initially reaches 1400/sec and maintains this rate. However, the client on the Ubuntu server drops to about 1000/sec after running for approximately two minutes. At this point, the server load hasn't even risen, and client server resource consumption is only around 60%. The system tuning parameters for both clients are identical, and surprisingly, the CentOS server's configuration is even lower than the Ubuntu server's. Why is this happening?
This issue was automatically translated from Chinese.