In recent years, it is common and popular to employ the KL inequality to show the global convergence of certain descent algorithms. To use the KL inequality, a sufficiently decreasing inequality has to be proved in advance. In this talk, it is shown that for some problems, we can weaken the conditions to obtain such inequality, and hence it is possible to weaken the assumptions for proving the global convergence.