Nothing in your code stands out from a cursory viewing, but the diagnostic visualizer, running behind your app, is your friend to confirm what is happening versus what your app code is interpreting is happening. Also, drawing the data you are working with, such as in OpenLeapKit I draw a 2D hand where I can toggle on and off data such as finger tip positions, finger directions, sphere radius as a horizontal line, sphere centre as the position of the horizontal line on y, etc. This way you can get quick visual feedback as to what your filters are operating on.
So, if your code is working properly and isLeapHandFist does not seem to be performing well, then you can see visually what is happening that is throwing it off. The parameter values I chose were a response to the characteristics I witnessed, which I did see affected by lighting conditions, plus I questioned how much different hands will also affect best values. But tuning the values for my environment led to over roughly 85% accuracy versus less than 20% accuracy for the other with closed fingers and open hand. And there was more tuning potential I believe.