If screen size and frame rate (fps) had any relation, we could produce video with 5 fps for 1" size screen and 1 fps for even small screen ( which will be single image per second )
Actually, screen size makes a big difference in where the threshold for animation occurs.
From the page:
https://en.wikipedia.org/wiki/Frame_rate
https://en.wikipedia.org/wiki/Frame_rate#/media/File:Animhorse.gif
You can just use ctrl +/- to resize the 12fps gif.
Sized down to about 2" it barely meets the threshold of animation, but your mind perceives it as animated.
Sized up to about 5" it appears quite choppy.
Sized up to 12" it's 'unwatchable'.
Games that will appear choppy as hell at 20fps on a 21" screen will be quite playable at 20fps on a 5" screen.
I would argue, then, that processing cycles being used to generate frame rates on a mobile device that are higher than 'needed' (admittedly variable on many factors) aren't doing any 'good' and are doing basic 'harm' to the experience by wasting battery power by turning it into heat without enhancing the user experience.
Where that threshold is will be variable on the user and the application/game being used. If we were to be able to put in a user configurable 'restriction' at the .dbp layer (so that it is application dependent), that would allow the device user to be able to adjust up/down how much CPU gets allocated to the application - which translates into a choice between battery life or frame rate - and would let the user choose where that balance best suits them.
That is the idea. I don't know if it is possible to implement. But, as is, 'greedy apps' will draw every scrap of CPU they can, regardless of whether doing so actually improves the perceived user experience.