Show them how wrong they are about FPS
#shorts Can you tell the difference? A video demonstrating the differences between three common FPS targets set by movies and games.
ADD A COMMENT
I try to play at 30 fps but even with low settings it ends up being 5-10fps 1/3 of the time
60 fps is the max human can notice .. but playing at 120hz and 120 FPS your eye will rest more
Why are all comment related to “human eyes can’t see more than c fps”. That is so stupid. The eye isn’t a numeric device. It is way closer to an analog device than anything else and you calculate that in Hz, not fps. It heavily depend on the brain processing the “data” for you. No two persons will actually see the same thing and perceive it the same. Also, perception will be different wether you look at it with one eye or two as there are half the “data” the brain can process once again. You’d be surprise how bad life look with two eyes if the brain doesn’t do it’s job.
There are no frames rendering in real life, it's just infinite . . . how the truly infinite God created it
bro you can't see more than 60fps not 30fps they said.
People said all kinds of silly things 😊
2X playback speed 😏
that makes the clip 120fps, not even kidding.
Who said they can't detect more than 30 fps? "Science"? Come on man!
Human eye cant see more than 4GB of RAM
But could I download more ram?
If someone says "The human eye can't see above 30 fps" you should just tell them to go see a doctor cause something is wrong with their eye.
only console kiddos that have never played on pc would say that lol
You are right, the human eye can differentiate well over 100 fps in special cases, provided there is no "motion blur". Whether a video ("animated picture series") is percieved as smooth or not, mostly depends on the inter-frame correlation, the similarity between two consecutive images. The amount of image detail also plays a role. For example, a 15 fps 'cloud flow' video can be totally smooth, while a 60 fps "shoot-by" animation of a black sphere on white background (unblurred, maximum brightness contrast) can still be percieved as seperate images. So the main reason why or when "traditional" 24/25 fps movies appear smooth, is the tremendous amount of blurred pictures (filmed images, not the computer generated ones). You can see the blurriness very well on still images of those motion pictures.
@LtdJorge for slomo footage you can almost never have enough xD If we're talking about games though, im sure we won't stop until we have at least 1000hz monitors (and capable hardware).
@Takis K. There was an estimation that said it was physically around 800 fps. But that is only for the fovea, etc, etc. I think upwards of 200 is already enough to stop wasting bandwidth/storage. Another good benefit is that as with raw, more data is always good for editing, you can trim it down later, but you cannot get data out of nothing without faking.
I can pretty much guarantee that you could differentiate 100 from 200 and then 200 from 1000 fps too. It's beyond the 1000hz point that probably becomes nearly impossible.. I have personally seen up to 240 and can easily see the difference. It's the small differences that are super hard to see.. like 100 from 120 or 200 from 240.. but when you're comparing doubles, it's really obvious and I wouldn't be surprised if some can see the difference between 1000 and 2000 also.
At first, the 30 and 60 looked the exact same to me. But when I played it in 1080p and full screen, the 30fps does have a noticeable jitter to it that can strain the eyes a bit after a while.
Yes... and if you watch a full screen moving at 120 FPS you will see the difference to 60 fps. Clickbait title! :D
24 fps - cinematic 60 fps - orgasmic
If you don't aim for 1 FPS so every input is a "frame perfect input", you just ain't trying
SEE ALL COMMENTS