News | Forum | People | FAQ | Links | Search | Register | Log in
General Abuse
Talk about anything in here. If you've got something newsworthy, please submit it as news. If it seems borderline, submit it anyway and a mod will either approve it or move the post back to this thread.

News submissions: https://celephais.net/board/submit_news.php
First | Previous | Next | Last
Wrath 
So, who or what are the Gilmore Girls? 
Mike 
from your response, I reckon you googled it.

you're fired. 
Wrath 
Nope. Worse than that. I thought it sounded like a Bob Hope or maybe Groucho or even Mae West. But my mum said something along the lines of "so what has the Gilmore Girls got to do with Quake" - she was waiting for Heresp2 to download at the time. And she won't say any more. So, what gives...? 
Wait 
she was waiting for Heresp2 to download at the time

Your mum plays Quake? Why does everyone else get to have cool parents? :( 
Gilmore Girls 
Hmmmmm, Lorelai............................ 
Kell 
He doesn't think I'm cool when I won't let him use HIS computer 'cause I want to play on it. His FMB series of maps should have been called FMM! 
Ignore 
Please ignore the last post - the maps really were For My Babies. (they have both left home now so not so much babies any more) 
 
Gilmore Girls is some TV show on Lifetime. Full of older women and drama or some shit like that.


Necros: Quake (or maybe it's Q3) is also limited to only displaying 60 fps (even if the FPS display tells you it's more). It seems silly now, but 5 years down the road, when it would be possible to run D3 at 200 billion FPS, it'll keep the game's timing and speed at what it was designed for. For the here and now, it'll serve as an upper limit to even the playing field for the Deathmatch part of the game. and besides, as has been pointed out millions of times before in FPS related flamewars, Humans can't even see 60 fps, so it's a high enough limit that more won't make a differance. 
Some May Beg To Differ 
Quake (or maybe it's Q3) is also limited to only displaying 60 fps (even if the FPS display tells you it's more)

Some may beg to differ with that statement, not only does it display tell you it's more, but you can feel it

Humans can't even see 60 fps, so it's a high enough limit that more won't make a differance

I would also beg to differ on that point, If you leave quake at the 72 fps setting, and then set it up much higher, you will certainly notice/feel a difference, I know I can, I leave Quake capped to 150 FPS, and you can certainly tell the difference between that and 72 FPS. The only reason I cap it at 150 FPS, is that any higher and I will get the step climbing problem, ie, at higher FPS you are either at the bottom of a step, or at the top, there is no transition, which makes climbing stairs feel funny, but if it wasn't for that, I would let it go much higher.

Quake 3, the difference between the default of 85FPS (I think it is) and setting it higher is very noticable. And if if QuakeIIIis limited to 60FPS, then why do people have it set to 125 FPS, 333 FPS to gain advantage through the different player speed jumping etc, gained at these speeds? I also cap QuakeIII to 125FPS, I used to leave it uncapped, but when it got over 400FPS it would basically shut down my ADSL modem, and dropp my connection, which was solved buy capping below 400FPS. So if QuakeIII is limited to 60FPS, as you say, then why would I get that problem?

It may all be fine in theory, but in my experience, it just doesn't hold 
John Carmack On The Issue Of Frame Rate 
October 22, 2003 - At a recent NVIDIA Editors' Day, id Software CEO Todd Hollenshead announced that DOOM 3 will be capped to 60 frames per second in the rendering engine.

We checked with John Carmack himself about why DOOM 3 will be hard-capped at 60fps in the renderer, and he had this to say:

"The game tic simulation, including player movement, runs at 60hz, so if it rendered any faster, it would just be rendering identical frames. A fixed tic rate removes issues like Quake 3 had, where some jumps could only be made at certain framerates. In Doom, the same player inputs will produce the same motions, no matter what the framerate is."


Also gives a round about reference to the fact that QuakeIII is not limitted to 60FPS

http://pc.ign.com/articles/456/456054p1.html?fromint=1 
Gist Of It 
There are q3 server settings that can limit physics frames to 60 per second. In that case the extra graphics frames are wasted. Most servers don't do that though.

If you aren't supposed to feel the difference in theory, you wouldn't in practice. There's nothing about rendering the same frame twice that's going to make it feel smoother.

Regardless, humans can perceive more than 60fps. For most people it's between 90 and 125. 
Quake And Quake III 
Both feel smoother the higher the frame rate

That is all I am saying, that is the gist of it 
For Any UT2003 People 
The release of maps, mods, etc, will be hotting up today with the deadline for the MSU contest being the 31st, there have been some good stuff released of late. Check the official forums for news as it happens

Damned freaks 
Unconvinced 
Regardless, humans can perceive more than 60fps. For most people it's between 90 and 125.


and where did you pick up this piece of information from, may i ask? 
Pt1 
There is a common misconception in human thinking that our eyes can only interpret 30 Frames Per Second. This misconception dates back to the first human films where in fact a horse was filmed proving actually that at certain points they were resting on a single leg during running. These early films evolved to run at 24 Frames Per Second, which has been the standard for close to a century.

A Movie theatre film running at 24 FPS (Frames Per Second) has an explanation. A Movie theatre uses a projector and is projected on a large screen, thus each frame is shown on the screen all at once. Because Human Eyes are capable of implmenting motion blur, and since the frames of a movie are being drawn all at once, motion blur is implemented in such few frames, which results in a lifelike perceptual picture. I'll explain the Human Eye and how it works in detail later on in this multi-page article.

Now since the first CRT TV was released, televisions have been running at 30 Frames Per Second. TV's in homes today use the standard 60Hz (Hertz) refresh rate. This equates to 60/2 which equals 30 Frames Per Second. A TV works by drawing each horizontal line of resolution piece by piece using an electron gun to react with the phosphors on the TV screen. Secondly, because the frame rate is 1/2 the refresh rate, transitions between frames go a lot smoother. Without going into detail and making this a 30 page article discussing advanced physics, I think you'll understand those points.

Moving on now with the frame rate. Motion blur again is a very important part to making videos look seamless. With motion blur, those two refreshes per frame give the impression of two frames to our eyes. This makes a really well encoded DVD look absolutely incredible. Another factor to consider is that neither movies or videos dip in frame rate when it comes to complex scenes. With no frame rate drops, the action is again seamless.

Computer Games and their industry driving use of Frames Per Second
It's easy to understand the TV and Movies and the technology behind them. Computers are much more complex. The most complex being the actual physiology /neuro-ethology of the visual system. Computer Monitors of a smaller size are much more expensive in cost related to a TV CRT (Cathode Ray Tube). This is because the phosphors and the dot pitch of Computer Monitors are much smaller and much more close together making much greater detail and much higher resolutions possible. Your Computer Monitor also refreshes much more rapidly, and if you look at your monitor through your peripheral vision you can actually watch these lines being drawn on your screen. You can also observe this technology difference by watching TV where a monitor is in the background on the TV.

A frame or scene on a computer is first setup by your video card in a frame buffer. The frame/image is then sent to the RAMDAC (Random Access Memory Digital-Analog-Convertor) for final display on your display device. Liquid Crystal Displays, and FPD Plasma displays use a higher quality strictly digital representation, so the transfer of information, in this case a scene is much quicker. After the scene has been sent to the monitor it is perfectly rendered and displayed. One thing is missing however, the faster you do this, and the more frames you plan on sending to the screen per second, the better your hardware needs to be. Computer Programmers and Computer Game Developers which have been working strictly with Computers can't reproduce motion blur in these scenes. Even though 30 Frames are displaying per second the scenes don't look as smooth as on a TV. Well that is until we get to more than 30 FPS. 
Pt2 

NVIDIA a computer video card maker who recently purchased 3dFx another computer video card maker just finished a GPU (Graphics Processing Unit) for the XBOX from Microsoft. Increasing amounts of rendering capabilities and memory as well as more transistors and instructions per second equate to more frames per second in a Computer Video Game or on Computer Displays in general. There is no motion blur, so the transition from frame to frame is not as smooth as in movies, that is at 30 FPS. In example, NVIDIA/3dfx put out a demo that runs half the screen at 30 fps, and the other half at 60 fps. The results? - there is a definite difference between the two scenes; 60 fps looking much better and smoother than the 30 fps.

Even if you could put motion blur into games, it would be a waste. The Human Eye perceives information continuously, we do not perceive the world through frames. You could say we perceive the external visual world through streams, and only lose it when our eyes blink. In games, an implemented motion blur would cause the game to behave erratically; the programming wouldn't be as precise. An example would be playing a game like Unreal Tournament, if there was motion blur used, there would be problems calculating the exact position of an object (another player), so it would be really tough to hit something with your weapon. With motion blur in a game, the object in question would not really exist in any of the places where the "blur" is positioned, that is the object wouldn't exist at exactly coordiante XYZ. With exact frames, those without blur, each pixel, each object is exactly where it should be in the set space and time.

The overwhelming solution to a more realistic game play, or computer video has been to push the human eye past the misconception of only being able to perceive 30 FPS. Pushing the Human Eye past 30 FPS to 60 FPS and even 120 FPS is possible, ask the video card manufacturers, an eye doctor, or a Physiologist. We as humans CAN and DO see more than 60 frames a second.

With Computer Video Cards and computer programming, the actual frame rate can vary. Microsoft came up with a great way to handle this by being able to lock the frame rate when they were building one of their games (Flight Simulator).
 
Pt3 
The Human Eye and it's real capabilities - tahDA!
This is where this article gets even longer, but read on, please. I will explain to you how the Human Eye can perceive much past the mis conception of 30 FPS and well past 60 FPS, even surpassing 200 FPS.

We humans see light when its focused onto the retina of the eye by the lens. Light rays are perceived by our eyes as light enters - well, at the speed of light. I must stress the fact again that we live in an infinite world where information is continuously streamed to us. Our retinas interpret light in several ways with two types of cells; the rods and the cones. Our rods and cells are responsible for all aspects of receiving the focused light rays from our retinas. In fact, rods and cones are the cells on the surface of the retina, and a lack thereof is a leading cause of blindness.

Calculations such as intensity, color, and position (relative to the cell on the retina) are all forms of information transmitted by our retinas to our optic nerves. The optic nerve in turn sends this data through its pipeline (at the nerve impulse speed), on to the Visual Cortex portion of our Brains where it is interpreted.

Rods are the simpler of the two cell types, as it really only interprets "dim light". Since Rods are light intensity specific cells, they respond very fast, and to this day rival the quickest response time of the fastest computer. Rods control the amount of neurotransmitter released which is basically the amount of light that is stimulating the rod at that precise moment. Scientific study has proven upon microscopic examination of the retina that there is a much greater concentration of rods along the outer edges. One simple experiment taught to students studying the eye is to go out at night and look at the stars (preferably the Orion constellation) out of your peripheral vision (side view). Pick out a faint star from your periphery and then look at it directly. The star should disappear, and when you again turn and look at it from the periphery, it will pop back into view.

Cones are the second retina specialized cell type, and these are much more complex. Cones on our retinas are the RGB inputs that computer monitors and graphics use. The three basic parts to them absorb different wavelengths of light and release differing amounts of different neurotransmitters depending on the wavelength and intensity of that light. Think of our cones as RGB computer equivalants, and as such each cone has three receptors that receive red, green, or blue in the wavelength spectrum. Depending on the intensity of each wavelength, each receptor will release varying levels of neurotransmittor on through the optic nerve, and in the case of some colors, no neurotransmitter. Due to cones inherent 3 receptor nature vs 1, their response time is less than a rods due to the cones complex nature.

Our Optic nerves are the visual information highway by which our lens, then retina with the specialized cells transmit the visual data on to our Brains Visual Cortex for interpretation. This all begins with a nerve impulse in the optic nerve triggered by rhodospin in the retina, which takes all of a picosecond to occur. A picosecond is one trillionth of a second, so in reality, theoretically, we can calculate our eyes "response time" and then on to theoretical frames per second (but I won't even go there now). Keep reading.

The optic nerves average in length from 2 to 3 centimeters, so its a short trip to reach our Visual Cortex. Ok, so like the data on the internet, the data traveling in our optic nerves eventually reaches its destination, in this case, the Visual Cortex - the processor/interpretor.

Unfortunately, neuroscience only goes so far in understanding exactly how our visual cortex, in such a small place, can produce such amazing images unlike anything a computer can currently create. We only know so much, but scientists have theorised the visual cortex being a sort of filter, and blendor, to stream the information into our conciousness. We're bound to learn, in many more years time, just how much we've underestimated our own abilities as humans once again. Ontogoney recapitulates phylogeny (history repeats itself).

There are many examples to differentiate how the Human Visual System operates differently than say, an Eagles. One of these examples includes a snowflake, but let me create a new one.
 
Pt4 
You're in an airplane flying looking down at all the tiny cars and buildings. You are in a fast moving object, but distance and speed place you above the objects below. Now, lets pretend that a plane going 100 times as fast quickly flys below you, it was a blur wasn't it?

Regardless of any objects speed, it maintains a fixed position in space time. If the plane that just flew by was only going say, 1 times faster than you, you probably would have been able to see it. Since your incredible auto focus eye had been concentrated on the ground before it flew below, your visual cortex made the decision that it was there, but well, moving really fast, and not as important. A really fast camera with a really fast shutter speed would have been able to capture the plane in full detail. Not to limit our eyes ability, since we did see the plane, but we didn't issolate the frame, we streamed it relative to the last object we were looking at, the ground, moving slowing below.

Our eyes, technically, are the most advanced auto focus system around - they even make the cameras look weak. Using the same scenario with an Eagle in the passenger seat, the Eagle, due to its eyes only using Rods, and its distance to its visual cortex being 1/16 of ours wouldn't have seen as much blur in the plane. However, from what we understand of the Visual Cortex, and Rods and Cones, even Eagles can see dizzy blurry objects at times.

What is often called motion blur, is really how our unique vision handles motion, in a stream, not in a frame by frame. If our eyes only saw frames (IE: 30 images a second), like a single lens reflex camera, we'd see images pop in and out of existance and that would really be annoying and not as advantagous to us in our three dimensional space and bodies.

So how can you test how many Frames Per Second we as Humans can see?
My favorite test to mention to people is simply to look around their environment, then back at their TV, or monitor. How much more detail do you see vs your monitors? You see depth, shading, a wider array of colors, and its all streamed to you. Sure, we're smart enough to use a 24 frame movie and piece it together, and sure we can make real of video footage filmed in NTSC or PAL, but can you imagine the devices in the future?

You can also do the more technical and less imaginative tests above, including the star gazing, and this tv/monitor test. A TV running at only 30 FPS is picking up a Computer monitor in the background in its view, and with the 30 FPS TV Output you see the screen refreshes on the computer monitor running at 60 FPS. This actually leads to eyestrain with computer monitors but has everything to do with lower refresh rates, and not higher.

Don't underestimate your own eyes Buddy...
We as humans have a very advanced visual system, please understand that a computer with all it's processor strength still doesn't match our own brain, or the complexity of a single Deoxyribonucleic Acid strand. While some animals out there have sharper vision than us humans, there is usually something given up with it - for eagles there is color, and for owls it is the inability to move the eye in its socket. With our outstanding human visual, we can see in billions of colors (although it has been tested that women see as much as 30% more colors than men do. Our eyes can indeed perceive well over 200 frames per second from a simple little display device (mainly so low because of current hardware, not our own limits). Our eyes are also highly movable, able to focus in as close as an inch, or as far as infinity, and have the ability to change focus faster than the most complex and expensive high speed auto focus cameras. Our Human Visual system receives data constantly and is able to decode it nearly instantaneously. With our field of view being 170 degrees, and our fine focus being nearly 30 degrees, our eyes are still more advanced than even the most advanced visual technology in existance today.

So what is the answer to how many frames per second should we be looking for? If current science is a clue, its somewhere in sync with full saturation of our Visual Cortex, just like in real life. That number my friend - is - well - way up there with what we know about our eyes and brains.

It used to be, well, anything over 30 FPS is too much. (Is that why you're here, by chance?) :) Then, for a while it was, anything over 60 is sufficient. After even more new video cards, it became 72 FPS. Now, new monitors, new display types like organic LEDS, and FPDs offer to raise the bar even higher. Current LCD monitors response rates are nearing the microsecond barrier, much better than millisecond, and equating to even more FPS.

If this old United States Air Force study is any clue to you, we've only scratched the surface in not only knowing our FPS limits, and coming up with hardware that can match, or even approach them.
 
Pt5 
The USAF, in testing their pilots for visual response time, used a simple test to see if the pilots could distinguish small changes in light. In their experiment a picture of an aircraft was flashed on a screen in a dark room at 1/220th of a second. Pilots were consistently able to "see" the afterimage as well as identify the aircraft. This simple and specific situation not only proves the ability to percieve 1 image within 1/220 of a second, but the ability to interpret higher FPS.

This article was updated: 7/27/2002 due to its popularity and to reflect in more detail the science involved with our eyes and their ability to interpret more than 60 FPS.
 
There Is Much More 
Would you like Pt's 6,7 and 8? 
Sorry.......... 
You can drag up information to back up any story/argument, the best way is to see for yourself, you may not be able to do high end scientific experiments etc, but you can do what you can do. I know for a fact that I can see/feel/percieve a difference (and a large difference) going from, in Quake,72FPS to 150FPS and higher. I know for a fact that I can see/feel/percieve a difference in QuakeIII, going from 85FPS to 125FPS and higher. I will never believe that there is not any gain in going above 60FPS or such, because in my experience, I know different. People can quote articles, figures, results till they are blue in the face, if it doesn't match my personal experience, I would be a fool to accept it. Check it out yourself, come to your own conclusions, remember, the experts once said the World was flat. 
 
too long, didn't read.

just link to the fucking article you're quoting. 
It Would Be The Same Length 
here or there, not my point though 
Dude... 
Post a URL, not the whole fucking article.

http://amo.net/NT/02-21-01FPS.html 
Also: 
quit being a smartass. it's just framerate, you don't need to post 50 millions times to try and prove you're right. 
First | Previous | Next | Last
You must be logged in to post in this thread.
Website copyright © 2002-2024 John Fitzgibbons. All posts are copyright their respective authors.