ok I am not good with the jargon, I do know it says 60hz when the tv is turned on from an off state. I would imagine its 720p upscaled but to me it looks way better than any I would get from my internet.
As Americans, everything electronic in our use is 60hz, including our TVs. That's the frequency of the electrical current coming from the power-plant. In Europe, their stuff is 50hz (PAL).
FPS means "Frames per Second," and describes a COMPLETE frame, i.e., if you looked at it, it'd be a still image of what you were watching. 1080i means 1920x1080 interlaced. Interlaced means each frame is only half of the complete frame; thus, you need 2 interlaced frames in sequences to get 1 frame over the 2x the period. Thus, 60i is downsampled to 29.97fps. The way this works is that your eyes see the flickering between the two interlaced states of Frames 1 and 2 and composites the image in your brain, but, this makes interlaced images appear darker, with less contrast, and detail; they also suffer from some judder in fast moving scenes on progressive scan displays.
The alternative tradeoff is 720p; but, depending upon the size of your monitor, 1080i may look better than 720 due to upscaling. The difference in quality depends on the upscaler you're using. TVs can use frame interpolation to smooth out the difference between 30fps and 60fps but it can look awful; thus, it's generally accepted that 1080i looks better than 720p.
The two standards are actually close, information-wise; 1.036Mpx vs 0.922Mpx. But the difference comes in, again, how well your set handles the upscaling vs de-interlacing.
Anyone, you should try to get a hold of the FSO 4Mbit 1080p streams.. My goodness... They are fantastic.. much better than the crap coming through my cable box. I'm an audio/videophile, so, the shit is kind of important to me..