HDTV

Message boards : Cafe SETI : HDTV
Message board moderation

To post messages, you must log in.

1 · 2 · Next

AuthorMessage
grumpy

Send message
Joined: 2 Jun 99
Posts: 209
Credit: 152,987
RAC: 0
Canada
Message 224246 - Posted: 1 Jan 2006, 17:58:14 UTC
Last modified: 1 Jan 2006, 18:14:23 UTC

A lot a people are buying HDTV these days and that is fine because the tv system in north america is changing.
But please be aware that the tv sets available now may not be FULLY HDTV.
Some are HD ready, compatible,capable etc. so they may not be plug and play right out of the box !
Why you may ask ?
It's because some models sold do not have an
FULLY intergrated ATSC, QAM, digital tuner.
A lot of them still have an ntsc analog tuner only,so you will have to buy a seperate (set-top box)
tuner in order to receive the true digital stations.
If you do need a STB be sure to have DIGITAL inputs
on that tv ex. digital DVI-D or DVI-I ,HDMI component inputs
not those analog RCA components inputs.
Some people are spending a lot of money on systems that may be obsolete a year from now.
ID: 224246 · Report as offensive
Jim
Avatar

Send message
Joined: 28 Jan 00
Posts: 614
Credit: 2,031,206
RAC: 0
United States
Message 224460 - Posted: 2 Jan 2006, 3:14:02 UTC - in response to Message 224246.  

not those analog RCA components inputs.
Some people are spending a lot of money on systems that may be obsolete a year from now.


Component video is fully capable of transmitting an HD signal from source to destination. Their downfall is that you need to invest in very good ones, eg Monster, Kimber, Phoenix Gold or one of the many other good brands which can be expensive. True, the digital interconnects have the advantage of needing two (or one in some cases) fewer format changes (digital to analog and back again). But don't think that if you invested in a device that has only component video ins/outs that you're out of luck or got ripped off.

By the way don't EVER use the crap analog cables that came in the box. You got them for free and they're worth every penny. This is true for both audio and video. It's the same as buying a sports car and using cheap tires on it. The car is capable of a lot more than the tires will allow.

Jim


Without love, breath is just a clock ... ticking.
Equilibrium
ID: 224460 · Report as offensive
Profile David@home
Volunteer tester
Avatar

Send message
Joined: 16 Jan 03
Posts: 755
Credit: 5,040,916
RAC: 28
United Kingdom
Message 224611 - Posted: 2 Jan 2006, 10:19:21 UTC
Last modified: 2 Jan 2006, 10:22:35 UTC

Anybody understand all the different High Def formats?

I have heard about 720p, 1080i and now 1080p. 1080p is the ultimate high-def standard but no content is currently available because of bandwidth requirements.

I am puzzled in two areas:

1) is 720 progressive a better bet than 1080 interlaced?

2) What is the link between a screen's resolution and these High Def standards? E.g if a plasma is 1024*768 pixels will it display both 720p and 1080i OK?

No TV shop has been able to answer these questions. My cable company are planning a 1080i service (we are a bit behind the rest of the world in High Def) so I am keen to get a plasma but I am puzzled by all the different High Def standards.




ID: 224611 · Report as offensive
Profile Stan Ott

Send message
Joined: 24 Dec 00
Posts: 5
Credit: 563,572
RAC: 0
United States
Message 224642 - Posted: 2 Jan 2006, 13:11:53 UTC

juat to add a few things here, stay away from the edtv tv, as their resolution it in the 800x600 range, and most plasma's show a 1078x768, unless you get a huge one in the 50 inch range. I have a 42 inch Samsung plasma and it is GREAT!, and does have the digital tuner which you might not need with a cable hookup, but I run it without the box and still get most of the Hi-Def channels.and is cable card ready, but haven't tried that. also have a hi-def DVR made by sony which has the digital tuner in it, so no box needed for most hi-def channels.but you do have to manually look for the HD stations with these tuners, takes a little time. goodbye tivo!..Get the hi-def if you can,,you won't believe the picture!but do shop around as prices do vary. got mine at amazon for 2200, shipped!...while BB and CC were selling it gor 3000.

ID: 224642 · Report as offensive
Heffed
Volunteer tester

Send message
Joined: 19 Mar 02
Posts: 1856
Credit: 40,736
RAC: 0
United States
Message 224728 - Posted: 2 Jan 2006, 17:10:56 UTC - in response to Message 224460.  

By the way don't EVER use the crap analog cables that came in the box. You got them for free and they're worth every penny. This is true for both audio and video. It's the same as buying a sports car and using cheap tires on it. The car is capable of a lot more than the tires will allow.

Same goes for digital interconnects. Don't be fooled that digital is simply ones and zeros and therefore doesn't need quality shielded cables. Sure, digital is a tad more robust, but is still susceptible to RFI which can cause jitter. (a disruption of the clock signal) Digital is very finicky about it's clock signal. Any inconsistencies with the clock makes syncing the sending unit with the receiving unit problematic. This of course results in signal degradation.

Fiber optics are a good choice, as there is no electrical connection possible between units, so a ground loop (usually manifested as a 60 cycle hum) won't be possible unless the equipment is mounted on rack rails and not floated properly, (using nylon shoulder washers and rubber gaskets on the rail face) or the feet have been removed and chassis is sitting on chassis. Fiber is also immune from RFI. However, quality applies to fiber as well. Cheap plastic cables (generally supplied with the unit) are more cloudy than actual glass, and are only capable of a lower polish at the terminations, thus hindering the transmission of light.
ID: 224728 · Report as offensive
grumpy

Send message
Joined: 2 Jun 99
Posts: 209
Credit: 152,987
RAC: 0
Canada
Message 224729 - Posted: 2 Jan 2006, 17:12:36 UTC - in response to Message 224460.  

not those analog RCA components inputs.
Some people are spending a lot of money on systems that may be obsolete a year from now.


Component video is fully capable of transmitting an HD signal from source to destination. But don't think that if you invested in a device that has only component video ins/outs that you're out of luck or got ripped off.
Jim


Component is capable of transmitting HD signal has you said ,the problem
is how it`s transmitted: analog or digital. If your device has only analog
in/out or you do a convertion from digital to analog your are not getting
(the full benefits of true digital that is bandwith,content,speed,resolution.
D/A convertion makes it an extra operation and problem with analog is that it's subject to all kinds of interferences.
Analog signal systems cannot handle so much info,speed that's why the system
is changing.

Cable quality is very important for best digital viewing ,I know I sell them !
but don`t expect to improve your analog signal at the level of digital. Good cables are not a waste of money so much so if you have long lenghts.

Every day I see customers come in and buy cables because their brand new
HDTV does not give an improved picture compared to their old tv. So they bring
in their manuals they ask for better cables. Most them have many analog inputs /outputs: s-video,composite,rca analog component, NTSC tuners.It's ok for now because it's needed for the old vhs,dvd's etc. The NTSC input could give you
an 480P EDTV resolution that's some improvement over the SDTV 480I.
A lot of customers come back with their $100 cables complaining they did nothing for them.

So where are the digital inputs/outputs ?
What I see are one DVI(analog and/or digital)or one HDMI but no tuners.
Wait until they want to plug in all those new HD devices...
(satellite,cable boxes, HD dvd's etc.)
Btw you can't plug your HDTV ntsc tuner on regular cable and see true
HD tv broadcast, they are not on your regular channels.
Most of these so called HD broadcasts are down convertion's (D/A).

What about HD resolutions ?
Well not all broadcasters(networks)will use the same.
There are 18 approved ATSC Digital TV broadcast formats.
What about sound ? it's 5.1 Channel Dolby Digital.

So that's why I said some TV's may become obsolete soon
because they dont support some or all the above.
We are in an transition phase !
ID: 224729 · Report as offensive
Jim
Avatar

Send message
Joined: 28 Jan 00
Posts: 614
Credit: 2,031,206
RAC: 0
United States
Message 224953 - Posted: 3 Jan 2006, 4:22:57 UTC - in response to Message 224728.  

Same goes for digital interconnects. Don't be fooled that digital is simply ones and zeros and therefore doesn't need quality shielded cables.

So true. It seems strange to me that people will spend thousands on the gear but balk at the proper cables to make them perform to their potential.

Fiber optics are a good choice, as there is no electrical connection possible between units

The trick with TosLink (optical), beyond the plastic/polish issue you mentioned, is that the terminations are almost universally shoddy. I've only found a few manufacturers that offer improved ends that help keep the cable tight and perpendicular. I much prefer coaxial for the tighter fit. Though RFI resistant and "bend resistant" interconnects are a bit pricey they're worth every penny. It's a matter of personal opinion though. I like the "warm" qualities of coax, others prefer the "brightness" of optical. More than that, my Monster Cable coax's don't pop out when I pull gear out for cleaning/moving.


Without love, breath is just a clock ... ticking.
Equilibrium
ID: 224953 · Report as offensive
Heffed
Volunteer tester

Send message
Joined: 19 Mar 02
Posts: 1856
Credit: 40,736
RAC: 0
United States
Message 224981 - Posted: 3 Jan 2006, 5:35:47 UTC - in response to Message 224729.  

Component is capable of transmitting HD signal has you said ,the problem
is how it`s transmitted: analog or digital. If your device has only analog
in/out or you do a convertion from digital to analog your are not getting
(the full benefits of true digital that is bandwith,content,speed,resolution.
D/A convertion makes it an extra operation and problem with analog is that it's subject to all kinds of interferences.
Analog signal systems cannot handle so much info,speed that's why the system
is changing.

You should probably read up on a few things... Even with digital, you're still hitting the CODEC. Yes, analog is more susceptible to certain interferences, but digital isn't immune. Analog is generally more forgiving when harmful interference occurs. Often it's manifested as oversaturated colors, or slight ghosting. When digital goes bad, it goes bad. In my opinion, pixelization is more distracting than a hue change.

And what's up with the comment that analog systems can't handle the info, and speed??? That's digitals biggest drawback! It requires lots of (data) bandwidth, high bitrates, and requires lossy data compression to try and approximate an analog signal that analog equipment can effortlessly recreate. By it's very nature, digital is (audio) bandwidth limited. (look up Nyquist frequency) And don't even talk about resolution. Analog has limitless resolution. Analog makes smooth sine waves. No stepping. Digital recreates a stepped interpretation of a sine wave, with the number of steps being whatever sampling frequency the recorders DAC was... Until unlimited sampling frequencies are within reach, digital will always be lower resolution, and lower bandwidth.

ID: 224981 · Report as offensive
Jim
Avatar

Send message
Joined: 28 Jan 00
Posts: 614
Credit: 2,031,206
RAC: 0
United States
Message 224995 - Posted: 3 Jan 2006, 6:23:06 UTC - in response to Message 224729.  
Last modified: 3 Jan 2006, 7:04:21 UTC

Component is capable of transmitting HD signal has you said ,the problem
is how it`s transmitted: analog or digital. If your device has only analog
in/out or you do a convertion from digital to analog your are not getting
(the full benefits of true digital that is bandwith,content,speed,resolution.
D/A convertion makes it an extra operation and problem with analog is that it's subject to all kinds of interferences.
Component video was specifically designed for high resolution signal transfer. Also, it is not an issue of analog or digital component signals. It is analog only. Subject to interference and additional D/A-A/D conversion, true. But that does not automatically mean that it is inferior. Digital signals have their gremlins too. Digital does not = quality. Witness your digital answering machine. Would you listen to music with that quality? Don't get me started on MP3's.
I hope you'll forgive me if I correct you on an issue or two. The bandwidth, content (I assume you mean full signal content) and resolution are absolutely identicalallowing for each format's flaws - digital being the worst culprit BY FAR - between the two formats. This, of course, predicates the use of even marginal cables. It's not worth discussing the free to $30 for 20 feet range of cables here. I'm not sure what you mean by speed or even what difference it makes. It's either the speed of light (optical) or very near that (electrical). With the distances we're talking about the difference is completely inconsequential. In what way would the issue of speed factor in?

Every day I see customers come in and buy cables because their brand new
HDTV does not give an improved picture compared to their old tv. So they bring
in their manuals they ask for better cables... A lot of customers come back with their $100 cables complaining they did nothing for them.
Better cables will only make a difference under the right circumstances. Anyone upgrading from F-pin to composite will see a difference. The same is true going from composite to s-video and from s to component. However, the differences will most often be lost considering the way most folks have their picture controls set. Most people leave the color, contrast, brightness and other settings as they came from the factory. This is most commonly referred to as "torch mode". The subtleties that better cables can bring out are lost in the wash. It's like missing the subtle fingering techniques of a talented violin player from across a crowded bar. TVs are all set at the factory to "bright and eye-catching" in case they are the one that is placed on display in a bright showroom and next to other TVs. Teach your customers how to set their TVs properly and they'll see the difference in the picture. Oh, and don't forget to teach them how to look for the differences they'll be seeing - enhanced detail in shadowed areas, skin that looks real, not sunburned, whites that are white, blacks that are black, etc.
The NTSC input could give you an 480P EDTV resolution that's some improvement over the SDTV 480I.
Which NTSC inputs are you referring to? There are no progressive (p) video formats in the NTSC system. All NTSC video formats are interlaced, from crappy old VHS through MiniDV. Even DVDs are recorded in interlaced format. Progressive video output on them is made possible by circuitry inside the player. Progressive video is only possible through component cables or one of the digital video cable types e.g. HDMI, DVI but only possible if the TV is capable of rendering a progressive image. 480p does nothing for a set that can only render interlaced images. In fact, they'll get no picture. Maybe you're talking about someone who might have a progressive-capable source but didn't have a proper TV before to make full use of it?


So where are the digital inputs/outputs ? What I see are one DVI(analog and/or digital)or one HDMI but no tuners.
It's about manufacturers trying to hit an attractive price point. The same thing as with car manufacturers making available several investment levels; levels within levels, as a matter of fact, with different option packages. It's your job to show them the best available so they'll know what they're missing and why if they step themselves down. "Would you like to make sure your new TV is ready for the next few years' technology updates? If so..." (Training session over)

Btw you can't plug your HDTV ntsc tuner on regular cable and see true
HD tv broadcast...
There's no such thing as an NTSC digital tuner.

What about HD resolutions ?
Well not all broadcasters(networks)will use the same.
There are 18 approved ATSC Digital TV broadcast formats.
There are 18 available formats for very good reason. Each broadcaster is apportioned a chunk of digital bandwidth. It is sufficient to fit (1) 1080i or 720p video signal and 5.1 DD sound (plus a little more for closed captioning, emergency broadcast data, Second Audio Program (SAP) audio or even interactive TV(!) for the future). If they choose - and here's where the 18 formats comes in - they can carve their bandwidth up and offer 2, 3 or more different channels at reduced resolution. The FCC has provided them with 18 different ways they can divide their allotted bandwidth. This is common now in all major and mid-market stations in the US. They can broadcast three (ABC) channels with different programming in the day ("Who cares if the soaps and midday news are lower resolution? We get three times the advertising revenue!") and in the evening they can reclaim it all for the big game or movie event in full HD and 5.1 Dolby Digital. This realization of potential increased revenue was one of the major factors in finally getting the broadcasters to get their keisters moving on the HD thing after sitting around since 1994.
What about sound ? it's 5.1 Channel Dolby Digital.
BEWARE!! Yes, all ATSC broadcast formats use Dolby Digital for their audio but that DOESN'T necessarily mean 6 (5.1) channels of sound. The reduced resolution formats will generally use 2 channel DD. Dolby Digital can be 1,2,3(old and rare),4,6,7,8 or 9 channels. Who here bought the 1st US release of "Top Gun" in glorious Dolby Digital only to find that it was in mono? Guilty. DD is simply the "pretty" name for the algorithm (code) that is used to digitize the sound during recording. We first heard it called AC-3, its proper name, but that was scary and hard to package and sell.

So that's why I said some TVs may become obsolete soon
because they don't support some or all the above.
No way! Composite video has been around for literally decades and we still see them on current models. Sure, they (analog including component) will be phased out some time in the future but certainly not anytime soon. There are still millions of people all over the world that have source devices that only have analog video outs (not to mention only analog audio outs). For goodness sake, can you imagine the uproar of the consumer base? Imagine the owner of a DVD player who has to trash it and buy another simply because the new TV he wants to buy doesn't have any analog video connections. What about the TV owner who invested tens of thousands on a set over the last ten years that's told he'll have to buy a new one if he wants to take advantage of the latest source device.

Though new technology is embraced MUCH more quickly these days than it ever was before, people are still buying VCRs and hooking them up with coax cable - the crappiest picture possible from the crappiest source possible with the crappiest sound possible (mono sound is all you get with coax cable, even if you have a stereo tape, stereo player and stereo TV). DVDs are much more popular and common but VHS is still around even 10 years after the introduction and popularization of the DVD format.

Please don't tell your customers stuff like this, it's simply not true, regardless what your boss says. I've been in the consumer electronics and pro audio businesses for 33 years now and I've heard a lot of well meaning salespeople spew balderdash to their customers on topics such as this. When asked about it, they generally say they heard some other salespeople talking "and that's what they said". Just as common is the person who saw/heard/read something and misunderstood then misspoke later. It's almost never meant to be inaccurate in my experience and I assume this is the case with you - you took the time to post in order to help people. Rather, it's like the game "telephone" where a message becomes hopelessly scrambled after just a few retellings.

I sincerely hope you don't take this as an attack on you. It certainly wasn't meant to be.

Cheers! Jim



Without love, breath is just a clock ... ticking.
Equilibrium
ID: 224995 · Report as offensive
Heffed
Volunteer tester

Send message
Joined: 19 Mar 02
Posts: 1856
Credit: 40,736
RAC: 0
United States
Message 225173 - Posted: 3 Jan 2006, 17:30:26 UTC - in response to Message 224995.  

Please don't tell your customers stuff like this, it's simply not true, regardless what your boss says. I've been in the consumer electronics and pro audio businesses for 33 years now and I've heard a lot of well meaning salespeople spew balderdash to their customers on topics such as this.

Exactly! I'm a recording engineer and made my first digital recording over 20 years ago. I am well acquainted with the failings of both analog and digital mediums, and have encountered many disillusioned salespeople over the years.

The simple fact of the matter is that digital isn't anything new. It's been around for decades, and it's still playing catch up with analog mediums that have been around far longer. Our latest 24bit 196kHz Pro Tools HD systems with Apogee converters still don't sound as good as our Studer A820 2" 24 track analog tape machine biased at +6 running at 30ips with Dolby SR noise reduction. Yet we use our digital systems everyday, and only on rare occasions use the analog machines. Why? Convenience on the production end of course! A reel of 2" tape is bulky. Reels are just over 2 inches thick, 10 inches across, weigh over 5 pounds, cost around $120 and only record 15 minutes of music. I can get a 200GB hard drive for less money, record several hours at 24bit 196kHz, and when I need to take the recording to another room to work on, I simply spin down the drive and carry a book sized package that weighs less than a pound.

And editing? Digital editing is the stuff dreams are made of! Using a razorblade to cut the chorus out of one take on the 2" master and splicing it into another isn't for the faint of heart. With digital, it's a simple copy and paste. And if the vocalist is lagging behind the band in a few places on the 2", you'll most likely leave it. People have cut out individual tracks and shifted them, but this is just insane. With digital, it's as simple as advancing the vocal a few frames in the offending portion.

And one of digitals biggest strengths is it's transferability. With analog, you can't get around the fact that each generation (copy) is noisier. It comes from the fact that each pass adds more tape noise. (we won't even get into the fact that each time you play the tape, some oxide is shed and you loose a little high frequency) High bias tapes and noise reduction help to maximize the signal to noise ratio, but it's still readily apparent. With a digital generation, if both machines are locked to house sync (a single, stable clock reference to help ensure a jitter free transfer) and are using high quality cabling, the copy will be virtually identical to the master. (regardless of what anyone tells you, it will not be exactly the same. We don't live in a perfect world)

It's the exact same thing with digital video, and that is the reason for the switch. Not that analog equipment isn't up to the challenge. It's simply too convenient to have a camera with digital outs to input into your AVID for editing/manipulation. Broadcasters love the ability to shorten the time frame from shooting, to broadcast. Film makers? Why bother going through the added step of needing to delay production while your film gets developed when you can get started on rough cuts immediately? Digital video has been around as long as digital audio, but from the sheer complexity and bandwidth required to convert images, (An image contains much more information than audio) it was a very long time before any digital video sources could cost effectively produce anything anyone would consider broadcast quality. Even now, the compression algorithms need to be trickier than audio compression schemes. (and yes, .mp3 is a horrid format) These compression schemes are lossy. Meaning, data is simply discarded as irrelevant. How could something even remotely be considered higher resolution if it's simply throwing bits of the original image away? To put it another way, ever been to an Imax theater? That's an analog medium. Take your highest resolution broadcast HDTV output there and project it on that massive screen and see what you think. You'll be less than impressed. And analog not having the speed of digital? Come on... Analog is what you are seeing and hearing! How can you possibly say it isn't fast enough?

Plain and simple. Digital is the future. It's not yet better. The convenience of the format far outweighs the fact that it is still inferior to analog technologies far older. Digital has come a long way, but there is too much misinformation going around about it's wonders. Digital does not = quality. It's every bit as easy to make a poor quality digital recording as it is with analog. In fact, in many ways, it's even easier. Overmodulate an analog signal the tiniest bit, and you most likely won't notice it. Slight distortion here, a little oversaturation of color there... Overmodulate a digital signal, and it's more than readily apparent. Bursts of horrible digital flatulence. Digital is not a forgiving medium.

So as Jim has said, please don't tell your customers that pure digital is the only way to go because of it's speed, bandwidth, content, and resolution. And that analog equipment can't handle the above mentioned items because it simply isn't true. It may sell more gear, but it isn't true.
ID: 225173 · Report as offensive
Jim
Avatar

Send message
Joined: 28 Jan 00
Posts: 614
Credit: 2,031,206
RAC: 0
United States
Message 225275 - Posted: 3 Jan 2006, 22:01:32 UTC - in response to Message 225173.  
Last modified: 3 Jan 2006, 22:03:42 UTC

Our latest 24bit 196kHz Pro Tools HD systems with Apogee converters still don't sound as good as our Studer A820 2" 24 track analog tape machine biased at +6 running at 30ips with Dolby SR noise reduction.
Ah, the age-old debate! It was going on even when digital audio was limited to 16 bit/20kHz. Very serious kudos to Apogee for still being the converter of choice. I have to admit I never understood what the debate was. I remember the first time I sat down to seriously compare the two. At the time my facility was the first in Minnesota to have Sound Tools. It was the heyday of the "Minneapolis Sound" with our favorite son Prince and bands like Husker Du, The Replacements, Soul Asylum and Boiled In Lead were already big or on the way. A number of us got together at my studio to make a serious A/B comparison. Well, as serious as my early hodgepodge of gear would allow for. I had my Westlakes by then and the unavoidable Yamaha NS-10Ms, a much-modded and oft-repaired MCI 32 channel in line desk, a Studer A80 refurb, Carver Magnetic Field amps wired with Mogami star-quad cable and Noel Lee's OMC (Original Monster Cable) speaker wire. I'll forgo the details of how we set up the comparison, but suffice it to say that we took every effort to be fair and unbiased. As it turns out, the digital recording played first and it took no more than a few scant seconds of the analog playback to bring us all off our seats and start asking where the beer was. The competition was over that quick. It certainly didn't mean that we weren't all still excited as hell about what this new digital package was going to do for us. In fact, my studio's success was founded on the fact that I could do things no one else in town could do. BUT the serious musicians didn't use it for years as their primary recording medium, only as a tool to fix troubled tracks, etc. I made a MINT doing small band CD pre-mastering to DAT with it. Ahhh, the good old, bad old days (read poor) when there was always a QP in the back room for the bands and me ;).
And editing? Digital editing is the stuff dreams are made of! Using a razorblade to cut the chorus out of one take on the 2" master and splicing it into another isn't for the faint of heart.
A feat I never had the nerve for. And what if someone had dropped the blade (magnetizing it)? No way. I like coffee too much. For guys like me that's when you punch in to a spare and take notes for the mix.

It's every bit as easy to make a poor quality digital recording as it is with analog. In fact, in many ways, it's even easier. Overmodulate an analog signal the tiniest bit, and you most likely won't notice it. Slight distortion here, a little oversaturation of color there... Overmodulate a digital signal, and it's more than readily apparent. Bursts of horrible digital flatulence. Digital is not a forgiving medium.
Digital flatulence ... brilliant! But the day my farts sound like that is the day I start smoking and doing drugs again, 'cause I'll already be dead, I'll just not have noticed yet.

Ever combine the two mediums? I would commonly loop the digital tracks through the Studer to warm them up. Just load a tape and patch the repro to wherever you want it. Not exactly a secret technique but I find that there are schools of thought / preference there.

The recording business has always been a wonderful marriage of art and science. It seems as if we're swinging a lot more toward the science every year. But whatever tools you use to create it's still art.


Without love, breath is just a clock ... ticking.
Equilibrium
ID: 225275 · Report as offensive
Profile Sir Ulli
Volunteer tester
Avatar

Send message
Joined: 21 Oct 99
Posts: 2246
Credit: 6,136,250
RAC: 0
Germany
Message 225294 - Posted: 3 Jan 2006, 22:25:58 UTC

if you look for Good HDTV Demos

http://setiathome.berkeley.edu/forum_thread.php?id=19193

Greetings from Germany NRW
Ulli

ID: 225294 · Report as offensive
Jim
Avatar

Send message
Joined: 28 Jan 00
Posts: 614
Credit: 2,031,206
RAC: 0
United States
Message 225445 - Posted: 4 Jan 2006, 2:02:46 UTC - in response to Message 224611.  
Last modified: 4 Jan 2006, 2:19:13 UTC

Anybody understand all the different High Def formats?

I have heard about 720p, 1080i and now 1080p. 1080p is the ultimate high-def standard but no content is currently available because of bandwidth requirements.

I am puzzled in two areas:

1) is 720 progressive a better bet than 1080 interlaced?

2) What is the link between a screen's resolution and these High Def standards? E.g if a plasma is 1024*768 pixels will it display both 720p and 1080i OK?

No TV shop has been able to answer these questions. My cable company are planning a 1080i service (we are a bit behind the rest of the world in High Def) so I am keen to get a plasma but I am puzzled by all the different High Def standards.
Sorry Appetiser, I forgot to answer your questions.

1)720p vs. 1080i is a purely subjective decision. There are caveats of course. Broadcasters can choose whichever format they wish but display manufacturers but select what is referred to as the set's "native mode". In other words, the TV must be designed to do one or the other. This means that when a native 1080i TV sees a 720p signal or vice verse a format conversion process must take place. As you can imagine, manipulating this vast amount of data can be a tricky process and the image can suffer from any number of mild to nasty side effects. Keep in mind you can never improve the image, you can only maintain its integrity. This necessitates the incorporation of higher quality circuitry which means more dollars invested. So the best advice I can give is to invest as much as you feel comfortable with. There is a direct correlation between dollars invested and quality realized in consumer electronics. Understand that this is with all other things like the quality of source media and devices, cables and viewing environment (distance, angle, ambient light) all being equal. A cheap plasma display will look like a turd compared side by side with one costing 25 or even 10% more. By the way there is no such thing as paying a premium for a brand name. Rumors abound but logic dictates that such a thing would be fiscal suicide
in today's market.

The best example I can give you is Pioneer Electronics. They make two lines of consumer plasma displays; regular Pioneer branded and their Elite by Pioneer line. They both use the same plasma panel and yet the Elite models are far superior in overall picture quality. The reason lies in the electronics inside. Everything from the power supply to the video amps to the scalers are superior and consequently result in a superior rendered image.

2) The answer here is linked to the latter portion of my response to #1. Yes, they will show you a nice image. How close that image is to what was sent into the TV is related to the overall quality of the set. No, 720p native sets aren't inherently better nor the other way around. Nor is it true that one format is easier to convert to another. My advice here is to look at a number of TVs showing the same picture and compare them in certain basic ways. This will involve setting the TVs to roughly the same color, contrast and brightness settings, but that's simple to do with the remote. If the retailer won't let you do that walk away. I'll detail below some basic steps to follow when looking at a TV.

The Advanced Television Systems Committee (ATSC) decided, among other things, that HDTV in the US was to be defined by:
- 720 Progressively scanned or 1080 Interlaced vertical lines of resolution
- a 16:9 (widescreen) aspect ratio, shape.
Your typical old TV had an aspect ratio of 4:3, meaning it was a rectangle where for every 4 inches of run there were 3 inches of rise. However, *we see in widescreen* and so the movie industry in the mid 1950's adopted a new camera / projector technology that allowed cinematographers to capture a wide and thus more lifelike and engrossing image. There are now several flavors, each with a slightly different aspect ratio, but all are fairly close to 16:9. All hail Panaflex and Cinemascope lenses!!
- 6 (5.1) channels of Dolby Digital encoded sound
- Approximately 2 million pixels.
720p has 1440x1280 lines giving us 1,8432,000 pixels. 1080i has 1080x1920 lines giving us 2,0736,000 pixels. A 720x1280 image will look just fine on most displays and absolutely wonderful on a high quality display. The critical element here is the display's scaler that I referred to earlier. It's the set of chips that "up-convert" or "down-convert" an image to fit the native resolution of the display. This is where the 1s and 0s get manipulated and possibly messed up. In this case the 720x1280 would remain almost untouched if it as sent from a graphics card through a DVI-D connection. Which is why I'm typing this and I game on a 42" plasma. My living room TV is 1080i native. This means that when I use my theater computer the TVs scaler must down-convert the vertical lines from 1440 to 1080 and up-convert the horizontal lines from 1280 to 1920. This sounds like a lot of digital hocus pocus but the bottom line is that it too looks figgin' amazing. I'm lucky enough to have a really nice set and that's a big factor in the equation. My best friend has a 52" rear projection TV that he paid only $1000 for (amazingly it has a DVI connection) and the same computer and wiring render a nice picture but noticeably less crisp and realistic. Still, it's a hell of a lot better than any DVD (480x520 - 520 on paper, less than 500 on your average sub-$300 US DVD player).

Yes, the specs tell us that 1080i should be better. But let me give you one of my Golden Nuggets of advice; Specs are Worthless. They tell you absolutely nothing about how good a TV looks or a sound system recreates the original recording. They're an okay place to start but a very poor way to buy anything. For example, the use of Total Harmonic Distortion (THD) as a valuable specification in analyzing the quality of an amplifier was one of the biggest and most outrageous perpetrations of hooliganism ever visited on the consumer public by electronics manufacturers. I won't go into the tech of it but suffice it to say that it is an easily manipulable specification that gives virtually no indication as to the accuracy of the amplifier.

Here are a few things to do when you look for a TV or adjust your own.

- Look for shades of grey in the shadowed areas by adjusting the brightness control. This will help reveal the set's ability to render a contrasty image (more shades of grey means a more realistic image including more shades of each color). Look for skin that looks like actual skin.

- Turn the color setting WAY down for this. Most TVs show proper color when set to the upper portion of the lower third of the scale.
- Contrast is a tricky one. It controls just that, the spread (scale) of the brightest white on the screen contrasted by the darkest black. Obviously we want a wide spread but this is expensive to achieve and typically the first thing to go on a cheap display.

- Try to look at scenes that have been shot outdoors on a bright day. Let your eyes tell you if it looks real or not. Now, the eyes have an incredible ability to adjust to what they're taking in. For example, turn your TV or display off. What color is it? Is it black or grey or something else? That is the blackest black that display can render. Black is made by simply not lighting up any pixels. None the less, even on a grey TV tube or panel, your eyes will "believe" that they are seeing a deep black. On a display like that it means that your grey scale is severely limited (grey is your deepest black). Look for a deep black screen when the TV is powered off.

- Stand back from the display!! Stand at the distance and angle you'll be sitting at home. Look at the lapel of a man's suit coat. Does it seem to stand out from the body of the coat? Can you even distinguish the lapel? Look at a person's face. Does the nose seem to stand forward from the rest of the face? Look at hair. Does it look like a hair helmet or does it have dimension and detail? Look for loss of brightness and contrast (one of plasma's greatest benefits) and room light reflections.

- Invest in a calibration DVD!!!!!! There are a couple of them out there that are generally regarded as the best. Each one has its fortes and foibles. I won't name them out of respect for the forum rules against advertising. What they allow you to do is calibrate your display to the same brightness, color, contrast and other settings that were used in the filming and mastering of the video. The difference in the picture quality you'll see is OUT OF THIS WORLD!!!. They go for around $30 US and are worth every penny and then some.

Here is what you should generally do with the picture controls to get an accurate image:
- Start by making these adjustments under the lighting conditions that you will commonly watch them. Some TVs let you recall a set of memorized picture settings. I use these to set up a perfect picture for morning, afternoon, evening and late night viewing. Commonly labelled "Sports" "News" "Movie" or something of the sort, they can be changed by you and recalled using your remote control.
- Bring the color control all the way down to black and white and walk away for a cup of coffee or tea. Come back and s l o w l y bring the color back up until skin of any sort looks natural and not overly red. You will be astounded by how much depth you reclaim from this simple adjustment. Your picture won't look so 2D anymore. I recommend using the early street scene in Monsters, Inc., where Sully and Mike are walking to work. Pause on a frame that has bricks, trees and lots of stuff in the far distance. Look for the bricks and trees to pop out and gain texture and depth.
- Leave the tint control alone until you have a calibration disc to use.
- Bring the brightness WAY down. Look for the blackest black your TV can can show you. Bring up the rightness slowly, looking for detail in the shadowed areas. There's a trash can (I think) in an alley they pass. Pause there and adjust the brightness to maintain as much deep black as possible while still seeing all of the shadow detail. This will mean you'll have to see-saw back and forth from bright to dark and back again before you find the setting that you like. You should also be looking for a sense of 3D to appear and disappear as you swing from dark to bright. This is the second of the the two most impactful settings you can adjust easily on your TV.
- The contrast setting (sometimes referred to as Picture) is tricky too. Each brand tends to sent its TVs out with different optimal settings. As you change this setting, look for the same things you did with color and brightness; depth, detail and overall realism.
- Sharpness is tricky since it means different things with different technologies. On CRT based displays such as traditional tubes and rear projection sets, sharpness changed electron beam focus. With CRTs a focused beam of electrons cones down a narrow neck and is bent around to strike to phosphors painted on the back of the tube causing them to light up. The sharpness control changes the beam's focal point so that it is as tiny as possible in as much of the screen area as possible. It's just like moving a magnifying glass up and down to make the smallest possible dot of sunlight (except that nothing moves in the TV). This is important so that the electron beam hits only the phosphor that it should and not adjacent ones above, below or to the sides. When set correctly, you'll see much more depth in your picture. Most displays end up with Sharpness almost all the way off after calibration so start low. Sharpness in plasma, LCD and other micro device displays each mean something a little different so take my advice and bring it all the way down then up one, two or three ticks.
- For any source other than VHS, turn off Velocity Scan Modulation (VSM) or Scan Velocity Modulation (much more accurately named but all the same). There are exceptions. Top quality TVs allow for 5 or 6 VSM levels for varying quality signals as well as personal taste. If your TV has this feature start with Low or Mid for VHS, Low or Off for most cable TV, Off for Direct Broadcast Satellite (DBS or small digital satellite dishes), any setting that works for standard (non-HDTV) off-air broadcast from an antenna, and definitely OFF for HDTV from any source. These may change after you use a calibration disk.

These were most of the broad strokes. There is a lot more to it, including hiring an Imaging Sciences Foundation (ISF) or other certified expert to come to your home to calibrate your set(s). If you've got the dough, that's the way to go without doubt. Between $300-$450 US per input is fair and so well worth it it's difficult to express. Just be sure to check references and certifications. It's easy if the dude's legit. If they make it tough, of course, hang up the phone.

I hope this has helped some of you.

Ciao -

Jim

Without love, breath is just a clock ... ticking.
Equilibrium
ID: 225445 · Report as offensive
Heffed
Volunteer tester

Send message
Joined: 19 Mar 02
Posts: 1856
Credit: 40,736
RAC: 0
United States
Message 225589 - Posted: 4 Jan 2006, 4:44:56 UTC - in response to Message 225275.  

A feat I never had the nerve for. And what if someone had dropped the blade (magnetizing it)? No way. I like coffee too much. For guys like me that's when you punch in to a spare and take notes for the mix.

Unfortunately, if that's the way the client is used to working and they insist, there isn't much you can do about it. But I hated doing that more than anything. Mixes on 1/2" or 1/4", no problem. I was an editing madman. Sitting there with pieces of tape labeled and taped to the wall so I wouldn't forget where I put them and get the song assembled out of order... Ah, the memories... But editing the 2" was something that would make me break out in a cold sweat. And yes, a freshly opened razorblade that you had thoroughly degaussed was the only way to go. Nothing like making a splice and playing back only to hear the dreaded pop of using a magnetized razorblade. Ugh... I was amazingly happy when I saw what digital could do. Yes, I remember Sound Tools well. We were actually a beta facility for Studer Dyaxis. We had the worlds largest system at the time. Two huge (at the time) 600MB HDs, and two I/O's giving us four inputs and outputs. Although we were to discover that regardless of what the techs at Studer told us, (they were learning too, and nobody had tried what we wanted to do) You couldn't use all four inputs at once, so using it as a recorder was somewhat limited. (After we discovered that, the techs could figure out why we could possibly want to record four tracks at a time) That and you could only record 30 stereo minutes, because unless you were doing really simple edits, (such as simple album assembly) you needed to render the output which took the same amount of disk space as the recorded material. You also needed to backup and totally reformat the drive between sessions, because as we found out, it would pick random pieces of a previous project and stick it in the current project. That was more than annoying... We ended up splitting the I/O's and creating two separate mastering systems. Even with the headaches back then (and I'm glad things are much better now) the advantages outweighed the drawbacks.

Ever combine the two mediums?

Absolutely! As I mentioned, even a single bit of overmodulation with digital gives you nastiness, so we would regularly bounce stuff to tape and hit it a bit hard to get that crunchy tape saturation sound. Also, tape delays. And digital signal processing has yet to achieve the effect of reel flanging. Also, trying to VSO something in the early days of digital was a death sentence. (You could get some pretty cool space monster effects though!) So we VSO'd using tape.

The recording business has always been a wonderful marriage of art and science. It seems as if we're swinging a lot more toward the science every year. But whatever tools you use to create it's still art.

Absolutely. It is more of a science to record these days, but if you do your job right, listening unlocks the art. :)

And back somewhat to the topic at hand, (I feel a bit like I'm beating a dead horse, but I have to say it...) superior formats are quite often edged out of the marketplace. Either from customer demand, or pressure from broadcasters. (the convenience issue I mentioned in an earlier post) Remember Sony Betamax? Disappeared from the public eye rather quickly because VHS already had a strangle hold on the market. People simply couldn't tell, or didn't understand it was a superior format to VHS in just about every way. Subsequently, it was the transport of choice in professional circles for years. And even though DigiBeta is quite popular, the workhorse beta machines are still being used today, and going strong. Too bad for the consumer. :(

So really, even though it may sound like I'm anti-digital, I'm not. I use it every day. It's simply not the be all, end all that marketers and many salesmen would like you to think it is. Analog is more than up to the task, and properly connected and configured, I guarantee you won't see any image degradation if you choose to use component video inputs instead of digital.
ID: 225589 · Report as offensive
Jim
Avatar

Send message
Joined: 28 Jan 00
Posts: 614
Credit: 2,031,206
RAC: 0
United States
Message 225602 - Posted: 4 Jan 2006, 5:00:36 UTC

I enjoyed the reply Heffed. Happy mixing and Joyful crunching!

Jim

Without love, breath is just a clock ... ticking.
Equilibrium
ID: 225602 · Report as offensive
Profile David@home
Volunteer tester
Avatar

Send message
Joined: 16 Jan 03
Posts: 755
Credit: 5,040,916
RAC: 28
United Kingdom
Message 226569 - Posted: 5 Jan 2006, 23:20:50 UTC - in response to Message 225445.  

Sorry Appetiser, I forgot to answer your questions.

....

I hope this has helped some of you.

Ciao -

Jim


Many thanks Jim, your reply cleared up a lot of confusion I had.



ID: 226569 · Report as offensive
Profile David@home
Volunteer tester
Avatar

Send message
Joined: 16 Jan 03
Posts: 755
Credit: 5,040,916
RAC: 28
United Kingdom
Message 227460 - Posted: 7 Jan 2006, 10:17:46 UTC

A report from CES (international Consumer Electronics Show) said there were 1080p TVs everywhere!


ID: 227460 · Report as offensive
Profile Darth Dogbytes™
Volunteer tester

Send message
Joined: 30 Jul 03
Posts: 7512
Credit: 2,021,148
RAC: 0
United States
Message 227494 - Posted: 7 Jan 2006, 12:49:06 UTC

I have a Samsung 30" CRT version which was quite reasonable in cost, has a better picture than the LCD's at a fraction of the cost. Like high end audio devices, most consumers really do not notice the difference unless it is profound. My set is fed by a DVI cable output and produces a fantastic 1080i picture. Like computers, there is no way to prevent almost immediate obsolescence, so why pay through the nose for cutting edge technology. Probably in the not so distant future, all these HDTV's will be just a new pile of consumer waste when 360 degree 3D comes out (imagine the required bandwidth on that).
Account frozen...
ID: 227494 · Report as offensive
Jim
Avatar

Send message
Joined: 28 Jan 00
Posts: 614
Credit: 2,031,206
RAC: 0
United States
Message 228153 - Posted: 8 Jan 2006, 23:13:36 UTC - in response to Message 227460.  

A report from CES (international Consumer Electronics Show) said there were 1080p TVs everywhere!
Yeah, true. Now that I/P (interlaced to Progressive) circuitry has become more common, widespread and useable, as in the X-Box 360, the cost is likely to come down at a fairly rapid pace. Manufacturers have learned that their age-old formulae for predicting R&D cost and marketing cost recovery need to be rethought.

Now we should look forward to sources that are actually encoded in 1080p. HDDVD and BluRay discs are the likely 1st sources. Most high budget TV shows have been either filmed (a great medium for transfer to HD) or videotaped in HD for the past 5 years so we also have a nice backlog of shows to enjoy in high resolution.

I suspect it won't be as long in the future as we might think before we "jack in" physically a la William Gibson's brilliant "Neuromancer", "Mona Lisa Overdrive" "Burning Chrome" and others.


Without love, breath is just a clock ... ticking.
Equilibrium
ID: 228153 · Report as offensive
Jim
Avatar

Send message
Joined: 28 Jan 00
Posts: 614
Credit: 2,031,206
RAC: 0
United States
Message 228157 - Posted: 8 Jan 2006, 23:25:00 UTC - in response to Message 227494.  

I have a Samsung 30" CRT version which was quite reasonable in cost, has a better picture than the LCD's at a fraction of the cost. Like high end audio devices, most consumers really do not notice the difference unless it is profound. My set is fed by a DVI cable output and produces a fantastic 1080i picture. Like computers, there is no way to prevent almost immediate obsolescence, so why pay through the nose for cutting edge technology. Probably in the not so distant future, all these HDTV's will be just a new pile of consumer waste when 360 degree 3D comes out (imagine the required bandwidth on that).
You spent your money wisely IMO. CRT based techs are more mature by decades in this application - though plasma and LCD have been around a lot longer than many people realize. I have my plasmas mostly because I won them in work related contests. They're fabulous, but I am a cheapskate in a lot of ways and would have had a hard time parting with $5000-$10,000 for them (the tax was enough). You can buy them cheaper, but $2000US only buys you an "basic to okay" plasma/LCD whereas it buys you a state of the art tube. The pictures are not comparable.

If living room space is not an issue and you sit within 15 degrees either side of perpendicular from your set, mature techs are still the way to go for many people. I might not be saying that next year. Who knows?

The stuff at CES is not generally what's on the horizon. There are other conventions for that. We're seeing stuff that for the most part is going to be on showfloors in the next 1-3 quarters.


Without love, breath is just a clock ... ticking.
Equilibrium
ID: 228157 · Report as offensive
1 · 2 · Next

Message boards : Cafe SETI : HDTV


 
©2024 University of California
 
SETI@home and Astropulse are funded by grants from the National Science Foundation, NASA, and donations from SETI@home volunteers. AstroPulse is funded in part by the NSF through grant AST-0307956.