Jose wants to know the difference between a TV and a monitor. Leo says that there are huge differences, mainly that TVs simply aren't good enough. TVs don't have as sharp resolution as monitors, which are designed for up-close viewing. That requires a higher than HD resolution. Even up to 4K. TVs tend to be bigger, but the resolution isn't higher as a result, and as such, pixels are easier to see up close. A TV has 100 DPI, while a monitor is 300-400 DPI and a phone 500 and above. The refresh rate is also faster, often twice as much, especially in gaming monitors.
Mike has been watching videos on youtube and wants to screen grab various shots. But when he does, it's terribly pixelated, like a multi-generational copy. And Windows just gives him an error. Leo says to open the file with a photo editor and see if it's better. Leo recommends Irfanview. Leo also recommends trying to make a PDF of the image and then printing that. It could be a print driver issue.
Sarafine has images that she puts on a thumb drive, and sometimes they become very pixelated. Leo says that s likely because the image is low resolution, and the metadata doesn't show that it isn't as sharp as it looks. Also, converting an image to JPEG is a mistake because it doesn't scale. So if you're using vector graphics, converting it to JPEG makes it pixelated as you change the size of it. What you want to do is change the size to what you want FIRST, and then convert it to JPEG.
Scott says that TV makers are leading the way to 8K TVs now. But the question is, can the human eye even see the difference between 4K and 8K? Scott says probably not. In fact, Warner Brothers tested 130 people and found that most people either couldn't tell the difference or found 8K TVs only slightly better. And some judged 4K better than 8K. But that could actually be a blind guess. People with 2010 vision sitting 5' from the screen could see the difference, but only slightly. So Scott says we've reached the limit of how the human eye can see the resolution.
Chuck has a 5K iMac and a second LG 5K monitor. But he can't get the resolution to match. The window ends up 50% smaller. Leo says Apple has taken a lot of control away from users. He suggests trying an app called SwitchResX. It'll let you choose any resolution and frame rate your monitor will support.
Dan wants to get a VHS to DVD player. Leo says that analog VHS is really low in resolution. It's only standard definition at 480 lines, and it's interlaced. We're now at 10 times that. But on an LCD screen, they are dimmer and scan progressively. The DVD side is 480p. It's a little brighter and the LCD screen tries to upscale the resolution. It improves it, but there's only so much he can do. It's really just old technology and it's time to move on. The reality is that VHS and DVD are both going away as most people are preferring streaming media now.
Arthur uses YouTube and he says that it's so compressed, it's absolutely unwatchable. Leo says that when he's streaming video, it's largely dependent on his bandwidth. The less bandwidth he has, the lower his resolution is going to be. He can adjust the quality he's getting in the settings, however, but at the end of the day, he may need to just get more bandwidth.
Carl has a 2012 MacBook Pro Retina, and now when he connects it to his Vizio, he's noticed that the screen isn't as clear. The fonts are fuzzy and the image quality varies from app to app. Leo says that it could be that the native resolution of his Vizio screen may not be one that his MacBook understands and therefore, it runs the default resolution, which is generally half the native resolution of the screen. He'll need to figure out what the native resolution of the screen is, divide it by two and choose the best option based on that. Could updating to El Capitan also be a factor?
Kevin got the new 4K Roku and wants to know the best way to watch TWiT. Leo says that BitGravity High has the highest resolution, so that's the one he'd choose. Where can Kevin find all the channels listed for Roku? CordCutting.com has a list here.
Keith streams on Netflix and it looks terrible at the beginning. Leo says that Netflix uses an adaptive algorithm that starts off at its lowest resolution and then gets better once it guages his bandwidth. If it gets worse over time, though, that means that his bandwidth is inconsistent. Since Keith is using wireless, it could be the Wi-Fi dropping or pausing briefly, dropping packets. Keith needs to wire his router to the TV. If that doesn't fix it, he can look for the Quality of Service (QoS) setting that will enable him to set priority over what traffic he wants.