Tony wants to know what are five of Leo's favorite books on Technology. Leo says that the Steve Jobs biography by Walter Isaacson is excellent. There's also a book called "Hackers" by Steven Levy, the early days of MIT Computer Hackers, who created modern computing. "The Soul of a New Machine" by Tracy Kidder, the creation of the microcomputer gives you the roots. "Microcosm" by George Gilder, tells about how microprocessors have changed the world as rapidly as it has. "The Weapons of Math Destruction" by Cathy O'Neil.
Dwayne wants to know what we can do to avoid being overly reliant on China to build everything. Leo says that for years, the world, and the US specifically, has looked at China as the manufacturer of the world. But that kind of overreliance can be problematic when relationships become strained like it did when the US started requiring tariffs. Companies like Apple are now looking to move their manufacturing to other countries, even back to the US. Diversity is a good hedge against overreliance.
Jeff doesn't understand the obsession with privacy and how skittish people are with technology and its privacy issues. Leo says that the real issue is how far-reaching technology is in violating privacy and the potential hazard of being misidentified. The question is, how far down the rabbit hole will it lead us?
Leo and Dickie D got an email from their past selves, outlining their predictions from 2010 about what the world will look like in 2020. How close were they? Most of Dick's decisions were silly, except for Microsoft continuing to release Windows and it won't work right. He predicted Windows 55, but other than that, he's pretty spot on.
Today marks the beginning of the 17th year of the Tech Guy Show, and Leo says it's just plain odd. It's 2020, the distant future is here. But it looks nothing like Blade Runner. In fact, it's not that much different from 2000 or even 1995, except slightly better technology. Science Fiction promised flying cars, living on other planets, and the virtual metaverse (ala Ready Player One). But we're barely starting to crack self-driving cars, and VR is a disappointment so far.
With what people are saying is the end of the decade (it isn't), there are stories coming out about technology over the decade. And one story talks about the most disappointing stories of the decade. Leo says that Social Media has been one of the most disappointing stories of the decade as it didn't really achieve the promise that it would bring us together. In reality, the old saying "familiarity breeds contempt" has been more likely. Social media has made us more fractured, as people tend just to read and embrace the things they agree with, or uniformity of thought.
Bob is a computer guy and he has no patience with people who want to learn computers. Leo says it helps to remember when he was just starting out. Somebody took the time to teach him, so he should pay it forward. Community colleges are also a good way to learn. Local extension courses, user groups, etc. There are plenty of options out there.
Citing the trend in Silicon Valley, where computer executives won't allow their kids and loved ones to use mobile devices, the NY Times came out with an article stating that using mobile phones can basically turn you into a zombie. Leo says that while there is indeed a dark and seedy side to the use of mobile devices, is it any different a concern from the wide spread adoption of books in the 1700s, or even the TV in the 1950s? For good or bad, this is the world they are growing up on, but it's important to teach them how to navigate through the connected universe.
Josh would like to educate himself on consumer electronics and technology. Leo says that tablets and mobile have really moved into the game, even in the corporate arena, where Bring Your Own Devices is a thing now. Voice technologies like the Amazon Echo and Google Assistant are really hitting the mainstream, and with that, so is home automation. Drones are also big. And looking over the horizon, AI is going to be big.
Intel has run up against a wall in Moore's Law that said that the number of transistors in a processor would double every 18 months. In the last few years, Intel has been up against a wall, not being able to double the speed. But a recent breakthrough has created a transistor using a single atom! That will enable processors to become faster and smaller, using very little energy.