As you can tell by the title of this post, I tend to ignore the common title tropes of the blogosphere. If I were in the business of attracting as many page views as possible, I’d title this post something like “The 3 Secrets of Fabulous People”. But I write for smart, witty (and already fabulous) people like you! So what gives with the title?
Behraim’s Earth Apple
Epistemology is a branch of philosophy that concerns itself with what we know, the nature of human knowledge, and what is knowable (you know?). It challenges our beliefs. How can you be sure that is true? Does our knowledge truly represent reality?
We live in an age where technical and medical leaps forward convince us that we know a lot. We carry a subtle sense of superiority when we think about the rubes from centuries past who thought leeches were an innovative form of medical treatment. Perhaps even those people thought themselves superior to the generations before them who hadn’t yet discovered “leeches 2.0”.
The idea of epistemology has been in my mind after listening to this a16z podcast on the advances in mapping technology. I’ve always had a love of maps. The “old” 2D paper maps of yesterday (as opposed to the digital 2D map on your phone or in your car) are, to me, like the difference between reading a paper book and watching a YouTube video. The latter conveys information well and may require concentration, but is passive. Paper maps require active reading and imagination, where we convert the information into something useful and have to find our route rather than having it show up as a magical blue line on a computer screen.
In the podcast, the guests talked about Martin Behaim’s Erdapfel Globe (“Erdapfel” being German for “Earth Apple”). Created in 1492, it was Behaim’s best approximation for the earth’s layout. Painstakingly created from his many travels as a merchant and mariner, it is the oldest existing globe known today. And if you’re like me, when your brain registered the auspicious year of 1492 you immediately thought of something else that year is famous for (when “Columbus sailed the ocean blue”).
We wouldn’t use Behraim’s globe to circumnavigate the earth today due to a few minor mistakes – for instance, the lack of North or South America. His globe represents Europe as one large landmass, with various islands around it. Japan is too big.
It was with this limited understanding of the world that Columbus took off that same year on what he believed was a voyage to India, and that’s what makes me think about us, today.
The purpose of this post is to help the non-technical understand the term “Artificial Intelligence” as well as related terms like “Machine Learning” and “Deep Learning”.
Since I work in the tech industry, I know our industry’s legendary power to hype a new idea and create new terms. For example, the idea of using off-site computing resources was once known as a Service Bureau – a term that was later supplanted by “ASP” (Application Service Provider), then SaaS (Software as a Service), the Cloud, and so forth. Embedded in these changes in terminology were changes in technology, but for as much as the tech world celebrates the power of individual it has a tendency to jump on the latest bandwagon en masse, which brings us to Artificial Intelligence.
A comic book prediction from 1965
You can’t swing a dead cat in the tech world these days without someone using one (or all) of the following terms:
- Artificial Intelligence
- Machine Learning
- Deep Learning
I’m going to keep things pretty high level here. Let’s get started…
When I was at my Military Intelligence Officer’s course in the mid-1980’s, our coursework was almost completely focused on encountering Soviet armored divisions storming into Germany’s Fulda Gap, despite the fact a) that battle had already been thoroughly planned and we weren’t going to add any insights, and b) it was clear at that point that the Cold War was shifting to new venues. Within four years of that time the Berlin Wall had fallen and I was in a war in the Middle East.
Here’s another example of how organizations have been resistant to change: In the hilarious memo below, written in 1935, Col. Hoffman – a member of the Army Air Corps – objects to taking “Equitation” (horsemanship) classes in part because he fails “… to see that horses have any place in the science of aviation”. Unsurprisingly, his request was denied.
It’s not just the military however. We humans are terrible at change. It frightens us – a fact that deserves some consideration given the fact change is coming faster than ever.
Luddites were people who smashed machinery they saw threatening their jobs. Now it’s a term used to describe hopelessly backwards-looking people who think they can beat back technology. Nobody wants to be called a Luddite.
When the first railway opened, detractors said that the human body was not meant to travel at 30 miles per hour and could possibly melt at that speed (similar concerns were voiced about super-sonic flight several decades later).
The newly-invented telephone was claimed by some to be an instrument of the devil. Versions of this accusation have been voiced about virtually every other new communication method, except for the fax machine – which may in fact have actually been an instrument of the devil (word of the fax machine’s death has not yet reached the HIPAA-regulated medical community: “no we can’t email your health information directly to your personal, password-protected email address, but we can send images of that same medical info to a fax machine located in a high-traffic area at your place of work”).
It’s easy to be smart about bad ideas of the past. But we humans repeat our mistakes. Here are some changes that are coming:
Elon Musk. It’s hard to read any business publication and not run across some story of the Thomas Edison/Henry Ford of our time. I read two particular stories – one where Elon was used to inspire action, and the other where Elon himself pledged to take action. Let’s see what we can learn from both examples.
The first example, curiously, is a BMW employee meeting. As detailed in this article, BMW employees were brought to an airplane hanger and Elon’s sinister face was used to put the fear of …well…Elon into the assembled throng:
“Inside a bright auditorium at an abandoned airfield near Munich, rows of men and women gaze at images flashing by on a giant screen: a Mercedes sedan; Porsche and Jaguar SUVs; the face of Elon Musk. “We’re in the midst of an electric assault,” the presenter intones as the Tesla chief’s photo pops up. “This must be taken very seriously.”
The audience is composed of BMW Group employees flown in for a combination pep rally/horror film intended to make them afraid about the future of the industry. The takeaway: The market is shifting in ways that were unimaginable just a few years ago, and BMW must adapt. The subtext is a recognition that the company has gone from leader to laggard.”
The scene above sounds a bit like the horror scenes force-fed to Alex in A Clockwork Orange, or perhaps the iconic George Orwell-themed Apple TV Ad introducing the first Macintosh. I like to believe the film was accompanied by some thrilling Wagner music.
- BMW is smart to pick a villain. People respond better when they can envision an external threat (something, sadly, that political leaders know too well).
- If you’re Elon Musk, you know that when the competition is using you to scare their employees you’re doing something right. Not that Elon really cares.
- That said, the truest barriers to established brands’ success in the new era are seldom external. As Clayton Christensen and others have pointed out, the eating of existing market share is consumed in small bites and enabled by business models that are constructed on new technologies and business processes.
My guess is that BMW spent quite a bit producing their internal horror flick, flying people in for the pep rally and feeding them excellent German sausages and beer. Meanwhile, back at Tesla….
As I write this I am eating a bowl of lentil soup, which reminds me of a story. Let me set the stage….
It’s winter of 1991, and I’m in Kuwait City. As you recall, this is when coalition forces entered Kuwait City as part of a multinational effort to evict Saddam Hussein’s forces from the city – aka “Desert Storm”. Although few US forces actually entered into the city itself, there were some unique US troops running around the city during/after the short ground war. I was in Civil Affairs – the part of the Army that is a liaison between local civilians and governments. Our much cooler colleagues from Special Forces served as liaisons between our military and the local military elements on the ground.
I was part of “Task Force Freedom“, a small task force set up to work through the complex – and impossible to anticipate – challenges that arose during the operation. As the ground war quickly came to an end, our mission turned to getting Kuwait City on the road to recovery.
Our immediate concerns related to basics like security, sanitation and food distribution. The previous months of Iraqi occupation of Kuwait City had been hard on some city residents, so in the early days, along with our other missions, we were in the city making sure large trucks filled with food were distributed to neighborhoods as requested by the Kuwaiti government.
If you read my About Me page, you will see that it ends with this: “Finally, I am an optimist. It’s an exciting time to be alive”.
I thought that now would be the perfect time to revisit optimism – not just as a way of viewing external events and avoiding despair, but as a way to impact events. Let’s start with three well-known characters: Bill Gates, Melinda Gates, and Warren Buffet.
Last week, the Gates Foundation released their annual letter. Since Warren Buffet gave the bulk of his wealth – $30B or so – to the Gates Foundation in 2006, Bill and Melinda addressed this year’s letter directly to Warren. I recommend you read through the letter in its entirety.
In one part of the letter they touch upon why they remain optimistic about many of the major health challenges facing the world. This optimism runs contrary to rampant pessimism. For instance the statistic below:
Bill puts it well here:
“One of my favorite books is Steven Pinker’s The Better Angels of Our Nature. It shows how violence has dropped dramatically over time. That’s startling news to people, because they tend to think things are not improving as much as they are. Actually, in significant ways, the world is a better place to live than it has ever been. Global poverty is going down, childhood deaths are dropping, literacy is rising, the status of women and minorities around the world is improving.”
Optimism (and pessimism) perpetuates itself. While the political world has become practiced in leveraging fear, that approach doesn’t work as well in the private sector, where leaders create outcomes based upon the shared belief and passion of their teams.
Which leads me to a related point: the importance of active optimism, or “applied hope”. Before I get into the provenance of this phrase, let me drop one last (telling) quote from the Gates letter, this line specifically from Melinda:
Every now and again I like to provide examples of cool things that are happening. Without further ado, let’s dive in….
This might be my 3rd grade picture
Say No to Bifocals. The reason people don’t usually see me with eyeglasses isn’t because I have perfect eyesight. It’s because I wear contact lenses made for the preposterously near-sighted. Lately however I’ve been wearing reading glasses in restaurants, my theory being that restaurants have recently decreased both their lighting and the font size used in their menu and receipts. It must be their fault, somehow.
Researchers at the University of Utah have developed eyeglasses that can sense when you’re looking at something close or far and adjust the lens strength accordingly. They do this with a little infrared sensor embedded in the bridge of the glasses to detect where your eye is focusing and the glasses can alter the lens correction within 14 milliseconds.
One potential advantage is that users will only need to change their settings as their prescriptions change.
The trick here is not so much the Big Discovery, but turning the innovation into a mass-market product. Despite my poor eyesight, I see this as inevitable.
But the big breakthrough here is that unlike chess, poker is a game where participants have imperfect information. You only know your hand – not the other person’s. The resulting computing task is therefore more difficult. (Fun fact: the developers leveraged the game-theory work of Nobel laureate John Nash, he of the Beautiful Mind movie).
The speed of this breakthrough stunned even the experts. Said one scientist: “Such an event was prognosticated to be at least a decade away.”
As we hear about automation changing the face of work, consider the future for the neighborhood pizza worker.
I’d also like a glass of Chianti with that…
Zume Pizza, a startup in Mountain View, CA, is imagining a different future for how your pizza will be made and delivered.
According to this article, Zume is a pizza chain startup that is envisioning a future where machines and robotic arms press the dough, spread the sauce in “near perfect circles”, add the ingredients, and slide the pie into an 800 degree oven for the first few minutes of baking. The pie is then transferred to a delivery vehicle that will finish the baking process in on-board ovens on it’s way to your house – a baking process timed to complete exactly as the vehicle arrives at your location.
Suddenly I’m getting hungry….
As you might imagine, Zume currently spends far less on labor costs than Dominos or McDonalds. They are currently leveraging that lower cost basis to provide higher pay and full benefits to their fewer employees, but I suspect that sort of California idealism won’t last long as the model expands in the market and new competitors enter. In just the past few months McDonalds has increased their roll-out of automated ordering kiosks, which many (including McDonalds executives) say is a response to the recent efforts to increase the minimum wage to $15 in various cities.
Given the rise of driverless cars and trucks, I absolutely envision the day when we will order our pizzas using convenient mobile apps or voice assistants, get an alert when the truck is in front of our house, walk to the (driverless) truck to tap in the code that appeared with the alert, and get the boxed, piping hot pizza we ordered right there.
Probably all of us remember that one English class (or composition class or whatever your school system called it) where the teacher was a stickler for good content and rigorous adherence to the published school writing policy. We learned to construct our arguments following accepted formats, demonstrate the difference between our work and the work of others, and to cite the sources we used by including footnotes and bibliographies which listed the metadata (tech term!) associated with each source – the author, title, publisher, the relevant pages, etc.
It turns out that in our business culture, citing your sources – or, even more importantly, knowing them – is perhaps a lost art but could separate the unserious from the serious.
Since I am always interested in how the world is changing and how globalization is affecting much of that change, it was with interest that I read this article about a town you never heard of: Veles (population: 45,000) in the Yugoslav Republic of Macedonia.
In the recent US presidential elections, much was written about the proliferation of “fake” news sites, which spin fake stories to excite partisans and drive social media shares, clicks, and ad revenue. It turns out that the town of Veles specialized in this cottage industry during the recent US presidential election. Here is a quote from a creator of just one of the 100 fake news sites in Veles:
This is going to make me sound like a geek, but this past weekend I spent a few minutes flipping through an old college economics textbook. I didn’t keep many of those textbooks, but I enjoyed the class and always thought having a book on the Evolution of Economic Thought (the book’s title) might be a good thing to add to my then very insignificant personal library.
I spent a bit of time in the section about Thomas Malthus, a British economist in the late 18th century. If you were to associate one word with Malthus, that word would be “overpopulation”. His name is tied to that word still today. Just to show that Malthus’s name has jumped the shark from dusty economics textbooks to current use, look at some of the coverage on Tom Hanks’ new movie “Inferno” (“Tom Hanks Endorses ‘Malthusian Theory’ of Overpopulation“)
Malthus viewed populations as growing geometrically but food production only increasing arithmetically – therefore, he predicted, the growing world population would lead to mass starvation.
Malthus would be shocked to see that some of the chief health challenges in the populous and industrialized world of today are more associated with obesity than starvation. The idea that only 2% of the United States’ 320M citizens work in farming is a rebuke to Malthus’s doomsday predictions.
What Malthus missed is the rise of technology and productivity. And while we might forgive Malthus for this blindspot (although not, in my mind, many of his disturbing proposed solutions to the overpopulation he so feared), it is more difficult to forgive it in ourselves. Yet I think we continue to have it – and I’m no exception.
In the era of the PC, it was hard to imagine what else there was that could be invented and create such a societal impact. Ditto the iPhone and the rise of the mobile era. Even as Moore’s Law continued, we humans have had a hard time envisioning how the future is likely to change drastically from the present.
The increase in the all our personal devices have perhaps blinded us to where real innovation happens. We mistakenly conflate new and more spectacular devices with actual innovation (see here), but realistically our personal productivity hasn’t moved along as quickly as our vibrating and chirping personal devices might have us believe.
Here are a couple thoughts to consider as we look toward the future: