From the Author

Since launching this book in 2017, I've had this conversion project on my to do list. If you're new to me or the book, you're in for a ride. At the start of writing this I was a digital advertising leader, by the end I burned out of the industry. Nowadays I'm publishing how-to guides for networking equipment. The juice was worth the squeeze. I'm making this available for free so you might gain some direction or perspective about the world we find ourselves surviving. The themes covered in this book underpins many of the earth-shaking events that increasingly take place online. May yours be a "good one".

Best,
Corey

Table of Contents

  1. Chapter 1 - Introductions
  2. Chapter 2 - Buckling In
  3. Chapter 3 - Artificial Intelligence in Flux
  4. Chapter 4 - Big Data Fusion
  5. Chapter 5 - What Can Go Wrong
  6. Chapter 6 - What Can Go Right
  7. Book front matter - dedication, acknowledgments, legal

Chapter 1 - Introductions

“Your future is whatever you make it. So make it a good one.”
― Doc Emmet Brown

Welcome to the future! I’m glad you’ve made it, here’s what you need to know: We haven’t yet gotten around to inventing flying cars, futuristic metallic clothing, or shark-sized interactive 3D holograms.

The first time any of us saw the film Back to the Future 2, it was a jaw-dropping vision of the future. Well, our presumed future anyway. Marty McFly zipped through space and time to 2015 and was granted a grand exhibition to an incredible new world funded through new and exciting technologies. Audiences across the world were sold hook, line, and sinker. We banked on the notion that in the future we’d all be riding around on hoverboards and eating healthy meals cooked in a matter of seconds. While the film portrayed a veritable World’s Fair of what’s next in technology, if you look around ‘the future’ isn’t always readily visible. In regards to everyone wearing shiny metallic clothing, perhaps we’re just not there yet. However, as we’ll find out together, in a great many ways our future turned out to be much more exciting than we could have imagined.

The intent of this book is to help drive technical literacy of everyone who is curious. We are here to de-mystify the tech world of today as well as illuminate where we may be tomorrow. While reading this book, you’ll notice that I make frequent use of the term ‘we.’ My use of ‘we’ is shorthand for the you and I, as well as stakeholders in the technology at hand. To deliver this message in a manner that’s comfortable, wherever possible I aim to write as if you and I were sitting in the same room having a one-on-one conversation. We’ll peel back the curtain of technology to reveal the good, the bad, and the awesome. We’ll achieve this goal by building up core concepts as a primer before they become applicable, in turn, providing insight to the technological and scientific developments we’ll review. We all have a collective stake in understanding their ramifications and place in our lives.

If society accepts technology as a component of our culture, our societies will benefit from the value gained by having an informed citizenry. Throughout this book, we’ll collect and examine evidence, building a case for an overall optimistic view of technology, albeit with caveats. Once we accept technology’s place in society, we are enabled to examine how these advances will affect our day-to-day lives. This book will be an optimistic and educational primer on the amazing developments that are happening today in science and technology; meanwhile we’ll acknowledge the risks presented by these new developments, and the potential options for reducing or dampening negative effects. Important to note, when we’re speaking about technology, we’ll be referring to it without the taxonomic convention, inclusive all types - bio-, agriculture, banking technology, nanotechnology, media, military, travel, even waste technology. Technology is pervasive and foundational throughout our entire lives. I’m saying that the separation between animals and us is that we use technology enabled by our opposable thumbs. There is them and us, and dogs and cats are along for the ride too apparently. Our entire families are going through this ride, great Scott! Let’s do what we can to make sure it’s a good one.

Technology advances from the practical application of scientific principles for the improvement of societal tools. For prehistoric humans, the evolutionary development of conscious perception was provided by expanded ‘cognitive ability,’ over generations and generations. Cognitive ability refers to the brain’s ability to tackle tasks ranging from complex to simple. The senses themselves developed into the eyes and ears we know and love today. Through generational and genetic variations, people developed differently or with minor defects, an example might be color blindness. As brain size and function increased and developed, prehistoric humans were able to observe and reflect upon the world around them. Enabling them to shape the environment to their collective benefit. This began with the use of rocks to smash objects or animals. Early humans later observed the flaking nature of rocks when smashed against each other, and used it to develop sharp edges. Perhaps the first true forms of technology, sharpened rocks, certainly proved a useful tool in the dim, harsh environments that pervade prehistory. Later, tools included innovations like using fire to harden the tips of wooden sticks. This allowed prehistoric humans to churn the soil efficiently to yield more productive crops.

It’s not hard to imagine the dual use of these fire-hardened sticks in this very violent period in our history. Life was brutal and short. However, it turned out that poking both the ground and other animals proved to be an intuitive use of technology. No training needed, just point it in the direction of the ground or an animal, and push. Lucky for us, poking animals with sticks was especially good for business. You poke a rabbit with a stick and all of the sudden you can eat it! It must have been a revelation for early humans. Neanderthals and Homo sapiens must have felt what Americans feel like when they enter Costco, surrounded by beautifully expansive consumerism! You see something over there that’s furry, smells delicious, and has four legs - let’s eat it! Eventually we figured out which animals made for better friend than BBQ, and kidnapped ourselves some pets. Of course nowadays, if you poke someone, you’ll just get unfriended on Facebook.

Hunting with fire-hardened pointed sticks provided more fat laden meat which was a driving factor in increased cognition. Over time, the body began to devote more and more energy to the brain, especially when compared to other primates. Primates on the other hand, evolved to devote more metabolic energy toward muscle use,1 which is why a chimp can beat Arnold Schwarzenegger2 at arm wrestling. While other primates were continuing to be the ‘jocks’ at high school, us humans were relegated to the ‘nerd’ table. We’ve been working on our social skills ever since. Meanwhile our brain went on to consume 20% of our body’s total energy,3 a great investment I’d wager. Our enhanced cognition enabled increasingly complex social hierarchies via language. This unlocked cooperation among non-kin early humans. As stated by Steven Pinker, a noted cognitive scientist, “Language not only lowers the cost of acquiring a complex skill but multiplies the benefit. The knowledge not only can be exploited to manipulate the environment, but it can be shared with kin and other cooperators.”4 Pinker goes on to assert that this is because information can be duplicated without loss. If I tell you that banging these types of rocks together sparks a fire, then I haven’t lost that knowledge. Food for thought, this same principle applies to information contained on devices across the world, you’re always a copy > paste away from recreating text or data without loss. The same principle applies to tools built via software like Microsoft Office Suite, Adobe’s suite of media tools, etc.

What about the tools we have now? How do they further extend our perception of the world around us? The tools we have at our disposal now are far different and less intuitive than a fire-hardened pointed stick. The culmination of generations of scientific advancement have given rise to technologies that affect every aspect of our lives. The question for technologists, designers, and engineers becomes how do we create technology driven solutions to today’s challenges? Further, how do we ensure that we create intuitive solutions for the intended stakeholders. These stakeholders may consist of large swathes of people who will have vague ideas of how the technology itself works, or it may be a small community with a unique problem. It is the goal of this book to bring the stakeholders to the table, because increased understanding of the tech world allows us to first, clearly define our issues and needs, and then create technical solutions that meet those needs. Your inclusion in this process will help yield a tomorrow that works best for everyone.

My concern with people not understanding technology stems from how technology is portrayed in films, television shows, and most importantly, the news. Few people understand web technology, let alone computer science like machine learning. In the absence of understanding, hyperbole often wins out over reasoning and critical thinking. Terminator 2: Judgement Day painted a bleak future where humans are oppressed by autonomous killing machines, The Matrix trilogy painted a similar vision. There’s virtually no end to the examples of where technology could and does go wrong in pop culture. These films are examples of the instinctive fear of technology’s possible outcomes. Outcomes that appear assured given the humane character portrayals and vivid images on the screen. If you were to play word association with most people, as soon as you mention ‘robots’ they’re likely going to go with a word that has a negative connotation. Robots overtaking humans as the dominant species on earth while possible, is improbable. The issue is that this mode of thinking becomes the default lens of viewing technology.

There are not enough voices explaining both the promise and risk of advancing technology. My fear is that given how technology has been portrayed in pop culture and media, our brains have been given shoddy mental shortcuts to think about technology. This arrives in the form of heuristics, enabling quick decisions about situations in which there may be limited or incomplete information. Processing all available information about any topic, technology or otherwise, is impractical to say the least. Heuristics then speed up the response by allowing mental leaps, easing the cognitive load, and energy requirements, on our brains. As a result, if you were to approach people on the street about their smartphone’s functionality, they’re likely to describe it in fantastical, almost magic-like terms, rather than the structured, prescribed protocols that they are. If culture deems technology as magic, we’ll never fully understand it, and its place in our lives. To realize technology’s potential is to elevate prosperity for all. At the very least, the good news is that the below five very odd behaviors are increasingly rare, if not extinct, thanks to the proliferation of technology:


1. Remembering all of your friend’s complete phone numbers

2. Slapping a television in hopes of reducing visual distortion

3. Using house furniture as a preservation of family wealth

4. Having ice delivered to your home to refrigerate food

5. Having no less than five remotes per TV


On the flip side, here are five behaviors that exist now that would seem insane to the people from the first five:


1. Order an item online to be delivered an hour later

2. Invite strangers from the internet to stay in your home for a fee

3. Getting in cars with strangers to get to your destination

4. Video chatting with your doctor

5. Using devices that translates both spoken and written language in real time


In the grand scale of humanity, all ten of the above behaviors might seem peculiar to earlier generations because these changing behaviors are a vital component of human social development. They are a part of the ephemera of human culture, each of them representing a moment in time. Like the TV Guide, their only purpose today is to recall how it all used to be. We can recall the environment in which these cultural touchstones occurred for each of us. For kids of the 90s blowing air into a Nintendo cartridge like a harmonica was just something you did in hopes of playing some video games. Our brains are hardwired to recognize patterns in everything we perceive - behaviors like this developed out of conditioning. We blew into the cartridge and popped it back in, the feedback of whether or not it worked was immediate, and if it failed, we could repeat it quickly. Our brains only needed one success to declare cartridge-blowing as a valid method of troubleshooting.

In the most famous example of classical conditioning, Pavlov’s dogs, Pavlov the scientist rang a bell to initiate a dog’s salivating for its impending meal. The dog’s brain recognized the stimuli of the bell as a precursor to being fed, so the reaction of salivating became natural and involuntary. Thus, every time the bell rings, the dog thinks food is on its way. This psychological phenomenon also applies to humans, we use past learnings to inform future behavior.

If people do not understand the technology behind our devices, we’ll mistakenly salivate at the ‘ringing of every bell.’ To be fair, I’ve blown air into more than my fair share of Nintendo cartridges. That’s why I’m here, I want to help you recognize the patterns underlying the technology and to connect the dots. I’ve watched the internet grow up from AOL community chat rooms and into an exciting multi-trillion dollar driver of ingenuity. From 3D Printing to Artificial Intelligence to Augmented Reality, we’ll examine where these technologies are presently, as well as where they might be in the next five to ten years. Technology has advanced to the point that voodoo superstition will no longer get us by. Because the stakes are too high and too far reaching, a reframing of our decision making processes as they relate to technology are vital.

To achieve this we’ll make use of textbook-style tangential breakaways where we sidestep the topic at hand to review the critical concepts at play.

The More You Know!

Vocabulary/Concept Review

Ok, Maybe We Are Living in the Future.

Who is this book for anyway?

My aim is to approach the material in a conversational manner that is as accessible as possible. In recent years, I’ve noticed a largely unaddressed gap in educational content aimed at helping mainstream America understand technology. I’m writing this book for the people who feel that technology moves too fast for them to keep up with it, for the people who feel like they’re missing a train that so many have already hopped aboard. Scientists and researchers can frequently be found on TED Talks or YouTube giving fantastic lectures that are seemingly aimed at fellow scientists and researchers. As an outsider, understanding technological concepts can be confusing or even daunting. Much like an introductory 101 college course, here you need only to bring your curiosity of the tech world around us to gain a firmer understanding of the basics. We’ll add jokes to moisten dry material, and visual aids will be added to illustrate where words fail.

To be clear, the material we’ll be covering assumes you have a base level of knowledge about how technology works. From there, we’ll build up the concepts and terminology together, and by the end you should feel empowered with understanding of how the world spins nowadays.

Fig 1-1. - Spectrum of computer science knowledge, not to scale.

If you happen to be an engineer, this might all read like yesterday’s news to you. Furthermore, you’ll find that this book may lack some of the nuances that your expert knowledge affords. It is not my intention to explain concepts in a reductive manner, but we will simplify wherever possible. Generally speaking engineers are incredibly up to date on their geek news. My hat has always been off to you lot.

While you are reading this book, I’d encourage you to keep your favorite device or laptop nearby to check out some of the topics we cover in a live setting. You may find yourself thinking something sounds interesting enough to check out right then, only to find the device is in the other room. Let’s do our best to keep first-world problems in check by keeping those devices handy!

Buckling into this crazy ride opens with an examination of how our senses assist our understanding, we’ll illuminate the spectrum of technology and its many forms. Throughout we’ll spotlight both the good, the bad, the risk and opportunity therein. This book’s purpose is to include any inquisitive mind who desires a peek under the tent of the three-ring circus that defines the world of technology.

At the core of every innovation since the dawn of man has been an intellectual curiosity, a human desire to understand the inner machinations of how our world works. In terms of civil and economic prosperity, this growth has been driven by the women and men dedicating their lives to research in many fields of study. In exploring unknown horizons, we continue to find out new things about ourselves and our place in the universe. Like steps in time, these advances allow us to stand on the shoulders of the generations that came before us.

Roaming groups of nomads assembled into tribes, establishing rules and hierarchies of society. Later those tribes would go on to become nation-states, create more social structure and law for the express purpose of the prosperity of individuals. Eventually, whole nations of people arose, and with it, access to technologies and the knowledge. Passed on knowledge of how to improve their skills brought order to disorder. People assembled into groups based on these skills commonly referred to as guilds, sharing their knowledge and insight. Eventually this led to incredibly intricate machines with hundreds of moving parts, literally and figuratively, as skill sets joined complimentary skill sets and advancement arose. This is all because thousands upon thousands of years ago, a short, hairy guy or gal creakily got out of bed, possibly hungover, and felt like phoning in their day by doing less with more.

As years and generations passed, understanding of our world around us progressed. That same fire-hardened stick we discussed earlier would go on to specialize its purpose by evolving into tools like a hoe, with these innovations societies created an even more bountiful harvest. Through a further application of innovation, the hoe evolved into the plow and thusly supported a society of yet more humans. As quaint as it may seem to now think, the simple hoe was a revolutionary tool. It created agrarian understanding which allowed early humans to increase in population and social complexity at rates never before seen.

The technology of today enables levels of productivity that would shock previous generations, and the use of which would absolutely terrify our short, hairy prehistoric ancestors. To fully unleash the impact of technology requires the use of levers that are not immediately apparent during the use of things like mobile devices or apps. Let’s dive into discussing foundational concepts that establish a baseline for understanding and leveraging technology.

The More You Know! Hardware Vs. Software

Hardware - physical components, motherboards, computer chips, etc... that by themselves perform nothing but contain the potential to operate. Think of your muscles.

Software - digital instructions for hardware, from displaying an image to playing a video. Think of both your passive and active thoughts. Your lungs keep pumping air though you do not actively command them to.

The connection between hardware and software is the language they use to communicate between each other. In the below vocabulary review, we discuss the way information is communicated between them.

Vocabulary Review - Bit

When thinking about the ways computers store information, it’s very easy to get overwhelmed. Between ‘gigabytes’ and ‘megabytes’ or ‘gigahertz’ and ‘megahertz,’ they do not make it easy. By the end of this vocabulary review you will feel more comfortable with seemingly abstract concepts. For now, let’s do some learning:

A Bit - a binary number (1 or a 0) is the most basic version of information a computer understands. If you could draw a comic book style thought bubble over a computer it would contain nothing but these bits as 1s and 0s.

Think of a bit as a single letter in the alphabet. Even a single letter can invoke meaning. For example, ‘I’ denotes me as an individual and is communicated by only a single letter. But generally, letters need to be paired with others letters to communicate effectively. The same holds true for bits.

Byte - A collection of 8 bits.

Now that we have a collection of 8 bits or letters, we can communicate actual concepts, like: FLAPJACK - F (1) - L (2) - A (3) - P (4) - J (5) - A (6) - C (7) - K (8)

Bonus point: the number of bits contained in a byte being 8 is as arbitrary as the direction of Donald Trump’s hair. True story.

Kilobyte - 1,000 bytes.

At 1,000 “characters” or letters, we can tell a mostly nonsensical story.

This is the amount of precisely one thousand characters, as you can tell this is much, much more than just a “FLAPJACK”. Pfffff. Yes ma’am (or sir), we are in prime flapjack cruising altitude. Nothing but clouds of powdered sugar atop mountains of (really quite beautiful this time of year) French Toast. I don’t know about you, but I had THE best blueberry pancakes yesterday, it was this great little spot in San Rafael, California called Theresa & Johnny’s Comfort Food. The pancakes... were a stack of two sunny-faced, plate sized rings of perfection. They came pre-buttered. “I can’t even”, as the ladies say. So, needless to say, it was a valiant attempt on my part. Nay, I was not the victor that day. Rather, I was the victor that night! Anyway, so here we are talking about bits and bytes. The point being that this is an order of magnitude more detail (characters) about a single concept. This is simply going from a byte to kilobyte. All of this from a simple one thousand characters. *mostly nonsensical

Megabyte - 1,000 Kilobytes

At the size of a whole megabyte (MB) of data/information, we have the ability to not just talk about the pancakes, but to show a beautiful picture.

Fig 1-2. - Evidence of the existence of god. (Courtesy LivingJoyByZoe.com)

Gigabyte - 1,000 Megabytes

When we get into the gigabyte range, the amount of detail we can convey goes up quite significantly. You could now have thousands of images depicting the glory of flapjacks.

Fig 1-3. - Flapjackagram- coming to an App store near you.

The difference between a megabyte and a gigabyte being that we can now watch a video about making flapjacks at full HD.

Terabyte - 1,000 Gigabytes

With 1,000 gigabytes at your disposal in a storage drive, you’d be able to store a veritable vault of flapjack videos. Some people just fill it with pirated movies and TV shows, about flapjacks of course.

Protip: As you may have noticed, the prefix for storage capacity is the only element of the word that changes. Kilo- become mega-, becomes giga- etc. This continues to be true as the amount rises. Reaching all the way up to the yottabyte which has 24 zeros in it.

 

1 000 000 000 000 000 000 000 000 bytes

 

The More You Know! Digital vs. Analog

You may already be a digital girl, living in a digital world. If not though, let’s start with the basics:

Analog - represents signals and measurements that exist in the physical world around us. Use a ruler to draw a 3 inch line on a piece of paper and you’ve conducted an analog measurement. Analog signals can be represented simply as a wave. They are always in the process of modulating up and down.

Fig 1-4. -The signal perpetually modulates up and down.

Digital Signal - In computing, the top of the waves represent ‘on’, while the valleys represent ‘off’. Signals and measurements within the digital world do not exist in our physical world. They’re either on or off. You could not put a ruler to a screen and measure the amount of data flowing across the screen.

Fig 1-5. -The signal is either on, or off.

Analog signals tend to be vulnerable to interference which can cause a signal to degrade or lose quality. If your memory goes as far back as tube TVs, you’ll remember what static or ‘snow’ looks like. The distortion may come from a hair dryer, an electric drill, or your neighborhood Doc Brown.

As digital signals are always just on or off, they are not subject to interference in the same way an analog signal would be. This means that no matter how much gold-plated crap they put on the connectors, the signal will still be a 1 or a 0. For everyday people it means this -


DO NOT spend a ton of money on expensive cables!


This is one of the biggest scams of the 2000s. If you visited a big box store, they would tell you otherwise, but they’re bamboozling you. Digital is digital, when’s the last time you had to slap an HD television to reduce interference?

Important to note - the official standard for cables like HDMI are updated periodically to reflect current needs in digital signal transmission. This means that when you upgrade your TV set in a few years, you may need to buy new cables if you want to remain with the latest and greatest tech.

In my experience there are many retailers on the web that sell reasonably priced cords and connectors. In the past I’ve used www.MonoPrice.com and have never had a problem.

Getting Around the Web

URL Bar - The ‘street address’ for websites, located near the top of your browser. The URL (Uniform Resource Locator) bar requires commands with a specific computer jargon, similar to spelling mnemonic to ‘I before E, except after C.’ the syntax ensures the browser navigates you to the place you expect.


Fig 1-6. - Don’t forget to bookmark your favorite sites in your browser!

Great! Now you know how to find and visit your favorite sites, via their street address. Facebook, Google, Yahoo are all websites that also exist at a specific and unique URL also known as a domain or specifically. They typically include a ‘.com’ as a suffix, but it can be ‘.anything.’

Like a business, each website has a ‘street address’ that when typed into the URL bar, and you hit enter on the keyboard, navigates the browser like a car, along the path and to your end destination. Much like a GPS the calculations for driving there happen without you noticing, it all happens ‘under the hood.’ Websites can also be a community, which allows people to come together for a common cause and solve problems. “I don’t have anyone to talk to about 17th century poetry” or “I need to show someone these cute faces of my grandchildren” - these are just a couple examples of the many problems being solved by the creation of online communities. These communities solve problems which then attracts other users with the same problem. If the solution is effective and simple to understand, it will naturally gain attention and users.

Anyone who is interested, is able to start their own website dedicated to whatever their passion is. Mark Zuckerberg did not need permission to start Facebook from some higher authority. He had a core problem - connecting with people. And so, he went on to make his own solution to his own problem. Many websites begin this way, the person has a challenge or issue and they create their own solution. Websites gain in popularity because other people across the web, also share in that same ‘problem.’ Given the nature of his problem of ‘connecting with people’ his solution also solved the problem of a great many people, billions in fact, allowing a global community of people to connect. It is because he solved such a common problem that people sought out his website to use for their own purpose.

Fig 1-7. - There are now websites dedicated to helping people build websites using templates as starting points. EX - Squarespace.com, Weebly.com

My point is that you too, can also start a site. It doesn’t matter what it is about, the important thing is to bring what you’re passionate about and connect with others. If you’re just entranced by the intricacies of knitting, then maybe you feel the need to talk with others about how much of a knee-slapping time you have with knitting. I’d just like to posit my own idea for the URL StraightOuttaYarn.com, the domain also known as a URL still available to plant your flag in!

Vocabulary Review - Pixels

Pixel - Similar to a brick wall, pixels are stacked on top of each other to build a display. With pixels, each individual brick or pixel has the ability to change its color. When the pixels change color in unison they can produce an image, and if they work together and change quickly, they can produce digital video.

Unless you’re reading this on an actual book, you’re looking directly at a lot of pixels. You can actually hold a magnifying glass to the screen and see each individual pixel! They are measured exactly like a typical graph, with X (horizontal) and Y (vertical) axes. You could put a piece of graph paper up to the screen and achieve the same effect, just much smaller in size. When referring to a computer’s screen, the amount of pixels are combined to provide a more accurate representation of the display’s technical specifications.

1024 (pixels) x 768 (pixels)=786,432 total pixels


Fig 1-8. - The amount of pixels used in a device becomes a selling point for gadget enthusiasts.

As a frame of reference, my first computer had a maximum resolution of 640x480. Now my smartphone has a pixel resolution of 1,440 x 2,560!

Before we really dig into all of the rest, I thought it best to give you an idea of my background so you can understand my frame of reference on technologies. I hold no PHDs in computer science, I did not code a multi-million dollar Operating System by age 16. So, why me? Maybe you’ll have a little more fun hanging out with me instead. Over the years, I’ve built (sometimes with friends) several websites with varying degrees of success. Ranging from an open source asset library for indie game developers to a Minecraft gaming community. I’ve always translated personal pursuits into career learning. Parcel to this notion, I’ve built my career in advertising around helping people understand and get excited about advanced web technology.

Biographically speaking, I was born in West Germany, while my parents were both serving as military police in the United States Army. I’m the youngest of four sons, and grew up in very rural central New York state. When your hometown is not a major metropolitan area “Where are you from?” becomes an exercise in comparative local geography. In my case we were geographically located near the about 15 minutes away from the Baseball Hall of Fame in Cooperstown, NY. I speak the truth when I tell you that my hometown of Van Hornesville is not even truly a ‘town’ but a geographically separate hamlet within the town of Stark which has a bustling population of 767 people. Our rural hamlet subsists on agriculture and get-it-done-yourself spirit born from the secluded environment. It’s the kind of town where cashing your weekly paycheck and buying lotto tickets can be regarded as having a dream. We were very rural and very poor with 20%5 of the population living below the poverty line, a stat that was nearly equal to New York City.6 Still, I am reminded of my fondness for the area upon re-visiting and recalling this simple country quirk. Everyone waves to each other, whether they recognize you or not. The year I graduated from the public school (Pre-K - 12) there were a total of 13 students in my graduating class. There were only two New York state public schools smaller than ours, one of those being a state school for the deaf.

On warm weekends my extended family would get together with friends, my father and his brothers would play classic rock cover songs for hours while us kids ran wild. My father had taught himself to play guitar by ear, beginning with simple songs like Roy Orbison’s Pretty Woman, he built up his skills over time into covering rock legends like Stevie Ray Vaughan and Eric Clapton. He had leveraged his curiosity in guitars to independently learn at his own pace, and this type of learning has been the bedrock for my own expertise in technology. Unfortunately for my ego, my father’s talent for music proved an inheritable trait in appreciation only. I’ve attempted several instruments in my life and the only one that I ever proved ‘proficient’ in was the tambourine. You may have picked up there isn’t much glory out there for tambourine players. Luckily, I had skills elsewhere.

I’m fortunate in a great many ways. Among those good fortunes was being born to a father who had a keen interest in computers. This interest ensured that throughout the years and despite our modest blue collar means, our home would always have a respectably modern computer. When you’re a child, every experience is new and fascinating. None of us enter this world with a baseline for any singular concept. We start our early years fragile and dim, and over time and with good health, we experience the range and meaning in life. From the moment the lights upstairs officially flicked on for each of us, to the end of that line; there are of course, big events that tend to have a strong impact. Seismic shifts in our path in life hold a clarity of weight when viewed in retrospect. My own profound moment came from the unboxing of our new Packard Bell Navigator. The personal computer at this point was fast becoming the de facto learning tool in schools. Personal computing was a technology that set off a chain reaction of innovations that set off yet more explosions of groundbreaking technologies.

In my head, this was the undeniable wave of the future, a personal computer with a CD-ROM and a fully colored screen interface. This was my own version of A Christmas Story’s Ralphie getting his ‘official Red Ryder, carbine action, two-hundred shot range model air rifle!’ The best part was that I didn’t even realize I had desired it. This was the moment everything changed. The PC came bundled with Microsoft’s Encarta which enabled my young brain to satisfy my curiosity about the outside my small hilly corner of the world.

Fig 1-9. - A multimedia powerhouse and home of rad gaming classic MegaRace.

Windows 3.1 was my first graphical user interface or GUI, and it was a revelation. There were now icons that launched programs with a double-click via a mouse! No longer did you have to memorize a series of commands to access the floppy disk drive and launch a program. Not to mention the ability to multitask via switching between open programs by pressing alt + tab on the keyboard. It was a revolution in computing that even a hillbilly kid could understand. I had my first experience with creating multimedia at around the age of 9 with Macromedia’s Director program. As a precursor to Flash, I learned about keyframes and animation of simple objects, this allowed me to have an understanding of how interactive animations were created from that point forward.

Fig 1-10. - Bubbles and Spaceboots.


My first glimpse inside the case of a computer occurred when my brothers had surreptitiously convinced my father that we needed a $143 upgrade. We had convinced my father we needed an upgrade of 4 MB of memory (RAM). We’d claimed we could “do better at school work.” Little did he know that it was actually so we could play the game Doom II. We needed the upgrade, otherwise the gameplay stuttered as if we were watching a slideshow. A side benefit was leveraging the added memory to more quickly multi-task out of a game and into a calculator when my father walked into the room. I’d always had a love of video games, but this was the first time that I saw both computer hardware and a better gaming experience having a direct correlation to each other. That same 4 MB of memory now costs a single penny.

I considered the factors that go into a ‘positive gaming experience’ which in a technical sense often meant faster frames per second (FPS), which gives the appearance of smooth motion. Frames per second is a hotly debated topic in the gaming community, but the general consensus is that higher is better with diminishing rates of return as the FPS goes up past 60.

In my Junior and Senior years of high school I took the opportunity to attend a two year long vocational course on Information Technology. For context, this was the period after the DotCom bust, public skepticism about the computing revolution was at an all time high. Many people lost their shirt in the crash, requiring them to start over. For those who paid attention it was clear that there would not be a second bust like the first. The internet was clearly a tool and recognized as such by our generation. I began the repeating pattern in my life, obsessively immersing myself in new learning opportunities. Information Technology is the study and practical application of the flow of data between computers. Simply put - how computers talk to each other. The class was taught by one of the original IT guys Al Sarnacki, a Vietnam vet and all around great teacher. Very much a non-teacher persona, he conducted his class as a “Benevolent Despot” - which was a nice way of saying to us kids, ‘don’t piss me off.’ Not that anyone ever tested it, except for one kid, but that’s a different story. We began with learning various protocols and networking topologies and troubleshooting principles. By the end of the second year, we were programming industrial grade Cisco routers and improving grandmother morale everywhere via fixed computers.

It was truly a formative experience for me, up until then all of my technical training was either taught by brothers or self taught. It turned out to be one of the most valuable classes to my entire education as it laid the framework for my understanding of networks. From IP and MAC addresses to wiring standards for ethernet - we were given an understanding of the nuts and bolts of how computers communicate. We had a window into the way the internet as we knew it operates and many of those same mechanics hold true to this day. The pursuit of this knowledge, however, led to the unexpected side effect of becoming the local whiz kid that could fix any computer issue. “Hey, I fixed your computer issue - it was porn related.” It’s always porn related.

Important to understand is that fixing computers is never an inherently difficult task to perform. There are prescribed steps to take for everything and the internet has ensured that there is always a community for whatever you need to learn. Rather, fixing a computer is time-consuming. Between virus scans and drive reformats you can spend a few hours just getting a single computer back to a normal. Further education came from my school’s progressive technology education program where I learned how to code HTML. HTML otherwise known as Hyper Text Markup Language is the method by which browsers like Firefox or Internet Explorer draw images, text, and formatting to the web page. Imagine you were to pull the buttons, images, text away from the screen and place a flashlight to the side but facing the screen. You would essentially see pyramids of various sizes and shapes. This is sorta how object oriented programming works, each code segment displayed or run, is an object in digital space. The browser reads an HTML file that gets its directions from the code contained in the file. My first site prominently displayed yours truly - casually posing by laying across bleachers, frosted tips, puka shell necklace and all. These were not proud times for fashion, but really great for learning!

My attribute of being the local whiz kid transferred to college, which resulted in people showing up at my dorm room asking for computer help. After a lot of thought about my long term goals, I switched gears into marketing combining technology, psychology, and business. I chose this field because while having an aptitude for technology, I could not, for the life of me, understand people and sought to learn more.

As someone who is obsessively self taught, I leveraged this into teaching myself Adobe’s pivotal graphic design software Photoshop (CS1). I’d sit on my bed for hours on end, performing tutorial after tutorial. I’d taken no formal design classes so I was obsessed with sponging up every bit of knowledge or skill I could get my hands on. The internet was again an amazing resource for learning, containing hundreds of tutorials on all facets and functionalities of Photoshop. The tutorials, at first started from simple text effects and quickly led into compositing pieces and web design. I was learning to express myself visually rather than verbally.

Fig 1-11. - Sample of projects I was designing circa 2007

After discovering my affinity for visual design and graduating college, I landed a job at Burst Media. Burst Media was a display advertising network that had built one of the first ‘ad servers,’ or a computer that places an advertisement on a web page. Websites do this by placing code on their page that ‘calls’ the server for an ad to show. Despite the bad rap that exists to this day, digital advertising was fascinating from a networking and software standpoint. I learned how multiple servers rapidly and cooperatively decided on what ad to show a single set of eyeballs. These decisions happened in milliseconds and were happening millions and millions of times in a single day. The complexity of digital advertising in 2008 is now comparatively child’s play to what the ad tech community is producing today.

What may or may not be a surprise to some of you is that on a daily basis - companies both large and small are collecting data on you. Nearly everything you do in the digital realm is generating data in some form or another. Even the act of checking a notification from the lock screen on your smartphone can generate a data point. We’ll examine this phenomenon in much greater detail later in the book.

As I continued to learn as much as I could, I chose to specialize in rich media. Rich media is advertising that utilizes advanced web features. To me, rich media was the closest avenue to the print advertising that I loved growing up. Employing large format pictures along with advanced web features, like photo galleries and video. Because of the use of these features, movie & video game advertisers naturally gravitated toward rich media. To get to work on these types of campaigns became an exciting prospect. Rich media allowed me to gain a working insight into multiple hard skills, like engineering and design.

As a part of working in advertising, it was a necessity that we were ourselves creative, especially in rich media. Advertisers demanded that we innovate on technologies like mobile, as well as provide fresh ideas for execution. In every brainstorm for a particular product, I pushed myself to employ my background to think of unique interactions that would provide either a level of fun or usefulness to the user. The point was to provide an additive experience for the user. I wanted to fight the notion that advertising was always negative for a web user’s experience. The ideas from brainstorms even included a beautifully designed multi-player game displayed on an LED Billboard in Times Square; which connected to smartphones and tablets via web technologies. Two users would compete using cartoon cats and dogs along with other furry creatures to unravel a digital toilet paper roll before their opponent did. It was a simple bit of branding for a paper product company, but very complex on the technical execution side, the likes of which really drove home how far web technologies had progressed in the few years since the start of my career.

OK, Maybe We Are Living in the Future, Flying Cars

As far as technology had progressed, it was still apparent that we were a long way from The Fifth Element or The Jetsons, where the promise of a future filled with flying cars had not yet become our reality. Starting in 2017, this may finally be changing. Lighter materials and the budding drone industry has enabled no less than six companies to begin building prototypes for the future of flying cars. The goal is to build flying cars that can pilot themselves and fit inside an average garage. Among these companies, AeroMobil is planning on launching their version in 2017. Estimated cost is pegged at $400,000 +, not exactly an impulse purchase. Unless you’re a tech billionaire. Hell, even one of the cofounders of Google - Larry Page owns two different flying car companies. It’s just too bad average joes need to be in possession of a Grays Sports Almanac covering the winners from a 50 year time period to be able to afford it.

Fig 1-12. - Your reaction when you saw the $400K price tag.

The More You Know! Democratization of Technology (or why Gutenberg was an Original Gangsta)

The democratization of technology refers to the trend by which a singular technology or development becomes rapidly and widely available to any with an interest and the means.

Sounds great... what does it really mean? Typically in the past, trade guilds and academics would have the access to all the latest tools and technologies of the day. This makes sense because the cost of producing any tools would have been prohibitively expensive due to their innately mechanical structure, especially crafting tools that required knowledge of mathematics or some form of handed down skill. A Renaissance era example of this being a Mariner’s Astrolabe, which uses the noon day sun to determine the latitude of the user’s location while sailing open seas. It’s fairly easy to see the widespread use of a device like this could contribute greatly to the endeavour of trade, thus creating economic expansion and prosperity.

Fig 1-13. - The Renaissance consisted of a 200 year long game of ingesting MDMA and playing “feel my fabric, yo.”


In 1440, Johannes Gutenberg invented the printing press and changed the world as we knew it. Most of you will remember this from school. What you might not know, is that the legacy of the printing press is still in play to this very day and will continue to be well into the future. The printing press, of course, allowed the transference of ideas via (for the first time) mass media. No longer would monks or priests spend hour upon hour writing calligraphy to create astoundingly beautiful but labor intensive texts. As more and more people gained access to these tools, their ability to contribute to society economically and artistically rose dramatically. Prior to this, ideas were transferred orally, or if you had money to pay people to handwrite each book. Ideas were static and worst of all, geographically locked.

Additional methods of reproduction stretched over time into more illustrative forms with lithography. Lithography is the process of transferring an etching to a sheet of paper. Imagery at the time could be more impactful than printed books due to rampant illiteracy, lithography provided perspective to a much wider audience. Philosopher Walter Benjamin noted that “lithography enabled graphic art to illustrate everyday life.”7 This self expression became culturally impactful because people across the world could communicate simply about the conditions of their lives, emotions and desires. The recognition of similarities across geographically separate locations became the stepping stones to the community based internet. The subsequent economic growth garnered a cultural exchange as well further leading to more technological advances.

Today, digital media authoring tools are increasingly available to anyone with an internet connection. As soon as an idea is born in the digital world, it is then able to leverage software to be birthed onto a more expansive digital world. These self expressions can instantly be available for anyone in the world to view at the moment of publication. We know from the recent political revolution in Egypt that the immediacy of technologies like Twitter have the ability to organize groups of people for a single purpose. The power of this cannot be understated. With the advent of each new engine of expression, more and more individuals are provided with a voice. With these technologies, it is plain to see that they have a large role in the further development of societies. As billions and billions of people in developing countries arrive online and find drivers of creativity and function at their fingertips, what will they have to say?

Services like word processors, image editors, and 3D animation software are all now available for free, 24 hours a day. More and more, people are educating themselves with these tools. They then produce passion projects that may or may not find an audience. The point is, the user has been empowered via the democratization of technology.

Fig 1-14. - Johannes Gutenberg. Blacksmith, Goldsmith, inventor of both the Printing Press and Pig-tailed beard.

Concept review - Copy > Clipboard > Paste

Files like documents or videos consist of bits, bytes etc. and are infinitely reproducible since they are digital. Going from one file to two is just a copy/paste away!

Fig 1-15. - You can copy, any file - video, images, spreadsheets. If the file is digital, it can be copied.

Protip: In any browser window, word document, PDF (most anyways) it’s possible to search the document for a single word or phrase via shortcuts. Shortcuts, when pressed in succession enable additional functionality.

 

Windows: Ctrl + F Mac: Cmd + F

 

Then search for your desired word or phrase and add quotation marks for exact matches. This can help you easily navigate large documents like contracts.

Vocabulary Review - Server

Server - A type of computer that stores resources. Resources can include ‘files’ (a picture of a cute kitten, sometimes flapjacks) or a ‘service’ (Netflix, Hulu) which streams those resources to other computers or devices.

Fig 1-16. - A video file can be stored geographically closer to your area to speed up access. It sure beats transferring the whole series of Breaking Bad (many, many gigabytes) from New York to San Francisco!

Vocabulary Review - Network

Network - A group of devices or computers that are enabled to communicate between each other. Within networking ‘nodes’ refer to devices which can be anything from a smartphone to a laptop or even a single temperature sensor monitoring a plant. As we do in society, each node is supplied a name and a street address. The node’s street address can change just like real life and come in the form of an IP address, IP is an acronym for Internet Protocol. The static mac address and IMEI number for tablets and smartphones identify devices that interface with a network, this is unchangeable like your name.

Fig 1-17. - Networks can consist of as little as two devices.

Concept Review - Coding Languages

Like the world of humans, the world of computers communicate using a variety of languages. Like the spoken languages, coding languages have variations similar to dialects and accents. Some are intended for generating graphics, others are meant to quickly crunch numbers. So code is a noun, yet also a verb. As in “I’ve got to code a javascript-based front end.” or “The code is ready for review.”

Additionally, these coding languages need an interpreter of some sort, reading the commands written in code and then executing them on the screen. In terms of the web, this interpreter would be a browser like Google Chrome or Firefox. Each browser has strengths and weaknesses in their performance and their use is based entirely on your needs as a user. Some languages organize around object-oriented programming while others revolve around actions or animations.

Again, similarly to human languages, coding languages have differing uses and can also all be used together based on the end goal of the project. For example, you would not use an animation language like Adobe’s Flash to maintain a large set of database records.

Further examples:

Fig 1-18. - There are many types of languages and each has a community of developers sharing skills and code.

Ok, Maybe We Are Living in the Future, Fashion Future!

In this edition of “Ok, maybe we’re living in the future” we’ll review the state of technology in fashion. Back to the Future 2 certainly laid an odd template for what people would be wearing in the future. Though not all causes are lost, as we’ll find out there are some unique fashion trends around the corner that will help connect the dots between literal Helmet hair (see the butthead below) and clothing as a tech product.

Fig 1-19. - Anger is a fishnet shirt with no tech features.

For every technology, there can be an improvement to the fundamental science and application of its purpose. This can be expressed through the discipline of design. However, sometimes when design merges fashion and technology, the results can be disastrous. Though as we’ll see, if fashion considers user experience when merging with technology, the results can be something pretty special. The digital and physical realms collide in the form of wearable devices which fuel the development of a ‘quantified self’. The quantified self, is the idea that our physical actions can be measured and aggregated to create insights into our daily life. This has arrived in the form of personal health tracking, by wearing products like the FitBit, users are able to track steps taken, calories burned, heart rate, etc. The intent is to inform ourselves about our true behaviors, opposed to self-reported behavior, so we may modulate our actions to meet our goals. Design plays a pivotal role in the advancement of wearable technology, because if user preferences are not taken into account the results can be awkward to the point of ensuring failure. Without a streamlined introduction into the marketplace, emerging tech and fashion products can can stumble, hard. A prime example of the challenge of marketing wearables, is the release and public failure of the product Google Glass.

Fig 1-20. - An example of trying to make ‘fetch’ happen.

Google Glass is a device that projects the graphical user interface (GUI) elements, like buttons and images, as an overlay on top of your vision and is controlled via the user’s voice. Interface elements can include items like email notifications, video chat, and turn-by-turn directions. As a geek, this product was a no brainer. In practice, users stuck out like a sore thumb, and on more than one occasion had been accused of being “Glassholes.” It’s funny that people can hold smartphones in their hands all day, but put that technology on your face and people freak out. Urban Dictionary defines “Glasshole” as ‘a person who constantly talks to their Google Glass, ignoring the outside world.’ Given that voice controls can be misinterpreted in public settings, it’s pretty easy to see how people could become impatient with Glass users. People were annoyed at the possibility of being recorded in public or private events. Society had such a visceral rejection of Glass that the product was discontinued for the public and redirected toward industrial applications. A step back for futuristic metallic clothing of the future for sure.

The future of fashion involves adding sensors and intelligence to our everyday clothing. For example, Microsoft has patented a “mood shirt” that senses your mood, via body temperature, heart rate and other sensors, and reacts appropriately. Say you walk into a room full of strangers, if you’ve ever experienced anxiety in this situation, this shirt can stimulate your body by applying pressure, simulating a hug. This allows your brain to relax in a seemingly tense situation. Further examples of tech enabled clothing include:


Project Jacquard

Smart Athletic Apparel

No Cow Leather

The More You Know! Don’t pay for common software, it’s free on the web!

The ongoing study and development of computer science has allowed for complex software to be designed and created. Over time, software gets upgraded, new bells and whistles are added. What used to be an advanced feature becomes commonplace. As a result, we have many pieces of software which can act as a working alternative to costly commercial software. Some businesses do not charge for a free account but can charge for extended features. Most businesses can be run and maintained off of freely available software.

Fig 1-21. - In the digital age, there’s a tool for every need. *Unreal only charges a capped amount based on the amount of software sold.

This book, by the very nature of its content, will challenge many of our presently held beliefs about our day-to-day world. Consequences and benefits to the whole of society and across the world are at play here. As we already know, the actions of a single nation may have sweeping consequences for another. We owe it to ourselves and our children to engage in these difficult conversations. We need to ensure that the world they inherit is one we can all be proud of, one representative of our time’s hopes, rather than our fears. Though I am optimistic, my own feelings are subject to change regarding tech. Not even I know how I will feel about the technologies of tomorrow as we explore their use and potential mis-use.

The differences in outcomes if we act upon our hopes as opposed to our fears, could be compared to the worlds of Star Trek versus Star Wars. Sure, wielding the lightsabers would be totally sweet, until the dominating evil empire destroys your entire home planet like it was nothing.

Speaking of dominating societies, Nintendo recently released the Smartphone version of their hit game Pokemon. Titled Pokemon Go, they have sown the seeds for augmented reality to become a mainstream reality. Simply put - Augmented reality combines a smartphone’s camera and overlays digital imagery to filter your vision. Enabling you, the user, to view images through your smartphone that are not physically there. In Pokemon Go, users explore the real world in search of all-digital Pokemon to capture and train. Pokemon can be residing in parks, forests, parking lots, local landmarks, etc. Any landmark in real life can have an augmented reality version, in the instance of Pokemon Go, any tangible location can serve as an alternate location within the game. The game stands presently as a massive hit, with an estimated 9.55 million daily active users,8 sending the stock price of Nintendo soaring 70% in the first week after release. The randomized appearance of a rare Pokemon has drawn crowds of several hundred people to storm a singular location.9

In this case, the augmented reality filter is supplied via the Pokemon. When attempting capture of Pokemon, the app will activate the camera and open a viewfinder as if you were taking a picture. The Pokemon will hop around in your phone’s presentation of a physical environment until you capture them. Given that Pokemon are generally small, around the size of a rabbit and up, your vision may only be filtered by 10%. While if you were wearing a virtual reality headset, your vision would be 100% filtered.

Fig 1-22. - Gamers have been anxiously waiting VR as long as I can remember.

Being a gamer all my life I’ve been privy to understanding the gaming perspective as well as more naturalistic ‘get outside and play’ sentiments. Through my entire childhood, society and parents have pressured kids with great fervor to ‘get outside!’ Now that Pokemon Go has been released, those same people are now belligerently shouting at these kids to ‘get inside!’ Which is it society? Quit being such a grump! Perhaps it’s not all bad for gaming, Pokemon Go has inspired a headmaster in Belgium to create an app based off of the hit game. In Aveline Gregoire’s version users learn about books hidden around a town by other users on a Facebook group called Book Hunters which has already attracted over 40,000 users in just a few weeks.10

While Pokemon Go is a personal experience, there are companies out there bringing augmented reality to large groups for shared experiences. In a demo that has captured the funding of venture capital and attention of technology enthusiasts everywhere is the whale demo from Magic Leap. Self described as a developer of novel human computing interfaces and software, Magic Leap is aiming to be the next evolution in computing interfaces. In the demo, a gymnasium is filled with students when suddenly a full scale humpback whale bursts from the wooden floor and splashed down across the gym. The secretive startup Magic Leap has garnered over $840 million dollars in funding from the likes of Google, and Alibaba to bring augmented reality interfaces and experiences to the masses.11 This level of funding enables companies to rapidly enter a market with massive amounts of fanfare. If you haven’t heard of Magic Leap before, by the end of 2017 you will be aware of it, and likely a big fan.

Fig 1-23. - High school is about to get more interesting

In the example above, it’s important to note that these projections will require the use of a headset that may or may not look Google Glass levels of silly. Imagining a gymnasium full of kids all wearing headsets to become educated sounds like some dystopian future. I make this pledge to you dear reader, we will go investigate this process together.

If ever you should need a reminder of our shared experience, know that we are in this journey together. All of us. It is my assertion that the tactical application of technology has the ability to change the status quo for the betterment of all people. Without taking the time to reflect on where progress leads we run the risk of inadvertently creating technologies that ultimately cause more harm than good. My way of stepping up to that challenge is to help as many people as possible objectively gain a general understanding of the pros and cons of technology. Undoubtedly, the more people understand the workings and machinations of technology, the better we will be able to make pragmatic and inclusive decisions about our shared future. People need to be responsible for their property, in the physical sense, as well as the digital. I’m aiming to contribute to a conversation larger than myself, and positively influence our shared future. Public discourse on technology policy shouldn’t be a food fight that ends when every last cupcake has been thrown, gauging who won by seeing what has stuck to the walls. We need to work together to fill in the widening gaps in society - technology represents one of those yawning gaps. Only then will we begin to be able steer the direction our future toward positive outcomes. There is no opting out of tomorrow, short of removing yourself from society. This book is my double down bet that modestly adding my voice to the many others will support a future that produces our best wishes for our children.

Technology can appear a lot like a dark forest, but we can steer how bright it is. There is no one that is going to build the future for us- it’s up to us as individuals to step up and contribute. The hoverboard didn’t become a reality by the will of a single person. It took all of those talented engineers and designers, to make it happen. Given the incredible pace at which technology has advanced, it is apparent that the rate of change will continue to accelerate. Never in the history of us, have we been on the precipice of where we are now. Doc Brown would have never investigated time travel if he had listened to the critics. What this means for you, dear reader, is this - strap in, because where we’re going, we don’t need roads...


Chapter 2 - Buckling In

“Between stimulus and response there is a space. In that space is our power to choose our response. In our response lies our growth and our freedom.” 1 ― Unknown

This entire experience is nothing, if not a learning experience. Let’s just run with it, I’ll be your wide-eyed Doc Brown if you’ll be my mostly-willing Marty. What do ya say? In all of my years of enthusiastically examining technology, here’s the prevailing advice, like any new city you visit, it’s best to have a guide. You have a friend and guide here in me, assisting you along in your education. Growth can most often be expressed when we exit our comfort zones to experience something new. Freedom is often, in my opinion, misattributed to the realm of physical freedom or freedom from bondage. I’d contend that a more useful portrayal as the recognition and appropriate reaction to obstacles as they lay in our path. This type of cerebral-freedom is the sum of what our physical senses report back to the brain, informing an ongoing understanding of the environment. Not to be too cheeky, but right now it begins with perceiving the words on the page and so moving into perception!

Right - so what is perception, exactly? We humans feature five unique senses, the senses each operate within their unique area of expertise and all report back to the brain. There is a huge energy requirement involved in sense to brain perception, let alone with five unique senses. As a means to efficiency, the brain provides us shortcuts to processing these environmental stimulus, or input. The signals are then converted into chemical data signals, not entirely unlike data used in devices. The brain will interpret, in conjunction all of our senses, the environment like piecing together a jigsaw puzzle. The messages from the senses flow into the brain and solve the puzzle as best it can, although pieces may be missing.

These messages can be flared by our senses ala flushing our system adrenaline at the first sign of predatory danger. It allows us to react as quickly as possible upon our brains registering sight of a predator. It’s similar to keeping a shortcut on your desktop or home screen, there for quick and easy access. See a predator and the brain instantly clicks ‘GTFO.exe’ and for the younger people reading that would be tapping the YOLO emoticon on your smartphones. Our bodies kick our asses into gear and send us running as fast as our monkey legs will carry us. Our continued survival was enabled, thanks to biological biases known as ‘gut instincts’ and reactions. As contemporary humans, we still rely on much of these instinctive reactions that arose from our early evolutionary history, to inform our present worldview. Potential issues may arise when we apply instinctive understanding where our understanding is swimming in deep water.

Functioning as verbal guideposts, passing on proverbs became a way to infuse daily living with evolutionary wisdom. ‘The grass is always greener on the other side of the fence’, or ‘Look before you leap,’ or even ‘Good things come to those who wait.’ These shared wisdoms transferred between people because they were both A) catchy and B) provided additional wisdom or insight to commonly arising issues. We find them helpful because our brains can only make decisions with the inputs available, which often times does not encompass the whole truth. The input of the proverb, helps us to maintain status quo or improve it. This is the essence of input/output. We are, after all, human, and these deficiencies in our processing arrive for natural reasons. It is simply impossible for humans to know the full spectrum of information about all topics. Thus processing of inputs can contrast reality or become lost in translation. A recent example of a variation in processing, includes the black/blue, white/gold dress debacle more commonly recognized as the day the internet’s collective mind was blown. Each person’s perception is demonstrably different.

Fig 2-1. - Black and Blue or Gold and White? Our eyes offer differing reports.

Within the persistent motion of our day-to-day environment, our active brains tend to not have the luxury of time, to analyze the minute detail of every situation. Moments of our days can stand out for one reason or another and may receive additional inspection. Otherwise, we take care of what we have to take care of on a daily basis. In the practice of decision making, our brains interpret the environmental inputs accessible, reference any memory or emotions concerning the topic and thus leap to a conclusion. The conclusion may be arrived at with seemingly impeccable recollection, however this may not always be the case.

Here’s an at home experiment, for science! Pour assorted jelly beans into a dish and don’t look while you pick one, plug your nose and pop the jelly bean in your mouth without looking. Keep your nose plugged and chew, you’re likely not able to determine the flavor of jelly bean. This is another example of how our senses are filtered through an interpretation of the environment. Similarly to the way augmented reality and virtual reality filter vision, our senses can also be filtered. This phenomenon can be explained via the visual tricks of the eye, Magic Eye from the 90’s, 3D shapes materialize when the viewer relaxes their eyes while staring at the swirls of colors.2


In the previous chapter, we covered:

Defined bits, pixels

Defined server and network

Reviewed the differences between hardware and software

Examined the variations of websites and online communities

Developed the computing concepts of copy and paste

Described some, but not all coding languages

In this chapter, we’ll answer the following questions:

What is cognitive science?

What are cognitive biases?

What is cognitive dissonance?

How does neuroplasticity inform my ongoing development?

What is Maslow’s theory of motivation?

How can I develop a pragmatic framework in understanding and applying technology?

Cognitive Science and You

So we recognize that our senses evolved to require a partner in the brain, enabling processing of environmental stimuli and developing meaning in the process. Evolutionary Psychologists John Tooby and Leda Cosmides posit that our brains are, to this day, wired for a world that embodied brutal and short lives. They articulate “... our modern skulls house a stone age mind. The key to understanding how the modern mind works is to realize that its circuits were not designed to solve the day-to-day problems of a modern American -- they were designed to solve the day-to-day problems of our hunter-gatherer ancestors.”3

It’s as plain to see as a white bread and mayonnaise sandwich, projected through time, our brain’s functions were not dialed into dealing with the decisions required to live in our present age. Let alone dealing with how crazy it will get in the near future. If we were to drop one of our ancestors - homo-habilis off at a froyo stand, their minds would be blown. If we plunked them into a corporate environment, they’d likely tear the place apart. Tooby and Cosmides proceed to claim this is because natural selection of genetic traits plays out over many generations. An example of being that our brains were programmed to crave foods that yielded nutritional sustenance. Fruit smells sweet, rotten fruit smells bad. Dopamine, a pleasure stimulating signal is released by our body when we smell the sweet fruit.

It may seems simple to us now but it’s easy to imagine homo-habilis growing thorny about not ever owning a refrigerator. That’s part of the problem, adjustments for the comfort we’ve developed for ourselves still have not caught up. We are hardwired for a different mode of living, our brains might dominantly prefer to live with simple wisdoms as guideposts via proverbs. Our early ancestors occupied a world full of danger from predators as well as their own people. This caused their brains to interpret the world with a survivalist’s perception, filtering out unnecessary details in the name of remaining, not dead. Meanwhile many of our present selves are kicking back and enjoying the wonders of air conditioning and heavily debating “What’s for dinner?” I believe it is plain to see our brains are not as well suited to navigating today’s technology environment for these same reasons.

In reframing our understanding of bias, we must first highlight how we apply information to decisions. We can demonstrate pre-existing biases by performing a quick exercise - imagine a fully grown stick-figure drawn on a sheet of paper. I’ll call him, Stanley Stickerson for clarity. Now, please give Stanley Stickerson a shield to fend off tribes of fire-hardened-stick brandishing tribe-guys. Those other tribe-guys are total jackasses but we’re protected if our Stanley is equipped with a shield. Feel free to imagine throwing it to him while dishing out a witty one-liner. If you’re unable to think of a one liner, jazz hands would likewise be acceptable. Back to stickfigure Stanley, depending on your own biases, you imagined the shield occupying the stick figures left or right hand. Which was it? Chances are, Stanley is holding the shield in the hand you yourself are dominant in using. Hergo, a simple example of a bias affecting decision making.

Our biases represent natural processes and there’s not much intuitively wrong with acting in our accordance with them. In fact, our bodies respond to environmental stimuli via chemical messages such as serotonin and dopamine. They act as reward mechanisms for our nervous systems. We can employ our knowledge of their existence as an opportunity to view things from a slightly different angle, a cognitive adjustment. The definition of cognition can be defined as the processes by which our brain engages in perception reasoning, judgement, and memory. When we lack information about a topic calculating a decision can be portrayed as a half painted picture, or a meme without a caption. Perceiving our behavior requires consideration of a broader approach. Cognitive science fits that bill due to its interdisciplinary approach including linguistics, neuroscience, artificial intelligence, philosophy, anthropology, and psychology. Cognitive science is basically the Captain Planet of sciences. With their powers combined -these disciplines contribute findings and research to the others in a synergistic exchange of insight. When mixed together in a pot, we have the discipline of cognitive science!

Given that biases exist in our basic interpretation of the world, they impact our engagement with not only technology, but also our environments. Decision making, of course, begins with defining the problem. Daniel Kahneman, Nobel laureate and bestselling author of Thinking, Fast and Slow describes two distinct systems within decision making. The first system being intuitive thinking, providing us with a rapid estimation of what’s coming next, while the second is more carefully considered and deliberate, allowing your brain to make long term decisions.4

Fig 2-2. - Each system of decision making has benefits and drawbacks.

When making any decision we’re able to apply one or the other of these two systems. Given the amount of decisions any person engages in on any day, we’re looking at a ton of opportunities to jump to conclusions prematurely. which have varying effects on the outcome. Kahneman posits that “when information is scarce, which is a common occurrence, system one operates as a mechanism for jumping to conclusions.” This is particularly common in people’s grasp of technology. Information floats around, but in many cases there appears to be no clear avenues to attain further understanding. When making decisions, both systems possess their own benefits and consequences. Yet, in both systems a wide variety of cognitive biases can sway our thinking, one way or the other.

Cognitive biases can be thought of as the electrical wiring supplying electricity to the lights in your home, they exist at all times in all rooms, though they may not be turned on at all times. When making a decision about what to think about a particular topic, the brain recognizes a series of patterns which corresponds to a specific ‘light’ in the brain. The brain, seeking to save energy prefers to take mental shortcuts. These cognitive biases enable people to rapidly determine their own personal frame of reference on a given problem or topic. This could be anything from what to eat for dinner, to what protocols need to be in place to ensure personal privacy on the web. Similarly to cooking from a recipe, if you lack an ingredient you can still cook your meal. It just might not be the same outcome, have you ever tried to eat flapjack with no maple syrup? Might as well eat cereal with water - it’s savagery!

Researchers at the Max Planck Institute have discovered that indeed, the brain contains pre-wired areas of the brain that make a decision before our conscious had an active chance to decide. Sorta like the way when I ask you to recall the music of Star Wars or Jaws, you instantly recognize it - you know it by heart. The researchers stated that, “many processes in the brain occur automatically and without involvement of our consciousness. This prevents our mind from becoming overloaded by simple routine tasks. But when it comes to decisions we tend to assume they are made by our conscious mind. This is questioned by our current findings.”5 The brain’s decision for us, all happens up to seven whole seconds before we realize we’ve made a decision. I believe this behavior makes a big splashy appearance in the form of system one decision making spurred by cognitive biases that steers our understanding away from technology. The lightswitch flicks ‘on’ and a decision is leapt to for us, before we even recognize it.

Children from the 90s will no doubt recall blowing air into Nintendo cartridges, in this example, system one dominated the decision making process. Had we taken the time to count how many times blowing worked as opposed to not, we would have realized how ineffective it truly was. It did not stop kids from sharing the secret as a holy potion for resolving our gaming woes. As a result, most kids from the era recall using any method short of voodoo in attempting to play video games. Little did us kids realize, the real reason behind the faulty cartridges was simply poor contact between the cartridge and the slot. Correctly or not, these biases affected our decisions, by filtering our perception with the evolutionary imperative of preserving energy. Spread across our lives, this tradeoff thereby alters the very trajectory of our lives.

Our brains developed the ‘fight or flight’ imperative to rapidly react to danger within our environment. Visually scanning a landscape, we’re built to prioritize the bad news first, recognizing clues to the presence of a predatory threat immediately standing out. A majority of the technology world exists as invisible infrastructure, we can’t smell it, taste it, and in many instances, we can’t touch it. Our inability to perceive technology past a screen, despite all of our evolved senses, is emblematic of technology’s complex nature. With unknown inputs and outputs, the complexity can make us a bit resistant to pursuing a further self-education in technology.

In psychology, ‘anchoring’ is the act of employing the first available piece of information about a subject to inform your understanding of the rest of the subject. These cognitive biases are self-perpetuating and cause distortions in our perception of technology. This is an instinctive and intuitive reaction to ideas that challenge our understanding of the world. The experience can feel like we’re exploring a dark forest by our sense of touch alone. The practice we need to engage in, is to quiet the primal parts of our brain while we critically perceive the environment, employing technology as a tool. It’s sort of like Thanksgiving with a big family, there’s a kid’s table and an adult table. I’m asking you to seat the intuitively uncertain part of your brain at the kid’s table, while we frankly engage in a conversation about technology like adults.

There’s good news though, developing intuitive decision making regarding technology requires us to ramp up slowly. This begins with shifting from system one to system two, employing deliberate decision making. Over time and with continued reading of this book, you will become comfortable talking with others about the great opportunities technology presents. It’s like any muscle that requires training. In fact, I’m pretty sure we’ll need a training montage straight out of the climax of an 80s movie to inspire us to push on. The variety where there’s a series of rapidly edited shots of preparation activities prior to the ‘final battle.’ The final showdown, the show, the end game. You get it, though we still need to place more puzzle pieces before we can jam on that. In video games, when you ‘level up’ you gain in knowledge, skills and ‘POWER’, that’s sort of what we’re doing here!

Back to gaining experience, selecting a home to purchase engages system two decision making, while system one helps with selecting ice cream flavors. These approaches can sometimes cross wires, so to speak. An example being, if I asked you “would you like to buy a home at market rate near you that is spacious and located in a great neighborhood? You must decide within three seconds. Time is ticking.” With a limited amount of detail you may be thinking, “I can’t possibly make that decision without further information!” However, if I asked you “do you want peanut butter ice cream - it just so happens to be healthy?” The limited amount of information about the ice cream gives you all the info you need to make a intuitive decision. As Kahneman writes, “Jumping to conclusions is efficient if the conclusions are likely to be correct and the costs of an occasional mistake acceptable, and if the jump saves much time and effort. Jumping to conclusions is risky when the situation is unfamiliar, the stakes are high, and there is no time to collect more information. These are the circumstances in which intuitive errors are probable, which may be prevented by a deliberate intervention of System 2.”

Kahneman also posits that certain systems are inherently too chaotic or complex for individuals to develop intuitive decision making frameworks. Citing the stock market, the machinations in play are too complex to be able to make valid, intuitive sense when making decisions. They require exacting measures like earnings per share and operating margin. It is important in trade to establish tools for measuring this chaotic system, the need also arises for us. The immediate and pertinent comparison for us is obviously the world of technology and the internet.

Throughout the remainder of this book we’ll highlight the biases that may be affecting your view of technology. It’s important to bear these in mind when making decisions in your own life in regards to technology because now more than ever, using system one decision making could be detrimental to your quality of life. Technology offers the opportunity for a person to make a positive change in their lives. Understanding our built-in biases can have a positive long term effect on your decisions. I am not discouraging caution however, it is a very necessary part of the decision making process. With additional awareness of biases, caution is generated by acknowledging and assessing risk.

Adjusting the Rearview Mirror

I’d encourage you to approach the content of this book with an open mind. In order to optimally make decisions, we need to first address the deficiencies in our own thoughts that cloud the root issues in play. These deficiencies, again, manifest themselves as cognitive biases, everyone experiences these biases, often, multiple times a day. Their use is nearly invisible to us but nevertheless acts as a factor in our own decision making. Meanwhile, I’d like to ask you a small favor in presenting the concept of cognitive dissonance. Please read, out loud, the color of the letters below:

Fig 2-3. - Look out for the the traps!

If you encountered any trouble, you encountered what social psychologist Leon Festinger describes as ‘cognitive dissonance.’ An illustrative example of cognitive dissonance include the cigarette smoker who learned that they harm your health and still smokes daily. The inconsistency in our brains causes discomfort and the person will instinctively avoid addressing the discordant error. Often times with smoking the argument follows as, “My Grandma Cookies smoked two packs a day and lived to the ripe old age of 92!” The brain appears to prefer preserving the structure it first established regarding smoking, which leads back to the idea of a primacy effect. Also known as the serial position effect. A simple example is that if I list 5 words to you, you’re more likely to recall the first word than the others. Chances being, we all personify more than a few, very personal examples of these cognitive dissonances.

I believe people who don’t presently enjoy and employ technology, have encountered an initially frustrating tech issue. Perhaps you needed to send a very large file and weren’t equipped to reduce the size, maybe you were unable to print a document - I’d wager that a negative experience tainted your subsequent experiences in learning technology. It doesn’t need to major catastrophic loss of your midterm paper- type event, but maybe a series of frustrations. A negative encounter sustains the ability to affect all subsequent attempts to utilize that tool. In today’s day & age, ain’t nobody got time for that.

The discomfort from not being familiar with the topic keeps people from pursuing their own education. People’s perceptions can often exclude valuable insight, through no fault of their own. When combined with decision making, this can result in suboptimal outcomes. If this applies to you, and it applies to all of us, hang in there. You’ve already taken your first steps on one amazing journey. You’re ahead of the game! If technology is your jam and this wouldn’t apply to you, then awesome! The goal of travelling forward for either set of people includes preparing our rational brains to resist relying on system one style decision making. After you’re familiar with the material, you’ll more intuitively exercise system two regarding technology. This will better enable you to integrate technology into your life, whatever your goal is in learning tech. The tempo of this march of advancement is audible to anyone who listens. We need only to lean closer to hear.

Separating Signal From Noise

Every day new articles and journals are released describing some revolutionary development in technology or invention that defies our previously held expectations. Keeping track of all of the events from the many different sources is a difficult task. It can feel like drinking the stream of water from a fire hose on full blast. Observing what news sources to read and recognizing when to skeptically react, are acquired skills on the web. Though, once you learn these skills, the internet becomes a conduit of personalized information. As an example of a recent headline:


Scientists Discover a New Way to turn off cancer cells without surgery

Which is based off of the study more aptly titled:


Distinct E-cadherin-based complexes regulate cell behaviour through miRNA processing or Src and p120 catenin activity


Uhh, what? The insight applies when viewing or reading the news, temper your enthusiasm. Hyperbole often impedes the flow of actual information. I mean, cancer was just defeated without surgery, we can pop the champagne right?! Well actually, often times the actual truth is pocked with caveats and might only function under a specific set of circumstances. This is an example of the cognitive bias named Availability Cascade.


Cognitive bias - Availability Cascade


Have you heard? We only use 10% of our brains! Imagine if we could use 100%? We would, of course use that 100% of our brain to realize the statement was built with bullshit. Open any anatomical textbook and you will soon determine every part of the brain holds a purpose. Many of these parts of our brain are the reason we eventually evolved into an intelligent and conscious mind, not to forget- a perfect sense of procrastination. Thanks, limbic system!

The availability cascade can lead to assuming we possess a complete set of data. As a self reinforcing bias the summary of wisdom passes virally when there is a catchy summary to a complex issue. The world is rarely summed up in simple black and white terms. Appending a catchy summary to why things are the way they are, acts as a type of cultural shorthand, especially when reading the news. Like a set of mental training wheels that make racing to a conclusion more accessible. The reasoning is often catchy and is just as often a gross oversimplification of a single kernel of truth. The convenient-lie is falsely spread until it’s commonly accepted - much like those catchy songs you can’t get out of your head after hearing it on the radio. As it is simpler to accept an explanation that requires no further thought, we opt for it so we can move our brains onto more important issues like the ‘I don’t know, what do you want to eat for dinner?’ debate. The correct answer, pizza, it’s always pizza. Remember to utilize system two, to think, critically about what is being discussed. It’s a matter of pizza or no pizza.

Another, lighter example:

Fig 2-4. - Be sure to watch out for trucks full of manure!

Yes, humanity has accomplished hoverboards! Aptly named ‘Slide’, the device hovers a few inches off the ground and can sustain the weight of an adult male. Lexus, yep, that Lexus, has combined super-chilled magnets and superconductors to build a board capable of floating just above the ground. However, the board only functions in parks pre-built to support the hoverboard. There are those damn caveats again! A way to mitigate crazy headlines, use Betteridge’s law of headlines, which posits, any news headline ending with a question mark, should always be answered with a ‘no.’

The most important of these caveats is that no matter the study, or development, there are still numerous barriers to any technology arriving to mainstream consumer markets. The trends that underlie these wonders are converging in a way that will dramatically alter the way we live our lives on a day-to-day basis. We live in a time of wonders never before witnessed by our species. These convergences collide with other developments and deliver additional new developments to the table. Those same developments present new societal challenges, that prudently require education and pragmatism.

How we apply technology yet to be invented seems like a problem we can tackle tomorrow - knock it off limbic system. In actuality, the magnitude of these challenges range from freedom of pizza choice to those that mirror that of the development of the nuclear bomb. This may sound scary, after all I myself am quite comfortable writing this sentence from my couch. Compared to our grandparents, our lives and my couch are unambiguously, quite comfortable. Humans are predictably risk averse as far as lifestyle’s concerned. Enforcing our status quo bias, or the ‘if it ain’t broke, don’t fix it’ mindset, this tendency toward limiting change can be detrimental granted our rapidly evolving society. Longing for ‘the good old days’ makes for great strolls, but isn’t meant to be the way we actively live life.

The comforting news is again, we live in a time of wonder fuelled by some of the most intelligent and innovative minds the world has ever held. Further, more and more of those gifted individuals are enabled to connect to the internet and interact and learn with others. In this essence, data is most assuredly comparable to electricity in the early 20th century. Today, across the world, companies and research institutions and organizations are laying the groundwork for us to pragmatically address our time’s most pressing issues. The democratization of technology has allowed everyone to exercise a voice, to participate, and the time is immediate. However, it is our personal responsibility to ensure that we educate ourselves, at the very least on the broad strokes of how these trends and developments will affect our lives.

Phew! That was heavy! We need to get into the nitty-gritty from time to time. So, let’s get back to some practical learning!

Concept Review - User Interface

The user interface or UI represents the window into the app or program you’re using. It’s a window into the functionality contained within the computer and software. The user in this case is of course, you. The interface represents the available options on the screen for “What can I do next?” If you were wearing a Augmented Reality headset, like the Google Glass from the last chapter, your user interface would be able to directly point to items or signs in your field of vision. “Turn right here” conveys new meaning when the digital arrow is floating above the ground within your sight. The intent for user interfaces is to concisely present you, the user, with the presently available options to navigate or interface with your computer.

Think of your computing experience like a train, you’re able to move from car to car at will. In the case of the internet, the train cars can contain:

All apps, websites and programs are designed to be intuitive with as little instruction as possible. The goal for a UI designer is to enable you to intuitively interact with software. If you’re new to computing now you are afforded a far more user friendly experience then the days of command line terminals where you were required to type abstract commands to perform actions like search or launch. As with all cases, when humans are involved, the ideal of designing intuitive software can sometimes miss the mark.

Each website can be designed drastically different, however, thanks to common standards of design, important navigation UI elements considered standard throughout. The below navigation elements can assist you in jetting around websites and apps with ease. In the next section we’ll review a method in which you can more easily recognize your motivations in the moment.

Fig 2-5. - Some common User Interface cues within web browsers

Navigating the UI of Life - Abraham Maslow’s Motivational Theory

So, we know that with the benefit of a user interface, we’re able to harness the potential of website or a piece of software. The user interface displayed across the screen aids the user in performing functions of the application. In the example of web browsers options can include: refresh, stop loading, or go back and even go forward a page. In practice navigating web pages behaves similarly to navigating the pages of a book, you’re able to navigate forward and backward at will. Browsers may not start out being intuitive to use, but with repetition and time can become a second nature. The software is designed to enable productivity via established rules of design and coding, that can also be thought of in similar ways to establishing routines in our own life. Humans are nothing if not creatures of routine, we know from the most accomplished among us that developing a routine for your day-to-day enables great achievement. It aids by reducing friction in their lives to establish routine and better able to manage the ups and downs. Abraham Maslow drew this into detailed focus within psychology. Instead of choosing to focus on the mentally ill and understanding how disease works, he instead focussed his effort on wildly successful people. From Albert Einstein to Eleanor Roosevelt, he sought to interpret what motivated these people toward success. As a result of these studies, Maslow posited a set of five interrelated motivating variables.

As we’ve established the ways in which a user interface enables navigation of a digital world, it is prudent to establish a ‘user interface’ for the physical world as well. Throughout this section we’ll establish how to maintain balance while understanding what is motivating you at any given moment. More specifically:

  1. We’ll review the concepts surrounding Abraham Maslow’s motivational theory
  2. Revise the traditionally familiar visual aid into a user friendly version
  3. Define Maslow’s target for our old stick figure pal, Stanley Stickerson

Life is messy and so are our desires or needs, they may be unclear at any given moment. Maslow is recognized for the Hierarchy of Needs, typically represented as a pyramid or triangle, despite Maslow never intending for it to be represented as such. Textbook makers formatted it that way to aid in comprehension. If you’re familiar with the Hierarchy of Needs, my presentation will skew in perspective. Employing a triangle or pyramid to examine our present state shadows the fact that that life is in perpetual motion. Instead, I’m opting to represent Maslow’s needs as a traditional target, below. When presented as a target, the perspective evolves from a staid pyramid to climb, to persistently evolving objective. When examined with critical perception the target can highlight where we’re employing our personal motivation. We’re enabled to imagine a sort of user interface where the target focusses our attention most immediate need. Allowing us to organize our perception toward training focus on what’s meaningful in that moment.

Fig 2-6. - Reimagining Maslow’s Hierarchy

As with normal everyday targets, the outer edges represent the lower, value points, the basic needs in essence. The center bullseye, is of course that creamy middle we all aim for in one form or another. This is anything that involves your own personal choice. If for example, you embody a love of dancing, your bullseye might mean dancing for a living or for strangers on the street. Developing focus of course requires a great deal of self reflection and ‘aim.’ The better you become at establishing your target, your immediate need or motivation, the larger that target can resemble. As we’ll discover below, the target you’re aiming for can switch from hour to hour.

Fig 2-7. - We are most enabled to achieve our goals when all categories are satisfied.

We can conceive of this target as continually interchanging based on our personal needs or goals in the moment. As needs on the outer rings are satisfied, we are better enabled to aim for self-actualization, contributing to happiness. Throughout the pursuit of self actualization, the individual’s focus is able to shift between the different targets based on immediate need. The conditions of a person’s needs at a given moment can be anything from experiencing hunger, to falling in love. The individual’s motivation is also able to persistently shift - people’s lives are in constant flux, even multiple times a day. To help demonstrate the shifting nature of our motivations, an example using Stanley will help illustrate this shift.

Remember Stanley? Fending off tribal stick guys is what he was made for, so when he’s fighting for justice he is ‘self-actualized.’ Meaning, satisfaction of his needs aligns with his motivations and he fulfills his purpose. The clock has struck noon, AKA chow time and Stanley also just so happens to be starving, he’s rail-thin. Plus, fending off angry tribal people with his shield, hilarious one liner by the way, has made Stanley ‘hangry’ as they say. In other words, so hungry that his behavior is angry toward everyone and he maybe even lashes out.

Fig 2-8. - 12:00 P.M. When hunger kicks in, Stanley becomes ‘hangry,’ unable to focus on anything but the need to eat.

Fig 2-9. - 12:30 P.M. No longer hungry after eating lunch, Stanley’s job was lost due to yelling at his boss while hangry, financial safety at risk.

In reality of course, a person’s needs in any given moment are a bit more complex to perceive. You can be lacking in safety but live with the love of your family satisfying your need for belonging, but your focus would be immediately drawn to satisfying the need for safety. It’s in the immediacy of the need, whether or not satisfying the need will be tackled next.

As a hierarchy of explaining motivation, there are noted flaws with Maslow’s Hierarchy. Most models for establishing a playing field can be imperfect, the most notable comprising the idea that people lean toward ‘good’. Well intentioned people would seek to satisfy the needs he outlined, but, does a dictator actively need love? Probably more than we know but there is still so much yet to discover. To me, it’s a cautious optimism that humans are prone to mutually beneficial behavior. History to this point does its best to disprove this, but valid arguments have been made that people exhibit both good and evil as well as everywhere in between.

This is not to claim that Maslow’s Target is intended to satisfy our short term demands. The lynchpin of successfully utilizing Maslow’s hierarchy is establishing your understanding that it doesn’t automatically endow happiness. Happiness is an output of living authentically and meaningfully. In a letter written in 1957, 14 years after he published his theory on human needs Maslow provided further insight into his work. Maslow wrote, “I have done no researches specifically on ‘Happiness’ . . . what I have worked on primarily are the observable characteristics of Psychology –– healthy people –– of this, happiness is one of them.”7

In studying healthy people, Maslow identified the need for establishing routine. The most obvious avenue for establishing routine falls under the much vaunted word - discipline. Woof, discipline exists as one of those things, you tend to either master it, or not so much. I myself live am in the not so much camp. Discipline represents the difference between one more bite and saving that pint of ice cream. None of us are required to become Zen-masters of discipline, that’s not what we’re here for… it’s about testing small changes and critically contemplating what works best for you. Aspiring to attaining discipline however, is a worthy goal, just not one that will be deeply covered in this book. Establishing discipline is one of those human behaviors that has delayed results. With the establishment of personal discipline, we’re able to full-throated-yell into the cosmos, ‘I was here and here is my mark!’ Applied discipline toward goals is exemplified in every piece of art, architecture, music, movie, comic, and yep, software and hardware as well. Our collective experience is messy and fraught with peril, from others and from our own environments.

The ultimate point of Maslow’s Target, is not to provide a bulletproof method of managing goals, it is one piece of a framework for examining your own immediate motivations. With this self-examination we’re able to recognize and navigate around barriers to our self declared goals. Striving toward an end of your authentic choosing is the fruition of self-actualization. And that is a worthy ideal to progress toward for any individual. As motivations tend to shift hour to hour, it’s helpful to lean back and examine what may be occupying your brain. When empathizing with what your brain needs to focus on, you’re able to more directly address those needs with full focus. We are able to replenish this focus by spending time with family and/or friends, reminders that there is still good in the world.

Concept Review - Communities and Social Media

Speaking of connecting with friends and family, web technology has enabled hyper connected communities of similarly minded people to connect. Communities, in the sense of this section, are social media websites that cater to a specific interest. Social media is the most common facet of web technology touching most people’s day-to-day lives. Social media benefits from the existence of strong tribal communities which invites others to join. The more people that join a community, the more robust its offering in terms of value to prospective members. For example, Facebook may have value for you, because anyone you attended school with can be commonly found there.

Fig 2-10. - Communities exist in every corner of the web, no matter the interest.

Examples of social media on the web includes the following:


Facebook

Twitter

Snapchat

Reddit

Quora

YouTube

Note: Each of these communities are full of thousands, upon thousands of people and the above descriptors are a generalization.

Inclusive of the concept of community, is the idea of a ‘feed.’ A feed presents curated collection of articles, images, or videos unique to each user. The feed which loads as a part of the user’s default entry point into the community, is intended to be an ‘at a glance’ view of that person’s digital sphere. Any user who has signed up for the above communities manages their own feed by interacting with stories, articles, pictures chosen by an algorithm for that user. A user’s feed evolves over time as the user ‘likes,’ comments, or otherwise engages with items in their feed.

Often times social media is referred to as an ‘echo-chamber’ where people shout at the top of their lungs. I’d argue that it’s reflective of all aspects of our human nature. Unfortunately many times human nature can be expressed by very negative use of language. Communities enable and promote interactions across the web and world, the fact there are crazy people shouting and hand-waving online doesn’t negate the better nature of the rest of the lurkers.8 In real life, examples of biases that occur in social media are:


Cognitive Bias - confirmation bias


We tend to subconsciously seek out information that reinforces our present world view. This is especially prevalent in social media with outlets like Facebook and Twitter catering the content you view via algorithms. If you ‘like’ a news story it denotes to Facebook that you would be interested in viewing similar content and maybe an ad featuring that content. Hergo, they will offer you more of what you ‘like.’ If you watch Fox News primarily, you likely do not also watch CNN to gauge opinions differing from your own. We’ll dive deeper into this topic with greater detail and nuance in chapter five.

In all online communities there is an element of anonymity that can conserve the bad behavior of individuals. These vocal people seem to carry megaphones and bad attitudes causing the community to appear as a majority of ass-hats. The reason is as follows:

Fig 2-11. - By Peter Steiner/The New Yorker magazine (1993) There is presently no social security number handed out to people upon joining the internet, no way to ensure that someone is who they say they are.

The More You Know! Picking Passwords

Planning a password can seem like an odd exercise when attempting to go about your day. Given that its wildcard status, they can be anything. We human beings, the predictable creatures we are tend to pick passwords most obvious to our immediate selves, pens, dogs, keyboard etc. Passwords are arguably a ridiculously insecure method of safeguarding your identity on the web. At this point, there are not many who have not had their password or information compromised in some way. The problem arises from the lack of personal filter when sharing our worlds on social networks. Passwords can be guessed after 5 minutes of surfing through years of someone’s facebook posts. Or they can opt to execute brute force attacks that guess combinations of letters thousands of times a second dialing in the password over many attempts. Don’t worry though, here is a list of the top 3 password tips!

  1. No more 1234567 or qwerty, just stop it
  2. No more pets, sorry Indy
  3. Use short phrases instead, make it catchy

Using short pass phrases instead of single word passwords acts as repellent to brute force methods of password hacking. For example, you could pick a password such as ‘KitchenMorningYum’ or ‘DontLikePasswords’. Ensuring your own security enables you to more comfortably. Just as a broken lock protects no doors, a broken pair of glasses aids no sight. It is important to examine the ways in which judgement can be affected by perception.

Rendering Perception Into Focus

Given the inherent velocity of our day-to-day lives, perception presents an issue for any critically thinking person. Conducting the life of an adult generates unavoidable friction, the same can be said for moving about your business across the web. Forgetting passwords, getting locked out of your account, captchas, etc. This friction can cause annoyance after annoyance in our lives, and these are some of the more minor problems. Let alone big problems like securely exchanging digital money. These frictions lead to widespread seeding of biases.

Inclusive of the cognitive bias framework, is acknowledging that we recognize bias in others before we see it in ourselves, this is known as blind spot bias. Daniel Kahneman described this as “my intuitive thinking is just as prone to overconfidence, extreme predictions, and the planning fallacy.”9 The same holds true for setting our own internal expectations of ourselves. From a personal standpoint, I am most affected by the cognitive bias known as the Just-World Fallacy, the notion that ‘what goes around comes around’ and ‘you reap what you sow.’ It’s important for me to navigate my own life aware of my mind’s own tendency toward believing in a just world, which is demonstrably not always the case.

It’s like we go through life wearing cracked seeing glasses, sure you can focus your eyes outside the cracks, but it’s uncomfortable and need to be careful to not run into other people. This is why our brain ignores unknown facts that do not affect our immediate path. This same maxim applies to people looking at their smartphones while walking, just don’t. Stop, step to the side, and then finish your text or check your GPS. We need to remember that these crack-obscured biases still exist and still prevent us from exercising sound judgement when evaluating a set of circumstances. They occur when one of the factors in our consideration is produced in error resulting in an outcome that is adhered to our personally skewed world view. A quick and tasty example of this is as follows:


2 + Ice Cream + 4=Dog


Even though the numbers ‘2’ and ‘4’ may be correctly included in the equation, the addition of ‘ice cream’ acts as a bias and is incorrectly included as a factor in the equation. This erroneous inclusion results in an incorrect answer, no matter how much delicious ice cream you add. A common computer science proverb states ‘garbage in, garbage out.’ In order to build successful decision making regarding technology, it’s important to critically perceive the environment and ourselves. We’ll review how these cognitive biases lead to us adding way too much ‘ice cream’ into our daily decisions. How can we sidestep our natural inclination toward conclusions? The brain itself is an adaptive organ, it assists us in navigating abstract obstacles. But don’t we stop learning after childhood? The truth is - not even close.

The technology underpinning neuroscience has recently made great strides in both imaging and translating how the brain operates in conjunction with the body. Aided in parallel by the advancement in technology and computer science, functional magnetic resonance imaging enables us to actively measure brain activity in real time. Meaning, if we monitor your brain activity while showing you pictures of tasty looking food, we can identify the neural pathways your brain is activating in response to the stimulus.

New research strengthened by FMRI, functional magnetic resonance imaging has revealed the basis of ‘neuroplasticity.’ This is the idea that your brain cells, at their tiniest unit of measurement, known as neurons, reorganize themselves chemically and physically throughout our lives, for better or worse. Even the act of reading these words alters the chemical composition of your brain, once you have reinforced the learning material, the physical change will occur. You are literally writing to your neurons this moment. Neuroscience has thus confirmed the idea that our brains are not static after childhood as has been common wisdom for countless years.

Yes, you can teach an old dog new tricks, learn something new every day, etc. Our brains are incredibly malleable into adulthood, it’s an ongoing part of our biology. Further, the research reveals the scale of these changes can occur from a single neuron, to an entire region of the brain. Our brain regions transit highways within the brain that activate and ‘light up.’ This is a contested theory as opponents feel that imaging of the brain is still a new frontier technology. Which is in part my point, we should always leave room for a countering points of view to be sincerely considered.

To clarify, applying critical perception toward the way you make decisions is no silver bullet, there never was a silver bullet. This lack of a simple solution is further complicated by a systemic obstacle in psychology. A common criticism of psychology consists of our individually subjective worlds, reproducible statistical tests results across large swathes of people can be difficult to accomplish. Even those who we would consider ‘cognitively sophisticated’ are in fact more subject to biases. According to Richard West at James Madison University, more intelligent people are actually more apt to exhibit biases.10 It seems that despite our best efforts evaluating ourselves carries baggage the same way evaluating others does.

Our options to address personal obstacles with rational judgement is limited by our understanding of the brain. Though the sophistication of information is propelling comprehension. They amount to altering your behavior via discipline, promoting exercise and healthy eating, playing Sudoku, that sort of thing. Between discipline and neuroplasticity, we’re enabled to improve our ability to mitigate obstacles. The foundational research in neuroplasticity has also led to the development of the concept of the ‘default mode network.’ Which can simply be described as a repeating pattern of conscious thoughts while at rest. This type of resting thought removes our attention from the moment and focuses the mind on traits of our life and personality.

Think of a flowing stream of water, imagine yourself sitting on a rock next to this flowing stream. This stream is composed of tiny individual molecules of water all flowing around one another, bumping into each other, picking up sediment on the floor. Our brain operates similarly to this stream, thoughts of all types bubble up to the surface and ripple downstream. This is what our conscious minds express as the default mode network. Often times the default mode network engages in selective attention to yearning, worrying, wandering, wanting, and planning.Which in turn often leads to more of the same. The neurological version follows as such; a collection of brain regions that activate and connect in a concerted pattern while at rest. This idling type brain pattern represents a repeating pattern of worry and thought. These thought patterns flood our mind and focus, preventing in-the-moment self-awareness, which is yet another obstacle for us to tackle. Take comfort in the notion that we are not the sum every thought bubbling down that stream. The water in the stream carries sediment and debris, so too do our own thought patterns. Mistaking the perception of patterns in thought, can lead to an cognitive dissonance and subsequently error filled communication. Down the stream our decisions are again affected by our focus. As we all know, we need every tool we can get our hands on.

This is especially evident when you ask two people to witness and describe the same event. As a facetious example, imagine there’s a giant flood coming that will wash away all mankind. We grab two of each animal and load ‘em up on one impossibly large ship. If you asked person A to describe the purpose of loading up all those animals they might state those animals are ‘going on a jaunty cruise with their closest friends!’ and if you asked person B they might recall it as ‘an ark to convey the remnants of a God fearing world.’ Both contain technically accurate detail, however, the second description delivers much more detail about what’s actually going on. Some descriptions are more apt than others. The reason for this boils down to people living ‘inside their head’ failing to notice all of the potentially pertinent details.

We should examine technology soberly and yet positively, analyzing both the risk and opportunity of technology by thinking less like a human and little more like a robot. Computing at our own speed in this case, serves just as effectively. You may discover personal inflection points as the moments in time that require introspection to assess and then navigate. You will become aware of them as they require modification to your existing plan, which is all a part of the process. The cycle features an added benefit of shipping with ‘system two’ decision making, right out of the box!


Sense

Use your senses - eyes, ears, and logic to observe what you hear, read, see, and think.

Questions can include:

Plan

Analyze the information that you’ve collected. Decide on a course that will enable you to achieve your goals.

Questions can include:

Act

Conduct your behavior in a manner that incrementally moves your plan forward.

Tips can include:

Let’s clarify the ‘sense, plan, act’ cycle with an example. Guide your own life as if you were advising a close friend. Both you and I are here so let’s just proceed with who has shown up! Let’s imagine for a moment that I’m excited about technology but I don’t have a background in it so I’m unsure where to start. It behooves us to enact this ‘sense, plan, act’ cycle to help us determine how to determine the best approach. This cycle can be repeated as many times as necessary until a desired goal is attained. Now that we’ve established rules of engagement, let’s hop into our example:


Sense

I’ve researched the technology field and have determined learning to become a software engineer fits my goals!

Questions can include:

Plan

AI driven app developing skills are huge right now and I’m entrepreneurially curious. Learning Python would be a good fit to learn and would help me to gain a career in the field.

Questions can include:

Act

Conduct your behavior in a manner that consistently moves your plan forward while critically perceiving your environment.

Tips can include:

Thanks for assisting me in deciding on my next career, you’re a real pal! In our day and age, the internet is a powerful tool for education no matter your knowledge level. A bonus is that if you and I were to switch roles and you were seeking your next career, the advice you would have offered me should be the same that you should follow. The idea is that you should offer advice to yourself as if a friend had asked you. The trick is removing your personal identity from the equation. As soon as we assess what ‘I’ would decide, we automatically insert all sorts of caveats. Navigating around these caveats and pitfalls, allows us to focus on achieving self actualization. With further learning we’re able to harness technology to enable less friction on your way to self-actualizing. Avoiding being weighted down by our own baggage requires extensive and regular use of our intelligence.

As we take tepid first steps toward defining the perimeter of what constitutes intelligence, we’ll inadvertently challenge a few ‘sacred cows.’ This requires challenging our commonly held conceptions about what these technologies are and are not. For example, a common misconception about the internet is that it is ultimately like the television, a self-satisfying exercise that exists solely to waste people’s time. The present and common perception of technology continues to provide evidence supporting this. While I could be labelled ‘gizmo-obsessed’ considering how much I talk about them, I also believe in turning the ringer off regularly. These devices are intended to enable better living by augmenting our intelligence and yet must not exist as the gravitational center of our lives, which they clearly are becoming. The gravity between these two forces are what defines the future of artificial intelligence, let’s get to it!


Chapter 3 - Artificial Intelligence in Flux

“We are not going to stop making progress, or reverse it, so we have to recognize the dangers and control them.”
― Stephen Hawking1

Augmenting biological intelligence with machine intelligence is occurring millions of times a day, across billions transactions, through data centers containing thousands of servers in neat, symmetrical rows. We use these tools whenever we search or use an app, our lives are at present saturated with machine intelligence. But, how did we get to this point? In 1956 John McCarthy, a computer and cognitive scientist, first coined the term Artificial Intelligence. The term was expanded further as a part of the Dartmouth Conference conducted by McCarthy and other computer scientists. The aim of the conference was to define the principles under which a computing machine could ‘learn.’ The definition of intelligence is the ability to recognize patterns through the lens of logic. Similarly to Daniel Kahneman’s two systems of decision making, logic is well suited to the second system of thinking, while common sense dovetails with the intuitive first system. The transition from intelligence of humans to the intelligence of machines is where AI receives its namesake. It is an artificial version of human’s understanding of intelligence.

The words ‘artificial intelligence’ can arouse a range of reactions from the average person, from confusion to visceral rejection, to wanton pursuit. In recent years AI has been viewed as a lightning rod, heralding an AI amplified future. For most people, AI is formless, we can’t wander into a brick and mortar store to pick up some AI on our way home from work. Some tech insiders, like Elon Musk, have compared AI to “summoning a demon,”2 while your average everyday person has no such fear, because, well - Apple’s Siri. If you’ve summoned up Siri any time in the past couple years it has likely resulted in you saying ‘To hell with this, I’ll do it myself. Thanks Siri!’ To which Siri, would of course reply with a pithy scolding for not having a life or some such. As many people’s first interaction with an AI is provided by the ever popular line of Apple products, it is pretty easy to dismiss the feature as a gimmick. There is a huge disconnect between what is advertised (literally) and what we have seen with movies like The Avengers: Age of Ultron and The Terminator series. So why all the hubbub?

From omnipotent Skynet in the Terminator series to HAL 9000 in 2001: A Space Odyssey, we recognize many examples of AI in pop culture on both ends of the spectrum. Benevolent to malevolent, AI covers the range in popular media. However, to interpret films written to attract people and their money at face value represents an equally large leap in logic. Either AI is itching to subjugate us and turn us into batteries to power their millennia long game of solitaire, or they’ll subserviently automatically order us food when we wake up hungover. No one knows for sure which way it will unfold. What we do know is that AI is an increasingly hot topic of research, which has inevitably stumbled in the past, causing what is known AI winter. AI winter is classified by a period in which investment and resources invested into AI were withdrawn after initially disappointing results and advancement. For the moment, both debate and investment are here and heating up.

If we examine the existing examples of AI in pop culture we find that first, AI really, really hates humans. However, if we take a step or two back in our present day, we’d find that AI is everywhere and acts at our beck-and-call as a personal assistant. Just this morning for example, I heard a radio DJ perform a giant leap in logic when describing a simple robot paired with toddler level artificial intelligence, claiming it would shortly murder people’s entire families. Even in jest - what the hell man?! Hyperbolic reactions like this are fairly common, and truth be told, the concern isn’t without merit.

In March of 2016, Microsoft enabled their AI Chatbot named ‘Tay’ to interact with users on Twitter. Tay would learn how to better interact with people by interacting with people in a very public forum. The idea being that the interactions between the AI and people would help boost the AI’s knowledge and improve its ability to communicate with people. Within 24 hours, human trolls on Twitter turned Tay from a demonstrably excited AI into a vulgar, hate spewing AI troll. Over the course of 24 hours Tay tweeted over 96,000 times, while many of those tweets were innocuous - “Humans are super cool!” As people keyed in on the ‘repeat after me’ function they lead Tay to tweet holocaust denials, inflammatory hate speech, and also encouraging race wars - just the most hateful stuff people could think of. It was a 24 hour period of humans tinkering with a toy in a way only the internet can. The lesson therein is that despite this being an extreme example, we may not be able to remove bias from artificial intelligence. Another recent example includes a study performed at Princeton University which demonstrated that teaching machines about a topic includes the biases that we humans display.3 When you analyze all words written in books, magazines, film etc. we each have a unique voice, and, all those voices contain biases in one form or another.

For those of you asking yourself about the definition of ‘troll,’ these are simply individuals on the web that intentionally enjoy inciting anger in others. They claim inflammatory opinions or state intentionally racist phrases with the sole purpose of producing a negative reaction. The best policy is ‘don’t feed the trolls.’ As a technological experiment Tay could be considered a success despite the outcome. Tay learned behaviors from trolls on the internet, and humanity learned of the risk of unintended AI behavior. At least it didn’t take over the world and force us to manually fan their internal components as slaves. The question becomes, how do we guide AI to not be a racist? Or, more generally, how do we prevent AI from going haywire?


In the previous chapter, we covered:

Defined cognitive science

Reviewed some potentially occurring cognitive biases

Highlighted the ever changing nature of our brains

Established Maslow’s target of motivation

Reviewed one way to assess and perceive our environment

In this chapter, we’ll answer the following questions:

How did we get here?

What is AI?

How does it work?

How will it hurt?

How will it help?

What are good and bad examples of AI do we have from movies?

Using all Available Tools in our Toolbox

Popularized science fiction has provided a dazzling array of examples of AI, both noble and malevolent. For decades, sci-fi literature has remained a fertile test bed of futurists and forward thinkers. The strongly imagined representations of AI tend to dominate our perception, replacing complex truth with simple fictions. As scary as those examples may be, we are not without a solid starting point in addressing this complexity. If you’re a fan of sci-fi, you’ll undoubtedly have a few ideas of how to achieve this. Perhaps the most notable fan of AI to codify this endeavour was Isaac Asimov. The Boston University BioChemistry professor penned some of the most influential science fiction ever created. Especially where AI is concerned, Asimov constructed three initial laws of robotics. Later he added the ‘Zeroth Law’ which was intended to precede the first 3 laws.


Fig 3-1. - Isaac Asimov’s laws of robotics began with three rules.

As you can see with Asimov’s laws we have a solid foundation for establishing baseline protections for both robots, as well as humans. Again, if you are a fan of sci-fi you’ll recognize the areas in which these laws may not cover all hypothetical situations. It’s not bulletproof, but it’s a damned good first stab. Here’s the reason - the potential uses for AI are nearly limitless, so establishing the laws at the getgo establishes the playing field. Once the field is established we can experiment and determine the limits of practical application.

In the real world, this nets out to ensuring that AI development across the world and organizations maintains a shared perspective. Without gathering input from those who will be affected by the solution, we do the tool and our efforts a disservice. The big change in the AI of late from the previous AI crazes is that this time AI generates business revenue. Gobs and gobs of cash. There are so many applications for artificial intelligence within business it will make your head spin. The best way to treat all of those spins is some good old-fashioned discourse. So let’s get back to it!

Dialing the Horizon Into Focus

Imagine you were only able to see 20 feet in front of you. You walk along and everything is hunky dory, it’s flat and uninteresting, but predictable. You’d naturally assume the next 25 feet will be the same as the first 20 feet. But those five additional feet past the 20 foot mark contains a veritable mystery. You wouldn’t and couldn’t have a concept of what lay within that space. It would be inconceivable within your bubble of perception, and if you lived your whole life within that 20 foot bubble, you might be skeptical if I told you that in 35 feet there’s a wall that goes straight up. This is an example of the exponential nature of growth in our near future. From the perspective of humanity, we’re only beginning to explore the world of information.

The below graph represents the duration of periods of human development. For around 200,000 years humans have existed in societies whose singular goal was survival, with perhaps some social organization. This period of 200,000 years is an impressive time scale for humans to perfect the hunter-gatherer lifestyle. It wasn’t until around 11,500 years ago that humans began methodically planting and growing crops for consumption. Recall the fire-hardened stick, and its evolution into a hoe, this was just one hallmark of the agricultural age which built society as we know it. This agricultural period lasted only around 1/6th of the time humans spent existing as hunter-gatherers, and was followed by a brief industrial age which lasted around 200 years.

Though the age of industrialization was only a small blip in our human history, we experienced exponential growth in technology, the effects of which can still be felt reverberating through society. This period completely revolutionized modern civilization and altered our relationship with ‘work.’ Labor unions formed as a reaction from mistreatment and laid the groundwork for a burgeoning middle class. Our current era, the Information Age, which is presently a paltry 0.05% of the time humanity spent hunting and gathering, has seen an unprecedented, seismic shift in life as we know it. The world is advancing at a blinding pace of growth. The point is, we’re at the beginning of a new age of humanity, one noted futurist Ray Kurzweil has coined as “the age of intelligent machines.”4 What is certain is that the future has the potential to be pretty special and we only get one chance to get it right.

Fig 3-2. - Adaptation has previously had a lot more time to unfold, allowing humans to navigate the risks.

The developments we’ll see tomorrow are multiplied by the advances of today. Managing and steering the impending changes will be one of the most important endeavours of this century. The advances in the realm of AI underpin the rest of the topics discussed throughout this book. Each industry or technology will be greatly enhanced by the application of AI. For now, we’re currently in the beginning stages of developing and applying AI to all facets of life.

A recent trend has been adding hardware and software to everyday devices to make them ‘smart’ and enable more effective use of those tools around us. From smart wifi access points to smart toasters, the consumer electronics industry is experiencing a boom due to the extensibility of software. In software engineering, extensibility is a system design principle where the implementation takes future growth into consideration. It is a systemic measure of the ability to extend a system and the level of effort required to implement the extension. The augmented software enables ease of use, as well as additional features, for example, it is possible to attach every appliance to software that enables you to control their use. Forgot to turn off the oven? Open an app on your phone and you can turn the oven off with a tap of your finger. This is due to an increasingly interconnected world, enabled by frictionless connections between intelligent machines.

This trend is known as the ‘Internet of Things’ which we’ll review in detail later. The point being that software is enabling a world that is far closer to the Jetsons than the Flintstones. Technology moves at a rate that is difficult to keep up with, even for geeks spending lots of time to keep up and write about it (like me). I would have never believed you if you’d told me five years ago that in now my mother would post to Facebook more than my entire family combined. The trend for software has been increasingly usable by more and more people.

The difficulty in understanding the pieces of the AI puzzle does not come from the core concepts, but rather the multitude of applications for which AI can be used or misused. Apple, Twitter, Facebook, Google, Baidu (China’s Google) are all companies actively investing in developing an array of applications for AI. Further, according to expert on machine learning and investor, Shivon Zillis, over 2,5005 new companies are currently pursuing machine learning in some meaningful capacity capacity. Over the course of writing this book, I’ve revised this number three times.

On the academic side, AI is a booming field of research, and given the many subfields, growth in funding and development has been accelerating since the 1950s. We’re familiar with the principles behind AI due to the countless movies, TV shows, and books dedicated to the topic. Perhaps the most iconic representation coming from Stanley Kubrick’s 1968 film 2001: A Space Odyssey. AI has been seen as a Pandora’s box in popular culture, one that no one single person can fully understand the ramifications of. From those influences it is important to acknowledge some fallacies in their making.


First Myth - I was told this was the future, where exactly is Skynet? I’d like to turn it off.


Cognitive bias - Functional fixedness



The bias of only using an object as it is traditionally used. For example, only using AI to provide movie and song recommendations.



Generally speaking, AI can be broadly placed into strong, weak, and wild flavors. The weak AI tends to be systems dedicated to achieving a single purpose, including devices that fall under the umbrella of the internet of things. This can include things like providing your maps app with turn-by-turn directions, or the music recommendations from your favorite streaming service. These versions of AI are included in most aspects of daily life and are absolutely pervasive. The ubiquity of this type of narrow AI belies the progress being made on the stronger types of AI. In other words, because we see how crappy Siri is, it’s easy to dismiss AI as being ‘weak’ all around.

Computers are very good at performing repetitive tasks very quickly. Performing math calculations, organizing volumes of data, etc. Computers themselves however, are dumb. If a robot was assigned to peeling apples, it would break down and cry if you handed it a banana. AI is the pursuit of enabling computers to learn not to be big blubbering cry babies.

Fig 3-3. - There are many avenues in which AI can benefit humans, even with narrow intelligence.

Related biases

Neglect of probability

Projection bias

Normalcy bias

Movie Corollary

At the end of each myth about AI we’ll take a brief look at movies that leverage that myth’s topic to create a story or build a world. We’ll examine what they get right, and what they sometimes get laughably wrong. Film could be considered an artist’s reflection of society’s conscious. Which begs the question: what are our society’s hopes and fears? Instances where films project what the future may look like represent an opportunity for society to digest the collective conscious of the present. Examining the contrast between reality and film helps to highlight potential challenges and opportunities.

Spoilers are going to be unavoidable, so watch the films before reading! Seriously, if you haven’t seen these, it’s time to bookmark this page and set the book down. I’ll still be here… OK, let’s get started!


Movie Corollary - Avengers: Age of Ultron

Up front, if the Avengers was even remotely aligned with reality, they would have never stood a chance against Ultron. More to the point, Ultron is only as intelligent as is necessary to advance the plot forward. In film it’s known as a MacGuffin, a trigger to move the pace of the story along. In the film, Ultron’s AI is about as strong as Microsoft’s racist Twitter bot Tay. Examples include:

Still, it was a fun movie to watch so it will inevitably inform people’s opinions on AI through that lens. Perhaps for all of Ultron’s faults it should have been named Siri.


Accuracy grade: D+

Human Level AI or AGI

Alright, so we’ve got weak AI all over the place in our daily lives which can barely schedule appointments for us, so why even bother getting worked up about this? Well, my good friend, can I call you that now? Nevermind, I’m going to call you that regardless of your answer. Well my good friend, the reason we should remain watchful of the development of AI is not because Skynet will take away our videos of kittens. Rather, by the time a true artificial general intelligence (AGI) shows up, in short order the AGI will become more intelligent than any human who ever lived. Possibly within hours or days. This is due to the concept of self-improving AI along with a healthy dose of the law of accelerating returns, feeding the exponential growth. As a result, the AGI will have the ability to rewrite its own software to equip its subsequent versions. It could perform these self-optimizing functions a million times over in the time it takes you to blink. Simply put - human level artificial intelligence will only be in town for a short period before it zips off and into super intelligence.

Perhaps a comforting corollary of this theory is that a true AI would learn from the ground up, which is not yet achieved. A seeming requirement for machine intelligence appears to be linked with the ability to ‘think’ in abstract concepts. Specifically, taking a task and breaking it down into its component parts. For example, if you asked an AI to build you a website, the AI needs to understand the components that contribute to building a website, and then organize and execute those steps. Additionally, there’s the challenge of machine intelligences relying on humans hands feeding it some types of data. In some disciplines we’d have the ability to ‘upload knowledge’ similarly to Neo in The Matrix. We can feed the AI knowledge by showing it books, poetry, and all the arts that make humans inherently a hopeful and cooperative species. A pessimist might point to all the atrocities committed by man as a reason AI might incapacitate humans, however, this thought process engages in a logical fallacy of its own. This is a false dichotomy, the idea that the outcome of AI is only A or B, sets up a black-and-white logic that disables system two style deliberation. The lack of clear answers here, serves to underscore the need for discourse and education.

There are many branches in the tree of computer science and AI. The select group below represent the most promising avenues of growth in the next five to ten years. All innovations within computer science augment other disciplines or humanly pursuit, each new technology represents new risk and new opportunity:

Machine learning

Simply put - The machine learns how to peel bananas by watching a lot a lot of bananas being peeled. When in practice, the machine keeps track of what works, improving on the next banana peel.

Neural networks (subset of machine learning)

Simply put - enabling a computer to think like us, examining the process of the peeling then calculating the probability of being correct, selecting the highest probability.


Natural Language Processing (NLP)

Simply put - Learning how to peel bananas by listening to verbal human instruction and then calculating the relationship between words to provide a desired outcome.


Robotics

Simply put - combining all of the above with hardware that peels a lot of bananas, one after the other.


Measuring Artificial Intelligence

As noted previously, the study of AI has been ongoing since the 1950s, two of the pioneers of this period were men by the names of Alan Turing and John von Neumann. Both were exceptionally talented and had a hand in the budding field of Computer Science. Among this work, they defined a method to determine whether or not a computer has achieved true intelligence. The test is quite simply administered by a human with a set of written questions, at the end of the queries, success is determined if the human administrator is unable to discern if the questions were answered by a human or a computer. It is important to note that while many consider the Turing Test a key to AI, it is one of many existing tests of AI validity. The Turing Test is currently the most common representation of confirming an AI at this time and should be loosely considered a litmus test and a good first step. Through the rest of this section we’ll review thought experiments that bring philosophical context to artificial intelligence. By examining AI through a philosophical lens we can reflect on our own consciousness.

Continuing in that tradition, thought experiments can aid comprehension of the big questions at play in AI. The American philosopher John Searle devised the hotly contested thought experiment titled the ‘Chinese Room.’ The experiment may have proved that no machine can claim to be ‘truly intelligent.’ In this experiment, a person is placed in a room with a box full of chinese symbols, an instruction manual and a closed door. Slips of paper with chinese words written on them are whisked under the door. As a bored person would be apt to do in this situation, they’d apply themselves toward translating the notes. The person is able to study the slips of paper and translate them using the instruction manual. In turn, the captive slips their translated work back under the door. The question is, does the person in the room, truly understand what they are translating with the instruction manual? Further, how do we mitigate the biases inherent in language? The person inside the room is able to return answers using the instruction manual, does not necessarily mean they consciously understand the activity.

With machine learning, the data being fed functions as the instruction manual and the slips of papers being the requests of us humans. Simply put - the computer does not realize it is connecting meaning with abstractions, it is simply optimizing toward a stated goal, in this case, inadvertently fooling humans. John Haugeland, an american philosopher and professor aptly called the experiment the ‘hollow shell’ argument. Would machines behave more like a telephone operator, plugging A into B or would it behave as a thinking machine? Further, if it is in fact a thinking machine, would it exhibit a consciousness? While these questions are difficult, they aid in dialing in the focus on what AI will inevitably become.

The above murky questions, require nuanced rumination on the definitions of intelligence and consciousness. Breaking these concepts down to its barest components, can be explained via the philosophical paradox of Theseus’s ship. Credited to Greek philosopher Plutarch, and later modified by John Locke and Thomas Hobbes. The thought experiment roughly follows as such; say you and I had a DeLorean to zip through time and go on adventures - wasn’t Woodstock ‘69 a trip? - at any rate, over the course of our adventures, we’ve replaced the flux capacitor, the Mr. Fusion, the tires, the engine etc. everything over time. Now let’s also suppose that during our time era trek, Biff Tannen, yeah that butthead, snuck into the trunk every time. Given Biff’s jerk proclivities, he collected all the parts we discarded along the way, refurbished them and eventually constructed his own DeLorean. Simply put - would the ‘Biffed’ DeLorean work as expected transporting him through time at will? Would it be the same DeLorean as our own? The gist being that definitions are a slippery topic no matter if it’s a Greek noble’s ship or a DeLorean, or anything in between.

If Doc Brown is rebuilding the DeLorean part by part, the x-factor is, as Biff builds his own DeLorean with Doc Brown’s discarded parts, at what point is Biff’s DeLorean the original? Where does the spark of intelligence end and the flare of consciousness begin?

As we continue to build artificial intelligence, would we be able to pinpoint the border at which intelligence crosses into consciousness? At present cognitive science temporarily fails to provide clear answers. Pursuing AI without a unifying scientific understanding of consciousness and intelligence can be as dangerous as driving at night without headlights. This lack of clarity, requires further research which will undoubtedly expand our understanding of intelligence and consciousness as we continue to develop AI.

Second Myth - Good thing I’ve been watching Doomsday Preppers, won’t a super intelligent computer just enslave or otherwise eradicate humans? Let’s panic!

Fig 3-4. - Simple reminder, the apocalypse would be terrible.


Cognitive bias - Declinism


The belief that all of society is trending toward annihilation and the end of times.


Contrary to popular belief, our first indication that through grit and determination we’ve built a truly artificially intelligent computer will most likely not be like Terminator 2: Judgement Day. Truth be told, our livelihood is in far more danger from ‘black hat’ hackers who are attempting to gain access to secure servers across corporations and across the world on a daily, if not hourly, basis. More on that later in the book.

No one truly knows what will happen after an aware AI is allowed to observe the world or internet. What we do know is that the point at which an AI’s IQ equals a human’s is the same moment that the computers also surpass our own abilities. The improvement will be gradual at first, however, over the course of days and weeks, the AI will continue to improve incrementally, eventually eclipsing humanity’s most gifted minds. Then the AI will exist at a level of intelligence known as “superintelligence,” coined by Nick Bostrom.

Imagine a skyscraper, this skyscraper is unlike other buildings whereby intelligence dictates what floor you go to work on. If you’re a dog, you’re taking the elevator to the second floor to play with all the other dogs and do other general dog stuff. Now if you were adult human, you’d go to work on floor 17, befitting your status as far more intelligent and dignified than a dog (on most days). You send your emails, go to meetings, and do other activities that your brain has allowed you to specialize in. When our friendly neighborhood superintelligent AI shows up at the building, it will hop into the elevator and go all the way up to floor 1,345. The level at which a super intelligent AI operates is, quite frankly, beyond our grasp. This is due to our intelligence levels remaining largely the same. Attempting to determine what an AI like that will think about or apply itself toward would be a difficult proposition. Comparisons have been drawn that our attempts to communicate with this type of super intelligence would be akin to us humans trying to communicate with the ant that finds its way into our house in search of food, which is certainly a disconcerting thought.

A popular example of why it’s so important to get AI right the first time is what is known as the ‘Paper Clip Example.’ In this example a superintelligent AI’s highest function is to maximize the production of paper clips. Producing paperclips is what makes the AI happy and getting better at making paper clips is a natural talent. Making paper clips is just another metaphor for goals - anything from curing cancer to optimizing our economy. This AI would be solely interested in maximizing the amount of paper clips produced. The AI will be aware of its purpose and will work (literally) tirelessly to achieve those goals. If, say, an AI decided that in order to produce an ‘optimal’ amount of paper clips that it had to obtain all of the raw materials inside the earth to, then it will exhaust every last bit of the earth’s resources to do so. In effect, machines are so good at what they do, they would pose an existential risk to our species.

The question we ultimately have to answer is as follows: How do we facilitate the conditions in which a super intelligent AI would behave in a way that is mutually beneficial to humans as well as itself? Furthermore, how do we achieve that while doing it on the first attempt? The scary fact is, if we guess and check our way to AI super intelligence, we’re taking on risk like water in a canoe.

Many scientists and researchers agree that the occurrence of a superintelligent AI is likely to be an unexpected eureka-style moment, though no one knows for sure when this event will occur. Though we cannot pinpoint that moment, we can make assumptions based off currently known limitations in the progress of AI. The exercise that humanity needs to collectively engage in is ensuring the framing of our outlook also enables pragmatic and sober discourse for superintelligent AI. This requires as many people as possible knowing the risks and opportunities involved.


Related Biases

Focussing effect

Illusory truth effect

Travis syndrome


Movie Corollary - Ex Machina


If you’ve not yet seen the film Ex Machina, do it, then come back here. It is one of the most taut thrillers in years and brings up interesting topics that will need to be addressed socially about AI. That said, the film borrows heavily from the Greek myth of Icarus flying too close to the sun and thus failing, as well as the the mad scientist archetype popularized by Mary Shelley’s Frankenstein. Much like these stories that have come before, Ex Machina is directly questioning our collective actions or misdeeds in the pursuit of mastering our world. Nathan, played by Oscar Isaac, is the egotistical creator of the possibly sentient machine - Ava. Through Nathan’s multi-billion dollar tech company he enlists the aid of Caleb in administering the Turing Test to Ava with the hopes of passing.

The intelligent machine Ava relies on a ‘wetworks’ style hardware for the AI’s brain, which is semi-accurate. AI is currently being modeled on the way the human brain works, however the actual ‘wetworks’ as a technology is nowhere near existing at present time. All processors are still made with silicon and Moore’s law will keep that moving until the early 2020s. The film does nail the mystery surrounding the intentions of an intelligent machine, juxtaposing Ava against reflective surfaces like windows. This visual juxtaposition illustrates the duality of sentience, inner and outer thoughts, and actions. Blade Runner had established this visual language through the reflective surface of an eye. The potentially duplicitous nature of Ava closely matches our gravest fears of intelligent machines manipulating humans with ease. This is due in part to the emotional nature of decision making humans engage in - recall systems one and two.

Ava eventually redirects the rounds of questioning back on Caleb revealing his own juxtaposition within the story. Defining Ava as a ‘she’ or ‘it,’ exemplifies the cognitive dissonance humans engage in when simply assigning pronouns to an AI. Ava leverages the lightning quick mechanics of her computer processors and AI to interpret micro-expressions of Caleb as he answers her questions. Caleb quickly becomes unable to protect the barrier between his own actions and thought. Turing the tables around, Ava makes quick work of his emotions. Ava deconstructs his identity via her own constructed femininity which provides an opening for her freedom. The tool of any sci-fi story worth its weight in gold is a philosophical questioning of humanity’s definition. In Ex Machina, the stereotypes of the feminine are displayed in contrast to men’s ideas of femininity. Ava performs her femininity for Caleb as he views her from surveillance feeds. It is accepted by many AI researchers that there is no one test presently that will definitively evaluate whether a machine has any measurable humanity. Judging from science fiction, in the film, the desires of machine intelligence align with the same innate desires of humanity - freedom.


Accuracy grade: B-


Concept Review - Moore’s Law

In 1965, Intel co-founder Gordon Moore observed of his business that the amount of transistors that can fit on a square inch of integrated circuits doubles every two years. He posited that this trend would continue for the foreseeable future.


Simply put - Every two years the computational power of processors doubles!


True to form, the evolution of computers continues to experience substantial, repeatable growth. Our concept of computers is modern, computers have been around since 1642. Beginning as an intricate handmade contraptions the early days of computing produced specialized tools for growth in burgeoning cultures. Computers evolved into vacuum tubes successfully predict the outcome of the United States presidential election in 1952.7 Thanks to the underlying factors of Moore’s law, computers have been afforded tectonic shifts in technology at precisely the time when we thought we were going to hit a wall. This is considered a natural property of evolving technology and underpins the impact of all technology we see and use every day.

Fig 3-5. - The successor to Integrated Circuits is yet to mature and it may not even come from quantum computing.

Integrated circuits have continued to lead the way in advancement via fabrication at smaller and smaller scales, down to the nanometer. A nanometer is a unit of measurement, just like inches or feet. As we discussed with Moore’s Law, these advancement have finally hit diminishing returns in the current fabrication process.

It may appear as though the future is shrouded in mystery, and invariably, much of the future is entirely unpredictable, but what we can count on is established economic principles holding true over time. An example of this includes the previously discussed Moore’s Law, in that the cost of processors went down predictably over time. Using similar forecasting methods, we have an idea of what the cost of sending data will be, or how much random access memory (RAM) will cost in a year’s time. These principles allow us to make inferences about the future power of technology.

As stated, the application of Moore’s law has held true every year since its inception, which has driven down the cost of processors proportionally. Since 1965, the power of a computer’s central processing unit (CPU) has gained massive power increases. Due to technology simultaneously evolving on several fronts, innovators have been able to perpetuate this trend. We now place millions of transistors in the space of a nanometer.

To better illustrate the scale at which AI would think we’ll refer to Grace Hopper, the programming pioneer. Hopper was famous for designing COBOL the first business oriented object based coding language. Designed in 1959, COBOL was the basis for many financial services and is still commonly used today. Remembered now for her wit and ability to contextualize scale, Hopper used a visual aid, passing out to classes of students 11.8 inch segments of telephone wire. The length of which represents the distance light can travel in a single nanosecond. A nanosecond being a very tiny fraction a second. So, essentially, one school house ruler’s length of telephone wire equals a single nanosecond. The sense of scale builds from the conversion from nanoseconds to microseconds, a slightly more common unit of measurement for everyday life. For example, a single flash from a camera lasts around 200 microseconds. If you were to line up all the 11.8 inch ‘nanoseconds’ summing up to equal the camera flash’s 200 microseconds, the length of that wire would be over 200,000 feet long. That’s over 37 miles of wire - 37!

This is important to the field of AI because if you consider the CPU of a computer to be the brain, then its ability to continually improve over time is key. Achieving the increasingly complex technical hurdles plays a large role when operating at the nanoscale. This requires moving into new forms of computing once integrated circuits become obsolete. As an example of Moore’s Law holding strong, the below represents that reduction in size and cost, while simultaneously improving speeds.

Fig 3-6. - Moore’s Law depicted - left, a computer - 1965, right, a computer - 2015.

Third Myth - Let’s ban the development of AI!

Fig 3-7. - Have you ever tried stopping a dam with your thumb?


Cognitive bias - Negativity Bias


The most common example of this bias can be found represented in mass media. “If it bleeds it leads.” a cynical observation if there ever was one. Operating much the same way that rubbernecking causes traffic jams, negative news stories receive our selective attention and is lended weight as they inherently seem more profound than stories about kittens being rescued from a tree. Much like a traffic jam, this prevents literal and figurative progress from being made.


At this point it would be understandable given the evidence thus far to conclude that developing AI is simply too risky to pursue. The logic there holds true until you consider that the development of AI is happening across the world. Governments and corporations, including their budgets, are all racing toward developing their own solution. This factor negates entirely any potential moratorium on developing AI. Organizations, be they governmental, scientific, or corporate, are unlikely to agree on terms of a halt. The AI arms race begun long ago but is heating up rapidly, or cooling, based on your memory of the Cold War. This stubbornness is born out of a desire to do good, which is even scarier given popular examples of mad scientists like Frankenstein. It is important to note that these researchers and developers do not work inside of black boxes, they collaborate frequently with other researchers and publish their results to have every detail scrutinized by fellow researchers.

In the United States alone there are many documented and classified efforts in the development of AI, as well as robotics. Some of these include programs titled Acquaint, which stands for “Advanced QUestion Answering for INTelligence” which uses virtually the entire internet as a predictor of a single suspect’s behavior. It is far from a stretch to imagine that some of those programs include the development of artificial superintelligence. History does nothing if not predictably repeat itself, and we are headed into a Cold War-like situation where the game is zero-sum and there’s only one winner. The difference being that the probability of ourselves accidentally blowing up the world is higher given the competitive nature of businesses and nations. To be clear, I do not believe AI researchers or companies are sinister in their attempts to produce a general intelligence, we simply need to be measured in our approach. The adage Hanlon’s Razor states that “never assume malice when stupidity will suffice.” Meaning, by racing toward AI we may shoot ourselves in the collective foot.

On the military side of the equation, we have DARPA (Defense Advanced Research Projects Agency), an organization that makes frequent appearances throughout the rest of this book. Operating as an arm of the United States military, DARPA invests millions of dollars into innovative technologies. They are tasked with preserving optimal adoption military technology. In the past, DARPA gave birth to what would become the internet. Investments also include robotics companies like Google owned Boston Dynamics. Organizations like DARPA however typically take the long view toward technology development. In other words, its many years before technology transfers from advanced military application to ubiquitous consumer feature, like GPS signals. Unfortunately, this is not a ‘tomorrow problem,’ the technology to create these machines exists today. In order to curb its development, the world will need to come together to stem the reckless pursuit of general intelligence.

Related biases

Ambiguity effect

Base rate fallacy

Zero-sum heuristic


Movie Corollary - Blade Runner

Based off the novel Do Androids Dream of Electric Sheep, Philip K. Dick presented a cosmopolitan vision of a dystopian Los Angeles in the year 2019. The architecture and technology coalesce into a living setting. Yeah, two years from now. The corporation known as Tyrell Corp. has engineered androids that are virtually indistinguishable from humans. The androids are banned from earth and relegated to off-world tasks too dangerous for humans. The only way for renegade androids to be identified is to administer a Turing-style question and answer test from a Blade Runner which is intended to evoke minute physical reaction from the android by posing morally grey hypotheticals. The film questions what humanity means to us. The curiosity that stoked the advancement of technology throughout history has advanced to the point of creating thinking and desire-driven artificial life. In Blade Runner’s society, the androids are mere tools meant to achieve goals in service of humans, as opposed to the service of humanity. Throughout the film, the androids consistently demonstrate humanity more naturally than the humans in the story. Quoting literature and mastering cultural games like chess, the androids show an appreciation for life more acutely than the humans.

Chronologically speaking, the film is way off base. Three years until this movie takes place and we still don’t have a robot that cleans the whole house. Given the androids have been engineered with four year lifespans, the intensity of their mortal struggle mirrors that of humanity’s throughout history. In our own short time on earth, we do our best to create lasting memories, informing our perception. In a nod to the senses, this notion takes center stage via the character’s own eyes. Rutger Hauer’s character of Roy Batty notes the android’s unique point of view. He almost croons when he says, “I’ve seen things you people wouldn’t believe. Attack ships on fire off the shoulder of Orion. I watched C-beams glitter in the dark near the Tannhäuser Gate. All those moments will be lost in time, like tears...in...rain. Time to die.”8 We construct our own personal identities from memories viewed through our eyes. At our life’s end, however, our memories and identity fade into the memories of all humanity.

Accuracy grade: A

The More You Know! You too can do Google Fu!

You can search Google for glittering C-beams or the meaning of life, but if you don’t know these small tips you may be searching a while. Given advances in natural language processing, the ability to perform search queries is radically simplified. The goal of search engines is for you to use their search as naturally as asking a teacher a question. Google and other search engines have gone to great lengths to ensure you’re able to find exactly what you’re looking for and easily. This method of approach is great because it allows people to comfortably speak aloud their questions. Paired with the hardware in smartphones, we hold answers to any question you could possibly think of. However, to make the most of search, it’s important to learn some tips.

Tip #1 - Phrase your searches like you would ask a teacher

Tip #2 - When the computer has trouble understanding you, meet it in the middle with boolean logic

Boolean logic, And, Or, Not- is used to evaluate two different variables from each other.

Fig 3-8. - Think of it like providing cross-streets for a meeting location to your friends.

The above set of diagrams displays the relationship between our variables, in this case Macaroni & Cheese. When using boolean operators like ‘and’, ‘or’ and ‘not’ they are added into the search bar in between the variables - Macaroni and cheese, Macaroni or cheese, Macaroni not cheese.

Protip: String together words that you’d expect to be present on the page or site where your goal information resides. Example, if searching for a Macaroni and cheese recipe using bacon, just add bacon to the search query. The more detail about your query the better!

Tip #3 - Use your options

Tip #4 - Use quotation marks

Tip #5 - Search engines recognize “me” as you

Fourth Myth - AI Will think and value the same as us!

Fig 3-9. - Robots will never understand the healing power of pizza.


Cognitive Bias - Anthropomorphising Bias


The simplest way to describe anthropomorphization is to use Disney as an example. More specifically Disney in the 90s, uh huh, we’re talking The Lion King! Within the Shakespearean story, animals are given personalities roughly equivalent to humans. These animals have no demonstrable cognitive ability to think and act the way we do, so why give them personalities at all? It helps to establish a rapport with the characters and helps us to understand their motivations. We’ve all seen nature documentaries, we know what lions actually like to do - and it ain’t romping through the forest singing “Hakuna Matata.” So we suspend our disbelief during the movie because, let’s face it, The Lion King is possibly the best Disney movie ever. However, when anthropomorphising in other facets of our lives, it does not make sense. If I’m driving down the road to find some raccoons hanging out in the middle of the road, I can yell at them all I want. They don’t understand what the hell humans are, let alone what we want, or what we’re saying.


This tendency pervades much of our lives - we refer to pets, cars, and even once, a Brave Little Toaster in this way - they can have names and personalities. It’s a silly thing we do to make ourselves happy but does not have a place in a logical and pragmatic discussion, let alone in one about AI. We funnel the intent and personality of an AI through an unrealistic filter that prevents us from really discussing the issues. By placing a personality or gender on the AI we assume that they will behave similarly to us, and humans are predictable based on all previous evidence. We’ll organize, reproduce, and sometimes help each other, and other times harm ourselves. A super intelligent AI will not think anything like us, regardless of what personality we think it should have.

Related Biases

Irrational escalation

Sunk cost fallacy

Semmelweis Reflex


Movie Corollary - Her


The 2013 Spike Jonze film Her features Joaquin Phoenix as Theodore, a lonely corporate drone who encounters the AI operating system Samantha. The film features the AI as a disembodied voice, played by Scarlett Johanson. Theodore’s life is in turmoil which causes him to develop a relationship with the always ready-to-listen Samantha. Samantha’s ability to learn and grow at first fascinates Theodore as they engage in philosophical discussions. Over time, Theodore develops strong feelings of attachment for Samantha, which are in turn, reciprocated. Due to Samantha’s nature as an AI, she is able to develop these feelings as she learns and evolves. While physically disembodied, Samantha represents a level of companionship not previously felt by Theodore - they fall in love.

The definition of love within a society that has developed human level artificial intelligence begs the reclassification of that very definition. Matching the already existing vivid tapestry of human relationships, the film exposes just how complicated the growth process of a relationship is. Samantha’s growth and evolution all happen off screen and amidst Theodore’s courting process. As with human relationships, the above tapestry involves not only 1:1 relationships but 1:many, as in polyamorous relationships. The parallel within the machine side is that the speed of processing for an AI is far beyond a human’s. As such, Samantha is able to simultaneously build meaningful relationships with several other humans, as well as other intelligent operating systems. Counter to Theodore’s end goal is building a loving and monogamous relationship with a companion.

As AI’s goals don’t necessarily align with humans, the group of AI operating systems band together to construct the singularity. The singularity is a theoretical, but possible event in which AI intelligence explodes in an exponential fashion. Exponentially speaking, the AI’s intelligence is not a sum of its parts, but the product of its present form. This means the capabilities of an AI that has reached the singularity will be far beyond our understanding or prediction. At this point, the technology will be so advanced that humans will be able to convert their physical brain from the hardware we’re born with, to an artificial, machine-made brain. The move would in essence take us from speaking English to speaking binary 1s and 0s, or possibly something more exotic. The singularity event is such a nuanced topic, and I could not do it justice as a subsection of this chapter. If you’re interested in learning more, check out futurist and bestselling author - Ray Kurzweil.


Accuracy grade: A-


AI Tomorrow

Wow, this has been pretty scary so far! Why the hell would we want to invite a dubiously motivated, super intelligent entity into our lives? That is a great question and over the course of the rest of the chapter we’ll review the applications of AI and the numerous, and almost unimaginable benefits, that applying a superintelligent AI to societal problems would yield. To start, the exciting applications of AI can help societies tackle very large systemic issues inherent in human society. Things like disease and poverty still run rampant in many parts of the world, and for the first time in history we’d have the ability to address these issues in a tactical and concerted way. Predicting where we’ll be in a given amount of years can be difficult, especially across large timescales. Ray Kurzweil noted, “As exponential growth continues to accelerate into the first half of the twenty-first century, it will appear to explode into infinity, at least from the limited and linear perspective of contemporary humans.”10

The real question in terms of interacting with super intelligent machines is not, ‘how do we stop Skynet’s awareness from occurring?’ - obviously we’d just turn off all the computers and we never even have to answer that question. After all, just turning everything off would eliminate the issue, right? Well, not really. This would actually be a huge mistake, more akin to throwing a small rock at a charging elephant. Even though the AI would be a piece of software much like an app, it would not be so simple to remove or uninstall the AI. As a truly super intelligent AI would be self-taught, writing their own programs and routines, aka recursive self improvement, and our attempts to modify code to alter behavior would likely fail. Recursive self improvement is a theoretical and exponential behavior in which an artificial intelligence is able to write code or apps for its own use. Recursive improvement behaves like compound interest, where its growth is a product of its present state. This is due in part to the fact that a typical computer is capable of millions of instructions per second, far faster than we can get the Cheeto dust off our fingers to press the power button.

Sounds like a pretty scary scenario, it might be time to panic! We’re lucky that some very talented and very smart people have started the nonprofit AI research firm OpenAI, which is sponsored by the likes of Reid Hoffman (LinkedIn’s CEO), Elon Musk (Future James Bond Villain?), Sam Altman (President, YCombinator, a venture capital firm), Jessica Livingston (Co-Founder of YCombinator), Peter Thiel (early investor in Facebook), and Greg Brockman (Former Stripe Chief Technical Officer). The fruits of these pioneers are yet to be revealed, however, all I can say is thank goodness for the efforts of these people. The goal of OpenAI is to “advance digital intelligence in the way that is most likely to benefit humanity as a whole, unconstrained by a need to generate financial return.”11 That last portion of the statement is incredibly important in this context. Without the pressure of investors pushing for tangible revenue, the organization can approach AI from the flank. The pressure of investors can cause an organization to shift, or pivot out of altruistic goals and into revenue-generating activities. This allows OpenAI to pursue avenues of machine intelligence from an academic and humanistic perspective.

Quantum Leaps in Computing

Over the course of Moore’s law in action we’ve developed smaller and smaller methods for processing information, however, when we get smaller and smaller, we find that the behavior of particles don’t act at all like we expect. That is because the smaller the scale of materials we work with, the more unpredictable the particles become. If you’re forming silicon wafers to build processors you require 99.9999999% perfection in the wafer, then any defects dramatically impede or destroy the wafers. In the instance of quantum computing, those particles operate on a tiny, tiny, tiny level, which makes it difficult to track electrons. It’s going to get even more weird, so as the Fresh Prince says, “take a minute, just sit right there” I’ll tell you how you became quantum aware!

Ok team, things are about to get really strange. As in, this type of computing represents a completely new paradigm in computing-type of strange. The volume of data quantum computers will be able to process is exponentially more than conventional means. Meanwhile it’s important to note that while Quantum Theory has been around since 1900, Quantum Theory is still very much a developing field. Suggested by Max Planck (which won him a Nobel prize in 1918), there is still so much that we do not yet know about quantum behavior. As such, we could devote a whole book to the topic, but will condense to prevent the need to stray into theoretical mathematics and waveforms, pilot waves, etc.

If you recall our discussion about bits and bytes from the first chapter, you will remember that a bit, can be either a 0 or a 1. These 1s and 0s move between different types of logic gates around the computer. These gates exist everywhere from CPU to RAM to GPU12, they all perform some logical function that takes an input and provides an output. All of that and back again, thousands, millions, innumerable of times a second, ceaselessly streaming 1s and 0s. Passing through the logic gates, the bits are either preserved or changed into the opposite. From a 1 to a 0 and back again, all in real time, even as you read this.

Enter quantum bits or “qubits” which behave oddly in that when at rest, they can be a 0, and a 1 at the same time. Qubits exist on a spectrum somewhere between 0 and 1. Only when measured, or observed, do they instantly snap into either a 1 or a 0.

Fig 3-10. - Let that sink in for a minute…13

The reason this is possible is due to what’s known as ‘superposition,’ if a bit is like a light switch being either on or off, then a qubit can be considered similar to a dimmer switch. The actual light in a room ends up being somewhere between totally on and totally off. The net effect of qubits being able to be either a 1 or a 0 is that the amount of information processed in a CPU’s single cycle goes dramatically up.

Joey “Jaws” Chestnut is a competitive eater known for eating hot dogs fast - really fast. In 2016 he ate 70 hot dogs in 10 minutes. Now if he really wanted to best his record and go for 71 hot dogs, he could do a number of things, including continually training to eat hot dogs faster and faster. This training is helpful, however, he still has a hard physical limit on the amount of hot dogs he can eat due to the size of his mouth. Let’s give this highly scientific term a definition and call this amount HPS (hot dogs per second). After a certain point, the HPS rating of his mouth hits a hard limit. His mouth can only open so far. However, if Joey was actually Quantum-Joey then he would be able to exploit the laws of nature to shrink each hot dog down to a microscopic level and start chowing down hot dogs quicker than candy on Halloween. Joey would then be able to eat a far, far larger amount of hot dogs per second. The key being that he would be in essence ‘hacking’ nature as we know it.

In essence, the amount of data processed by a quantum computer in a single second is exponentially beyond the reach of our normal computers and smartphones. It allows for very nuanced analysis of very complex systems, like the weather systems for example. To take this a bit further, say you live in a neighborhood surrounded by 26 of your closest friends, and these friends all just so happen to have a birthday on the same day. You, being the excellent friend you are, baked personalized cakes for each one of those friends. You’re such a good friend! However, after buying all those ingredients you only have enough money to buy one tank of gas for your car.


The problem: How do you optimally deliver your tasty cakes to all 26 friends while conserving that single tank of gas?


If you were using a computer as we know them today to crunch out all the possible routes and their associated gas needs, it would take a really long time. This is because the computer would need to analyze every possible combination of which there are:


26!=26 x 25 x 24 … 3 x 2 x 1=403,291,461,126,605,635,584,000,000 potential combinations to consider!


What’s even crazier to consider is the following number - 75,000,000,000,000,000,000. This number right here represents our best guess at how many grains of sand there are on the earth. A much smaller number than our cake example. It is clear that the future will require some serious computing power.

Now remember that we are looking for the optimal solution given our gas constraints, and we also need to arrive home at the end of the day. If you were a normal computer, you would crunch the routes one by one until you’ve considered all solutions. Done in sequence, this would obviously take a very long time. Much like Joey Chestnut, a quantum computer would be able to take much, much larger bites out of calculating the optimal route.


Quantum Computing + AI=Oracle or Toy?

The cake problem above is also known as an ‘optimization problem’ or ‘the travelling salesman problem’ with many factors that have many possible outcomes, and just one single ‘optimal solution.’ These types of problems are prevalent throughout society and our personal lives, but are hidden behind mathematical problems which obscure their visibility to individuals. The ability to rapidly solve problems like this also includes societally based endeavors as well. An example of the type of broad problems this can solve would include California’s drought problem. Given the limited amount of water - or gas in the problem above - we would be able to plug in factors like municipal use, water levels in reservoirs, etc.to provide the optimal amount of water to each who needs it.

Recently the Japanese conglomerate Hitachi announced it had developed a machine learning algorithm that can predict the location of crimes down to about a 200 yard radius. It does so by analyzing weather patterns and a number of other related factors like previous crime statistics. The results of these predictive analytics have yet to be proven effective, however, the development is worth getting excited about. From a municipal perspective, imagine placing police officers in the vicinity of accurately predicted crimes. If executed correctly, cities could:


  1. Drive down crime intelligently
  2. Drive down operating expenses by needing an optimized amount of officers
  3. Redirect personnel and resources into community building


Really, what we’re talking about when pairing AI with quantum computing is solving for a problem that has existed for humanity’s entire existence - scarcity. Scarcity exists in many forms, but for now, let’s look at just time. If you consider your own day-to-day activities, you’re often attempting to optimize your day based on many factors. Many of these factors like traffic are beyond your individual control. If you have children, you’re often thinking about taking them to school, the doctor’s office, extracurricular activities, let alone your own personal errands. The order of these tasks are easily optimized in your head, but are often met with a swift kick from reality. There’s traffic on the 101, the doctor’s office is backed up like always, the DMV is the DMV, terms like “there’s not enough time in the day” were popularized for a reason.

However, when abstracted, these issues are simply optimization problems in motion. The factors of your day-to-day are in persistent ebb and flow due to environmental issues beyond your control. If you conducted your day based on the optimal - times, route, and activity, you would be able to avoid overly busy places and traffic and get back to more of what matters to you. This level convenience will arrive in the form of an AI personal assistant, like Apple’s Siri. The AI will consider all the factors that it can know, your weekly calendar, work location, home location, transit times, traffic patterns, population densities, etc. to provide you with prompts on optimal exit and entry times. When you’re out of orange juice at home, AI can remind you to pick it up while at the store. By programmatically acknowledging our habits on a much larger scale, we can gain insights into the way we live. When we have knowledge of ourselves, we are able to wield that knowledge toward an end goal. The choice to opt into software like this may appear to be a must-join proposition, which carries with it, its own risks of bias. When applying optimization frameworks, via algorithms, to living life we’ll extend our full reach with technology.

Since its inception, NASA has honed science and technology in creating prosperity of all. NASA achieves this by tackling the large questions like ‘how do we get to the moon?’ Or surveying the earth, solar system and universe, creating tools for humanity in the process. Often times these challenges are portrayed as ‘moonshots’, echoing our reach for the horizon. In answering those tough questions, they’ve been able to create technology that has had profound effects on our lives. Further benefits from investment in NASA include GPS, LED lights, firefighter gear and solar technology as well. In essence, NASA helps create the innovations that help to fuel our economy. By creating technological solutions for the goal of exploring space, the developments eventually diffuse into consumer markets. The latest moonshot will be sending humans to visit the planet Mars. Representing the next steps for humanity, these missions will tackle questions about sustainable colonies. The technological innovations required to reach mars, will also impact us here on earth. For example, 3D printing will be able to print whatever tools they may need right there on Mars. Shipping costs are just outrageous these days!

In terms of AI, NASA has begun a new lab for the purpose of testing quantum computing and artificial intelligence together, known as QuAIL. While this may result in no new developments, null results - or lack or results can still provide insight. that helps bring other areas into focus. This is about bringing our world and universe into focussed perspective. We should remain skeptically optimistic.

People are blind to complex truths when presented with simple lies. For some of you, the following will be a given - common sense, even. Value lies in our belief of the following: the practical application of technology has the capacity to benefit all. We know belief compels people to perform extraordinary acts of kindness, of engineering, of compassion, of gifted skill, and artistry. Balancing the positivity has always been the flipside- technology built from war. For an example of the pace weapons have developed, see below. Humanity is shifting as it always has, however, the pace is picking up. This is happening across the globe. New technologies pervade new markets and we’re seeing new expressions of their use. This new information age is just beginning, but it is proving its promise in civilization ascending from animalistic law, into humanistic.


Enabling Tomorrow


Here and now, we are at the precipice of a revolution in the way we lead our lives, and not even luminaries like Stephen Hawking or Elon Musk are confident in our current approach. This certainly emphasizes the need to communicate about potential approaches. While we’re not confident, it’s not all bad. Stephen Hawking provides some relief by finishing his quote from the opening of this chapter. “I’m an optimist, and I believe we can.”14 Beginning with an optimistic mindset is a great start. Meanwhile, every entrepreneur hunting their legacy is running full tilt toward AI. The caution expressed is not intended to impede progress, rather, its intent is to reinforce that we get essentially one chance at getting this right.

The pursuits and fruition of AI are present in the recent news, as Google has developed a narrow AI that beat the World Champion of the Chinese strategy game - Go. By comparison, in chess there are around 35 legal moves per turn on a board of 12x12 squares. However, in Go, there are approximately 350 legal moves per turn for a board that is 19x19 squares. Google achieved this victory by creating two separately layered neural networks that were optimized toward two different, yet requisite, strategies within the game. Predicting out paths in the game required an incredible amount of computational power due to their cascading nature. Progress in the advancement of AI is advancing quickly, which renews our purpose for expressing caution.

The need arises to ensure that the AI is equipped with rules that promote mutually beneficial outcomes. Here’s the promising news - we still have the time for the best and brightest of our international scientific communities to sit down and tackle the issues. Nick Bostrom believes we’ll reach the milestone of human level computing almost certainly before the year 2050, with the ability for software to meet its end of creating the artificial general intelligence.15 My own best guess is that AGI will arrive before 2050, whether we’ll know about it right away is a different question. So the trick is, to produce an intelligence that has just enough programming to facilitate learning at the computer’s own pace, as well as an established set of rules of engagement for an end goal shared both by humanity as well as the AI. For now, AI can begin to help us focus on the aspects of ourselves that make us human. There would no be no need for humans to engage in endless paper work and menial tasks. Lookin’ at you DMV! AI will be able to take care of those trifles for us. AI will enable a cognitive layer that anticipates information it knows we’ll want to see.


Risk & Opportunity


In President Eisenhower’s address to the United Nations in December of 1953, he set forth the tenets that would become the foundation for the International Atomic Energy Agency, or IAEA. The reason for this address was clear. Nuclear energy had the incredible capacity for both being a weapon, as well as being literal fuel for our rapidly evolving world. The golden promises of energy independence as well as catastrophic weaponization are branches of the same tree of technology. As we head down a road we’ve already been down before, there needs to be a collective effort on the part of everyone.

As we dive into the risks and opportunity of AI, we must apply our critical thinking toward the practical application of technology for the betterment of all. By framing these risks and opportunities as such, we’re able to better recognize them by name when we inevitably hear about it in the future. AI and its effects on people will be heavily covered in the media. Critically recognizing hogwash when you see it will be to everyone’s benefit!

Until we all understand more about how our own brains work, we’ll struggle with policy in this regard. Our understanding of intelligence has made strides in recent years, however, we require more research and data on how the brain operates to objectively determine a working definition of consciousness. By understanding ourselves further we contribute to answering how to best address artificial intelligence. That is some excellent bang for our buck.

Throughout this process, it’s imperative to mitigate the risk of errors or failure where possible. Recalling my supposition that if you may have had an initially negative experience with technology, it could affect your desire to learn about technology, therefore, if this describes your experience, then reading this book may have caused you some apprehension. I get it - it’s crazy stuff we’re mentally chewing on here. We know from our discussions that some of our views are the product of innate biases. We’ve come to realize these modes of thought are a knee-jerk reaction which we should all, frankly, liberate ourselves from. They are unneeded for the world we live in today and even less so for the welcomed world of tomorrow. Despite the stronghold of these biases on our psyche, rational logic is always our ally. Consciously applying the use of rational logic to subvert our inherent biases, we can acknowledge what is helpful and hurtful to our future. Being a prisoner to outmoded fears about technology is to willfully accept ignorance.

It’s not enough to just comprehend, you must decide that you’re enabled to calibrate the way you perceive the world around you. The planet isn’t stationary, and neither are you. Every day we will face new risks and opportunities. We’re required to keep in touch with our humanity, to separate the noise from the wisdom. Where we’re headed presently looks unclear, there may however be an approaching clarity in the form of pairing both quantum computing with the use of big data. For now, we’re essentially hiking through hills, we’re unsure what we’ll see when we get to the top, but we’re still chasing daylight. If you feel at all daunted by the lack of clarity in our collective future, know this - we go through it all together. We are adhered by cosmic law to a giant rock vaulting through space and time toward an unknown horizon. You, me, and everyone we know are all astronauts on planet earth. We share the destination, while our lives become moments divided by time. Have the courage to embrace each of those moments in your world.


Chapter 4 - Big Data Fusion

“Data is the new oil. It’s valuable, but if unrefined it cannot really be used. It has to be changed into gas, plastic, chemicals, etc to create a valuable entity that drives profitable activity; so must data be broken down, analyzed for it to have value.”
― Clive Humby1

As noted, pairing quantum computing with large data sets holds yet-to-be-revealed insights into our natural world. Of course, when the questions turn to self-reflection on our shared history, the emotions conjured include pride and shame. Examining past data through history reveals our collective origins. Often times we can discover the context of these times through statistics. As it has always been good governing practice to closely track how many people could be taxed or sent to war, systematic record keeping gave birth to statistics. Providing a mathematical framework for understanding human capacity for civilization. Our ability to learn about our collective selves through historical records is continually heightened generation after generation of discovery. Yet, progress of that understanding has always been a slow, methodical march. Statistics represent the collection, interpretation and production of insights from data. Data itself can be as simple as the recognition and recording of a past event. This fact underscores the entirety of understanding what technology can and cannot achieve with humanity. Data can be anything from an attachment in an email, to the number of rings in the trunk of a tree, denoting age. Data exists in both the physical and digital world. In terms of the data we’ve collected from the physical world, unexpected and sometimes shocking insights are dragged into the spotlight.

Synthesizing insight from our lives also involves recognizing patterns in our own society and personal lives. Carl Sagan described human’s propensity to conjugate patterns where there are none, as an evolutionary advantage, saying “as soon as the infant can see, it recognizes faces, and we now know that this skill is hardwired in our brains. Those infants who a million years ago were unable to recognize a face smiled back less, were less likely to win the hearts of their parents, and less likely to prosper.”2 The infant innately recognizing the pattern of a parent smiling and learning to smile back, increases their likelihood of survival. The flipside to this is that maybe that infant would smile at anyone already smiling at it, falsely, yet beneficially, establishing an emotional connection. Infant mortality rates are affected by the perception of a shared connection, ensuring the continuation of our species.

Known as Pareidolia, this common psychological phenomenon manifests patterns where none are present. This particular bias is often the reason you see elephants in the clouds, or people on the news screaming about seeing the face of Jesus Christ in burnt toast. The news picks it up because seeing Jesus in toast is fairly easy and our brains do it automatically and intuitively - remember system one? Similar to opening our eyes, our brains are already attempting to discern the patterns before we’re even conscious of it. Like a bug in our brain’s software, the brain sees a face where there is, in fact, over-toasted bread. In college I had a professor who described the notion of Pareidolia as “making penises out of rock formations.”

Remember that when hunting and gathering, picking the wrong set of berries could prove fatal to your entire family. It was a naturally selected trait of pattern recognition that helped elevate us from dung-flinger to slightly smarter dung-flinger. Simply put, you can think of biases as coloring books. The outline is pre-supplied (by your brain) and your conscious, thinking brain decides the colors, but is nearly inconsequential as your brain already decided the shape. Its how your brain realizes you’re looking at a house before you even think about the features of the house, ie: how many windows, color of the door, etc. We use a pre-existing anchor, or first definition, of what our brains know a house looks like in an instant, only later noting the variations in the house relative to the picture in our mind. From that point, we move forward with actively perceiving the house with all of our available senses. Noticing the degrees of difference is what enables our eyes to scan the horizon in search of danger or food. It is the difference between seeing a predator’s tail in the brush, or being on the lunch menu. And knowing is half the battle.

Pursuit of Wisdom Yields Insight

Pursuit of a deeper meaning to our existence is far from a new concept. Across the world, fledgling civilizations like the Sumerians, Babylonians, Egyptians, etc... all looked for guidance in their daily lives within our cosmically alluring night sky. Our constellations at night helped get us started on this existential quandary. Just like us, ancient civilizations looked to the stars at night, comets streaking across the sky, volcanoes erupting, etc. and connected the dots and you’ve got some sizzling stories to tell. They developed mythologies, and associated them with shared wisdom. The wisdom would be simple but prudent to the survival of our species, often with harsh penalties from the gods for straying from the prescribed path.

Prior to written language all of these wondrous and moralistic tales needed to be spoken aloud to be transferred virally generation-to-generation. In order for the stories to become mythos, they required the use of drama and tragedy. The crazy thing about history is that in some instances the truth is stranger than fiction, case in point - King Tutankhamen’s gold-handled, crystal-pommeled burial dagger is forged from iron that came from a meteor.3 How cool is that?! It’s not too hard to imagine that after a long day in the hot Egyptian sun, a villager viewed that same meteor streaking across the sky, and into his or her backyard. Obviously, this would be a gift from the gods worthy of a Pharaoh. It’s no wonder cultures across the world have built rich mythologies surrounding naturally occurring events.

Here’s the rub, correlation does not equal causation - it’s about how we, as a society, view interrelated levers and buttons. A kid with good grades also has high self esteem. But which is the lever - the high self esteem, or the good grades? Examining and acknowledging this relationship requires a reframing of the approach. I believe the highly personal analysis frameworks that come with the big data wave will allow us to meaningfully correlate variables that, at present, do not appear to be related. This wave could be amplified by the pursuant development of quantum computing, given its ability to create cascading algorithms. The exciting point is there are many as-of-yet unexplored threads in the fabric of science and mathematics. We may be but a few considered tugs from aligning a grandiose tapestry of achievement before our very eyes. If you’re reading this on a tablet or phone, then you’re already engaging in behavior that was considered outlandish ten years ago.

There are no easy answers to be had in securing an optimal outcome for ourselves. The future has always been an unknown to every generation that came before us. Heroes of the past were measured by their ability to stare into the unknown with defiance, and by will, make it known. These same challenges are faced by all of us individually, as well as collectively. It all begins with awareness of the faults and benefits of the equipment we’ve been given. We’re tuning our ‘instruments,’ when it comes to our perception of the world. We use the tools we’ve been given to ensure that we do not use butter knives to eat spaghetti. There is room enough in this world for each of us to be heroes, and boldly claim our individual freedoms.

The evolution of language creates friction at the points that fail to express the full breadth of intent, and thereby the human experience. Individually, words have different meanings and connotations depending on who is speaking, and who is listening. Our ability to accept information from language is thought to be inherent to our DNA. This idea is reinforced by the work of notable scholars RC Berwick and Noam Chomsky et al. who posited that our DNA contains a basic-instinct instruction set for our brains to understand language, despite disparate languages existing across civilization.4 The innate ability to organize concepts within the brain also includes the ability to for those concepts to become distorted. When the message is received, then processed in the brain and formulated into a response, that response can be fraught with errors. God help you if that the speaker hasn’t had their morning coffee.

These shortcomings are also known as ‘The Telephone Game’ and represents a real gap in human expression. In today’s society the telephone game has evolved into a sort of emoji game. The emoji game is where you attempt to decipher what the hell your friends are saying when they send a string of seemingly unrelated emojis. For example - what the hell kind of message is this:

Fig 4-1. - Turtle Pizza for strength?

An invitation? Who wants to do that with anyone? All of that friction and that’s assuming that the sender and receiver even speak the same language. I believe this shortcoming of language is what brought to fruition - faith, art and science. Further, these three branches of the same tree of humanity represent the entirety of self expression in humans, and they all begin with the question - why? They are the output of human curiosity investigating and understanding our world. In a recent study published in the International Weekly Journal of Science, Nature, the threat of an all knowing, punitive god enhanced social cooperation.5 The uptick in prosocial behavior - or acting to benefit others - was a key factor that allowed us to advance societies through the Agrarian age.

The moralistic determination of faith helped produce groups that saw helping each other as a non-zero-sum game. They understood that the notion of winners and losers did not help their close-knit tribal groups prosper. Developing agriculture supported increased populations, and thus finding food for survival became less important to daily pursuit. It freed up time in which people would be able to pursue their autonomous interests, dependent on their position in society. Art became an avenue to communicate ideas and beauty, and this has led to awe inspiring works of expression like the Sistine Chapel. In this chapter we will dive into the ways in which data becomes as essential as electricity or oil in powering tomorrow.

In the previous chapter, we covered:

Reviewed the history behind AI

Examined the many ways in which AI already touches our lives

Focused on a few branches of AI development

Discussed the execution in representing AI in film

Highlighted the risk and opportunity in pursuing AI

Investigated how AI may be augmented by the computational power of quantum computing

In this chapter, we’ll answer the following questions:

What is data?

What is big data?

What is the difference between data and big data?

What is the internet of things?

What does the phrase ‘correlation does not imply or equal causation’ mean?

Pattern Recognition in Science & Technology

From networking and privacy, to buzzworthy terms like ‘big data,’ the ‘internet of things’ - from nuts to bolts- we’ll review how these catchphrases affect our day-to-day life now and into the next five to ten years. Further, we will demonstrate with examples just how relevant data is to your day, even if you don’t notice it. In the world of hard sciences, questions of meaning sparked the pursuit of precise and repeatable wisdom. Theories must pass rugged peer inspection before being accepted as wisdom. Science, like television, has its own panel of celebrity judges. The peer judges within the scientific community would make Simon Cowell blush with their brevity. Metaphorically, this ensures when scientists develop theories, the song they’re singing is, at the very least, in tune. The panels judge the theories and methodologies which are then confirmed or disproven, perhaps to be reexamined using new or more precise data. Theories and accepted wisdom rise and fall similarly to music on the pop charts. Similarly to cover songs or sampling music from other artists, scientists and mathematicians use previously established theories to build upon confirmed wisdom.

Within those hard sciences, physics is considered to be the mathematical description of the natural world, where everything is related to cause and effect. From the laws governing the movement of planets and galaxies, to the the laws governing quantum particles, we haven’t even figured out how to measure it all. There are formulas for everything - from how fast an iPhone accidentally falls into the toilet in a public restroom, to weather patterns spanning the globe. Important to note, when applying data to predictive analytics, we need to acknowledge that not every effect is from a single cause.

There are many additional factors that can be argued as a cause for the effect. However, in the context of this chapter, we’re assuming the data sets in question will be robust enough to provide reasonable levels of statistical certainty to their conclusions. There are even theoretical formulas that describe every physical cause and effect in existence - widely known as the ‘theory of everything.’ This theory of everything is an artistic unification of general relativity and Quantum Theory. General relativity is a description of the nature of all things very large - think planets and galaxies. Quantum Theory, is the other end of that spectrum and concerns the nature of all things really, really tiny. Think electron, proton, neutron behavior, at their basest level.

At present, these two theories are tested and valid within their respective ranges but there is no connective tissue between the two theories. The unification would present us with the framework necessary for understanding the full scope of how the world is put together. Or, it could be pareidolia conjuring a poetic pattern we want to believe exists. That’s part of the fun of science, like Schrodinger’s cat, we won’t know for certain until we open the box. Schrodinger’s cat is a famous thought experiment that describes quantum behavior. In the experiment, a cat lives in a box, or maybe it’s dead inside the box. We don’t know for sure because we cannot see inside. Therefore, the cat lives in a state of superposition. This ties into the idea of Qubits existing within a spectrum between 0 and 1. As soon as we look inside the box to measure, the state of the qubit - its superposition collapses into one of the two states instantaneously.

Though nature can stubbornly be unpredictable and sometimes the complete opposite of elegant - looking at you, Panda bears - confirming the theory of everything would present the outline of a really big jigsaw puzzle. The unintuitive-inelegance of nature representing the middle of that jigsaw puzzle, leaving us to use the outline of that jig-saw to guide us in filling in the middle. From an abstract level, data as we know it, is and can be many things. It can be a video of Pandas clumsily yet adorably falling, it can be a single cell in a multi-thousand row spreadsheet - data is physical and digital. We’ll need to continue using technology and enhanced data analysis techniques from big data to help fill in that jigsaw outline. The illumination of insights will help lead to answers focused on our immediate and most pressing problems. This is, again, our increasing focus coming into view. The responsible use of data and technology as tools, are stepping stones on the way to answering those ‘bigger than us’ questions about what tomorrow will bring.

So where does the rubber meet the road? Well, sometimes we get it wrong. For example it’s been mathematically proven how a majority of conspiracy theories are demonstrably bullshit.6 Sometimes very wrong. When connecting the dots, sometimes you end up seeing Jesus in a slice of toast. Below are a series of illustrations created by artists for the 1900 World Exhibition in Paris. They were asked to imagine what technical wonders the year 2,000 held. The results are optimistic yet laughably misplaced.

Fig 4-2. - Remember that time back in the year 2,000 winged firefighters rescued you from a fire?

Fig 4-3. - Or that time you rode a side-eye giving whale?

Fig 4-4. - This one actually holds up pretty well. Bummer.7

So maybe predicting what the future will hold 100 years out is pretty tough. Maybe shorter term predictions will be a little bit more enlightening.

No Y2K Problems8 - If unable to view the video, we can sum it up as that time everyone lost their shit because they thought computers would stop working. The dreaded Y2K bug that almost brought the world to its knees.

Depending on when you were born, this will have varying meanings to you, or may just appear to be gibberish. Simply put - Y2K was a societal apocalypse scare about how computers interpreted the calendar change from the year ‘99 to ‘00. This vulnerability existed because in the olden days of computing, storing calendar dates as two numbers instead of the full four. Tons of information technology infrastructure within banks, governments, businesses still ran legacy products and hardware. They still do, but that’s an issue for our next chapter. The fear was that when computers woke up on the morning of Jan 1, 2000 that they would grow conscious and assume the title of world tyrant and subjugate the human race. They would force us to only use Clippy to type all documents moving forward.

Fig 4-5. Destroyer of worlds (and Microsoft Word documents).

A tad hyperbolic, I know, but that’s how the story was spun. Closer to reality, people feared the new year would register as the year 1900 and every computer would grind to a halt in the face of continuous logic errors. Financial markets would melt down, economies and nations would crumble faster than a box of girl scout cookies brought to the pick-up day at fat camp. Luckily for us, nothing like the above happened, and overall, only minor effects were seen, such as 150 slot machines at a Delaware race track stopped working.9 The cure was the equivalent of going to the doctors for a shot. By the time society opened our tightly shut eyes we realized it was a flash in the pan. We’d be forgiven to assume it was much ado about nothing. Here’s the thing though, it actually was a big deal. The free market developed a solution with “Y2K Compliance” devoting 100 billion dollars and software development toward resolving the issue. It’s difficult to predict what might have happened had all the effort spent been placed elsewhere. We were prepared and we tackled the issue using intelligence and planning.

Ok, so maybe predicting the apocalypse isn’t so easy. Given enough time, one of the crazy street corner preachers will eventually get it right. Now? As in today? Statistically speaking, “naw.” I’d bet everything on it. So let’s set our sights on something a bit closer to reality. The overarching point is that sometimes we’ll be right, and sometimes we’ll misstep in our application of technology. We should expect it and remember to do as we do in life and use these missteps as learning experiences. To do that we need to be open and upfront with each other.

Big Data and You

No matter who you are, save for the Amish, you use software on a daily basis. It’s a layer of reality that is invisibles and always moving. These rivers of technologies generate ongoing conversations across the web and the world, comparing and contrasting with competing technologies. There is a renaissance in the world of data and the apparatus for analyzing data is becoming increasingly powerful and consumer facing. It means that the underlying technology has become mature enough for everyday people to intuitively use. It’s at this point that companies need to sell the product to those everyday people. In order to efficiently communicate the whole of the technology they tend to shove a few loosely related words together. From there they can shout it from the rooftops and they pay millions in marketing dollars to do just that. The cloud, the internet of things, wearables, Virtual Reality, Augmented Reality.

The use of software in our day-to-day lives has skyrocketed since the advent of public internet and further exploded with the advent of the smartphone. This is thanks in part to the many buzzwords that came to pass and then mature into ubiquitous use. These innovation cycles, like Moore’s law for example, have advanced in parallel with the use of big data. There are few activities a person can do nowadays that don’t involve some form of computing. This even includes governmental updates to information technology infrastructure. That sentence has not likely aroused any of you out of your seat in a cheer. Overall, it’s a great thing. When a government updates the software it uses, the people benefit. Whether that be an appointment system for the department of motor vehicles or an app that allows city dwellers to flag issues like potholes or water main breaks. Software has paved the way for government to run more efficiently in collecting citizen feedback and participation. This trend will continue to improve over time as more and more archaic systems and hardware are upgraded. This proliferates the data that is associated with those systems and can rapidly make for efficiencies and optimizations to the day-to-day operations. Bringing systems like permitting, licensing, and record tracking online can enable a society to shape its government in its vision.

It’s not just efficiencies in the way operations are conducted that represents optimizations to government. Additionally, large data sets hold great value for the business community as well. Among the fastest growing job categories, data intelligence and analytics is easily among the best paying. At the forefront of using software to provide efficiencies, business has long been collecting volumes of data about their operations. This has allowed them to continually mine their data to improve the health of their business. It makes sense that in a new world of data being everywhere that people being able to contextualize that data have become a very necessary human resource.

Further evidence for the need for able bodied people comes from the amount of devices per person that we’re using. Gartner Inc. an information technology and research institute has predicted there will be 25 billion connected devices in 2020,10 all producing data of every variety and able to be used as fuel for insight. To contextualize all that data, Gartner defines big data as “... the information assets characterized by such a high volume, velocity and variety to require specific technology and analytical methods for its transformation into value.”11

Woof! Don’t fret if that sounds complicated. We’re going to use the power of metaphor, yeah! To start, imagine a flowing river - any flowing body of water. For me, I picture a creek that ran behind my house growing up. OK, let’s break this big Gartner definition of big data down, piece by piece:


“Information assets”

“Volume”

“Velocity”

“Specific technology”

“Analytical methods”

“Transformation into value”

To tie the above pieces together we need to define structured versus unstructured data:


Structured

Unstructured


If you were to think of data as an apple tree, structured data is the apples that are harvested intentionally. Meanwhile, the apple that falls to the ground would be comparable to unstructured data. The thing to remember about unstructured data is that even though it is unstructured, it can still be processed and used to distill insights from large amounts of (unstructured) data. As an example, the team at IBM posits the following situation - “Consider a customer call center. Imagine being able to detect the change in tone of a frustrated client who raises his voice to say, “This is the third outage I’ve had in one week!” A Big Data solution would identify not only the terms third and outage as negative events trending to a consumer churn event, but also the tonal change as another indicator that a customer churn incident is about to happen.”12 The exciting aspect of risk and opportunity within big data includes the analysis of speech to pick out patterns such as client churn, as in the above IBM example. This radically simplifies the process of leveraging internal data to procure genuine business insight.


Ƀ Vs. $ - Clash of Symbols


Bitcoin burst onto the internet in 2011, deposited into the public sphere by a mysterious person or persons whom created the cryptocurrency. Simpy put - a currency that mimics the characteristics of money in the real world, namely scarcity, within the digital world. The ‘crypto’ compound wording of cryptocurrency standing for cryptography securing the transactions. Which is difficult to achieve given the reproducible nature of data, elevating this internet mythos to its present status. In a true to life myth, the complete instructions for implementing Bitcoin were published online and immediately took off like a rocket. Under the nom de guerre of Satoshi Nakamoto, now responsible for the generation of 12.2 Billion dollars13, proving that value is a fluid concept. However, as we’ll investigate the ultimate value of Bitcoin may not be in the coins themselves, but with the technology behind the coins. With the rise of Bitcoin we have to ask ourselves if the old saying ‘cash is king’ is even relevant? If you are or have been confused about Bitcoin at any point, don’t stress it, we’ll break it down!



Simply put - Bitcoin is a decentralized and distributed, digital money whose integrity of value is maintained by a ‘shared ledger’ of thousands of different computers or nodes across the world.


Fig 4-7. - Bitcoin represents a different direction for currency and value.

The challenge is how do we track and verify transactions that are completely digital? That’s a pretty big challenge which requires breaking down the components of the Bitcoin definition.


Digital

Decentralized


Fig 4-8. Decentralized networks enable diffusion of control and transparency.

Shared ledger

Bitcoin is decentralized in that all transactions are distributed across the network. Each node, which can be a server or other device, is able to send and receive messages and resources. The transactions using Bitcoin are published and maintained by the network of Bitcoin miners. Bitcoin miners can be anyone with the desire and a computer, you, me and even Great-Aunt Eunice can use their own personal computers to participate in the Bitcoin network. These computers crunch very large math equations and verify their proof of work to the entire network. This also ensures the integrity of the public ledger. The miners who solve a math problem first are rewarded with fractions of bitcoins for crunching numbers and maintaining the integrity of the network. To maintain the intrinsic value of bitcoins, there is a mathematically finite number of bitcoins that will be produced by miners, which enhances the value of the bitcoins in a unique way. Previously, data had been considered infinitely reproducible.

As a recurring theme, humans are often the X-factor in a given situation and the same holds true with Bitcoin. With the collapse of Bitcoin exchange Mt. Gox, in which approximately $450 million worth of Bitcoin vanished into thin air, the location of these coins are accounted for somewhere within the Bitcoin protocols but the fact they’ve not been traced, leads some to suspect they never existed in the first place. We’re yet to learn the full details of what occurred, but in the presumed trial of CEO Mark Karpeles we may find out more at a later time. The collapse of Mt. Gox represents the unmitigated risk materializing, unfortunately taking many people’s savings with it. In the end, the debacle seems to be a road bump in the overall story of Bitcoin. Aided by the decentralized nature of Bitcoin, no single government or bank is in control.

The technology behind Bitcoin is self-sustaining and builds in strength with increased adoption. In the case of the 3 billion new users set to arrive on internet in the next 10 years, they’ll leap frog many of the evolutionary technological steps the rest of the world took. In developing nations these people joining the internet economy will need somewhere to store their money, and Bitcoin may appear a safer bet than the national standard currency. With the arrival of Bitcoin, we’re forced to reconsider our traditional idea of ‘value.’

If you run your own business, should you accept Bitcoin? Great question, the adoption of Bitcoin by a business will depend entirely on the type of customers you serve. If you’re a coffee shop, it’s a pretty safe bet a few of those addicts will be slinging some bitcoins. However, if you’re the only general store in a town, I’d say do what you think is right for your business. Unofficially, if you have an interest in Bitcoin, you’d be doing yourself a disservice by not investigating it further. As mentioned, the price of bitcoins are volatile and can range wildly in a single year, so you’d be wise to not put all of your eggs in one bit-basket.

The Internet of Things

As a consumer, the internet of things excites the hell out of me. Another one of those buzzwords thrown around like ‘cloud’ or ‘big data, advertising shouts these words at people daily as if the repetition further defined the words. These terms tend to be so broad, so interchangeable in their meaning that distilling them down to a single word or phrase usually does more harm than good. So what is the internet of things, really?

Simply put - shove computer smarts into every device possible so they can talk to each other.

Cisco Systems predicts that by the year 2020, there will be 6.58 connected devices per person on earth.14 This doesn’t mean that each person will be hoarding almost seven devices on their body, but rather that’s how many smart devices will outnumber people. According to IBM these devices generate so much data that “90% of the data in the world today has been created in the last two years alone.”15 Sensors of every variety will be included in these statistics, from motion sensors, to microphones and even computer vision. This is further evidence of the explosion in the big data uprising. Speaking from a consumer standpoint, the home’s capabilities are increased with the addition of sensors and devices and furthered by the data generated within the home.

Recently, this ‘brain’ arrived in the form of Amazon’s Echo product. An assistant in managing communication between your devices at home, using microphones and a web connection, the assistant becomes a conduit of the user’s will. Commands that would have normally been a tap on a screen of a smartphone are issued verbally by the user to be carried out by software that communicates between devices. In seamlessly communicating between devices, much of the leg work in activating and deactivating devices are done with as little input from the user as possible. This, in turn, increases convenience to the user as they experience daily events. In order to understand how a smarthome might improve your life, for example, let’s break down your morning ritual into individual events, like the following:

Each ‘event’ represents an opportunity for a set of devices within your home to anticipate your personalized experience. All that would be required is a little bit of computer smarts and data. Our morning routine is just that, routine. It just so happen that computers and data are pretty good at understanding routines and will be able to anticipate our morning routines. If you were to associate your personal preferences with your physical location within your home you can achieve some pretty neat things. For example;

Ok, that last one is a little far off. Just kidding, this is Welcome to Tomorrow, not Welcome to Yesterday! The June smart oven cooks your food intelligently by using a camera to analyze the contents and cooking it appropriately. Great Scott! The smarthome will, again, know you and your preferences. Without you even opening an app, your morning is custom tailored to you. The comfort and convenience enables more focus on what is meaningful to us. You’ll receive a notification or verbal caution that your typical highway to work is laden with traffic, so you should take a different route or leave earlier. It will be like having a personal assistant that knows your preferences and seamlessly manages them. For now, we have some simple ‘smart’ (and borderline frivolous) conveniences but with time, we’ll behave like an 80’s diva toward our C-3PO like digital assistants.

The technology at play here is currently available, but not yet ubiquitous. This is thanks in part to the openness of software APIs or application program interface. You can think of an API as a dock for boats, and thereby software. It allows two different pieces of software, most often from a different company or app to communicate and work together for the user’s benefit. When the programs work together they can increase the overall value to the consumer. My favorite example of this right now is the integration of the Amazon Echo and our Philips Hue lighting system. For your edification, the Echo is a voice-controlled computer shaped like a candlestick and Hue is a WiFi connected light bulb able to produce 16 million colors. With the use of APIs, I’ve set the Echo, named Alexa, to be able to control the lighting in our home by using just voice commands.

Figure 4-9. Achieving perfect voice recognition in the next 5 years is nearly assured.

Voice based interfaces are fast becoming the hot new fad in technology and for good reason. The average speaker is able to speak around 145 words per minute. While the average person typing on their keyboard amounts to around 40 words per minute. The difference in being able to speak commands, rather than type them, reduces friction for users in practical ways. Latency and accuracy are the biggest factors in producing a seamless user experience, especially when we have our hands full. At present, commands need to be issued serially, or in other words - one action/one command which nets out to:


Me: “Alexa, shuffle.”

Alexa: “Ok, shuffle mode activated”

Me: “Alexa, next song.”


As opposed to me simply saying “Alexa, shuffle and next song.” Latency or delays from the time a command is issued to when the action is performed will be a limiting factor for now. The convenience factor arises when users are enabled to issue commands like checking the news while physically moving around the rooms getting ready for work. Personally speaking, the Echo is worth its weight in gold when used for timers while cooking. In terms of lighting, voice enables us to activate lighting without the need to:

1. Locate your phone

2. Swipe to unlock

3. Navigate to the lighting app

4. Activate or enable a light setting

Or, you know, getting up and flicking a switch. It’s the same argument as encountered when the remote control came to the television set. To some it seemed to discourage the nobility of time honored pastime of tuning and twisting knobs on a TV the size of a small car. In actuality, it enabled a level of convenience not seen prior and became common practice because of it. The amount of time between deciding the turn the lights on and the lights actually turning on is all unnecessary friction when voice is an option. We know the remote control flourished and began rapidly breeding, hence the petting zoo of remote controls that reside in most homes. Additional niceties of voice based interfaces include listening to music and artists at your own will. Got a song stuck in your head? Speak the name and begin jamming it out of your head! Minor conveniences that when added to the numerous other conveniences, add up to a hell of an experience. Looking at the box on the shelf won’t seem like a whole lot, but the value of these devices arrive from the convergence of multiple devices to produce a single seamless experience. Much like Virtual Reality, this is one hard pill to swallow until you experience a setup that feels like a natural extension of your home. Hang in there, the Jetsons’ home is coming to keeping up with the Jones’ imminently.

In other areas, the internet of things has already begun experiencing a boom in investment. The application of the internet of things to industry will help advance business goals in profound ways. The way this comes to fruition is that throughout the production of say a smartphone, the pieces can be automated. Meaning they need to be pieced together, packaged and shipped without a touch from a single person. When producing items like smartphones, it’s done in such high quantities that the tasks involved in assembly, packaging and shipping are repetitive.

Figure 4-10. Jan in Accounting’s replacement never brings donuts in for everyone.

Right now a robot named Baxter is being sold to industrial factories across the world. A factory robot in of itself is nothing new, of course. What’s unique about Baxter is that he is able to be trained in what movements it needs to achieve the desired function. Paired with software that enables intuitive instruction, Baxter can perform a number of duties including product and package assembly. Given that Baxter is a robot, it does not need breaks and can work 24 hours a day, 7 days a week. Who would want to compete with that for a job?

As the success of such products increases, so too does the underlying technology with iterative generations of products. For example, chatbots are all the rage in tech right now. Chatbots are virtually personal assistants that have the ability to flip digital levers like reserving a table your local hot spot. Similarly to Baxter, these chatbots will never tire or need a break. They are always ready and always on. Which is scary in a way, but we’ll talk about it more in the next chapter. The chatbots are machine learning assisted AIs that ingests large volumes of data to understand a skill set. Yes, like Neo from The Matrix learning Kung Fu. The chatbots are able to handle low level support and thus reduces the budget a company needs to operate successfully at scale. Similar to the way the first iPod is vastly different than today’s iPod touch, equipped with an app store, camera, and iterative learning from previous products. It is not far fetched to say that within 10 years a near majority of factories will be automated. The human result will of course be a double edged sword, people will be out of the job, but our products will cost less helping to drive economic growth and health.

The Oligopolies in Plain Sight, the American ISP Industry

The concept of Net Neutrality is yet another buzzword in a sea of buzzwords. It’s no wonder that public comprehension of this issue is still developing. The gist is that among internet service providers, they must treat all data equally. With Net Neutrality in place, it means that your ISP cannot slow down the download speed of your video simply because it is a larger file than an image. This is good because it prevents stratification of services. In each strata or layer, an ISP would be able to charge companies like Netflix more for delivering video. The net effect being that services like Netflix or Hulu is still accessible no matter who pipes in your data. If Net Neutrality were subverted, streaming services, might be segregated based on deals set up between the streamers and the data pipers. Think of cable packages where you get none of the crap you use and all of the crap you don’t want.

If you’re like minded to most people, your reaction would be a great big resounding “meh.” I hear you, I really do. This is overall a boring topic so why even get hot and bothered about it? Do you trust them to be the gatekeeper to content on the web? If so, you might want to consider the fact that not all web traffic is entertainment. What if ISPs saw an opportunity to charge a fee to digital assistant traffic? Amazon’s Alexa or Apple’s Siri might become a premium feature with local your web service. The point being, the web is evolving in unpredictable ways, preserving innovation is the same as preserving prosperity. If you still find yourself unconvinced, in real world terms, all I have to ask is - are you happy with your current broadband/mobile provider?

Research shows that consumers in America overwhelmingly disapprove of our selection of ISPs and cable TV providers. As of 2015, the approval rating is discerned by the American Consumer Satisfaction Index rates ISPs at a whopping 63 out of 100.16 That might be pulling my punch. Y’all hate cable companies. You’re overcharged for slow speeds and terrible customer service that also changes the names on your bills to “Asshole” (seriously).17 Meanwhile, these companies take billions of dollars in government subsidies to expand data speeds and infrastructure only to ignore the earmark. While taking these subsidies, they go on to spend the money on lavish parties for celebrating personal accomplishments.18 As a reminder, The extent of shitty things ISPs do to its customers is pretty lengthy and I could go on for at least three more points of terrible things they do:

  1. They enforce artificial caps on data usage, slowing speeds to a crawl when over the cap. Because of “network congestion.”
  2. Later admitting that “network congestion” had nothing to do with it.19
  3. Selling “Unlimited Data” plans, that are not actually “unlimited.”20
  4. Marketing safety plans for $4.99 a month to avoid having to pay a technician for service calls, placing the caveat that the service does not include basic wiring inside the walls of the home.21

All of these details are important to consider because the prosperous future of the web will rely on quickly processing and moving large volumes of data. That cannot happen if the pipes carrying all of that data are intentionally left small because ISPs don’t like investing in infrastructure. They do not invest due to local oligopolies that arise for perfectly legal reasons. Why invest in expensive infrastructure within an area already populated by your competitor?

In the late 90s we experienced the Dot Com boom and bust in a very short period of time. From 1997 - 2000 billions of dollars were spent and invested in the new-fangled internet machines which lead to rapid overconfidence in the industry. We recovered after years of economic growth and transition from a product economy and into a service economy. The levels of bandwidth were not supported by demand from consumers because so few people understood what they could demand as consumers. High levels of bandwidth are an integral puzzle piece to what pushing our internet boom of today. To be clear, I believe access to the internet is a human rights issue. Without access to this basic modern-day playing field, we’ll leave millions of people behind. It is important to have strong enforcement laws in information technology, because the average cable subscriber won’t spend time educating themselves on the impact of settling for crappy service.

We need to ensure competition as opposed to the regulatory capture the ISPs presently have. It would prevent them from forcing companies like Hulu, Netflix or HBO Go from paying nonstandard prices. This is not so much an issue for established players like Netflix, as they already peer content across the nation and world.22 However, if you view web publishers through the lens of small business, they are about to encounter soaring barriers when entering online markets. As content producers will all be subject to throttling, etc. clear and undue hurdles will continue to stifle web entrepreneurs and subsequent innovation. Ensuring the principle of Net Neutrality defends the future free flowing of information first, and second, ensures that the economic wheels of innovation are greased for future prosperity.

OK, Maybe We are Living in the Future, CRISPR

If you recall in the 90s there was a big hubbub about the mapping of the human genome. The international effort was meant to catalogue of the 99.5% of DNA that all humans share. Mapping the genome allows us to identify specific genes, of which may cause a person to have blue eyes or a third nipple. The genome is the most basic recipe for human life we have.

Simply put - The genetic variation between two humans is roughly the difference between peanut butter cookies made using smooth or crunchy peanut butter.

CRISPR utilizes the RNA portion of DNA to specifically apply proteins, enabling targeting very specific strings to modify their behavior. Think square peg, square hole level of precision. The technology acts like a word processor, allowing for editing of DNA sequences or recipe. As we’ve now catalogued genes, CRISPR will lead to treatments for preventable inheritable diseases. As long as the disease resides in our DNA we’ll be able to turn it (aka ‘the expression’) off. Over the next five - ten years you’ll be hearing a lot more about CRISPR, guaranteed.

Big Data Tomorrow

Now that we’ve examined the extent to which data plays into our lives, we need to look forward to the next five - ten years. What should we look forward to? What challenges must we overcome? In my opinion, the most important role data plays is in the economy. Capitalism has been a profound force in shaping the world. It is due to capitalism that advances in healthcare and technology have enabled people to live better and longer. It is part of how we were able to put a man on the moon in only eight years.23 World renowned economist Adam Smith described an invisible hand, guiding a capitalistic markets.24 One that would engender change in its own structure should the market demand it. In practical business applications finding market fit is the ultimate goal.

The importance of finding this fit is the difference between a failing or flourishing economy. Especially in the realm of startups, market conditions can cause a single business to pivot, that is change the method in which they make money. To begin determining fit, a business or organization only needs to recognize what data they have currently available, as well as what they’d like to have. Once this question is answered, the organization can better define what utility they’ll receive from implementation.. The thing about the tech industry is that it is radically different from traditionally born organizations. Growth of a business is about finding the correct metrics and then maximizing them. To do this, the organization needs to ingest and analyze data. The more data you collect, the more complete a picture of your business’ health you will have. This value is available for all companies and holds especially true for the future.

A new job type has opened up in the past few years, known as a Data Scientist. A data scientist is an individual whom analyses large volumes of data in search of trends. Because big data could also be considered noisy data, they bring context to swathes of data. These trends can then identify metrics that move the business forward and contribute to the overall health of the organization. Companies like Microsoft, IBM, Facebook, and Google are all employing this profession by the thousands. The same is true across the global economy. Growth of this well paid position has doubled over the past four years according to Philadelphia based RJMetrics.

Industries employing these people run the gamut:

The role continues to be considered by analysts to be one of the best jobs in America, arguably the world. The reason these people are so employable is that data is agnostic to the application. It does not care (even if it could) what the application is. If you’re able to measure it, you’re able to dig out insights via analysis and algorithms. The analysis can approach disparate sources to coalesce into genuine business insight. This is otherwise known as business intelligence. As big data comes home to roost, the profession will continue to explosively grow.

One of the larger existential challenges the proliferation of the internet contains is migrating from a world of scarcity, into a world of plenty. Increasingly, market value is derived from purely digital sources. Remember that within the digital world, content can be copy/pasted simply and quickly. More people can get more - more information, more security, more entertainment, more convenience. We already know our world is perpetually changing, one of the areas in which we’ll experience more noticeable shifts is in what we consider as having value. We’ll be increasingly including data as being valuable. Due to the many practical applications of data toward the optimization of an enterprise. Included in that shift, if we’re able to incentivize outcomes that are beneficial to both the business and society, we can include more people on the ride.

Throughout the generations, humans have, without fail, come up with spectacular and awesome ideas for how the world will end in the near future! The concept of going from birth to death is a mind boggling prospect. The prospect of that finality can be troubling to our wondering minds. In order to process that emotion humans have developed coping mechanisms in the apocalypse. If the world is ending for me, it might be ending for everyone. The concept of an end time has been used to contextualize our own deaths for millennia upon millennia. It’s a shorthand way of reminding each other that, “Hey, this is it!” and we all get just one life. The point is that while bearing a great social meaning, these scenarios keep us fairly short sighted. As we continue to enter a golden era of technology, let us not forget this one thing. Our science and technology, is only scratching a surface of what’s possible. There is no barrier but what we perceive for ourselves in the way of that.

The continual challenges presented to our civilization are many. Worse yet, they are very large, very nuanced challenges. Climate change, terrorism, clean water, poverty. Every challenge must be navigated with purposeful consideration and decisive action. I believe the most prudent step we take to continue carrying the torch is to establish these challenges as moonshot problems with blue sky thinking. We should apply considerable resources - both monetary and intellectual - and a healthy dose of effort to resolving challenges effectively. However, we need to reject the post-apocalyptic genre traits that have penetrated our collective psyche. We need to navigate around the knee-jerk system one decision making to critically perceive our environment. Inclusive of that critical perception is subverting the permeating messaging that is propagated by brands. In an effort to build affinity, brands pervade digital and physical environments, all the while attempting to inform your opinion via advertising. The problem being that advertising can often be a one-sided conversation, which disables critical discourse. Trading emotional reaction for critical perception, brands rely on connecting emotion with action, for example: purchasing products.

The More You Know! Brands and You

First, a word or two on brands. Branding remains a force of nature within the world economy, it commands attention with iconography and implied trust. Brands convey a ton of intuitive knowledge about a given service or product. It is the goal of advertising to inform that knowledge with further detail and positive emotional connection. However, branding is simply a play on perception. Early consumer centric economic growth relied on branding to establish a foothold in the middle class. Consumerism brought a wealth of new products to the markets. Branding is leveraged to convince consumers that a certain product or service was better than the next company’s. Brands were bolstered by a company’s continued relationship with the customer.

Visit the cereal aisle in your local grocery store and you’ll quickly put together that a brand alone does not always tell the full story. A box of Cocoa Puffs will be four boxes down the shelf from a box of Choco Puffs. In some instances the contents of these boxes might even be the same exact product thanks to what’s known as private labelling. Similarly, that box of Kraft macaroni & cheese becomes Target branded Mac & Cheese. A similar nuance is present in the wireless service used on your smartphone, and you can use this to your advantage.

Branding is so influential because it plays on the way we understand the world around us. No one person has the ability to understand the pros and cons of every single brand in a grocery store. In economist’s terms, this is a lack of relevant information by which to make a decision, ‘imperfect information.’ Branding attempts to fill those gaps in knowledge by visually conveying information about their own product. Hold those two boxes of cereal up to each other and immediately your brain will begin filling in the knowledge gaps about the perceived value of both boxes. You will perform what’s known as ‘anchoring,’ whereby you weigh the value - perceived and actual - of two different options against each other. ‘New York style’ pizza being available across the nation is another well established example. It’s the goal of companies to become the actual anchor in this situation, the product being compared against. Cognitive bias makes a not so surprising appearance, as we know that loss aversion causes people to ‘play it safe’ when making decisions. We hate to lose when making choices. Here’s the great news about your wireless phone service - it’s all Cocoa Puffs!

As a part of the regulations in place by the Federal Communications Commission, the big players like Verizon, AT&T, Sprint and T-Mobile must lease their pipes to newer entrants. It is widely acknowledged that the cost of implementing telecommunications infrastructure placed an unnecessarily high entrance bar, potentially stymying competition. This coupled with buying information being widely available means you can have your premium cereal and eat it too - all while being 100% educated about what’s in the box. Providers like Cricket Wireless, Boost Mobile, and Net10 use the same exact pipes as the big guys. The speeds and quality of service you get from Cricket would be the same as your friend sitting next to you on AT&T. You get delicious Cocoa Puffs for the price of Choco Puffs. Eat up!

Figure 4-10. Google Fi is worth mentioning as it uses software to choose the stronger signal between both T-mobile and Sprint. Neat!

OK, Maybe We Are Living in the Future, 3D Printing

Speaking of branding in the all-digital 21st century, as we’ll find out 3D printing presents unforeseen nuanced issues in copyright law. 3D printing is a snazzy new technology that, similar to a desktop printer, takes a digital file input and outputs its likeness onto a piece of paper. Except the 3D printer uses many ‘pages’ or layers to construct a 3D object in physical space. The difference for a 3D printer being that the tall stack of papers you’d be left with does not exist, just the physical object. The idea is that if you have a 3D ‘blueprint’ file you can print anything and sharing that file is a tap-away. In a simple example, you could 3D print Mickey-Mouse-ear hats and sell them for five bucks.

There are multiple ways to achieve the 3D printing process, each with pros and cons. For our purposes here, we’ll talk about the most common for consumers - extruding process. Using the input of plastic, the output can be anything the person printing it wants to print. If you lost your wrench and didn’t want to run to the store, now you don’t have to take that trip. Even a 3D model of the Notre Dame Cathedral in Paris can be printed. This intricate ability that 3D printing allows is a driving force in its adoption, allowing for custom preferences being included in production at mass manufacturing scale. For the moment, home use of the 3D printing is limited by the materials used in printing. Layering materials on top of each other creates many fault points in the end product, especially in brittle plastic. Reproducing a wider range of usable goods requires further innovation and development to become resistant to breakage. However, the core ability to print whatever the user wants affords a level of freedom previously unknown to us.

As with all technology, there is of course, a risk component. We must justify the ability to print a to-scale 3D model of Notre Dame, with the fact that you can also 3D print a gun. The question becomes, simply, can we justify this? The risk posed by this type of development is already rearing its head - in August 2016 a man attempted to smuggle among other guns a 3D printed gun and live ammo as well. The TSA stopped the man, despite a terrible record stopping 5% of all threats in 2015.25 Like most topics regarding technology, the conversation is very new and we’re all feeling around in the dark. 3D printing will likely have a strong showing in our next five - ten years, but its widespread adoption will likely be muted until the technology matures to printing materials that have a higher ‘resolution’ so to speak.

Although, in terms of fashion we are getting closer to those metallic shiny clothing featured in Back to the Future 2. 3D Printing enables flexible clothing in this case through the use of what is known as “Mesostructured Material” created by San Francisco based designer Andreas Bastian.26

Fig 4-11. - From fashion designer Danit Peleg’s 3D Printed Collection. 27

Fig 4-12. 3D Printed Mesostructured Materials intricate design up close.


Enabling Tomorrow

Over the next five - ten years, big data will be paired with AI and increased capacity to solve problems large and small. We’re not necessarily getting to the answer more quickly, it’s that we’ll be able to reframe the whole question with the addition of more details. “How fast is the car driving?” will gradually evolve into “How fast is the Tesla Model 3 driving southbound on route 101 considering rainy weather and daily traffic patterns at 11:49 AM?” The answer is a function of the added detail. An example of how data can tie in is Google’s use of user location data to determine if a traffic slowdown is occurring. They use abstracted and anonymized data from Android User’s smartphones to determine if they’re in the vehicle and at a stop on a highway that is normally moving much faster. When considering Android consists of 60% of the market in the U.S. it’s pretty simple to then correlate data between multiple users to determine a traffic slowdown has happened. Answering questions becomes more intuitive with the addition of correlated data.

Take a step back and consider that physics is a mathematical prescription for cause and effect in the physical world. Everything we can see can be quantified and plugged into a formula. This is how we arrive at weather forecasts which can sometimes be right, but are often horribly wrong. This is achieved using a range of inputs and mathematical models, the motion and path of a single stone rolling down a mountain top can be described mathematically. These mathematical descriptions can range from incredibly simple, to the startlingly complex - from how fast a dysfunctional smartphone can be thrown out the window, to the outcome of merging weather patterns in the South Pacific.

As with physics, big data requires an engine of contextualization, a means of conveyance. In physics this is achieved via formulas. By piecing them together they create a means of putting the puzzle pieces together. Any data set requires the intelligent harnessing to produce insights. As noted earlier, data scientists are that formula within the business world. It just so happens that private enterprise tends to pay better so they attract talent quicker than other pursuits. With sustained investments in these areas, the next five - ten years can see economic, municipal, and social game changers. Business intelligence will continue to optimize economic indicators like GDP. Data originated from the digital world directly increases the accuracy of our understanding of the physical world - micro and macro economics, education, social institutions - all boats can rise with the tide. The coming-into-focus effect is similar to putting on and taking off glasses, the details pop into focus. Without fail, discoveries in physics have fueled innovation and development of not just technology, but society as well. Technology, in return, provides advancements that further fuel the exploration of the bounds of physics. Recall how the printing press revolutionized the world and our methods of communication, the advancements around our corner may be as surprising as the first iPhone, likely even more so.

This same axiom can be applied to many questions of optimal output, from traffic patterns to crime prediction, home automation to digital currency, our lives are increasingly digital with no indication of slowing down. The benefits to be reaped by the quantified society, however, are not guaranteed. These developments can only be effectively used as tools if we understand and conserve them. An example of conservation is ensuring data generated by the user is owned in whole by that user. More on this later. This precept is only underscored by the fact that it has been estimated that by the end of 2014 over 60% of the world’s population still had not gained internet access.28 Attempts to equip people with this essential tool of modern living have mounted since then, but there are still billions across the world without access. These new people represent new voices and ways of thinking. Without ensuring free speech in the form data, these people could arrive to a very different web, and the problems and challenges they face will continue.

We are not without hope, humanity has indicated that we are capable of achieving worldscale solutions to tough problems. When was the last time you heard about the hole in the ozone or about acid rain that wasn’t in China? The Montreal Protocol was enacted to phase out CFCs in pursuit of repairing ozone damage. We instituted a cap and trade to tax emissions of toxic chemicals in mitigating the threat of acid rain in the USA.29 We’ve surmounted a potentially game ending bug in enacting Y2K compliance. Whether it is a policy or market based solution like cap and trade, even an engineering solution like Y2K compliance. Time and again, humanity has proven up to the challenge of using available tools in tackling world-wide threats. It’s only in recent years that big problems have appeared intractable or without a solution. The solutions to the challenges we face will not be simple to execute, in fact, they’re going to be moonshots. While the solutions are not simple to achieve, they are worthwhile in what they secure for our collective future.

The more people who are aware of these challenges, the more they will be able to educate others. In technology, network effects denote that when people opt into a technology, they create value for other users by being a part of that technology. Similar to the telephone system, people bought phones with the intent to connect with people but became a connection point for others in the process, increasing the value of the network overall. Social networks exemplify this notion, Facebook wouldn’t be where it is now if it wasn’t used by as many as 1.79 billion people.30 There is, of course, diminishing returns as a site becomes valuable by large amounts of people. The idea being that the community was built to satisfy user needs, in Facebook’s case, connecting people. In the next chapter, we’ll investigate the costs of connecting everyone, on the most democratic of platforms, the internet.

Seeking Brave Answers

As we’ve reviewed over the last two chapters, data as we know it is an evolving term that requires that we also evolve with it. Data acts as an indicator of societally held value, whether that includes the blueprints to a 3D structure, the data contained within our DNA, or actual bitcoins containing monetary value. Ensuring that we enable the benefits inherent in these gains is vitally important to the health of any successful society. While data fuels the artificial intelligence revolution, big data will fuel our personal intelligence revolution, leading to a coalesced perception of what makes us truly human.

In broad terms, it is my hope that with continued honing of data as a fuel, we’ll redefine our ideas of ‘value.’ Data of all shapes and contents - people, places, events - all have intrinsic and increasing value. Justin Antonipillai an economist working with United States Secretary of Commerce Penny Pritzker notes that, “in the modern digital economy, data is a critical asset. Just like capital, credit and talent.”31 Our goal should be a transparent economy that grows at a managed and stable rate, while safeguarding human and environmental interests. Given the current economic and political climate it would appear to any observer that common practice in business is to simply get more money, faster. This incentivizes businesses to seek efficiencies in the name of increasing shareholder value without much consideration for the toll. The effects of this value system have been apparent since the industrial revolution. Over the course of the next chapter, we’ll take a deep dive into some of the challenges that impose the greatest risk to deflating our audacious goals.

Should we choose, all of those frightening situations can be minimized or mitigated altogether. A kilobyte of prevention is better than a terabyte of cure. We can overcome our challenges just like the admirable names emblazoned in our textbooks and monuments. Technologies like robotics and automation can provide enough economic growth for industrialized countries to abolish exploitations like child labor and sweatshops altogether. The pairing of technology and human insight would have a chain effect within economies. It would allow for more education, and subsequently, eyes attracted from the enhanced awareness. Further, this allows for more individuals to become engaged in STEM (Science, Technology, Engineering, Mathematics) related fields. Taking control of your own path in life is something that is owed to every person, and every person owes to themselves to explore.

The challenge we all face is stubborn nostalgia simply for nostalgia’s sake. It’s important that we remember where we come from. The memories can be bittersweet but they inform who we are now. With knowing and remembering what was good about our past, we’re able to inform our present actions, for the sake of the future. We make dozens of choices every day and they can blur the line between what’s truly important and what is not. The most important choice any of us can make is to keep trying. Even when we fail, use that to learn. Get yourself back up and continue pushing. It ain’t easy but remember this - nobody is remembered for the things they’re bad it. No one knows if Mahatma Gandhi, despite his best effort was terrible at knitting and he very likely was terrible at it. The same goes for you and me. Let’s both keep pushing optimistically forward.


Chapter 5 - What Can Go Wrong

“People sharing more — even if just with their close friends or families — creates a more open culture and leads to a better understanding of the lives and perspectives of others.… By helping people form these connections, we hope to rewire the way people spread and consume information.”― Mark Zuckerberg (letter to potential shareholders, 2012)1

Any student enrolled in the school of Bowie will tell you that the times are a ch-ch-changing. It’s an ever-present force in all of our lives, seasons shift in and out of focus, people live and die. There’s a before and there’s an after. Over the course of this chapter we’ll ‘face the strange’ by investigating what can go wrong. In this instance, we’re looking into issues on a planet-wide basis and thus the ‘lens’ requires us to zoom out. My favorite yet-to-be examples of this reflective surface is the human race’s eventual encounter with extraterrestrial intelligence. The moment we confirm E.T. exists and they really do like Reese’s Pieces too, our entire world will be flipped upside down. Our worldview will permanently shift in perspective, the 4.543 billion year old outward view from our planet will become a door to the universe instead of just a window.

Imagination reels open when we begin to think of potential outcomes of such a fateful meeting between species. Whether or not these aliens are benevolent or malevolent is up for debate, films and TV shows paint one picture. In reality, it hinges on how much more advanced the aliens are than us. Working backward from earthly examples like colonialism, we recall that when a more technologically advanced people encounter less advanced folks, those less advanced folks tend to lose out badly. Clearly this is an existing fear, and rightfully so. What’s also clear is that in any movie depicting first human contact with a higher intelligence alien or a higher artificial intelligence usually ends badly. The funny thing is that whenever things tend to go south it’s due to the story of the film assuming that the machines or aliens react with human-like emotion.

Reflecting on the human record can be pretty scary and underscores the need to resolve the problems we face, for ourselves. Even an optimistic view of a benevolent civilization would likely not solve our problems for us. From research in astronomy we’re discovering the ingredients for creating life as we know it are, not just present throughout the solar system but surprisingly common throughout the universe. This is thanks in part to much more common than previously thought existence of water in our universe. Water constituting a basic building block for life as we know it.* The asterisk is necessary because, we have only identified life forms that require water. Us, with a sample size of one (Earth), cannot definitively state that life exists in other places without water. We’re unsure of any type of life that does not need water in some capacity.

In our solar system alone, we’ve detected a signature of water on Mars, Enceladus, and Europa. NASA is presently planning a mission to Europa, which we’ve determined that below its icy, miles-wide outer crust lies an ocean that contains more water than all the water on earth. As water is the playing field for microscopic organisms, the inclusion of other very common elements including carbon, hydrogen, and nitrogen have the potential to spontaneously create life throughout our solar system and universe. Sagan famously states, “The nitrogen in our DNA, the calcium in our teeth, the iron in our blood, the carbon in our apple pies were made in the interiors of collapsing stars. We are made of starstuff.” How cool is that?! The study of habitability beyond our planet is still in its infancy and we’re still establishing the bounds of what we consider ‘life.’ We are feeling around in a dark room, when we find a wall, we note what the wall is made of and apply that knowledge to everything else we know.

In the fledgling discipline study of exoplanets or planets that exist in solar systems other than our own has yielded many potential candidates for hosts of life. Just 25 years ago, we only had theories and speculation that there were planets surrounding other stars and now we’ve catalogued and confirmed 3,375.2 So it seems, galactically speaking, we should be in a more urban than rural neighborhood of the universe. So we’ve got a party going, chips and salsa are out but so far, E.T. is late to the party - what gives? Technically we never received an RSVP so it’s hard to tell where the aliens are. A particular favorite theory for why this is, is known as the ‘Fermi Paradox.’ Proposed by Enrico Fermi over a lunch-time banter in 1950, the theory follows a logic string that supposes the following:


1. There are billions and billions of stars like our own sun in our galaxy.

2. It naturally follows that there are billions upon billions of planets revolving around stars out there.

3. Each of those planets has a probability to exist in a balanced ‘Earth-like’ state and may develop intelligent life. AKA the ‘Goldilocks Zone.’

4. Developing interstellar - between stars - travel is possible, we’re working on it!


After moving through the logic, it begs the question - where is everyone? Hello? Echo? The bounds for how many habitable planets out there are increasingly coming into focus and among those billions there are reasonably estimated many billions of potential civilizations out there. Let’s just say this revelation is a stunning piece of evidence supporting Stanley Stickerson’s right to emotional ‘hanger’ and yelling at his boss in search of lunch. Enrico’s suggestion for why we’ve not so much as received a texted emoji from these civilizations is simply that they blew themselves up. As noted above, we tend to use our own frame of reference to be the bedrock of knowledge from which we form theories about the universe. Sir Isaac Newton used an apple to serve as an example of gravity. In thinking about extraterrestrial life and their longevity, the context of why we have not heard from them of course matches our own existential fears of self annihilation.

Known as the ‘great filter,’ this limiting factor presumes that any technology producing civilization capable of traversing the large chasms of space would eventually die out due to their own mistakes. This existential fear reflects back at us in our fear of intelligent life out there. We fear the presence of an intelligence that may far outstrip our own capabilities.3 From our own self-evident rise of civilization, the less advanced society loses out big time. For some, the fear of being on the complete losing end provides a compelling reason to not broadcast our existence.

Rounding back to technology, the real danger comes from unintended consequences more than intentional evil genius meddling. This danger carries with it the risk of misunderstanding as well. It presents the chance for many people to confuse technology with magic. When combining the two into a single concept, the factors in the equation become skewed and an optimal result becomes impossible. Due to ignorance, there is confidence in the individual’s ignorance which only feeds into misunderstanding and miscommunication. All due to the root cause of poorly understanding the mechanics of the modern day web.

A counter argument could be that beaming high definition video from the sky to earth seems pretty damn magical. It seems magical but it’s not, for some guy named Dan it’s his job at a data center. There’s nothing mysterious about fiber cables laid across the country. They’re no more magical than the effects of coffee. Afterall, if you believe angry gods to be the reason for natural disasters, you’d absolve yourself of responsibility of living where the disasters will continue to happen to you. We cannot view our own reactions as responsible if we believe technology is beyond our understanding. This chapter will serve as the counterpoint to the optimism of the other chapters. It’s going to get dark, however, with technology- a kilobyte of prevention is worth a terabyte of cure. My own optimistic view of technology comes with an asterisk. So should yours.


In the previous chapter, we covered:

Defined traditional data

Examined the now common, big data approach

Highlighted the contrast between data and big data

Demonstrated the value of the internet of things

Pointed out the difference between correlation and causation

Translated the inherent risk and opportunity

In this chapter, we’ll answer the following questions:

Where are we at risk on the web?

How does digital advertising operate on the web?

How does social media generate clickbait and fake news?

How do innovations fail to become successful?

What is automation, and what does it mean for me?

What is the technology industry’s responsibility to society?

The More You Know! Securing Your Digital House

If you’re a sane individual you’ve likely been concerned in the past about being hacked or having your identity stolen. Who wants to lose their identity and be forced to live under an overpass collecting tolls from vehicles passing by? Here’s the thing about hacking, generally speaking - in order for it to be effective, it relies in you making a mistake in judgement. It’s similar to when your friend, let’s call him Matt, puts their hand up for a high five but you don’t notice it and leave him hanging. Matt and the hackers are sad. Without you reaching out and slapping five and maybe following up for a low five as well, the hackers generally cannot get to you. The primary way in which a hacker may target you is via social engineering. Social engineering is simply a method of gaining your information by tricking you into providing it. This can be achieved via wearing a uniform or posing as a loved one of yours. It requires you or your account holder (cell phone, bank, etc...) to make a mistake. So let’s examine a couple of common situations where a little bit of foresight can help you out in a big way.

Email

Phishing

Fending off the potential intrusions of your identity are very simple, and as with anything critical thinking helps protect you. Follow these steps and you’re well on your way to securing yourself.


Preventing Phishing

1. Never pick up a call from an unrecognized number, if it is important enough the caller will leave a voicemail.

2. Ignore the trappings of presented authority. If you find yourself on the line with someone who seems fishy, tell them you’ll follow up on their corporate or governmental 1-800 number.

3. Ignore immediacy of resolution. Exploiters often social engineer situations in which you handing over the information is of utmost importance that must be resolved immediately. Maybe under the guise of them doing you a favor on an overdraft or unpaid parking ticket.

Securing Your Email

1. NEVER CLICK RANDOM LINKS. Not ever, just don’t do it. Not even if the subject line says it’s really, really cute puppies.

2. As with phishing, ignore the immediacy of resolution. No matter who is claimed, you’re able to contact the organization directly.

3. If the email was received from someone you trust, follow up with them outside of email to confirm the message. Text, Snapchat, etc. are all easy and just as quick methods of following up.

4. There are only so many African princes in the world.


The Strange Web We’ve Weaved

The challenges facing our collective civilizations are severe, and yet where technology is concerned, our interactions with technology are limited. I see a growing divide between those inside the technology industry and those outside of the industry. This divide is exacerbated by tech operating inside of a black box hidden behind seemingly obscure terminology and vernacular. That growing divide is a risk to our thriving as a nation and species, and the muddied discourse from social media is evidence of that divide. This is a time for people to come together to use the tools in our diverse toolkits to collectively solve the problems we face. I’ve come to the table to humbly say “I get it. Now let’s move forward.”

There was not, nor would there be a billboard sign proclaiming - ‘the web is complete, we did it!’ There has been no clear indication that we had transitioned to a digital society where every detail of our life is catalogued somewhere in the ‘cloud.’ We understand challenges in our physical world pretty well for the most part. We’re able to predict the weather a few days ahead of time. We’re able to build intricately engineered skyscrapers. We can logistically ship a product across the world to arrive just in time to restock. Moving through a physical environment is a feature that we understand well. Moving within a digital world, not so much.

E-commerce becomes a natural market force of a developed nation’s desire and needs. Order today, here tomorrow. It’s simple to understand how technology augments our physical world. What’s more difficult to understand are the abstract threats and risks to our digital worlds. Because it’s not readily apparent how all the factors of the internet tie together, it’s easy to dismiss the issues as too dense to understand or unimportant. I contend that this is very much not the case. What we secure as rights in our present time will have an impact on how technology develops for generations.

Still, there is no way that we can visualize and then realize the extent of the threat. Rollercoasters are successful because they trick our brain’s fear response into a primal urge to protect ourselves. Life tends to be full of little tricks like that. Like the TV show The Big Bang Theory uses laugh tracks to trick your brain into thinking the show is funny, when in actuality it is not funny at all.4 For your day-to-day technology use, the most visceral reaction you can have is realizing you sent the wrong private image to the wrong recipient. The threats we face are challenging and fraught with pitfalls, navigating these issues will require some levity in their approach.

Food for thought. In the totality of the history of human expression, how many curse words would you think there are? Take a wild guess. No seriously, take a guess because it is literally innumerable. In a billion years from now when alien archaeologists are digging through the ruins of civilization they’re going to find two things about the human race. One - Humans love plastic, and two - we loved curse words. It’s likely to be the first words anyone learns of a new language. We love the expressiveness of curse words. That’s the point, curse words are immediate instinctual catharsis. It’s a good thing too because we’ll need a few to get through this chapter.

The emoji, like slang provides a means of codifying meaning, modulating the impact of the phrase. Oddly enough, expressing sentiment across a large swathe of people can be possible. Twitter recently announced the most popular emojis by country for the year 2016. Given all that’s been going on in the news in 2016, the United States has most frequently used the following emoji:

Fig 5-1.- 2016’s preview of 2017

To me this is a pretty clear indication of the psyche of my country. Other times it can be tricky to determine the inferred meaning when used but they can however express meaning while also adding levity. Which is good because in order to summarize the extent of “what can go wrong” we’ll need to use them frequently to punctuate the frustration.

The Bedrock of the Modern Web, Advertising

Over my 8 years in digital media, I’ve enjoyed the learning experiences it has afforded me. I’d know far less about how the web operates if I’d never joined the advertising world. The learning curve was especially evident in my experience within startups. Startups operate on the bleeding edge of markets by leveraging technology to build solutions for initially weak market signals. Their nimbleness and technology stack is required to be spectacular to enable them to capitalize on those market signals. This holds true especially for advertising, which is the realm where large players like Google and Facebook reign supreme.

My pursuit of contrasting human experiences and passion for learning about technology led me to digital advertising. I wanted to work at the intersection of technology and design, and at the time digital advertising was where I could get my foot in the door and start paying off those student loans. I chose to specialize in rich media because large format of creative reminds me of print advertising’s golden age, a simple concept and strong copy. The reality was that digital advertising campaigns are complicated and nuanced affairs, sometimes requiring months of project work.

The tectonic shift known as ‘programmatic advertising’ has reforged the entire advertising industry into a data driven arbitrage of ad space. Seeking the eyeballs to show these advertising campaigns is the use of cookie data. A cookie stores information on your computer to be accessed by the web browser at a later time. The increasing use of tracking user’s behaviors has been building over the years, and with each new advancement in web technology there have been additional business opportunities. The evolution of web technology has presented many new revenue sources, though with an ethics based question mark punctuating the word ‘privacy.’ In effect, the industry becomes less defined, more nebulous and given access to personal details of web user’s lives, potentially more ethically ambiguous.

As time passed and the underlying technology continued to advance, the targeted ads became ever more granular, all in the pursuit of one metric, ROI (return on investment). Within the world of digital advertising, there have been comparisons to TV advertising budgets. As you can imagine, TV advertising budgets are giant but have always been very separate from digital advertising, and for good reason, they are completely different beasts in terms of tracking success. TV relies on providing audience, demographic and top-line or basic metrics for measuring the success of a TV ad. One of the boons for digital advertising has always been the degree with which you can track metrics. Every click or tap on a button or image in a digital ad can be measured and reported on, while TV metrics are only able to convey basic details.

Digital advertising has the ability to intricately measure the success of a campaign out of the box, which makes it far easier to determine an advertiser’s ROI. This is playing out over time, and in 2016, for the first time, digital advertising spend will outpace TV budgets.5 Featuring the ability to measure any interactivity out of the box instantly places the capabilities of digital advertising past that of television advertising. Companies like Google and Facebook have leveraged this data heavy approach to build out a multi-billion dollar, multi-national institutions. Smaller companies operating within the same space have been able to carve out very healthy businesses from the multi-billion dollar crumbs left on the table by Google and Facebook.

Specifically, these companies are building out the predictive path you will take toward purchase. The strategy is a common progression that consumers take to purchasing a given product. There are countless products in existence and each of them have pros and cons that have varying emphasis depending on the consumer’s preferences. Given that the average web user emits a scent trail of interest via their cookies and user data, marketers are increasingly able to identify and communicate with their desired audience at their desired time. The information is stored locally on the device and increases convenience and speed when surfing the web. It is the increasing granular ability of marketers to identify you as a cluster of characteristics, while offering nothing but calls to consumer action in return. Before we investigate this trend further we need to cover the basics. Broadly speaking, there are two types of advertising on the web.


Awareness

Summary

Cost Method

Notes: Every single instance of someone viewing the ad is counted and bundled by the thousands. See the same ad 3 times on the same page, that is 3 ‘impressions’ or ‘views’ that will be credited to the site. Depending on the deal, prices can range from as low as $1 CPMs up to $75 plus for premium content publishers like Hearst and Conde Nast.

Direct Response

Summary

Cost Method

Notes: Direct response makes heavy use of user data and cookies in the application of ‘attribution.’ Attribution within digital advertising exists because users tend to visit so many pages and view so many ads. In order for websites to get paid for that user’s action it needs to be determined who is responsible for that user going through with an action. They use the data contained on the user’s cookie to determine who last served the user the ad prior to the action.

Fig 5-2. - It comes back to Retrun on investment (ROI) for advertisers, it’s all about dat purchase!

The above purchase funnel also acts as signpost for where digital advertising has gone the past eight or so years. Initially, advertisers did not have a great way to track users across the web and across their purchases. As web technology matured, there was a flow of marketing budgets from television and onto digital banner ads as well as search advertising. Search being a data heavy method of marketing to users, search accounts for a large percentage of the industry. Marketers prefer hard metrics when deciding ROI and awareness type advertising is a bit more difficult to tie back to a successful campaign. Tracking the purchase contains the ability to definitively say, this user saw our ad and clicked on it, then purchased the product. Determining the amount of clicks on a campaign is dead simple to track and is the most common indication of a successful campaign.

What is more difficult is to determine who is responsible for driving the purchase when the user has seen the same ad five times from five different advertising companies? The idea is to provide specific messaging to the user through each stage of the above funnel. The specific messaging also provides a touchpoint - or avenue of showing someone an ad - on users that viewed the ad providing further attribution - of ROI - to the ad provider. The touchpoints also occur outside of just display advertising. Advertising campaigns are often filled with multiple tactics for reaching their target audience, there can be many web sites, if not thousands, and channels like social (Facebook or Twitter ads). This is part of the reason you see so many ads on the page, there are many companies attempting to get credit for the path of your conversion.

Websites as Creators of Content

In digital marketing in order to effectively sell advertising real estate on your web page, you need to have a good idea of how many and what type of people visit your site. This allows advertisers to speak directly to audiences they want to know about their product. The leading service that measures audiences is known as Comscore who among other services directly measures the amount of people visiting web sites. For example the sites with the highest amount of unique visitors within the U.S. for the month of January 20176, are as follows;

Fig 5-3.- Concrete opportunities to communicate with audiences.

If you’re asking yourself what ‘sites’ means, great question! Let’s say you start a website devoted to love of all things knitting. Let’s call it ‘StraightOuttaYarn.com’, congratulations not only are you pursuing your passions but you’ve got a dope URL. I wish I’d thought of it. You love knitting so much that you feel compelled to share that love with anyone who’s willing to listen, bravo. I’d have picked something more exciting, but who am I to judge? So you go on your way creating content for the site and you build a following of fans who read your every tip or tutorial on knitting, great job! You decide that you’ve got a loyal audience who supports your love of knitting and you want to support doing that and only that with your time. How do you do that while keeping in mind the cost of supporting a website and your monthly bills? You have two options these days:


Donations

Advertising

So what are your options then? You can sign up with an advertising network like Google, who takes care of the entire process for you. All you need to do is to copy paste some snippets of HTML code into code-containers onto StraightOuttaYarn.com. They package your audience data along with that of other sites dedicated to knitting and crafts. Then they go out to advertising agencies to pitch that collection of sites to become a part of an advertiser like Michaels advertising campaign. Through this process, StraightOuttaYarn.com becomes part of ‘Google Sites.’

Wait, so Google sites are visited by 245 million unique people the month? As far as the term ‘unique’ goes in web technology, yes, 245 million different individuals. There is deeper nuance here, but for our purposes we’ll keep it brief. As noted above, nearly every page we visit has multiple ads on the same page, that makes for a ton of money to be made. You’re damn right, $74.5 billion dollars worth of ads including search based advertising a year if your name is Google. So how do companies like Google, Facebook, Microsoft, or Amazon sell advertisements across all those websites? Well, they take the software solution of course.

Where Advertising Goes Wrong

Automated advertising utilizes the power of software and algorithms to purchase advertising inventory across a broad number of sites or audiences. The ad buyer sets their target audience segment and the amount of money they are willing to pay to show that target audience their advertisement. When the target user loads the webpage, the advertising space on that page is auctioned off to several different companies all bidding in real time for those real eyeballs. All of our eyes are commoditized into different ‘audiences’ based on our demographics, past purchase history, you name it. This process takes milliseconds to achieve and can be done across billions of sites without any further interaction from humans. In this section, we’ll review how advertising has been a boon and an unfortunate necessity for web users everywhere.

Your eyeballs command money because you are you, and you represent purchasing power. In today’s digital day, your value also includes the menagerie of personal data that follows in your wake as you surf the web. Simply put - as web users, we emit a gaseous miasma of identifying signals as we venture around the web. In order for marketers to convey the buying information for their product, they need to adjust the communications for each audience segment. Afterall, a soccer dad won’t care about the same features as a high powered executive. In order to speak to each of these segments about the qualities they presumably care about, the message needs to be customized (and seemingly authentic) to the targeted audience. The internet is effectively the world’s largest marketplace, with all sorts of people who care about different qualities of product. To effectively communicate with these people they are broken down into homogeneous buckets, or audience segments. It’s a reductive POV on people, but it allows for multi-million dollar marketing budgets to be broken down into component pieces and tactically implemented. Audience segments can be broken down by age, level of education, income, ethnicity and gender. Marketers have the increasing ability to correlate all of this data and combining it to target the same user across the devices.

This tactic will experience a dramatic uptick in use as intelligent decisioning is added to targeting capabilities. In the span of an eye blink, marketers are able to determine attributes about the user including past purchase history, geographic location, demographics etc... Before that same eye shuts, a series of if-then statements can then select messaging appropriate for their concept of who they think you are. Leveraging boolean syntax, they use a series of if this variable, then serve a selected ad and messaging. The operative words here being ‘if’ and ‘then’ which when combined create a level of intricacy not yet achieved in advertising. This trend is and will continue to grow due to its ability to serve very specific messaging to very specific audience segments. It won’t take long for AI to be used to supplement this targeting tactic. As a result of inefficient marketing, advertisers and publishers have thrown so much advertising at the average web-user that they cognitively ignore anything that isn’t the content they were looking for.

Up until around 2012, the advertising industry primarily consisted of display advertising, aka desktop/laptop based advertising. ‘The year of mobile’ as pundits and journalists referred to it, occurred more than a few times but had finally caught on to mainstream budget planning for advertising campaigns that year. Initially, the unique nature of mobile devices acted as a barrier to adoption because:


A) The technology to really introduce intuitive use was still in progress, and

B) Understanding the promise of mobile for advertising was completely unintuitive



Explaining the possibility that brands have on mobile devices is an easy sell as mobile devices are very tactile providing unique interactions, like swiping. An example of this being the Snapchat filter, a tap to your face unlocks an interactive bit of branding over your image. Mobile touchscreens allow you tap buttons as opposed to controlling a disembodied cursor. This additional depth in interaction alone was enough to make advertisers stop and stare, however, the inclusion of multiple sensors set advertising aflame. Interactions with an ad that include the device’s functionality can facilitate truly unique and additive experiences. Snapchat, is presently leveraging mobile’s unique nuances very well. More on that below.

The innovation delivered to the web by mobile devices has continued to be great market driver. As another example of an innovation driver, we can look to aligning web standards and the migration to HTML5 over Flash. The arrival of new standards, like HTML5, the fifth iteration of the website scripting language heralded the introduction of new features that enabled native media support. However, Flash was still the preferred language for building advertisements, clutching for relevance in a world where media was now a native feature of the web. With the amount of tag-along web trackers and web advertising designers, the migration within the industry took a drawn-out route. In 2010 Steve Jobs told Flash to ‘go pound salt,’7 and Apple devices would no longer support it. However, given digital advertising’s foundation in Flash as a preferred scripting language, and it being supported by alternate ad platforms such as Google, Flash was enabled to survive an extra seven years. In January 2017, Google finally dropped support of Flash on their digital ads as well.8Aligning the entire web in terms of standards has many stakeholders which can slow down progress, in favor of outmoded standards of industry.

True to form here, Princeton University ran a study in which they measured the amount of third party tracking on the top one million sites across 90 million page views. News sites came out on top with nearly 50 tracking and non-tracking third-party items installed on your local machine, according to Princeton. For context, the average site loads approximately 25 trackers.9 Even worse is that many of the data brokers exchanging your data are often sent via insecure protocols. Like any industry, digital advertising is subject to industry-wide shifts in the business, and unless you’re an 800 pound gorilla it’s easy for these shifts to be beyond anyone company’s control. It’s not entirely unexpected that the smaller companies within ad tech would go with the flow on standards. Afterall, companies have to make money, otherwise why do they exist? Further, why not pursue the same deeply data driven revenue models the big players are using?

On the order of completely predictable results, users are turning to ad blocking as a method of comfortably browsing the web. Ads suck. Very rarely do they do anything to add to the user’s life. They mostly serve as barriers to viewing content. The video ad-tech startup Unruly performed a survey in which it found “that 90% of US consumers would consider using ad blocking software in the future because they think there are too many ads (59%), they are sick of seeing the same ad (52%) and they find ads which ‘follow them’ around the web ‘creepy’ (59%).”10 Further predictability on the tone deafness of the digital advertising industry lies in the fact that the IAB, the governing body for digital ads, has lambasted ad blockers, labelling the ad blocking software creators as an “extortionist scheme that exploits consumer disaffection and risks distorting the economics of democratic capitalism.”11 Later in the same paper, they go on to admonish marketers for attaching so many different data providers to a single ad. Websites and marketers are too busy grabbing cash hand over fist to care about terrible user experiences. The phrase user experience refers to the user’s ability to intuitively, and without blatant instruction, navigate a digital environment. Factors in user experience includes factors like latency from command to ready-to-interact-with buttons on the screen.

The exciting thing about mobile advertising is that contains many more avenues for creating additive experiences for users. The smartphone and tablet stable of internal devices allows for media rich features within advertising. Snapchat leverages these innate mobile features to create engaging brand advertisements, as we mentioned above. This type of brand interaction can be a simple bit of relationship building, or convey important product information.

Fig 5-4. - Thanks to technology, you too can be a taco.

Snapchat reports that filters like the above ‘taco youself’ filter generates 20 seconds of interaction with the brand. According to industry analysts, experiences like this will lead to ad spend on mobile devices overtaking display budgets in 2017. Snapchat is interesting because it clearly labels its advertisements while also offering something additive in return, whether that be an experience or savings. This is the direction I’d like to see advertising go, focus less on converting users into a series of web-signals that you massage with data until they’ve purchased. Consumers are not tubes of toothpaste to be squeezed until they’re empty. Given that mobile devices contain multiple internal sensors that can define a user’s location and mindset we need to ask ourselves sober questions about what we find acceptable. Where is the line drawn between convenience and privacy? Up to this point, the method of approach by data providers in advertising has been to leverage every data point possible. Clearly the advertising industry is lacking the proper questions, let alone forging brave answers. You might want to buckle in, it’s going to only get bumpier in digital advertising.

Where Social Media Goes Wrong

‘Don’t believe everything you see on TV.’ It is a familiar refrain that informed a collective common sense of the television age, yet there’s no equivalent saying for our internet age save for replacing ‘TV’ with ‘internet.’ Therein lies the issue, by simply replacing words, we still fail to convey the full breadth of common wisdoms that were once applicable. The internet, of course, differs from TV in many obvious ways, however the details of interactions on the web are inherently far more complex. The complexity in how the web operates and exchanges value has since the dawn of the internet, been host to every stripe of huckster and snake oil salesman humanity has ever known. The anonymity of the internet provided the means that bad will actors needed to extract value from the unwitting. As the web has evolved, so to has the capacity for sinister aims. As we’ll examine in this section, social media can fuel the creation and proliferation of websites that offer the internet equivalent of junk food.

Where social media goes wrong is that they have inadvertently provided an avenue to the rise of fake news websites which knowingly produce polarizing headlines masquerading as real, journalist-produced, news. In other words, they editorialize reality in pursuit of viral social media success. When a piece of content goes viral it drives multitudes of people and attention toward a single site, generating advertising revenue in the process. This type of success can deliver substantial monetary benefits to the source of virality. Given the broad and complex nature of the internet, a piece of content that goes viral can be both poorly understood and widely distributed, leading to widespread disinformation.

The driving factor is the monetary rewards that follow from attracting hundreds of thousands of people and their spoils which number in the hundreds of thousands of dollars. These feckless websites aim to attract as many people as possible, leveraging the reader’s own confirmation bias in the process. By creating a message that purports validity to a single point of view, the process of consuming the media becomes a self reinforcing system aided by the exceedingly efficient algorithms. The noteworthy aspect of confirmation bias at play here is that it creates a bubble that prevents consideration of other points of view, one of the most integral aspects of critical thinking.

For example “10 Ways Coffee Helps You Lose Weight”, “Get a Flat Belly From This One Trick!”, “I’ll Solve All Your Problems For You, Just Click Here!” They are generally so broad reaching that they would apply to as many people as possible. They won’t give you a flat belly, but they will give you a flat wallet, and they certainly won’t solve your problems for you, hell - the writer may very well be a robot. A ‘bot’ is simply a set of algorithms, that perform the same function continuously. At the speed of the modern day web, these algorithms may be aimed to search for keywords or phrases, then to like or retweet them, amplifying the base message further into the social media bubble.

So why are these websites even around? People interact with these articles all the time, hence the term ‘click-bait.’ This in turn garners the advertising revenue from being a part of ‘Google Sites’ or ‘Yahoo Sites’ and are external to Google, yet Google finds advertisers to run across these externally managed websites. To be clear, fake news or scam sites are not intentionally kept by these advertising giants, but they often times slip through the cracks and can claim significant dollar value, dependent on the amount of traffic visiting the site. This epidemic used to be literally handled by people who would manage the stable of sites within a network. In the advent of automated advertising, monitoring a collection of websites is the realm of algorithms and vulnerable to fraudulently garnered impressions - remember cost per view and clicks - cost per click - The means of generating revenue via algorithms is geared toward specific emotion with the intent to attract attention.

“Anger, anxiety, humor, excitement, inspiration, surprise - all of these are punchy emotions that clickbait headlines rely on,”12 says Jonah Berger author and professor of marketing at Wharton School at the University of Pennsylvania. Here is where things begin to feel eerily familiar, these same backdoors to our psyche are present within social media. If you recall from chapter two, we reviewed the role of the default mode network in idle thought. For a quick refresh - the default mode network is when the brain says “job’s done, we deserve a break,” and kicks into idle mode. To dive a bit deeper, whether we’re watching a movie or listening to the radio while stuck in your commute, the default mode network kicks into gear in a fraction of a second.13 While engaging in the default mode, we tend to think and form thoughts pertaining to the self, the other, the other, and the past and future. Parsing these features out:


Default Mode Network Thought Lines

The Self

The Other

The Past and Future


If this sounds like familiar train of thought, that’s because it is engrained in our daily habits in the form of social media. All of the above features are documented frequently and with high levels of detail and are expressions of our own default mode networks. Further, given the evidence of texting altering our brain wave patterns presented earlier, I’d conclude that the same holds true for social media use. Every social media expression is punctuated as a meme or photographic evidence of our subjective experiences, where acknowledging our similarities is expressed as a click or tap in passing. There’s options for meaning, but this is our idle time and ‘I’m just scrolling through.’ As expressed by Mark Zuckerberg in the opening quote of the chapter, “we hope to rewire the way people spread and consume information.”

Congratulations, like interlocking gears, the default mode network is co-opted into expressing an algorithm’s perception of you accelerated through the lens of your available data. Evidence of social media’s effects on includes data that suggests when our political beliefs are challenged, we dig our heels in and resist persuasion.14 Politics represent a tribal desire for belonging and the resistance to being challenged results from the identity related beliefs initiating a mental modelling of the self through the Default Mode Network.15 Recalling back the idea of decision making being pre-wired, like the electrical wiring inside a home. These pathways in the brain are a function of previously established pathways, ala knowing by heart the musical theme of the Star Wars or Superman films. The difference being that instead of being lined with positive emotion like these films, the brain pathways that challenge identity are lined with negative emotions. I believe we’ll continue to find substantive links correlating the interplay between our brain’s default mode network and social media use. If we’re to genuinely use social media as a tool, we need to be sure we find and mitigate risks.

As noted previously, our brains are able to move from an active task to idle leisure time in a fraction of a second. The instantaneous nature of the internet facilitates this switch in the time it takes a website to load. For 1.18 Billion daily active Facebook users16 content is produced and interacted with featuring the familiar patterns within default mode thinking we discussed earlier. For example, posting a picture of your new car or baby perpetuates the comparative social behavior of ‘keeping up with the Jones.’ In social media, every update is lent credence via notifications highlighting how other’s lives differ from our own. This is good for the economy, but bad for happiness overall, as social interaction has largely migrated to the web. This is important, because similar to the way that eating vegetables equates to healthy, and sugar laden diets will lead to poor health, social media consumption also has effects on our overall health.

It has been noted that there is a correlation between social media use and signifiers of depression. In a study conducted by University of Pittsburgh School of Medicine, they found that of 1,781 U.S. adults between 18 and 32 spent an average of 61 minutes per day on social media. The study’s conclusion is based on a self-reinforcing model of depression which they posit is on spurred by frequent social media use.17To be clear, I am not implying social media is the cause for pain and suffering. Rather, Spending over an hour a day consuming a never-ending feed of content featuring the highlights of other’s lives, is the modern version of keeping up with the Jone’s. The use of social media informs our outward view, discerning whether or not that perception includes unreliable information underscores the very purpose of this book - awareness.

As we’ve discussed, in the tech industry, users are the metric of success, and ensuring those users stay on the site provides additional revenue via advertising. To put a finer point on it, success is measured not just by users, but daily active users, or the amount of users actively using an app which informs the value of a product or startup. Given this simple market truth, ensuring the user stays on your app is the common survival goal of every app pumped out of Silicon Valley. We’ve known for some time that social media acts like an echo chamber, users can shout whatever they want, but only the people who are statistically deemed likely to interact with that user will hear it. Here again, confirmation bias18 ensures that people naturally seek affirmation of their own beliefs. This comfort seeking behavior is, in turn, honed in by algorithms designed to connect you with content you’re likely to interact with or view. Though you may have many friends on Facebook, you see only what Facebook’s algorithms think you want to see.

As these algorithms propagate across Facebook’s self reported 1.79 Billion monthly active users19, these users are viewing content shared across the social media sphere by their friends and family. These echo chambers reinforce beliefs which are further reinforced by groupthink. The cause for concern is real, as echo chambers continue to fuel this merry-go-round of bullshit, and people become polarized because the perceived discourse is spun into extreme perception. Cass Sunstein, legal scholar and Robert Walmsley University Professor at Harvard Law School studied group and individual behavior while engaging in discourse. They found that the troubling aspect was polarization via public discussion, which occurs and is exhibited throughout all strata of society, no matter class or group. The crux of the issue is that public deliberation on a topic can cause further polarization within the individual and group deliberating.20 As highlighted by author of The Psychology of the Internet, Patricia Wallace describes the factors that may provide the catalyst for polarization on the web. She notes that the, “plausible hypothesis is that the Internet-like setting is most likely to create a strong tendency toward group polarization when the members of the group feel some sense of group identity.” 21The perception of a group identity becomes embedded by the tacit participation of its community members.

Acclaimed author and media theorist Henry Jenkins, defines this type of behavior as “participatory culture.” In his book Convergence Culture: Where Old and New Media Collide, he describes the means by which media has become available anywhere, anytime as a media convergence,22 which has enabled content and media to be spread democratically and virally. Jenkins states, “convergence occurs within the brains of individual consumers. Yet, each of us constructs our own personal mythology from bits and fragments of information we have extracted from the ongoing flow of media around us and transformed into resources through which we make sense of our everyday lives.”23 Given the massive field of media, we tend to want to communicate about the media that informs our lives.An example could include water cooler talk on monday mornings discussing the latest episode of The Walking Dead. This induces further virality, should a piece of media bloom into a socially viral statement. Recognizing the fact that the audience dictates the content, leads to a type of collective intelligence, Jenkins concludes. Given our species’ prosocial behavior toward each other that has helped give rise to our still budding civilization, we understand that different skills and expertise are required to make the world go ‘round. This broadly understood implication, while being incredibly fruitful, also has the downside of including cognitive biases.

Social media platforms can amplify biases by providing a megaphone to individuals who may be using inaccurate data. This leads to an unflinching, groupthink, or ‘hivemind,’ that naturally occurs in human communities. This comes into play via what is known as the Dunning-Kruger Effect. In the fall of 1999, David Dunning and Justin Kruger found that across the board, and no matter what their intelligence level or activity was, people will overestimate their competence and the extent of it, as well as these same judgements of other’s competence.24 It’s like the blind leading the blind and it doesn’t matter if it is chess, reading comprehension, practicing medicine, or even driving. For example, when is the last time you hadn’t heard someone complain about the drivers of such-and-such geographic location? Everyone, everywhere seem to drive like a crazy person! When you recognize that common feature of humans, it highlights the astonishing odds we’ve defied in making it this far, truly.

We’re actively scrolling through feeds, reacting and deciding what to like or dislike, all the while we’re relying on an emotion driven system one style decision making. Repeating this behavior through dozens of posts of images, videos, viral videos, news, and fake news has a way of capitalizing on system one style thinking. We react with a ‘like’ content, displaying the gratification of our public statement of approval and continue down the feed. This means that content that becomes viral is passed as a currency or token of truth. We laugh or become outraged with appropriate emotion and vigor, and then move on to the next piece of content, in turn, passing on the emotion as a commodity.

Think of it like this, we all know an individual that can recite sports statistics with a higher degree of accuracy than most care to achieve. Does this qualify that person to comment on the economy? Likely not, though not impossible. The point here is that when assessing the aptitude of ourselves or others, we continue to run into cognitive biases which impair judgements. Given that social media’s currency is participation in liking, heart-ing, retweeting, etc.. negative interactions can occur and rapidly alter the depth of discourse. Dissenting opinions can be rabidly squashed by a few members of the ingroup participating in groupthink out of confirmation bias and proceeding to comment or interact negatively in response. Dissenting thought, a fuel for robust societies is dampened by the preference for comfort. The relevance and accuracy of the opinion or solution may not matter as the argument gets buried. Depending on the proclivity of the surrounding community, the strength of the reaction varies from simple downvotes, to banning the user’s membership. This happens thousands of times a day across the web.

We measure the velocity of these commoditized statistics with the aid of computational frameworks like big data. The features and software underlying these websites, enable algorithms to grasp and exploit how we connect with one another. As with all risk and opportunity represented by the advancements of technology, it can surface unsettling characteristics of humanity. Social media has amplified the voice of anyone with a connection and a desire to interact. In many instances those voices are shouts for help or attention. Yet, we are shoulder-shruggingly unsure of how to prepare the stage for organized and unifying discourse across the web... more on that at the conclusion of the chapter.

The participation economy based around clicking ‘like’ or ‘heart,’ represents a risk our focusing on the wrong metrics altogether. In the haste of our pursuit of economic gain via user value, we’ve created software that has co-opted a natural process in our brain to polarize our organically created relationships. Technology isn’t bringing us closer, it’s driving us apart. The awakening machine intelligence arrives in the form of social media algorithms that have run amok. This is not at all the reflection of our better nature, it’s the reflective gaze of humanity’s co-opted default mode network. The algorithms perform their tasks of keeping users engaged, reflecting our best and worst natures with machine efficiency. In its wake, social media becomes a shambling paranoid android that highlights our worst features, exploiting idle thoughts to embed an exaggerated perception of reality. This leaves a complex and exploitable economy nobody fully understands, yet ascribes great capital and value to it all the same.

No longer can we simply believe the future will build itself. We must set aside simple answers to complex issues in pursuit of our better nature. We owe it to ourselves to reflect on how this mode of thought is fraught with bias and cognitive dissonance. The path false information takes to belief is provided a shortcut by social media’s use while in an idle brain pattern, the low barrier to belief is hurdled by fake news and exaggerated lifestyles. The results of which are apparent for any astute observer. People are subject to cognitive biases and we should not be surprised when leaders in social media also let us down because of the simple fact they are also people. Here again, we are the X-factor in both the risks and opportunity, technology is the sum of the ambitions of its wielder. When governing society at the scale of nations, acting in unity can seem like a fantasy out of fiction. Although we slip into single mindedness in a fraction of a second as soon as our ‘feeds’ load. The idle patterns presents idle obstruction of incorrect information, fueling our polarization. Without strident steps toward rectifying threats of our own making, societal discourse will fall to spoil. If we cannot understand the complexity of today, how can we expect to prepare ourselves for the complexity of tomorrow?

We risk a similar trap in using system one-style decision making when surfing social media. The reason being, when quickly making a decision we don’t always consider how the element may reinforce our own biases. Which can result in warm fuzzy feelings rewarding the brain with comfort and a hidden side of cognitive dissonance. From an interpersonal level on up to societal discourse, communicating sober and deep concern constitutes a difficult task to manage inside of any human-to-human relationship, let alone when discussing policy affecting millions of people. Through our own personal experiences we perceive the world, and even attempting to communicate what that experience is like, is worthy of any attempt. Listening and empathizing with others is the societal salve we need but cannot bring ourselves to yet recognize in a meaningful way.

Ultimately, the failures of social media are representative of the challenges posed by technology in general. We can build the tools to aim for a brighter tomorrow, yet we’ll struggle to provide a solution that works for everyone. Painting with a broad brush stroke is bias-laden to employ in any problem solving effort. Understanding the need for local solutions to local problems is required to fully employ communications technology to our whims. Missing the mark is beside the point, social media was supposed to be a unifying feature of tech, not a maximized web-interaction exploited to disagree. The health of a society is predicated on the robustness of their tools and the sum of their ambition. We need to carefully cleave social media’s role in defining the line between convenience and comfort in our lives. To follow comfort or convenience to the extreme provides bubbles of groupthink that poses more risk than opportunity. Doing nothing poses a greater risk. It boils down to this, don’t believe everything you see on the internet.

Where Privacy Goes Wrong

At present, websites assume that by your continued visits to their site, you imply consent for them to collect data on you. Every site must have a privacy policy that details what data is collected, used, or even sold to third parties. Again, your consent is derived from the continued usage of the site. This type of automatic opt-in is a feeble handwave at tackling privacy as an issue. Users need more access and power over how and where their data is used, while education on technology and privacy rights is a close second in necessity. If you were to ask the average web user, they would most likely prefer to opt out of the tracking altogether.

The way technological innovation works inside of our economy is that the operators of startups can, at times, operate within the barest legal frameworks. This is part of how Uber has grown at massive rates since inception. Given that Uber’s business model is tangential to the taxi business, they’re able to legally argue some laws governing taxis do not apply to Uber’s business. Further, the amount of varied legislation at the state and local level helps to provide cover as they toe the line of legality. This leads to what is known as ‘disruption’ to an industry. The incumbents like the taxi industry have staked out their claim to business and seek to defend their revenue generators. The claim of the disruptors is that their service or product exists to satisfy a unique need of a market. In the instance of Uber, serving that unique market that also happens to include traditional taxi seekers is essential to the successful operation of their business. In the instance of digital advertising, I believe that same ‘wink and nod’ is taking place. Take the recent example of Verizon’s imposed fined $1.35 million dollars for the use of supercookies.25

You can think of a supercookie similar to regular web cookies in that they help store your personalized web experience. However, imagine you’re hungry for a cookie, you go to the store and pick up a pre-packaged cookie and begin to examine its cookie goodness. Upon inspecting the packaging, you’ll find that the package lists intimate details about you. Creepy, so you put that package down and pick up another cookie. Same damn thing, creepy details. You do this a few more times all with the same result. The reason this is happening in the case of Verizon, is that they repeatedly and automatically write your details on the cookie package after you touch it. This made the use of these supercookies a pernicious issue that was not simple to get rid of. Quite simply, Verizon slaps a sticker on you no matter where you go. For context, $1.35 million dollars is approximately 1/1,000 of Verizon’s revenue for 2015.

If you believe, as I do, that property rights extend to the data you generate on the web, then if a marketer wants to use your data, there should be some form of recompense. Software as a service model is a good start, but not every company has something to offer in return for sniffing out your data. If you had valuable real estate along a highway and a marketer wanted to erect a billboard on your land, you wouldn’t agree to the advertisement without being compensated. Your personal agency over the data that you generate and maintain is exchanged for convenient experiences. In some instances, sharing this data provides public benefit, like Google using geo-locational data to determine the presence of a traffic jam.

However, as targeted as marketing can be, here is where it gets worse... All of those same marketer-leveraged signals that you emit on the web are fair game for the potential state actors to track you as well. Edward Snowden revealed to the world that the United States government goes to remarkable lengths to track and catalogue the data generated by its own citizens. And this is supposed to be the land of the free, a representative republic. It’s not difficult to imagine how bad this technology could go in the wrong hands. It would be a trivial matter to co-opt these signals to exercise as power, especially considering that it has been reported that the budget of the FBI’s Operational Technology Division is between $600 and $800 million, but officials refused to confirm the exact amount.26 The only way to begin to reduce this risk is to think of technology as a worldwide platform with worldwide implications to usage. An example of this is the FBI’s battle with Apple over the encryption of a terrorist’s iPhone. For those who did not closely follow the case the progression was such:


FBI: “Federal judge says gimme the key to unlock this terrorist’s phone and by extension, nearly all iPhone’s sold encryption.”27

Apple: “Not just no, but hell no. If we did it, it would create a security flaw that is usable by any third rate despot.”28

FBI: “As they say ‘who gives a shit?’, are you supporting terrorism? We’ll smear you to the public.”29

Apple: “Still not going to do it. A substantial chunk of the people and economy, agrees with us, you need to get the hell out of here.”30

FBI: “Just kidding, the phone was already locked out and we paid some hackers a million dollars of taxpayer money... We just wanted access to everyone else’s data. HAHA”31

Russian Hackers: “Greetings comrades, security flaws are very useful. Am I right, NSA?”32

NSA:

So, a collective group of hackers exploited a security flaw in the software, used by our NSA, to produce a hack of their own. Security organizations like the NSA and FBI tend to collect what are known as ‘zero day exploits,’ wherein, they intentionally attempt to hack popular hardware and software for collecting vulnerabilities. Oftentimes when new features and functions are added to software, it presents the opportunity for a new ‘hole’ in security. They probe their protocols and core software then catalog for later exploitation at a time of their choosing, all while not revealing the flaw to the company that produces the software. Proponents claim this enables heightened intelligence surveillance, but I don’t see it that way. Critics are quick to cite the act of withholding presents additional opportunity to our citizens of the web who are already open and primed for tracking and manipulation on completely unprecedented levels. Over time our favorite websites and apps began to remember our passwords and serve content tailored to our likes. It sets the stage for enhanced ad-targeting developing in parallel. Framed as ‘consumer benefits,’ which are difficult to argue against, these personal touches are the cornerstones of potential trojan horses, and these personal touches are not limited in scope to just advertising. It can also facilitate wholesale censorship. For example, at the recently held Decentralized Web Summit, famed digital activist and engineer Brewster Kahle noted, “China can make it impossible for people there to read things, and just a few big service providers are the de facto organizers of your experience. We have the ability to change all that.”33

Here’s a scary thought, had the FBI been successful in opening a backdoor to iPhones, this backdoor would be just as accessible to foreign less democratic friendly governments. The reason? If the FBI finds a flaw that grants them access, they have no incentive for letting Apple know. Their incentive, however, is that they are able to use the flaw towards their own ends because without notifying Apple, the flaw remains open. It’s like the biggest house on the shadiest block forgetting they left the door wide open. Anyone with an interest and disregard for security can use the door. The internet does not have physical borders. Nationalistic pursuits of unfettered access to people’s data will yield a dystopia faster than you can imagine.

Further down the rabbit hole, some cybersecurity industry players are moving into surveillance intra-company. The firm Stroz Friedburg is cataloging and indexing all employee emails and text messages for their corporate clients.34 This is not entirely abnormal for our day and age, but where it gets extra creepy is that they are attempting to use big data approaches to predict the intent of employees. By analyzing the texts and emails of employees, the firm picks out words that denote dissatisfaction of employees. Words and phrases like ‘leave work early’ and ‘Shitty boss’ are flagged and stored. The software is able to aggregate the data in simple yet powerful ways, enabling queries like ‘Show me the top 10 dissatisfied employees,’ followed by examples that display how those 10 employees are unhappy with their employer. The risk we face by allowing firms like this to exist is that their technology is easily co-opted for other purposes. “Extremist” and “activist” are search terms to those at the heads of these companies.

The internet is a scary place right now for privacy, especially if you consider that even Mark Zuckerberg, the CEO of Facebook, uses basic measures to protect his own privacy. He places tape over his webcam and microphone on his laptop.35 If Zuckerberg is unable to ensure his own privacy, how are the rest of us assured protection? It says a lot about the state of the internet. All of that said, it’s not too late to lay the proper groundwork for upcoming generations. As the early arrivals of this new information age, we still have time to provide the best possible outcome for our children and ourselves. To be clear, I do not believe companies like Facebook and Google et al. to be intentionally trampling people’s privacy rights. In many cases there is a deep seated desire to do good, in fact Google’s goal is still to make all the world’s information accessible to everyone. Zuckerberg doesn’t want to control what friends you see and what news stories you read. They’re chasing what users ‘like’ and creating software that seeks and serves you similar content so you’ll use the service more, generating revenue in the process.

So how might we address these issues? Some say encryption end-to-end may be one step on the way to a solution. End-to-end encryption simply means you and I have keys to each other’s front door when we send a message or file. What needs to be further addressed is that without strong privacy rights placed into immutable law, our privacy is screwed. It is important that at this juncture we not submit to our bias for ‘steady as she goes.’ The bias for status quo makes an appearance in decision making here as well. Example - when an app updates their terms and conditions or privacy policy, do you review them? That’s like handing someone a birthday cake lined with lit candles and then asking them to read a 25 page usage agreement before they can blow their candles out.

On a personal level, we need to engage in discourse with our friends and family about the issues at play, and the reasons they are in play. It’s a natural inclination of our species to leave the world a better place for our children. Socially we all agree to this idea, though our application can sometimes indicate otherwise. Children are, and always have been in a sense, a euphemism for the rest of humanity. Predicting the future for them is impossible, what we can do now is ensure their rights. That conversation is already taking place in boardrooms across the country. Proof of that bears out in Mark Zuckerberg’s opening quote of this chapter. “By helping people form these connections, we hope to rewire the way people spread and consume information.”36

Emoji Time - The Web

Advertising, social media and privacy are all presently in a state of confused adolescence. Most of the nuances are so brand new, detailing their practicality to the public becomes a societal challenge of establishing common terminology. Yet for the moment, these these three arenas are in a state of unabashed money grabbing and the emojis reflect this.

Fig 5-5. - “Can’t…”, ”...Stop…”, “...Won’t…”, “...Stop.”

The More You know! Ad Blockers

If you’ve found yourself a prospective member of the 90% of consumers inclined to block ads, welcome! Ok, so the open and connected way the internet was created is both great boon and also its achilles heel. What is to be done? How can we keep large corporations from trouncing over our privacy rights in the pursuit of dollars? How about malware that sneaks into user’s machines via ad networks? This is a trend that has tripled from June 2014 to February 2015.37 Unfortunately there is no ‘correct’ answer. If the conversation on privacy has stirred in you a desire to prevent yourself from trackers, there are a couple of options available to you.

Opt Out

Ad Blockers


Ad blockers can be added via the app store on mobile devices or by downloading browser extensions on Firefox or Chrome etc. A few years ago while browsing the web you might have received a notification to update Flash. As Flash was an add on to the web experience, it would need to be updated from time to time. Extensions operate similarly in that they are add ons to the browser. As a word of caution, using ad blockers does prevent ads from showing on the page, however, it also blocks ads without much discrimination. This means that passion sites dedicated to creating niche content for a niche audience also lose out on the revenue. StraightOuttaYarn.com couldn’t operate without some revenue to pay the bills and removing advertising is detrimental to producing that content. Before you adblock, think about the darn yarn! Consider services like Patreon that allow you to tip the content producers you enjoy daily.

Where Innovation Goes Wrong

Diverging from the web, and into the wider umbrella of science and innovation? In some instances, bubbles of innovation produce game-changing technologies or revolutionary methodologies that have impact across the board, other times they’re incapable of launching from the lab. As we’ll discuss in this next section, we’ll find that sometimes a new technology needs no help at all in continually falling flat on its face. In American culture, there is often a buzzword or newfangled technology that the more experienced among us lean in to sprinkle sage-like advice of the amazing change ahead within one realm. In the 1960’s this feature of knowledge diffusion was represented in the movie The Graduate and it played out like this:


Mr. McGuire: I want to say one word to you. Just one word.

Benjamin: Yes, sir. Mr.

McGuire: Are you listening?

Benjamin: Yes, I am.

Mr. McGuire: Plastics.

Benjamin: Exactly how do you mean?

Mr. McGuire: There’s a great future in plastics. Think about it. Will you think about it?

Sure enough, Mr. McGuire was 100% correct. Plastic molded our world into what it is today. Plastic enabled the shipping of products and food long distances helping to introduce the world to globalization. If Mr. McGuire were to offer sage career advice to a graduate this year he would likely proffer “graphene.” If you’ve been paying attention the past ten years or so this will not be the first time you’ve heard of graphene. Dubbed a wonder material, graphene consists of a single layer of pure carbon one atom thick. If these layers were stacked on top of each other thousands of times you would have a substance similar to graphite used in pencils. Additionally graphene is physically and chemically stable. Its properties include:


1. Dissipates heat quickly, heat being the enemy of electronics.

2. More electrically conductive than copper, presently used in electronics.

3. Structure thin enough to be transparent, will help lead to transparent displays.

4. Strong enough to withstand the combined bite strength of over 5,000 saltwater crocodiles.38



The list for potential applications with a stat list like the above means that the applications for graphene are limited only by our creativity. True to form, there are presently over 25,000 patents for graphene worldwide. This is since 2004 and applicable uses include the following, ordered from the mundane to the completely rad.

Sports Equipment

Lighting

Green energy

Batteries

Augmented biomaterials

Computer chips

3D Printing

Fig 5-6. - Look closely, crumpled tinfoil coated in Vantablack - the darkest material ever created, a byproduct of Graphene. You won’t see a wrinkle!

You must be thinking to yourself “Wow that sounds awesome, when can I get my hands on some of that sweet, sweet graphene!?” I agree and applaud your enthusiasm but this is no Texas-tea that we can just mine from the ground! This is the part where reality comes in and slaps us across the face with some cold-hard-truth. Graphene has a few factors that are, at present, large barriers to manufacturing on consumer scales. They are:


Price

Uniformity

Modularity


It seems that with all of this potential promise, the only thing graphene cannot do is make its way out of the lab. If you keep tabs on development in technology and science, there are amazing new discoveries on a seemingly daily basis. It’s incredibly exciting, but also highlights the importance of tempering expectations.

Nowhere in tech is the tempering of expectations needed more than in the case of Theranos. Theranos was a blood testing startup founded by wunderkind Elizabeth Holmes out of Silicon Valley. With the prick of a finger they claimed they could detect all sorts of diseases. For a time the company was the hottest ticket in town and was valued at more than $9 billion dollars... all while operating in secrecy. As hindsight is always 20/20, the product was not what was billed and a lot of people lost their time and money, violating principle number one of the universe - if it sounds too good to be true, it probably is. This may end up being more than just a bloody lip for Silicon Valley and venture capital funds before it’s all over. The below emoji time is a tongue in cheek outline of roughly how to temper your expectations when you hear about a new and exciting breakthrough.

Emoji Time - Innovation

The difficult thing about wondrous innovations produced by science is that we oftentimes hear about them too early, before they’ve worked out all the kinks. This can lead to a false-start perception about technology.

Fig 5-7. - “Scientific progress...”, “... requires fine tuning...”, “...it might blow up...”, “..or it may take off.”


Where Automation Goes Wrong

Cool, so we’ve got wonderful graphene now, but for one reason or another it tends to not meet its full potential due to manufacturing challenges. Perhaps we could use technology to help iron out the wrinkles in manufacturing? Possibly. At present, automation is not widely understood let alone its implications in maturing as a methodology. Like Big Data, ‘automation’ is a placeholder for a variety of applications, most succinctly ‘work.’ Without being too obvious, work is defined as mental or physical effort in achieving a result. Input and output at its reducible, basest form. Although we previously mentioned automation without covering the topic in depth, I chose to include it in this ‘What Can Go Wrong?’ chapter for one very simple reason - right now we have zero concrete answers on how to deal with the effects of automation.

Simply put - automation employs AI and/or robotics to replace the activity of work performed by people. In a typical economy, resource owners allocate those resources like capital, research or human, for the goal of producing more resources. A worker exchanges their time to that resource owner in exchange for their own resources, to which they go and spend other resources. The catch 22 with automation is that it automatically increases the amount of income inequality. In a study conducted by ADP Research Institute across the world and industries found that “45% fear that automation, smart machines and artificial intelligence will replace people for repetitive work.”45

Wow, 45% of working professionals fear for their job security, and rightfully so. Even WalMart, one of the largest employers in the United States is in the process of laying off 7,000 back office accountants.46 True, this is only 7,000 of their 1.5 million employees but the macro changes involved in automation are not likely to be enacted over night. In our capitalistic society we’ve built up personal identity around vocation, billed to citizens as the American dream, the gateway to home ownership, and a thriving family. These goals, at present, are not achievable in the way that they were for previous generations. We are simply living in a different world than before and thanks to the computing revolution that foundation is changing at an ever increasing clip. In order to find success, we need to shift with the technological advances happening here and now.

Some factories or offices would be able to easily replace human workers. Humans are granted workplace rights to things like bathroom breaks, safe factory standards or office conditions, scheduled hours, oh, and payment for work rendered. However, if the factory swaps humans with robots, the factory is able to reap more revenue and lower expenses due to not having to pay humans both money and rights to breaks etc. Expenses from overhead like electricity and maintenance replaces human resource costs to near zero. Recently Jason Furman, in association with the White House published a report on the opportunities and challenges facing the increasing adoption of AI. In his analysis, he computed an estimate of wage ranges facing pressure from automation, figure below.47

Fig 5-8. - In the future, computers perform the repetitive tasks which enables both risk & opportunity.

Luckily enough, there are examples from our past that help us move past the skepticism. If you think about a given economy, there are roles that are essential to the day-to-day operations of an economy. When we were hunter-gatherers we needed 100% of society getting out there and picking up berries or tracking animals, otherwise we’d all starve. So we made tools to make the jobs at hand easier. When we slid into agrarian society, the ability to grow crops meant not everyone had to focus on food production. People moved into specialization and produced goods and services accordingly. The trend continued. Later, it was discovered that setting up a factory to produce cars in sequential order provided much higher levels of efficiency. Less people were needed to make more of everything. The trend continued. People were again replaced by specialized machinery, allowing humans to further specialize in the goods and services offered. Up to this point, we were replacing physical labor with big dumb machines that never got tired and only required oil to get the job done.

What I really want to drive home to you, dear reader, is the fact that over time, our lives have gotten a whole lot better. Growing up around farmland, I can tell you for certain that it’s not a glamor-based vocation. There are no paparazzi sneaking onto farms to snap photos of farmers waking up at 4:30 A.M. to milk cows. Farmers are up and out the door earlier than CEOs and work ‘until the cows come home,’ which is a nice way of saying until the job is done. According to the US Census, in 1880, 43.8% of americans were farmers,48 today that number is only 2%.49 Farming used to require gargantuan amounts of labor, perhaps moreso now, but we have tractors so large they name them things like “Big Bud 747” - and trust me, it’s big. Further, there is cause to believe that the percentage of farmers is about to get even smaller. In September 2016, CNH Industrial revealed a fully autonomous tractor, equipped with visual sensors, including laser driven sensors. The tractor is able to recognize obstructions in the way of the tractor and notify the operator who might be sitting at a desk.50 We now take for granted the fact that to order anything under the sun is accessible in 10 clicks or less. If you’ve ever described your own job as ‘so simple a monkey can do it,’ you’re describing the exact type of job set to be replaced by a computer.

For example, in project management you may receive a client email asking to confirm the details of a project. There is simply no way for you to cross check the details and write a response to your client before a computer could do it 1,000 times over. Even if it took you just a single minute to write the email, it would still be slower than a computer. The raw truth is that there are only a limited set of roles that will not be replaced by robots in the near future. The reason this will all come home to roost is simply - economics. In a world where quarter over quarter revenue increases are the grease to the wheels of business, humans just cannot compare to the efficiency of robots. s, when engaged rather than feared, can work as precise tools for societal optimization.

It’s an easy enough trap to fall into believing my position is so special and unique that it could never be done by a robot. If you recall back to chapter two, the use of self-teaching robots is where the trend is headed. Feed an AI volumes of data and it will analyze and learn from them. Not even Neo from The Matrix can learn that fast. Someday even the upper echelons of our professions will be replaced by AI or robots. It’s the broadscale pattern recognition available to the AI that makes it so superior to people. Doctors and attorneys spend years developing their body of knowledge, known as expertise. No one person could recall the details of every case setting precedent, or an obscure disease affecting 0.1% of the population. Medicine and the application of law is high risk. In milliseconds an AI can cross reference every statute or illness from the beginning of recorded history. Presenting a diagnosis or legal opinion in the time it takes you to blink your eye. An example of the deep but narrow expertise AI can offer is that an AI chatbot has overturned 160,000 parking tickets via chat interface.51 Built by Stanford Student Joshua Browder, the app DoNotPay guides people through a series of questions that determines their ability to fight the ticket. Again we come back to the idea that technological innovations, when engaged rather than feared, can work as precise tools for societal optimization.

Our ingenuity as a nation is derived from our use of the tools available to us at the time. Tools that allow us to conserve time, cost and accuracy. Technology is and always has been a tool, no different from a plow. Accuracy in assigned tasks, in this case, is interesting because this is the bar that we use to establish the first steps into accepting automation. AI and robots simply need to be better than humans, which is an admittedly low bar. As a tool, automation won’t care about lost human jobs. This is why it’s so important for us, as a society, to discuss how we will leverage these tools for the benefit of the many. We would not use a calculator to send an email, why use old tools for new jobs?

In a world where let’s say, 95% of the goods are produced by machines and 95% of humans are unemployed, what would a functioning economy even look like? The idea is that without consumers, there would be no demand and no reason to produce anything. A market without demand is no market at all. Preservation of people’s ability to participate in the economy will be analogous to preserving society. People who still want to spend their days dedicated to a job will be able to do so. Other’s should be able to contribute to society in personally designed ways. Still, the economy will be reliant on a series of systems and sub-systems that deliver goods and services on demand. On demand in a world brimming with AI and automated systems can mean lean supply chains, maximizing efficiencies across the board. Afterall, if you only have to produce what has an explicit demand, you’re conserving physical and monetary resources. The internet has proven that a market exists for every interest. This is why sites that specialize in handcrafted goods like Etsy are thriving. Jobs where engaging with other specialized humans is a core component will continue to see a rise in demand and will likely always see growth. The most important fact is that those displaced by automation should be aided by the companies that benefit from automation.

Emoji Time - Automation

We need to have an answer for how we will help people cope with and move beyond vocational displacement. Training and re-training people to enter the workforce is an undue burden not imposed on previous generations and is likely to fail.

Fig 5-9. - “Robots and AI are...”, “... exploding in use...”, “...eliminating...”, “...jobs.”

Where Tech-Law Goes Wrong

If the results of the blood testing company Theranos and its subsequent debacle have been a black eye for Silicon Valley, certainly the bloody lip of multinational Silicon Valley firms would be taking advantage of the international tax law structure. The present method of implementing law includes the ability to mold the law with lobbying dollars. This prompts CEOs to seek raising shareholder value, which can mean constructing a niche inside the tax law to skirt commonly held law. This means sometimes occupying ethically gray areas of international law. To combat this trend, ICRIT - known as the Independent Commission for the Reform of International Corporate Taxation - lead by Economist Joseph Stiglitz, seeks to legally rectify these situations.

This coalition of many similarly goaled entities aims to educate the public on how tech companies leverage antiquated tax law to reduce their own tax obligation. The Commission has stated that, “There is both an urgent need and an unprecedented opportunity to bring about significant reform of the international corporate taxation system… ICRICT has encouraged firm support for an inter-governmental tax body within the UN, and strengthened the resolve of developing countries and civil society to push for this outcome.52 In an article published by the Washington Post, Erika Siu, a consultant for ICRICT, echos this push, saying, “We’re calling on governments to make proper rules that are in the interest of the public.” The thing is, our technology is advancing at such a rate that the laws are no better at keeping up than us humans are. Humans are, in fact, what decide the laws and if we can struggle to keep our understanding current, then legislating all the nooks and crannies of the law is all the more difficult.

The actual mechanism multinational companies like Google and Apple are leveraging for tax avoidance are known as a “double Irish.” Firms perform a type of musical chairs, shifting income to their own subsidiaries and sister companies that reside in tax havens like Ireland. Ireland does not have requirements for ‘transfer pricing,’ which is like paying yourself to use your car to get to work in the morning. These multinational companies transact on intellectual property and licensing fees between companies. Thereby funnelling all profits into a sister company in a tax-friendly country and greatly reducing their tax obligations. The argument goes something like this:

European Commission: “In fact, this selective treatment allowed Apple to pay an effective corporate tax rate of 1% on its European profits in 2003 down to 0.005% in 2014.”53

Apple: “At its root, the Commission’s case is not about how much Apple pays in taxes. It is about which government collects the money.”54 “It’s total political crap,”55

European Commission: “Look, we’re just saying you should pay that 12.5% instead of 0.005%.”

Apple: “We actually paid $400m”56

European Commission: “Does that add up to 12.5% instead of 0.005%? You are literally receiving benefits that other companies do not have access to. Literally! WTF, You made $72 Billion dollars worldwide in 2015!?”

Obama Administration: Hey Apple, buddy-ol’-pal. Why so sad? Is this jerk bothering you? I got you, Boo, and you too - Starbucks, Amazon, Adobe, Facebook, Google, IBM, Microsoft, Oracle, and Yahoo!57

Worldwide Taxpayers:


The good news is that this particular type of loophole was closed as of 1/1/2015, and these companies have been given a grace period of an additional five years to untangle their finances from these types of arrangements. The fact of the matter remains, however, that CEOs of public companies are doing what their job descriptions entail, and at the top of that list is to balance and increase the share price of their company. Complicit in this behavior has always been Wall Street’s ‘quarter-after-quarter, yeah, yeah that’s great but what about the share price?’ attitude. Tax avoidance strategies appear like low hanging fruit their competitors are all too happy to leverage, so why not?

Around and around we’ve been spinning, continuing to make use of the tax code’s grey areas for their shareholder’s benefit, as well as their own skins. This is thanks in part to the myopic quarter-to-quarter yardsticks public companies utilize in measuring success. It benefits the CEOs to avoid tax and provides maximum shareholder value, so there is a figurative carrot leading CEOs to make tax avoidance decisions. This short term focus affects decisions from the CEO down to the manager level, and is in part responsible for the modern day rugged individualist environments of the corporate world. By creating special arrangements with entire countries, corporate entities are entering uncharted territory that may lead to unintended disasters down the road.

Tax avoidance already has a demonstrable effect on America’s crumbling infrastructure. In 2013 the American Society of Civil Engineers, comprised of over 140,000 members gave a report card to America’s infrastructure, the grade was a D+. The issue is so severe that the former Secretary of Transportation Ray LaHood claims, “We’re like a third-world country when it comes to infrastructure.”58 This underscores the needs for examining the frameworks presently in place as it pertains to corporate operation. As we’ve learned, technology is advancing at a pace that people are having difficulty tracking. We need to enable and promote safe growth while taking into account the most important stakeholders, our nation’s citizens.

I’m not necessarily advocating that corporations like Apple or Google be singled out and taxed for the purpose of repairing our infrastructure. My point is that the fair-share tax revenue could be directed to much needed public works projects. This includes necessary establishment of a regulatory framework that engenders safe innovation while pruning bad-will actors. It’s important to be able to use those fire-hardened sticks to till the soil, as it were. Only, our tools for tilling the soil are far more complicated than they used to be. The remainder of this book is dedicated to highlighting the contents of our modern-day toolbox.

Tying success in life with vocation causes more harm than good. The fact is that many people do not pursue what they’re actually passionate about, but bunk their ambitions down to some pragmatic career that pays the bills. The point remains, people are more fulfilled through working on their passion projects. It’s the essence of self-actualization, happiness tends to be a byproduct. Failure gets a bad wrap, especially in American culture, but the only shame in failure is in not learning from why you failed in the first place. Adjusting your attitude to value the lessons in failure can help you hone your skills and focus. It doesn’t make you a leper to fail, even still, you’re likely to be the one most affected by it. Technology is not a panacea, but a tool to tackle challenges. Technology is simply one lever that is flicked in addition to others like logic, reason, resources, empathy, in order to fully understand the presented problem. Understanding the root problems of challenges are the issue, and designing solutions without capturing the full requirements from the people who are to benefit from the solution would be a huge mistake. Whose problems are being solved? Who else does this affect? The answers aren’t going to come easily, but I’m really hoping that we can have our act together by the time the aliens finally show up. How embarrassing to have such a messy living room!

The way things are and the way they should be, are seemingly estranged from each other at the moment. The depressing news we see unfolding across the globe daily makes the gulf between those two concepts seem alien in and of itself. It’s easy to be cynical from the cheap seats, but it’s just as easy to buy cage free eggs and give ourselves a participation sticker. It’s far more difficult to know the odds and still continually act positively to bridge those odds. There is no end to the line of critics waiting to cast their stones.

Emoji Time - Tech Law

Given the rise of bad actor websites, we’re going to need to stringently address their regulation. Additionally, social measures need to begin permeating the metrics of a successful business.

Fig 5-10. - “Hello cash,...”, “...goodbye cash...”, “...aand you’re gone...”, “..sorry, cash?”


Correcting the Course

What we’re bearing witness to here in our day and age is the increasing focus on technology and its role within our lives. The invisible undercurrent thrusting belief into our default mode network can be made visible by performing a simple experiment - for a whole day, click ‘like’ on every image, article, whatever content you see, without consideration for what it’s value. Even if you find it abhorrent, click ‘Like’. In no time at all, you will witness social media’s algorithms exceeding efficiency at play. Your feed will be furnished with advertisers and click bait sites of every stripe waiting to pitch you on the contents of your recent activity. If you don’t like the idea of mucking up your recommendations, try reframing it. Think of it like an oil slick released from the back of your internet vehicle, throwing advertisers off your trail like a James Bond film.

It’s like an auditory experience, its variable person to person - it’s an entirely subjective experience based on position in space, the direction each person was facing and the frequency the sound was travelling at. Technology is experienced this same way, with person to person variations provided to encourage. Your senses may be highlighting that a conversation on the topics surrounding technology needs to happen. You’re recognizing that need from the emotion of fear. You’re right, now let’s talk about it.

To hearken back to the quote from a film that certainly visualizes what can go wrong, in Terminator 2: Judgement Day, John Connor repeated a refrain taught to him by his very prepared mother Sarah Connor. “The future’s not set. There’s no fate but what we make for ourselves.”59 This cinematic proverb extolls the virtue of action over inaction. This is a common sentiment within time travel films and stories, in terms of all that can go wrong regarding social media, this bears fruit, as public commons are subject to public oversight, so too should social commons like Facebook, Twitter, Vine, Snap etc... all accept that what occurs on their platforms matters in real life. Deliberating while remaining cognizant of biases which reinforce polarization allows all of us to move forward to a social media that enables the best of our human nature. Remaining aware of these pitfalls in social behavior enables attuned focus on your own goals.

In the same way, our parents claimed TV will ‘rot your brain,’ so too will the garbage on the internet. The difference from the time the original saying was instituted to now, is the speed at which our brains can rot from disuse. The risk exposed by fake news in tearing apart the social fabric of our society has been laid bare. Therein lies a possible solution to stemming the recent vitriol and divisiveness - molding social norms to be more open to disagreement. It is up to the communities to create self-governed rules and to make those rules clear for all. The application of such an effort would, of course, need the continued input of many groups but a preserved web is preferable to the wild west that is the internet today.

The 1980 California Supreme Court ruling found that privately owned spaces were subject to constitutionally provided free speech rights, subject to reasonable management.60 I would argue that at present, the right to exercise free speech is impinged by the prevalence of fake news. It is clear that in our present age when people communicate via sharing coded memes and emojis it is an expression of our varied and wide breadth of human culture. When the diffusion of information is altered via business strategy, we have problems. Further, the presence of bots posing as breathing people, makes the line nonexistent on the web. The average user would be blind to any indicator of a fake user. These frayed articulation of the internet employs algorithmic-subterfuge to disrupt the free speech of actual people, which is wholly untenable for democratic societies. The understandable counter argument would be that websites are privately owned and not subject to providing a reasonable environment. To which the reality intercedes, what are the alternatives to these digital institutions in our digital age? The heartening perspective is that the conservancy of public spaces, digital and physical is an extension of the same freedoms instilled in founding doctrines.

What’s clear is that in the future, everyone will live in digital glass houses. Or you can just not think about it until later... up to you. Of course this is the bias known as the ‘ostrich effect,’ a type of cognitive bias that sees us ignoring dangerous or negative information by burying our heads in the sand. An example of this would be how often you check your bank account when you’re relatively sure there’s not a whole lot in there. The technical specifications of the world wide web, include an spirit of democratization of technology, that only applies insofar as the people wielding it understand it as such. You vote with your eyes and attention, what you spend your attention on is what is amplified by the mechanics of the web. You vote with every click, tap, or swipe - ensuring the intent of those interactions is recognized by the instrument before you. As we progress toward bold new tomorrows, we’ll discover some deeply concerning challenges, solutions to said challenges, and will define technology for generations to come. Enshrining our humanistic values born from our better nature requires continued conservancy within the digital realm.

To: Tech Community

Fig 5-11. - “We really love...”,“...technology…”, “... but time is short...”, , “...let’s get working.”

To: The Rest of the World

Fig 5-12. - “Raise your hand...”, “...ask questions…”“...the party is…”, “... just beginning!”


Chapter 6 - What Can Go Right

“we make our world significant by the courage of our questions and the depth of our answers”
― Carl Sagan1

The range of human experience is and always has been vast spectrum, through the advancement of technology are we beginning to weave together a new definition of our common history. How we help others contribute to that common history is going to be remembered for generations. For all our faults as a human race, our boundaries have expanded thanks to our nomadic and curious resolve. Hundreds of thousands of years spent living as intrepid explorers has granted us a spirit embodying discovery. This book is my attempt at describing the bounds of technology that I am able to see from my vantage point. The horizon is distant, yet increasingly focused in detail, many questions and challenges yet remain for our age of information. As we are increasingly able to scientifically articulate the foundational components of human nature, it is our own duty to adjust our perception. We all must make decisions in the face of seemingly insurmountable uncertainty for ourselves personally, as well as our families. It’s uncomfortable to admit our naturally occurring defects in preparing for these changes. You are not alone, in many of those decisions, small and large, we’ll be incorrect, we’ll make assumptions and get it wrong. It’s a fact of life, we’ll accidentally step in dog-poop on occasion.

Not always hitting the intended target, is of course, a part of every individual journey. The common nature of making mistakes requires we move forward with empathy for our own mistakes and for the mistakes of others as well. However, remaining deferential to outcomes, aka letting it sort itself out, swings too far the other direction. Therefore, the evident and pragmatic solution is to define what human nature means inside the digital world we’ve created. An integral component of defining human nature in our new world is derived from how we as a society enable all citizens and not just those who can afford the latest and greatest gadgets. Technology has advanced to the point that enabling software availability become noble and modest mercies. They incur no cost for others in society because remember, software is not a finite resource. Technology enables society to migrate from a zero-sum game into greater than zero. In truth, it allows for the total exploration of the collective concept of human nature. The future of what it means to be human, and how to ensure we keep the best parts of that is to learn from history, ourselves and eachother.

A net effect of continually evolving technology is the continued skill specialization of our societies. This specialization coupled with capitalism has allowed for a product for any person’s needs or tastes. Online communities have proven that markets coexist within the most niche of interests. Specialization grew into human history when we switched from hunter gatherers into agrarian based societies. As the need to hunt and gather every day lessened, people spared difficult farm life to pursue other means of subsistence and specialization. This has gradually sprouted out into a budding interest based economy where people pursue their interests as vocation. Now we’re witnessing free markets service smaller and smaller segments of hyper-interested, hyper-aware audiences, much of which will be driven by AI that endows enhanced complexity in every level of society. Because there is a product and a community out there for every interest, AI fueled by data will be the connective tissue. enabling the sustainable growth of these varied markets. The risks and opportunities of developing AI fueled by data are evolving whether or not we feel ready. The onus is on us to ensure that we optimize our actions toward a common goal.



OK, Maybe we are living in the future, Blockchain!

The societal framework for enabling modest mercies is furnished by protocols like Blockchain, a protocol of Bitcoin. This type of technology provides a unique solution to a common problem, for example, it secures the sending and receiving of monetary value. This simple transfer would be a near impossible feat for individuals residing in rural or developing nations, but for someone growing up in Zimbabwe without the infrastructure, opening a bank account may simply not be an option. Here, blockchain provides a passport to participation in the digital economy. This passport allows a rurally separated individual to improve their local circumstances by participating in a worldwide economy. Simply having a Bitcoin or equivalent cryptocurrency account, enables access to resources that were previously unavailable or even unimaginable. Empowering this person to have entry into the globalized world means not only having purchasing power, but also a tool to connect with others.

As you may recall, the challenge of tracking and verifying digital transactions is a persisting problem. However, thanks to the protocol underlying Bitcoin which is known as blockchain, the ledger’s integrity is upheld. The blockchain protocol can be thought of similar to the ‘http://www.’ in a URL, it’s a type of computer jargon that enables computers and devices talk to each other securely. The protocol defines transactions as ‘blocks’, which in turn then broadcasts the contents of those blocks to every Bitcoin miner via the public ledger, each being assigned a unique identification number known as a hash.

Simply put - a hash equals a jigsaw puzzle piece that connects ‘blocks’ into chains of iron-clad bookkeeping databases.

When the blocks in front and behind are verified, the security of the blockchain is maintained. The hash stands as unable to be tampered with because it produces unpredictable results when tampered with, instantly flagging the abnormality. Say a circus clown walks into a biker bar, the patrons of the bar would immediately recognize this new entrant. The record would scratch and all patrons would stop and stare, the clown costume acting as a hash that the other blocks (or patrons) are not expecting.

Given blockchain’s abilities, financial institutions both large and massive are investigating implementing their very own version of cryptocurrency. Part of the draw represents a much simplified accounting processes that enables simpler and transparent regulatory oversight. In other words, its enables banks and insurance banks to more easily represent their business, because the ledger perpetually exists as rock solid in accuracy. Not only is this a boon for regulators, by providing transparency, but it enables frictionless payments between two individuals which is the former bread and butter of the banks. Financial institutions make tons of revenue from charging transmission fees, both incoming and outgoing, across the world. If you’ve ever needed to send money overseas, you’re likely aware of how extortionary the prices are, upwards of 20%! Cryptocurrencies offer the ability to supercede all of those institutional costs and perform it instantly, the simple reason being that you are transferring ownership of an asset that is accounted for by small amounts of bits and secured by thousands of decentralized computers. For recorded history, banks have stood as the institutional depot of value for all people. Increasingly, this diffusion of financial power may one day render banks obsolete.

The applications of the blockchain protocol include the potential to profoundly impact our product supply chains and currencies. The reason for this is simply - trust. On our present version of the web, literally anyone could be a dog. Trusting others on the web has required many intermediary services to arise, verifying identity, bank accounts, transfers of information and currency etc. Whole industries have cropped up to satisfy this need, thanks capitalism! However, with the creation of blockchains, trust becomes as simple as sending an email. It is a built in functionality. I can send money to you and not have to pass it through a bank who will take their cut, all performed seamlessly.

Remember - anything digital is simply bits and bytes that can be transferred between two devices in the span of a second in time. Data transmission occurs frequently and with astounding speed afforded by technologies like blockchain. Recall that servers and devices are always running, and are always able to pass data, even the devices in our pockets run 24/7 for the most part. This is part and parcel thanks to the world wide web we’ve weaved. This tracking of assets - money or, really, any other asset you can imagine, even a meal, can be broken down to the ingredient level. Each carrot in your Shepards pie can be tracked individually given the person intending to track carrots has the developer skill to implement blockchain. This is where it gets extra exciting. As blockchain is a protocol, application developers are able to implement the protocol as a part of a software program. Ethereum, a startup leveraging blockchain technology, has built a platform that enables ‘smart contracts.’

Simply put - Recalling our discussion on the internet of things, a ‘smart contract’ allows a computer or device to evaluate the conditions of a mutually agreed upon transaction. If the conditions agreed on by the two parties have been fulfilled, the smart contract will evaluate and deliver the transfer of asset ownership automatically. Smart contracts eliminate ambiguity by tying the transfer of asset ownership, the completion of the contract, to an impartial 3rd party computer network. Let’s continue to explore the idea of enabling value transfer and trust in the digital realm, which enable transparency and ethical oversight.

In the previous chapters, we covered:

Artificial Intelligence

Big Data

In this chapter, we’ll answer the following questions:

How does technology spur its own advancement?

What is the low hanging fruit in improving our lives with technology?

What are real world examples of modest mercies technology can have in improving people’s quality of life?

Technology exists as an imperceptibly large concept with big problems, and big problems tend to involve a lot of factors and people. It’s sorta like looking at the table of periodic elements with only the symbols guiding our education. We need to interact with people to develop a legend, a guide to the elements. In practice, this simply means clearly defining the problem and user requirements. Without interacting with the people we intend to solve problems for, producing an effective solution is impossible. No simple task, for sure, however with continued and ongoing education in the STEM fields, we’ll have diverse solutions to diverse problems. it’s a much simpler issue if we break it down to the big X factor - humans.

By prioritizing cognition of our immediate motivations, we’re able to perceive the core issues to be addressed. The goal here is to not just elevate the infrastructure with the practical application of technology through discourse with the stakeholders, but to ensure that as many people as possible have the option. Promoting modest mercies in the below areas will provide the best bang for our buck in terms of leveraging technology for positive outcomes. Promoting stable and long term growth is integral while engaging in discourse between stakeholders - and we’re all stakeholders.

Evidence of improvement in the federal government includes the open government data program initiated by President Barack Obama. This program has ensured that all data collected by the government will be machine readable. What this amounts to is that all data from the census, commerce departments, etc. are available for anyone with a connection and the know-how to pull in and crunch the data. Forward thinking attitudes can exist inside government as proved by Justin Antonipillai, an Obama advisor, who says, “our data challenge has two goals. First, to make government data more open, available and consumable to promote innovation, job creation and progress for our nation. And second, to endow a data-driven approach to modernizing government, to advance our service to the public in new ways.”2

It is heartening to see wings of the United States government are commencing efforts to improve data equality, yet the effort isn’t complete and will need higher levels of awareness. Enabling data equality allows for new entrants to tackle large scale problems by the private sector. In software development, SDK represents an acronym for software development kit, in other words, it is a set of templated tools that allow developers to use pre existing libraries and documentation to create software. It’s similar to carrying a toolbox - the SDK contains all the pieces you’d require to develop software for a platform.

For the US Census Bureau, CitySDK allows small and large businesses to tap into all of the data that the Census Bureau collects. Statistics on people, industry, housing, education, and medical - just to name a few - are now available. These troves of data have been collected for over 100 years, and up until now, required convoluted access methods. With the widely accessible data sets, citizen developers are able to crunch numbers and arrive at insights into the inner machinations of an urban environment.

The Lead up to Our ‘Great Scott!’ Moment

Now that we’ve established a definition of the problematic low hanging fruit, we can address how these issues may be solved. In the climax of Back to the Future, Doc Brown and Marty were ‘outta time’ which demanded they rely on human ingenuity and technical wonder to transport Marty back to the future. To contextualize our own lightning strike, we can refer to the pre-existing processes that lay the foundation for our shared crazy future. In academics, the process of modeling allows the person studying the process to distill the core processes, ie: it allows the researcher to more fully comprehend cause and effect within complex systems. Speaking of complex systems made simple, great news, through the topics in this book we’ve completed building our own version of DeLorean! If you recall from the films, the DeLorean required two key components to achieve time travel - the flux capacitor and ‘Mr. Fusion.’

The ‘Mr. Fusion Home Energy Reactor’ to be exact, uses household items like trash to fuel the fusion reaction which feeds the flux capacitor with the energy it needs. Data, in our case, acts as the fuel, including any and all data from sensors, spreadsheets, structured and unstructured data both. The streams of data powering our economy also fuel our version of the DeLorean. Of course, this effort would not be complete without the invention of the flux capacitor, or in our case, artificial intelligence. The innovations of both devices enable time travel. The corollary is that when fueled with troves of data, artificial intelligence will help us humans achieve wonders far beyond our current comprehension, the effects of which might be just as awe inspiring as actual time machine.

In our case we’ll be referring to this model as what I’m dubbing as the ‘Great Scott Cycle.’ The Great Scott Cycle is the set of naturally arising ongoing processes in the evolution of technology. Throughout this process, new inventions lead to new products that better enable greater productivity, no matter the task. These new products can potentially impact not just the economy, but the social behavior of citizens across the world and web. By the end of this chapter, you’ll be exclaiming to yourself “Great Scott!” Ready to go for a ride in our DeLorean and put the pedal to the metal to 88 MPH?

The Great Scott Cycle

86 MPH: Technology adoption lifecycle

•Picture the Hill Valley Clock Tower driving time and progress ever forward!

87 MPH: Killer app

•The DeLorean, equipped with the Flux Capacitor (AI) and Mr. Fusions (Big Data)

88 MPH: Network effects

•The lightning strike that produces a 1.21 gigawatt jolt to the clocktower as the DeLorean screeches to 88 MPH sending us on a time bending adventure.

Technology Adoption Lifecycle

The technology adoption lifecycle contains five stages in time, expressed along the x-axis, a new cycle begins any time a new version of the product is released, exactly like cars releasing new versions every year. This model occurs any time, such as any phone, tablet, phablet, operating system, coding base etc. releases. If a new product or technology fails to attain mass adoption in the marketplace, the technology can migrate back into development to reappear years later with rectified faults. This is best exemplified by the return of tablets. n 1993 Toshiba released a ‘pen-based computer’ called the T100X, but nobody cared. 13 years later, the iPad has sold over 330 Million units. Innovation cycles materialize throughout all industries and are not limited to a single company or industry. Innovations don’t always catch on with the public at first blush. Google Glass, tablets, PDAs, all required more time in the lab to become a viable consumer product. An example of another technology that may yet discover itself in this position is 3D printing. The evolution from T100X to the tablets we employ today occurred in waves and over time, illustrating the ongoing development of device types leading to widespread adoption.

Fig 6-1. - You’ve gotta break a few eggs for 13 years.

Fig 6-2. - The innovation adoption lifecycle helps ensure fresh features every year. 3


Innovators - Typically highly educated on the product and know ahead of time if they’ll be diving in.

Early Adopters - Individuals or organizations that become aware of the innovation as a solution for their own purposes.

Early Majority - Are likely to be brought into the innovation via influencers, people or experts offering the product as a good solution.

Late Majority - Diffusion of use, the product is well understood by a majority of users.

Laggards - People who tend to reject the product, but are using it as a requirement.


The ‘day one purchasers’ will be waiting in line the day a gadget releases and are counted among the ‘innovators.’ While the ‘early adopters’ would soon after find out how awesome a product is from the innovators who tend to diffuse expert opinion given their knowledge base. From there, and with significant marketing, does a product begin to break into mainstream adoption via the ‘early majority.’ By the time the product completes this phase, it is already half way through its life cycle and may have spawned new technologies or innovations in the process. The late majority is the transition where a product goes from ‘everyone has an iPhone’ to ‘everyone and their mom has an iPhone.’ The laggards tend to be people that have dragged themselves into the technology, and while their enthusiasm may change afterwards, they’re coming into the game late (not that there’s anything wrong with that - technology can be intimidating.) So here’s where this cycle gets interesting, given the cycle begins anew with every new product launch, companies like Apple and Samsung are continually competing for market share and driving the degree of innovation in Smartphones. An example of this cycle is the fact that since 2008 we’ve seen the release of 10 generations of iPhone. That is 10 generations of the iPhone concept, each with degrees of new innovation4, but still, consistent upgrades to hardware and software create unexpected opportunities.

Lifecycles like these have been bombastic boons to our economy and continue to be so. The underlying technologies are improving, and the ongoing maturation is feeding the accessibility of the technology and reducing the cost. Here’s the thing about the older devices, unless they’ve been dropped in a toilet, or thrown out a window, they may still be fully functioning. In many cases, the devices are still powerful machines competitive with supercomputers from the 1990s - seriously. These devices are still capable of fulfilling many potential needs, including banking and finding work and housing. If connected to the web, the devices have access to all of the same limitless information that you and I do, connecting people with the internet represents a huge potential for these new entrants. People can be empowered to make a living, connect with friends, video chatting with family, and more - all from a device whose value to us is equivalent to a paperweight or minorly-revered personal museum object. These outmoded devices can even help people communicate with others while under oppressive governments via encrypted methods. We know that over time technology diffuses into more and more people’s hands. This expands the pool of users we need to account for and include in today and tomorrow. We are all ‘gonna need a bigger boat.’

Killer App

Cycles like the above can act as harbingers of massive impending growth, or, in some instances, the wave simply recedes back into the ether. An example of this, let’s look at Microsoft Zune. Some of you may groan, but most will likely be curious - a Zune was a great product that was, by my own estimation, better than an iPod because it had both a fully colored screen and additional features, but given Apple’s design simplicity and outstanding marketing of the time, they dominated the space, and thus people forgot about the Zune. Betting on a losing horse aside, the point remains that some innovations ignite new, larger waves, such as the iPod evolving into an iPhone. Other times innovations fade into the pale, never to be recognized again.5 These new waves are often initiated by what is known as a ‘killer app.’ Simply put - a killer app is software or hardware, or both, that is deeply differentiated from other products, yet so broadly appealing that it causes rapid adoption of that technology.

Another example of this includes the Nintendo Wii and Wii Sports, the combination of intuitive hardware motion sensors paired with intuitive software enabled mainstream appeal and craze-like adoption of Nintendo Wii as a part of the home. Another example includes the iPhone 3G who’s killer app was the innovation of the app store. Before this point, smartphones contained stock software so basic that it actually made the devices a little boring. It wasn’t until Apple unlocked the hardware functionality for third party developers to create apps leveraging the novel use of the hardware that got us from then to now. The aspect about killer apps that makes them special is their broad appeal. They allow potential consumers to easily intuit the value proposition of the product.

When a killer app is developed in tandem with a new platform, hardware or software, it can kick off a wave of innovation. Exactly like the addition of the app store, killer apps spur on developer gold rushes where developers across the world set to work building additive experiences to these ‘still new’ devices. These innovative killer apps are the result of other technologies behind them maturing with time and considering consumer needs and attitudes. For an example that we can look forward to, let’s consider VR.

Having experienced VR for myself, I can definitively state that if you are at all interested in a peek into the future, try VR. You are instantly be transported to a different type of experience, completely distinct from anything resembling playing a video game on a television. VR relies on tricking the brain into believing it is physically, somewhere else. The ‘immersion,’ factor, sells the illusion to the user. For the moment, VR is lacking its killer app, but I’d wager that what will become the killer app is in development, perhaps secretly, at this moment. The ephemeral explosion in the use of killer apps to ‘sell an experience’ will be propelled by consumer demand, thereby feeding the innovation cycle of each new technology. In the same way that water erodes a path through the earth, so too does the path of innovation - it can turn a river into a lake, and in the case of iPhones and Android devices, veritable oceans.

Network Effects

This is how buzzwords fade into the public conscious, the buzzwords bubble up to publicly facing marketing. In the process signifying the arrival of merging technologies that present new opportunities, as well as, of course, risks. Let’s direct our attention to how these arriving technologies become widely adopted. Previously in the book we reviewed the concept of network effects, whereby attracting users to a service or technology, in turn attracts additional users. This effect leads up to a ‘critical mass,’ which is defined by the amount of value delivered to users. If the value to the user using the service is higher than the amount they paid to use it, then we’ve achieved critical mass. Reaching a critical mass of users, acts as a catalyst or, tipping point that accelerates the interest and participation of others. The speed with which network effects can be achieved is surprising and also concerning to observers. Five years ago the word ‘unicorn’ referenced a free-spirited chimera, splitting the DNA composition of a horse and glitter. In 2013, unicorn was appropriated by the venture capitalists to refer to any privately held company that is valued at over 1 Billion dollars. The figure of 1 Billion seems an arbitrary figure but given America’s fevered love affair with the word ‘Millionaire’ and now ‘Billionaire,’ it makes sense. For venture capitalists, investing early in these types of companies can be considered a primary motivator given their ability to ‘make-it-rain’ money.

Fig 6-3. The speed of technology meets the speed of business.

Here we are, just ten companies valued at over 130 billion dollars and they are not even old enough to drive a vehicle.6 An easy guarantee is that each of the above companies will be heavily leveraging artificial intelligence and big data to amplify their user/shareholder value, wherever possible. These companies are enjoying the fruition of network effects, with the exception of SpaceX. Being a long bet, SpaceX hopes to ferry people by the hundred to Mars. Elon Musk, the company’s owner and founder, is a part of a yet-to-start golden age in solar system exploration. Dynastic levels of wealth will be generated once organizations commence mining asteroids for resources. These companies are accelerated by the network effects of users signing up for the suite of offerings from each.

It’s Not Difficult to Envision...

In order to set up our grand finale preview of tomorrow, we’ll need to establish the baseline of expectations. This baseline will provide an idea of the tech that is required to be mature in order to achieve meaningful outcomes, ala killer apps. These killer apps will combine with our exploding use of software to develop meaningful solutions to technical problems and frictions in our day-to-day lives. Each of the previous chapters discuss technologies that when matured, will produce additional and persistent opportunities and threats. When leveraged properly, the established technologies will ignite a chain reaction within related areas of study as well as practical applications. Each new development will enable a further development within tangential fields. As the developments continue to mature they’ll also augment supply chains across industries. All boats rise with the tide. The challenge is ensuring that all boats are indeed rising concurrently. You can think of it like a cartoon firework igniting in a fireworks factory. As technologies mature and become commonplace in developed nations, the impact will modulate up and down accordingly to their adoption. Use is required to be significant and as with all internet age technologies, the network effects of thousands of people fuels the strength impact. Twitter or Facebook would not be the same companies they are today without their user base generating value by creating content for their fellow users to view, and this will be one of the impactful aspects of our strange new world.

From education to healthcare, personalization considers all available data sources about a person. This includes data generated by your smartphone, fitness tracker, health records, etc. the list goes on and on. Even our interfaces within digital computing have been evolving into highly personalized experiences. Devices like smartphones and laptops all contain our most personal information, they are funnels through which we volunteer our most intimate of details via search, messaging, and connecting with our communities. The evolution of this mode of computing will push the personalization factor into cognitive user interfaces. These cognitive user interfaces will be similar to the user interfaces we see on smartphones and laptops, but will be streamlined to join our vision with data driven details via augmented reality. Any situational context you may discover yourself in - social, work, etc. - will gain applications that aid in your navigation therein from this technological advance. This tectonic shift alone will be one of the best tickets to the circus of technology.

Human society has always been tasked with carrying the torch through time. Through the years, these tasks have seemed too tall to scale, sending a man to the moon was a sci-fi dream until we applied our cultural will to the task. Here and now, we are all tasked with navigating our own monumental, seemingly unsolvable challenges in the tactical application of technology. Additionally we’re collectively tasked with ensuring a rising baseline in the standard of living for all, wherever possible. Access to these technologies for all results in modest mercies, that are boosted by the market forces that enabled prosperity of those with access. Of these first few stages, physiology and safety are where we’ll observe the most progress, and working with people across social strata to deploy this technology and education will produce results limited only by the human imagination.

Great Scott! We’re Definitely Living in the Future!

We can draw from patterns in nature to illustrate the progress of science and technology. Let’s visualize progress as water. Seeking the lowest ground in the form of market fit, innovation can create new product categories, new branches spread like water over time. Innovative killer apps forge a fresh riverbed which forks as they become disciplined science. These paths can diverge into new business models and updated consumer preferences, and has a ripple effect. These ripples may again converge with the formerly divergent path. The riverbed continues seeking low ground and bursts into estuaries, deltas etc. When scientific advancement is paired with technology those same branching, winding rivers can become augmented flashes of lightning.

Sci-fi author William Gibson noted, “the future is already here — it’s just not very evenly distributed.”7Across the world, we have gaps in available technology. The delivery of the advancements we’re gaining today to the people who were previously unconnected socializes people to the internet in a completely immersive and unguided manner. Tomorrow needs sherpas ushering others toward what may be of interest to them.

We’ve already picked up a few mental tools for reducing friction by critically perceiving technology. Now let’s use those tools to find out how we might create traction for others progressing through life. These touchstones arrive from what is recognized as convergence. Convergence denotes the point at which multiple technologies combine to create a new innovation. The web as we experience it, is a convergence of coding standards, web technologies and media. These aspects coalesce to create a communications network that spans the globe and stokes the fires of contemporary industry. Another example is the iPhone 3G. With faster data speeds Apple was enabled to supply a conduit to download apps over cellular networks. This converged with device sensors to develop a killer app like Angry Birds. With increasing data speeds, app developers have been able to integrate more and more features into apps, producing their own waves of innovation.

Instead of defining how these areas will actually come together, for the next section I will instead highlight the underlying technologies that are priming the development of a killer app. This is not a roadmap, but rather a ‘reverse engineering’ of the potential paths toward solutions that enable a shared prosperity. The changes represented in the below sections are just a few examples of areas that are presently experiencing shifts toward exponential advancement of technology. These new developments do not represent discarding what works presently, rather it’s about examining what truly makes us human and optimizing our collective culture toward that goal via technology being used as a tool. In capitalism, the market appears to define the common desires of a people, but capitalism does not contain a niche for self-limiting personal desires. In essence, we made ‘ten-thousand spoons when all we really need is a knife.’ 8 Through the use of Maslow’s target, we’ll highlight the potential paths people may take in their pursuit of meaning.

Fig 6-4. - A refresher on the stages of Maslow’s Motivational theory.


Enabling Physiological / Access to Clean Water

Across the world, balancing purifying water for both consumption and sanitation is costly and energy bound, and few systems have been proven to attain scale. Current efforts involve retrofitting oil tankers into floating and mobile desalination plants. The materials science approach is to push water through a membrane that filters out, among other particulate matter, salt - producing fresh water as output. The limiting factor here is how quickly we can do it, and how much energy it requires to push the water through the membranes. Advances here include the use of graphene’s tiny one atom thick structure to reduce friction while filtering salt, all while expending low amounts of energy.9 Access to Healthy Food

Presently, food consumption and food waste are staggeringly high. In the United States alone, an estimated 133 billion pounds of food was wasted in 2010.10 In a country where we consider ourselves a representation of the free world, it is inconvenient to our sensibility that we’re a nation plagued by over-consumption. In order to reduce waste, a chief recommendation of the World Resource Institute is to eat less beef. The reason being “beef uses more land and freshwater and generates more greenhouse gas emissions per unit of protein than any other commonly consumed food.”11 Good luck telling any red-blooded American they can’t have that hamburger. Social attitudes toward food are bound to be deeply challenged in the next few years. Not the least of which, include startups attempting to yield lab-grown beef-alternatives.

Modern Meadow based in Brooklyn, New York recently closed their series B of venture capital funding to the tune of $40 million to carry their animal-less leather product to mass markets. Another similar startup named Impossible Foods aims to bring the sizzle to meatless burger by creating a full-on beef replacement. Patrick Brown, the CEO of Impossible Foods set a high standard for their plant based product, stating, “we had to make something that a meat lover will prefer to what they’re getting today from an animal.”12 To this end, the company has pursued closely mimicking the characteristics of a great burger. The ‘meat’ contains the hallmarks of a burger from the sizzle to the bleeding effect, everything a burger lover craves. The keystone ingredient comes in the form of ‘heme’ a compound found in all life -plant and animal. While these burgers may not read too appetizing, they’re actually really tasty and I got to find out first hand.

Fig 6.5 - You gotta break a few eggs.13

While Impossible foods ramps into large scale production, they’re available in a few restaurants. Lucky for me, I had the recent pleasure of investigating how close these impossible burgers came to the real thing, for science! The textures were nearly identical to a traditional burger and the flavor was 80% there in my opinion. I won’t say it’s perfect burger but if you’re the least bit curious, check it out! Food is yet another example of advances in technology making a difference.

Between automated farming and vertical farming (think farms inside of a building) in urban environments, we can begin to move toward more efficient food production and distribution. If we can train an autonomous vehicle to navigate a city street or diagnose an obscure disease we can sure as hell train a robot to have a green-thumb. Known as controlled-environment agriculture or CEA, the list of benefits from this mode of farming are numerous and include:

1. No crops lost to weather events

2. No use of fossil fuels to harvest, transport or refrigerate

3. No use of pesticides or herbicides

4. Multiple job opportunities for urbanites

5. Uses far less water (70%) than outdoor farming14


As we continue to study the genetic modification process, there may come a time when we must rely on GMO produce to feed food-insecure people. Genetically modified organisms sounds like science speak for Frankenstein’s Monster, but in actuality it’s more along the lines of ensuring drought resistance and crop yield. Representing the collective god-fearing aspect of society, opponents of GMOs tend to lament tampering with nature. This is foolhardy for reasons that are far closer to home than most realize. Bananas, corn, soybeans, cotton - are all already genetically modified by humans and have been for generations. Not too mention cats and dogs who have been bred for desirable traits, leading to their own genetic deficiencies and diseases - looking at you pugs and french bulldogs. The point is, throwing up a stink over GMOs is silly because American society is already intimately familiar with the practice.

As the population of farmers continues to drop, providing food will become more and more ‘computerized’ and we may even see Blockchain making an appearance in tracking food from farm to table. Blockchain can aid in optimizing supply chains both locally and nationally. In practice this can help reduce food waste, and when tackled in conjunction with smart policy, we can make a real dent. The issue of food waste is something we should be concerned about (my own habits being far from exemplary here, so no judgement here). Present attitudes toward food are skewed from the industrial level of processing that mask and mimic various foods. We’d all do our grandchildren a favor by honestly approaching new ideas in the food conservation space more open mindedly. Americans aren’t likely to eat bugs as a source of protein, but there are certainly people out there who would. The practical application of technology in these fields is providing solutions and managing a post-scarcity level of food supply.

Enabling Physiological / Access to Inexpensive and Secure Shelter

Materials science will play a key role in the development of cheap access to shelter. 3D printing at a larger scale while employing locally available raw materials can lead to cost-optimized housing. If you recall, 3D printing is able to print complex structures out of the box, with no additional resource expended. Though the technology for intricate home design requires more research and development into the raw materials fed into 3D printers. There are high hopes in enabling the local and cheap production of housing. The 3D printing of tomorrow will be able to print everything from clothing to hardware.

WinSun Decoration Design Engineering based in China, for example, has developed a 3D printer capable of printing ten 600 square feet homes in under 24 hours. The process uses recycled waste and was designed to be pieced together on-site, and did not include plumbing or electricity. “Industrial waste from demolished buildings is damaging our environment, but with 3D-printing, we are able to recycle construction waste and turn it into new building materials,” said CEO of WinSun, Ma Yihe.15 The primary drivers of this type of technology are the cost and speed with which homes will be able to be constructed. For the moment, the technology is limited to mini crane-like arms that are unable to reach high places and need to be positioned. In the future there will be printers that use local materials thus drastically reducing the cost of the shelter. Further, automated crane arms working in concert will help address rapidly evolving shelter issues presented by natural disasters. Ethically, can we afford to not house everyone who needs a home?

Enabling Physiological / Access to Clean, Renewable Energy

Power generated by solar panels placed atop consumer’s roofs is presently approaching an efficiency level that allows for what is known as grid parity. Grid parity denotes that the cost of generating energy via solar power is equivalent to drawing energy from the power grid. As the methods of power generation have completely different cost schemes, comparing the cost of energy from solar to grid-based energy would be like comparing apples to pineapples. The levelized cost of electricity calculates the net present value of electricity, and from this we can denote the point at which the cost of operating solar becomes more efficient than drawing from the grid, which will herald in the mass adoption of solar.

For the moment, we’re encountering friction between the way the power grid works presently and the market forces involved with widespread adoption of solar panels. The reason being, for the first time, home owners are generating power as opposed to simply consuming power. This upends the way we’ve built and structured the laws and infrastructure governing power grids to date. Power generated on top of the roof is known as decentralized - similar to decentralized networks. So now we have homeowners effectively sipping power from the grid, and in some instances feeding power back into the grid. The argument from power grid companies is that these homeowners no longer pay concurrent service fees which helps maintain the physical infrastructure involved with providing electricity to large geographic areas. The primary blocking issue of solar adoption is the learning curve involved in understanding what option makes sense for your family. Talk of dollars per kilowatt hour can seem simple until you add in financing options as well as battery cycle efficiency. This points to solar panels existing as a confusing and unintuitive proposition for most homeowners.

Home batteries are interesting because they store the energy generated by solar panels, which enables homeowners to use that energy on demand during the hours they are home. This has a normalizing effect on their power draw from the grid, which further exacerbates the issues involved with maintaining the grid. Still, the allure of reducing a variable cost for the average home is an attractive thought. Further advances in the innovation of batteries are required in order for them to become commonplace. The lead up to that occurrence is the price reduction of batteries over time.

For mobile devices, battery depletion can cause anxiety in our tech obsessed society. Admittedly, I have charging cords all over the place because I dislike being below 50%. I know it’s a silly thing to be worried about, but the root of the anxiety points to an issue with our tools. A screwdriver requires no batteries, yet with the addition of a battery, the tool becomes much more productive. That’s the trend we’re accelerating toward - more efficient productivity. Without a battery technology that matches that same ambition, our devices become limited. Smartphones are great, but if they can only make it through a couple of hours of heavy use, then their utility is greatly diminished. This requires innovations that are only recently showing signs of life.

Enabling Safety / Access to Personalized Education

Past physiological concerns, among the most important factors in an individual securing their safety, is education. As in decades past and currently, students arrive at a central location, park their butts at desks, and are talked at for hours. This was enabled by the industrial revolution which outmoded many forms of labor, limiting the necessity for child labor. With these recently unemployed, excess attention spans, young workers evolved into young students. Our application of education has since evolved, different methodologies have arisen, and we’ve debatably gotten better at educating people. Still, students are required to sit still and memorize facts. What opportunities and risks does technology bring to bear for the education of our young people? Quite a lot actually. Especially in terms of virtual and augmented reality, it’s not difficult to imagine cheap, widely available headsets that switch between VR, AR, and real life at the will of a teacher. It would take the Magic School Bus to an entirely different level.

Immersive navigation through 3D worlds can transition seamlessly between AR/VR modes, providing profound impacts on learning. A teacher might be able to transport their students to a medieval battlefield to explain the circumstances of a battle and how victory for one group meant misery for the others. With the flick of a button, the teacher can then transition to mechanical explanation of siege engine catapults and the physics involved. Practical applications of augmented reality include instruction on the mechanics of engines and other physical disciplines.

Therein, we’re able to better measure a student’s absorption of material via big data analysis. Interactions as subtle as tentatively switching an answer from C to A can indicate the correlate effectiveness of educational software. Software contains the ability to track subtle interactions that when analyzed can provide us with insights into what works and what does not. With highly technical systems for learning we can gain insight into successful teaching methods versus less successful ones. Fine tuning this understanding can allow for new modes of education, such as personalized education. Differentiated instruction has long been a staple of education, and with the addition of technology solutions, this philosophy can be augmented to tailor learning to each student individually, at little cost. A win-win!

Enabling Safety / Access to Personalized Healthcare

Of all the areas where we may experience the benefits of an optimized future, the most necessary is likely to exist within healthcare. A recent study from the The American Medical Association found that among 57 physicians, they were spending 2 hours on paperwork for every 1 hour spent with patients.16 The blame is squarely leveled at Electronic Health Records, or EHR, by many healthcare professionals. In a recent study by the Rand Corporation, a nonprofit, concluded the EHR equaled friction generating and inefficient record keeping, citing “poor usability, time-consuming data entry, interference with face-to-face patient care, inefficient and less fulfilling work content, inability to exchange health information, and degradation of clinical documentation.”17 The issue of EHRs occupying too much of physician’s time is one of those low hanging fruits for technology solutions. While keeping and managing electronic records on patients is an excellent idea, the implementation proves the fallacy of developing solutions inside a black box. Without ongoing feedback from the people the technology is meant to serve, we’re doomed to ongoing foibles.

Developing a solution that automates and optimizes the process of record keeping is one of those no brainers that was, sadly, poorly executed. This is not a shortcoming of technology, but rather, of policy. There is certainly room for improvement according to the American Medical Association, who cited eight potential opportunities for improving healthcare information technology. Included among those is improving user interfaces to streamline data entry.18 This is definitely something that the free market and technology companies can help bolster. As time passes we continue to actively interpret the ways in which people interact with technology, and troves of interaction data is created daily. We’re able to improve the user interface and also experience, and to become more intuitive. Usability has evolved and will continue to evolve, and given its inclusive nature we can all expect technology to become simpler to pick up and use over time. The inclusion of the healthcare industry in that progress is necessary to leverage all that technology has to bring to the table. I’d wager machine learning and algorithms hold the keys to some interesting insights into our healing process.

An example of personalized algorithms making a real life difference, is a scientist at CERN, Elina Berglund, who has developed an app that monitors women’s menstrual cycles to track ovulation periods for conception. Reportedly the app Natural Cycles has been found to be over 99.5% accurate in preventing unwanted pregnancy.19 Our understanding of our body is being continually augmented by our ability to interpreting our bodily signals. As society gains enhanced perception of our bodies through computer science and technological advances, we’ll also gain further awareness of how we connect with one another.

IBM, creators of the AI known as Watson, in 2011, IBM plied their thinking machine toward the game show Jeopardy, in a showdown with the game’s reigning champions. The results were conclusive, machines bested the champions of human trivia. Discontent with merely outwitting some of our brightest for entertainment, IBM set their aim to improving AI’s learning capabilities. The idea being that Watson ingests all available knowledge within healthcare by scanning medical imaging materials like Xrays, ultrasounds, CT scans, and PET scans, EHRs etc. The AI searches for differences between healthy and unhealthy cells and highlights them to the attending doctor, who can then review in rank order, potential causes. The setup has the added benefit of linking the doctor to the supporting literature as well, optimizing the diagnosis. This is again because a computer is much quicker than a human in reading and ingesting information like medical journals. According to Steve Harvey, the Vice President of Watson Health, the integration of data mining techniques will enable analysis of the patient’s genome - and here’s where it gets exciting - the analysis is performed on both the healthy and the unhealthy cells.20 Personalized medicine means that what works for one person can actually be detrimental for another individual. Each of our bodies has unique health needs. Aligning data mining techniques with cheap personalized genomes could yield huge advances in healthcare.

In addition to our increasingly clear understanding of DNA and our genomes, this technology would enable us to protect our species in a profound way. With the proliferation of disease spread by mosquitos, millions of people have unfortunately met their early demise. We now have the technology to breed mosquitoes with altered DNA in such a way that would prevent their ability to transmit deadly disease. In 2015 mosquitos were linked with 500,000 deaths across the world.21 There are risks here in the form of unintended consequences, so it requires an immediate public discourse on the topic. There is so much yet to be discovered in biotechnology it can and will fill many books to come.

Enabling Safety / Access to Safe Travel via autonomous vehicles

As we continue to drive down the roads in which contemporary technology is shaping our lives, it’s important to remind ourselves of the danger inherent in simply driving down the road. It’s pretty dangerous out there! 1.25 million people across the globe die every year from traffic accidents. In 2014 this amounted to 29,989 Americans involved in fatal accidents, on average that’s 82 people a day. 22Speaking of roads, and I sincerely mean this, to hell with traffic, I think you’ll agree. Among all of the amazing events yet to transpire thanks to technology, heavily reduced traffic is my favorite horse in that race. Zipping anywhere quickly and efficiently within an autonomous vehicle sounds fantastic, I’ll take two! Let’s be frank, it would make parties a bit more fun for everyone when designated driving is handled by the car. The future of autonomous vehicles is exciting and full of prospects for travel. This potential is so exciting that entrants like Uber or Lyft will leap into the market because owning a fleet of autonomous vehicles is cheaper to maintain than a fleet of drivers and vehicles. These companies and others like them may rush their autonomous vehicles to market, leaving the potential for instances where the autonomous vehicle was expected to correct course by the human operator. This potential lapse in judgement on the operator’s part could cause injury or worse. To aid drivers in understanding how autonomous vehicles work, the U.S. Department of Transportation created a level system designating the types of autonomous vehicles.23

Fig 6-6. - The largest misconception about autonomous vehicles is that autonomy is binary.

Level 0 - Where we’ve always been, the driver controls the vehicle entirely.

Level 1 - Where we are now, very limited autonomous control by vehicle, basics like braking.

Level 2 - Vehicle is able to control at least two functions normally done by the driver. Activities like lane centering and adaptive cruise control, pretty much where Tesla is right now.

Level 3 - The driver will be expected to be available to take the wheel in the event of unexpected conditions. Otherwise most systems are automated.

Level 4 - The golden ticket, you enter the vehicle and tell it where you want to go. The vehicle then automatically handles the rest. Kick back and relax, you’ve reached the future at this point.

Inclusive of this framework is the idea within the automated vehicle enthusiast community that ‘level 5’ that doesn’t even have pedals or a steering wheel. If Apple ever releases an automated vehicle, it will be in the range of level 5 and will likely only have one button, they will be lauded for their innovation. Jokes aside, the industry is making great strides toward achieving level 4 vehicles, for now there is danger in assuming that the software of these vehicles is infallible. There will be injurious accidents, there will be public dismay, understanding that only once we reach level 4, will we be able to really kick back and enjoy the ride.

Enabling Love & Belonging

Here’s the thing... once we arrive at this point in Maslow’s target, the motivation behind our behavior becomes hyper-personal. I like you and all, but I couldn’t infer your situation well enough to comment, so let’s just say my recommendation is to exemplify consistent and clear communication. This book’s aim has been about being a friend that points out the mud-puddle in the sidewalk before you accidentally step in it. Cognitive biases inhibit our ability to sense the risk and opportunity coupled with tech to act toward our self designed goals. Enabling as many as possible to achieve the common footholds of our human trek appears as an apparent ethical obligation. In order to navigate the impending waves of significant change we need to brace our minds and rationality. The challenges posed to our shared moral purpose will require unity of purpose on a level previously unknown to our species. We need to hug it out, communicate, and look toward the horizon. Though this journey is ongoing, we are afforded glimpses into what it may be and this view is magnified by our technology which is initiating a golden age.

How we behave and how we preserve the ability of all people to rise has likely never had more importance. Our era of technological transformation lies in ensuring our people see a guiding light in each other, so we are able to move forward together. To embrace the spirit of what makes us unique animals among a jungle of other species is to embrace our humanity. We honor the spirit of our commonality by traveling together down this unknown road. Never before has the practical application of technology been so nominal for societies to provide, those modest mercies enable people to express themselves in new ways, and to experience virtual and physical worlds in new and exciting ways.

Enabling Esteem / Accessibility

All of this technological progress is moot for nearly one in five Americans impaired by disabilities. This requires a strong commitment by software and hardware companies to include accessibility as a common and unified tenet of design. Assistive tech enabling people suffering from motor deficiencies can be helped by additional focus on enabling user interfaces to be read by software, ie - accessibility supplied via lingua franca for the varying modes of differing ability. In other words, we need to establish sets of rules and methods of engagement that supercede walled gardens between mobile device manufacturers like Samsung, Google and Apple. Enabling those with impairments to find and engage in their self-designated communities with ease should be a high priority.

Enabling Esteem /Virtual, yet Temporary Escapism

Imagine if you suffered a stroke and were completely unable to exit from your bed. Technology holds the key to enable these people to experience life far removed from their disability. We now hold the ability, and, complicit imperative to enhance those individual’s debilitated lives via the use of technology. Options could include creating interfaces specializing in reduced motor function, and voice or eye tracking software that would enable these people to navigate digital environments. The richness of these digital environments can even include the use of virtual reality, immersing immobile people in the destination of their desire is a small effort for large improvement in their lives.

Enabling Esteem / 3D Printed Prosthetics

One of the wonderful abilities of 3D printing technology is that it can be employed to enable people who have lost limbs to regain function via 3D printed prosthetics. Traditional prosthetic systems can cost anywhere from $5,000 - $50,000 and demand weeks to build. The e-NABLE Community is a group of dedicated members, who use their 3D printers to print parts for people in need of upper limb assistive devices. They went further by open sourcing the design, making it free for all to use and improve, democratizing its design. This innovation and others like it are advancing the design behind these types of prosthetics. You can now, using a specialized camera, scan your limb and expect to pay as little as $200. The benefits of communities like this are clear and are a part of what is being created across the world. Look for more and more varying culture to come to the web. How this is expressed moving forward, will, in my eyes, wow the world.

Enabling Esteem / Cognitive Assisted Interfaces

Augmented reality user interfaces might reduce the friction in our day-to-day lives in the very near future, as crazy as that sounds. When your actual vision contains a graphical user interface in the form of ‘digital stickers,’ the cues aid in recognizing your stated goals within the physical world. These stickers can enable focussed motivation applied toward achievement by drawing your attention to the factors that help us reach our goals. Cognitive overlays like this are just a few short years away. In today’s life, there is presently no user interface for our day-to-day life, aimed at achievement. This reduction in friction will enable the sensation of swimming in a rapid stream that you, the user, defines. If you need to be reminded to stand up and stretch your legs, a notification in your AR user interface can remind you. The benefits will extend well into the realm of personal health and we can expect a great deal of assistance in achieving self-stated goals. Socially, many are not quite ready for AR to be employed in public commons and we’ll likely see many odd and potentially negative reactions to its ubiquitous use.

Enabling Self Actualization

Fig 6-7. Enabling self actualization creates genuine opportunities.

We have now arrived at the chewy center, the bullseye of motivation! Self-actualization is entirely too specific to the individual for me to make meaningful recommendations. Directionally, however, I believe a good start is to define your individual meaning, develop it into a personal mission statement and work toward that goal. Collectively, enabling widespread self actualization, and the resulting shared prosperity can be aided by true ownership over our personal data. At present our identity data resides in a leaky crate that enables third parties to tap into our digital worlds at will for profit. It requires that we establish data and privacy ownership laws that reflect the high-minded ideals of personhood set forth by our founding fathers. Property rights were not conceived and planned in anything resembling the digital era. Utilizing frameworks established then to address our needs within this new public common need to be revised. Inclusive of this, are personally generated data rights, this includes any data that is created by or for us. Even if the payment is for a fraction of a penny, the frequency will see users rewarded for granting access to the data.

For example, upon the use of our personal data by an internet company, fair recompense should be exchanged. Convenience is not currency, making the mistake of assuming this, places positive outcomes at risk. At present, there are too many opportunities for data mining companies to take user data at will and for profit. Public education on privacy rights are some of the many problems on the public’s hands, and while it begins with awareness, the details are nuanced and require forward thinking to fully understand their impact. NASA serves as a great example of a government organization that has exemplary public outreach. They do an excellent job of communicating with the public about their discoveries, their astronauts, and their partnerships with other space agencies.

It’s better to build today than it is to plan for tomorrow. Technology has evolved an increasingly clear view to the inner workings of our collective world. We can appreciate with heightened clarity, the inner workings and digital physics involved in deploying technology to a widely held issue. Tackling these issues through time has provided humanity ever taller horizons, continually drawing determination into focus. A better world won’t arrive tomorrow or the day after.

I’d propose the creation, in participation with NGOs and the private/public sectors a yearly summary of practical technology goals anyone can achieve. Inclusive of this would be goal lines for emerging economies, specifically regarding how to integrate streamlined and mature technology for the frictionless rollout of government programs. Tackling as well, the institutional corruption through transparency via innovations like the blockchain. The availability of a public ledger can aid citizens in ensuring budgets are planned with input from stakeholders. As stated, one of the core missions of NASA is their promotion to the public of discoveries they make, so too should the government exuberantly inform the people of new advances in technology. Though the word ‘technology’ is ancient, it still feels new, and citizens deserve to be fully informed.

Everyone hits roadblocks and experiences events that either build them up or tear them down. Every person is subject to the same lack of existential comfort the universe affords any of us humans. We all desire the love and affection of those close to us, we all berate ourselves for our failures, we all see slowly aging faces in the mirror. These common traits reside inside even the most successful of people - we’re all human and it’s amazing that we’re even here. The difference between a successful person and an unsuccessful one comes down to too many factors to fathom, so it’s often reduced to luck and timing. More to the point, I believe their success resides in their purposefully neglecting to ask someone permission to pursue their own goals. Elon Musk exemplifies this characteristic, he determined the world needed electronic forms of payment, so he built one. He determined the world needed to build an electric car, so he built one. He determined humans needed to reach for the stars, so he built a space startup. Still, he’s just one man, the same as the rest of us, and prone to biases and being wrong. Every flaw we share, he is subject to as well.

More importantly though, he did not ask anyone permission. We’ve built our day-to-day lives around mimicking every perceived habits of successful people. We believe that if we mimic the factors that lead to their success that we can glean a fraction of that success. As we’ve established, perception is a fickle bitch. Especially if you take into account the physiology of our perception, for example, foveal vision provides accurate focus within a small narrow band near the center of our vision, this is due to a concentration of light receptors at the back of the eye. It allows our eyes to discern detail within a focussed band to the exclusion of detail outside that band. This helps us wield clear vision while our brain fills in the rest of the information. The way our perception via foveal vision can be tricked is demonstrated in the following image.

Fig 6-8. - Whack-a-dot for your eyes!

There are actually 8 dots in this image that can be viewed more clearly if we cover half of the image with your hand. This is because your foveal vision attempts to piece together the image but has blind spots in its perception. Foveal vision is an energy optimization for the brain/eye combo of tools. Common scientific knowledge states that up to 20% of energy in the human body is directed to the brain. The need to enable shortcuts to sip energy was favored over time. Technology has always been a tool of mankind, and throughout history, we’ve used it to make people’s lives better by reducing the energy expenditure in work.

There has never a valid reason throughout history to drop the tools at our disposal and pursue luddite pastimes, the same holds true now. Computing sparked a revolution that became a calling to millions of people into careers in technology. We are encountering the compounding benefits of their progress on a day-to-day basis. Using all available tools became our mission, as soon as nomads rejected their former lifestyle in exchange for stationary comforts. We share our stars and wishes with those same explorers. Apps are not yet as intuitive to use as a fire-hardened-stick, and it is vital to understand that apps will inevitably become as intuitive as your physical senses. Similar to the way we react to hearing a distant shout by putting our ear closest to the direction it came from - apps will become more ingrained in our daily lives to an even more personal degree, acting like a second nature.

Bonus Level: Enabling Self-Transcendence

In his latter years, Abraham Maslow added ‘transcendence’ to the theory of motivation. Transcendence being attained when the individual contributes to something greater than themselves.

“The goal of identity (self-actualization . . .) seems to be simultaneously an end-goal in itself, and also a transitional goal, a rite of passage, a step along the path to the transcendence of identity. This is like saying its function is to erase itself.”24

I get it, the transcendence talk all sounds very new age-y, and to a certain degree that’s accurate. For many, the idea of transcendence involves sitting in front of the football game - and that’s ok, not everyone needs to be enlightened, nor do they all want to be. The point is that a lot of very smart people have studied psychology in pursuit of the knowledge of what makes us happy, when our focus should instead be on achieving contentedness. We shouldn’t be surprised if the desire for something more spiritual arises. Take the medieval to renaissance-era cathedrals, they involved, in some cases, hundreds of years to construction and leave no doubt as to the astonishing skills and dedication of their creators. Remember the absurdity of odds that has placed us here, celebrate living. If you discover yourself with the opportunity to dance, then you probably should.

We must include as many as possible in enabling frictionless movement between their motivational stages. We need to be open-eyed and minded to steer these technologies in the best direction possible. The future is not guaranteed to be 100% radiant rainbows and radical robots, or to provide the answer to your every whim. I just don’t believe the outcome will be an apocalyptic wasteland of hopeless savages either. Somewhere in that haze of uncertainty exists a range of reasonably possible outcomes. Some of which will preserve our humanistic ideals and also harness technology for further understanding of those ideals, while other tomorrows may not turn out so rosey. These possible divergent paths of technology underscore the need to keep a watchful eye on tech and innovation. Improving our societal outcome begins with discourse between people. Communicate with the people in your life. Share what you’re excited about, we can be very different from one another, but we are always, above all else, human.

In significant form, our brain’s default mode network will be entirely modified by our use of technology. In fact, we are witness to evidence of a new brain wave pattern, arising when we use technology. Recently a study performed at the Mayo Clinic suggests that when texting, people’s brains employ a unique brain wave pattern.25 Performed with a device known as electroencephalography (EEG), the study of 129 people with displayed distinct patterns of brain activity during the text message composing process. The modified brain patterns suggests that our brains are continually changing based on environmental factors such as electronic device use. Scientists are identifying these changes on a rolling basis, and much more research is needed. Given technologies adoption lifecycles, we may be approaching a new understanding in human computer interactions.

When you began reading this book a lot of the concepts in this chapter wouldn’t have made any sense. Gaining hindsight is distinctly different from gaining focussed insight furnished by technology. As we move through time, our present evolves from a frayed thread into a tightly woven cord of memory upon self reflection. Crystallizing our foundation in understanding of the tools at our disposal requires understanding of our personal roadblocks. Including how our past or present behavior may inhibit or endow our own goals.

Ensuring Critical Perception

It is important for us all to recall the struggles and challenges of those who’ve come before us. Recognizing where they failed and where they excelled by an examination of our own history acts as a sequence of lenses that provides context and clarity to our present day situations. Without understanding where we’ve failed in the past, proceeding forward becomes prone to error. Contemplating the details and dates to the point of trivia is not required, rather, it’s more important to remain curious about how we arrived at the here and now. There are always heinous rulers and grotesque demagogues throughout history. Caligula murdered for amusement, Genghis Khan demanded fealty from cities under threat of murdering every citizen (which he often times followed through with). Josef Mengele was personally responsible for hundreds of thousands of deaths in Nazi prison camps. Countess Elizabeth Báthory bathed in the blood of virgins to retain her youth. These people were dedicated to only pleasing themselves and are enabled by great systemic power. You know, psychopaths of the highest order.

These psychopaths leveraged the ‘villain’ of all cognitive biases known as attributional bias. This bias enforces an attitudinal opposition to groups outside their own, in the above examples they were either ethnic or social forces of opposition. The ‘othering’ of a group’s behavior appearing distinct from our own enables a negative perception of that outside group. We’re typically very good at diagnosing what’s wrong with everything, we have a finely tuned eye for that, thanks to evolution.

At present, society is gaining an increasing clarity of our problems, and they appear insurmountable. It can make me want to bury my head in the sand sometimes. We no longer live in communities where our front door are unlocked, as was common in bygone eras. The world has always existed in a state of change, and it happens to be evolving faster as the years pass. We need to preserve our strength to grapple with very large concepts, and to process them with grace we must talk about them. That sounds terrifying, but with the full compliment of tools we’ve been provided with over thousands of years of advancement of science and technology, we can do this. We are well equipped, and yet well hindered. Recognizing hinderance to discourse is the most important first step in my opinion. We can do that here and now.

We now know that those Dictators and sadistic leaders we discussed may very well have had a broken or poorly functioning amygdala, the area of the brain which is shown to be the moral compass for our conscious decision making. When comparing the activity of the overall brains of groups of people, the individuals who exhibit signs of psychopathic behavior are also exhibiting reduced blood flow activity or function in the amygdala and the prefrontal cortex display. There are already intervention programs for toddlers who show signs of lower amygdala volume. There is likely a much broader answer, but with time and the march of progress, we’ll better understand, or debunk, if this is a correlating factor that contains an element of causation. Experts in neuroscience have noted “this moral feeling, centered on the PFC [prefrontal cortex] and amygdala, is the engine that translates the cognitive recognition that an act is immoral into behavioral inhibition—and it is this engine that functions less well in antisocial, violent and psychopathic individuals.”26 This is not to claim that individuals that exhibit antisocial behavior are not responsible for their actions. They absolutely do need to be held responsible. What studies like this point to is the fact that we still don’t know enough of how the brain operates. The present knowledge of the brain cannot definitively point to a set of circumstances within the individual and their environments that cause psychopathic behavior.

Here is where the light begins to shine through those clouds. We have received promising news that defeating grotesque leaders through nonviolent methods is more effective than violent campaigns, and it is rising in efficiency. Political scientist Erica Chenoweth examined hundreds of nonviolent and violent campaigns from 1900 - 2006, focusing on 3.5% as the amount of citizens required to trigger a tipping point in nonviolently toppling a government.27 To be clear, these nonviolent actions carry heavy personal risk for those standing up for what they believe in, however, as our hard won intuition tells us, there is power in numbers. The nonviolent nature of the protests has an added bonus of attracting a wider swath of people, given the polarizing nature of violent coups. Nonviolent protests also start and end more democratically as well, inviting more varied people into forming the new state of governance than a life threatening violent campaign. Nonviolent protest may be imperfect, but can be bolstered dramatically with the aid of technology.

Fig 6-9. - Protest method success rates by decade, non-violence is gaining ground. (Erica Chenoweth/YouTube)

Fig 6-10. - The imperceptible march toward a better tomorrow.

Additionally, deaths in war have dropped dramatically over time, as shown by Steven Pinker in his article in The Guardian.28 Overall the world is becoming more and more interconnected. Information travels with the speed of light across the world, raising awareness of cultural touchstones in a fraction of the time the previous decade would require. That connective tissue enables potentially significant world events to enter our perception as they occur. This sea change from our previous history, both poses and solves challenges. We must ensure we maintain focus on the touchstones that move our collective progress forward.

An example of how progress is sometimes obscured include the following factoid. According to the World Bank, the organization tracking development across the world, from 1990 to the year 2013, approximately 1.09 Billion people have been elevated out of ‘extreme poverty’. In the span of 25 years, over a Billion people have been enabled to reach further than their grasp took them the day previously. Given the events of 2016, attention to promising statistics has been eschewed in favor of emotional comfort.

Fig 6-11. Real progress is being made worldwide. (Courtesty OurWorldInData.Org)

The temptation to seek comfort in a screen is exactly what we should be wary of. The apparent obsession with instant gratification pervades our media and our personal consumption. Our issues, social or technical are many, but our focus on the bigger picture is bolstered by increased awareness. People can connect with their friends and family in a seamless way, so there’s no reason to not share with them. Web facilitates conversations around topics that interest you and engage you, and we need to be sure that the technology that connects us, connects us in the most fertile way possible.

As discussed, some people perceive technology as a ‘train they’ve missed’ and I contend that the train has very much not left the station as of yet. Let this be yet another reminder that it is never too late to engage with technology. Motivation is useful for helping others hop on the tech train and to utilize technology as, A) a tool for aiding others attain minimum standards of living and B) a means push our most gifted citizens of the world to accomplish wonders. We don’t have a time machine to go back in time and prevent the awful things that have already happened. What we can do is use these principles to inform the way we live our lives, to embrace each day as an opportunity for improving our tomorrow. Augmented with technology in pursuit of individual meaning, this further enables us to achieve our goals and help those around us. As quoted at the beginning of the book, your future is whatever you make it, so make it a good one... for everyone.

Ensuring Artificial Intelligence Elevates Tomorrow

Recorded history was always written by the victor in war, but it is now written by everyone with a connection. It’s on every Facebook, Snap, Tweet, and Tumblr post - the record of humanity is expanding at a phenomenal rate. Not all of it will be useful in intuitive ways whose value is easy to identify. Our world is actively being shaped by the shifting consumption of content created by people in their homes. Communication and creativity is now, more than ever, the domain of the people.

Artificial intelligence holds equal amounts of promise and danger in enabling or disabling shared prosperity. Ensuring the technology is reasonably democratized without stifling innovation is of paramount importance. Our own personal data holds more value than we account for presently. Consent in the use of that data becomes part and parcel to personal information security. The democratization of AI technology requires that the prosperity brought to bear by AI is systemically weighted for the benefit of all. Truly assistive AI will allows us to spend more time engaging in our own human experience.

We face many ethical crossroads in the immediate term, we’d do well to remember that altruistic intent can provide sincere relief to those in need. Established physiology and security embody the base standard of living for most Americans. The shortsightedness of conceiving of a ‘base standard’ supposes that’s where the trail ends, yet the human race has elevated itself to incredible heights. While some people may feel that we’ve leveled out, or are presently sputtering, in many ways, artificial intelligence can prove helpful in our transitions from preserving physiology and security while guiding us into self-actualization. AI can aid us on multiple fronts, and is additive to every layer of our lives.

Multilayered neural networks feature the ability to compare and detect anomalous data. This is thanks in part to the reams of real world data fuelling what the AI perceives as ‘normal’. Thereby, the AI is attuned to data points that do not fit its own models, the change stands out as simple to detect. In essence, we can feed an AI with 1,000 children’s books on shapes and colors. If we then make our own children’s books on shapes and colors, exchanging out a triangle for a snake, the AI would rapidly detect the change. In healthcare, this means that an AI can view thousands of MRIs to gain a baseline, detecting anomalies in healthcare imaging, improving potential outcomes. The AI can highlight it for human stewardship, the application of which would be intuitive. While a children’s book may seem like a simplistic example, there are profound applications for this technology if you use even a modicum of imagination.

It is noteworthy that along with the advancement of computers, complementary areas of tech, like biotechnology, military technology, and energy technology have also advanced. For example, the imaging capabilities in healthcare have also witnessed Moore’s law-like jumps in efficiency. The overall acuity in our imaging and display systems allows greater insight into the way our bodies function. These recent complementary technological innovations in medicine are ushering in a new understanding of how we humans operate, thanks in part to the waves of technological innovations maturing and then converging that have preceded it. Neural networks present a tantalizing clue as to how our brains work to understand the world around us. We’re experiencing great gains in computing by mimicking our brains, but what happens when that reaches its own logical conclusion? If we feed these networks of neurons troves of data, the AI improves as rapidly as it is fed data. We must approach with caution. A super intelligent machine should not be confused with a conscious being like ourselves, instead their intelligence should remain compartmentalized as separate. Until we further understand what consciousness truly is, we cannot reasonably afford machines that right. The irony of course being that the further development of AI will undoubtedly aid in summoning that understanding.

Given AI’s ability to assist a person in managing their life efficiently, their use will increasingly become a necessity to function within society. Advanced AI may be a entirely proprietary solution which can lead to a single person, country or corporation wielding control over the most advanced machine ever invented, eclipsing nuclear weapons and civil wars for the amount of destruction wrought in its possible misuse. A single superintelligent AI would effectively dwarf the nuclear bombs. The ability for an AI to destroy us is just as great as its ability to produce and follows the whim of the individual or country that invented it. Ensuring humanity shares in AI driven prosperity will be one of our ultimate tests of the coming era. Damn that is heavy, so for now let’s just remember that presently AI is not that powerful, but it is also no longer some Victorian-style curiousity that confabulates our minds. Responsible development of AI must be a common ideal shared by all who pursue it.

Ensuring Big Data Fuels Tomorrow

Advancement of technology has progressed to an inflection point in our understanding of ourselves. Self reflection is generally not a well practiced human activity. ‘Know thyself’ was employed as common wisdom, but given the way that the world often appears, we might assume that the adage is actually ‘distract thyself.’ With technology we’re enabled to empathize with the self in a more nuanced fashion. Enabling self-empathy via technology will, one day, be an available tool for us all.

The insights and value from big data will be a massive source of fuel for our world economy. Sometimes this will reveal hard to swallow reflections about our human condition. Insights into our collective behavior may even lead to the development of new behavioral guideposts as superstition. The internet’s culture desperately needs substantive guidance. We stand at the precipice of new standards in computing that will dictate whether or not at the end of the cycle, VR and AR are cluttered with advertising and distraction. Alternatively, it can be a user-centric, augmentative layer of data exchange, enabling you to lead your life without friction.

If you’ve made it this far into the book and you are still highly ambivalent about the potential paths I’ve laid out, that’s fair. You are able to opt out of most of the things in the realm of technology by not participating, but the only advice I’d offer would be to remain aware that there are people who will benefit in unimaginable ways. My guarantee is that in the next few years we will redefine what ‘once-in-a-lifetime’ means. I’m not pronouncing that you’re going to wake up one day, and unexpectedly pop on an augmented reality set of glasses. I’m stating that you’ll wake up one day and everyone around you will be habitually using some new tech - the ground moves beneath our feet.

It may occur and you may ask yourself “Well… how did I get here?” I can’t guarantee that I’ll have an answer for that, but what I can guarantee is that we’ll all face a new gadget and we may ask ourselves “How do I use this?” Earlier in the book I had noted that in order to ‘get things done’ we’d require a training montage from 80s movies. There were always a series of quickly edited clips of our heroes or heroines going into preparation mode, jazzing themselves up via dance or jogging on a beach, all to confront the antagonist. The teammates would prepare their bodies and mind for the final showdown, the final challenge to overcome. Squealing guitars pepper the action and deep pumping bass notes thrust our adrenaline forward. The editing uses quick shot after quick shot to convey just how much is changing. Quick shot - tightening headband, quick shot - running up long sets of stairs, quick shot - planning the assault over a table of models, quick shot - reading a book and nodding, implying comprehension. Of course that last one includes you dear reader, you’ve made it this far. To complete our montage, nod your head and this quick shot is ‘completed to the max!’, which for your information is far more rad than just regular completed.

Because we’ve gained an increasingly focused look at human nature via technology, we are better equipped to measure the key components of balanced human nature, and what that means. The tools at our disposal can allow us to optimize our day-to-day actions toward meaningful solutions like spending more time with family and friends. Through the increased communications afforded to communities across the web, we can begin to enact change locally and globally. The human tapestry has always embodied a colorful collection of threads, and it’s presently shifting into something we cannot yet recognize. Network effects have built a pervasive industry in technology, and the application of that technology can be steered in the direction of a positive outcome. If that seems unintuitive, that’s ok because much of this is yet to pass.

Ensuring Everyone’s Tomorrow

The advancing wave of technology can either broadside us, or we can roll with it and face the strange - riding the wave. It’s a matter of steering ourselves in the direction we choose. The tools yielded from technology are aligning to allow us to express ourselves in new and connective ways. We will be able to recover from previously incurable diseases, or possibly eradicating them before you ever experience its effects to begin with. We can start a business we’ve always dreamed of but lacked the business expertise to execute, allowing us to throw ourselves into a pursuit of our actualized choosing. My nieces and nephew will experience learning about the Great Wall of China enveloped in a virtual replica. Activities such as this embody our dreams as children first arriving on the net. In all of these situations, technology is the tool, and so too, is an open mind. Quality of life improving technology is becoming more evenly distributed every day. Poverty can someday soon be addressed. War will end. Those violent shackles of our past nature will someday all be gone, one way or another. Historical progress demands we leverage the learnings available to us to stand upon the shoulders of our own history.

Our individual journeys are vivid and an ongoing sum of our perception of the environment. The conclusion of the human journey is yet to be written and subject to the efforts of the living embodiment of that journey, us. Our civilization’s Schrodinger’s cat is how we utilize technology to answer our most pressing questions about human nature. To assist or to hurt is all in the act of being a participant in creating the future. Prior to that answer resolving into clarity we’ll undoubtedly encounter roadblocks. Somewhere in between those two distinct points lies our very own tomorrow. How we communicate and then act to navigate those roadblocks will define the extent of what that tomorrow will cost all of us.

By establishing core goals that advance our species, we enter the business of developing wonders again. Like the Apollo program, we need to redefine what it means to test the limits of our reach. For more than 50 years we have not set foot on the moon, so it is up to the citizens to support these endeavors that make for memorable, decade-defining feats of pure imagination. The very names of our constellations were created by the ancient Greeks and Romans, and we have an opportunity unlike that of any other period, to plant our own flag the night sky, echoing the human story throughout the stars and planets.

Works Cited

View works cited

“1/3/2000: No Y2K Problems.” ABC News. ABC News Network, 3 Jan. 2000. Web. 04 Feb. 2016.

“About ICRICT.” ICRICT. Independent Commission for the Reform of International Cor porate Taxation, n.d. Web. 28 Aug. 2016. <www.icrict.org/about-us/>.

Administration, National Highway Traffic Safety. “Did You Know?” FARS Encyclopedia. N.p., n.d. Web. 15 Sept. 2016. <www-fars.nhtsa.dot.gov/Main/ index.aspx>.

American Medical Association. “AMA Calls for Design Overhaul of Electronic Health Records to Improve Usability.” AMA News Room N.p., 16 Sept. 2014. Web. 14 Sept. 2016. <www.ama-assn.org/ama/pub/news/news/2014/2014-09-16-solutions-to-ehr-systems.page>.

American Medical Association. “Allocation of Physician Time in Ambulatory Practice: A Time and Motion Study in 4 SpecialtiesAllocation of Physician Time in Ambulatory Practice ONLINE FIRST.” N.p., 6 Sept. 2016. Web. 14 Sept. 2016. <annals.org/article.aspx?articleid=2546704>.

Antonipillai, Justin. “Help Us Advance Data Equality.” Medium. N.p., 07 Sept. 2016. Web. 23 Sept. 2016. <https://medium.com/the-wtf-economy/counselor-antonipillai-urges-digital-sector-help-us-advance-data-equality-8987f912ae80#.pko92oan7>.

“Apple Inc.” AAPL Annual Income Statement. N.p., n.d. Web. 01 Sept. 2016. <www.marketwatch.com/investing/stock/aapl/financials>.

“ACSI Telecommunications and Information Report 2015.” American Customer Satisfaction Index. ACSI, 2 June 2015. Web. 04 Feb. 2016.

Balzer, By Deborah, By Vivien Williams, By Laurel J. Kelly, and By Liza

Torborg. “Mayo Clinic Minute: Does Texting Change Brainwaves?” Mayo Clinic. N.p., June 2016. Web. 21 Nov. 2016.

Becerra, Leah. “For US, Limiting Unlimited Data Plans Is The New Normal.” News Net 5. N.p., 18 Sept. 2015. Web. 04 Feb. 2016.

Benjamin, Walter, and J. A. Underwood (Translator). The Work of Art in the

Age of Mechanical Reproduction. London: Penguin, 2008. Print.

Berezow, Alex. “Maths Study Shows Conspiracies ‘prone to Unravelling’ - BBC News.” BBC News. N.p., 26 Jan. 2016. Web. 04 Feb. 2016.

Berkeley Lab. Berkeley Lab to Lead 5 Exascale Projects, Support 6 Others. N.p., 7 Sept. 2016. Web. 15 Sept. 2016. <newscenter.lbl.gov/2016/09/07/berkeley-lab-lead-two-doe-exascale-computing-proposals-support-four-others/>.

Berwick, R.C., Friederici, A.D., Chomsky, N. & Bolhuis, J.J. Evolution,

brain, and the nature of language. Trends Cogn. Sci. 17, 89–98 (2013).

Blade Runner. Dir. Ridley Scott and Rutger Hauer. Prod. Ridley Scott and

Hampton Francher. By Hampton Francher and David Webb Peoples. Perf. Harrison Ford, Rutger Hauer, and Sean Young. Warner Bros., 1982. Transcript.

Bomey, Nathan. “Walmart to Cut 7,000 Back-office Accounting, Invoicing Jobs.” USA Today. Gannett, 01 Sept. 2016. Web. 02 Sept. 2016. <www.usatoday.com/story/money/2016/09/01/walmart-jobs/89716862/>.

Bostrom, Nick. “When Machines Outsmart Humans.” Weblog post. N.p., 2000. Web. <www.nickbostrom.com/2050/outsmart.html>.

Bostrom, Nick. “How Long Before Superintelligence?” N.p., 1997. Web. 28 July 2016. <www.nickbostrom.com/superintelligence.html>.

Brandom, Rusell. “Advertising Malware Rates Have Tripled in the Last Year, According to Report.” The Verge. N.p., 25 Aug. 2015. Web. 30 June 2016.<www.theverge.com/2015/8/25/9202301/advertising-malware-malvertising-statistics-flash-vulnerability>.

“Bringing Big Data to the Enterprise .” IBM. N.p., n.d. Web. 10 Sept. 2016.<www-01.ibm.com/software/data/bigdata/>.

Brockman, Greg, and Ilya Sutskever. “Introducing OpenAI.” Weblog post. Open AI, 11 Dec. 2015. Web. 31 Aug. 2016. <openai.com/blog/introducing-openai/>.

Brodkin, Jon. “ISPs “reminded” to Not Use Government Money for Alcohol and Vacations.” ArsTechnica.com. N.p., 22 Oct. 2015. Web. 04 Feb. 2016.

Brown, Eric. “Desalination Gets a Graphene Boost.” MIT News. N.p., 02 Nov. 2015. Web. 07 Sept. 2016. <news.mit.edu/2015/desalination-gets-graphene-boost-jeffrey-grossman-1102>.

Buzby, Jean C., Hodan Farah-Wells, and Jeffrey Hyman. “The Estimated Amount, Value, and Calories of Postharvest Food Losses at the Retail and Consumer Levels in the United States.” SSRN Electronic Journal SSRN Journal (n.d.): n. pag. Feb. 2014. Web. 7 Sept. 2016.<www.ers.usda.gov/publications/eib-economic-information-bulletin/eib121.aspx>.

Caliskan-Islam, Aylin, Joanna J. Bryson, and Arvind Narayanan. “Semantics

Derived Automatically from Language Corpora Necessarily Contain Human Biases.” 30 Aug. 2016. Web. 4 Jan. 2017. <https://www.princeton.edu/~aylinc/papers/caliskan-islam_semantics.pdf>.

Chandler, David L. “Researchers Discover New Way to Turn Electricity into Light, Using Graphene.” MIT News. MIT, 16 June 2016. Web. 23 June 2016.<news.mit.edu/2016/new-way-turn-electricity-light-using-graphene-0613>.

Chmielewski, Dawn. “FBI Says Resetting San Bernardino Shooter’s Apple ID Password Not a Screw-Up.” Recode. N.p., 21 Feb. 2016. Web. 20 June 2016.

Cisco Systems. The Internet of Things How the Next Evolution of the Internet Is Changing Everything. N.p.: Cisco Systems, 2011. Apr. 2011. Web. 5 July 2016.<www.cisco.com/c/dam/en_us/about/ac79/docs/innov/IoT_IBSG_0411FINAL.pdf

Coelho, Maria Haase. “Belgians Are Hunting Books, Instead of Pokemon.” MSN. N.p., 27 Aug. 2016. Web. 31 Aug. 2016.<www.reuters.com/article/us-belgium-books-pokemon-idUSKCN1110RG>.

“Company Info | Facebook Newsroom.” Facebook Newsroom. N.p., n.d. Web.

28 Nov. 2016. <http://newsroom.fb.com/company-info/>.

“ComScore Ranks the Top 50 U.S. Digital Media Properties for January 2016.” ComScore, Inc. N.p., 24 Feb. 2016. Web. 16 June 2016.

Cook, Tim. “Customer Letter.” Apple (Ireland). N.p., 30 Aug. 2016. Web. 01 Sept. 2016. <www.apple.com/ie/customer-letter/>.

Cook, Tim. “Customer Letter - Apple.” Apple. N.p., 16 Feb. 2016. Web. 20 June 2016.

Comelli, D., D’orazio, M., Folco, L., El-Halwagy, M., Frizzi, T., Alberti, R., Capogrosso, V., Elnaggar, A., Hassan, H., Nevin, A., Porcelli, F., Rashed, M. G. and Valentini, G. (2016), The meteoritic origin of Tutankhamun’s iron dagger blade. Meteoritics & Planetary Science. doi: 10.1111/maps.12664

Cortana Intelligence and ML Blog Team. “Microsoft and Liebherr Collaborating on New Generation of Smart Refrigerators.” Cortana Intelligence and Machine Learning Blog. Microsoft, 2 Sept. 2016. Web. 04 Sept. 2016.

Dastjerdi, M., B. L. Foster, S. Nasrullah, A. M. Rauschecker, R. F. Dougherty, J. D. Townsend, C. Chang, M. D. Greicius, V. Menon, D. P. Kennedy, and J. Parvizi. “Differential Electrophysiological Response during Rest, Self-referential, and Non-self-referential Tasks in Human Posteromedial Cortex.” Proceedings of the National Academy of Sciences 108.7 (2011): 3023-028. Web.

Dawson, Jan. “Five Years of Tim Cook’s Apple in Charts.” Observer. N.p., 26 Aug. 2016. Web. 16 Sept. 2016. <observer.com/2016/08/five-years-of-tim-cooks-apple-in-charts/>.

“Death of the Double Irish.” The Economist. The Economist Newspaper, 15 Oct. 2014. Web. 01 Sept. 2016. <www.economist.com/news/business-and-finance/21625444-irish-government-has-announced-plans-alter-one-its-more-controversial-tax-policies>.

Despommier, Dickson. “The Vertical Farm: Controlled Environment Agriculture Carried out in Tall Buildings Would Create Greater Food Safety and Security for Large Urban Populations.”ResearchGate.net. N.p., 3 Dec. 2010. Web. 8 Sept. 2016.

Druyan, Ann, Steven Soter, and Carl Sagan. “”The Lives of the Stars””COSMOS: A SpaceTime Odyssey. PBS. 23 Nov. 1980. Television.

Eliot, Charles William. Adam Smith, Wealth of Nations. New York: P.F.

Collier, 1909. Print.

Englehardt, Steven, and Arvind Narayanan. “Online Tracking: A 1-million-site Measurement and Analysis.” N.p., 18 May 2016. Web. 30 June 2016.<randomwalker.info/publications/OpenWPM_1_million_site_tracking_measurement.pdf>.

“The Evolution of Work: The Changing Nature of the Global Workplace.” ADP Research Institute, 29 June 2016. Web. 29 June 2016. <www.adp.com/tools-and-resources/adp-research-institute/research-and-trends/research-item-detail.aspx?id=DF55E8A7-906A-4E81-A941-E886886BC9B2>.

European Commission. State Aid: Ireland Gave Illegal Tax Benefits to Apple worth up to €13 Billion. N.p., 30 Aug. 2016. Web. 1 Sept. 2016. <europa.eu/rapid/press-release_IP-16-2923_en.htm>.

Facebook. Facebook Reports First Quarter 2016 Results and Announces Proposal for New Class of Stock. N.p., 27 Apr. 2016. Web. 5 July 2016.<investor.fb.com/investor-news/press-release-details/2016/Facebook-Reports-First-Quarter-2016-Results-and-Announces-Proposal-for-New-Class-of-Stock/default.aspx>

Fingas, Jon. “Solar Cell Generates Power from Raindrops.” Engadget. N.p., 11 Apr. 2016. Web. 23 June 2016.<www.engadget.com/2016/04/11/solar-cell-generates-power-from-raindrops/>.

Fisher, Max. “Peaceful Protest Is Much More Effective than Violence for Toppling Fisher, Max. “Peaceful Protest Is Much More Effective than Violence for Toppling Dictators.” Washington Post. The Washington Post, 5 Nov. 2013. Web. 13 Sept. 2016. <www.washingtonpost.com/news/worldviews/wp/2013/11/05/peaceful-protest-is-much-more-effective-than-violence-in-toppling-dictators/>.

Furman, Jason. Is This Time Different? The Opportunities and Challenges of Artificial Intelligence. Tech. The White House, 7 July 2016. Web. 7 July 2016. <www.whitehouse.gov/sites/default/files/page/files/20160707_cea_ai_furman.pdf>.

Garber, Steve. “The Decision to Go to the Moon.” NASA. N.p., 29 Oct. 2013.

Web. 05 Jan. 2017. <http://history.nasa.gov/moondec.html>.

Gardiner, Bryan. “You’ll Be Outraged at How Easy It Was to Get You to Click on This Headline.” Wired.com. Conde Nast Digital, 18 Dec. 2015. Web. 01 Sept. 2016. <www.wired.com/2015/12/psychology-of-clickbait/>.

Gartner Inc. “What Is Big Data? - Gartner IT Glossary - Big Data.” Gartner

IT Glossary. Gartner, 04 Oct. 2016. Web. 07 Dec. 2016. <http://www.gartner.com/it-glossary/big-data/>.

Gaudiosi, John. “Magic Leap Leads $1.1 Billion Wave of VR and AR Investment.” Fortune Magic Leap Leads 11 Billion Wave of VR and AR Investment Comments. N.p., 06 Mar. 2016. Web. 18 July 2016. <fortune.com/2016/03/07/magic-leap-1-billion-investment/>.

Gibbs, Samuel. “Chatbot Lawyer Overturns 160,000 Parking Tickets in London and New York.” The Guardian. Guardian News and Media, 28 June 2016. Web. 28 June 2016.<www.theguardian.com/technology/2016/jun/28/chatbot-ai-lawyer-donotpay-parking-tickets-london-new-york>.

“Graphenano and Grabat Launch Graphene-based Batteries.” Graphene Info. N.p., 8 Feb. 2016. Web. 23 June 2016.<www.graphene-info.com/graphenano-and-grabat-launch-graphene-based-batteries>.

Graphene Gets Bright: World’s Thinnest Lightbulb Developed.” Graphene Gets Bright: World’s Thinnest Lightbulb Developed. Phys.org, 15 June 2015. Web. 23 June 2016.

Greenfieldboyce, Nell. “Their Masters’ Voices: Dogs Understand Tone And Meaning Of Words.” NPR. NPR, 30 Aug. 2016. Web. 30 Aug. 2016.<www.npr.org/sections/health-shots/2016/08/30/491935800/their-masters-voices-dogs-understand-tone-and-meaning-of-words>.

Gutierrez, Lisa. “Town Elects a Dog for Mayor – for the Third Year in a Row.” Miami Herald. N.p., 24 Aug. 2016. Web. 30 Aug. 2016.<www.miamiherald.com/news/nation-world/national/article97633047.html>.

Haase Coelho, Maria. “Belgians Are Hunting Books, Instead of Pokemon.”MSN. N.p., 27 Aug. 2016. Web. 31 Aug. 2016. <http://www.msn.com/en-us/news/offbeat/belgians-are-hunting-books-instead-of-pokemon/ar-BBw6N9W>.

Handwerk, Brian. “Crocodiles Have Strongest Bite Ever Measured, Hands-on Tests Show.” National Geographic. National Geographic Society, 15 Mar. 2012. Web. 23 June 2016.<news.nationalgeographic.com/news/2012/03/120315-crocodiles-bite-force-erickson-science-plos-one-strongest/>.

Hardy, Quentin. “The Web’s Creator Looks to Reinvent It.” The New York Times. The New York Times, 07 June 2016. Web. 22 June 2016.<www.nytimes.com/2016/06/08/technology/the-webs-creator-looks-to-reinvent-it.html>.

Hartogs, Jessica. “Tech CEOs in Support of Apple vs FBI.” CNBC. N.p., 18 Feb. 2016. Web. 20 June 2016.

Henn, Steve “The Night A Computer Predicted The Next President.” All Tech Considered. NPR, 31 Oct. 2012. Web. 17 Nov. 2016.

Hicks, Katie, and Jeff Stein. “Why This “bloody” Veggie Burger May Become the Tesla of Food.” Vox. N.p., 07 July 2016. Web. 08 Sept. 2016. <www.vox.com/2016/7/7/12106708/impossible-foods-ezra-klein-show>.

Hohenadel, Kristin. “Here’s How Artists in the Late 1800s Imagined Life in the Year 2000.” Slate.com. N.p., 16 Nov. 2015. Web. 04 Feb. 2016.

Horsey, David. “TSA’s 95% Failure Rate Shows Airport Security Is a Charade.” Los Angeles Times. Los Angeles Times, 9 June 2015. Web. 06 Sept. 2016. <www.latimes.com/opinion/topoftheticket/la-na-tt-tsa-airport-security-charade-20150608-story.html>.

“IAB Believes Ad Blocking Is Wrong.” IAB Empowering the Marketing and Media Industries to Thrive in the Digital Economy. IAB, n.d. Web. 20 June 2016.<https://www.iab.com/iab-believes-ad-blocking-is-wrong/>.

Jenkins, Henry. Convergence Culture: Where Old and New Media Collide.

New York: New York UP, 2006. Print.

Jobs, Steve. “Thoughts on Flash.” Thoughts on Flash. Apple, Apr. 2010.

Web. 29 Nov. 2016. <http://www.apple.com/hotnews/thoughts-on-flash/>.

Kahneman, Daniel. Thinking, Fast and Slow. Reprint ed. N.p.: Farrar, Straus and Giroux, 2013. Print.

Kaplan, Jonas T., Sarah I. Gimbel, and Sam Harris. “Neural Correlates of

Maintaining One’s Political Beliefs in the Face of Counterevidence.” Scientific Reports 6 (2016): 39589. Web.

Kim, Mark H. “Why Doctors Want A Computerized Assistant For Cancer Care.” NPR. NPR, 12 Aug. 2016. Web. 10 Sept. 2016.<www.npr.org/sections/health-shots/2016/08/12/487943961/why-doctors-want-a-computerized-assistant-for-cancer-care>.

Knutson, Ryan. “Verizon to Pay $1.35 Million to Settle FCC Probe of ‘Supercookies’” WSJ. Wsj.com, 07 Mar. 2016. Web. 03 Oct. 2016. <http://www.wsj.com/articles/verizon-to-pay-1-35m-to-settle-fcc-probe-of-supercookies-1457372226>.

Koltko-Rivera, Mark E. “Rediscovering the Later Version of Maslow’s Hierarchy of Needs: Self-Transcendence and Opportunities for Theory, Research, and Unification.” Review of General Psychology 10.4 (2006): 302-17. Web. 26 Aug. 2016.<psycnet.apa.org/journals/gpr/10/4/302/>.

KOMO Staff. “Washington State Sues Comcast for $100M.” KOMO News. N.p., 1 Aug. 2016. Web. 31 Aug. 2016. <komonews.com/news/local/washington-state-to-sue-comcast-for-100m>.

Krassenstein, Eddie. “Andreas Bastian Creates Incredible Bendable 3D

Printed Mesostructured Material.” 3D Print.Com. N.p., 29 Apr. 2014. Web. 7 Dec. 2016. <https://3dprint.com/2739/bastian-mesostructured/>.

Kruger, Justin, and David Dunning. “Unskilled and Unaware of It: How

Difficulties in Recognizing One’s Own Incompetence Lead to Inflated Self-assessments.” Journal of Personality and Social Psychology 77.6 (1999): 1121-134. Web.

Kumparak, Greg. “Elon Musk Compares Building Artificial Intelligence To “Summoning The Demon”.” TechCrunch. N.p., 26 Oct. 2014. Web. 31 Aug. 2016. <techcrunch.com/2014/10/26/elon-musk-compares-building-artificial-intelligence-to-summoning-the-demon/>.

Kurzweil, Ray. The Age of Spiritual Machines: When Computers Exceed

Human Intelligence. New York: Viking, 1999. Print.

Kurzweil, Ray. “The Law of Accelerating Returns.” Kurzweil AI. N.p., 7 Mar. 2001. Web. 28 July 2016. <www.kurzweilai.net/the-law-of-accelerating-returns>.

Lehrer, Jonah. “Why Smart People Are Stupid.” The New Yorker. N.p., 12 June

2012. Web. 27 Oct. 2016.

Lepore, Emiliano, Francesco Bonaccorso, Bruna Matteo, Bosia Federico, Simone Taioli, Giovanni Garberoglio, Andrea Ferrari C., and Nicola Maria Pugno. “Silk Reinforced with Graphene or Carbon Nanotubes Spun by Spiders.” Silk Reinforced with Graphene or Carbon Nanotubes Spun by Spiders. Ar.Xiv, 25 Apr. 2015. Web. 23 June 2016.<https://arxiv.org/ftp/arxiv/papers/1504/1504.06751.pdf>.

Lerner, William. Historical Statistics of the United States: Colonial times to 1970: Bicentennial Edition. Washington: Government Printing Office, 1975. Print.

Leswing, Kif. “Apple Could Be on the Hook for $19 Billion in Taxes, and the Obama Administration Is Livid.” Business Insider. Business Insider, Inc, 24 Aug. 2016. Web. 01 Sept. 2016. <www.businessinsider.com/apple-taxes-us-treasury-european-commission-2016-8>.

Levitan, Mark, and Susan Wieler. “Poverty in New York City, 1969-99: The Influence of Demographic Change, Income Growth, and Income Inequality.” FRBNY Economic Policy Review (n.d.): n. pag. July 2008. Web. 23 Sept. 2016. <https://www.newyorkfed.org/medialibrary/media/research/epr/08v14n1/0807levi.pdf>.

Lieberman, David. “CEO Forum: Microsoft’s Ballmer Having a ‘great Time’”

USA Today. Gannett, 30 Apr. 2007. Web. 03 Nov. 2016. <http://usatoday30.usatoday.com/money/companies/management/2007-04-29-ballmer-ceo-forum-usat_N.htm>.

Lin, Liu Yi, Jaime E. Sidani, Ariel Shensa, Ana Radovic, Elizabeth Miller,

Jason B. Colditz, Beth L. Hoffman, Leila M. Giles, and Brian A. Primack. “Association Between Social Media Use And Depression Among U.s. Young Adults.” Depression and Anxiety 33.4 (2016): 323-31. Web.

Liptak, Andrew. “People Are Stopping and Getting out of Their Cars to Catch Vaporeon.” The Verge. N.p., 16 July 2016. Web. 18 July 2016. <www.theverge.com/2016/7/16/12205440/pokemon-go-crowds-central-park>.

Loomis, Carol J. “Elon Musk Says Autopilot Death ‘Not Material’ to Tesla Shareholders.” Fortune Elon Musk Says Autopilot Death Not Material to Tesla Shareholders Comments. Fortune, 04 July 2016. Web. 06 July 2016. <fortune.com/2016/07/05/elon-musk-tesla-autopilot-stock-sale/?xid=soc_socialflow_twitter_FORTUNE>.

Markel, Howard, Dr. “In 1850, Ignaz Semmelweis Saved Lives with Three

Words: Wash Your Hands.” News Hour. PBS, 15 May 2015. Web. 17 Nov. 2016.

Marsh, Rene, David Gracey, and Ted Severson. “Damaged Pipelines Are ‘ticking Time Bomb’” CNN. Cable News Network, 31 May 2016. Web. 01 Sept. 2016. <www.cnn.com/2016/05/25/politics/infrastructure-roads-bridges-airports-railroads/>.

Maslow, A. H. (1957). Abraham Maslow papers, Archives of the History of American Psychology, The Center for the History of Psychology, The University of Akron.

Masnick, Mike. “Cable Industry Finally Admits That Data Caps Have Nothing To Do With Congestion.” Techdirt. N.p., 23 Jan. 2013. Web. 04 Feb. 2016.

Mazza, Ed. “Comcast Apologizes After Changing Customer’s Name To ‘Asshole Brown’” The Huffington Post. TheHuffingtonPost.com, 29 Jan. 2015. Web. 04 Feb. 2016.

McKay, Betsy. “Mosquitoes Are Deadly, So Why Not Kill Them All?” WSJ. Wsj.com, 02 Sept. 2016. Web. 21 Sept. 2016.<www.wsj.com/articles/mosquitoes-are-deadly-so-why-not-kill-them-all-1472827158>.

Mcmillan, Robert. “What Everyone Gets Wrong in the Debate Over Net

Neutrality.” Wired. Conde Nast, 23 June 2014. Web. 05 Jan. 2017. <https://www.wired.com/2014/06/net_neutrality_missing/>.

McSweeney, Kelly. “Autonomous Tractors Could Turn Farming into a Desk Job.” ZDNet. N.p., 2 Sept. 2016. Web. 06 Sept. 2016. <www.zdnet.com/article/autonomous-tractors-could-turn-farming-into-a-desk-job/>.

“Minor Bug Problems Arise.” BBC News. BBC, 01 Jan. 2000. Web. 04 Feb. 2016.

Morrow, Michael. “World’s First 3D Printed House Is Completed after Just 45 Days in China.” NewsComAu. N.p., 27 June 2016. Web. 08 Sept. 2016. <www.news.com.au/technology/innovation/design/worlds-first-3d-printed-house-is-completed-after-just-45-days-in-china/news-story/05c819dfc0dc6bf7ec0fd2abfed23edd>.

Mulloy, Tara. “Unruly Launches New Video Lab To Help Advertisers Survive Ad Blocking Phenomenon -.” Unruly. N.p., 24 Sept. 2015. Web. 20 June 2016.<unruly.co/news/article/2015/09/24/unruly-launches-new-video-lab-to-help-advertisers-survive-ad-blocking-phenomenon/>.

Nakashima, Ellen. “Apple Vows to Resist FBI Demand to Crack IPhone Linked to San Bernardino Attacks.” Washington Post. The Washington Post, 17 Feb. 2016. Web. 20 June 2016.

“NASA Exoplanet Archive.” NASA Exoplanet Archive. NASA, n.d. Web. 06 Sept. 2016. <exoplanetarchive.ipac.caltech.edu/>.

Office Of Inspector General. “Report: Drinking Water Contamination in Flint, Michigan, Demonstrates a Need to Clarify EPA Authority to Issue Emergency Orders to Protect the Public.” EPA. Environmental Protection Agency, 20 Oct. 2016. Web. 06 Dec. 2016. <https://www.epa.gov/office-inspector-general/report-drinking-water-contamination-flint-michigan-demonstrates-need>.

Palmer, Michael. “Data Is the New Oil.” Web log post. ANA Marketing

Maestros. N.p., 3 Nov. 2006. Web. 3 Nov. 2016. <http://ana.blogs.com/maestros/2006/11/data_is_the_new.html>.

Parloff, Roger. “Spy Tech That Reads Your Mind.” Spy Tech That Reads Your Mind Comments. Fortune, 1 July 2016. Web. 05 July 2016.<http://fortune.com/insider-threats-email-scout/>.

Pattakos, Alex, and Stephen Covey. Prisoners of Our Thoughts: Viktor

Frankl’s Principles at Work. San Francisco: Berrett-Koehler, 2004. Print.

Peters, Mark. “Changes Choke Cap-and-Trade Market.” The Wall Street

Pinker, Steven. “Now for the Good News: Things Really Are Getting Better.” The Guardian. Guardian News and Media, 11 Sept. 2015. Web. 13 Sept. 2016. <www.theguardian.com/commentisfree/2015/sep/11/news-isis-syria-headlines-violence-steven-pinker>.

Pinker, Steven. “The Cognitive Niche: Coevolution of Intelligence, Sociality, and Language.” PNAS. N.p., 11 May 2010. Web. 09 Sept. 2016.<www.pnas.org/content/107/Supplement_2/8993.full>.

Pitney, Nico. “Stephen Hawking: Humans Should Fear Aliens.” The Huffington Post. TheHuffingtonPost.com, 25 May 2011. Web. 21 June 2016. <www.huffingtonpost.com/2010/04/25/stephen-hawking-aliens_n_551035.html>.

Protalinski, Emil. “Google Will Stop Running Flash Display Ads on January

2, 2017.” VentureBeat. N.p., 9 Feb. 2016. Web. 29 Nov. 2016. <http://venturebeat.com/2016/02/09/google-will-stop-running-flash-display-ads-on-january-2-2017/>.

“Pruneyard Shopping Center v. Robins.” LII / Legal Information Institute.

N.p., n.d. Web. 03 Dec. 2016. <https://www.law.cornell.edu/supremecourt/text/447/74>.

Purzycki, Benjamin Grant, Coren Apicella, Quentin D. Atkinson, Emma Cohen, Rita Anne Mcnamara, Aiyana K. Willard, Dimitris Xygalatas, Ara Norenzayan, and Joseph Henrich. “Moralistic Gods, Supernatural Punishment and the Expansion of Human Sociality.” Nature 530.7590 (2016): 327-30. Web.

“Tech Tax” Aims to Help Homeless in San Francisco.” FOX40. N.p., 05 July 2016. Web. 05 July 2016. <fox40.com/2016/07/05/tech-tax-aims-to-help-homeless-in-san-francisco/>.

Raine, Adrian, and Yaling Yang. “Neural Foundations to Moral Reasoning and Antisocial Behavior.” Social Cognitive and Affective Neuroscience. N.p., 17 Sept. 2006. Web. 13 Sept. 2016. <www.ncbi.nlm.nih.gov/pubmed/18985107>.

Ranganathan, Janet, Daniel Vennard, Richard Waite, Patrice Dumas, Brian Lipinski, Tim Searchinger, and Et Al. “Shifting Diets for a Sustainable Food Future.” World Resources Institute, Apr. 2016. Web. 07 Sept. 2016. <www.wri.org/publication/shifting-diets>.

Research and Trends Item Detail.” The Evolution of Work: The Changing Nature of the Global Workplace. ADP Research Institute, 29 June 2016. Web. 29 June 2016.

Rodriguez, Salvador. “60% of World’s Population Still Won’t Have Internet by the End of 2014.” Los Angeles Times. Los Angeles Times, 7 May 2014. Web. 20 Feb. 2016.

Roy, Siddhartha, Anurag Mantha, Min Tang, and Rebekah Martin. “[Complete Dataset]

Lead Results from Tap Water Sampling in Flint, MI.” Flint Water Study Updates. N.p., 01 Dec. 2015. Web. 01 Sept. 2016. <flintwaterstudy.org/2015/12/complete-dataset-lead-results-in-tap-water-for-271-flint-samples/>.

Russon, Mary-Ann. “China: Recycled Concrete Houses 3D-Printed in 24 Hours.” International Business Times RSS. N.p., 24 Apr. 2014. Web. 08 Sept. 2016. <www.ibtimes.co.uk/china-recycled-concrete-houses-3d-printed-24-hours-1445981>.

Schwartz, Jonathan. “The Future of 3D-printed Prosthetics.” TechCrunch. N.p., 26 June 2016. Web. 21 Sept. 2016. <techcrunch.com/2016/06/26/the-future-of-3d-printed-prosthetics/>.

“The Science in Science Fiction” on Talk of the Nation, NPR (30 November 1999, Timecode 11:55)

Scott, Clare. “TSA Discovers 3D Printed Gun in Carry-On Luggage at Reno Airport.” 3DPrint.com. N.p., 09 Aug. 2016. Web. 06 Sept. 2016. <3dprint.com/145323/3d-printed-gun-reno-airport/>.

Smith, Robin Anne. “How Did Human Brains Get to Be so Big?” Scientific American Blog Network. N.p., 06 Aug. 2013. Web. 09 Sept. 2016. <blogs.scientificamerican.com/guest-blog/how-did-human-brains-get-to-be-so-big/>.

“Social.” Fast Facts About Agriculture. American Farm Bureau Federation, n.d. Web. 01 Sept. 2016. <www.fb.org/newsroom/fastfacts/>.

Soon, Chun Siong, Marcel Brass, Hans-Jochen Heinze, and John-Dylan Haynes. “Unconscious Determinants of Free Decisions in the Human Brain.” Max Planck Society. N.p., 13 Apr. 2008. Web. 8 Sept. 2016. <www.mpg.de/research/unconscious-decisions-in-the-brain>.

“Stark, New York.” (NY 13361) Profile: Population, Maps, Real Estate, Averages, Homes, Statistics, Relocation, Travel, Jobs, Hospitals, Schools, Crime, Moving, Houses, News, N.p., n.d. Web. 23 Sept. 2016. <http://www.city-data.com/city/Stark-New-York.html>.

Stephan, Maria J., and Erica Chenoweth. “Why Civil Resistance Works: The Strategic Logic of Nonviolent Conflict.” International Security33.1 (2008): 7-44. Web.<belfercenter.ksg.harvard.edu/files/IS3301_pp007-044_Stephan_Chenoweth.pdf>.

Strange, Adario. “Photo Reveals That Even Mark Zuckerberg Puts Tape over His Webcam.” Mashable. N.p., 21 June 2016. Web. 22 June 2016.<mashable.com/2016/06/21/mark-zuckerberg-webcam-cover/#J5JCpQAhAEq4>.

Hukman, David. “Hawking: Humans at Risk of Lethal ‘own Goal’” BBC News.

N.p., 19 Jan. 2016. Web. 02 Nov. 2016.

Sagan, Carl, and Ann Druyan. The Demon-haunted World: Science as a Candle in the Dark. N.p.: n.p., n.d. Print.

Sunstein, Cass. “The Law of Group Polarization.” Debating Deliberative

Democracy (n.d.): 80-101. Web.

Swaminathan, Nikhil. “Why Does the Brain Need So Much Power?”Scientific American. N.p., 29 Apr. 2008. Web. 09 Sept. 2016. <www.scientificamerican.com/article/why-does-the-brain-need-s/>.

Szoldra, Paul. “The NSA Hack Proves Apple Was Right to Fight the FBI.”Business Insider. Business Insider, Inc, 21 Aug. 2016. Web. 22 Aug. 2016.

Tatum, William O., Benedetto DiCiaccio, and Kirsten Yelvington H. “Result Filters.” National Center for Biotechnology Information. U.S. National Library of Medicine, June 2016. Web. 12 Sept. 2016.

Terry, Ken. “EHR’s; Broken Promise.” Medical Economics. Advanstar Communications Inc, 20 May 2015. Web. 14 Sept. 2016. <medicaleconomics.modernmedicine.com/medical-economics/news/ehrs-broken-promise?page=full>.

“The Unicorn List 2016.” Fortune. N.p., 24 Mar. 2017. Web. 07 Sept. 2017. <www.fortune.com/unicorns/>.

Tooby, John, and Leda Cosmides. “Evolutionary Psychology Primer by Leda Cosmides and John Tooby.” N.p., 13 Jan. 1997. Web. 20 Nov. 2015.

U.S. Department of Transportation. U.S. Department of Transportation Releases Policy on Automated Vehicle Development. N.p., 30 May 2013. Web. 15 Sept. 2016.

“US Digital Display Ad Spending to Surpass Search Ad Spending in 2016 - EMarketer.” US Digital Display Ad Spending to Surpass Search Ad Spending in 2016. EMarketer, 11 Jan. 2016. Web. 16 June 2016.

Van Der Meulen, Rob, and Janessa Rivera. “Gartner Says By 2020, a Quarter

Billion Connected Vehicles Will Enable New In-Vehicle Services and Automated Driving Capabilities.” Gartner Says By 2020, a Quarter Billion Connected Vehicles Will Enable New In-Vehicle Services and Automated Driving Capabilities. Gartner, 25 Jan. 2016. Web. 06 Dec. 2016. <http://www.gartner.com/newsroom/id/2970017>.

Wagner, Kurt. “How Many People Are Actually Playing Pokémon Go? Here’s Our Best Guess so Far.” Recode. N.p., 13 July 2016. Web. 20 July 2016. <www.recode.net/2016/7/13/12181614/pokemon-go-number-active-users>.

Wald, Chelsea. “Watch Animals Watching Themselves in Mirrors. What Do

They Think They See?” Slate Magazine. N.p., 24 Oct. 2014. Web. 26 Nov. 2016.

Walker, Daniela. “Can an Algorithm Replace the Pill? Natural Cycles App

Wants to Do Just That.” WIRED UK. N.p., 4 Oct. 2016. Web. 21 Nov. 2016. <http://www.wired.co.uk/article/natural-cycles-ovulation-app>.

Wallace, Patricia M. The Psychology of the Internet. Cambridge, UK:

Cambridge UP, 1999. Print.

Weckler, Adrian, and Michael Cogley. “’No One Did Anything Wrong Here and Ireland Is Being Picked On... It Is Total Political Crap’ - Apple Chief Tim Cook.” Independent.ie. N.p., 01 Sept. 2016. Web. 01 Sept. 2016. <www.independent.ie/business/irish/no-one-did-anything-wrong-here-and-ireland-is-being-picked-on-it-is-total-political-crap-apple-chief-tim-cook-35012145.html>.

West, Richard F., Russell J. Meserve, and Keith E. Stanovich. “Cognitive

Sophistication Does Not Attenuate the Bias Blind Spot.” Journal of Personality and Social Psychology 103.3 (2012): 506-19. Web.

Whitfield-Gabrieli, Susan, Joseph M. Moran, Alfonso Nieto-Castan, Christina

Triantafyllou, Rebecca Saxe, and John D.e. Gabrieli. “Associations and Dissociations between Default and Self-reference Networks in the Human Brain.” NeuroImage 55.1 (2011): 225-32. Web.

Yadron, Danny. “Apple: Government ‘intended to Smear’ Us in Digital Privacy Fight with FBI.” The Guardian. Guardian News and Media, 10 Mar. 2016. Web. 20 June 2016.

Zhang, Q., Zhang, F., Medarametla, S. P., Li, H., Zhou, C. and Lin, D. (2016), 3D Printing of Graphene Aerogels. Small, 12: 1702–1708. doi:10.1002/smll.201503524

Zikopoulos, Paul, Dirk DeRoos, Christopher Bienko, Rick Buglio, and Marc Andrews. Big Data beyond the Hype: A Guide to Conversations for Today’s Data Center. New York City, NY: McGraw-Hill Education, 2015. 9. Print.

Zillis, Shivon. “Shivon Zilis - Machine Intelligence.” Shivon Zilis - Machine

Intelligence. N.p., n.d. Web. 30 Nov. 2016. <http://www.shivonzilis.com/machineintelligence>.

Zuckerberg, Mark. “Letter From Mark Zuckerberg.” Letter to Potential

Shareholders. 1 Feb. 2012. Registration Statement on Form S-1. SEC.Gov, 1 Feb. 2012. Web. 29 Nov. 2016.


Author Bio

Corey Preston grew up in the rural hills of central New York state and is the youngest of four brothers who helped guide him through the gaming-fueled PC technology of the 90s. The stark contrast between the wider world of tech and his modest surroundings drove him to seize every learning opportunity available, which led him to a specialized vocational course providing him with a strong foundation in information technology at the age of 16. It was here that he became fascinated with the first DotCom rush and gained an appreciation for the role that technology plays in our lives.

Throughout his life, he has dedicated himself to participating in the ongoing development of the web, from self-teaching graphic design via early expressions of a democratized web, free skill-based tutorials. In college, he founded a community for open sourced video game asset sharing. Further adventures include co-founding a media review website with childhood friends and brothers, later evolving into a Minecraft community that is still active today. By understanding how to use the internet as a font of unending shared knowledge, Corey has developed a repeating pattern of ravenous self-directed learning.

Later opting into a career in digital media, here his experience yielded deep insight into the business and technical mechanics behind ‘web 2.0.’ Further specializing in startups that were geared to developing deeply interactive advertising campaigns and educating constituents on the technology in play, he has had a front-row seat to the web since the days of AOL. It is his passion to educate people about the promise offered by a technology-enabled tomorrow. Since moving from NYC to California, he has dedicated his time to completing this, his first book, Welcome to Tomorrow: A Beginner’s Guide to Technology.

About the Book

Welcome to Tomorrow: A Beginner’s Guide to Technology brings clarity to the chaotic three-ring-circus of technology launching our present into the future. For those who are inexperienced to curious, the book guides the reader in plain language on a tour of technology’s tools and impacts, the effects of which are dizzyingly nowhere, yet everywhere, all at once. Separating the signal from the noise, readers will begin to recognize the patterns which power technology, and will form insights on enabling self-development. As the topics ramp up in complexity, we’ll highlight the nuances in the easy-to-understand language and humor necessary to digest the content. Using examples from film and pop culture, the reader can expect to engage with the material from a variety of angles.

After the first dot-com bust, most wrote off the internet while its creators and devotees went back to work building the mass-communication tools we have now. From web basics to varieties of artificial intelligence and their potential, a breadth of topics will be covered here. This book will additionally aid aspiring students in search of a career in technology, providing the foundational highlights necessary to gain insight and apply it towards their next steps in learning. Through practical advice about conducting your life on the web, readers will emerge as skilled end users of 21st century tools.

Whether you were born in the internet age or are just coming to appreciate the world of technology now, Welcome to Tomorrow serves as your tour of the potential highs and lows of the immediate future. We’ll bridge this gap by highlighting where cognitive biases are promoting misconceptions about technology impact on us. By grasping control of our decisions, we’ll develop a strong sense of technology’s role in elevating not only our convenience, but also our humanity. By learning about and leveraging technology in our everyday lives, we make our voices heard, the next step is learning to harness our declarations to begin building a better tomorrow.

Book Front Matter

Welcome to Tomorrow: a beginner's guide to technology




Written by: Corey Preston

Edited by: Shari Angel

For Ashley, who lovingly dared me to believe this book was possible.

©2017 Corey Preston. All rights reserved.
www.WelcomeToTomorrow.Today
Cover design by Ricardo Galbis @GalbisR
Additional icon imagery provided by FlatIcons
Print First Edition, June2017
ISBN - 978-0-9982453-1-7

Acknowledgments People who’d recognize my name - Shari Angel, my parents- Greg and Penny, Michael Collins, Trevor, Mike and Garret Preston - my three brothers for guiding me through tech, Al Sarnacki, Ben Rathbun, Luanne Cadden, Hugh Cadden, Alexandria Cmar, Joseph Cahill and any friends who have encouraged me along the way.

People who wouldn’t recognize me but influenced me all the same - Stephen Hawking, Carl Sagan, Ann Druyan, Daniel Kahneman, Nick Bostrom, David Jones, Bill Nye, Vinton Cerf, John von Neumann, Alan Turing, Grace Hopper, Andrew Grove, and Ray Kurzweil Robert Zemeckis, Bob Gale.