Where Tech Plays out of Bounds

At present, websites assume that by your continued visits to their site, you imply consent for them to collect data on you. Every site must have a privacy policy that details what data is collected, used, or even sold to third parties. Again, your consent is derived from the continued usage of the site. This type of automatic opt-in is a feeble handwave at tackling privacy as an issue. Users need more access and power over how and where their data is used, while education on technology and privacy rights is a close second in necessity. If you were to ask the average web user, they would most likely prefer to opt out of the tracking altogether.

The way technological innovation works inside of our economy is that the operators of startups can, at times, operate within the barest legal frameworks. This is part of how Uber has grown at massive rates since inception. Given that Uber’s business model is tangential to the taxi business, they’re able to legally argue some laws governing taxis do not apply to Uber’s business. Further, the amount of varied legislation at the state and local level helps to provide cover as they toe the line of legality. This leads to what is known as ‘disruption’ to an industry. The incumbents like the taxi industry have staked out their claim to business and seek to defend their revenue generators. The claim of the disruptors is that their service or product exists to satisfy a unique need of a market. In the instance of Uber, serving that unique market that also happens to include traditional taxi seekers is essential to the successful operation of their business. In the instance of digital advertising, I believe that same ‘wink and nod’ is taking place. Take the recent example of Verizon’s imposed fined $1.35 million dollars for the use of supercookies.

You can think of a supercookie similar to regular web cookies in that they help store your personalized web experience. However, imagine you’re hungry for a cookie, you go to the store and pick up a pre-packaged cookie and begin to examine its cookie goodness. Upon inspecting the packaging, you’ll find that the package lists intimate details about you. Creepy, so you put that package down and pick up another cookie. Same damn thing, creepy details. You do this a few more times all with the same result. The reason this is happening in the case of Verizon, is that they repeatedly and automatically write your details on the cookie package after you touch it. This made the use of these supercookies a pernicious issue that was not simple to get rid of. Quite simply, Verizon slaps a sticker on you no matter where you go. For context, $1.35 million dollars is approximately 1/1,000 of Verizon’s revenue for 2015.

If you believe, as I do, that property rights extend to the data you generate on the web, then if a marketer wants to use your data, there should be some form of recompense. Software as a service model is a good start, but not every company has something to offer in return for sniffing out your data. If you had valuable real estate along a highway and a marketer wanted to erect a billboard on your land, you wouldn’t agree to the advertisement without being compensated. Your personal agency over the data that you generate and maintain is exchanged for convenient experiences. In some instances, sharing this data provides public benefit, like Google using geo-locational data to determine the presence of a traffic jam.

However, as targeted as marketing can be, here is where it gets worse… All of those same marketer-leveraged signals that you emit on the web are fair game for the potential state actors to track you as well. Edward Snowden revealed to the world that the United States government goes to remarkable lengths to track and catalogue the data generated by its own citizens. And this is supposed to be the land of the free, a representative republic. It’s not difficult to imagine how bad this technology could go in the wrong hands. It would be a trivial matter to co-opt these signals to exercise as power, especially considering that it has been reported that the budget of the FBI’s Operational Technology Division is between $600 and $800 million, but officials refused to confirm the exact amount. The only way to begin to reduce this risk is to think of technology as a worldwide platform with worldwide implications to usage. An example of this is the FBI’s battle with Apple over the encryption of a terrorist’s iPhone. For those who did not closely follow the case the progression was such:

FBI: “Federal judge says gimme the key to unlock this terrorist’s phone and by extension, nearly all iPhone’s sold encryption.”

Apple: “Not just no, but hell no. If we did it, it would create a security flaw that is usable by any third rate despot.”

FBI: “As they say ‘who gives a shit?’, are you supporting terrorism? We’ll smear you to the public.”

Apple: “Still not going to do it. A substantial chunk of the people and economy, agrees with us, you need to get the fuck out.”

FBI: “Just kidding, the phone was already locked out and we paid some hackers a million dollars of taxpayer money… We just wanted access to everyone else’s data. HAHA”

Russian Hackers: “Greetings comrades, security flaws are very useful. Am I right, NSA?”


So, a collective group of hackers exploited a security flaw in the software, used by our NSA, to produce a hack of their own. Security organizations like the NSA and FBI tend to collect what are known as ‘zero day exploits,’ wherein, they intentionally attempt to hack popular hardware and software for collecting vulnerabilities. Oftentimes when new features and functions are added to software, it presents the opportunity for a new ‘hole’ in security. They probe their protocols and core software then catalog for later exploitation at a time of their choosing, all while not revealing the flaw to the company that produces the software. Proponents claim this enables heightened intelligence surveillance, but I don’t see it that way. Critics are quick to cite the act of withholding presents additional opportunity to our citizens of the web who are already open and primed for tracking and manipulation on completely unprecedented levels. Over time our favorite websites and apps began to remember our passwords and serve content tailored to our likes. It sets the stage for enhanced ad-targeting developing in parallel. Framed as ‘consumer benefits,’ which are difficult to argue against, these personal touches are the cornerstones of potential trojan horses, and these personal touches are not limited in scope to just advertising. It can also facilitate wholesale censorship. For example, at the recently held Decentralized Web Summit, famed digital activist and engineer Brewster Kahle noted, “China can make it impossible for people there to read things, and just a few big service providers are the de facto organizers of your experience. We have the ability to change all that.”

Here’s a scary thought, had the FBI been successful in opening a backdoor to iPhones, this backdoor would be just as accessible to foreign less democratic friendly governments. The reason? If the FBI finds a flaw that grants them access, they have no incentive for letting Apple know. Their incentive, however, is that they are able to use the flaw towards their own ends because without notifying Apple, the flaw remains open. It’s like the nicest house on the shadiest block forgetting they left the door wide open. Anyone with an interest and disregard for security can use the door. The internet does not have physical borders. Nationalistic pursuits of unfettered access to people’s data will yield a dystopia faster than you can imagine.

Further down the rabbit hole, some cybersecurity industry players are moving into surveillance intra-company. The firm Stroz Friedburg is cataloging and indexing all employee emails and text messages for their corporate clients. This is not entirely abnormal for our day and age, but where it gets extra creepy is that they are attempting to use big data approaches to predict the intent of employees. By analyzing the texts and emails of employees, the firm picks out words that denote dissatisfaction of employees. Words and phrases like ‘leave work early’ and ‘Shitty boss’ are flagged and stored. The software is able to aggregate the data in simple yet powerful ways, enabling queries like ‘Show me the top 10 dissatisfied employees,’ followed by examples that display how those 10 employees are unhappy with their employer. The risk we face by allowing firms like this to exist is that their technology is easily co-opted for other purposes. “Extremist” and “activist” are search terms to those at the heads of these companies.

The internet is a scary place right now for privacy, especially if you consider that even Mark Zuckerberg, the CEO of Facebook, uses basic measures to protect his own privacy. He places tape over his webcam and microphone on his laptop. If Zuckerberg is unable to ensure his own privacy, how are the rest of us assured protection? It says a lot about the state of the internet. All of that said, it’s not too late to lay the proper groundwork for upcoming generations. As the early arrivals of this new information age, we still have time to provide the best possible outcome for our children and ourselves. To be clear, I do not believe companies like Facebook and Google et al. to be intentionally trampling people’s privacy rights. In many cases there is a deep seated desire to do good, in fact Google’s goal is still to make all the world’s information accessible to everyone. Zuckerberg doesn’t want to control what friends you see and what news stories you read. They’re chasing what users ‘like’ and creating software that seeks and serves you similar content so you’ll use the service more, generating revenue in the process.

So how might we address these issues? Some say encryption end-to-end may be one step on the way to a solution. End-to-end encryption simply means you and I have keys to each other’s front door when we send a message or file. What needs to be further addressed is that without strong privacy rights placed into immutable law, our privacy is screwed. It is important that at this juncture we not submit to our bias for ‘steady as she goes.’ The bias for status quo makes an appearance in decision making here as well. Example — when an app updates their terms and conditions or privacy policy, do you review them? That’s like handing someone a birthday cake lined with lit candles and then asking them to read a 25 page usage agreement before they can blow their candles out.

On a personal level, we need to engage in discourse with our friends and family about the issues at play, and the reasons they are in play. It’s a natural inclination of our species to leave the world a better place for our children. Socially we all agree to this idea, though our application can sometimes indicate otherwise. Children are, and always have been in a sense, a euphemism for the rest of humanity. Predicting the future for them is impossible, what we can do now is ensure their rights. That conversation is already taking place in boardrooms across the country. Proof of that bears out in Mark Zuckerberg’s following 2012 quote to shareholders. “By helping people form these connections, we hope to rewire the way people spread and consume information.”

Emoji Time — The Web

Advertising, social media and privacy are all presently in a state of confused adolescence. Most of the nuances are so brand new, detailing their practicality to the public becomes a societal challenge of establishing common terminology. Yet for the moment, these these three arenas are in a state of unabashed money grabbing and the emojis reflect this.

Fig 5–5. — “Can’t…”, ”…Stop…”, “…Won’t…”, “…Stop.”

(This post is an excerpt from my upcoming book Welcome to Tomorrow, offering insights and practical advice on living in our digital world. If you enjoyed this piece, please follow to stay up to date as more release!)