Information Anxiety

Have you ever woken up dreading looking at your phone to meet the waves of posts and messages?

It’s something I’ve become familiar with. Information anxiety, they call it. Richard Wurman, author of the book Information Anxiety, says the condition is “produced by the ever-widening gap between what we understand and what we think we should understand. It is the black hole between data and knowledge, and what happens when information doesn’t tell us what we want or need to know.”

Information anxiety causes a loss in productivity everywhere. People cannot understand the constant influx of information that is presented to them, and are distressed by trying to reconcile numerous sources practically screaming in their ears. We know there is more information out there, and yet, we just cannot absorb it all. It is a perceived shortcoming of being human.

We cannot know everything.



So why do we expect that of ourselves? With approximately 2.5 exabytes, or 2.5 billion gigabytes, of data in the world, we cannot hope to take it all in. People are under the impression that everyone else is in the loop, and that they themselves are the only ones who do not understand.

This pursuit is killing meaningful conversation and connection. With a world of facts, people have disregarded the importance of thoughts. Simple ideas and unfounded theories have gone by the wayside, and facts reign supreme in a society pursuing the synthetic ideal.

There are many who feel this way. I myself have struggled with even breaking into social media, let alone big data. And the benefits that these tools reap are too great to do away with them. I am no psychologist, but I will offer my tips, the tricks I found that helped me overcome my crippling fear of technology and information, and how baffling the entire charade is.

  1. Set a Technology Bedtime: It’s widely known that looking at a screen late into the night is bad for your health, but absorbing more information and trying to synthesize it in a meaningful way is even more stressful, especially when you’re tired. Pick a time to unplug.
  2. Don’t Bring it When You Go Out: Unless you’re out on business, allow yourself some breathing room.
  3. Set Meetings in Person: Talking to a screen is unsettling. If you’re conversing with someone, set up a meeting in person. You’ll make a better impression, and you can communicate in a more fluid way.

Like I said, these are just some tips that have helped me. If you have more questions regarding information anxiety, I urge you to read this article from Richard Wurman.


The Ethics of Big Data

I posted not too long ago about speaking with Andy Rossmeissl from Faraday about his take on the ethical issues presented by the use and collection of big data. If you didn’t get a chance to check it out, I encourage you to click here!

In the past few weeks, I have read through a very enlightening book called Big Data: A Revolution That Will Transform How We Live, Work, and Think by Kenneth Cukier and Viktor Mayer-Schönberger. The book presents a lot of the major issues with the current approaches to regulating data usage, some that I would like to reiterate and build upon:

  • Informed Consent – given the nearly unlimited uses of data, and what can be extrapolated from the tiniest scrap, informed consent means very little if consumers don’t understand who can use their data and how they can use it.
  • Willingness – Most end-user license agreements give a LOT of leeway with how companies can gather data from you. Take the Facebook fiasco or the OkCupid experiments for example. They did not breach contracts because the terms are beyond conventional understanding.
  • Lack of Data is Data – As described in the aforementioned book, Google Street View allowed German citizens to obfuscate their houses from their maps. The blurred image, however, was interpreted by some as prime burglary targets.
  • Data vs. Individuality – Data is, by definition, messy. When making predictions to create benefits for the general public, this is fine, but data’s application in areas such as making forecasts in parole hearings is dangerous. It disregards human agency, and takes power away from individuals.
  • Anonymization – It doesn’t work. Scrubbing the data of personal information is a small comfort, but the New York Times was able to identify an individual person from anonymized search queries.

If there is so much risk than can come from data, and we as consumers cannot possibly stop the flow of it, how, then, should society proceed?

I will take a moment to quote Ben Parker, uncle of the world-famous Spider-Man: “With great power comes great responsibility.” Data has power, and logic should dictate that those who hold the data should be held accountable for their usage and collection of data.

We have ethics and oversight committees for healthcare professionals, and data should be no different. When used responsibly, it yields great benefits for everyone. Data helped curb the spread of swine flu, makes marketing more affordable and targeted, and allows for greater standardization and collaboration in the medical and technological fields. But the uses need to be monitored so that they are in line with empowering consumers and protecting the fundamental humanity and safety of the individual.

A team of internal and external algorithmists need to be present to ensure that the system is working properly and safely. Oversight committees must be looking over the gathering of data, especially when it is unbeknownst to consumers, to make sure that any experiments are ethical. And finally, rigorous security precautions need to be employed to ensure that no one person can be singled out by the public. Anonymized data isn’t enough, it is more important to ensure that it is aggregated, even if it is messier. That is simply the nature of big data.

The Face of Big Data

This week, I had the pleasure of meeting with Andy Rossmeissl, co-founder and leader of the Faraday team. Faraday is a big data company that first broke into the scene by gathering data for the Department of Energy on potential solar panel customers. Since then, the company has spread to helping colleges target potential students and companies target consumers for retirement plans.

“Big data” is a term that conjures up images of drones outside of windows, or data banks filled to the brim with one’s life history. I myself was on the fence about the industry, considering the usefulness and invasiveness. However, upon meeting Andy, I found that the actual face of big data is a friendly one.

Faraday is small, consisting of 15 employees. The meat and potatoes of their work is in the development and upkeep of their data program. Having seen it up close, it’s truly fascinating. The entirety of the United States is broken up into hexagons, ranging in color from dark black to bright yellow. The aesthetic of it is very techno-punk, looking like something out of a Phillip K. Dick novel.

Users can put in interests and demographics of their target market, and search within regions to find out which households are ripe for the picking. The brighter yellow a hexagon, the higher the population that fits a user’s parameters. Following this, users can order a mailing list based on what they’ve narrowed it down to, cutting out the clutter of the “spray and pray” marketing model.

What seems concerning is the ability to define an individual household by its interests, but Andy set our minds at ease. He explained that if a user makes their search too specific, the system will explain that the search was too narrow and ask them to change it. Furthermore, he addressed the concerns of health privacy, stating that the business ethics in place at Faraday would keep them from selling sensitive health information, using the example of giving a list of people suffering from depression to a pharmaceutical company.

And while this is good news, this is not the case with all data companies.

Firms such as Castlight Health and Welltok contract with companies to preemptively determine employees’ health needs. Walmart uses these firms, having them directly contact employees with tips and treatment options for conditions that said employees are estimated to be at risk for.

Castlight Health recently launched a new platform that will actually predict if a woman is pregnant or not based on her age, number of previous children, whether she recently ceased taking birth control, and fertility information. This, naturally, caused an uproar from consumers (Smith).

So where is the line drawn? Does the ability to so specifically target consumers and deliver information and promotions outweigh the blatant invasion of privacy present in many firms’ operations?

Where I draw the line, in my own ethical thoughts, is at the point where information has stopped being given willingly. Information that someone freely shares on the internet, or their shopping habits through a retailer, are things that are pretty public, and there is no expectation of privacy. There is little unethical in simply compiling this information and selling it; it’s out there anyway, these firms just make it easier.

However, when health information starts being distributed, information taken from a person’s insurance records, then it gets unsettling. There is a reason that those who work in healthcare of pharmaceuticals are bound by a confidentiality code. I work in a pharmacy myself, and have to abide by HIPAA regulations. When that confidentiality is broken, it can cause shame, embarrassment, and serious safety risks.

There may be a right way to do big data, but what we see (and fear) is the wrong way to do it. I have wondered myself how to enforce ethical standards in the industry, and who gets to draw the line in the sand between what is sacred and what is not. I’ll sum it up with Andy’s answer when he was asked about how to ensure ethical behavior in big data: “Well, let’s just hope the laws catch up.”


Smith, L. J. (2016, February 17). Big data knows if you’re pregnant. Retrieved February 28, 2016, from