First in a series of pieces that examine the hidden links between apparently unrelated ideas.
Scientists at the Laboratory for Information and Decision Systems at MIT have been working on a way to determine how far an idea will propagate via Twitter. The system, named Trumor, measures the reach and influence of individual users on Twitter, much like the commercially-available service, Klout. Trumor, however, creates a list of "superstars" in each topic (e.g. soccer, automobiles, copyright law). These users are very likely to have their comments propagated through the network.
The Filter Bubble is the title of a book and the name of an idea developed by online activist Eli Pariser. The basic premise is this: online information is being increasingly organized automatically by individual preference, and this algorithmic curation may not be to our benefit. As an example, Pariser notes the results pages for Google searches two of his friends did on "Egypt" during the revolution earlier this year. One friend received mostly news, the other received travel agencies and basic encyclopedia information, but nothing on the protests. As human beings rarely like their ideas to be challenged, the filter bubble tends toward increasing divisions between groups of people, based on ideology.
The Intersection: As people self-organize by ideology, they will likely either choose to restrict the number of information sources they rely on, or the filter bubble will do that for them. As the bubble closes in, information from outside will leak in less and less frequently, making the power of online thought leaders very strong as concerns influencing those that follow them. (Incidentally, the implications of the word "follow" in the Twitter context will become creepier.)
As this occurs, the knowledge of who influences whom (will MIT retain this information? will it be made public?) will likely fall into the hands of corporations, political parties, hackers, and activist groups. The game will become "Who influences the influencers?" as groups attempt to convince members of the superstar list they're interested in to carry their content. This structure of top-down siloed syndication would be very weak to subtle subversion and account hacking.
In sum: As the world outside our senses becomes more important, our reliance on others' perspectives increases. As our filter bubbles close in, our worldviews become less robust and more vulnerable to manipulation via the social media superstars we rely on. Reality-hacking and reality-selling are imminent, but not in a cool cyberpunk way.