Video: Can we save the open web?

In March, I did a presentation at SxSW that asked the audience a question I've been thinking about a lot lately: "Can we save the open web?".

The web is centralizing around a handful of large companies that control what we see, limit creative freedom, and capture a lot of information about us. I worry that we risk losing the serendipity, creativity and decentralization that made the open web great.

Open web closing

While there are no easy answers to this question, the presentation started a good discussion about the future of the open web, the role of algorithms in society, and how we might be able to take back control of our personal information.

I'm going to use my blog to continue the conversation about the open web, since it impacts the future of Drupal. I'm including the video and slides (PDF, 76 MB) of my SxSW presentation below, as well as an overview of what I discussed.

Here are the key ideas I discussed in my presentation, along with a few questions to discuss in the comments.

Idea 1: An FDA-like organization to provide oversight for algorithms. While an "FDA" in and of itself may not be the most ideal solution, algorithms are nearly everywhere in society and are beginning to impact life-or-death decisions. I gave the example of an algorithm for a self-driving car having to decide whether to save the driver or hit a pedestrian crossing the street. There are many other life-or-death examples of how unregulated technology could impact people in the future, and I believe this is an issue we need to begin thinking about now. What do you suggest we do to make the use of algorithms fair and trustworthy?

Idea 2: Open standards that will allow for information-sharing across sites and applications. Closed platforms like Facebook and Google are winning because they're able to deliver a superior user experience driven by massive amounts of data and compute power. For the vast majority of people, ease-of-use will trump most concerns around privacy and control. I believe we need to create a set of open standards that enable drastically better information-sharing and integration between websites and applications so independent websites can offer user experiences that meet or exceeds that of the large platforms. How can the Drupal community help solve this problem?

Idea 3: A personal information broker that allows people more control over their data. In the past, I've written about the idea for a personal information broker that will give people control over how, where and for how long their data is used, across every single interaction on the web. This is no small feat. An audience member asked an interesting question about who will build this personal information broker -- whether it will be a private company, a government, an NGO, or a non-profit organization? I'm not really sure I have the answer, but I am optimistic that we can figure that out. I wish I had the resources to build this myself as I believe this will be a critical building block for the web. What do you think is the best way forward?

Ultimately, we should be building the web that we want to use, and that we want our children to be using for decades to come. It's time to start to rethink the foundations, before it's too late. If we can move any of these ideas forward in a meaningful way, they will impact billions of people, and billions more in the future.

Comments

Renee S (not verified):

Re: Ideas #2 and #3: the problem is that robust privacy is in nobody's interest except citizens themselves.

I was chatting with a friend who worked in privacy at one of those companies named above; they have a mandatory privacy-review checkbox before something can ship and STILL they get a lot of pushback from inside the company. This person had a lot to say about how difficult it is to get marketing and engineering both to understand why privacy is important *at all*. They often don't appreciate concepts like how cross-linking information can literally put lives at risk, about how anonymity is related to human rights... these things aren't on the radar of a lot of implementers and decision-makers. This company sees integration partners constantly trying to push the limits of what data they can collect, and the choice is either to help them in a limited way and control the damage, or see them leave the platform and do their own thing willy nilly... it's a tough set of calls to have to make.

At best, privacy is seen as inconvenient, and at worst an active impediment to business. Marketing hates it because it cuts down on the profiling they can do, and engineering hates it because building robust privacy options, meaningful choices, and timed removal of data (real removal, from things like backups) into an application makes it that much harder to build. Regulation and oversight, like what the EU is attempting re: their privacy standards, is probably the only way that privacy can be guaranteed to get built into the next generation of the web. But I'm not sure that will ever happen in the countries that need it most: China and the US.

An information broker, that allows people to know what and where their data is used (sort of like the social graph but for privacy and data?) sounds like an incredibly useful and powerful tool, but might also be a tempting surface for attacks and even could provide a way to build a profile from somebody's "information shadow," the things that are missing (being super tinfoil hatty here, but... the algorithms. They are smart). Given the NSA's known proclivities, it may be hard to get users to trust something centralized, so possibly a combination would be most fruitful: open standards with an expectation that an organization will implement them via a common interface, or some regulation that forces a minimum compliance... as you say, who knows what it would look like? But the need is great.

The ability for systems to build precise predictive models from online (and, more and more with the Internet of Things, offline) behaviour is, frankly, terrifying. At best it leads to self-censorship and second-guessing and dampens social risk-taking, and at worst it can lead to manipulation, a subversion of free will and (not to sound like a paranoiac here, but) democracy itself. It needs to be something that companies actively care about balancing. And if companies don't, then free nations must. But, see above re: US and China, governments also have an interest in accessing personal data: law and tax enforcement come to mind as being boosters for backdoors and more invasive and intensive data collection programs. This is, perhaps, the real "hardest problem" in computer science right now.

April 27, 2016
Pierre NOEL (not verified):

One solution can be Platform Cooperativism: http://www.thenews.coop/100215/news/co-operatives/platform-cooperativis…

Of course not, it's not THE solution to the whole problem, but still, I think it a good start.

July 10, 2016