You are here

Video: Can we save the open web?

In March, I did a presentation at SxSW that asked the audience a question I've been thinking about a lot lately: "Can we save the open web?".

The web is centralizing around a handful of large companies that control what we see, limit creative freedom, and capture a lot of information about us. I worry that we risk losing the serendipity, creativity and decentralization that made the open web great.


The open web closing

While there are no easy answers to this question, the presentation started a good discussion about the future of the open web, the role of algorithms in society, and how we might be able to take back control of our personal information.

I'm going to use my blog to continue the conversation about the open web, since it impacts the future of Drupal. I'm including the video and slides (PDF, 76 MB) of my SxSW presentation below, as well as an overview of what I discussed.

Here are the key ideas I discussed in my presentation, along with a few questions to discuss in the comments.

Idea 1: An FDA-like organization to provide oversight for algorithms. While an "FDA" in and of itself may not be the most ideal solution, algorithms are nearly everywhere in society and are beginning to impact life-or-death decisions. I gave the example of an algorithm for a self-driving car having to decide whether to save the driver or hit a pedestrian crossing the street. There are many other life-or-death examples of how unregulated technology could impact people in the future, and I believe this is an issue we need to begin thinking about now. What do you suggest we do to make the use of algorithms fair and trustworthy?

Idea 2: Open standards that will allow for information-sharing across sites and applications. Closed platforms like Facebook and Google are winning because they're able to deliver a superior user experience driven by massive amounts of data and compute power. For the vast majority of people, ease-of-use will trump most concerns around privacy and control. I believe we need to create a set of open standards that enable drastically better information-sharing and integration between websites and applications so independent websites can offer user experiences that meet or exceeds that of the large platforms. How can the Drupal community help solve this problem?

Idea 3: A personal information broker that allows people more control over their data. In the past, I've written about the idea for a personal information broker that will give people control over how, where and for how long their data is used, across every single interaction on the web. This is no small feat. An audience member asked an interesting question about who will build this personal information broker -- whether it will be a private company, a government, an NGO, or a non-profit organization? I'm not really sure I have the answer, but I am optimistic that we can figure that out. I wish I had the resources to build this myself as I believe this will be a critical building block for the web. What do you think is the best way forward?

Ultimately, we should be building the web that we want to use, and that we want our children to be using for decades to come. It's time to start to rethink the foundations, before it's too late. If we can move any of these ideas forward in a meaningful way, they will impact billions of people, and billions more in the future.

Can we save the open web?

The web felt very different fifteen years ago, when I founded Drupal. Just 7 percent of the population had internet access, there were only around 20 million websites, and Google was a small, private company. Facebook, Twitter, and other household tech names were years away from being founded. In these early days, the web felt like a free space that belonged to everyone. No one company dominated as an access point or controlled what users saw. This is what I call the "open web".

But the internet has changed drastically over the last decade. It's become a more closed web. Rather than a decentralized and open landscape, many people today primarily interact with a handful of large platform companies online, such as Google or Facebook. To many users, Facebook and Google aren't part of the internet -- they are the internet.

I worry that some of these platforms will make us lose the original integrity and freedom of the open web. While the closed web has succeeded in ease-of-use and reach, it raises a lot of questions about how much control individuals have over their own experiences. And, as people generate data from more and more devices and interactions, this lack of control could get very personal, very quickly, without anyone's consent. So I've thought through a few potential ideas to bring back the good things about the open web. These ideas are by no means comprehensive; I believe we need to try a variety of approaches before we find one that really works.

Double-edged sword

It's undeniable that companies like Google and Facebook have made the web much easier to use and helped bring billions online. They've provided a forum for people to connect and share information, and they've had a huge impact on human rights and civil liberties. These are many things for which we should applaud them.

But their scale is also concerning. For example, Chinese messaging service Wechat (which is somewhat like Twitter) recently used its popularity to limit market choice. The company banned access to Uber to drive more business to their own ride-hailing service. Meanwhile, Facebook engineered limited web access in developing economies with its Free Basics service. Touted in India and other emerging markets as a solution to help underserved citizens come online, Free Basics allows viewers access to only a handful of pre-approved websites (including, of course, Facebook). India recently banned Free Basics and similar services, claiming that these restricted web offerings violated the essential rules of net neutrality.

Algorithmic oversight

Beyond market control, the algorithms powering these platforms can wade into murky waters. According to a recent study from the American Institute for Behavioral Research and Technology, information displayed in Google could shift voting preferences for undecided voters by 20 percent or more -- all without their knowledge. Considering how narrow the results of many elections can become, this margin is significant. In many ways, Google controls what information people see, and any bias, intentional or not, has a potential impact on society.

In the future, data and algorithms will power even more grave decisions. For example, code will decide whether a self-driving car stops for an oncoming bus or runs into pedestrians.

It's possible that we're reaching the point where we need oversight for consumer-facing algorithms. Perhaps it's time to consider creating an oversight committee. Similar to how the FDA monitors the quality and safety of food and drugs, this regulatory body could audit algorithms. Recently, I spoke at Harvard's Berkman Center for the Internet and Society, where attendees also suggested a global "Consumer Reports" style organization that would "review" the results of different company's algorithms, giving consumers more choice and transparency.

Gaining control of our personal data

But algorithmic oversight is not enough. In numbers by the billions, people are using free and convenient services, often without a clear understanding of how and where their data is being used. Many times, this data is shared and exchanged between services, to the point where people don't know what's safe anymore. It's an unfair trade-off.

I believe that consumers should have some level of control over how their data is shared with external sites and services; in fact, they should be able to opt into nearly everything they share if they want to. If a consumer wants to share her shoe size and color preferences with every shopping website, her experience with the web could become more personal, with her consent. Imagine a way to manage how our information is used across the entire web, not just within a single platform. That sort of power in the hands of the people could help the open web gain an edge on the hyper-personalized, easy-to-use "closed" web.

In order for a consumer-based, opt-in data sharing system described above to work, the entire web needs to unite around a series of common standards. This idea in and of itself is daunting, but some information-sharing standards like OAuth have shown us that it can be done. People want the web to be convenient and easy-to-use. Website creators want to be discovered. We need to find a way to match user preferences and desires with information throughout the open web. I believe that collaboration and open standards could be a great way to decentralize power and control on the web.

Why does this matter?

The web will only expand into more aspects of our lives. It will continue to change every industry, every company, and every life on the planet. The web we build today will be the foundation for generations to come. It's crucial we get this right. Do we want the experiences of the next billion web users to be defined by open values of transparency and choice, or the siloed and opaque convenience of the walled garden giants dominating today?

I believe we can achieve a balance between companies' ability to grow, profit and innovate, while still championing consumer privacy, freedom and choice. Thinking critically and acting now will ensure the web's open future for everyone.

(I originally wrote this blog post as a guest article for The Daily Dot. I also gave a talk yesterday at SXSW on a similar topic, and will share the slides along with a recording of my talk when it becomes available in a couple of weeks.)

White House deepens its commitment to Open Source

Yesterday, the White House announced a plan to deepen its commitment to open source. Under this plan, new, custom-developed government software must be made available for use across other federal agencies, and a portion of all projects must be made open source and shared with the public. This plan will make it much easier to share best practices, collaborate, and save money across different government departments.

However, there are still some questions to address. In good open source style, the White House is inviting developers to comment on this policy. As the Drupal community we should take advantage and comment on GitHub within the 30-day feedback window.

The White House has a long open source history with Drupal. In October 2009, WhiteHouse.gov relaunched on Drupal and shortly thereafter started to actively contribute back to Drupal -- both were a first in the history of the White House. White House's contributions to Drupal include the "We the People" petitions platform, which was adopted by other governments and organizations around the world.

This week's policy is big news because it will push open source deeper into the roots of the U.S. government, requiring more government agencies to become active open source contributors. We'll be able to solve problems faster and, together, build better software for citizens across the U.S.

I'm excited to see how this plays out in the coming months!

The rise of Drupal in India

Earlier this week I returned from DrupalCon Asia, which took place at IIT Bombay, one of India's premier engineering universities. I wish I could have bottled up all the energy and excitement to take home with me. From dancing on stage, to posing for what felt like a million selfies, to a motorcycle giveaway, this DrupalCon was unlike any I've seen before.

Drupalcon group photo

A little over 1,000 people attended the first DrupalCon in India. For 82% of the attendees, it was their first DrupalCon. There was also much better gender diversity than at other DrupalCons.

The excitement and interest around Drupal has been growing fast since I last visited in 2011. DrupalCamp attendance in both Delhi and Mumbai has exceeded 500 participants. There have also been DrupalCamps held in Hyderabad, Bangalore, Pune, Ahmedabad Jaipur, Srinagar, Kerala and other areas.

Indian Drupal companies like QED42, Axelerant, Srijan and ValueBound have made meaningful contributions to Drupal 8. The reason? Visibility on Drupal.org through the credit system helps them win deals and hire the best talent. ValueBound said it best when I spoke to them: "With our visibility on drupal.org, we no longer have to explain why we are a great place to work and that we are experts in Drupal.".

Also present were the large System Integrators (Wipro, TATA Consultancy Services, CapGemini, Accenture, MindTree, etc). TATA Consultancy Services has 400+ Drupalists in India, well ahead of the others who have between 100 and 200 Drupalists each. Large digital agencies such as Mirum and AKQA also sent people to DrupalCon. They are all expanding their Drupal teams in India to service the needs of growing sales in other offices around the world. The biggest challenge across the board? Finding Drupal talent. I was told that TATA Consultancy Services allows many of its developers to contribute back to Drupal, which is why they have been able to hire faster. More evidence that the credit system is working in India.

The government is quickly adopting Drupal. MyGov.in is one of many great examples; this portal was established by India's central government to promote citizen participation in government affairs. The site reached nearly two million registered users in less than a year. The government's shifting attitude toward open source is a big deal because historically, the Indian government has pushed back against open source because large organizations like Microsoft were funding many of the educational programs in India. The tide changed in 2015 when the Indian government announced that open source software should be preferred over proprietary software for all e-government projects. Needless to say, this is great news for Drupal.

Another initiative that stood out was the Drupal Campus Ambassador Program. The aim of this program is to appoint Drupal ambassadors in every university in India to introduce more students to Drupal and help them with their job search. It is early days for the program, but I recommend we pay attention to it, and consider scaling it out globally if successful.

Last but not least there was FOSSEE (Free and Open Source Software for Education), a government-funded program that promotes open source software in academic institutions, along with its sister project, Spoken Tutorial. To date, 2,500 colleges participate in the program and more than 1 million students have been trained on open source software. With the spoken part of their videos translated into 22 local languages, students gain the ability to self-study and foster their education outside of the classroom. I was excited to hear that FOSSEE plans to add a Spoken Tutorial series on Drupal course to its offerings. There is a strong demand for affordable Drupal training and certifications throughout India's technical colleges, so the idea of encouraging millions of Indian students to take a free Drupal course is very exciting -- even if only 1% of them decides to contribute back this could be a total game changer.

Open source makes a lot of sense for India's thriving tech community. It is difficult to grasp the size of the opportunity for Drupal in India and how fast its adoption has been growing. I have a feeling I will be back in India more than once to help support this growing commitment to Drupal and open source.

Algorithms rule our lives, so who should rule them?

Volkswagen's recent emissions scandal highlighted the power that algorithms wield over our everyday lives. As technology advances and more everyday objects are driven almost entirely by software, it's become clear that we need a better way to catch cheating software and keep people safe.

A solution could be to model regulation of the software industry after the US Food and Drug Administration's oversight of the food and drug industry. The parallels are closer than you might think.

The case for tighter regulation

When Volkswagen was exposed for programming its emissions-control software to fool environmental regulators, many people called for more transparency and oversight over the technology.

One option discussed by the software community was to open-source the code behind these testing algorithms. This would be a welcome step forward, as it would let people audit the source code and see how the code is changed over time. But this step alone would not solve the problem of cheating software. After all, there is no guarantee that Volkswagen would actually use the unmodified open-sourced code.

Open-sourcing code would also fail to address other potential dangers. Politico reported earlier this year that Google's algorithms could influence the outcomes of presidential elections, since some candidates could be featured more prominently in its search results.

Research by the American Institute for Behavioral Research and Technology has also shown that Google search results could shift voting preferences by 20% or more (up to 80% in certain demographic groups). This could potentially flip the margins of voting elections worldwide. But since Google's private algorithm is a core part of its competitive advantage, open-sourcing it is not likely to be an option.

The same problem applies to the algorithms used in DNA testing, breathalyzer tests and facial recognition software. Many defense attorneys have requested access to the source code for these tools to verify the algorithms' accuracy. But in many cases, these requests are denied, since the companies that produce the proprietary criminal justice algorithms fear a threat to their businesses' bottom line. Yet clearly we need some way to ensure the accuracy of software that could put people behind bars.

What we can learn from the FDA

So how exactly could software take a regulatory page from the FDA in the United States? Before the 20th century, the government made several attempts to regulate food and medicine, but abuse within the system was still rampant. Food contamination caused widespread illness and death, particularly within the meatpacking industry.

Meanwhile, the rise of new medicines and vaccines promised to eradicate diseases, including smallpox. But for every innovation, there seemed to be an equal amount of extortion by companies making false medical claims or failing to disclose ingredients. The reporting of journalists like Upton Sinclair made it abundantly clear by the early 1900s that the government needed to intervene to protect people and establish quality standards.

In 1906, President Theodore Roosevelt signed the Food and Drug Act into law, which prevented false advertising claims, set sanitation standards, and served as a watchdog for companies that could cause harm to consumers' welfare. These first rules and regulations served as a foundation for our modern-day FDA, which is critical to ensuring that products are safe for consumers.

The FDA could be a good baseline model for software regulation in the US and countries around the world, which have parallel FDA organizations including the European Medicines Agency, Health Canada, and the China Food and Drug Administration.

Just as the FDA ensures that major pharmaceutical companies aren't lying about the claims they make for drugs, there should be a similar regulator for software to ensure that car companies are not cheating customers and destroying the environment in the process. And just as companies need to disclose food ingredients to prevent people from ingesting poison, companies like Google should be required to provide some level of guarantee that they won't intentionally manipulate search results that could shape public opinion.

It's still relatively early days when it comes to discovering the true impact of algorithms in consumers' lives. But we should establish standards to prevent abuse sooner rather than later. With technology already affecting society on a large scale, we need to address emerging ethical issues head-on.

(I originally wrote this blog post as a guest article for Quartz.)

Pages

Updates from Dries straight to your mailbox