State of Drupal presentation (September 2015)

File this under "better late than never". Before the year closes out, I wanted to post my 2015 DrupalCon Barcelona keynote video and slides. I archive all my DrupalCon keynotes on my site so anyone who is interested in taking a trip to memory lane or studying the evolution of Drupal, can check out all my previous DrupalCon keynotes.

My DrupalCon Barcelona keynote is focused on having a realistic, open and honest conversation about the state of Drupal. In it, I broke down my thoughts on Drupal's market position, development process, and "decoupled Drupal". You can watch the recording of my keynote or download a copy of my slides (PDF, 27 MB).

In addition, here are three related blog posts I wrote about the future of decoupled Drupal, having to pick a JavaScript framework for Drupal, and a proposal for how to evolve our development process, which will allow us to ship new features every 6 months rather than every 4-5 years.

Should we decouple Drupal with a client-side framework?

As user experiences evolve from static pages to application-like experiences, end users' expectations of websites have become increasingly demanding. The Facebook newsfeed, the Gmail inbox, and the Twitter live stream are all compelling examples that form a baseline for the application-like experiences users now take for granted.

Many of Drupal's administrative interfaces and Drupal sites could benefit from a similarly seamless, instantaneous user experience. Drupal's user interfaces for content modeling (Field UI), layout management (Panels), and block management would benefit from no page refreshes, instant previews, and interface previews. These traits could also enrich the experience of site visitors; Drupal's commenting functionality could similarly gain from these characteristics and resemble an experience more like the commenting experience found in Facebook.

As Drupal's project lead, I ask myself: how can our community feel enabled and encouraged to start building rich user experiences?

In recent years, the emergence of decoupled architectures with client-side frameworks and libraries have helped our industry build these experiences. Does that mean we have to decouple Drupal's front-facing site experience (for site visitors or end users) and/or the administration experience (for content editors and site builders)? If so, should Drupal decouple the administration layer differently from the front-facing site experience? By extension, should a client-side framework guide this change?

Here is my current thinking: in the short term, Drupal should work toward a next-generation user experience under progressive decoupling for both the administration layer and the end user experience. At the same time, we should enable fully decoupled end-user and administrative experiences to be built on Drupal too. In my view, the best way to achieve this is to formally standardize on a full-featured MV* framework (e.g. Angular, Ember, Backbone, and Knockout) or a component-based view library (e.g. React, Vue, and Riot). In this blog post, I want to kick off the discussion and walk you through my current position in some more detail.

Should we decouple Drupal itself?

Should we decouple drupal framework responsibility

A framework can assume responsibility over very little, such as with Backbone, or a great deal of the stack, including the rendering process.

The question we have to answer is: do we stand to benefit from decoupling Drupal? In this context, decoupled Drupal refers to a separation between the Drupal back end and one or more front ends. In a decoupled architecture, the back end is built in a server-side language like PHP, the front end is built using a client-side framework written in JavaScript, and data is transferred between the two. Decoupling your content management system gives front-end developers greater control over an application or site's rendered markup and user experience, and using a client-side framework gives them better tools to build application-like experiences.

There is no doubt that Drupal's administration layer is very application-like and would benefit from an experience that is more interactive. For the end-user experience, a decoupled implementation is not always the best option. Some sites may not need or want the application-like interactivity that a client-side framework would provide, since not every site needs interaction. For instance, news sites or blogs do not need much interactivity, custom applications need a great deal, while e-commerce sites lie somewhere in the middle. It's not clear-cut, so let's look at our options in more detail.

The first option is to preserve traditional Drupal, the status quo where there is no decoupling. This would leave the Drupal theme layer intact apart from additional JavaScript embedded freely by the developer, but as a result, the onus is on the developer to introduce client-side user experience improvements. The likely outcome is that different modules will start using different client-side frameworks. Such a scenario would steepen Drupal's learning curve, impede collaboration, make Drupal more brittle, and damage performance on pages whose display is influenced by front-end components of multiple modules.

The second option is to work toward full decoupling, in which Drupal is solely a themeless content repository for client-side applications. But fully decoupling both the Drupal administrative interface and the front end would require us to rebuild much of our page building functionality (e.g. site preview, layouts, and block placement) and to abandon many performance benefits of Drupal 8. Moreover, more JavaScript, especially on the server side, means complicating infrastructure requirements and a large-scale overhaul of server-side code.

The third option is what I call progressive decoupling, where Drupal renders templates on the server to provide an initial application state, and JavaScript then manipulates that state. This means we can have the best of both worlds: unobtrusive JavaScript and all the out-of-the-box performance advantages of Drupal 8 — a short time to first paint (when the user sees the page materialize) via BigPipe as well as interface previews and static fallbacks during the time to first interaction (when the user can perform an action). Plus, while some additional JavaScript would enter the picture, progressive decoupling mitigates the amount of change needed across the stack by using a baseline of existing theme functionality.


Approach User experience Developer experience Ease of implementation
Traditional coupling Page refreshes
No immediate feedback
Interrupted workflow
Theme layer preserved
Modules in mostly PHP
Ad-hoc client-side code
PHP and minimal JavaScript
No server-side change
No client-side change
Progressive decoupling No page refreshes
Immediate feedback
Uninterrupted workflow
Theme layer preserved
Modules in PHP and JavaScript
Unified client-side code
Both PHP and JavaScript
Some server-side change
Some client-side change
Full decoupling No page refreshes
Immediate feedback
Uninterrupted workflow
Theme layer replaced
Module rewrites in JavaScript
Unified client-side code
Primarily JavaScript
Much server-side change
Much client-side change

How should Drupal decouple its user experience?

Should we decouple Drupal front end experiences?

This diagram shows some of the possible front-end experiences that could rely on a single Drupal back-end. In full decoupling, a custom application built using a client-side framework encompasses the entire front-end. In progressive decoupling, JavaScript manipulates an initial state already provided by the theme layer.

Drupal's administration layer (content editor and site builder experience) is effectively an application. Fully decoupling may be the appropriate call to achieve the best possible user experience for creating, managing and presenting content. However, rewriting the administration layer from the ground up is a monumental task, especially since its modules provide powerful interfaces that allow site builders to build robust, complex sites without a line of code.

The same expectations for application-like interactivity often hold for the end-user experience: users expect shopping carts, comments, notifications, and searches to be updated instantaneously, without having to wait for a page to load.

For both the administration layer and the end-user experience, I believe the Drupal front end should not be fully decoupled out of the box. We should advance from our traditional paradigm and default to progressive decoupling. It allows us to achieve the user experience we want without significant downsides, since not every use case would benefit from fully decoupling. Through progressive decoupling, Drupal could potentially reach the ideals of the assembled web more quickly by preserving a tight link between Drupal modules and their front-end components.

Nonetheless, we should empower people building fully decoupled sites and applications. Depending on the use case, Drupal 8 is a good match for decoupled applications but we should improve and extend Drupal's REST API, enhance contributed modules such as Services, and shore up new features such as GraphQL (demo video) so more functionality can be decoupled. Front-end developers can then use any framework of their choice — whether it is Angular, Ember, React, or something else — to build a fully decoupled administrative application.

Should Drupal standardize on a single framework?

All things considered, I do believe Drupal should standardize on a single client-side framework, but it should only make such an explicit recommendation for progressively decoupled Drupal, not fully decoupled architectures. It would result in a more seamless user experience, better compatibility across interactive components in modules, maximum code reuse, a more consistent front-end developer experience, more maintainable code, and better performance as we don't have to load multiple frameworks.

Despite the potential benefits, there are also good reasons not to embrace a single client-side framework. New front-end frameworks are being created at what feels like an unsustainable pace; every nine months there is a new kid on the block. It's hard for a project as large as Drupal to embrace a single technology when there is no guarantee of its longevity.

For instance, Backbone, with its underlying library Underscore, currently powers interactions in the toolbar and in-place editing in Drupal 8. Though Drupal could expand the scope of Backbone in core and encourage front-end developers to build with it, it means buying even further into a framework that is quite old among its peers.

To deal with the fast-evolving nature of the front-end landscape, we need to be thoughtful about which framework we choose, to reassess our choice from time to time, and to make sure we can migrate fairly easily if we so decide.

What client-side framework should Drupal standardize on?

Assuming we agree that embracing a single client-side framework makes sense for Drupal, there are actually three additional questions: what framework to standardize on, how to do it, and when to decide.

I'm the first to admit I'm not the best person to answer the first question. As I'm not a front-end developer, I'm looking at you, the Drupal community, to help answer this question. I'd hope that the chosen framework aligns well with both our architecture (modular, flexible) and community (collaborative, community-driven).

The second question — how to standardize on a framework — I can help answer. On the one extreme, Drupal could be opinionated and ship a client-side framework with Drupal core, meaning that every installation of Drupal ships with the chosen framework. This would be much like the adoptions of jQuery and Backbone.js. On the other end of the spectrum, Drupal could recommend a specific framework but not ship with it. Finally, somewhere in between, Drupal could provide a default standard framework in core but make it easy to replace with it a framework of a developer's choice, though the likelihood of this is quite small. This is akin to Drupal core shipping with a template engine (i.e. PHPTemplate) that could be (but was rarely) replaced with another. Ultimately, I think we get the best result if Drupal ships with a specific framework—much like the adoption of jQuery in Drupal 5.

The last question, when to standardize on a framework, is important too. I would recommend we experiment with possible directions as soon as possible in order to decide on a final vision sooner rather than later.

Conclusion

I believe that, for now, it makes more sense to progressively decouple Drupal sites and their administration layer by first building our pages with Drupal. Once the page is loaded, we can let a unified client-side framework take over as much of the page as needed to foster a next-generation user experience without reinventing the wheel or alienating developers.

Whatever the result of this debate, Drupal needs your help. We will need initiatives to build a foundation for better decoupled experiences. For all decoupled implementations - whether full or progressive - we will want to redouble our efforts on existing web services to create the best APIs for client-side applications. As for progressively decoupled experiences, we will need a process for selecting a client-side framework and defining how the existing Drupal JavaScript behaviors system would need to evolve. Finally, we will need to coordinate all of these efforts to focus on offering a better administrative experience.

That is not all! We will need module developers to bring rich interactions to their user interfaces with the help of the framework we choose. We will need designers to guide module developers in building a graceful user interface. We will need front-end developers to demonstrate how they want to develop the user experiences that will define Drupal for years to come. We will need users to test and validate all of our work. It will be tough going, but together, we can ensure Drupal stays ahead of the curve well into the future.

Special thanks to Preston So and Wim Leers for contributions to this blog post and to Moshe Weitzman and Gábor Hojtsy for feedback during its writing.

Algorithms rule our lives, so who should rule them?

Volkswagen's recent emissions scandal highlighted the power that algorithms wield over our everyday lives. As technology advances and more everyday objects are driven almost entirely by software, it's become clear that we need a better way to catch cheating software and keep people safe.

A solution could be to model regulation of the software industry after the US Food and Drug Administration's oversight of the food and drug industry. The parallels are closer than you might think.

The case for tighter regulation

When Volkswagen was exposed for programming its emissions-control software to fool environmental regulators, many people called for more transparency and oversight over the technology.

One option discussed by the software community was to open-source the code behind these testing algorithms. This would be a welcome step forward, as it would let people audit the source code and see how the code is changed over time. But this step alone would not solve the problem of cheating software. After all, there is no guarantee that Volkswagen would actually use the unmodified open-sourced code.

Open-sourcing code would also fail to address other potential dangers. Politico reported earlier this year that Google's algorithms could influence the outcomes of presidential elections, since some candidates could be featured more prominently in its search results.

Research by the American Institute for Behavioral Research and Technology has also shown that Google search results could shift voting preferences by 20% or more (up to 80% in certain demographic groups). This could potentially flip the margins of voting elections worldwide. But since Google's private algorithm is a core part of its competitive advantage, open-sourcing it is not likely to be an option.

The same problem applies to the algorithms used in DNA testing, breathalyzer tests and facial recognition software. Many defense attorneys have requested access to the source code for these tools to verify the algorithms' accuracy. But in many cases, these requests are denied, since the companies that produce the proprietary criminal justice algorithms fear a threat to their businesses' bottom line. Yet clearly we need some way to ensure the accuracy of software that could put people behind bars.

What we can learn from the FDA

So how exactly could software take a regulatory page from the FDA in the United States? Before the 20th century, the government made several attempts to regulate food and medicine, but abuse within the system was still rampant. Food contamination caused widespread illness and death, particularly within the meatpacking industry.

Meanwhile, the rise of new medicines and vaccines promised to eradicate diseases, including smallpox. But for every innovation, there seemed to be an equal amount of extortion by companies making false medical claims or failing to disclose ingredients. The reporting of journalists like Upton Sinclair made it abundantly clear by the early 1900s that the government needed to intervene to protect people and establish quality standards.

In 1906, President Theodore Roosevelt signed the Food and Drug Act into law, which prevented false advertising claims, set sanitation standards, and served as a watchdog for companies that could cause harm to consumers' welfare. These first rules and regulations served as a foundation for our modern-day FDA, which is critical to ensuring that products are safe for consumers.

The FDA could be a good baseline model for software regulation in the US and countries around the world, which have parallel FDA organizations including the European Medicines Agency, Health Canada, and the China Food and Drug Administration.

Just as the FDA ensures that major pharmaceutical companies aren't lying about the claims they make for drugs, there should be a similar regulator for software to ensure that car companies are not cheating customers and destroying the environment in the process. And just as companies need to disclose food ingredients to prevent people from ingesting poison, companies like Google should be required to provide some level of guarantee that they won't intentionally manipulate search results that could shape public opinion.

It's still relatively early days when it comes to discovering the true impact of algorithms in consumers' lives. But we should establish standards to prevent abuse sooner rather than later. With technology already affecting society on a large scale, we need to address emerging ethical issues head-on.

(I originally wrote this blog post as a guest article for Quartz.)

BigPipe, no longer just for the top 50 websites

One thing that is exciting to me, is how much we appear to have gotten right in Drupal 8. The other day, for example, I stumbled upon a recent article from the LinkedIn engineering team describing how they completely changed how their homepage is built. Their primary engineering objective was to deliver the fastest page load time possible, and one of the crucial ingredients to achieve that was Facebook's BigPipe.

I discussed BigPipe on my blog before: first when I wrote about making Drupal 8 fly and later when I wrote about decoupled Drupal. Since then, Drupal 8 shipped with BigPipe support.

When a very high-profile, very high-traffic, highly personalized site like LinkedIn uses the same technique as Drupal 8, that solidifies my belief in Drupal 8.

LinkedIn supports both server-side and client-side rendering. While Drupal 8 does server-side rendering, we're still missing explicit support for client-side rendering. The advantage of client-side rendering versus server-side rendering is debatable. I've touched upon it in my blog post on progressive decoupling, but I'll address the topic of client-side rendering in a future blog post.

However, there is also something LinkedIn could learn from Drupal! Every component of a LinkedIn page that should be delivered via BigPipe needs to write BigPipe-specific code which is prone to errors and requires all engineers to be familiar with BigPipe. Drupal 8 on the other hand has a level of abstraction that allows BigPipe to work without the need for BigPipe-specific code. Thanks to Drupal's higher-level API, Drupal module developers don't have to understand BigPipe: Drupal 8 knows what page components are poorly cacheable or not cacheable at all, and what page components are renderable in isolation, and uses that information to automatically optimize the delivery of page components using BigPipe.

It is exciting to see Drupal support the advanced techniques that were previously only within reach of the top 50 most visited sites of the world! Drupal's BigPipe support will benefit websites small and large.

Drupal 8 milestones in Tweets

Building Drupal 8 with all of you has been a wild ride. I thought it would be fun to take a little end-of-week look back at some of our community's biggest milestones through Twitter. If you can think of others important Tweets, please share them in the comments, and I'll update the post.

Feeling nostalgic? See every single version of Drupal running!

Here is how we opened the development branch for Drupal 8: live at Drupalcon!

Drupal 8's first beta showed the power of community

We had issues ... but the queue steadily declined

We held sprints around the world: here are just a few

And we created many game-changing features

The founder of PHP said: Drupal 8 + PHP7 = a lot of happy people

We reached the first release candidate and celebrated ... a little

And, just yesterday, we painted the world blue and celebrated Drupal 8 ... a lot!

Pages

Dries Buytaert is the original creator and project lead of Drupal and the co-founder and CTO of Acquia. He writes about Drupal, startups, business, photography and building the world we want to exist in.

Updates from Dries straight to your mailbox