The rise of Drupal in India

Earlier this week I returned from DrupalCon Asia, which took place at IIT Bombay, one of India's premier engineering universities. I wish I could have bottled up all the energy and excitement to take home with me. From dancing on stage, to posing for what felt like a million selfies, to a motorcycle giveaway, this DrupalCon was unlike any I've seen before.

Drupalcon group photo

A little over 1,000 people attended the first DrupalCon in India. For 82% of the attendees, it was their first DrupalCon. There was also much better gender diversity than at other DrupalCons.

The excitement and interest around Drupal has been growing fast since I last visited in 2011. DrupalCamp attendance in both Delhi and Mumbai has exceeded 500 participants. There have also been DrupalCamps held in Hyderabad, Bangalore, Pune, Ahmedabad Jaipur, Srinagar, Kerala and other areas.

Indian Drupal companies like QED42, Axelerant, Srijan and ValueBound have made meaningful contributions to Drupal 8. The reason? Visibility on through the credit system helps them win deals and hire the best talent. ValueBound said it best when I spoke to them: "With our visibility on, we no longer have to explain why we are a great place to work and that we are experts in Drupal.".

Also present were the large System Integrators (Wipro, TATA Consultancy Services, CapGemini, Accenture, MindTree, etc). TATA Consultancy Services has 400+ Drupalists in India, well ahead of the others who have between 100 and 200 Drupalists each. Large digital agencies such as Mirum and AKQA also sent people to DrupalCon. They are all expanding their Drupal teams in India to service the needs of growing sales in other offices around the world. The biggest challenge across the board? Finding Drupal talent. I was told that TATA Consultancy Services allows many of its developers to contribute back to Drupal, which is why they have been able to hire faster. More evidence that the credit system is working in India.

The government is quickly adopting Drupal. is one of many great examples; this portal was established by India's central government to promote citizen participation in government affairs. The site reached nearly two million registered users in less than a year. The government's shifting attitude toward open source is a big deal because historically, the Indian government has pushed back against open source because large organizations like Microsoft were funding many of the educational programs in India. The tide changed in 2015 when the Indian government announced that open source software should be preferred over proprietary software for all e-government projects. Needless to say, this is great news for Drupal.

Another initiative that stood out was the Drupal Campus Ambassador Program. The aim of this program is to appoint Drupal ambassadors in every university in India to introduce more students to Drupal and help them with their job search. It is early days for the program, but I recommend we pay attention to it, and consider scaling it out globally if successful.

Last but not least there was FOSSEE (Free and Open Source Software for Education), a government-funded program that promotes open source software in academic institutions, along with its sister project, Spoken Tutorial. To date, 2,500 colleges participate in the program and more than 1 million students have been trained on open source software. With the spoken part of their videos translated into 22 local languages, students gain the ability to self-study and foster their education outside of the classroom. I was excited to hear that FOSSEE plans to add a Spoken Tutorial series on Drupal course to its offerings. There is a strong demand for affordable Drupal training and certifications throughout India's technical colleges, so the idea of encouraging millions of Indian students to take a free Drupal course is very exciting -- even if only 1% of them decides to contribute back this could be a total game changer.

Open source makes a lot of sense for India's thriving tech community. It is difficult to grasp the size of the opportunity for Drupal in India and how fast its adoption has been growing. I have a feeling I will be back in India more than once to help support this growing commitment to Drupal and open source.

A history of JavaScript across the stack

Did you know that JavaScript was created in 10 days? In May 1995, Brendan Eich wrote the first version of JavaScript in 10 days while working at Netscape.

For the first 10 years of JavaScript's life, professional programmers denigrated JavaScript because its target audience consisted of "amateurs". That changed in 2004 with the launch of Gmail. Gmail was the first popular web application that really showed off what was possible with client-side JavaScript. Competing e-mail services such as Yahoo! Mail and Hotmail featured extremely slow interfaces that used server-side rendering almost exclusively, with almost every action by the user requiring the server to reload the entire web page. Gmail began to work around these limitations by using XMLHttpRequest for asynchronous data retrieval from the server. Gmail's use of JavaScript caught the attention of developers around the world. Today, Gmail is the classic example of a single-page JavaScript app; it can respond immediately to user interactions and no longer needs to make roundtrips to the server just to render a new page.

A year later in 2005, Google launched Google Maps, which used the same technology as Gmail to transform online maps into an interactive experience. With Google Maps, Google was also the first large company to offer a JavaScript API for one of their services allowing developers to integrate Google Maps into their websites.

Google's XMLHttpRequest approach in Gmail and Google Maps ultimately came to be called Ajax (originally "Asynchronous JavaScript and XML"). Ajax described a set of technologies, of which JavaScript was the backbone, used to create web applications where data can be loaded in the background, avoiding the need for full page refreshes. This resulted in a renaissance period of JavaScript usage spearheaded by open source libraries and the communities that formed around them, with libraries such as Prototype, jQuery, Dojo and Mootools. (We added jQuery to Drupal core as early as 2006.)

In 2008, Google launched Chrome with a faster JavaScript engine called V8. The release announcement read: "We also built a more powerful JavaScript engine, V8, to power the next generation of web applications that aren't even possible in today's browsers.". At the launch, V8 improved JavaScript performance by 10x over Internet Explorer by compiling JavaScript code to native machine code before executing it. This caught my attention because I had recently finished my PhD thesis on the topic of JIT compilation. More importantly, this marked the beginning of different browsers competing on JavaScript performance, which helped drive JavaScript's adoption.

In 2010, Twitter made a move unprecedented in JavaScript's history. For the redesign in 2010, they began implementing a new architecture where substantial amounts of server-side code and client-side code were built almost entirely in JavaScript. On the server side, they built an API server that offered a single set of endpoints for their desktop website, their mobile website, their native apps for iPhone, iPad, and Android, and every third-party application. As a result, they moved much of the UI rendering and corresponding logic to the user's browser. A JavaScript-based client fetches the data from the API server and renders the experience.

Unfortunately, the redesign caused severe performance problems, particularly on mobile devices. Lots of JavaScript had to be downloaded, parsed and executed by the user's browser before anything of substance was visible. The "time to first interaction" was poor. Twitter's new architecture broke new ground by offering a number of advantages over a more traditional approach, but it lacked support for various optimizations available only on the server.

Twitter suffered from these performance problems for almost two years. Finally in 2012, Twitter reversed course by passing more of the rendering back to the server. The revised architecture renders the initial pages on the server, but asynchronously bootstraps a new modular JavaScript application to provide the fully-featured interactive experience their users expect. The user's browser runs no JavaScript at all until after the initial content, rendered on the server, is visible. By using server-side rendering, the client-side JavaScript could be minimized; fewer lines of code meant a smaller payload to send over the wire and less code to parse and execute. This new hybrid architecture reduced Twitter's page load time by 80%!

In 2013, Airbnb was the first to use Node.js to provide isomorphic (also called universal or simply shared) JavaScript. In the Node.js approach, the same framework is identically executed on the server side and client side. On the server side, it provides an initial render of the page, and data could be provided through Node.js or through REST API calls. On the client side, the framework binds to DOM elements, "rehydrates" (updates the initial server-side render provided by Node.js) the HTML, and makes asynchronous REST API calls whenever updated data is needed.

The biggest advantage Airbnb's JavaScript isomorphism had over Twitter's approach is the notion of a completely reusable rendering system. Because the client-side framework is executed the same way on both server and client, rendering becomes much more manageable and debuggable in that the primary distinction between the server-side and client-side renders is not the language or templating system used, but rather what data is provisioned by the server and how.

Universal Javascript

In a universal JavaScript approach utilizing shared rendering, Node.js executes a framework (in this case Angular), which then renders an initial application state in HTML. This initial state is passed to the client side, which also loads the framework to provide further client-side rendering that is necessary, particularly to “rehydrate” or update the server-side render.

From a prototype written in 10 days to being used across the stack by some of the largest websites in the world, long gone are the days of clunky browser implementations whose APIs changed depending on whether you were using Netscape or Internet Explorer. It took JavaScript 20 years, but it is finally considered an equal partner to traditional, well-established server-side languages.

When traffic skyrockets your site shouldn't go down

This week's Grammy Awards is one of the best examples of the high traffic events websites that Acquia is so well known for. This marks the fourth time we hosted the Grammys' website. We saw close to 5 million unique visitors requesting nearly 20 million pages on the day of the awards and the day after. From television's Emmys to Superbowl advertisers' sites, Acquia has earned its reputation for keeping their Drupal sites humming during the most crushing peaks of traffic.

These "super spikes" aren't always fun. For the developers building these sites to the producers updating each site during the event, nothing compares to the sinking feeling when a site fails when it is needed the most. During the recent Superbowl, one half-time performer lost her website (not on Drupal), giving fans the dreaded 503 Service Unavailable error message. According to CMSWire: "Her website was down well past midnight for those who wanted to try and score tickets for her tour, announced just after her halftime show performance". Yet for Bruno Mars' fans, his Acquia-based Drupal site kept rolling even as millions flooded his site during the half-time performance.

For the Grammys, we can plan ahead and expand their infrastructure prior to the event. This is easy thanks to Acquia Cloud's elastic platform capacity. Our technical account managers and support teams work with the producers at the Grammys to make sure the right infrastructure and configuration is in place. Specifically, we simulate award night traffic as best we can, and use load testing to prepare the infrastructure accordingly. If needed, we add additional server capacity during the event itself. Just prior to the event, Acquia takes a 360 degree look at the site to ensure that all of the stakeholders are aligned, whether internal to Acquia or external at a partner. We have technical staff on site during the event, and remote teams that provide around the clock coverage before and after the event.

Few people know what goes on behind the scenes during these super spikes, but the biggest source of pride is that our work is often invisible; our job well done means that our customer's best day, didn't turn into their worst day.

In memoriam: Richard Burford

It is with great sadness that we learned last week that Richard Burford has passed away. This is a tragic loss for his family, for Acquia, the Drupal community, and the broader open source world. Richard was a Sr. Software Engineer at Acquia for three and a half years (I still remember him interviewing with me), and known as psynaptic in the Drupal community. Richard has been a member of the Drupal community for 9+ years. During that time, he contributed hundreds of patches across multiple projects, started a Drupal user group in his area and helped drive the Drupal community in the UK where he lived. Richard was a great person, a dedicated and hard-working colleague, a generous contributor to Drupal, and a friend. Richard was 36 years young with a wife and 3 children. He was the sole income earner for the family so a fundraising campaign has been started to help out his family during these difficult times; please consider contributing.

Should we prioritize technological advances?

This blog post is co-authored with Devesh Raj, Senior Vice President and Head of Strategy and Planning at Comcast-NBCU. Devesh and I are friends and both Young Global Leader at the World Economic Forum. In this blog post we share some of our observations and thoughts after attending the World Economic Forum's annual meeting in Davos.

This year's World Economic Forum Annual Meeting in Davos focused on the Fourth Industrial Revolution, a term coined by Klaus Schwab to describe the new generation of technological advances – sensors, robotics, artificial intelligence, 3D printing, precision medicine – coming together to define the next wave of progress.

These new technologies have the potential to transform our lives. Beyond sci-fi like scenarios – such as each of us having our own personal R2-D2, summoning our Batmobile, or colonizing Mars – these advances also have the potential to solve many real-world problems. With more intelligent, automated technology, we could generate renewable energy, address climate change, connect billions of people to the internet, develop affordable housing solutions and cure chronic diseases.

These advances are not far into the future. A recent report on Technology Tipping Points and Societal Impact anticipates many such moments of inflection within our lifetimes – in fact, we may see major advances in transportation, artificial intelligence, and new payment technology as soon as the next decade. Yet, somewhat surprisingly, much of the discussion in Davos last month focused on the negative impacts of these technologies, rather than their positive potential.

The average year that each tipping point is expected to occur. Source: WEF report.
2018 2021 2022 2023 2024 2025 2026
- Storage for all - Robots and services - The Internet of Things
- Wearable internet
- 3D printing and manufacturing
- Implantable technologies
- Big Data for decisions
- Vision as the new interface
- Our digital presence
- Governments and the block chain
- A supercomputer in your pocket
- Ubiquitous computing
- 3D printing and human health
- The connected home
- 3D printing and consumer products
- AI and white-collar jobs
- The sharing economy
- Driverless cars
- AI and decision-making
- Smart cities

One consistent, fearful theme was the potential for job losses. As automation continues to replace manufacturing or blue collar jobs, artificial intelligence will subsequently do the same for skilled, white collar jobs in banking, law or medicine. Estimates as to the impact this will have on jobs vary, but many prognostications in Davos suggested a depressive impact on the global economy. While it's true that technological leaps have often eliminated older, human-powered methods of doing things, many in Davos also recognized that advances in technology create new jobs, most of which we can't even dream of today. For example, the invention of the airplane created hundreds of thousands of jobs, from pilots, to stewards, to airport personnel, to international agents and more prognostications not to mention the transformative economic impact of billions of people traveling vast distances in a short span of time.

A second concern at Davos was growing inequality in the world between "digital haves" and "have-nots". This was reflected both as a challenge among nations – developed vs. developing – but also an issue for specific socio-economic groups within individual nations, some of which arguably are still not past the second or third industrial revolution. What does 3D printing or precision medicine do, for example, for rural parts of India and Africa that still don't have reliable electricity, while urban centers in those same countries race towards an era of smart, automated living?

A third common concern (particularly driven by robotics and artificial intelligence) was the "dehumanization" of our lives. There was a case for a renewed emphasis on qualities that make us uniquely human – empathy, sensitivity, creativity and inspiration.

Another issue centered on the ethical and moral challenges of many advances. Some conversations at Davos discussed the dangerous potential of eugenics-like scenarios in medicine, enabled by advances such as CRISPR/Cas9. On the flip side, could machines make positive decisions regarding human lives, such as a self-driving car making a choice between hitting a pedestrian or sacrificing its passenger?

One could argue some of these concerns are overblown Luddism. But in some ways, it doesn't matter – the march of technological progress is inevitable, as it has always been. Certainly, no one at Davos suggested slowing down the pace of technological advancement. The gist of the discussions was that we should figure out how to avoid, or address, the negative, unintended consequences of these changes.

We believe there is a major challenge with the Fourth Industrial Revolution that didn't get adequate attention in Davos – the issue of prioritization.

To date, the technological innovation that has driven the Fourth Industrial Revolution is shaped by the commercial prospects of small or large firms in the market. After all, one definition of "innovation" is the commercial application of invention. As an example, investment in alternative energy R&D fluctuates depending on oil prices, just as demand for hybrid or electric vehicles become more or less attractive depending on gasoline prices.

What if, instead of being driven solely by commercial returns, we could focus the Fourth Industrial Revolution more directly on the big problems our world faces? What if we could prioritize technological advances that have the most beneficial impact to society?

The world has recently defined its problems very clearly in a set of 17 Sustainable Development Goals (Wikipedia), also known as the Global Goals, that were adopted by all countries last year to "end poverty, protect the planet and ensure prosperity for all". The goals cover poverty, hunger and food security, health, education, energy, and water and sanitation – to name a few. A successor list to the earlier Millennium Development Goals, the Sustainable Development Goals get quite specific.

Take Goal 3 as an example: "Ensure healthy lives and promote well-being for all at all ages". This goal is linked to 12 targets, including these top three:

  • By 2030: reduce the global maternal mortality ratio to less than 70 per 100,000 live births.
  • By 2030: end preventable deaths of newborns and children under 5 years of age, with all countries aiming to reduce neonatal mortality to at least as low as 12 per 1,000 live births and under-5 mortality to at least as low as 25 per 1,000 live births.
  • By 2030: end the epidemics of AIDS, tuberculosis, malaria and neglected tropical diseases and combat hepatitis, water-borne diseases and other communicable diseases.

Of course, technological advancement is not the only solution to all Sustainable Development Goals – there is much more to do – but it is likely one of the major contributors.

As the world thinks through how to harness the Fourth Industrial Revolution, we think it is worth questioning which technologies we should be prioritizing to meet these Sustainable Development Goals. How do we draft policies and create economic incentives to encourage the right types of technology advances? What should governments and the private sector do differently to focus technology on addressing these goals? How do we direct the energy and creativity of millions of entrepreneurs towards improving the state of the world?

The world's innovation system is powerful and has generally worked well. However, it could use a guiding hand to nudge it in a direction that will benefit the planet beyond the incentives of commercial returns. Expanding our criteria for importance to solving areas of global need is not an inherently anti-capitalist idea. But it is one that would channel capitalism in the best direction for humanity as a whole. That, we hope, is the real agenda initiated by the focus in Davos on the Fourth Industrial Revolution, which the world will seek to address in the coming year.


Dries Buytaert is the original creator and project lead of Drupal and the co-founder and CTO of Acquia. He writes about Drupal, startups, business, photography and building the world we want to exist in.

Updates from Dries straight to your mailbox