File this under "better late than never". Before the year closes out, I wanted to post my 2015 DrupalCon Barcelona keynote video and slides. I archive all my DrupalCon keynotes on my site so anyone who is interested in taking a trip to memory lane or studying the evolution of Drupal, can check out all my previous DrupalCon keynotes.
My DrupalCon Barcelona keynote is focused on having a realistic, open and honest conversation about the state of Drupal. In it, I broke down my thoughts on Drupal's market position, development process, and "decoupled Drupal". You can watch the recording of my keynote or download a copy of my slides (PDF, 27 MB).
As user experiences evolve from static pages to application-like experiences, end users' expectations of websites have become increasingly demanding. The Facebook newsfeed, the Gmail inbox, and the Twitter live stream are all compelling examples that form a baseline for the application-like experiences users now take for granted.
Many of Drupal's administrative interfaces and Drupal sites could benefit from a similarly seamless, instantaneous user experience. Drupal's user interfaces for content modeling (Field UI), layout management (Panels), and block management would benefit from no page refreshes, instant previews, and interface previews. These traits could also enrich the experience of site visitors; Drupal's commenting functionality could similarly gain from these characteristics and resemble an experience more like the commenting experience found in Facebook.
As Drupal's project lead, I ask myself: how can our community feel enabled and encouraged to start building rich user experiences?
In recent years, the emergence of decoupled architectures with client-side frameworks and libraries have helped our industry build these experiences. Does that mean we have to decouple Drupal's front-facing site experience (for site visitors or end users) and/or the administration experience (for content editors and site builders)? If so, should Drupal decouple the administration layer differently from the front-facing site experience? By extension, should a client-side framework guide this change?
Here is my current thinking: in the short term, Drupal should work toward a next-generation user experience under progressive decoupling for both the administration layer and the end user experience. At the same time, we should enable fully decoupled end-user and administrative experiences to be built on Drupal too. In my view, the best way to achieve this is to formally standardize on a full-featured MV* framework (e.g. Angular, Ember, Backbone, and Knockout) or a component-based view library (e.g. React, Vue, and Riot). In this blog post, I want to kick off the discussion and walk you through my current position in some more detail.
There is no doubt that Drupal's administration layer is very application-like and would benefit from an experience that is more interactive. For the end-user experience, a decoupled implementation is not always the best option. Some sites may not need or want the application-like interactivity that a client-side framework would provide, since not every site needs interaction. For instance, news sites or blogs do not need much interactivity, custom applications need a great deal, while e-commerce sites lie somewhere in the middle. It's not clear-cut, so let's look at our options in more detail.
|Approach||User experience||Developer experience||Ease of implementation|
No immediate feedback
Theme layer preserved
Modules in mostly PHP
Ad-hoc client-side code
No server-side change
No client-side change
No page refreshes
Theme layer preserved
Unified client-side code
Some server-side change
Some client-side change
No page refreshes
Theme layer replaced
Unified client-side code
Much server-side change
Much client-side change
Drupal's administration layer (content editor and site builder experience) is effectively an application. Fully decoupling may be the appropriate call to achieve the best possible user experience for creating, managing and presenting content. However, rewriting the administration layer from the ground up is a monumental task, especially since its modules provide powerful interfaces that allow site builders to build robust, complex sites without a line of code.
The same expectations for application-like interactivity often hold for the end-user experience: users expect shopping carts, comments, notifications, and searches to be updated instantaneously, without having to wait for a page to load.
For both the administration layer and the end-user experience, I believe the Drupal front end should not be fully decoupled out of the box. We should advance from our traditional paradigm and default to progressive decoupling. It allows us to achieve the user experience we want without significant downsides, since not every use case would benefit from fully decoupling. Through progressive decoupling, Drupal could potentially reach the ideals of the assembled web more quickly by preserving a tight link between Drupal modules and their front-end components.
Nonetheless, we should empower people building fully decoupled sites and applications. Depending on the use case, Drupal 8 is a good match for decoupled applications but we should improve and extend Drupal's REST API, enhance contributed modules such as Services, and shore up new features such as GraphQL (demo video) so more functionality can be decoupled. Front-end developers can then use any framework of their choice — whether it is Angular, Ember, React, or something else — to build a fully decoupled administrative application.
All things considered, I do believe Drupal should standardize on a single client-side framework, but it should only make such an explicit recommendation for progressively decoupled Drupal, not fully decoupled architectures. It would result in a more seamless user experience, better compatibility across interactive components in modules, maximum code reuse, a more consistent front-end developer experience, more maintainable code, and better performance as we don't have to load multiple frameworks.
Despite the potential benefits, there are also good reasons not to embrace a single client-side framework. New front-end frameworks are being created at what feels like an unsustainable pace; every nine months there is a new kid on the block. It's hard for a project as large as Drupal to embrace a single technology when there is no guarantee of its longevity.
For instance, Backbone, with its underlying library Underscore, currently powers interactions in the toolbar and in-place editing in Drupal 8. Though Drupal could expand the scope of Backbone in core and encourage front-end developers to build with it, it means buying even further into a framework that is quite old among its peers.
To deal with the fast-evolving nature of the front-end landscape, we need to be thoughtful about which framework we choose, to reassess our choice from time to time, and to make sure we can migrate fairly easily if we so decide.
Assuming we agree that embracing a single client-side framework makes sense for Drupal, there are actually three additional questions: what framework to standardize on, how to do it, and when to decide.
I'm the first to admit I'm not the best person to answer the first question. As I'm not a front-end developer, I'm looking at you, the Drupal community, to help answer this question. I'd hope that the chosen framework aligns well with both our architecture (modular, flexible) and community (collaborative, community-driven).
The second question — how to standardize on a framework — I can help answer. On the one extreme, Drupal could be opinionated and ship a client-side framework with Drupal core, meaning that every installation of Drupal ships with the chosen framework. This would be much like the adoptions of jQuery and Backbone.js. On the other end of the spectrum, Drupal could recommend a specific framework but not ship with it. Finally, somewhere in between, Drupal could provide a default standard framework in core but make it easy to replace with it a framework of a developer's choice, though the likelihood of this is quite small. This is akin to Drupal core shipping with a template engine (i.e. PHPTemplate) that could be (but was rarely) replaced with another. Ultimately, I think we get the best result if Drupal ships with a specific framework—much like the adoption of jQuery in Drupal 5.
The last question, when to standardize on a framework, is important too. I would recommend we experiment with possible directions as soon as possible in order to decide on a final vision sooner rather than later.
I believe that, for now, it makes more sense to progressively decouple Drupal sites and their administration layer by first building our pages with Drupal. Once the page is loaded, we can let a unified client-side framework take over as much of the page as needed to foster a next-generation user experience without reinventing the wheel or alienating developers.
That is not all! We will need module developers to bring rich interactions to their user interfaces with the help of the framework we choose. We will need designers to guide module developers in building a graceful user interface. We will need front-end developers to demonstrate how they want to develop the user experiences that will define Drupal for years to come. We will need users to test and validate all of our work. It will be tough going, but together, we can ensure Drupal stays ahead of the curve well into the future.
Volkswagen's recent emissions scandal highlighted the power that algorithms wield over our everyday lives. As technology advances and more everyday objects are driven almost entirely by software, it's become clear that we need a better way to catch cheating software and keep people safe.
A solution could be to model regulation of the software industry after the US Food and Drug Administration's oversight of the food and drug industry. The parallels are closer than you might think.
When Volkswagen was exposed for programming its emissions-control software to fool environmental regulators, many people called for more transparency and oversight over the technology.
One option discussed by the software community was to open-source the code behind these testing algorithms. This would be a welcome step forward, as it would let people audit the source code and see how the code is changed over time. But this step alone would not solve the problem of cheating software. After all, there is no guarantee that Volkswagen would actually use the unmodified open-sourced code.
Open-sourcing code would also fail to address other potential dangers. Politico reported earlier this year that Google's algorithms could influence the outcomes of presidential elections, since some candidates could be featured more prominently in its search results.
Research by the American Institute for Behavioral Research and Technology has also shown that Google search results could shift voting preferences by 20% or more (up to 80% in certain demographic groups). This could potentially flip the margins of voting elections worldwide. But since Google's private algorithm is a core part of its competitive advantage, open-sourcing it is not likely to be an option.
The same problem applies to the algorithms used in DNA testing, breathalyzer tests and facial recognition software. Many defense attorneys have requested access to the source code for these tools to verify the algorithms' accuracy. But in many cases, these requests are denied, since the companies that produce the proprietary criminal justice algorithms fear a threat to their businesses' bottom line. Yet clearly we need some way to ensure the accuracy of software that could put people behind bars.
So how exactly could software take a regulatory page from the FDA in the United States? Before the 20th century, the government made several attempts to regulate food and medicine, but abuse within the system was still rampant. Food contamination caused widespread illness and death, particularly within the meatpacking industry.
Meanwhile, the rise of new medicines and vaccines promised to eradicate diseases, including smallpox. But for every innovation, there seemed to be an equal amount of extortion by companies making false medical claims or failing to disclose ingredients. The reporting of journalists like Upton Sinclair made it abundantly clear by the early 1900s that the government needed to intervene to protect people and establish quality standards.
In 1906, President Theodore Roosevelt signed the Food and Drug Act into law, which prevented false advertising claims, set sanitation standards, and served as a watchdog for companies that could cause harm to consumers' welfare. These first rules and regulations served as a foundation for our modern-day FDA, which is critical to ensuring that products are safe for consumers.
The FDA could be a good baseline model for software regulation in the US and countries around the world, which have parallel FDA organizations including the European Medicines Agency, Health Canada, and the China Food and Drug Administration.
Just as the FDA ensures that major pharmaceutical companies aren't lying about the claims they make for drugs, there should be a similar regulator for software to ensure that car companies are not cheating customers and destroying the environment in the process. And just as companies need to disclose food ingredients to prevent people from ingesting poison, companies like Google should be required to provide some level of guarantee that they won't intentionally manipulate search results that could shape public opinion.
It's still relatively early days when it comes to discovering the true impact of algorithms in consumers' lives. But we should establish standards to prevent abuse sooner rather than later. With technology already affecting society on a large scale, we need to address emerging ethical issues head-on.
(I originally wrote this blog post as a guest article for Quartz.)
One thing that is exciting to me, is how much we appear to have gotten right in Drupal 8. The other day, for example, I stumbled upon a recent article from the LinkedIn engineering team describing how they completely changed how their homepage is built. Their primary engineering objective was to deliver the fastest page load time possible, and one of the crucial ingredients to achieve that was Facebook's BigPipe.
When a very high-profile, very high-traffic, highly personalized site like LinkedIn uses the same technique as Drupal 8, that solidifies my belief in Drupal 8.
LinkedIn supports both server-side and client-side rendering. While Drupal 8 does server-side rendering, we're still missing explicit support for client-side rendering. The advantage of client-side rendering versus server-side rendering is debatable. I've touched upon it in my blog post on progressive decoupling, but I'll address the topic of client-side rendering in a future blog post.
However, there is also something LinkedIn could learn from Drupal! Every component of a LinkedIn page that should be delivered via BigPipe needs to write BigPipe-specific code which is prone to errors and requires all engineers to be familiar with BigPipe. Drupal 8 on the other hand has a level of abstraction that allows BigPipe to work without the need for BigPipe-specific code. Thanks to Drupal's higher-level API, Drupal module developers don't have to understand BigPipe: Drupal 8 knows what page components are poorly cacheable or not cacheable at all, and what page components are renderable in isolation, and uses that information to automatically optimize the delivery of page components using BigPipe.
It is exciting to see Drupal support the advanced techniques that were previously only within reach of the top 50 most visited sites of the world! Drupal's BigPipe support will benefit websites small and large.
Building Drupal 8 with all of you has been a wild ride. I thought it would be fun to take a little end-of-week look back at some of our community's biggest milestones through Twitter. If you can think of others important Tweets, please share them in the comments, and I'll update the post.
— Cheppers (@cheppers) November 19, 2015
The secretsauce of #drupal isn't code or features or market share, important thought they are. The secret sauce is community.
— Sean Yo (@seanyo) March 10, 2011
— Jeff Geerling (@geerlingguy) March 10, 2011
Drupal 8.0.0 beta 1 released! https://t.co/FwdmRYaZUx Ahh the power of COMMUNITY driven software! :-)
— Doug Vann (@dougvann) October 1, 2014
— Gábor Hojtsy (@gaborhojtsy) October 1, 2014
— xjm (@xjmdrupal) September 19, 2014
Drupal 8.0.x-rc1 release window is today. Good sign of real stability is major issue count going down for 6+ weeks. pic.twitter.com/5VnHGmL9zb
— catch (@catch56) October 7, 2015
— xjm (@xjmdrupal) July 5, 2015
Working on D8 Criticals at the Ghent DA critical sprint, this is how the "My issues" page looks for me right now! pic.twitter.com/y5SnavVtND
— Sascha Grossenbacher (@berdir) December 13, 2014
— Cameron Eagans (@cweagans) March 23, 2012
— Wim Leers (@wimleers) April 8, 2015
And.... there we go! http://t.co/ed6XtMIs MOTHER BLEEPING VIEWS IN MOTHER BLEEPING CORE!
— webchick (@webchick) October 22, 2012
— Alex Pott (@alexpott) February 15, 2014
With Content + Config Translation in core D8 core is more translatable than D7 with all of contrib. #drupal
— Tobias Stöckler (@tstoeckler) November 18, 2013
Amazing to see Drupal 8's multilingual capabilities explained on the multilingual release page (for example Farsi): pic.twitter.com/9owVE3xABo
— Gábor Hojtsy (@gaborhojtsy) November 19, 2015
— Rasmus Lerdorf (@rasmus) April 21, 2015
— Whitney Hess (@whitneyhess) October 7, 2015
— Manuel Garcia (@drupalero) October 7, 2015
Kudos to the 3000+ contributors and to the entire Drupal community that helped make this happen. https://t.co/FtATRtSmCU
— Leslie Glynn (@leslieglynn) October 7, 2015
— Drupal (@drupal) November 10, 2015
— Drupal (@drupal) November 19, 2015
— Taco Potze˙ (@tacopotze) November 19, 2015
— Duo (@DuoConsulting) November 19, 2015
— Shakeel Tariq (@shakeeltariq) November 19, 2015
— Agustin Rojas Silva (@Aguztinrs) November 19, 2015
— HornCologne (@HornCologne) November 19, 2015
— webchick (@webchick) November 19, 2015
— Paul Johnson (@pdjohnson) November 19, 2015
— Dries Buytaert (@Dries) November 18, 2015
— Peter Decuyper (@sgrame) November 23, 2015
We just released Drupal 8.0.0! Today really marks the beginning of a new era for Drupal. Over the course of almost five years, we've brought the work of more than 3,000 contributors together to make something that is more flexible, more innovative, more easy to use, and more scalable.
Drupal 8 has been a big transformation for our community. This particular reboot has taken one-third of Drupal's lifespan to complete. In the process we've learned that reinvention doesn't come easily or quickly. There are huge market forces happening around us, and we can't exactly look away. Mobile is moving our society to near-universal, global internet access. Most companies have begun to transform themselves digitally, leaving established business models and old business processes in the dust. Digital experience builders are turning to platforms that give them greater flexibility, better usability, better integrations, and faster innovation. The pace of change in the digital world has become dizzying. If we were to ignore these market forces, Drupal would be caught flat-footed and quickly become irrelevant.
But we didn't. I'm proud to see that we've responded to these market forces with Drupal 8, and delivered a robust, solid product that can be used to build next-generation websites, web applications and digital experiences. We've implemented a more modern development framework, reimagined the usability and authoring experience, and made technical improvements that will help us build for the multilingual, mobile and highly personalized experiences of the future. From how we model content and get content in and out the system, to how we build and assemble experiences on various devices, to how we scale that to millions and millions of pageviews -- it all got much better with Drupal 8.
I'm personally incredibly proud of this release. Drupal 8 is the result of years of hard work and innovation by thousands of people, with lots of attention to detail at every level. Congratulations to everyone who stepped up to contribute; this was only possible thanks to your persistence and tireless hard work. It took a lot of learning, our best thinking and our best people to create Drupal 8, and I'm very, very proud of what we have accomplished together.
For 15 years, I have believed that Open Source offers significant advantages to proprietary solutions through superior innovation. Today, I believe that more than ever. Drupal 8 is another key milestone in helping us win and doing what is best for an open web. Of course, our job is not done but now is the time to have fun and celebrate this monumental milestone. Tonight, we'll be hosting more than 200 parties around the world! (It's also my 37th birthday today and the release of Drupal 8 along with all those parties is pretty much the best present ever!)
A couple of weeks ago a Chief Digital Officer (CDO) of one of the largest mobile telecommunications companies in the world asked me how a large organization such as hers should think about organizing itself to maintain control over costs and risks while still giving their global organization the freedom to innovate.
When it comes to managing their websites and the digital customer experience, they have over 50 different platforms managed by local teams in over 50 countries around the world. Her goal is to improve operational efficiency, improve brand consistency, and set governance by standardizing on a central platform. The challenge is that they have no global IT organization that can force the different teams to re-platform.
When asked if I had any insights from my work with other large global organizations, it occurred to me the ideal model she is seeking is very aligned to how an Open Source project like Drupal is managed (a subject I have more than a passing interest in).
Teams in different countries around the world often demand full control and decision-making authority over their own web properties and reject centralization. How then might someone in a large organization get the rest of the organization to rally behind a single platform and encourage individual teams and departments to innovate and share their innovations within the organization?
In a large Open Source project such as Drupal, contributions to the project can come from anywhere. On the one extreme there are corporate sponsors who cover the cost of full-time contributors, and on the other extreme there are individuals making substantial contributions from dorm rooms, basements, and cabins in the woods. Open Source's contribution models are incredible at coordinating, accepting, evaluating, and tracking the contributions from a community of contributors distributed around the world. Can that model be applied in the enterprise so contributions can come from every team or individual in the organization?
Reams have been written on how to incubate innovation, how to source it from the wisdom of the crowd, ignite it in the proverbial garage, or buy it from some entrepreneurial upstart. For large organizations like the mobile telecommunications company this CDO works at, innovation is about building, like Open Source, communities of practice where a culture of test-and-learn is encouraged, and sharing -- the essence of Open Source -- is rewarded. Consider the library of modules available to extend Drupal: there can be several contributed solutions for a particular need -- say embedding a carousel of images or adding commerce capability to a site -- all developed independently by different developers, but all available to the community to test, evaluate and implement. It may seem redundant (some would argue inefficient) to have multiple options available for the same task, but the fact that there are multiple solutions means more choices for people building experiences. It's inconceivable for a proprietary software company to fund five different teams to develop five different modules for the same task. They develop one and that is what their customers get. In a global innovation network, teams have the freedom to experiment and share their solutions with their peers -- but only if there is a structure and culture in place that rewards sharing them through a single platform.
Centers of Excellence (CoEs) are familiar models to share expertise and build alignment around a digital strategy in a decentralized, global enterprise. Some form multiple CoEs around shared utility functions such as advanced data analytics, search engine optimization, social media monitoring, and content management. CoEs have also grown to include Communities of Practice (CoP) where various "communities" of people doing similar things for different products or functions in multiple departments or locations, coalesce to share insights and techniques. In companies I've worked with that have standardized on Drupal, I've seen internal Drupal Camps and hackathons pop up much as they do within the Drupal community at-large.
My advice to her? Loosen control without losing control.
That may sound like a "have-your-cake-and-eat-it-too" cliche, but the Open Source model grew around models of crowd-sourced collaboration, constant and transparent communications, meritocracies, and a governance model that provides the platform and structure to keep the community pointed at a common goal. What would my guidance be for getting started?
Drupal and Open Source were created to address a need, and from their small beginnings grew something large and powerful. It is a model any business can replicate within their organization. So take a page out of the Open Source playbook: innovate, collaborate and share. Governance and innovation can coexist, but for that to happen, you have to give up a measure of control and start to think outside the box.
I don't usually write about the topic of investing, but it is something I enjoy, so I decided to take the jump by sharing some thoughts on MasterCard, one of my favorite companies. Although it's not an obvious technology disruptor, MasterCard is a successful technology company. When it comes to investing, boring can be beautiful. Mastercard is relevant in the context of my blog, where I often write about the digitization of the world.
An investor who invested $10,000 in MasterCard around the time of its IPO in 2006 would have seen that investment grow to $200,000 today. MasterCard has significantly outperformed the S&P 500, since a $10,000 investment in the S&P 500 would have returned less than $20,000 over the same 9.5 year time frame. I was not fortunate enough to buy at the IPO; I only got into MasterCard 18 months ago.
MasterCard, along with its rival Visa, has one of the most lucrative business models I've seen. MasterCard and Visa enjoy a virtual duopoly in payment transaction processing. Unlike other credit card companies like American Express, MasterCard and Visa don't assume any of the credit risk; the customer's bank takes on the risk of its customer not being able to pay the bill, and either the merchant or their bank takes the risk for charges that are fraudulent or unrecoverable. What makes MasterCard and Visa so lucrative is that they simply act as "digital tollbooths" that take a small interchange or "swipe fee" on every credit or debit card transaction that goes through their network, without assuming any of the risk.
When you pay $100 with your MasterCard, MasterCard takes about $2.60 in interchange fees and the retailer collects the remaining $97.40. MasterCard has a net profit margin of an astounding 52%. So of that $2.60, MasterCard gets to keep $1.30. Now consider that MasterCard processes many billions of credit card "swipes" each year, and you start to see the beauty of their business model. Because MasterCard has minimal capital expenses, it is able to generate enormous free cash flows and maintain a pristine balance sheet with virtually no debt. It can then invest the retained profits toward new technology, advertisements, share buybacks, dividends, etc.
As someone living in the United States, I take my credit cards for granted and use them to pay for almost everything; at the grocery store, at Starbucks, my utility bills, train and plane tickets, etc. I almost never use cash.
But that is far from the norm; MasterCard cites a global credit card penetration of just 15%. Cash usage has declined to 59.4% in developed markets, while it is still 92.7% in emerging markets. This means that MasterCard is likely to have years of growth ahead, as 85% of global transactions are still cash-based. For example, at present, the Chinese market is dominated by state-backed UnionPay, but China recently opened its domestic transactions to foreign companies like MasterCard. The company claims it is already seeing double-digit annual growth in cross-border credit card transaction volume in China, primarily fueled by e-commerce. Beyond China, the ecommerce market is growing 25% year-over-year globally, opening up even more opportunity. All things considered, I believe MasterCard is poised to continue to see tremendous revenue growth. In addition, MasterCard continues to buy back stock (3-5% of the float per year) which further adds to their earnings-per-share growth.
Technology disruption seems like the biggest risk to MasterCard. While MasterCard and Visa currently play prominent roles in both Apple and Google's digital wallet as the processing "middlemen", that could change. If Apple or Google creates a more secure payment infrastructure, there might be no need for a MasterCard or Visa. Furthermore, technologies like the Blockchain could render companies like MasterCard and other middlemen in the payments value chain obsolete. Merchants are more likely to adopt new technologies if they get some sort of benefit in the form of reduced interchange fees or risk. What better way to reduce fees than cutting out the middleman?
While emerging markets do represent the largest areas for growth for a company like MasterCard, in some countries, it will be extremely difficult to set up the same level of banking infrastructure that the US or Europe has. That is why we're seeing mobile payment technologies like M-PESA take off in Kenya, enabling the easy transfer of cash over an alternative to credit card rails. There is a chance that technologies like M-PESA could leapfrog traditional credit card infrastructure entirely.
There are also some big legal and regulatory risks. Since Visa and MasterCard operate a near-duopoly, they have a lot of government eyes watching them on behalf of merchants. For example, in 2010, the US passed the Durbin Amendment, which forced Visa and MasterCard to lower interchange fees on credit card transactions. Also, both Visa and MasterCard are being investigated for price-fixing and possible collusion in a near $6 billion settlement lawsuit with merchants. Each of these legal and regulatory hurdles could become a significant hit to MasterCard's bottom-line.
Despite these risks, MasterCard isn't going anywhere anytime soon. The strong growth drivers, the relative lack of immediate competitive threats, and their profitable business model make me believe that MasterCard will keep outperforming the market. There are a few things to dislike about MasterCard; at 0.65% the dividend is low and at 30 the price-to-earnings ratio is high. The high price-to-earnings ratio makes MasterCard somewhat risky, as stocks with a premium valuation are more vulnerable to a steep corrections. I think MasterCard is a buy-and-hold, as long you buy into it at the right price point ...
Disclaimer: I'm long MasterCard. Before making an investment in any of the companies mentioned, you should do your own proper due diligence. Any material in this article should be considered general information, and not a formal investment recommendation.
The web has done wonders to make government more accessible to its citizens. Take the State of New York; NY.gov is a perfect example of taking a people-centric approach to digital government. The site lets people easily access state government news and services, including real-time public transit updates, employment resources, healthcare information, and more.
One year ago, The State of New York redesigned and relaunched their website on Drupal in partnership with Code and Theory and Acquia. Today, one year later, they’ve nearly tripled the audience to more than 6 million users. The most-visited part of the site is the services section, which aligns well with the governor’s priority to provide better customer service to state residents. Annual pageviews have quadrupled since launch to a record high of more than 17 million and mobile usage has increased 275 percent.
For more details, check out the press release that the State of New York published today.
The Industrial Revolution, started in the middle of the 18th century, transformed the world. It marks the start of a major turning point in history that would influence almost every aspect of daily life. The Industrial Revolution meant the shift from handmade to machine-made products and increased productivity and capacity. Technological change also enabled the growth of capitalism. Factory owners and others who controlled the means of production rapidly became very rich and working conditions in the factories were often less than satisfactory. It wasn't until the 20th century, 150 years after its beginning, that the Industrial Revolution ended creating a much higher standard of living than had ever been known in the pre-industrial world. Consumers benefited from falling prices for clothing and household goods. The impact on natural resources, public health, energy, medicine, housing and sanitation meant that chronic hunger, famines and malnutrition started to disappear and the life expectancy started to increase dramatically.
An undesired side-effect of the Industrial Revolution is that instead of utilizing artisans to produce hand-made items, machines started to take the place of the artisans. Before the industrial revolution, custom-made goods and services were the norm. The one-on-one relationships that guilds had with their customers sadly got lost in an era of mass-production. But what is exciting me about the world today is that we're on the verge of being able to bring back one-on-one relationships with our customers, while maintaining increased productivity and capacity.
As the Big Reverse of the Web plays out and information and services are starting to come to us, we'll see the rise of a new trend I call "B2One". We're starting to hear a lot of buzz around personalization, as evidenced by companies like The New York Times making delivery of personalized content a core part of their business strategy. Another recent example is Facebook testing shopping concepts, letting users browse a personal feed of clothing and other items based on their "likes". I'd imagine these types of feeds could get smarter and smarter, refining themselves over time as a user browses or buys. Or just yesterday, Facebook launched Notify, an iOS app that pushes you personalized notifications from up to 70 sites.
These recent examples are early signs of how we're evolving from B2C to B2One (or from B2B2C to B2B2One), a world where all companies have a one-on-one relationship with their customers and personalized experiences will become the norm. Advances in technology allow us to get back what we lost hundreds of years ago in the Industrial Revolution, which in turn enables the world to innovate on business models. The B2One paradigm will be a very dramatic shift that disrupts existing business models (advertising, search engines, online and offline retailers) and every single industry.
For example, an athletic apparel company such as Nike could work sensor technology into its shoes, telling you once you've run a certain number of miles and worn them out. Nike would have enough of a one-on-one relationship with you to push an alert to your smartphone or smartwatch with a "buy" button for new shoes, before you even knew you needed them. This interaction is a win-win for both you and Nike; you don't need to re-enter your sizing and information into a website, and Nike gets a sale directly from you disrupting both the traditional and online retail supply chain (basically, this is bad news for intermediaries like Amazon, Zappos, clothing malls, Google, etc).
I believe strongly in the need for data-driven personalization to create smarter, pro-active digital experiences that bring back one-on-one relationships between producers and consumers. We have to dramatically improve delivering these personal one-on-one interactions. It means we have to get better at understanding the user's journey, the user's context, matching the right information/service to the user and making technology disappear in the background.
Updates from Dries straight to your mailbox