Drupal Planet

Subscribe to canal de noticias Drupal Planet
Drupal.org - aggregated feeds in category Planet Drupal
Actualizado: hace 2 horas 54 mins

Web Wash: Integrate Webform and Google Sheets using Zapier in Drupal 8

Mar, 06/05/2018 - 10:30

Webform allows you to create powerful forms in Drupal without the need for any custom code. You can use it for a basic contact us form with a few fields such as name, phone and email, or it can also be used to create complex multi page forms with conditional fields.

If you want to allow your editors to create their own forms without the need of a developer then install and teach them how to use the module. If you want to learn more about webform we have a two part series which will help you get started; Getting Started with Webform in Drupal 8 and Moving Forward with Webform in Drupal 8.

Collecting submissions using Webform is easy, but what if you want to integrate the module with a 3rd party SaaS provider? What if you want to push all contact form submissions into your CRM system, or add a row into a Google Sheets spreadsheet.

Of course, this can be done by a developer through the right APIs but you can also do it without writing any code using a service called Zapier.

In this tutorial, you’ll learn how to send Webform submissions into Zapier which will then add it as a row into a Google Sheets spreadsheet.

Agiledrop.com Blog: AGILEDROP: Drupal and the Internet of Things

Mar, 06/05/2018 - 06:32
Unless you’ve been living under the rock these past few years, you might have heard of the term ‘The Internet of Things’. If you’ve always wondered what Internet of Things is and you know what Drupal is, then you’ve stumbled upon the right place. In this post, I’ll take a brief look and what Internet of Things is and how Drupal can be used to take advantage of it.   What is the Internet of Things? The Internet of Things or IoT for short is the next big technological leap in the networking world. If you take a look at the past few years, the growth of mobile devices has enabled constant… READ MORE

Virtuoso Performance: Migrating from an OAuth2 authenticated JSON feed

Lun, 06/04/2018 - 12:24
Migrating from an OAuth2 authenticated JSON feed mikeryan Monday, June 4, 2018 - 10:24am

Continuing with techniques from the “Acme” project, another ongoing feed I implemented was import from a JSON feed protected by OAuth2 authentication into “doctor” nodes. Let’s look first at the community contributions we needed to implement this.

Community contributions

Provide authentication plugins to HTTP fetcher - Moshe Weitzman had already suggested (and provided a patch for) adding basic and digest authentication to the HTTP fetcher plugin. I broadened the scope to add an Authentication plugin type, and implemented an OAuth2 authentication plugin.

Implement xpath-like selectors for the JSON parser - The JSON parser, from Karen Stevenson’s original JSON source plugin, used a numeric depth to retrieve data elements. The JSON feed we had here did not work with that approach, because at the top level in addition to the array containing our data was another array (and the depth approach would draw from both arrays). Implementing a means to select fields with a /-separated syntax made this much more flexible.

Project implementation

So, let’s look at the source plugin implementation:

source: plugin: url # We want to reimport any doctors whose source data has changed. track_changes: true # Counting the available records requires fetching the whole feed - cache the # counts to minimize overhead. cache_counts: true # Until https://www.drupal.org/project/drupal/issues/2751829 is fixed, this # should be used in conjunction with cache_counts in most cases. It was not # strictly necessary in this project because this was the only cached ‘url’ # source plugin. cache_key: doctor data_fetcher_plugin: http data_parser_plugin: json item_selector: /providers # Note that the source .yml file does not contain the urls, or half the # authentication configuration - these are merged in using the configuration # UI (see http://virtuoso-performance.com/blog/mikeryan/configuring-migrations-form). # We present sample values here so you can see what the complete configuration # looks like. # The endpoint from which the data itself is fetched. urls: https://kservice.example2.com/providers # The http fetcher plugin calls the authentication plugin (if present), # which accepts plugin-specific configuration and returns the appropriate # authentication headers to add to the HTTP request. authentication: # migrate_plus also has ‘basic’ and ‘digest’ authentication plugins. plugin: oauth2 # The grant type used by the feed (other grant types supported in theory, # but untested, are authorization_code, password, refresh_token, and # urn:ietf:params:oauth:grant-type:jwt-bearer. grant_type: client_credentials # The base URI for retrieving the token (provided through the UI). base_uri: https://kservice.example2.com # The relative URL for retrieving the token. token_url: /oauth2/token # The client ID for the service (provided through the UI). client_id: default_client_id # The client secret for the service (provided through the UI). client_secret: abcdef12345678

The ids and fields configuration operate as they do with other JSON and XML feeds I’ve blogged about.

Tags Planet Drupal Drupal Migration Use the Twitter thread below to comment on this post:


— Virtuoso Performance (@VirtPerformance) June 4, 2018


Lucius Digital: Save time and continuously deliver higher quality | Docker and Drupal

Lun, 06/04/2018 - 06:36
As a digital agency, we develop many platforms and websites for different clients. In the past years, we often encountered the following problem: a developer produces something on his laptop that does not work instantaneously on the laptop of another developer.

Kalamuna Blog: Kalamuna Recognized as Top SF Development Company by Clutch Business Ratings

Lun, 06/04/2018 - 00:14
Kalamuna Recognized as Top SF Development Company by Clutch Business Ratings The Kalamuna Team Mon, 06/04/2018 - 22:23

Clutch recently announced their annual listing of the top creative, design, and development companies in 2018.

Categories Articles Community Drupal Wordpress Author The Kalamuna Team

Oliver Davies: How to Use Environment Variables for your Drupal Settings with Docksal

Dom, 06/03/2018 - 21:00

Within the Docksal documentation for Drupal settings, the example database settings include hard-coded credentials to connect to the Drupal database. For example, within a settings.php file, you could add this:

$databases['default']['default'] = [ 'driver' => 'mysql', 'host' => 'db', 'database' => 'myproject_db', 'username' => 'myproject_user', 'password' => 'myproject_pass', ];

Whilst this is fine, it does mean that there is duplication in the codebase as the database credentials can also be added as environment variations within .docksal/docksal.env - this is definitely the case if you want to use a custom database name, for example.

Also if one of these values were to change, then Drupal wouldn't be aware of that and would no longer be able to connect to the database.

It also means that the file can’t simply be re-used on another project as it contains project-specific credentials.

We can improve this by using the environment variables within the settings file.

Wim Leers: Ode to the Drupal Association

Sáb, 06/02/2018 - 18:43

This is an ode to the Drupal Association.

  1. Yesterday, I stumbled upon Customizing DrupalCI Testing for Projects, written by Ryan “Mixologic” Aslett. It contains detailed, empathic 1 explanations. He also landed d.o/node/2969363 to make Drupal core use this capability, and to set an example.
  2. I’ve been struggling in d.o/project/jsonapi/issues/2962461 to figure out why an ostensibly trivial patch would not just fail tests, but cause the testing infrastructure to fail in inexplicable ways after 110 minutes of execution time, despite JSON API test runs normally taking 5 minutes at most! My state of mind: (ノಠ益ಠ)ノ彡┻━┻
    Three days ago, Mixologic commented on the issue and did some DrupalCI infrastructure-level digging. I didn’t ask him. I didn’t ping him. He just showed up. He’s just monitoring the DrupalCI infrastructure!

In 2015 and 2016, I must have pinged Mixologic (and others, but usually him) dozens of times in the #drupal-infrastructure IRC channel about testbot/DrupalCI being broken yet again. Our testing infrastructure was frequently having troubles then; sometimes because Drupal was making changes, sometimes because DrupalCI was regressing, and surprisingly often because Amazon Web Services was failing.

Thanks to those two things in the past few days, I realized something: I can’t remember the last time I had to ping somebody about DrupalCI being broken! I don’t think I did it once in 2018. I’m not even sure I did in 2017! This shows what a massive improvement the Drupal Association contributed to the velocity of the Drupal project!


Of course, many others at the Drupal Assocation help make this happen, not just Ryan.

For example Neil “drumm” Drumm. He has >2800 commits on the Drupal.org customizations project! Lately, he’s done things like making newer & older releases visible on project release pages, exposing all historical issue credits, providing nicer URLs for issues and giving project maintainers better issue queue filtering. BTW, Neil is approaching his fifteenth Drupal anniversary!
Want to know about new Drupal.org features as they go live? Watch the change recordsRSS feed available.

In a moment of frustration, I tweeted fairly harshly (uncalled for … sorry!) to @drupal_infra, and got a forgiving and funny tweet in response:

The system doesn't believe that a human could do as much as you do.

— Ryan Aslett (@ryanaslett) April 5, 2018

(In case it wasn’t obvious yet: Ryan is practically a saint!)

Thank you!

I know that the Drupal Association does much more than the above (an obvious example is organizing DrupalCons). But these are the ways in which they are most visible to me.

When things are running as smoothly as they are, it’s easy to forget that it takes hard work to get there and stay there. It’s easy to take this for granted. We shouldn’t. I shouldn’t. I did for a while, then realized … this blog post is the result!

A big thanks to everyone who works/worked at the Drupal Association! You’ve made a tangible difference in my professional life! Drupal would not be where it is today without you.

  1. Not once is there a Just do [jargon] and it’ll magically work in there, for example! There’s screenshots showing how to navigate Jenkins’ (peculiar) UI to get at the data you need. ↩︎

Dries Buytaert: Frontend United keynote

Sáb, 06/02/2018 - 08:00

Keynoted at Frontend United in The Netherlands about our work on Drupal's web services APIs and our work toward a JavaScript-driven Drupal administration interface. Great event with lots of positive energy!

© Christoph Breidert

Mobilefish.de: Import translations on profile or module installation

Vie, 06/01/2018 - 18:11
Import translations on profile or module installation Peter Majmesku Fri, 06/01/2018 - 23:11

Let's say you have a custom module and you want to attach translation files to it. You want to import the translation files after installation or after you have updated the .po translation files. Also make sure that the Interface Translation (locale) core module is installed.

Use a folder named translations inside the module where the language files like de.po or fr.po can be found. To load the translations you have to insert the following lines into your example_module.info.yml:

'interface translation project': example_module
'interface translation server pattern': modules/custom/example_module/translations/%language.po

Note: more details about the interface translation properties can be found here.

To update your translations use the following Drush commands:

drush locale-check && drush locale-update && drush cr

To update existent translations you should take a look at the settings page (/admin/config/regional/translate/settings). You can use local translation files only or overwrite any existing translation.

Tags Add new comment

Drupal Commerce: A May Full of Drupal Commerce Releases

Vie, 06/01/2018 - 17:36

May was one of our most productive months to date. It was full of releases for the core Commerce modules, our standalone PHP libraries, and essential contributed modules that all work together to comprise Drupal Commerce. While I outlined the highlights in the roadmap issue on drupal.org, these wins are worth sharing more broadly to keep the rest of the Drupal community in the loop.

The biggest release of the month was Drupal Commerce 2.7, which included new features for currency formatting, address form configuration, and stored payment methods. It also fixed a handful of bugs that unblocked other module releases and updated core in response to improvements in our libraries and dependent modules.

We've long discussed how our standalone PHP libraries are exporting expertise off the Drupal island. Addressing and Internationalization, which have each been downloaded over one million times, are our two shining stars. We rolled new releases for each of them in May, improving even further Drupal Commerce's ability to solve the hardest parts of address entry / validation / formatting and currency localization. Refer to the price formatting change record from the 2.7 release to see how the new API is more flexible and performant as a result.

Additionally, we released Address 1.4 and Inline Entity Form 1.0 RC1. The latest Address release unlocks the customer profile’s address field to support collecting less detailed billing addresses. The Inline Entity Form release includes new product information management features, letting you duplicate product variations for faster product data entry.

Thanks to generous sponsorship from Authorize.Net themselves, we've been able to dedicate several weeks to improving their integration this year. The resulting Authorize.Net RC1 release now supports eCheck, Visa Checkout, and 3DSecure payments! We also included several bug fixes related to duplicate customer and payment profiles that appear when migrating from an old system to Drupal Commerce, for example.

While not fully released yet, our Technology Partner integration for Avalara's AvaTax is nearing beta. Jace Bennest from Acro Media contributed heavily by refactoring the module to properly use a TaxType plugin while my co-maintainer Matt Glaman contributed additional fixes to our port from the Drupal 7 integration to prepare it for certification. Thanks, Jace / Acro Media!

When Matt wasn't working on the above contribs, he was collaborating with Lisa Streeter from Commerce Guys to bring Commerce Reports to its first beta release for Drupal 8. The new version takes a completely different approach from the Drupal 7 using lessons we learned developing Lean Commerce Reports. It denormalizes transaction data when an order is placed to support reports generation with or without the Views module, providing a better developer experience and much better performance. Check it out below! (Click to expand.)

We've also been hard at work improving the evaluator experience. The big release for that is Commerce Demo's beta1, which showcases what Drupal Commerce provides out of the box. It creates products and scaffolds out a full product catalog (pictured below). To get the full effect, try it out with our default store theme, Belgrade. The new demo module gets us closer to something like we had with Kickstart 2.x on Drupal 7 - a learning resource for site builders and a way for agencies to more easily demo and sell Drupal Commerce.

Finally, I'm very excited to announce that Lisa Streeter is our new documentation lead! Expect some great things to come. She has already done fantastic work with the Commerce Recurring documentation and is working on revising our getting started, installation, and update docs.

Looking at June, we plan on finalizing the query level entity access API, which will allow us to better support marketplace and multi-store Drupal Commerce implementations. We expect to merge user registration after checkout completion, and we will also be focusing on address reuse / copying, Buy One Get One promotion offers, and more product management experience enhancements.

Ashday's Digital Ecosystem and Development Tips: eSignatures with HelloSign and Drupal 8

Vie, 06/01/2018 - 16:00

Previously, I wrote a bit about the HelloSign eSignature platform and how it can be integrated into a Drupal 7 website. As promised, a Drupal 8 version of the integration is now available and ready for use on cutting-edge websites everywhere. But this new version is much more than a one-to-one upgrade of the original module— we've leveraged some of Drupal 8's great new features to make using HelloSign with your site even easier than it was before. Here are just some of the highlights of the new release:

ComputerMinds.co.uk: Rebranding ComputerMinds - Part 5: Development

Vie, 06/01/2018 - 09:01

Let's have a quick look through our development process on this project and pick out some of the more interesting bits. As briefly mentioned in the last article we are using a composer set up and all code is version controlled using git on github. All pretty standard stuff.


In the previous article I briefly discussed how we set up Pattern Lab. Before getting stuck in to the components that would make up the pages of the site, we first needed to set up some global variables and grid. Variables allow us to reuse common values throughout the SCSS and if we need to make a change we can do so centrally. After adding variables for each of the colours and also a colour palette mapping which would allow to loop through all colours if we needed to throughout the project, we added variables for padding that would be used throughout and also font styles, after importing from Google Fonts.


CSS Grid

Although still relatively new, CSS Grid is a web standard and works in all modern browsers. So much simpler than using grid libraries like Susy we were keen to start using it on our projects and this was the perfect one on which to try it out. Set up was simple, partly due to the simple grid in the designs but mostly due to the simplicity of CSS Grid itself. A few lines of SCSS and the grid wrapper was set up:

.grid { display: grid; grid-auto-rows: auto; grid-gap: 20px; grid-column-gap: 20px; grid-template-rows: minmax(0, auto); }

This declares the grid, sets a consistent gap of 20px and sets a broad size range for the rows. As well as adding the .grid class to the wrapper of where we'd like a grid, we also need to add another class to define how many columns that grid should have. Defining, in SCSS, a simple mapping allowed me to create a loop to generate the column classes we needed:

// Column mapping $columns: ( one: 1, two: 2, three: 3, four: 4, five: 5, six: 6, ); // Generate column classes @each $alpha, $numeric in $columns { .grid--columns-#{$numeric} { grid-template-columns: repeat(#{$numeric}, 1fr); @include to-large { grid-template-columns: repeat(1, 1fr); } } }

This loop generates a class for each of the potential number of columns we might need. The last @include in the above code simply resets the column definition, making all columns full width on smaller screens. Now, all we needed to do was add 2 classes and we'd have a grid!

Occasionally, we'd have a need for grid items to to span more than one column. Using the same mapping as before, I created a simple loop that would generate classes to define different column spans. These classes could then be applied to the immediate children of the grid wrapper.

.grid__item { @include from-large { @each $alpha, $numeric in $columns { &--span-#{$alpha} { grid-column: auto / span #{$numeric}; } } } }

Now we have complete control over our grid. Here's a example of how it's used.

First item Second item spanning two columns Third item spanning three columns


Pattern Lab

In the previous article I mentioned the setup of Pattern Lab and Emulsify but didn't look in to the actual development, so let's do that now! Although we're used to coding SCSS in a modular way here at CM, with Pattern Lab's stand alone components essentially working like modules we actually don't need to take too much care to produce nice clean code. Each SCSS file is only catering for a small component on the page and as such is usually small and specific.

But, as well as including our pattern specific code within each component's directory we needed to ensure that we also considered working in a SMACSSy way to reduce the CSS we were generating. We didn't want multiple classes applying the same styling, so any rules that would be reused and consistent, like padding, were placed inside the Base folder in a Base SCSS file.

Of course, once we had defined our classes we needed to get them in to the Pattern Lab Twig templates. As components will have variations we can't just hard code the classes in to the templates, we need to pass them in as variables. Passing variables to Twig files is super simple and with Emulsify 2.x there's now even Drupal Attributes support with the addition of the BEM Twig extension. As we are likely wanting to pass multiple classes to the same element we can pass in a simple array of modifiers and render it out in the Twig template. So in a Drupal preprocess we can prepare some modifiers (we'll look at passing these on to the Pattern Lab Twig files later):

$variables['heading_modifiers'] = ['centered', 'no-space'];

And then in our Twig file we pass this through the BEM function:

{% set heading_base_class = heading_base_class|default('h' ~ heading_level) %} {{ heading }}

Which renders the markup as:




The beauty of using Pattern Lab is the ability to work simultaneously on frontend and backend development. Before bringing more hands on deck I was able to begin the backend of the site before getting even close to completing the frontend. As mentioned earlier, the codebase was set up before the Front End work began so we could jump straight in to the Emulsify theme. Using composer allowed us to quickly get Drupal 8 and a bunch of contrib modules we needed so when we were ready to start on the backend we could jump straight in.

This site required nothing too complex in terms of backend development and the work was more a task of building content types and views to display content as per the designs. That said, we did utilise the Paragraphs module allowing us to create reusable entities, or tiles as we're used to calling them, as they are used extensively throughout the designs.



Something that hasn't been standard in our Drupal 8 builds since the release is configuration. Gone are the days of bundling settings in to features, Drupal 8 Core comes with configuration management tools. In the early days, one of our senior developers created cm_config_tools - a module to give developers precise control over what config to export. Drupal 8 has progressed since then and the timing of this project allowed us to use a new module, Configuration Split.

Configuration Split builds on Drupal Core's configuration management ability to export a whole set of a site's configuration by allowing us to define sets of configuration to be exported to separate directories. It's then possible to define in settings.php which directories to include when importing/exporting. As we were committing settings.php to git we could include the main config directory here and then have a local.settings.php (not committed to git) to define the database and any other config directories to include:

## Enable config split settings $config['config_split.config_split.local_dev']['status'] = TRUE; $config['config_split.config_split.local_overrides']['status'] = TRUE;

This means we can have configuration solely for use when developing (things like Devel and Field_UI). It's also possible to override settings that are included in the main config export, locally. This allows us to run local environments without fear of interfering with live functionality, like affecting comments by changing the Disqus Domain, for example.

Importing and exporting works the same way as Core's configuration management, by using Drush commands:

Drush cim Drush cex



In a normal Drupal project, the markup (Twig files) would be within Drupal's templating system with prepared variables being rendered out where they were needed to be. With our component based Pattern Lab, all of our markup was within the Patten Lab structure, away from Drupal's /templates directory. Fortunately, including them is simple enough. First we needed to download and install the Components Libraries module. This allowed us to specify a different directory for our Twig files and also register Twig namespaces for those files. We do this in the theme's .info file:

component-libraries: base: paths: - components/_patterns/00-base atoms: paths: - components/_patterns/01-atoms molecules: paths: - components/_patterns/02-molecules organisms: paths: - components/_patterns/03-organisms templates: paths: - components/_patterns/04-templates pages: paths: - components/_patterns/05-pages

Now our Pattern Lab Twig files were included, we could begin to link them up to Drupal's templating system. Linking them is as simple as choosing which components you want to display and then calling that Twig file from your Drupal template. When you call the component's Twig file you just need to pass in the variables from Drupal.

So if we wanted to display a page title as an H1, within page-title.html.twig inside Drupal's template directory we would call our Pattern Lab's heading component passing in the title and heading level:

{{ title_prefix }} {% if title %} {% include "@atoms/02-text/00-headings/_heading.twig" with { "heading": title, "heading_level": 1, } %} {% endif %} {{ title_suffix }}

If we wanted to change the style of the heading we could pass in an array of modifiers, as shown in the example further up the page, too. For more complex page components we can also pass in an array to be looped over inside the component's Twig file. For example, if we wanted a listing of cards we could pass an array to a listing component Twig template and within that loop through the array each time calling another component's Twig template:

{% for item in content_array %} {% include "@molecules/card/01-card.twig" with { "card_img_src": item.image, "card_title": item.title, "card_body": item.body, "card_button_content": item.button_text, "card_button_url": item.button_url, "card_button_modifiers": item.button_mods, "card_url": item.url, "card_img_alt": item.image_alt, } %} {% endfor %}

This is just a brief overview and a look at some interesting parts, there was obviously a lot more work that went in to the site build! Now, as this website was being built to replace our old site, we needed the content from old site to be moved over. In the next article Christian is going to talk through this process.

Third & Grove: A Year Later and Drupal Commerce is Still in Existential Crisis

Vie, 06/01/2018 - 09:00
A Year Later and Drupal Commerce is Still in Existential Crisis justin Fri, 06/01/2018 - 08:00

Agiledrop.com Blog: AGILEDROP: Top Drupal blog posts from May

Vie, 06/01/2018 - 04:00
Each month, we revisit out top Drupal blog posts of the month, giving you the chance to check out some of our favourites. This month was all about decoupled Drupal and JavaScript, check it out!   First one on the list is Nightwatch in Drupal Core by Sally Young from Lullabot. In this blog post, she introduces us to Nightwatch, a functional testing framework, that has been integrated into Drupal, so we can test JavaScript with JavaScript itself. She explains what are the features and how you can try it out.  We continue our list with Working toward a JavaScript-driven Drupal administration… READ MORE

Virtuoso Performance: Disabling functionality temporarily during migration

Jue, 05/31/2018 - 12:25
Disabling functionality temporarily during migration mikeryan Thursday, May 31, 2018 - 10:25am

Continuing with techniques from the “Acme” project, the location content type had an address field and a geofield, with field_geofield configured to automatically determine latitude and longitude from the associated field_address - a fact I was initially unaware of. Our source data contained latitude and longitude already, which I mapped directly in the migration:

field_geofield: plugin: geofield_latlon source: - latitude - longitude

However, testing location migrations by repeatedly running the import, I soon started getting messages from Google Maps API that my daily quota had been exceeded, and quickly tracked down the integration with field_address. Clearly, the calls out to Google Maps were both unnecessary and hazardous - how to prevent them? Fortunately, the migration system provides events which fire before and after each migration is executed. So, we subscribe to MigrateEvents::PRE_IMPORT to save the current settings and disable the external call:

public function onMigrationPreImport(MigrateImportEvent $event) { if ($event->getMigration()->id() == 'location') { $fields = \Drupal::entityTypeManager()->getStorage('field_config')->loadByProperties(['field_name' => 'field_geofield']); if ($fields) { /** @var \Drupal\field\Entity\FieldConfig $field */ if ($field = $fields['node.location.field_geofield']) { $this->originalSettings = $field->getThirdPartySettings('geocoder_field'); $field->setThirdPartySetting('geocoder_field', 'method', 'none'); $field->save(); } } } }

And we subscribe to MigrateEvents::POST_IMPORT to restore the original settings:

public function onMigrationPostImport(MigrateImportEvent $event) { if ($event->getMigration()->id() == 'location') { $fields = \Drupal::entityTypeManager()->getStorage('field_config')->loadByProperties(['field_name' => 'field_geofield']); if ($fields) { /** @var \Drupal\field\Entity\FieldConfig $field */ if ($field = $fields['node.location.field_geofield']) { foreach ($this->originalSettings as $key => $value) { $field->setThirdPartySetting('geocoder_field', $key, $value); } $field->save(); } } } }

The thoughtful reader may note a risk here - what if someone were adding or editing a location node while this were running? The geofield would not be populated from the address field. In this case, this is not a problem - this is a one-time bulk migration (and no one should be making changes on a production website at such a time). In cases involving an ongoing feed where the feed data is used as-is on the Drupal site, it would also not be a problem, although if there were a practice of manually editing imported content there would be some risk.

Tags Drupal Planet Drupal Migration Use the Twitter thread below to comment on this post:


— Virtuoso Performance (@VirtPerformance) May 31, 2018


OpenSense Labs: How to strategize web personalization with Drupal

Jue, 05/31/2018 - 09:33
How to strategize web personalization with Drupal Shankar Thu, 05/31/2018 - 18:03

You might have listened to the new album of your favourite band on a music application. Or, you would have streamed a critically acclaimed movie on a video streaming platform. In both cases, you will notice suggestions curated especially for you based on your choice of music and movies. Personalized content is the way to go for providing the better user experience. Drupal has provisions for building personalization features into your site to tailor the content as per the interests of the user thereby enhancing user engagement.

Source: Getty Images

Proper analysis of web personalization criteria and strategies should prove vital for the digital firms. According to a research from Econsultancy, 94 percent of the in-house marketers agree that web personalization is really significant for the current as well as the future growth of their business.

What is web personalization anyway? Source: Marketo Web personalization implies that on the basis of attributes like profile, behaviour and the location of the internet users, you should create dynamic and personalized content to provide them with the relevant website experience.

It refers to understanding the interests of the users, tailoring the website to accommodate their profile and, offer them the best content that is relevant to them.

Due to intricacies and traffic volume of some websites, digital marketers may find web personalisation as convoluted and may think that it will occupy a lot of their time. And they may also think that it is for large enterprises who have enormous presence globally, have a huge team, and can sustain increasing budget. But, with right tools and strategies, it can be incorporated into every website no matter what sort of business they do thereby making the website efficacious and skyrocketing the return on investment.

How is web personalization meritorious?

Digital marketers constantly work on personalizing the way they interact with the customers and attain their objectives through customer satisfaction and retention. There are various ways you can reap the merits of web personalization.

  • Strengthens customer’s loyalty to your brand: You can use it to firm the grip on how the customer thinks about you and your brand. A research from Invespcro stated that 45 percent of the online users are more likely to shop on a website that shows personalized suggestions. You can compile the data collected from a user’s interaction with your website and curate messages for your cross-channel marketing. Hence, it helps in developing a brand value and customer loyalty.
  • Enhances lead generation: A Hubspot research found that personalized calls-to-actions (CTA) can assist in generating leads. CTAs that were targeted for specific users had 42 percent higher view-to-submission rate than the ones that were same for every user.
  • Offers insight into online visitor’s preferences: You can get a deep insight into who your online users are, what they prefer and, lead to higher conversion rate. You can then channelise your messages based on what certain user might want to know.
  • Shores up sales and revenue: Through loyalty programs, it can help in accentuating your sales. It is not just restricted to offering discounts and merchandises to the customers but promotes more user engagement. For instance, you can send alert to the customers when a product is back in stock or encourage them for future purchases by sending them notifications when a brand new product is launched. A report by Mckinsey and Company states that acquisition costs get reduced by almost 50 percent and business revenue increases by 5-15 percent.
  • Increases conversion rate: Web personalization can help track demographics and the behavioural patterns and convert an anonymous user into a potential customer.
  • Improves user engagement: Engagement and acquisition are two of the important terms that digital marketers adhere to. Personalizing the website lets you cultivate user engagement through steps like cross-sell, upsell and, customer loyalty. The E-Tailing Group says that increasing the personalisation to multiple channels can increase the customer spending by 500 percent.
  • Disseminates targeted ads: It helps in broadcasting targeted cross-channel promotional campaigns. Through the compilation of user preferences and the advertisements which they click on, your website can spread targeted messages to enhance user engagement. Constellation Researchhttps://www.monetate.com/blog/constellation-research-declares-ai-driven-personalization-the-answer says that lack of content relevancy would lead to 83 percent lower response rates than the average promotional campaigns.
Web personalization with Drupal

Drupal 8 provides the perfect foundation for incorporation of technologies to enable personalization from marketing automation to web analytics. Drupal 8’s web services initiative has streamlined the process of sharing meaningful information with external systems. There is no circumscription on which marketing tools you apply as the Drupal managed content can be turned into standardized data sharing formats.

DrupalCon New Orleans 2016 threw light on how Drupal can be effectively implemented in your Drupal site to increase the user engagement.

This DrupalCon session talked about how Drupal module Acquia Lift can delineate unprecedented insight into what customers want and don’t want to serve them personalized content. With such a system incorporated into the site, digital marketers get more control over automation, testing and measurement of marketing activities.

Acquia Lift module helps in the unification of content and the insight collected from several sources about the customers for delivering in-context and personalized experiences across multiple platforms. It has functionalities like drag-and-drop user interface for targeting messages, syndicating content, behavioural targeting, A/B testing, unifying customer profile, and combining anonymous and known online visitor profiles.

Let’s see how can we strategize the integration of web personalization into the Drupal site.

How to implement web personalization? 1. Understand the classifications of web personalization
  • Feedbacks: Implicit and explicit feedback can be used to send personalized messages to the target segments. A data is said to be implicit when an inference is taken from user interactions with the website. Assessment of their page visits, response to CTAs, behavioural patterns with menu navigation etc. gives you an implicit data about user behaviour. When you use Google analytics data to know the geolocation, browser or the device of the user. Or, when you ask a user to fill out a form, it comes under explicit data.
  • Data sources: You can gain an insight into user behaviour from first-party data sources like user interactions with the website, email marketing, and marketing automation. Second-party data sources include the first party information that is known to a different entity. You can get this information out from trusted partners who are ready to enclose the data with some kind of agreement. Third-party data sources are the external data providers who can come handy in extracting user profiles.
  • Identity: Attributes like IP address, location, device, and browser of a user are automatically detected non-personal identification traits. A user’s age, gender, interests belong to personal identification traits.
  • Online visitor profile: User’s interaction with the website like the onsite searches, date and time of site visits, response to online forms etc. helps in building their user profile.
2. Creation of a content plan
  • Segmentation of audience: It is very much evident that personalizing your website lets you target your messages accordingly. Having content that requires being sent to different sort of audience calls for segmenting the audience type. You might be having a lot of whitepapers, case studies, ebooks, etc. on your website that can be useful for B2B marketing. Or, you might be having a lot of advertisements promoting slashed rates of popular products that are helpful for consumer marketing. So, sending the right content to the right audience is of paramount importance.
  • Efficacy in mapping out content: Once you have decided the segment of the audience that you are targeting, it is important to figure how to effectively map out content. For instance, in B2B marketing, you can consider building on metrics like awareness, interest, evaluation and commitment through infographics, case studies, live demos, and advanced solutions respectively. And in consumer marketing, you can build on metrics like awareness, interest, and decision through product highlights, videos, and special offers respectively.
Source: MarketoSource: Marketo
  • Betterments with the existing content: Smart calculation of dissemination of the content is very essential. Creation of cornucopia of content won’t solve your trials and tribulations. Personalized advertorial campaigns may require you to develop new content but it is also imperative to improvise the content that is already existing on the site. Repurpose the content according to the segmentation of audiences. Revisit the titles and the CTAs to make it appropriate as per the segments. Add the industry-specific research studies that resonate with your content. Convert your large reports into short ebooks, infographics etc.
  • Choosing the right place on your website: You can customize your homepage based on the region from where the user is accessing your website. Sometimes, an internal page with product details might be having higher SEO rankings than the homepage. So, it is fruitful to personalize that page with some live demos that is relevant to them.
3. Testing the web personalization efforts
  • A/B testing: In other words, it is also known as split testing. This process refers to the comparison of more than two versions of promotional campaigns or advertorial messages through cross-channel marketing to understand which method is working and which isn’t.
  • Importance of A/B testing: Not only it helps in enhancing the engagement of online users and efficacy of promotional campaigns but also improves digital marketers’ awareness and expertise when it comes to understanding user preferences.
4. Measuring the success rate Source: Marketo

You would notice the early signs of success through the increase in time on site, amount of content consumed and the return visitors. Thereafter, contact quality gets better. Finally, you see improvement in return on investment.

Source: MarketoCase study

Drupal has a proven track record in the healthcare industries. “What does it mean for content to be “personalized”?”. Humana, which finds a place in the list of Fortune 500 healthcare company, personalized their Drupal microsite for one of their customers with the help a digital agency to deliver relevant content.

The objective was to send out targeted content to the users and find their interests. Humana wanted to personalize their website based on the data including demographics and click path of the users. They pushed their existing online users to a personal wellness analysis that provided them with the insight into their behavioural patterns and their interests.

Source: Acquia

When the users clicked on this personal wellness registration forms, Humana got their demographic details. Users could also manually edit the settings through some sliders. This enabled them to see the type of content that they wanted to see on the website. Thus, it paved the way for mapping users with similar demographics and preferences.

Source: Acquia

They could disseminate the content by constructing different segments like health, finance, etc. For instance, in the health segment, they could categorise on the basis of attributes like demographics (gender and age) and site activity (users who mostly clicked on the content related to health factors).


Content strategies that prophesize the one-size-fits-for-everyone would make your site banal, trite and shorn of any fanfare. Digital strategies have come a full circle and have clung on to web personalization tactics to provide relevant and meaningful content to the users. Drupal 8 provides a magnificent platform for personalizing the website. With right strategies and plans in place, you can build a bonhomie between the online users and your website.

Ping us at hello@opensenselabs.com to personalize your site and build a colossal online presence.

blog banner blog image Blog Type Articles Is it a good read ? On

LakeDrops Drupal Consulting, Development and Hosting: Own your data (again)

Jue, 05/31/2018 - 09:03
Own your data (again) Jürgen Haas Thu, 05/31/2018 - 14:03

My personal #gdpr today, May 25th 2018: completed my project to get back all my data from @Google, @evernote et al and host it all by myself with @Nextclouders, #joplin and dozens of other @OpenSourceOrg tools that come with the same convenience but with real privacy. Check!

Promet Source: Should I Fix my Existing Site or Build a New Site from Scratch?

Mié, 05/30/2018 - 23:57
Does an accessibility issue on my website meanI need to build a brand new one? This might be one of many questions rolling around in your head as you read the email or letter informing you that your site has an accessibility problem. Don’t panic just yet. It could be something simple, but you need to have all the facts. You need a plan of attack and that starts with a site audit.

OSTraining: How to Use Google Webfonts in Your Drupal 8 Site

Mié, 05/30/2018 - 15:07

Although Drupal has reputation for being a developers' platform, lots of user rely on Drupal's admin area for key tasks.

For typography in Drupal sites, the best way to change your site's fonts via the admin is a module called @font-your-face

The @font-your-face module allows you to work with webfonts like Google Fonts or Font Squirrel. It also provides the ability to work with paid font services like Typekit or fonts.com.

In this tutorial, you’ll learn how to configure and use this module in Drupal 8.