Agregador de canales de noticias

Appnovation Technologies: Simple Website Approach Using a Headless CMS: Part 1

Drupal Planet - Mié, 02/06/2019 - 06:00
Simple Website Approach Using a Headless CMS: Part 1 I strongly believe that the path for innovation requires a mix of experimentation, sweat, and failure. Without experimenting with new solutions, new technologies, new tools, we are limiting our ability to improve, arresting our potential to be better, to be faster, and sadly ensuring that we stay rooted in systems, processes and...

Hook 42: Being Drupal Adjacent

Drupal Planet - Hace 9 horas 6 mins
What It Means to be Drupal Adjacent

One of the major objectives of Drupal 8 is the desire to “get off the island.” Drupal opened its doors to using and contributing to a broader ecosystem of projects instead of replicating functionality in Drupal. We now see tools like Twig and various Symfony components in Core. Contributed modules are also able to integrate projects and Software Development Kits (SDKs). For example, Password Strength, Digital Ocean, and Hubspot API integrate through Drupal’s native use of Composer. This philosophy helps those of us that more closely identify as “Drupal Adjacent.”

I consider being “Drupal Adjacent” as having some degree of experience in Drupal but maintaining expertise in a breadth of other technology. You are capable of leveraging Drupal’s strengths along with the strengths of other tools. Drupal Adjacent architects use “the right tool for the right job.” Drupal can serve as a bi-directional tool or framework in a broader architecture of systems and other tools can serve a discrete purpose.

As I consider myself Drupal Adjacent, I want to share my experience working in the Drupal community.

Axelerant Blog: Enterprise Digital Transformation (DX) + Outsourcing

Drupal Planet - Hace 10 horas 6 mins

Can you outsource Digital Transformation (DX)?

Let's set this off in the right direction. What people think digital transformation is, often isn’t. Adopting the latest feature is not digital transformation, and neither is a basic migration or new functional enhancements made to a site.

DrupalEasy: DrupalEasy Podcast 210 - Stefanie Gray - DKAN Open Data Platform

Drupal Planet - Hace 11 horas 45 mins

Direct .mp3 file download.

Stefanie Gray, (stefaniegray), Engineer and Open Data Specialist for CivicActions joins Mike Anello to discuss all things DKAN.

Interview DrupalEasy News News Sponsors
  • Drupal Aid - Drupal support and maintenance services. Get unlimited support, monthly maintenance, and unlimited small jobs starting at $99/mo.
  • WebEnabled.com - devPanel.
Follow us on Twitter Subscribe

Subscribe to our podcast on iTunes, Google Play or Miro. Listen to our podcast on Stitcher.

If you'd like to leave us a voicemail, call 321-396-2340. Please keep in mind that we might play your voicemail during one of our future podcasts. Feel free to call in with suggestions, rants, questions, or corrections. If you'd rather just send us an email, please use our contact page.

Drupal Europe: Community at Drupal Europe

Drupal Planet - Hace 14 horas 19 mins
Amazee labs @flickr

Drupal Europe brings a unique opportunity to connect, share and learn from the Drupal community and to talk about what holds us together. We grew to be the biggest open source community under the tagline “Come for the code and stay for the community” which we strive to uphold.

Join us on September 10–14, 2018 in Darmstadt, Germany to discuss and learn about growing and strengthening communities and the challenges that come with that.

Drupal has been a historic example of how Open Source communities can thrive and to maintain this leading position we need to learn from each other, include others and inspire everybody to be an active contributor. This might bring its challenges from time to time, so please come and share your stories, expertise and lessons learned with us. This is the only way to keep our community strong, diverse and open minded.

Who should attend?

You! This vertical topic will be the meeting place for everyone in Drupal and other communities.

Whether you want to organise events, you’re new to the community and want to know where you can get involved, or you want to share a success story from your community, you are welcome.

Target groups:

  • Members of the Drupal community
  • Other open source communities
  • Organisations and those interested in how communities work and prosper

Example talks:

  • Being Human
  • Challenges of contribution
  • Community help
  • Community retention
  • Growing leaders & influencers (by empowering, enabling and adding trust)
  • Growing the Drupal Community
  • Improving diversity
  • Mentorship, sponsorship and allies
  • Organizing events
  • Succession planning for organizers and leaders

As you’ve probably read in one of our previous blog posts, industry verticals are a new concept being introduced at Drupal Europe and replace the summits, which typically took place on Monday. At Drupal Europe. These industry verticals are integrated with the rest of the conference — same location, same ticket and provide more opportunities to learn and exchange within the industry verticals throughout three days.

Industry vertical icons by @sixeleven

Now is the perfect time to buy your ticket for Drupal Europe. Session submission is already open so please submit your sessions and encourage others who have great ideas.

Please help us to spread the word about this awesome conference. Our hashtag is #drupaleurope.

To recommend speakers or topics please get in touch at program@drupaleurope.org.

About Drupal Europe Conference

Drupal is one of the leading open source technologies empowering digital solutions in the government space around the world.

Drupal Europe 2018 brings over 2,000 creators, innovators, and users of digital technologies from all over Europe and the rest of the world together for three days of intense and inspiring interaction.

Location & Dates

Drupal Europe will be held in Darmstadtium in Darmstadt, Germany — which has a direct connection to Frankfurt International Airport. Drupal Europe will take place 10–14 September 2018 with Drupal contribution opportunities every day. Keynotes, sessions, workshops and BoFs will be from Tuesday to Thursday.

Amazee Labs: So, you want to run a Drupal Camp. Here's what you should know.

Drupal Planet - Hace 14 horas 53 mins
So, you want to run a Drupal Camp. Here's what you should know.

I’ve been running events since college, for work and for fun, and for groups of 3 to 3,000. You’d think there’d be a difference, but the amount of energy it takes to run an event, surprisingly, is the same. It’s crazy how well these things scale.

Stephanie El-Hajj Mon, 06/18/2018 - 10:18

Regardless of size, an event planner goes through a very predictable flow from event conception to event end.

We started planning Texas Camp in September of 2017. Knowing we were going to organize the event again for 2018, we scrambled to finalize the venue and update the sticker. By the time BADCamp rolled around, we had shiny new Texas Camp stickers to distribute at the nation’s largest gathering of Drupal people - all potential camp attendees.  

Because we knew when companies do their budget planning, we were ready with a brand new sponsor prospectus by December. By the second week, a cheerful call to sponsor was in many Drupal company inboxes.  

We worked to get the bestie launched in January, so attendees could plan ahead and to get everyone excited. Let me tell you this when building a spankin’ new React + Drupal site, plan for extra time.

By the time we did launch in February, we had missed a few big camps, but still had plenty of time to get the word out on the call for sessions.

From February to April, we worked hard to get the word out about all the different ways people could get involved with camp. Sponsorship, speaking, volunteering, or simply just attending. Early-bird tickets were on sale and the sessions submissions were trickling in.

Texas Camp organizers attended DrupalCon Nashville and spread the good word of Texas Camp to anyone who would spare a few minutes. Those who promised to submit sessions were gifted a Texas Camp sticker, along with lavish promises of fame and glory.

Because we want Texas Camp to be known as an inclusive camp, we reached out to different groups, including the Drupal Diversity and Inclusion group, to help get the word out to a broader, and more diverse, audience. I’d like to think our efforts here helped us pick up more diverse speakers than we might have gotten through our usual channels.

At the end of April, the craziness began. Although I am a seasoned session selection overseer, this was my first time actively participating in the selection as a team. It’s not an easy task, not just considering the length of time it takes to read sessions!

We had a few mandates: no repeat speakers, diverse topics, variety in experience levels, and oh yeah, the selection was done fully blind to the presenter. All personally identifiable information (pronouns, speaker names, company names, etc) was all painstakingly struck from the submission pile.

At the end of the two-week selection process, the team gathered and made the final selection. Some speakers with multiple sessions had been ranked high enough to make the session cut, so the better of the two, or the session with most topical conflict with other highly ranked sessions, were made into backups.

After session selection, things started moving really fast. We had one week to confirm speakers and another week to make a schedule. Once that newsletter went out announcing the final schedule, the official countdown to Texas Camp had begun.

Week 4: Guess what you’ll need and order everything. This gives you enough time to re-order if anything goes wrong. It’s too early for real attendance numbers, so any amount you order is the best guess.

Week 3: Things will start to arrive. Your office will be filled with an insane number of soda flats and bizarre equipment. We had a silver 4-foot metal trough we had to explain on a few client calls. Speakers will begin canceling. New sponsors will appear out of the woodwork - which is a GREAT thing. Last minute sponsors allowed us to blow the budget on breakfast tacos!

Week 2: You’ve printed everything you can think to print and pray the sizes match and the colors turn out right. The final “Texas Camp is next week!” notice has gone out to attendees. Speakers are thoroughly annoyed at the number of reminders to RSVP we’ve sent.

Week 1: The blessed “eye of the storm”. The week before the event. It’s too late to do anything meaningful. All you can do is hope you’ve done enough ahead of time and remembered everything. Especially if the week of ends in a 3-day weekend for Memorial Day. An unexplained spike in registrations. It looks like we’ll hit 150!

The week of: It’s time for final inventory audits, calling and confirming with all the venues and updating catering counts with vendors. Always add more vegan meals than you have data for! Rally the organizing team and caravan the soda flats and registration supplies to the venue.

Make eye contact and remind each other that you can do it and that there will be coffee in the morning. Charge the iPads. Remember to print the special diet food tents for the morning.  

During camp: Have a stupid amount of fun. See people you haven’t seen in a year. Celebrate the CMS that drew us all together. So many people, at Texas Camp we nearly hit 200! Eat an inordinate amount of food. Watch some amazing talks. Sing karaoke.

After camp: Go home. Swear to never do it again. Take a vacation. Get a sunburn. Reconsider.

The week after camp: Begin researching venues for the next year.  

PreviousNext: How to create and expose computed properties to the REST API in Drupal 8

Drupal Planet - Hace 18 horas 20 mins

In Drupal 8.5.0, the "processed" property of text fields is available in REST which means that REST apps can render the HTML output of a textarea without worrying about the filter formats.

In this post, I will show you how you can add your own processed fields to be output via the REST API.

by Jibran Ijaz / 18 June 2018

The "processed" property mentioned above is what is known as a computed property on the textarea field.

The ability to make the computed properties available for the REST API like this can be very helpful. For example, when the user inputs the raw value and Drupal performs some complex logical operations on it before showing the output.

Drupal fieldable entities can also have computed properties and those properties can also be exposed via REST. I used the following solution to expose the data of an entity field which takes raw data from the users and perform some complex calculations on it.

First of all, we need to write hook_entity_bundle_field_info to add the property and because it is a computed field we don't need to implement hook_entity_field_storage_info.


<?php // my_module/my_module.module /** * @file * Module file for my_module. */ use Drupal\my_module\FieldStorageDefinition; use Drupal\my_module\Plugin\Field\MyComputedItemList /** * Implements hook_entity_bundle_field_info(). */ function my_module_entity_bundle_field_info(EntityTypeInterface $entity_type, $bundle, array $base_field_definitions) { $fields = []; // Add a property only to nodes of the 'my_bundle' bundle. if ($entity_type->id() === 'node' && $bundle === 'my_bundle') { // It is not a basefield so we need a custom field storage definition see // https://www.drupal.org/project/drupal/issues/2346347#comment-12206126 $fields['my_computed_property'] = FieldStorageDefinition::create('string') ->setLabel(t('My computed property')) ->setDescription(t('This is my computed property.')) ->setComputed(TRUE) ->setClass(MyComputedItemList::class) ->setReadOnly(FALSE) ->setInternal(FALSE) ->setDisplayOptions('view', [ 'label' => 'hidden', 'region' => 'hidden', 'weight' => -5, ]) ->setDisplayOptions('form', [ 'label' => 'hidden', 'region' => 'hidden', 'weight' => -5, ]) ->setTargetEntityTypeId($entity_type->id()) ->setTargetBundle($bundle) ->setName('my_computed_property') ->setDisplayConfigurable('form', FALSE) ->setDisplayConfigurable('view', FALSE); } return $fields; }

Then we need the MyComputedItemList class to perform some magic. This class will allow us to set the computed field value.


<?php // my_module/src/Plugin/Field/MyComputedItemList.php namespace Drupal\my_module\Plugin\Field; use Drupal\Core\Field\FieldItemList; use Drupal\Core\TypedData\ComputedItemListTrait; /** * My computed item list class. */ class MyComputedItemList extends FieldItemList { use ComputedItemListTrait; /** * {@inheritdoc} */ protected function computeValue() { $entity = $this->getEntity(); if ($entity->getEntityTypeId() !== 'node' || $entity->bundle() !== 'my_bundle' || $entity->my_some_other_field->isEmpty()) { return; } $some_string = some_magic($entity->my_some_other_field); $this->list[0] = $this->createItem(0, $some_string); }

The field we add is not a base field so we can't use \Drupal\Core\Field\BaseFieldDefinition. There is an open core issue to address that https://www.drupal.org/project/drupal/issues/2346347 but in tests there is a workaround using a copy of \Drupal\entity_test\FieldStorageDefinition:


<?php // my_module/src/FieldStorageDefinition.php namespace Drupal\my_module; use Drupal\Core\Field\BaseFieldDefinition; /** * A custom field storage definition class. * * For convenience we extend from BaseFieldDefinition although this should not * implement FieldDefinitionInterface. * * @todo Provide and make use of a proper FieldStorageDefinition class instead: * https://www.drupal.org/node/2280639. */ class FieldStorageDefinition extends BaseFieldDefinition { /** * {@inheritdoc} */ public function isBaseField() { return FALSE; } }

Last but not least we need to announce our property definition to the entity system so that it can keep track of it. As it is an existing bundle we can write an update hook. Otherwise, we'd need to implement hook_entity_bundle_create.


<?php // my_module/my_module.install /** * @file * Install file for my module. */ use Drupal\my_module\FieldStorageDefinition; use Drupal\my_module\Plugin\Field\MyComputedItemList; /** * Adds my computed property. */ function my_module_update_8001() { $fields['my_computed_property'] = FieldStorageDefinition::create('string') ->setLabel(t('My computed property')) ->setDescription(t('This is my computed property.')) ->setComputed(TRUE) ->setClass(MyComputedItemList::class) ->setReadOnly(FALSE) ->setInternal(FALSE) ->setDisplayOptions('view', [ 'label' => 'hidden', 'region' => 'hidden', 'weight' => -5, ]) ->setDisplayOptions('form', [ 'label' => 'hidden', 'region' => 'hidden', 'weight' => -5, ]) ->setTargetEntityTypeId('node') ->setTargetBundle('my_bundle') ->setName('my_computed_property') ->setDisplayConfigurable('form', FALSE) ->setDisplayConfigurable('view', FALSE); // Notify the storage about the new field. \Drupal::service('field_definition.listener')->onFieldDefinitionCreate($fields['my_computed_property']); }

The beauty of this solution is that I don't have to write a custom serializer to normalize the output. Drupal Typed Data API is doing all the heavy lifting.

Related Drupal core issues: Tagged jsonapi, REST, XML, JSON, hal_json, Normalizers

Ashday's Digital Ecosystem and Development Tips: Better Admin Interfaces with ReactJS and Drupal 8

Drupal Planet - Vie, 06/15/2018 - 16:00

As you may or may not have noticed, we’re having a lot of fun over here at Ashday building Drupal sites with React. Check out our own site, for example. We are really digging this new direction for front-end and you can learn more about why we did it how we approached it in other articles, but here we are going to talk about how we approached the Drupal editorial experience, because honestly - we just didn’t find a lot of great resources out there discussing how this might be done well in a decoupled experience.

Drupal blog: A plan for Drupal and Composer

Drupal Planet - Vie, 06/15/2018 - 12:51

This blog has been re-posted and edited with permission from Dries Buytaert's blog. Please leave your comments on the original post.

At DrupalCon Nashville, we launched a strategic initiative to improve support for Composer in Drupal 8. To learn more, you can watch the recording of my DrupalCon Nashville keynote or read the Composer Initiative issue on Drupal.org.

While Composer isn't required when using Drupal core, many Drupal site builders use it as the preferred way of assembling websites (myself included). A growing number of contributed modules also require the use of Composer, which increases the need to make Composer easier to use with Drupal.

The first step of the Composer Initiative was to develop a plan to simplify Drupal's Composer experience. Since DrupalCon Nashville, Mixologic, Mile23, Bojanz, Webflo, and other Drupal community members have worked on this plan. I was excited to see that last week, they shared their proposal.

The first phase of the proposal is focused on a series of changes in the main Drupal core repository. The directory structure will remain the same, but it will include scripts, plugins, and embedded packages that enable the bundled Drupal product to be built from the core repository using Composer. This provides users who download Drupal from Drupal.org a clear path to manage their Drupal codebase with Composer if they choose.

I'm excited about this first step because it will establish a default, official approach for using Composer with Drupal. That makes using Composer more straightforward, less confusing, and could theoretically lower the bar for evaluators and newcomers who are familiar with other PHP frameworks. Making things easier for site builders is a very important goal; web development has become a difficult task, and removing complexity out of the process is crucial.

It's also worth noting that we are planning the Automatic Updates Initiative. We are exploring if an automated update system can be build on top of the Composer Initiative's work, and provide an abstraction layer for those that don't want to use Composer directly. I believe that could be truly game-changing for Drupal, as it would remove a great deal of complexity.

If you're interested in learning more about the Composer plan, or if you want to provide feedback on the proposal, I recommend you check out the Composer Initiative issue and comment 37 on that issue.

Implementing this plan will be a lot of work. How fast we execute these changes depends on how many people will help. There are a number of different third-party Composer related efforts, and my hope is to see many of them redirect their efforts to make Drupal's out-of-the-box Composer effort better. If you're interested in getting involved or sponsoring this work, let me know and I'd be happy to connect you with the right people!

Dries Buytaert: A plan for Drupal and Composer

Drupal Planet - Vie, 06/15/2018 - 11:08

At DrupalCon Nashville, we launched a strategic initiative to improve support for Composer in Drupal 8. To learn more, you can watch the recording of my DrupalCon Nashville keynote or read the Composer Initiative issue on Drupal.org.

While Composer isn't required when using Drupal core, many Drupal site builders use it as the preferred way of assembling websites (myself included). A growing number of contributed modules also require the use of Composer, which increases the need to make Composer easier to use with Drupal.

The first step of the Composer Initiative was to develop a plan to simplify Drupal's Composer experience. Since DrupalCon Nashville, Mixologic, Mile23, Bojanz, Webflo, and other Drupal community members have worked on this plan. I was excited to see that last week, they shared their proposal.

The first phase of the proposal is focused on a series of changes in the main Drupal core repository. The directory structure will remain the same, but it will include scripts, plugins, and embedded packages that enable the bundled Drupal product to be built from the core repository using Composer. This provides users who download Drupal from Drupal.org a clear path to manage their Drupal codebase with Composer if they choose.

I'm excited about this first step because it will establish a default, official approach for using Composer with Drupal. That makes using Composer more straightforward, less confusing, and could theoretically lower the bar for evaluators and newcomers who are familiar with other PHP frameworks. Making things easier for site builders is a very important goal; web development has become a difficult task, and removing complexity out of the process is crucial.

It's also worth noting that we are planning the Automatic Updates Initiative. We are exploring if an automated update system can be build on top of the Composer Initiative's work, and provide an abstraction layer for those that don't want to use Composer directly. I believe that could be truly game-changing for Drupal, as it would remove a great deal of complexity.

If you're interested in learning more about the Composer plan, or if you want to provide feedback on the proposal, I recommend you check out the Composer Initiative issue and comment 37 on that issue.

Implementing this plan will be a lot of work. How fast we execute these changes depends on how many people will help. There are a number of different third-party Composer related efforts, and my hope is to see many of them redirect their efforts to make Drupal's out-of-the-box Composer effort better. If you're interested in getting involved or sponsoring this work, let me know and I'd be happy to connect you with the right people!

Agiledrop.com Blog: AGILEDROP: Introduction to Drupal Commerce

Drupal Planet - Jue, 06/14/2018 - 20:17
Drupal and Commerce. These are two words that aren’t usually associated with each other. But do you know that Drupal can become a great eCommerce solution thanks to a dedicated software for it called Drupal Commerce? If you didn’t, then well, you are in for a treat. Let’s take a look at what Drupal Commerce is and how it can be used to create an eCommerce store using Drupal. Drupal Commerce at its core is a set of modules for Drupal that enable a host of eCommerce functionalities for Drupal which I’ll be highlighting further in the post. It was developed and is maintained by The Commerce… READ MORE

Lullabot: Not your grandparent’s Drupal, with Angie “Webchick” Byron

Drupal Planet - Jue, 06/14/2018 - 19:39
Mike and Matt talk with Angie "Webchick" Byron on what she's been up to, various Drupal initiatives, and what Drupal needs to do to succeed.

Acro Media: Web to Print with Drupal Commerce

Drupal Planet - Jue, 06/14/2018 - 11:45
Empower your customers to customize products.


There is a high likelihood that the tshirt on your back or in your closet started life as someone’s idea that was being uploaded to an online tool. The idea that a person could not only buy tshirts, but design them in a tool and approve the proof before payment seems almost commonplace. Why aren’t more people talking about this? Your customers are expecting more tailored experiences when buying decorated apparel, signage and personalized promotional products from the small to medium web store fronts. Getting the “Web to Print” toolset just right on Drupal is not easy.

Here’s just a few of the expectations for ordering printed materials from the web on Drupal:

  • Drupal integration: Full integration with existing Drupal website
  • Intuitive editor experience: Drag and drop toolset, uploading of files (jpg, png, tiff, pdf, eps, ai, psd), cropping and quick fixes to pictures, lots of fonts, pop-over text formatting, white labelled branding with plenty of customizations, low resolution upload warnings, and mobile friendly web to print tool.
  • Proof and checkout workflow: Print-quality PDF proof, edit before purchase, edit after purchase, CMYK color space, super large files that need processing
Getting off the bespoke product editor island

An example of a bespoke web to print tool Acro Media built with Drupal and jQuery UI.

Like many Drupal agencies, there’s rarely a problem we face that can’t be solved with in-house open source tools. Before we decry the problems, we are very proud of what we accomplished in the past given budget and available tools. With jQuery UI and html-to-pdf experience, we’ve built these kinds of tools before, to varying degrees of success. Every time we tackled a project like web-to-print, the struggle became very real. With minimal hours, the tools we knew and loved created a functional experience that was hard to maintain and very error prone.

More often than not, we had trouble with converting HTML to PDF reliably enough for high-resolution print-quality, especially with customer supplied imagery and layout. Offering fonts in a customized product builder is challenging to get right, especially when you’re creating a PDF that has to have the font attached. The RGB colorspace doesn’t translate easily to CMYK, the most common four color process for printing. And all of our experience in software revolved around pixels, not these things called picas. In this crazy world resolution could go as high as 3200 dpi on standard printers, dimensions suddenly couldn’t be determined based on pixels.

When one of our clients that had a tool we had built with existing technologies asked for some (not all) of the features mentioned in the beginning of the article, we also wanted to solve all the technical challenges that we grappled with over a year ago. As the planning stage was coming to an end, it was clear the budget wasn’t going to support such a complicated software build.

Product Customization is not the right phrase

Example screenshot of keditor in action.

We started to look for product customization tools and found nada. Then we looked for web layout tools which would maybe give Drupal a better HTML editing experience, but found a disappointing lack of online web to print solutions. We did find grapejs, innovastudio, and keditor

But, almost universally, these javascript-based libraries were focused on content and not editing products that would be printed. We needed something that had the goal of creating a printable image or PDF with a tight integration around the editor experience. We had nearly convinced ourselves there wasn’t a vertical for this concept, it seemed like nearly all product builders in the wild were powered by one-off conglomerations of toolsets.

Web to Print using Customer’s Canvas works with Drupal, right?

Finally, via a project manager, an industry phrase was discovered that opened the floodgates: web to print. After a bit of sifting through the sales pitch of all the technologies, almost all tools were found to be cumbersome and hard to integrate in an existing Drupal website, save one. Customer’s Canvas checked all the boxes and then some:

  • SAAS (so we don’t have to host customer’s images, or maintain the technology)
  • White label
  • More than fully featured
  • Completely customizable
  • Iframe-friendly. Meaning we could seamlessly plop the product customization tool into an existing or new layout.

Example of Customers Canvas running in Drupal Commerce.

To make an even longer story short, we jumped on board with Customer’s Canvas and built the first (to our knowledge) third party web to print Drupal 7 module. We might make a Tech Talk regarding the installation and feature set of the module. Until then, here’s what you can do:

  1. Download and install the module
  2. Provide some API credentials in the form of a javascript link
  3. Turn on the Drupal Commerce integration
  4. Provide some JSON configuration for a product via a field that gets added to your choice of product types.
  5. Click on Add to Cart for a Customer’s Canvas product
  6. Get redirected to a beautiful tool
  7. Click “Finish” and directed to a cart that can redirect you back to edit or download your product.
  8. As a store administrator, you can also edit the product from the order view page.

Drupal 8 and Web to Print and the Future

Currently, the module is built for Drupal 7. Upgrading to Drupal 8 Commerce 2 is definitely on our roadmap and should be a straightforward upgrade. Other things on the roadmap:

  • Better B2B features
    You can imagine a company needs signs for all of it’s franchisee partners and would want the ability to create stores of customizable signage. With Commerce on Drupal 8, that would be pretty straightforward to build.
  • More download options
    Customer’s Canvas supports lower res watermarked downloads for the customers as well as the high res PDF downloads. Currently the module displays the high resolution for all parties.
  • Better administrative interface
    If you’re using Drupal 7, the integration for this module is pretty easy, but the technical experience required for creating the JSON formatting for each product is pretty cumbersome. So it would be awesome (and very possible) to build out the most common customizations in an administration interface so you wouldn’t have to manage the JSON formatting for most situations.
  • Improve the architecture
    Possibly support Customer’s Canvas templates like entities that are referenced so that you could create a dozen or so customizable experiences and then link them up to thousands of products.
  • Webform support
    The base module assumes your experience at least starts with an entity that has fields and gets rendered. We could build a webform integration that would allow the webform to have a customer’s canvas build step. T-shirt design content anyone?
Integration can be a game changer

One of the big reasons we work with Drupal and Drupal Commerce is that anything with an API can be integrated. This opens the doors to allow the platform to do so much more than any other platform out there. If an integration needs to be made, we can do. If you need an integration made, talk to us! We're happy to help.

Evolving Web: Resizing Fields in Drupal 8 Without Losing Data

Drupal Planet - Jue, 06/14/2018 - 10:00

Drupal's field system is awesome and it is one of the reasons why I started using Drupal in the first place. However, there are some small limitations in it which surface from time to time. Say, you have a Text (Plain) field named field_one_liner which is 64 characters long. You created around 30 nodes and then you realized that the field size should have been 255. Now, if you try to do this from Drupal's field management UI, you will get a message saying:

There is data for this field in the database. The field settings can no longer be changed.

So, the only way you can resize it is after deleting the existing field! This doesn't make much sense because it's indeed possible to increase a field's size using SQL without saying goodbye to the data.

In this tutorial, we'll see how to increase the size of an existing Text (Plain) field in Drupal 8 without losing data using a hook_update_N().

Assumptions
  • You have intermediate / advanced knowledge of Drupal.
  • You know how to develop modules for Drupal.
  • You have basic knowledge of SQL.
Prerequisites

If you're going to try out the code provided in this example, make sure you have the following field on any node type:

  • Name: One-liner
  • Machine name: field_one_liner
  • Type: Text (Plain)
  • Length: 64

After you configure the field, create some nodes with some data on the One-liner field.

Note: Reducing the length of a field might result in data loss / truncation.

Implementing hook_update_N()

Reference: Custom Field Resize module on GitHub

hook_update_N() lets you run commands to update the database schema. You can create, update and delete database tables and columns using this hook after your module has been installed. To implement this hook, you need to have a custom module. For this example, I've implemented this hook in a custom module which I've named custom_field_resize. I usually name all my custom modules custom_ to namespace them. In the custom module, we implement the hook in a MODULE.install file, where MODULE is the machine-name of your module.

/** * Increase the length of "field_one_liner" to 255 characters. */ function custom_field_resize_update_8001() {}

To change the field size, there are four things we will do inside this hook.

Resize the Columns

We'll run a set of queries to update the relevant database columns.

$database = \Drupal::database(); $database->query("ALTER TABLE node__field_one_liner MODIFY field_one_liner_value VARCHAR(255)"); $database->query("ALTER TABLE node_revision__field_one_liner MODIFY field_one_liner_value VARCHAR(255)");

If revisions are disabled then the node_revision__field_one_liner table won't exist. So, you can remove the second query if your entity doesn't allow revisions.

Update Storage Schema

Resizing the columns with a query is not sufficient. Drupal maintains a record of what database schema is currently installed. If we don't do this then Drupal will think that the database schema needs to be updated because the column lengths in the database will not match the configuration storage.

$storage_key = 'node.field_schema_data.field_one_liner'; $storage_schema = \Drupal::keyValue('entity.storage_schema.sql'); $field_schema = $storage_schema->get($storage_key); $field_schema['node__field_one_liner']['fields']['field_one_liner_value']['length'] = 255; $field_schema['node_revision__field_one_liner']['fields']['field_one_liner_value']['length'] = 255; $storage_schema->set($storage_key, $field_schema);

The above code will update the key_value table to store the updated length of the field_one_liner in its configuration.

Update Field Configuration

We took care of the database schema data. However, there are other places where Drupal stores the configuration. Now, we will need to tell the Drupal config management system that the field length is 255.

// Update field configuration. $config = \Drupal::configFactory() ->getEditable('field.storage.node.field_one_liner'); $config->set('settings.max_length', 255); $config->save(TRUE);

Finally, Drupal also stores info about the actively installed configuration and schema. To refresh this, we will need to re-save the field storage configuration to make Drupal detect all our changes.

// Update field storage configuration. FieldStorageConfig::loadByName($entity_type, $field_name)->save();

After this, running drush updb or running update.php from the admin interface should detect your hook_update_N() and it should update your field size. If you're committing your configuration to git, you'll need to run drush config-export after running the database updates to update the config in the filesystem and then commit it.

Conclusion

Though we've talked about resizing a Text (Plain) or varchar field in this tutorial, we can resize any field type which can be safely resized using SQL. In certain rare scenarios, it might be necessary to create a temporary table with the new data-structure, copy the existing data into that table with queries and once all the data has been copied successfully, replace the existing table with the temporary table. For example, if you want to convert a Text (Plain) field to a Text (Long) field or some other type.

Maybe someday we'll have a resizing feature in Drupal where Drupal will intelligently allow us to increase a field's size from it's field UI and only deny reduction of field size where there is a possibility of data loss. But, in the meanwhile, we can use this handy trick to resize our fields. Thanks for reading! Please leave your comments / questions in the comments below and I'll get back to them as soon as I have time.

+ more awesome articles by Evolving Web

myDropWizard.com: CiviCRM secrets for Drupalers: Drupal 8 + CiviCRM June 2018 Update

Drupal Planet - Jue, 06/14/2018 - 02:15

We're Drupalers who only recently started digging deep into CiviCRM and we're finding some really cool things! This series of videos is meant to share those secrets with other Drupalers, in case they come across a project that could use them. :-)

In the screencast below, I'll demonstrate the new demo of Roundearth! Roundearth is our Drupal 8 + CiviCRM Distribution.

Watch the screencast to see the progress so far on the Roundearth project:

Video of Roundearth June 2018 Update

Some highlights from the video:

  • Drupal 8.5
  • CiviCRM + Bootstrap based Shoreditch theme
  • Quick demo of adding Contacts, using a Group, and sending a Bulk Mailing
  • Quick demo of a Public Event

Please leave a comment below!

PreviousNext: Patch Drupal core without things ending up in core/core or core/b

Drupal Planet - Jue, 06/14/2018 - 02:08

If you've ever patched Drupal core with Composer you may have noticed patched files can sometimes end up in the wrong place like core/core or core/b. Thankfully there's a quick fix to ensure the files make it into the correct location.

by Saul Willers / 14 June 2018

When using cweagans/composer-patches it's easy to include patches in your composer.json

"patches": { "drupal/core": { "Introduce a batch builder class to make the batch API easier to use": "https://www.drupal.org/files/issues/2018-03-21/2401797-111.patch" } }

However in certain situations patches will get applied incorrectly. This can happen when the patch is only adding new files (not altering existing files), like in the patch above. The result is the patched files end up in a subfolder core/core. If the patch is adding new files and editing existing files the new files will end up in core/b. This is because composer-patches cycle through the -p levels trying to apply them; 1, 0, 2, then 4.

Thankfully there is an easy fix!

"extra": { ... "patchLevel": { "drupal/core": "-p2" } }

Setting the patch level to p2 ensures any patch for core will get applied correctly.

Note that until composer-patches has a 1.6.5 release, specifically this PR, you'll need to use the dev release like:

"require": { ... "cweagans/composer-patches": "1.x-dev" }

The 2.x branch of composer-patches also includes this feature.

Big thanks to cweagans for this great tool and jhedstrom for helping to get this into the 1.x branch.

Tagged Drupal Development, Composer

Acro Media: Drupal Point of Sale 8 Released!

Drupal Planet - Mié, 06/13/2018 - 21:28
Official 8.0 Version Now Available


The Drupal Point of Sale provides a point of sale (POS) interface for Drupal Commerce, allowing in-person transactions via cash or card, returns, multiple registers and locations, and EOD reporting. It’s completely integrated with Drupal Commerce and uses the same products, customers, and orders between both systems. You can now bring your Drupal 8 online store and your physical store locations onto the same platform; maintaining a single data point.

The Drupal 7 version has been in the wild for a while now, but today marks the official, production ready release for Drupal 8.

Release Highlights

What features make up the new version of Drupal Point of Sale 8? There are so many that it will probably surprise you!

Omnichannel

Omnichannel is not just a buzzword, but a word that describes handling your online and offline stores with one platform, connecting your sales, stock and fulfillment centers in one digital location. Drupal Commerce has multi-store capabilities out of the box that allow you to create unique stores and share whatever product inventory, stock, promotions, and more between them. Drupal Point of Sale gives you the final tool you need to handle in-person transactions in a physical storefront location, all using your single Drupal Commerce platform. That’s pretty powerful stuff. Watch these videos (here and here) to learn more about how Drupal Commerce is true omnichannel.

Registers

Set up new registers with ease. Whether you have 1 or 1000 store locations, each store can have as many registers as you want. Because Drupal Point of Sale is a web-based solution, all you need to use a register is a web browser. A touch screen all-in-one computer, a laptop, an iPad; if it has a web browser, it can be your register. The Point of Sale is also fully open source, so there are no licensing fees and costs do not add up as you add more registers.

Customer Display


While a cashier is ringing through products, the Customer Display uses WebSocket technology to display the product, price, and current totals on a screen in real-time so the customer can follow along from the other side of the counter. Your customers can instantly verify everything you’re adding to the cart. All you need for the Customer Display is a web browser, so you can use an iPad, a TV or second monitor to display the information in real-time as the transaction progresses.

Barcode Scanning

Camera based barcode scanning
Don’t have a barcode scanner? No problem. With this release, any browser connected camera can be used to scan barcodes. Use a webcam, use your phone, use an iPad, whatever! If it has a camera, it works. This is helpful when you’re at an event or working a tradeshow and you don’t want to bring your hardware along.


Traditional barcode scanning
A traditional barcode scanner works too. Simply use the barcode scanner to scan the physical product’s barcode. The matching UPC code attached to one of your Drupal Commerce product variations will instantly add the product to your cashier’s display.

Labels

Generate and print labels complete with barcodes, directly from your Drupal Point of Sale interface. Labels are template based and can be easily customized to match any printer or label size so you can prep inventory or re-label goods as needed.

Receipts

Easily customize the header and footer of your receipts using the built in editor. Add your logo and contact information, return/exchange policy, special messaging or promotions, etc.

When issuing receipts, you can choose to print the receipt in a traditional fashion or go paperless and email it to your customer. You can do either, both, or none… whatever you want.

Returns

Whether online or in store, all of your orders are captured in Drupal Commerce and so can be returned, with or without the original receipt. A return can be an entire order or an individual product.

End of Day (EOD) Reports

When closing a register, you cashiers can declare their totals for the day. You can quickly see if you’re over or short. When finished, an ongoing daily report is collected that you can look back on. On top of this, Drupal Point of Sale is integrated with the core Drupal Commerce Reporting suite.

Hardware

Use Drupal POS 8 with anything that supports a browser and has an internet connection.

Technical Highlights

Adding to all of the user highlights above are a number of important technical improvements. It’s the underlying architecture that really makes Drupal Point of Sale shine.

Themable

Cashiers login to Drupal Point of Sale via a designed login page. Once logged in, the theme used is the default Drupal admin theme. However, like any other part of Drupal, your admin theme can be modified as much as you like. Keep it default or customize it to your brand; it’s yours to do with as you please.

Search API Enabled

The search API is a powerful search engine that lets you customize exactly what information is searchable. Using the Search API, your cashiers are sure to quickly find any product in your inventory by searching for a product’s title, SKU, UPC code (via barcode scanner), description, etc. Search API is completely customizable, so any additional unique search requirements can be easily added (brand, color, weight, etc.). The search API references the products on your site, and at any other store or multi-warehouse location to allow for you to serve customers in real-time. 

Fully Integrated with Drupal Commerce

The Drupal Point of Sale module seamlessly integrates into the existing Drupal Commerce systems and architecture. It shares products, stock, customers, orders, promotions and more. This makes Drupal Point of Sale plug-and-play while also making sure that the code base is maintainable and can take advantage of future Drupal Commerce features and improvements.

Permissions and Roles

When Drupal Point of Sale is installed, a “cashier” user role is created that limits the access users of this type have with your Drupal Commerce backend. Use Drupal’s fine grained permissions and roles system to manage your cashiers and give different permissions to employees, managers, marketers, owners, IT, etc. Any way you want it.

Custom Hardware

As mentioned above, all you need to use Drupal POS 8 is anything that supports a browser and has an internet connection. This opens the door for all kinds of custom Point of Sale hardware such as branded terminals, self-serve kiosks, tradeshow-ready hardware, and more.

We’ve been having fun prototyping various Raspberry Pi based POS hardware solutions. You can see some of them here and stay tuned for more. Drupal Point of Sale is open source, so why not open up the hardware too?

Drupal Point of Sale 8, Ready for your Drupal Commerce platform

We’re excited to finally release the production ready version of Drupal Point of Sale 8.0. There are many ecommerce-only platforms out there, but almost none of them can ALSO run in your physical store too. This is a BIG DEAL. Drupal Point of Sale gives you the last piece needed to run your entire store using Drupal Commerce allowing for centralized data and a single system for your team to learn and manage.

One admin login, one inventory list, one user list, one marketing platform, ONE. True omnichannel, without the fees.

Next Step

Commerce Kickstart
Starting a Drupal Commerce project from scratch? Use Commerce Kickstart to configure your install package (including Drupal Point of Sale).

Install with Composer
Already using Commerce for Drupal 8? Install Drupal Point of Sale with Composer.

$ composer require drupal/commerce_pos

Let Acro Media help
Acro Media is North America’s #1 Drupal Commerce provider. We build enterprise commerce using open source solutions. Unsure if Drupal Commerce and Drupal Point of Sale meet your business requirements? A teammate here at Acro Media would be happy to walk you through a replatforming evaluation exercise and provide you with the Point of Sale workbook to help you make your decision.

More from Acro Media

Lullabot: A Hierarchy of Software Quality

Drupal Planet - Mié, 06/13/2018 - 17:12

Our sales team often refers to our Hierarchy of Qualification when evaluating projects. This pyramid, inspired by Maslow’s hierarchy of needs, gives us the tools not just to evaluate the business needs of a project, but the human needs that are “encoded” in the project and team.

I’ve been thinking about how this applies to software development, particularly in the enterprise space. Enterprise implies that the software is maintained by large teams over long periods of time. Most developers have encountered internal enterprise software that leaves us shaking our heads, asking “how was this ever released?” Alternatively, we’ve used an internal tool that is quite good, but where the business has trouble repeating that success with new projects.

If we can describe the path towards self-actualizing software quality, we can elevate our teams from solving a series of one-off problems towards building value for our businesses and ourselves. It’s important to acknowledge that these steps are additive, meaning a team may benefit by mastering the lower rungs of the pyramid before moving on to the next.

undefined Describing software and how humans will use it

This is the base of the pyramid and undergirds everything else. Before writing a line of code, a team needs to have a good handle on who the audience is and how the software will affect them, along with the overall goals of the new project. Often, enterprise teams are given work to do by their customers or their management with the explanation: “a big internal department has told us this is a blocker, so don’t ask why and get to coding, so we aren’t late.”

This leaves project managers and developers in the dark, and unable to make reasonable guesses when requirements and technology conflict. Knowing the audience lets teams prioritize their work when time is short. Is the audience a group of editorial users? What’s their day-to-day workflow like? In that case, time may be best spent on testing and iterating the user interface, at the expense of implementing every last feature request. Is the audience another group of developers? Then perhaps the project doesn’t require a user interface initially, so developers can focus on great APIs and documentation. Not knowing the audience may lead to otherwise avoidable mistakes.

Like audience, project or product goals are another important piece of the puzzle. When goals are fuzzy, it is hard to know when a software project is done, or at least at a “1.0” release. Teams tend to leave projects at a nebulous “0.x” version, making future developers uncertain about the quality or robustness of a system. For example, perhaps a client asks to implement Apple News and Facebook Instant Articles on their website. It’s a reasonable request. Nevertheless, prematurely ending requirements gathering at “implement the feature” deprives your team of critical information.

There’s a business reason behind the request. Is the analytics team seeing declining traffic on their website, and worried the website audience is defecting to social networks? It may be good to suggest working on Facebook integration first, assuming you have some analytics data from existing Facebook posts to back it up. Or, perhaps the sales team is hearing from advertisers that they want to purchase ads against your content inside of the Apple News app. In that case, finding out some rough estimates for the additional ad revenue the sales team expects can help with qualifying estimates for the integration effort. If the estimated development budget eclipses the expected revenues, the team can investigate different implementation methods to bring costs down, or even rework the implementation as a one-off proof of concept.

Using audiences and goals to guide your development team makes it possible to discover opportunities beyond the immediate code, elevating the team from an “IT expense” to a “technical partner.”

Technical architecture and documentation

Writing the right software for today is one thing. Writing maintainable software that future developers can easily iterate on is another. Spaghetti architecture can leave you with software that works for now but is very expensive to maintain. For Drupal sites, that means avoiding god-objects (and services), writing true object-oriented code (and not procedural code that happens to live in classes), and avoiding side effects such as global and public variables. It’s important to document (preferably in code) not just what you built but why you built it, and how the architecture works to solve the original business problems. Think of it as writing a letter to your future self, or to whatever team inherits your code.

Determining the effort to put into this work is tricky, and time pressures can lead to quickly written, but poorly architected software. It’s useful to think about the expected life of the application. Look at similar applications in the company and ask about their initial and current development. For example, in our CMS projects, we often find we are replacing systems that are over a decade old. The code has gone through many teams of developers but is in a decrepit state as the architecture doesn’t follow common design patterns and the documentation is non-existent. No one wants to touch the existing application or infrastructure because experience has shown that may lead to a multi-day outage.

In cases like this, we know that following good design patterns and effectively documenting code will pay dividends later. A marketing site for an event in 3 months? You can probably take a lot of shortcuts without risk. Treat the project for what it is—a startup within an enterprise.

A culture of iterative feedback

Defining the software to write and following good architecture in its execution is important but we can do more. Now that our efforts are well documented, its time to create a proper harness for testing. Without one, it’s impossible to know if your team is meeting your quality goals. It’s easy for teams to fall into a fragile, untrustworthy process for testing. For sites, that typically means not defining and following deployment best practices or creating a QA bottleneck through infrastructure.

The typical git model of development implies not just a few branches that are merge destinations (like master and develop) but many feature branches, too. It’s not uncommon for developers to create multiple branches breaking a single ticket into smaller code components that can be independently reviewed and tested.

For developers, “waiting for a QA build” is one of the biggest motivation killers they face. For QA, trying to coordinate testing of various features onto a single QA server is complex and leaves them questioning the validity of their test plans.

Just as “smaller, chunkier” pull requests are a best practice for developers, a similar guideline helps QA analysts feel confident when certifying a feature or release.

Tugboat is one tool that lets teams implement automatic, lightweight, and disposable testing environments. When a developer opens a pull request, a complete working copy of the website is built and available for anyone with access to poke at. It’s easy to rebuild new testing environments because the setup is automated and repeatable—not manual. A process like this extends the effective QA team by including both developers (who can reproduce and take ownership of bugs “live” with QA, before code is merged), and the actual patrons of the build—project managers and business stakeholders—who will eventually need to sign off on the product.

These tools also change the culture of communication between stakeholders and implementation teams. Even with two-week sprints, there still can be an information gap stakeholders face between sprint planning and the sprint demo. Instead, getting their feedback is as frictionless as sending a link. There are fewer surprises at demos because stakeholders have already seen the constituent parts of the work that compose the whole demo.

Automated tests and quality metrics

Frictionless QA is great, but it’s still a manual process. It’s incredibly frustrating for an entire team to be going through manual QA for regression testing. Sometimes, regression tests uncover the sort of bugs that mean “rewrite the feature from scratch,” leading to failed sprints and missed demos. That’s especially likely when regression testing occurs at the end of a sprint, leaving precious little time for rework and another round of QA.

In contrast, developers love when automated tests alert them to issues. Not only does it save developers and QA time (they can work on a new ticket while waiting for a test suite to run), but it reduces the cognitive load of developing a new feature in a complex application. Instead of developers having to become the application experts, and maintain that knowledge in perpetuity, the automated tests themselves become the legal description of how the code should work.

Comprehensive test coverage, regardless of the actual tool used (like PHPUnit or Behat), also helps ensure that software quality remains constant as underlying dependencies are upgraded. For a Drupal site, teams should expect at least two significant upgrades of Drupal per year, along with an update to PHP itself. Automated testing means the initial work is as simple as “upgrade Drupal, run tests, and see what breaks,” instead of throwing a bunch of time and money at manual testing. It also means that testing of security patches is much faster, saving time between vulnerability disclosure and deployment.

Of course, there’s a cost to automated testing. Tests are still code that need to be maintained. Tests can be buggy themselves (like that time I accidentally tested for the current year in a string, which broke when 2018 rolled around). Someone has to pay for a continuous integration service or maintain custom infrastructure. Reducing these costs is a particular interest of ours, leading to templates like Drupal 8 CI and the Drupal Testing Container.

Again, knowing the expected life of the software you write helps to inform how deep into testing to go. I don’t think I’ve ever written automated tests for a microsite but for a multi-step payment form expected to endure for 5 years, they were critical.

Technical architecture, for all of its focus on “clean code,” can rapidly devolve into unresolvable conflicts within a team. After all, everyone has their idea of what “good design” looks like. While automated tools can’t tell you (yet) if you’re using the right design pattern to solve a given problem, they can tell you if your app is getting better or worse over time.

Start with enforcing basic code standards within your team. Code standards help team members to read code written by others, which is “step 0” in evaluating if a given codebase meets quality goals or not.

For example:

As a technical      architect, you       want reading the          code to feel as natural as             reading written text — so you            can focus on the meaning, and not       the individual words.

Think about how much harder it is to focus on the idea I’m trying to communicate. Even though you can understand it, you have to slow down and focus. Writing code without code standards sacrifices future comprehension at the altar of expediency. I’ve seen teams miss critical application bugs even though they were hiding in plain sight due to poor code formatting.

Luckily, while different projects may have different standards, they tend to be internally consistent. I compare it to reading different books with different fonts and layouts. You may have to context switch between them, but you would rarely have the paragraphs of the books interspersed with each other.

While code standards tend to either be “right” or “wrong,” code quality metrics are much more of an art than a science—at least, as far as interpreting and applying them to a project goes. I like to use PhpMetrics as it has a very nice user interface (and works perfectly within continuous integration tools). These reports inherently involve some subjective measure of what a failure is. Is the cyclomatic complexity of a method a fail at 15, or 50, or not at all? Sometimes, with difficult codebases, the goals come down to “not making things any worse.” Whatever your team decides, using these tools daily will help ensure that your team delivers the best product it can.

Continuous delivery

As a team addresses each new layer of the hierarchy, the lower layers become habits that don’t require day-to-day focus. The team can start to focus on solving business problems quickly, becoming an IT partner instead of an IT expense. Each person can fully participate in the business instead of the narrow field in front of them.

With all of these prerequisites in place, you will start to see a shift in how your team approaches application development. Instead of a conservative and fragile approach to development, your team will start to design and develop applications without fear. Many teams find themselves rotating between the bottom two levels of the pyramid. With careful planning and hard work, teams can work towards the upper tiers. The whole software delivery process—from ideas to releases to hotfixes to security releases—becomes a habit rather than a scramble every time.

Hero Image Photo by Ricardo Gomez Angel on Unsplash

Appnovation Technologies: Blazing Fast Drupal Workflow

Drupal Planet - Mié, 06/13/2018 - 15:34
Blazing Fast Drupal Workflow Why we need to improve our tools We are living exciting times as web developers. Nowadays, many great tools are available to develop and deploy Drupal websites. However, it is still hard to find your own way through the countless software and platforms available out there. Productivity depends on your tools and your workflow. Carefully picking softwa...

Electric Citizen: Electric Citizen Presents...

Drupal Planet - Mié, 06/13/2018 - 12:20

Twin Cities Drupal Camp 2018 just happened and Electric Citizen were able to present no less than four sessions on subjects ranging from Drupal search, configuration management, local development environments and dealing with emerging tech.

Just some of the folks who showed up for camp this year

Páginas