Drupal Planet

Subscribe to canal de noticias Drupal Planet
Drupal.org - aggregated feeds in category Planet Drupal
Actualizado: hace 59 mins 16 segs

Valuebound: Continuous integration using Jenkins and GitHub to automate deployment

Vie, 10/27/2017 - 09:35
Continuous Integration (CI) is rapidly becoming an integral part of software development process as it makes our monotonous and repetitive tasks a little less grindy. CI is a project development practice where developers integrate code into a shared repository frequently. Each Integration is then verified by an automated build that allows the team to detect problems in an early stage.

This post will walk you through the continuous integration with Jenkins and GitHub. Here we will install Jenkins, create Jenkins task and then configure it with GitHub.

Let’s get our arms around why Jenkins is so popular and why this is not just a hot topic, but an important best practice for developers…

InternetDevels: Supercharge your Drupal 8 website with special powers of Views

Vie, 10/27/2017 - 09:25

The Views module lets even non-developers organize and present the website's content in the desired ways. Website administrators love it, Drupal newbies start with it, and Drupal ninjas perform miracles with it. No wonder it used to be the most downloadable contributed module. Well, it still is — in Drupal 7.

Read more

Colan Schwartz: Representing Drupal at the GSoC 2017 Mentor Summit

Jue, 10/26/2017 - 23:25
Topics:  I've been mentoring students as part of Drupal's Google Summer of Code (GSoC) program for the last two years, where we guide students in working on Drupal projects over the summer. (For the projects I've been involved in, see User-friendly encryption now in Drupal 8! and Client-side encryption options now available in Drupal.) This year, our organization administrator, Matthew Lechleider, invited my to the Mentor Summit.

The Google-provided summit creates a forum for members of free/libre and open-source software (FLOSS) organizations to come together to discuss GSoC, mentoring and FLOSS in an unconference format. I met attendees from all over the world, who flew in from far-reaching places to interact as part of a wider community. Generally, two mentors are invited from each organization, but some had more and some had less.

I arrived late Friday night, having missed that day's introductory sessions due to some trouble at the US border. Historically, in my experience, we Canadians haven't had too much trouble getting across the border for technology conferences. This has recently changed so it's now necessary to provide proof of intent for being in the country (a signed invitation from the organizers) as well as proof of business activities (corporate and tax documents). Needless to say, all of this took a significant amount of time to prepare. Eventually though, I was allowed through and made my way to Sunnyvale, California.

On Saturday morning, the day started with Lightning Talks, where attendees gave presentations on their student projects having only a few minutes each to speak. There were so many presentations that it was necessary to split the session into two, continuing after dinner that same evening. While there were several interesting projects highlighted, the most interesting to me was Jitsi's speech-to-text service. Besides making video conferences accessible through textual media, it also allows for automated note-taking. This was one of the truly amazing projects completed by a student over the summer.

In talking about Drupal with other folks, I was surprised to hear that many other delegates do not have paying day jobs associated with their organizations. They work on these projects on the side, and generally don't get paid for them. For example, nobody in the Kodi contributor community gets paid; it's all volunteer work. While there are volunteer contributions to Drupal, many of those contributors eventually turn that knowledge into paid work. I suppose we're a lucky bunch, being able to work on an open-source project and get paid for it. And speaking of Kodi, I'm happy to report that they're using Drupal for their Web site!

There were quite a few conversations about messaging applications, with a large XMPP delegation. There were also folks from the Zulip and Rocket.Chat communities. It was interesting to hear from a former XMPP developer who's shifted completely to Matrix with the Riot client, exactly as I've done. I use that client and the federated protocol to bridge with other communications networks such as proprietary closed-source Slack and classic Internet Relay Chat (IRC) whenever possible. Matrix already integrates with these two protocols, and has built-in support. The goal is to eventually use only one messaging client, instead of the many applications we all have installed on all of our devices. Rocket.Chat has already started working on Matrix integration, while Zulip hasn't. They're open to it, and may move in this direction eventually, but for now they're focused on user-experience innovations. In the Drupal community, we've had a very long discussion about using Matrix for our communications alongside IRC, and have finally put a plan into place to make this happen. For those eager to jump in, it's now possible to use Matrix as an always-on IRC bouncer client to connect to Drupal's IRC channels for communication within the community. To sum up, it looks as though everything is moving towards Matrix.

Alongside Drupal, representatives from other content management systems (CMSes) also attended. There were folks from both the Joomla and Plone communities. It would have been great to connect with them, but I didn't get a chance. I was hoping that Airship would have representation as that crew has been doing a lot of excellent security work with PHP projects (including helping us).

All in all, it was an excellent conference. In my humble opinion, it's really important to stay in touch with this greater community, cross-pollinate with folks doing similar work in the public interest, and keep contributing!

This article, Representing Drupal at the GSoC 2017 Mentor Summit, appeared first on the Colan Schwartz Consulting Services blog.

PreviousNext: Sending Drupal entities to dialogflow with Chatbot API module

Jue, 10/26/2017 - 20:16
Share:

Services like dialogflow (formerly api.ai) do a much better job of natural language parsing (NLP) if they're aware of your entity names in advance.

For example, it can recognize that show me the weather in Bundaberg is a request for weather in Bundaberg, if you've told it ahead of time that Bundaberg is a valid value for the City entity.

Having the entity values automatically update in your service of choice when they're created and changed in Drupal makes this much more efficient.

This article will show you how to achieve that.

by Lee Rowlands / 27 October 2017

This is where the chatbot_api_entities sub-module comes in.

When you enable this module you can browse to Admin -> Config -> Web Services -> Entity Collections to create a collection.

The UI looks something like this:

Adding an entity collection to send to dialogflow in Drupal

Each collection comprises an entity-type and bundle as well as a push handler and a query handler.

By default Chatbot API Entities comes with a query handler for each entity-type and a specific one for Users to exclude blocked users.

The api_ai_webhook module comes with a push handler for pushing entities to your dialogflow/api.ai account.

By default, these plugins query based on available entities and the push handler pushes the entity labels.

Writing your own query handler

If for example, you don't want to extract entities from entity labels, e.g. you might wish to collect unique values from a particular field. In this case you can write your own query handler.

Here's an example that will query speaker names from a session content type. The collection handed to the push handler will contain all published sessions.

namespace Drupal\your_module\Plugin\ChatbotApiEntities\QueryHandler; use Drupal\chatbot_api_entities\Entity\EntityCollectionInterface; use Drupal\chatbot_api_entities\Plugin\QueryHandlerBase; use Drupal\Core\Entity\EntityTypeManagerInterface; /** * Defines a query handler that just uses entity query to limit as appropriate. * * @QueryHandler( * id = "speakers", * label = @Translation("Query speakers from sessions"), * ) */ class SpeakerQuery extends QueryHandlerBase { /** * {@inheritdoc} */ public function query(EntityTypeManagerInterface $entityTypeManager, array $existing = [], EntityCollectionInterface $collection) { $storage = $entityTypeManager->getStorage('node'); return $storage->loadMultiple($storage->getQuery() ->condition('type', 'session') ->exists('field_speaker_name') ->condition('status', 1) ->execute()); } /** * {@inheritdoc} */ public function applies($entity_type_id) { return $entity_type_id === 'node'; } }Writing your own push handler

Whilst we've written our own query handler to load entities that we wish to extract values from, we need to write our own push handler to handle sending anything other than the label.

Here's an example push handler that will push field values as entities to Api.ai/dialogflow

<?php namespace Drupal\your_module\Plugin\ChatbotApiEntities\PushHandler; use Drupal\api_ai_webhook\Plugin\ChatbotApiEntities\PushHandler\ApiAiPushHandler; use Drupal\chatbot_api_entities\Entity\EntityCollection; use Drupal\Core\Entity\EntityInterface; /** * Defines a handler for pushing entities to api.ai. * * @PushHandler( * id = "api_ai_webhook_speakers", * label = @Translation("API AI entities endpoint (speakers)") * ) */ class SpeakerPush extends ApiAiPushHandler { /** * {@inheritdoc} */ protected function formatEntries(array $entities, EntityCollection $entityCollection) { // Format for API.ai/dialogflow. return array_map(function ($item) { return [ 'value' => $item, 'synonyms' => [], ]; }, // Key by name to remove duplicates. array_reduce($entities, function (array $carry, EntityInterface $entity) { $value = $entity->field_speaker_name->value; $carry[$value] = $value; return $carry; }, [])); } } Learn more

If you're interested in learning more about Chatbots and conversational UI with Drupal, I'm presenting a session on these topics at Drupal South 2017, the Southern Hemisphere's biggest Drupal Camp. October 31st is the deadline for getting your tickets at standard prices, so if you plan to attend, be sure to get yours this week to avoid the price hike.

I hope to see you there.

Tagged AI, Natural Language Parsing, Chatbot, Drupal 8

Posted by Lee Rowlands
Senior Drupal Developer

Dated 27 October 2017

Add new comment

Dries Buytaert: Shopping with augmented reality

Jue, 10/26/2017 - 15:34

Last spring, Acquia Labs built a chatbot prototype that helps customers choose recipes and plan shopping lists with dietary restrictions and preferences in mind. The ability to interact with a chatbot assistant rather than having to research and plan everything on your own can make grocery shopping much easier. We wanted to take this a step further and explore how augmented reality could also improve the shopping experience.


The demo video above features how a shopper named Alex can interact with an augmented reality application to remove friction from her shopping experience at Freshland Market (a fictional grocery store). The Freshland Market mobile application not only guides Alex through her shopping list but also helps her to make more informed shopping decisions through augmented reality overlays. It superimposes useful information such as price, user ratings and recommended recipes, over shopping items detected by a smartphone camera. The application can personalize Alex's shopping experience by highlighting products that fit her dietary restrictions or preferences.

What is exciting about this demo is that the Acquia Labs team built the Freshland Market application with Drupal 8 and augmented reality technology that is commercially available today.

The first step in developing the application was to use an augmented reality library, Vuforia, which identifies pre-configured targets. In our demo, these targets are images of product labels, such as the tomato sauce and cereal labels shown in the video. Each target is given a unique ID. This ID is used to query the Freshland Market Drupal site for content related to that target.

The Freshland Market site stores all of the product information in Drupal, including price, dietary concerns, and reviews. Thanks to Drupal's web services support and the JSON API module, Drupal 8 can serve content to the Freshland Market application. This means that if the Drupal content for Rosemary & Olive Oil chips is edited to mark the item on sale, this will automatically be reflected in the content superimposed through the mobile application.

In addition to information on price and nutrition, the Freshland Market site also stores the location of each product. This makes it possible to guide a shopper to the product's location in the store, evolving the shopping list into a shopping route. This makes finding grocery items easy.

Augmented reality is building momentum because it moves beyond the limits of a traditional user interface, or in our case, the traditional website. It superimposes a digital layer onto a user's actual world. This technology is still emerging, and is not as established as virtual assistants and wearables, but it continues to gain traction. In 2016, the augmented reality market was valued at $2.39 billion and it is expected to reach $61.39 billion by 2023.

What is exciting is that these new technology trends require content management solutions. In the featured demo, there is a large volume of product data and content that needs to be managed in order to serve the augmented reality capabilities of the Freshland Market mobile application. The Drupal community's emphasis on making Drupal API-first in addition to supporting distributions like Reservoir means that Drupal 8 is prepared to support emerging channels.

If you are ready to start reimagining how your organization interacts with its users, or how to take advantage of new technology trends, Acquia Labs is here to help.

Special thanks to Chris Hamper and Preston So for building the Freshland Market augmented reality application, and thank you to Ash Heath and Drew Robertson for producing the demo video.

Acro Media: Video: Reporting in Drupal Commerce 2.x is Going to be Great!

Jue, 10/26/2017 - 13:08


The good news is that Commerce 2.x has the potential to handle tons of different reports and display the data any way you want. The dashboard is complete and the framework is impressive. The catch is that many of the reports don’t technically exist yet, so you need to do a little configuring to make sure you’re looking at the data that’s most important to you.

What kind of reports are we talking about?

You could have a whole suite of point-of-sale reports, for instance (in Commerce 1, they were their own set of reports; in Commerce 2, they just build on Commerce reporting). If you need reports for checkout, or cart, or analytics, you can have them all in the Commerce reporting suite, even if they are vastly different types of reports. So you can have reports for different people who manage different metrics, but you can build them all using the same framework.

Appnovation Technologies: Meet the Appnovation Fall 2017 Co-ops

Jue, 10/26/2017 - 05:00
Meet the Appnovation Fall 2017 Co-ops Get to know Appnovation's Fall 2017 cohort of post-secondary co-op students. This September 2017 has been both busy and exciting here at Appnovation! We've relocated to a brand new office in the Railtown area of Vancouver, BC, we've hopped into a brand new fiscal year, and we've hired a super cool group of co-op students to help break in the n...

PreviousNext: Lightning talk: Database Deadlocks & Render caching - A case study

Jue, 10/26/2017 - 00:46
Share:

In this week's Lightning talk, I go through a case study on an investigation into Deadlocks and Render caching and why cache contexts are so important to get right. Check out the video below to find out how we were able to withstand 10x the throughput with smarter caching.

by Adam Bramley / 26 October 2017 Tagged Cache Contexts, Drupal 8

Posted by Adam Bramley
Senior Drupal Developer

Dated 26 October 2017

Add new comment

Agiledrop.com Blog: AGILEDROP: History of the Druplicon, the famous Drupal symbol

Mié, 10/25/2017 - 23:54
Does everybody know a story how the Drupal was created? It's quite interesting. Dries Buytaert and Hans Snijder were students at the University of Antwerp back in 2000. Back then a broadband internet connection was a luxury, so Hans and Dries set up a wireless bridge among the student dorms to share Hans’s ADSL modem connection among eight students. Dries made an online forum, where they shared news like where they were meeting, having dinner, etc. This software was nameless for a while. Then Dries graduated and left the dorm. They wanted to stay in touch so the internal site had to go… READ MORE

Lullabot: Decoupled Drupal Hard Problems: Image Styles

Mié, 10/25/2017 - 20:52

As part of the API-First Drupal initiative, and the Contenta CMS community effort, we have come up with a solution for using Drupal image styles in a decoupled setup. Here is an overview of the problems we sought to solve:

  • Image styles are tied to the designs of the consumer, therefore belonging to the front-end. However, there are technical limitations in the front-end that make it impossible to handle them there.
  • Our HTTP API serves an unknown number of consumers, but we don't want to expose all image styles to all consumers for all images. Therefore, consumers need to declare their needs when making API requests.
  • The Consumers and Consumer Image Styles modules can solve these issues, but it requires some configuration from the consumer development team.
Image Styles Are Great

Drupal developers are used to the concept of image styles (aka image derivatives, image cache, resized images, etc.). We use them all the time because they are a way to optimize performance on our Drupal-rendered web pages. At the theme layer, the render system will detect the configuration on the image size and will crop it appropriately if the design requires it. We can do this because the back-end is informed of how the image is presented.

In addition to this, Drupal adds a token to the image style URLs. With that token, the Drupal server is saying I know your design needs this image style, so I approve the use of it. This is needed to avoid a malicious user to fill up our disk by manually requesting all the combinations of images and image styles. With this protection, only the combinations that are in our designs will be possible because Drupal is giving a seal of approval. This is transparent to us so our server is protected without even realizing this was a risk.

The monolithic architecture allows us to have the back-end informed about the design. We can take advantage of that situation to provide advanced features.

The Problem

In a decoupled application your back-end service and your front-end consumer are separated. Your back-end serves your content, and your front-end consumer displays and modifies it. Back-end and front-end live in different stacks and are independent of each other. In fact, you may be running a back-end that exposes a public API without knowing which consumers are using that content or how they are using it.

In this situation, we can see how our back-end doesn't know anything about the front-end(s) design(s). Therefore we cannot take advantage of the situation like we could in the monolithic solution.

The most intuitive solution would be to output all the image styles available when requesting images via JSON API (or REST core). This will only work if we have a small set of consumers of our API and we can know the designs for those. Imagine that our API serves to three, and only three, consumers A, B and C. If we did that, then when requesting an image from consumer A we would output all the variations for all the image styles for all the consumers. If each consumer has 10 - 15 image styles, that means 30 - 45 image styles URLs, where only one will be used.

undefined

This situation is not ideal because a malicious user can still generate 45 images in our disk for each image available in our content. Additionally, if we consider adding more consumers to our digital experience we risk making this problem worse. Moreover, we don't want the presentation from one consumer sipping through another consumer. Finally, if we can't know the designs for all our consumers, then this solution is not even on the table because we don't know what image styles we need to add to our back-end.

On top of all these problems regarding the separation of concerns of front-end and back-end, there are several technical limitations to overcome. In the particular case of image styles, if we were to process the raw images in the consumer we would need:

  • An application runner able to do these operations. The browser is capable of this, but other more challenged devices won't.
  • A powerful hardware to compute image manipulations. APIs often serve content to hardware with low resources.
  • A high bandwidth environment. We would need to serve a very high-resolution image every time, even if the consumer will resize it to 100 x 100 pixels.

Given all these, we decided that this task was best suited for a server-side technology.

In order to solve this problem as part of the API-First initiative, we want a generic solution that works even in the worst case scenario. This scenario is an API served by Drupal that serves an unknown number of 3rd party applications over which we don't have any control.

How We Solved It

After some research about how other systems tackle this, we established that we need a way for consumers to declare their presentation dependencies. In particular, we want to provide a way to express the image styles that consumer developers want for their application. The requests issued by an iOS application will carry a token that identifies the consumer where the HTTP request originated. That way the back-end server knows to select the image styles associated with that consumer.

undefined

For this solution, we developed two different contributed modules: Consumers, and Consumer Image Styles.

The Consumers Project

Imagine for a moment that we are running Facebook's back-end. We defined the data model, we have created a web service to expose the information, and now we are ready to expose that API to the world. The intention is that any developer can join Facebook and register an application. In that application record, the developer does some configuration and tweaks some features so the back-end service can interact optimally with the registered application. As the manager of Facebook's web services, we are not to take special request from any of the possible applications. In fact, we don't even know which applications integrate with our service.

The Consumers module aims to replicate this feature. It is a centralized place where other modules can require information about the consumers. The front-end development teams of each consumer are responsible for providing that information.

This module adds an entity type called Consumer. Other modules can add fields to this entity type with the information they want to gather about the consumer. For instance:

  • The Consumer Image Styles module adds a field that allows consumer developers to list all the image styles their application needs.
  • Other modules could add fields related to authentication, like OAuth 2.0.
  • Other could gather information for analytic purposes.
  • Maybe even configuration to integrate with other 3rd party platforms, etc.
The Consumer Image Styles Project

Internally, the Consumers module takes a request containing the consumer ID and returns the consumer entity. That entity contains the list of image styles needed by that consumer. Using that list of image styles Consumer Image Styles integrates with the JSON API module and adds the URLs for the image after applying those styles. These URLs are added to the response, in the meta section of the file resource. The Consumers project page describes how to provide the consumer ID in your request.

{ "data": { "type": "files", "id": "3802d937-d4e9-429a-a524-85993a84c3ed" "attributes": { … }, "relationships": { … }, "links": { … }, "meta": { "derivatives": { "200x200": "https://cms.contentacms.io/sites/default/files/styles/200x200/public/boyFYUN8.png?itok=Pbmn7Tyt", "800x600": "https://cms.contentacms.io/sites/default/files/styles/800x600/public/boyFYUN8.png?itok=Pbmn7Tyt" } } } }

To do that, Consumer Image Styles adds an additional normalizer for the image files. This normalizer adds the meta section with the image style URLs.

Conclusion

We recommend having a strict separation between the back-end and the front-end in a decoupled architecture. However, there are some specific problems, like image styles, where the server needs to have some knowledge about the consumer. In these very few occasions the server should not implement special logic for any particular consumer. Instead, we should have the consumers add their configuration to the server.

The Consumers project will help you provide a unified way for app developers to include this information in the server. Consumer Image Styles and OAuth 2.0 are good examples where that is necessary, and examples on how to implement it.

Further Your Understanding

If you are interested in alternative ways to deal with image derivatives in a decoupled architecture. There are other alternatives that may incur extra costs, but still worth checking: Cloudinary, Akamai Image Converter, and Origami.

Hero Image by Sadman Sakib

Drupal Commerce: Beta release for Commerce Discount 7.x-1.0

Mié, 10/25/2017 - 19:30

Commerce Discount improves Commerce 1.x by providing a custom entity type for managing Product and Order level discounts, including more complicated discounts like free shipping upgrades and BOGO offers. The module makes it easier for merchants to create promotions that would otherwise require the use of the Rules UI or even custom code, tasks that are similarly beyond the reach of most casual Drupal users.

Even as we've worked to improve the user experience even further in Commerce 2.x by making Promotions a core module, we continue to work to do to improve the experience for 1.x users. Today, after a month of focused contrib time at Commerce Guys team and review from end users like Thomas Jonas at the University of Minnesota, we're proud to announce the release of a long overdue beta version for the module.

Mediacurrent: DrupalCamp Atlanta 2017 Highlights

Mié, 10/25/2017 - 18:32

It's official: the countdown to DrupalCamp Atlanta is on. In just two weeks (November 2 - November 4), Mediacurrent will proudly sponsor another great camp in Buckhead, the tech center of ATL. Known for being a top Drupal event in the southeast, DCATL isn't one to miss. It's not too late to register!

Bay Area Drupal Camp: BADCamp videos now available on the website!

Mié, 10/25/2017 - 18:25
BADCamp videos now available on the website! Grace Lovelace Wed, 10/25/2017 - 1:25pm

Thank you! We had so much fun with all of you at BADCamp that we're already excited for next year!

Review what you learned and see what you missed!

Are there sessions you weren't able to attend at BADCamp this year? Or maybe you're back at work ready to apply what you learned and wishing you had better notes? Never fear! We took video of the slides from each presentation at BADCamp that includes audio from our expert speakers! Just visit our event schedule and click on the sessions you'd like to view. Videos are posted at the top of each session page. 

Share your feedback.

Please take a moment to let us know what you thought about BADCamp—it's just a few questions and will help us improve our future events.

Send Your Feedback

Join us at next year's BADCamp, October 24th through 27th, 2018! 

BADCamp Organizing Collective

Drupal Planet

Elevated Third: Lessons Learned: Component Based Design with Paragraphs

Mié, 10/25/2017 - 14:04
Lessons Learned: Component Based Design with Paragraphs Lessons Learned: Component Based Design with Paragraphs Anthony Simone Wed, 10/25/2017 - 10:04

 

 

 

The ideas of Atomic Design and component based design allow one to create an established structure within which a large scale front end project can be built. The CMS space hasn’t always been the most friendly toward implementing these types of patterns. Whether it’s difficulty in creating a content architecture that models your front end design system within Drupal or the feeling of lack of control over generated markup, it can feel like an uphill battle.

The Paragraphs module gives us the tools to create much more well defined and structured component based architectures upon which modular front end systems can be built. The Paragraphs module, however, comes with no rules. As a site architect and front end developer, you must decide how to implement Paragraphs. There is definitely a lot of room for flexibility in implementation, but there are many best practices that can be followed to allow for a very clean, scalable, and extendable front end design system to be built within Drupal 8.

The goals of this session will be the following:

  • Review the basic concepts and benefits of component based design
  • Discuss the paragraphs module and how to create an implementation based on a well defined content architecture 
  • Explore some Drupal best practices that allow for a successful component based design system implementation

Acquia Lightning Blog: Lightning migration to core media

Mié, 10/25/2017 - 12:57
Lightning migration to core media Adam Balsam Wed, 10/25/2017 - 10:57

It's here! Lightning 2.2.1 provides a migration to the core media system that was introduced in Drupal 8.4.0.

This is a major milestone for us. One of the big advantages of using Lightning over vanilla Drupal or a roll-your-own solution is that as underlying modules evolve, Lightning maintains an update/migration path. This effectively creates a facade in front of media, workflow, and layout functionality. That functionality remains stable no matter what. Of course, this is in addition to the fact that Lightning provides all of that functionality out of the box. (Even though Media is now a part of core, it still doesn't provide the out of the box configuration, experience, and add-ons that Lightning does.)

Core Media migration was #2 in our list of major migrations. It was preceded by a migration from Layout Plugin to the core Layout Discovery module. Next up is Workflow which will involve migrating from Workbench Moderation to core's Workflows and Content Moderation modules.

Special thanks to phenaproxima who is at the intersection of the core, contrib, and Lightning work. To say the migration wouldn't have been possible without him is an understatement.

Want to try it out?

Update your existing codebase:

composer update acquia/lightning --with-dependencies composer update drupal/core

Then check out our 2.2.0 -> 2.2.1 update instructions.

Or build a fresh codebase:

composer create-project acquia/lightning-project MY_PROJECT

 

ADCI Solutions: Top free good-looking Drupal themes

Mié, 10/25/2017 - 08:13

Professional design is a half of website successful performance. Every text field, a button, and a picture should be placed purposefully.

We keep exploring Drupal contributions, and here’s the selection of free good-looking Drupal themes available for immediate usage. So download any and start working.

 

Check awesome free Drupal themes

 

marvil07.net: A release plan for contributed drupal extensions

Mar, 10/24/2017 - 23:51

tl;dr: Review the plan at the end directly.

Software has a changing nature; Drupal and its extensions are not the exception.
To be useful for a most of the users, those need to be on full releases, not only on the version control system; indeed the problem is not new and there is even a well-known phrase for one of its solutions: release early, release often
Therefore it is important to have a release plan.
Following after some context and reasoning, I propose a couple of practical guidelines on release schedule for contributed drupal extensions that I intend to use: release weekly until stable, then once a month following core shedule.

On the changing nature of software

Software inherently tends to change, there are exceptions like embedded systems or really purpose-specific software.
Even really solid software like GNU core utils project, started on 1992, which provides tools that I consider among the most mature in the software space used daily, has 253 commits and three point releases in the last 12 months[1].

How much a software change depends on many factors.
I would hypothesize that the most relevant factors are the age of the project, the environment around it, and the amount of people behind it.
In this way, new projects change more than well established projects, and projects around dynamic environments which is also influenced by the amount of people around it, will also change more than the ones in environments with less participants or less technology changes.

How changing are contributed drupal extensions?

Drupal contributed extensions are naturally mainly influenced by drupal core, so let us examine a bit how changing is Drupal core.
It is definitely on a dynamic environment, and I will argue that each major release can be considered a new project, making it really changing.

On the dynamic side, even if web standards changes slowly, and for good reasons, technologies around web tools are still constantly changing.
The stack has changed a lot over the years, and even if some tools like apache and mysql/mariadb are still around, other parts of the stack has been changing a lot, especially around client side javascript.

Drupal core project code history is now 17 years old, which seems like enough time to get into a stable state, especially if you are not yet part of the drupal community.
But the drupal project has a history on rewriting the way its internal works, which has been argued as one of the reasons why drupal can keep up with the changing environment around web technologies.
It may be also a consequence of its amazingly collaborative community.
And because of this rewriting between major versions, at least internally, each major release can be considered a new project, especially with 8.x.x.
A hint about it may be reflected in the fact that major contributors across different drupal core versions are mainly different; only a few one are as active across releases.

In consequence, drupal core is still a highly changing project, and in the same way its extensions inherit part of that changing nature; but a contributed drupal extension is not really only influenced by core.
Given the amazingly high number of written extensions, it is only natural to start depending on other software pieces and make its maintenance more effective.
For instance, currently there are 13432 and 4069, D7 and D8 compatible modules respectively.

In this way, one of the factors that will clearly influence a contributed extension is their dependencies, both inside and outside the drupal, and how changing they are.

Another factor is the amount of people behind it, not only developers, but also users reporting bugs.
For drupal contributed extensions this vary a lot, but it is usually not that big.

For all this, contributed drupal extensions are usually in a changing environment.

Commits are not releases: release early, release often

As a contributed module developer myself, I will start by mea culpa.
Sometimes I wrongly assume that when a change is inside git the work is done, but that may be only true for people willing to take the extra effort to get the changes from git, or assume the consecuences of using a development release.
Commits are a developer tool inside the used version control system, but not necessarily something that is visible/usable for all.

As in many occasions, the problem is not new, and I find a pretty good answer for it on "The Cathedral and the Bazaar" chapter 2: Release early, release often.
It mainly propose that to be able to tackle enough bugs to make the software usable, the amount of releases needs to be as fast as the pace of the development, even at the cost of some stability.
I definitely recommend reading it fully for more context, and a lot more inspiring insights for any open source developer.

How often is often? A release strategy plan

Granted, the answer is not a recipe, and it makes sense it is that way because it really depends on the project.
On the following lines I will propose an specific release strategy for drupal contributed extensions.

Drupal core already has a release plan, it is really detailed, so please review it if you have not done it yet.
Minor releases are approximately available every six months, but security and bugfix releases for a given minor version branch are available monthly, on third and first Wednesday respectively.

Security releases for drupal contributed extensions are published in coordination with the security team, so there is no need to plan them here, they also happen on Wednesdays.

Making it simple to remember can help maintainers stick to it, so I will also be using Wednesdays as well as the weekday for releases.

The plan

I propose the following for each supported major branch in contributed extensions:

  • release alpha/beta/rc weekly on Wednesdays, until a stable is ready
  • release bugfix releases once stable has been reached in the same schedule than core, i.e. the first Wednesday of the month;

Looking back, it seems obvious and really simple, but if it is not documented somewhere, I will probably forget about it.
Hopefully someone else finds this useful, or even better wants to do the same.
Having a more predictable schedule always help to make better planning decisions.

I will start this week using this two guidelines and release a new version in the modules I maintain and there are pending changes to be released.

Auto-notify maintainers

Notifications may help us maintainers to stick to this, but I guess the plan itself was relevant enough keep the focus of this post.
I may be exploring some solutions around it in the future.

[1] To reproduce statistics you can retrieve the main repository from https://git.savannah.gnu.org/cgit/coreutils.git and then run a couple of commands:
git log --oneline --all --since="1 year ago" | wc -l git log --oneline --all --since="1 year ago" --decorate | grep tag

Etiquetas:

Agiledrop.com Blog: AGILEDROP: First days on board with A-team

Mar, 10/24/2017 - 23:05
When new developers arrive in our team, our mission is to help them as much as we can in every aspect of adaption to the new job environment, so they can show their potential and shine in their best light. First days at a new job are really important. Based on first impressions developers form the picture about their new coworkers and about the company itself, and that can have an impact on long-term. We pay great attention to the first days with us, so we prepared a brief insight in first work days at our company. First day We show the new developer around, show the desk and her… READ MORE

Agiledrop.com Blog: AGILEDROP: Why should agencies partner with companies rather than hire freelancers?

Mar, 10/24/2017 - 21:38
Digital agencies sometimes get in a position when they have more work their internal team can handle. For many outsourcing is not an option, as they still wish to keep the project in the house, but are open to working with external developers. Agencies can hire freelancers or work with teams like AGILEDROP. In this post, I will highlight some of the advantages of working with a team that agencies often overlooked when making a decision. No more job posts and screening interviews Hiring a freelancer is practically the same as hiring a full-time employee. First you have to write a job ad and… READ MORE

Páginas