Drupal Planet

Subscribe to canal de noticias Drupal Planet
Drupal.org - aggregated feeds in category Planet Drupal
Actualizado: hace 37 mins 25 segs

Acquia Developer Center Blog: Building Usable Conversations: Conversational Content Strategy

Lun, 02/11/2019 - 12:34

In this fourth installment of our series on conversational usability, we're turning our attention to conversational content strategy, an underserved area of conversational interface design that is rapidly growing due to the number of enterprises eager to convert the text trapped in their websites into content that can be consumed through voice assistants and chatbots.

Tags: acquia drupal planet

Dries Buytaert: Headless CMS: REST vs JSON:API vs GraphQL

Lun, 02/11/2019 - 11:59

The web used to be server-centric in that web content management systems managed data and turned it into HTML responses. With the rise of headless architectures a portion of the web is becoming server-centric for data but client-centric for its presentation; increasingly, data is rendered into HTML in the browser.

This shift of responsibility has given rise to JavaScript frameworks, while on the server side, it has resulted in the development of JSON:API and GraphQL to better serve these JavaScript applications with content and data.

In this blog post, we will compare REST, JSON:API and GraphQL. First, we'll look at an architectural, CMS-agnostic comparison, followed by evaluating some Drupal-specific implementation details.

It's worth noting that there are of course lots of intricacies and "it depends" when comparing these three approaches. When we discuss REST, we mean the "typical REST API" as opposed to one that is extremely well-designed or following a specification (not REST as a concept). When we discuss JSON:API, we're referring to implementations of the JSON:API specification. Finally, when we discuss GraphQL, we're referring to GraphQL as it used in practice. Formally, it is only a query language, not a standard for building APIs.

The architectural comparison should be useful for anyone building decoupled applications regardless of the foundation they use because the qualities we will evaluate apply to most web projects.

To frame our comparisons, let's establish that most developers working with web services care about the following qualities:

  1. Request efficiency: retrieving all necessary data in a single network round trip is essential for performance. The size of both requests and responses should make efficient use of the network.
  2. API exploration and schema documentation: the API should be quickly understandable and easily discoverable.
  3. Operational simplicity: the approach should be easy to install, configure, run, scale and secure.
  4. Writing data: not every application needs to store data in the content repository, but when it does, it should not be significantly more complex than reading.

We summarized our conclusions in the table below, but we discuss each of these four categories (or rows in the table) in more depth below. If you aggregate the colors in the table, you see that we rank JSON:API above GraphQL and GraphQL above REST.

REST JSON:API GraphQL Request efficiency Poor; multiple requests are needed to satisfy common needs. Responses are bloated. Excellent; a single request is usually sufficient for most needs. Responses can be tailored to return only what is required. Excellent; a single request is usually sufficient for most needs. Responses only include exactly what was requested. Documentation, API explorability and schema Poor; no schema, not explorable. Acceptable; generic schema only; links and error messages are self-documenting. Excellent; precise schema; excellent tooling for exploration and documentation. Operational simplicity Acceptable; works out of the box with CDNs and reverse proxies; few to no client-side libraries required. Excellent; works out of the box with CDNs and reverse proxies, no client-side libraries needed, but many are available and useful. Poor; extra infrastructure is often necessary client side libraries are a practical necessity, specific patterns required to benefit from CDNs and browser caches. Writing data Acceptable; HTTP semantics give some guidance but how specifics left to each implementation, one write per request. Excellent; how writes are handled is clearly defined by the spec, one write per request, but multiple writes is being added to the specification. Poor; how writes are handled is left to each implementation and there are competing best practices, it's possible to execute multiple writes in a single request.

If you're not familiar with JSON:API or GraphQL, I recommend you watch the following two short videos. They will provide valuable context for the remainder of this blog post:

Request efficiency

Most REST APIs tend toward the simplest implementation possible: a resource can only be retrieved from one URI. If you want to retrieve article 42, you have to retrieve it from https://example.com/article/42. If you want to retrieve article 42 and article 72, you have to perform two requests; one to https://example.com/article/42 and one to https://example.com/article/72. If the article's author information is stored in a different content type, you have to do two additional requests, say to https://example.com/author/3 and https://example.com/author/7. Furthermore, you can't send these requests until you've requested, retrieved and parsed the article requests (you wouldn't know the author IDs otherwise).

Consequently, client-side applications built on top of basic REST APIs tend to need many successive requests to fetch their data. Often, these requests can't be sent until earlier requests have been fulfilled, resulting in a sluggish experience for the website visitor.

GraphQL and JSON:API were developed to address the typical inefficiency of REST APIs. Using JSON:API or GraphQL, you can use a single request to retrieve both article 42 and article 72, along with the author information for each. It simplifies the developer experience, but more importantly, it speeds up the application.

Finally, both JSON:API and GraphQL have a solution to limit response sizes. A common complaint against typical REST APIs is that their responses can be incredibly verbose; they often respond with far more data than the client needs. This is both annoying and inefficient.

GraphQL eliminates this by requiring the developer to explicitly add each desired resource field to every query. This makes it difficult to over-fetch data but easily leads to very large GraphQL queries, making (cacheable) GET requests impossible.

JSON:API solves this with the concept of sparse fieldsets or lists of desired resource fields. These behave in much the same fashion as GraphQL does, however, when they're omitted JSON:API will typically return all fields. An advantage, though, is that when a JSON:API query gets too large, sparse fieldsets can be omitted so that the request remains cacheable.

REST JSON:API GraphQL Multiple data objects in a single response Usually; but every implementation is different (for Drupal: custom "REST Export" view or custom REST plugin needed). Yes Yes Embed related data (e.g. the author of each article) No Yes Yes Only needed fields of a data object No Yes; servers may choose sensible defaults, developers must be diligent to prevent over-fetching. Yes; strict, but eliminates over-fetching, at the extreme, it can lead to poor cacheability. Documentation, API explorability and schema

As a developer working with web services, you want to be able to discover and understand the API quickly and easily: what kinds of resources are available, what fields does each of them have, how are they related, etc. But also, if this field is a date or time, what machine-readable format is the date or time specified in? Good documentation and API exploration can make all the difference.

REST JSON:API GraphQL Auto-generated documentation Depends; if using the OpenAPI standard. Depends; if using the OpenAPI standard (formerly, Swagger). Yes; various tools available. Interactivity Poor; navigable links rarely available. Acceptable; observing available fields and links in its responses enable exploration of the API. Excellent; autocomplete feature, instant results or compilation errors, complete and contextual documentation. Validatable and programmable schema. Depends; if using the OpenAPI standard. Depends; the JSON:API specification defines a generic schema, but a reliable field-level schema is not yet available. Yes; a complete and reliable schema is provided (with very few exceptions).

GraphQL has superior API exploration thanks to GraphiQL (demonstrated in the video above), an in-browser IDE of sorts, which lets developers iteratively construct a query. As the developer types the query out, likely suggestions are offered and can be auto-completed. At any time, the query can be run and GraphiQL will display real results alongside the query. This provides immediate, actionable feedback to the query builder. Did they make a typo? Does the response look like what was desired? Additionally, documentation can be summoned into a flyout, when additional context is needed.

On the other hand, JSON:API is more self-explanatory: APIs can be explored with nothing more than a web browser. From within the browser, you can browse from one resource to another, discover its fields, and more. So, if you just want to debug or try something out, JSON:API is usable with nothing more than cURL or your browser. Or, you can use Postman (demonstrated in the video above) — a standalone environment for developing on top of an any HTTP-based API. Constructing complex queries requires some knowledge, however, and that is where GraphQL's GraphiQL shines compared to JSON:API.

Operational simplicity

We use the term operational simplicity to encompass how easy it is to install, configure, run, scale and secure each of the solutions.

The table should be self-explanatory, though it's important to make a remark about scalability. To scale a REST-based or JSON:API-based web service so that it can handle a large volume of traffic, you can use the same approach websites (and Drupal) already use, including reverse proxies like Varnish or a CDN. To scale GraphQL, you can't rely on HTTP caching as with REST or JSON:API without persisted queries. Persisted queries are not part of the official GraphQL specification but they are a widely-adopted convention amongst GraphQL users. They essentially store a query on the server, assign it an ID and permit the client to get the result of the query using a GET request with only the ID. Persisted queries add more operational complexity, and it also means the architecture is no longer fully decoupled — if a client wants to retrieve different data, server-side changes are required.

REST JSON:API GraphQL Scalability: additional infrastructure requirements Excellent; same as a regular website (Varnish, CDN, etc). Excellent; same as a regular website (Varnish, CDN, etc). Usually poor; only the simplest queries can use GET requests; to reap the full benefit of GraphQL, servers needs their own tooling. Tooling ecosystem Acceptable; lots of developer tools available, but for the best experience they need to be customized for the implementation. Excellent; lots of developer tools available; tools don't need to be implementation-specific. Excellent; lots of developer tools available; tools don't need to be implementation-specific. Typical points of failure Fewer; server, client. Fewer; server, client. Many; server, client, client-side caching, client and build tooling. Writing data

For most REST APIs and JSON:API, writing data is as easy as fetching it: if you can read information, you also know how to write it. Instead of using the GET HTTP request type you use POST and PATCH requests. JSON:API improves on typical REST APIs by eliminating differences between implementations. There is just one way to do things and that enabled better, generic tooling and less time spent on server-side details.

The nature of GraphQL's write operations (called mutations) means that you must write custom code for each write operation; unlike JSON:API the specification, GraphQL doesn't prescribe a single way of handling write operations to resources, so there are many competing best practices. In essence, the GraphQL specification is optimized for reads, not writes.

On the other hand, the GraphQL specification supports bulk/batch operations automatically for the mutations you've already implemented, whereas the JSON:API specification does not. The ability to perform batch write operations can be important. For example, in our running example, adding a new tag to an article would require two requests; one to create the tag and one to update the article. That said, support for bulk/batch writes in JSON:API is on the specification's roadmap.

REST JSON:API GraphQL Writing data Acceptable; every implementation is different. No bulk support. Excellent; JSON:API prescribes a complete solution for handling writes. Bulk operations are coming soon. Poor; GraphQL supports bulk/batch operations, but writes can be tricky to design and implement. There are competing conventions. Drupal-specific considerations

Up to this point we have provided an architectural and CMS-agnostic comparison; now we also want to highlight a few Drupal-specific implementation details. For this, we can look at the ease of installation, automatically generated documentation, integration with Drupal's entity and field-level access control systems and decoupled filtering.

Drupal 8's REST module is practically impossible to set up without the contributed REST UI module, and its configuration can be daunting. Drupal's JSON:API module is far superior to Drupal's REST module at this point. It is trivial to set up: install it and you're done; there's nothing to configure. The GraphQL module is also easy to install but does require some configuration.

Client-generated collection queries allow a consumer to filter an application's data down to just what they're interested in. This is a bit like a Drupal View except that the consumer can add, remove and control all the filters. This is almost always a requirement for public web services, but it can also make development more efficient because creating or changing a listing doesn't require server-side configuration changes.

Drupal's REST module does not support client-generated collection queries. It requires a "REST Views display" to be setup by a site administrator and since these need to be manually configured in Drupal; this means a client can't craft its own queries with the filters it needs.

JSON:API and GraphQL, clients are able to perform their own content queries without the need for server-side configuration. This means that they can be truly decoupled: changes to the front end don't always require a back-end configuration change.

These client-generated queries are a bit simpler to use with the JSON:API module than they are with the GraphQL module because of how each module handles Drupal's extensive access control mechanisms. By default JSON:API ensures that these are respected by altering the incoming query. GraphQL instead requires the consumer to have permission to simply bypass access restrictions.

Most projects using GraphQL that cannot grant this permission use persisted queries instead of client-generated queries. This means a return to a more traditional Views-like pattern because the consumer no longer has complete control of the query's filters. To regain some of the efficiencies of client-generated queries, the creation of these persisted queries can be automated using front-end build tooling.

REST JSON:API GraphQL Ease of installation and configuration Poor; requires contributed module REST UI, easy to break clients by changing configuration. Excellent; zero configuration! Poor; more complex to use, may require additional permissions, configuration or custom code. Automatically generated documentation Acceptable; requires contributed module OpenAPI. Acceptable; requires contributed module OpenAPI. Excellent; GraphQL Voyager included. Security: content-level access control (entity and field access) Excellent; content-level access control respected. Excellent; content-level access control respected, even in queries. Acceptable; some use cases require the consumer to have permission to bypass all entity and/or field access. Decoupled filtering (client can craft queries without server-side intervention) No Yes Depends; only in some setups and with additional tooling/infrastructure. What does this mean for Drupal's roadmap?

Drupal grew up as a traditional web content management system but has since evolved for this API-first world and industry analysts are praising us for it.

As Drupal's project lead, I've been talking about adding out-of-the-box support for both JSON:API and GraphQL for a while now. In fact, I've been very bullish about GraphQL since 2015. My optimism was warranted; GraphQL is undergoing a meteoric rise in interest across the web development industry.

Based on this analysis, we rank JSON:API above GraphQL and GraphQL above REST. As such, I want to change my recommendation for Drupal 8 core. Instead of adding both JSON:API and GraphQL to Drupal 8 core, I believe only JSON:API should be added. While Drupal's GraphQL implementation is fantastic, I no longer recommend that we add GraphQL to Drupal 8 core.

On the four qualities by which we evaluated the REST, JSON:API and GraphQL modules, JSON:API has outperformed its contemporaries. Its web standards-based approach, its ability to handle reads and writes out of the box, its security model and its ease of operation make it the best choice for Drupal core. Additionally, where JSON:API underperformed, I believe that we have a real opportunity to contribute back to the specification. In fact, one of the JSON:API module's maintainers and co-authors of this blog post, Gabe Sullice (Acquia), recently became a JSON:API specification editor himself.

This decision does not mean that you can't or shouldn't use GraphQL with Drupal. While I believe JSON:API covers the majority of use cases, there are valid use cases where GraphQL is a great fit. I'm happy that Drupal is endowed with such a vibrant contributed module ecosystem that provides so many options to Drupal's users.

I'm excited to see where both the JSON:API specification and Drupal's implementation of it goes in the coming months and years. As a first next step, we're preparing the JSON:API to be added to Drupal 8.7.

Special thanks to Wim Leers (Acquia) and Gabe Sullice (Acquia) for co-authoring this blog post and to Preston So (Acquia) and Alex Bronstein (Acquia) for their feedback during the writing process.

Agiledrop.com Blog: Our blog posts from January 2019

Lun, 02/11/2019 - 06:16

We've prepared an overview of all our blog posts from January 2019; have a look!

READ MORE

OpenSense Labs: Small enterprises, Rooting for Goliath? Here is How to Bag Big projects

Lun, 02/11/2019 - 02:29
Small enterprises, Rooting for Goliath? Here is How to Bag Big projects Vasundhra Mon, 02/11/2019 - 09:59 Wisdom doesn’t automatically come with old age. Nothing does - except wrinkles. It’s true that some wines improve by age but only if the grapes were good in the first place. 
-Abigail Van Buren

A reflection of the life experiences adds generously to the whole box of one’s wisdom because let’s face it, being wise and savvy can come from anyone and anywhere. So yes, famous quote “Age is just a number” has done justice to the whole scenario of erudition. 

Just like natural misconception “ bigger the better” proved right by small agencies handling bigger projects. Gone are the days where large enterprises use to rule in the market kingdom bagging all the big projects. Today, small agencies are winning big-name accounts and cool projects far more often. And the trend is forecast to continue.


For the Drupal agency with big aspirations deciding on the projects to opt for can be a bit of a task sometimes, but attaining the trust from CxOs of big organizations that is even bigger than the projects itself. 

Thereby, solving this issue of handling and winning - here are some of the ways which would help you to seize those big projects in your vanity. 

First things First - How to meet big clients?

Just because you are a small agency or organization, it would not mean your clients to have to be small. Landing on the large organization not only boosts up the small business revenue but also increases efficiency among your team members and organization.

  • Use client reference to introduce your process

Big companies may seem like a grand entity, but you should not forget that they are constituted of hundreds and thousands of individuals who have the power to make the decisions.

So it is really important for your research to be up notch and accurate that tells you who to contact within the company you've targeted. Some of the sources or references may help with this. Apart from this some companies also present details of at least one of the senior employees on their websites.

But you need to be really creative to figure out exactly who the right person is. Look out for out some of the company’s publications or newspapers mentions seeing whose name comes up.
Not only this but you can also tag along with people who would introduce you to big tech giants.

  • Indulge in cold calling

Telemarketing and cold calling continues to be an essential discipline that is really useful for the sales role. In many business sales organizations, the old school “door knocking” might not be that productive, and when it comes to big organizations especially with large territory assignments, cold calling becomes the hero for everyone. Prospecting via phone calls continues to be a great compliment to your overall employment setting and lead generation projects.

  • Be an expert and then try to be a solution to their needs. 

If you want the big giants to trust you with the projects then a sense of “What the work means to you”must be established with a clearer vision for the future. In fact, according to the Employee Job Satisfaction and Engagement survey, nearly 77% of employees said it was important to their job satisfaction and engagement to have a clear understanding of their organization’s vision and mission.

Start with your team 

Now that you have big names in your vanity start by developing strong team hold and skills. Starting from:

  • A team of Generalists 

Generalists are the people who have a particular skill but are flexible enough to mold themselves in any situations and are ready to learn a new skill. In the case of Drupal websites, a generalist should be able to handle both backends as well as frontend. 

In other words, having a person as a generalist would be beneficial for your organization. He/She would be able to effectively handle many tasks. 

 

  • Services are important 

Focus on the set of services and assistance which you would be providing to the vendor. Your team would become a specialist with time and experience. Treat a big enterprise like royalty. 

The big giant enterprise is like the customer for you who are always expecting great services and will not put up with the waiting for the poor responses from their representatives. 

Be honest with your projects and their goals. If your customers find that you are dishonest with your services, they will lose faith in you and may even spread negative feedback about your business.  

  • Categorizing your projects

To ensure that the complexity of the project is achieved, categorize the project into the following:

Small projects: These can easily be tracked just by getting updates A project is classified as small when the relationships between tasks are basic and detailed planning or organization is not required.

Charter required projects: These are projects that require some level of approval other than the first line manager, but do not include significant financial investment. A summary of major deliverables is usually enough for management approval.

Large projects: The project network is broad and complicated. There are many task interdependencies. With these projects, simplification where possible is everything. 

  • Planning 

Planning a project helps in achieving objectives and deadlines on time. It pushes the team members to keep working hard until the goal are conquered. Planning also helps in creating a network of right directions to the organization.

Increases efficiency: Planning helps in maximum utilization of all the available resources that you would be using. It supports to reduce the wastage of precious resources and dodges their duplication. It also aims to give the greatest returns at the lowest possible cost. 

Reduces risks: With having such large projects there are many risks associated with it. Planning serves to forecast these risks. It also serves to take the necessary precautions to avoid these risks.
 
Facilitates coordination: Often, the plans of all departments of an organization are well coordinated with each other. Similarly, the short-term, medium-term and long-term plans of an organization should be coordinated with each other. 
 
Aids in Organizing: Organizing intends to bring together all possible resources, Organizing is not possible without planning. It is so, since, planning tells us the number of resources needed and when are they needed. It suggests that planning aids in organizing in an effective way.
 
Keeps good control: The actual administration of an employee is compared with the plans, and deviations (if any) are found out and corrected. It is impossible to achieve such control without the right planning. Therefore, planning becomes necessary to keep good control.

 

  • The scope of the Project 

Perhaps the most difficult part of managing a large project with a small team is the difference between a task and an actual project. In order for small project teams to be successful with large projects, the manager should always know the status of the project and the scope at which it is being achieved. 

  • Excellent Relationship with the vendor

The most important part of managing big projects with small teams is to establish a meaningful relationship across the organization.

A solid relationship is a path that may lead to the difference between a project that becomes actualized and one that remains in the conceptual area. If the business doesn't concentrate on a product or the service that is important to reach your clientele, you require a vendor that does it. 

Next comes the Methodologies 

Large organizations usually handle classical methodologies which involve a lot of unnecessary documentation. Thus, for small agencies, some methodologies help largely in handling large projects 

  • Agile 
Agile was developed for projects that require both speed and flexibility. The method is split down into sprints- short cycles for producing certain features. 

Agile is highly interactive, allowing for fast adjustments throughout a project. It is mostly applied in software development projects in large part because it makes it simpler to identify issues quickly 

Agile is essential because it allows making changes early in the development process, rather than having to wait until testing is complete.

  • Scrum

It is a variation of an agile framework which is iterative in nature which relies on scrum sessions for evaluating priorities. “The Scrum assemblies” or “30-day sprints” are utilized to limit prioritized tasks.

Small teams may be gathered to concentrate on a particular task independently and then coincide with the scrum master to assess progress or results and reprioritize backlogged tasks.

 

  • Waterfall 

This is a basic, sequential methodology from which Agile and similar concepts evolved. It is commonly practiced in many industries, especially in software projects.

Waterfall has been an excellent project management methodology for years now and used by most of the project managers. This methodology is sequential in nature and is used by many industries, mostly used in software development. It consists of static phases ( analysis, design, testing, implementation, and maintenance) that are produced in a specific order. 

  • Critical Path Method 

CPM is an orderly, systematic method that breaks down project development into specific but related actions. 

This methodology can be used to build the preference for a project’s activities to assess risks and allocate resources accordingly. This method encourages teams to identify milestones, assignment dependencies, and deadlines with efficiency. 

A Critical Path introduces to a sequence of critical projects (dependent or floating) in a project that tells the extended succession of tasks that have to be made on time in order for the project to meet the deadlines.

Culture is Fundamental to Succeed

How do you explain to your client that the team won’t work this week for DrupalCon, DrupalCamp or any other events happening around?

You can only explain it by being clear with your thoughts and ideas. The community here plays a vital role in everything. 

Explain to your team members that it is beneficial for them to improve Drupal as a platform and introduce them with the team culture. Help your team member create pages in drupal.org and give credits to them of their creation on patches and modules. 

Closing the project 

Yes, it is possible that project closing might look like an insignificant and unimportant task in your project management journey, but, in fact, it is a critical part of producing a successful project. To help you get this step right, here are 4 things you need to know about how to close a project effectively.

Trace Project Deliverables: It is an effective closure means that you have completed all the deliverables to the satisfaction of the project’s sponsor

Reward Team Members: As your project comes to a close, always make sure to acknowledge, recognize and appreciate the contribution of your team members

Closeout Reports: A detailed close-out report should contain details about the process used during the project, the mistakes, the lessons learned, and how successful the project was in achieving the initial goals

Finance: Big clients are usually slow in payment, try to indulge in an agile budget for large projects. 

Turning From Technical provider to strategic solution partner 

As with any investment portfolio, an organization’s investment in Run, Optimise and Innovate initiatives must be balanced and aligned with the organization’s risk tolerance and the role expected of IT. If an organization considers itself to be more conservative, it is expected to see a higher ratio of Run to Optimise and Innovate spending. More progressive organizations will have more Optimise spending, and “leading edge” organizations will have more Innovate spending.

Conclusion 

Yes, Goliath, the Gittite, is and would always be the well-known giant in the Bible. He is described as 'a champion out of the camp of the Philistines, whose height was six cubits and a span. 

Befriending with the Goliath not only gave the sense of power to anyone with him but was also granted with security. 

Hunching on to large enterprises with big projects is like the very first step to success. Right steps and maintenance would to that success anytime soon. 

Opensense labs development methodologies work specifically on the approaches that involve Drupal development, enhancing efficiency, and increasing project delivery. 

Contact us on hello@opensenselabs.com to accomplish those large projects which you always desired off. 

blog banner blog image Drupal Drupal 8 Project management Project management Methodologies Agile Scrum framework Waterfall Critical Path Method CMS Blog Type Articles Is it a good read ? On

OpenSense Labs: Leveraging NightwatchJS in Drupal

Dom, 02/10/2019 - 14:34
Leveraging NightwatchJS in Drupal Shankar Sun, 02/10/2019 - 22:04

The night watchman can get a new image as an alert guard if he senses correctly that something does not look right. Imagine a man dressed like a stockbroker and carrying a soft leather briefcase and a valise. His walking style is sort of tentative and squirrelish which is not the way a stockbroker walks. On being asked for an ID by the watchman, he scurries off towards the entry gate of a building and drops his valise on to the floor before being finally captured by the watchman who later finds out that he was trying to rob someone from the building.


Much like the night watchman and his brilliance in judgement that makes sure that the building is kept safe from any such robbers, there is another solution in the digital landscape that helps in writing browser tests easily and safely. NightwatchJS is an exceptional option to run browser tests and its inclusion in the Drupal core has only made things easier for the Drupalists.

Understanding NightwatchJS


NightwatchJS is an automated testing framework for web applications and websites. It is written on NodeJS and uses the W3C WebDriverAPI (formerly known as Selenium WebDriver) for performing commands and assertions on DOM elements.

Nightwatch.js is an integrated, easy to use End-to-End testing solution for browser-based apps and websites, written on Node.js. - Nightwatchjs.org

As a complete browser (end-to-end) testing solution, NightwatchJS has the objective of streamlining the process of setting up continuous integration and writing automated tests. It can also be utilised for writing NodeJS unit tests. It has a clean syntax that helps in writing testing rapidly with the help of NodeJS and CSS or Xpath selectors. Its out-of-the-box command line test runner propels sequential or parallel test runs simultaneously by group, tags or single. It, also, has the support for Mocha runner out-of-the-box.

NightwatchJS has its own cloud testing platform called NightCloud.io in addition to the support for other cloud testing providers like SauceLabs and BrowserStack. It governs Selenium and WebDriver services automatically in a different child process and has great support for working with Page Object Model. Moreover, with its out-of-the-box JUnit XML reporting, it is possible to incorporate your tests in the build process with systems like Jenkins. 

NightwatchJS in Drupal

JavaScript Modernisation Initiative paved the way for the addition of NightwatchJS to the Drupal core (in version 8.6) as the new standard framework for unit and functional testing of JavaScript. It makes sure that alterations made to the system do not break expected functionality in addition to the provision for writing tests for your contributed modules and themes. It can be included in the build process for ensuring that regressions hardly creep into production.

You can try NightwatchJS in Drupal 8.6 by adhering to the instructions given on GitHub. It exhibits how to test core functionality. It also instructs on how to use it for testing your existing sites, modules, and themes by giving your own custom commands, assertions and tests. It is worth considering to check out Nightwatch API documentation and the developer guide of NightwatchJS for creating custom commands and assertions.

NightwatchJS tests will be run by Drupal CI and are viewable in test log for core developers and module authors. And for your own projects, tests can be run easily in, for instance, CircleCI, that can provide you access to artefacts like screenshots and console logs.

Conclusion

While Drupal 8 has extensive back-end test coverage, NightwatchJS offers a more modern platform that will make Drupal more familiar to PHP and JavaScript developers. 

Offering amazing digital experience has been our biggest objective and we have been doing that with a suite of services.

Contact us at hello@opensenselabs.com and let us know how can we help you achieve your digital transformation dreams.

blog banner blog image Drupal Drupal 8 NightwatchJS Blog Type Articles Is it a good read ? On

OpenSense Labs: Improving remote communications: Rocket.Chat for Drupal

Dom, 02/10/2019 - 13:54
Improving remote communications: Rocket.Chat for Drupal Shankar Sun, 02/10/2019 - 21:24

Virtual private servers are fantastic for running your own cloud applications and gives you the authority over your private data. Potentially, your private data may be leaked when you communicate via services like text messaging. One way to ensure greater privacy is by hosting your own messaging system. This is where Rocket.Chat comes into picture.


Rocket.Chat has the provision for an actual open source implementation of an HTTP chat solution that provides convenience and gives greater freedom at the same time. It can be a marvellous solution for remote communications, especially for open source communities.

Rocket Chat: A close look


The official site of Rocket.Chat states that it’s an open source team communication software which offers an alternative to remote communications by replacing the likes of email, HipChat and Slack. It aids in improving productivity via efficacious team communication and team collaboration. It helps in sharing files with the team, real-time chatting or even leveraging audio or video conference calls with screen sharing. Being an open source solution, It gives you the option of customising, extending or adding new functionality to meet your expectations.

Rocket.Chat is an open source team communication software which offers an alternative to remote communications by replacing the likes of email, HipChat and Slack

 


With Rocket.Chat, you can do away with cc/bcc and make use of Rocket.Chat channels and private groups thereby bringing in more transparency in the team communication. By utilising @username, you can include relevant participants in order to apprise them swiftly. When you need to inform about an important matter to all the members of a group, @all can be used. Participants can join or leave anytime using the full chat history. Moreover, Rocket.Chat offers a secure workspace with restrictions on username and greater transparency for admin. You can be a part of leading blockchain propellants like Hyperledger, Brave, Aragon among others in migrating from Slack and Atlassian to Rocket.Chat.

Essential features

Following are some of the major features of Rocket.Chat:

  • Unlimited: Rocket.Chat has the provision for unlimited users, channels, searches, messages, guests and file uploads.
  • Authentication mechanism: It offers different authentication mechanisms like LDAP Group Sync, 2-factor authentication, end-to-end encryption, single sign-on and dozens of OAuth providers.
  • Real-time Chat: Its Live Chat feature allows you to add real-time chat widgets to any site or mobile applications. This brings more efficacy in team communication and also ensures top-notch customer service.
  • Message translation: It utilises machine learning for automatically translating messages in real-time.
  • Use across platforms: It can be leveraged for all the platforms with its web, desktop and mobile applications, LiveChat clients, and SDK( Software Development Kit).
  • Marvellous customisation: You can alter it to meet your requirements. Incoming and outgoing WebHook integrations can be added to it. The personalisation of user interface can be performed by overriding any of the built-in styles. You get to extract the benefits of its REST API, LiveChat API or Real-time API.
Rocket.Chat in Drupal

The Rocket.Chat module, available for both Drupal 7 and Drupal 8, helps a Drupal site to integrate Rocket.Chat. It constitutes a base module that holds the configuration and the LiveChat module that includes a block for enabling LiveChat widget on a page which can be controlled as a block.
 
The maintainers of this module recommend running Drupal and Rocket.Chat behind a TLS (Transport Layer Security) proxy or web server that has TLS capabilities. Also, HTTPS and HTTP crossovers should be taken care of. Moreover, enabling the LiveChat on Rocket.Chat instance allows you to use LiveChat feature.

Conclusion

One of the quintessential aspects of open source communities is remote communication. Rocket.Chat offers a great alternative to the likes of Slack and gives you the plugin to enable in it in Drupal.
 
We have a strong inclination towards the provision of digital innovation and have been doing that with our expertise in Drupal development.

Contact us at hello@opensenselabs.com to understand more on Rocket.Chat and transform your team communication and collaboration.

blog banner blog image Rocket.Chat Drupal Drupal 8 Drupal module Blog Type Articles Is it a good read ? On

OpenSense Labs: Filling your Organization’s Traffic Canvas with the Colors of CRO.

Sáb, 02/09/2019 - 05:59
Filling your Organization’s Traffic Canvas with the Colors of CRO. Vasundhra Sat, 02/09/2019 - 13:29

Ever attended an art gallery and witnessed how modern artists use canvas to speak their thoughts?

By looking at the art and paying attention to their creation a lot of “Ohhs” and “Ahhs” and expressions of awe with wonder tells us whether it has sufficiently aroused the engagement or not.

But at the core of this practice, the whole idea of thinking an endless number of ways in which visitors attend an art gallery always sustain in the mind of an owner. 

More visitors means more conversions, which results in better progression. 

Customer experience and engagement - 2 things thrived by every art gallery owner.

Right?


Today most marketing teams are structured to drive traffic towards websites that seek to generate traffic and hopefully even more profit. 

Yes, it might be an oversimplification of the trend, but that’s the standard marketing playbook. That’s where Conversion Rate Optimization (CRO) comes in. 

But, What exactly is CRO?

Let's discover. 

Everything about CRO

In internet marketing, conversion optimization, or conversion rate optimization is a system for increasing the percentage of visitors to a website that converts into customers, or more generally, takes any desired action on a webpage. The whole process involves understanding how users move through a particular website, what actions they are taking, and what's stopping them from completing their goals. 
 


Importance of CRO 
  • Pay per click

The general idea of pay per click advertisement was that it targeted the audience fast by selecting options of what they could see. And you would agree on the fact that nowadays “pay per click” (Google Adwords) prices have hiked up to an extent where it is evident that it is directly affecting the conversions. Also with the increase in digital devices and people getting indulged in technology more businesses have become digital. 

  • Enhancing online competition 

Now that more people are becoming tech-savvy, competition among retailers has increased a lot. Some of them are simply eating away small retailers. That means if you want to convert your visitors into customers, you need to have a website that should be easy to use and easily customizable.  

You can take the help of Drupal as your website platform. It is one such CMS that allows you to set up your website easily and customize it as desired. 

Conversion Optimization has the benefit of allowing you to stay ahead of the curve in terms of competition, by providing you with insights on what's going on with the competitor's website.
  • Combating the rising cost of digital marketing

Let’s face it pay-per-click isn’t just the only thing which is rising in the market. Digital marketing in this area is giving good competition to any sort of “traffic source” ever known. 

The whole point of marketing is to direct the users towards your website.  But how would you make sure that most of them actually make a purchase?

This is where CRO comes to the rescue.

By increasing the number of page visitors who make purchases, CRO helps in improving the conversion rates by simply compacting the cost of digital marketing
  • Streamlining the business 

A website that is continuously being optimized looks more legitimate than the ones which are not doing that job. 

Why?

Well maybe due to the fact that the ones which are not being optimized are not providing a clearer path to the landing pages. Clear landing pages for an online retailer means having an inventory that can easily search or getting a clearer view of the categories. 

  • Saving a large amount of money 

So how can spending a large amount of money on your website result in saving money? 

You’ll find that spending less money on each customer would actually produce more money. Maybe you are not necessarily saving a lot, but you are definitely making a lot more. 

Which eventually balance out both. 

  • Improving the efficiency and layouts 

If you are working with an affiliate organization or the marketers, you would find out that many online retailers find CRO to be a good way to get news out about their products, through a platform that already has an engaged audience, CRO makes your website more valuable to your affiliates, and to any other marketing platforms.
When a higher number of users who click-through to your webpage actually makes a purchase, your affiliates, pay per click advertisers, social media marketing campaigns, etc., make more because you are making more.  

Common misconceptions related to CRO 

There are some businesses that see CRO be an unnecessary expense, an expense that doesn’t really move their business ahead.

Whereas most of them see it as a golden bolt for their marketing woes, a key for high traffic and more leads. 

Using it or not usually it is better than having misconceptions that lead to misguiding and wasting of resource and time. Among which some of them are:

  • CRO is a single skill 

One of the biggest misconceptions among business is an entrenched belief that CRO is a single skill set. In other words, CRO is a broad practice that encompasses a wide range of skills. To be effective at conversion rate optimization, you require three skill sets:

Copywriting: Whether you can write a persuasive copy or not would have a great impact on conversion rates.

Design: Starting from UI/UX to its choice in graphics depends highly on the rate of conversions.

Analytics: It is important to have someone with special and necessary skills to analyze your result. 

  • It is all about best practices

Running or stumbling upon blog posts and articles that tell you the best practices on boosting up your conversion rate has become standard now. 

And going ahead and implementing those practices as written on the write-ups is conventional. But do these tricks really work?

The truth is that there are no particular one-size-fit best practices that can lead you to the path of better conversions. Your focus should always be on removing barriers that hinder with the flow of conversions.  

  • Making small changes that lead to big and better rewards
     

 

The above pictures clearly describe that how changing the content length affected the conversions and resulted in 90% more clicks. 

Going by this case study, you might be tempted to find the silver bolt where making a minor change reaped into great news. 

In truth case studies like these are entirely misleading and provides you with only partial information. They don’t tell you that:

How long the tests were done?

Whether the traffic remained constant throughout the testing period?

What all changes were made on the website?

  • It is all about split testing 

Most people think that CRO is all about split testing site elements. 

The truth is that CRO is all about measuring your customer's actions and remove the convention barriers. 

To do this start by basic user needs and understanding the psychology of the customers. This model would help you to focus on the things that should be worked on:

 

  • Focusing on CRO alone builds a successful business 

Due to the immense love showered on CRO and per digital marketing, it is believed by the companies that they are winning the game of online business. 

True, that CRO might increase your conversion from 1% to 2%, which yes has a great impact on sales, but to reach those heights of success you need to take a closer look on traffic, brand, and customers. 

So What is the Structured Process in CRO?
  • Defining your conversion actions 

The conversion actions can be defined based on the business goals and then they can be implemented on web analytics. Promoting and producing content in one of the actions which should be implemented as soon as possible. The content technique would require you to do be indulged in practices like:

  1. Targeting email marketing
  2. Marketing automation 
  3. Producing demo
  4. Live events
  5. Case studies

The content at this stage revolves around customer-relationship management through segmentation. When you segment your audience based on age, gender, geographical position, professional role, etc., you are better equipped to offer them targeted content that interests them

  • Understand the prospects

A better understanding of the prospects helps in better converting of offers. This is the stage where you look for indirect customers acquisition and brand awareness.  Begin by mapping out the current situations and forming a clearer idea on your target audience, objectives, market, KPI’s and the current result. Knowing in advance that where you stand and what you want to achieve provides you with more context.

  • Research and Analytics

Once you have the insights into your current situation and objectives, it is the time for you to analyze them. In the analysis phase, you would want to employ web analytics and the other sources of information to form a hypothesis that can then be tested. It could include the heatmaps, tests, insights etc.

This would make sure that your insights are based on the actual behavior of the users and not on the basis of superstitions.

  • Implementing a challenger and Testing the hypothesis 

For implementing a good challenger you need to choose a suitable testing method and then run a test. It involves the turning up of backlogs and matrix into a measurement plan. This is one of the most crucial parts of CRO. 

Examining the hypothesis on basis of multiple tests results in a good number of visitors and websites. 

  • Validating 

After setting up and running the tests, you analyze the results. This generates insights that you then subsequently test. CRO is a continuous process of learning and refining. After setting up and running the test, you analyze the result. This then generates insights that can have a subsequent test. 

Testing for CRO 

Conversion Rate Optimization Test are the ones that refer to the various types of testing methodologies. They are used to identify the best possible version of a certain site that brings in the most valuable traffic.

  • Principles of CRO 

Speed

Back in 2009, Google conducted experiments which described that slowing down the search results page by under half a second resulted in 0.2% to 0.6% fewer searches.

These results now might not sound like big numbers, but for Google (which processed roughly 40,000 searches every second ) the number resulted in 14,400 fewer searches every minute and according to Google itself, people have become more impatient. Thus making speed an important factor
 
Singularity 
 
A single take away “Less is more” is just about right mantra which should be followed by every website owner. 
Many landing pages contain multiple offers.
 
They shouldn’t.
 
Why?
 
Having just one and improved campaign clicks help in increasing the conversion rates

Identification

Identifying your audience and their aspirations/desires mean high conversions. In other words - what they want, what matters to them, and what sources of frictions are for them all comes under the section of identification.  

To identify the people you must know them. After all the reason how sales and marketing would see those conversions is only when you identify your own customers. 

Landing Pages

Take an advise and do not clutter your pages or emails to “what if” moments. 

What if the user is to like my page?

What if more audience like the information that is being served on the page? 

What if they want to read my testimonials and case studies? 

If the goal of creating a particular page is to get likes and visitors to subscribe it, then all you should do it is focus on it. Thus, your landing pages should be as precise and simple as they could be. This gives a clearer idea to the audience and your customers on what you are selling. 

  • Testing Methods 

A/B testing

Businesses want visitors to take an action (conversion) on the website and the rate at which a website is able to drive, this is called its "conversion rate."

A/B testing is the practice of showing 2 variants of the same webpage to different segments of website visitors at the same time and comparing which variation drives more conversions. 
 
The one that gives higher conversions wins!
 
The metrics of conversion is different for every site. For commerce, it might be a sale product whereas for B2B it might generate qualified leads. A well planned, data-driven A/B testing makes your marketing plan more profitable by just narrowing it down to its most important elements by testing them and also by combining them. 

Note that every element on your website should influence visitor behavior and conversion rate should be A/B tested. 


Multivariate Tests 

In a multivariate test, you identify a few key areas/sections of a page and then create variations for those sections specifically (as opposed to creating variations of the whole page in an A/B split test). So for example, in a multivariate test, you choose to create different variations for 2 different sections: headline and image. A multivariate testing software will combine all these section specific variations to generate unique versions of the page to be tested and then simply split traffic amongst those versions.

Multivariate testing looks to provide the solution. You can change a title and an image at the same time. With multivariate tests, you test a hypothesis for which several variables are modified and determine which combination from among all possible solutions performed the best. If you create 3 different versions of 2 specific variables, you then have nine combinations in total (number of variants of the first variable X number of variants of the second). 

 
  • Some Testing tools 

Now we know that Conversion Rate Optimization (CRO) focuses on simple tests that allow you to compare and contrast layouts, call to action, a design feature, content and even personalized marketing feature to nudge your conversion rate into new.   

Therefore here are some wicked yet sweet CRO tools that would help you with the testing.

Optimizely: Optimizely tool requires a single line of lightweight code that can be added to any website. The user would then have the power to change any on-page elements on the website.

Google Analytics: Most of the website has this tool in-built on the platform. It is one of the most popular testing tools. Google Analytics tool would help you to split your traffic into two pages that you have developed and would let you know which version is best for testing. 
  
Visual web optimizer: This tool specifically helps in figuring out how your testing windows will be after you plug in a few variables. Visual web optimizer is great for the company project and client work. 

Adobe Target: Adobe target is a popular enterprise tool that combines the taste of targeting testing and personalization. It walks you through three step workflow where you first create a variant, then target a variant base and lastly customize your goals 

Appitmize: It is the testing tool that focuses entirely on mobile optimization and is a perfect choice for mobiles and business. It offers full control over the visual editor and rapidly creates new variants and targets. 

Conclusions 

Now we know that the most important goal for an organization is to create conversions. Creating conversions is the reason how you can measure your progress and growth. 

Opensense Labs is aware of the fact that how important it is to apprehend the visitor’s preferences and their interests. Therefore our services on Drupal personalization and CRO bundles us together to our clients and helps in accelerating the conversions.

Ping us on hello@opensenselabs.com and let us take that road to handle in your success and hurdles. 

blog banner blog image Drupal Drupal 8 CMS Conversion Rate Optimization CRO Website Traffic A/B testing Multivariate Tests Blog Type Articles Is it a good read ? On

Craft of Coding: Drupal on OpenShift: Deploying your first Drupal site

Vie, 02/08/2019 - 17:30

Learn how to deploy your first Drupal 8 site on OpenShift. We saw the business value of running OpenShift in the last post. Now we will look at how to build and deploy your first Drupal 8 site on OpenShift. Docker vs OpenShift First, we have to understand the relationship between Docker containers and OpenShift. […]

The post Drupal on OpenShift: Deploying your first Drupal site appeared first on Craft of Coding.

Lullabot: Announcing the New Lullabot.com

Vie, 02/08/2019 - 13:40

Welcome to the latest version of Lullabot.com! Over the years (since 2006!), the site has gone through at least seven iterations, with the most recent launching last week at the 2019 Lullabot team retreat in Palm Springs, California.

wishdesk.com: Drupal JSON:API 2.x module: new release to build high-performance APIs

Vie, 02/08/2019 - 08:43

Drupal 8 is known for extensive third-party integration opportunities it gives to websites. One of the tools for this is the contributed Drupal module JSON:API. It helps developers build high-performance APIs for various purposes, including multi-channel content or decoupled Drupal and JSON API setups (which is one of our Drupal team’s areas of expertise). This winter has seen a new release — Drupal JSON:API 2.x. Let’s take a look at what the module does, what makes it useful, and how it has changed in the 2.x version.

JSON:API: principle and benefits

JSON API is a specification, or a set of rules, for REST APIs. It defines how data is exchanged between the server and the client in the JSON format. This includes how the client request the resources, how the are fetched, which HTTP methods are used, and so on. 

OpenSense Labs: DeGov is here for all the Government websites

Vie, 02/08/2019 - 03:43
DeGov is here for all the Government websites Vasundhra Fri, 02/08/2019 - 11:13

Today, our interactions with the digital world have surpassed the human interactions so much that the need for the user interface to be appealing and friendly plays an important role in terms of progression. 

Government websites are no different.

The first connection which they have with their citizens is more likely to be an engaging website. One of the most essential tools for meeting the needs of your people or citizens. 

So, it has to be the best. Right?


Creating a functional website with easy navigation not only help officials do better in connecting with their constitutes but also ensures that the public stays well informed all the time. 

Drupal is one such platform which helps you achieve all of this in one go. 

How is Drupal in Government sector performing?  

Drupal is gaining popularity in government sector all over the world. The solidity and flexibility of the platform is the primary reason why the government is moving its online portals to Drupal. Government websites like the white house, Federal IT Spending Dashboard, Data govt. has specifically chosen Drupal for its efficiency. The reason why most of the govt. Websites are choosing Drupal over any other platform is also due to the facts that it is:

  • Secure

Drupal has an excellent track record when it comes to solving and maintaining security issues. The security team (Drupal Community) that works together with other councils watches and ensures that its users are getting the best security practices

The fact that The White House entrusts Drupal as its platform is enough to prove that it is highly secure CMS. There are several security modules that make it a highly reliable platform. Modules like:

Login Security: Drupal sites that are available in both, HTTP and HTTPs provides the user with a lockdown in the login page, and submits the other forms securely via HTTPS. Hence, it prevents passwords and other sensitive user data from being transmitted.

Password Policy: This module forces a user to forcefully create a strong and powerful password. 

Captcha: It is the response test which is specifically constructed for determining the user. To check whether the process is being done by a human and not by a robot.

Security Kit: It provides Drupal users with various security hardening options. This doesn’t let the website compromise the data and information to the hackers and other foreign users. Security Kit presents particular mitigation for cross-site request forgery, cross-site scripting, and clickjacking, among other issues.

Two-factor verification: The two-step verification is a procedure that implicates the two-step authentication method, which is performed one after another to verify the requesting access. 

 

  • Accessible

Government websites are those kinds which are used by everyone, and by everyone, I mean the visually impaired too. Each citizen should be able to access the government website quickly and seamlessly and according to WACG 2.0 (web content accessibility guidelines), every website should provide equal standards to all the people.
Drupal is one such platform which adheres to each and every WACG guidelines with its different modules and provides accessibility to everyone. 

Alt text: This is one of the most important modules when it comes to providing accessibility to a website. With this module, the search engine understands the text of an image or a page and screen readers read it loudly to the user.

Content accessibility: This module checks all the type of fields where the user can enter formatting. Below the content section of the accessibility page, the user is provided with the option to turn the tests on or off using the tab that would appear below the page. 

Accessibility in WYSIWYG: This type of module integrates the accessibility configuration with the WYSIWYG or CKEditor modules, which provide the user with a new button while they are editing the content that checks the work for accessibility issue. 
 

 

  • Economical

The government budget for software developments cannot be compared to the budget of large enterprises for the same purpose. The government needs to opt for a high-quality solution that does not cost a fortune. High development and maintenance costs may be considered as an obstacle to finding high-quality website solutions.

Thus, Drupal meets these and many other requirements are the reason why the government chose it as there CMS. The Drupal development price is relatively low when it is compared with other CMS in the market. 

  • Scalable

Another great feature of Drupal8 is its perfect scalability. The CMS is suitable for both small survey website as well as the content-rich websites with gigabyte information. 

At the same time, Drupal is capable of handling high traffic issues and problems. Web solution built on this tool is available even when the traffic volume jumps sky high. A great example of the same is UNESCO and CERN web channels. These Drupal-based websites offer a great user experience and thousands of people use it on daily basis. 

  • Easily Customized

When we say government websites we automatically imagine something grey, black, white or something really boring. But one has to remember that these website services both political and non- political purposes. Thus, user interaction and engagement then here becomes a crucial aspect. 

Better user engagement is made possible with Drupal and its modules. The administrator can add blogs, articles, write-ups that contributes highly opportunities and solutions. Not only this but Drupal also gives its user the power to personalize their website according to their needs and requirement, making it a flexible and reliable CMS. 

  • Has superb integration capabilities 

A powerful web solution should integrate seamlessly with third-party applications. Publishing tools, data repository, and other features belong to the list of necessary interactions. 

The integration capabilities offered by Drupal is enormous. It provides numerous options so that the user is able to integrate any type of third party service. 

Drupal Distribution: DeGov to the rescue 

DeGov is the first Drupal 8 open source distribution focussing entirely on the needs of the governmental organizations. It is intended to provide a comprehensive set of functionalities that are commonly used only of the applications that are related to government. 

What DeGov is not?

The DeGov distribution is not a commercial CMS ( developed as well as owned by a single company, and the users usually need to buy a license) and it is not a finalized product. Meaning, DeGov is not a ready solution. The reason is that there are a lot of functionalities and ideas in backlog and hence it is a daily work off process. 

DeGov is an idea to realize the benefits in the public sector with the idea of open source. 
  Use case related to DeGov

DeGov has 6 use cases and extends its valuable functions to meet certain scenarios. 

  • Publishes information for all the websites that are government based organizations of all level.
  • Service-oriented E-government portals, that closes the gaps between the users/ citizens and administrator.
  • Citizen engagement portals to decline and discuss online.
  • Open311 portal for civic issue tracking.
  • Open Data portal for publishing and creating communities data

Intranet and Extranet of government employees.

Beneath the Canopy of DeGov DeGov is the sub-profile of the lightning distribution. 

Lightning allows you to create sub-profiles that are entirely based on default lightening distribution profile. Creating a sub-profile enables you to customize the installation process to meet specific needs and requirements. 

By combining DeGov and lightening distribution, it delivers configuration that is specialized to start new projects. Building on top of Lightning, DeGov leverages in a true open source manner that allows focussing on functionalities for the public sector. 

Which problems does DeGov solve?

Editor 

DeGov solves the issues that are all related to complex backends, missing workflow, multi-language as a pain, missing modernity etc. It allows the editors with highest design flexibility in the maintenance of the website as well as simple editing of the content. 

With the help of “Drag and Drop” function, you can easily structure your page even without the help of programming. Predefined page type helps in easy maintenance. In the central media library, all types of media are managed no matter what form or type it is (pictures, PDFs, videos, posts)

DeGov distribution also helps in easy integrating of the social media. Likewise, you can allow the users to easily share the content in form of articles, blogs, and other write-ups. DeGov uses the privacy- compliant integration of social media share buttons. 

Customers and clients 

DeGov solves the issues that are related to the features and high ownership with old and propriety technologies.  It a user-friendly web front with attractive designs. It is responsive and thus adapts to any device and its size.  The use of HTML/CSS makes a website accessible. Likewise, the page can be made translated into several languages. A language switcher and an automatic screen reader translate the workflow easily. 

DeGov websites receive a powerful search function for the website content as well as the documents. This is particularly convenient for the authorities: The administrator search engine and NRW  that is connected to the service so that users can also search the content outside their area of responsibility. 

Companies and developers 

It solves the issue related to high-cost updates and the functionalities that are related to it. The distribution is based on opensource software, as a result, you pay neither for the license nor you purchase it.

Drupal is used in numerous projects thus the basic functions and the functionalities are constantly updated and expanded from time to time. This fixes all the bugs and vulnerabilities quickly. 

Implementations with DeGov

Websites

The DeGov distribution has the ability to release, federal and state portals, as well as internet sites for ministries, authorities, districts, and municipalities. The web pages and the target group-specific portals (or topic pages) can be created easily by DeGov. 

Participation Portals

With participation portals to let the citizens participate in decisions and proposed measures, DeGov distribution has opened doors in terms of communication. Here you can notify, discuss with users, gather valuable suggestions or create polls. From participatory budgeting through bills to constructing methods - portals obtained by DeGov give citizens as a voice.
 
E-Government Portals

DeGov distribution is a great way to implement entire eGovernment portals. Less stress is laid on the editorial content than whole specialist procedures. In this way, DeGov allows the digital processing of administrative processes. The project Gewerbe.NRW was implemented with the help of DeGov distribution.

Why DeGov with Drupal 8?

DeGov modules are Drupal modules which are carefully fit together. This means that the whole system benefits largely with its functionalities. The reasons why Drupal 8 should be with DeGov is because:

  • It is based on Symfony and it uses composers.
  • It is Git accessible for config files.
  • It is cloud-ready and runs like a modern PHP solution 

Better projects with DeGov

A case study on Gewerbe-Service-Portal.NRW

The new "Gewerbe-Service-Portal.NRW" has been providing citizen friendly services by allowing company founder in German federal state North Rhine-Westphalia (NRW) to electorally register a business from home. Its main aim was to provide aid of a clearly arranged online form, commercial registration and that can be transmitted to responsible citizens.

In addition to the business registration, the portal provided with information to the topic “foundation of an enterprise”. Furthermost all the users had access to Einheitliche Ansprechpartner NRW. Also, the online service supported specialized staff in taking up a service occupation.

The portal was developed on the basis of the content management Drupal-based system DeGov and nrwGOV. They chose Drupal because of that fact that it was cost-effective and new technologies could be adapted to the Drupal community. Apart from this Drupal provided with:

  • Higher safety and better quality
  • Independence
  • Comprehensive option
  • Accessibility

The portal aimed at providing more flexibility to the entrepreneur that is eligible to start there own business by saving time through digitization. The electronic forwarding and processing of the application by authorities ensured effective processing within applications. The result was effective and user-friendly communication between the citizens and authorities. Whereas in the near future the Gewerbe-Service-Portal.NRW will develop a comprehensive service platform so that the administrative process could be carried out at home.

In the Nutshell 

So how important is the government website to you?

The answer might have been crystal clear by now. 

As important as it is to have a website, maintaining it a whole different task. Yes, Drupal is making it easy for you by its functionality and distribution.  But the definite art of maintaining it is as important as creating it. 

Ping us on hello@opensenselabs.com our services would not only get the best out of Drupal but would also but would also help in enhancing your development and industrial standards.  

blog banner blog image Drupal Drupal 8 CMS DeGov Government Websites Drupal distribution Government portal Government Blog Type Articles Is it a good read ? On

Lullabot: Lullabot Podcast: Developing the latest version of Lullabot.com

Vie, 02/08/2019 - 02:00

Matt and Mike talk with the team that just pushed the button on the new Lullabot.com. 

ARREA-Systems: Coil

Jue, 02/07/2019 - 23:42
Coil

Web monetization service is a browser API which allow creation of (micro) payments between the reader (user agent) and the content provider (website).

This is one way of getting paid for valuable content.

Today Coil is providing web monetization service using Interledger protocol (ILP).

We have built a simple module to integrate coil monetization with Drupal website:

Simple settings to add payment pointer is demonstrated in this video:

 

Your browser does not support the video tag.

 

 

Drupal blog: Drupal helps rescue ultra marathon runner

Jue, 02/07/2019 - 16:12

This blog has been re-posted and edited with permission from Dries Buytaert's blog.

I'm frequently sent examples of how Drupal has changed the lives of developers, business owners and end users. Recently, I received a very different story of how Drupal had helped in a rescue operation that saved a man's life.

The Snowdonia Ultra Marathon website

In early 2018, Race Director Mike Jones was looking to build a new website for the Ultra-Trail Snowdonia ultra marathon. He reached out to a good friend and developer, Rob Edwards, to lead the development of the website.

© Ultra-trail Snowdonia and No Limits Photography

Rob chose Drupal for its flexibility and extensibility. As an organization supported heavily by volunteers, open source also fit the Snowdonia team's belief in community.

The resulting website, https://apexrunning.co/, included a custom-built timing module. This module allowed volunteers to register each runner and their time at every aid stop.

A runner goes missing

Rob attended the first day of Ultra-Trail Snowdonia to ensure the website ran smoothly. He also monitored the runners at the end of the race to certify they were all accounted for.

Monitoring the system into the early hours of the morning, Rob noticed one runner, after successfully completing checkpoints one and two, hadn't passed through the third checkpoint.

© Ultra-trail Snowdonia and No Limits Photography

Each runner carried a mobile phone with them for emergencies. Mike attempted to make contact with the runner via phone to ensure he was safe. However, this specific area was known for its poor signal and the connection was too weak to get through.

After some more time eagerly watching the live updates, it was clear the runner hadn't reached checkpoint four and more likely hadn't ever made it past checkpoint three. The Ogwen Mountain Rescue were called to action.

Due to the terrain and temperature, searching for the lost runner on foot would be too slow. Instead, the mountain rescue volunteers used a helicopter to scan the area and locate the runner.

How Drupal came to the rescue

The area covered by runners in an ultra marathon like this one is vast. The custom-built timing module helped rescuers narrow down the search area; they knew the runner passed the second checkpoint but never made it to the third.

After following the fluorescent orange markers in the area pinpointed by the Drupal website, the team quickly found the individual. He had fallen and become too injured to carry on. A mild case of hypothermia had set in. The runner was airlifted to the hospital for appropriate care. The good news: the runner survived.

Without Drupal, it might have taken much longer to notify anyone that a runner had gone missing, and there would have been no way to tell when he had dropped off.

NFC and GPS devices are now being explored for these ultra marathon runners to carry with them to provide location data as an extra safety precaution. The Drupal system will be used alongside these devices for more accurate time readings, and Rob is looking into an API to pull this additional data into the Drupal website.

Stories about Drupal having an impact on organizations and individuals, or even helping out in emergencies, drive my sense of purpose. Feel free to keep sending them my way!

Special thanks to Rob EdwardsPoppy Heap (CTI Digital) and Paul Johnson (CTI Digital) for their help with this blog post.

Specbee: Drupal 8 websites in a Flash - 5 reasons to choose Acquia Lightning

Jue, 02/07/2019 - 07:53

Drupal 8 gives developers and content authors full flexibility to shape their websites and applications that meets their vision. It is packed with thousands of powerful features that requires to be able to support a wide variety of content-rich applications. Acquia Lightning is a lean, ready-to-use starter-kit that encompasses just the tools needed to develop and manage your enterprise-grade digital experiences. Lightning is built specifically to empower your marketing and editorial teams to build better, easier and faster. Here are 5 reasons why choosing Acquia Lightning could be a great decision for your organization’s editorial and content teams.

Agiledrop.com Blog: Druplicon.org: In Search of the Lost Druplicon

Jue, 02/07/2019 - 05:24

In this post, we present the story behind druplicon.org, a site for exploring the various different versions of the well-known Drupal logo.

READ MORE

Dries Buytaert: Drupal helps rescue ultra marathon runner

Mié, 02/06/2019 - 23:34

I'm frequently sent examples of how Drupal has changed the lives of developers, business owners and end users. Recently, I received a very different story of how Drupal had helped in a rescue operation that saved a man's life.

The Snowdonia Ultra Marathon website

In early 2018, Race Director Mike Jones was looking to build a new website for the Ultra-Trail Snowdonia ultra marathon. He reached out to a good friend and developer, Rob Edwards, to lead the development of the website.

© Ultra-trail Snowdonia and No Limits Photography

Rob chose Drupal for its flexibility and extensibility. As an organization supported heavily by volunteers, open source also fit the Snowdonia team's belief in community.

The resulting website, https://apexrunning.co/, included a custom-built timing module. This module allowed volunteers to register each runner and their time at every aid stop.

A runner goes missing

Rob attended the first day of Ultra-Trail Snowdonia to ensure the website ran smoothly. He also monitored the runners at the end of the race to certify they were all accounted for.

Monitoring the system into the early hours of the morning, Rob noticed one runner, after successfully completing checkpoints one and two, hadn't passed through the third checkpoint.

© Ultra-trail Snowdonia and No Limits Photography

Each runner carried a mobile phone with them for emergencies. Mike attempted to make contact with the runner via phone to ensure he was safe. However, this specific area was known for its poor signal and the connection was too weak to get through.

After some more time eagerly watching the live updates, it was clear the runner hadn't reached checkpoint four and more likely hadn't ever made it past checkpoint three. The Ogwen Mountain Rescue were called to action.

Due to the terrain and temperature, searching for the lost runner on foot would be too slow. Instead, the mountain rescue volunteers used a helicopter to scan the area and locate the runner.

How Drupal came to rescue

The area covered by runners in an ultra marathon like this one is vast. The custom-built timing module helped rescuers narrow down the search area; they knew the runner passed the second checkpoint but never made it to the third.

After following the fluorescent orange markers in the area pinpointed by the Drupal website, the team quickly found the individual. He had fallen and become too injured to carry on. A mild case of hypothermia had set in. The runner was airlifted to the hospital for appropriate care. The good news: the runner survived.

Without Drupal, it might have taken much longer to notify anyone that a runner had gone missing, and there would have been no way to tell when he had dropped off.

NFC and GPS devices are now being explored for these ultra marathon runners to carry with them to provide location data as an extra safety precaution. The Drupal system will be used alongside these devices for more accurate time readings, and Rob is looking into an API to pull this additional data into the Drupal website.

Stories about Drupal having an impact on organizations and individuals, or even helping out in emergencies, drive my sense of purpose. Feel free to keep sending them my way!

Special thanks to Rob Edwards, Poppy Heap (CTI Digital) and Paul Johnson (CTI Digital) for their help with this blog post.

Lullabot: Why Programmers Should Read Good Fiction

Mié, 02/06/2019 - 19:17

If you are a programmer looking to improve your professional craft, there are many resources toward which you will be tempted to turn. Books and classes on programming languages, design patterns, performance, testing, and algorithms are some obvious places to look. Many are worth your time and investment.

Agaric Collective: Pass variables without escaping nor sanitizing to t() in Drupal 8

Mié, 02/06/2019 - 15:12

In Drupal 7 it was useful to do things like this: 

function mymodule_content() { $links[] = l('Google', 'http://www.google.com'); $links[] = l('Yahoo', 'http://www.yahoo.com'); return t('Links: !types', array('!types' => implode(', ', $links))); }

In this case, we are using the exclamation mark to pass the $links into our string but unfortunately, Drupal 8 doesn't have this option in the FormattableMarkup::placeholderFormat(), the good news is that even without this there is a way to accomplish the same thing. 

Read more and discuss at agaric.coop.

Mass.gov Digital Services: Introducing Drupal Test Traits

Mié, 02/06/2019 - 12:34
Mass.gov dev team releases open source project

The Mass.gov development team is proud to release a new open source project, Drupal Test Traits (DTT). DTT enables you to run PHPUnit tests against your Drupal web site, without wiping your database after each test class. That is, you test with your usual content-filled database, not an empty one. We hope lots of Drupal sites will use DTT and contribute back their improvements. Thanks to PreviousNext and Phase2 for being early adopters.

Mass.gov is a large, content-centric site. Most of our tests click around and assert that content is laid out properly, the corresponding icons are showing, etc. In order to best verify this, we need the Mass.gov database; testing on an empty site won’t suffice. The traditional tool for testing a site using an existing database is Behat. So we used Behat for over a year and found it getting more and more awkward. Behat is great for facilitating conversations between business managers and developers. Those are useful conversations, but many organizations are like ours — we don’t write product specs in Gherkin. In fact, we don’t do anything in Gherkin beside Behat.

Meanwhile, the test framework inside Drupal core improved a lot in the last couple of years (mea culpa). Before Drupal Test Traits, this framework was impossible to use without wiping the site’s database after each test. DTT lets you keep your database and still test using the features of Drupal’s BrowserTestBase and friends. See DrupalTrait::setUp() for details (the bootstrap is inspired by Drush, a different open source project that I maintain).

Zakim Bridge at Night, North End Boston. Photo by David Fox.Using DTT in a Testhttps://medium.com/media/cbe46617878edbc55bbf67c573fbc46a/href
  • Our test cases extend ExistingSiteBase, a convenience class from DTT that imports all the test traits. We will eventually create our own base class and import the traits there.
  • Notice calls to $this->createNode(). This convenience method wraps Drupal’s method of the same name. DTT deletes each created node during tearDown().
  • Note how we call Vocabulary::load(). This is an important point — the full Drupal and Mink APIs are available during a test. The abstraction of Behat is happily removed. Writing test classes more resembles writing module code.
More Featureshttps://medium.com/media/7c921cc06be32c3b0944aef1d597e853/hrefMisc
  • See the DTT repo for details on how to install and run tests
  • Typically, one does not run tests against a live web site. Tests can fail and leave sites in a “dirty” state so it’s helpful to occasionally refresh to a pristine database.

If you have questions or comments about DTT, please comment below or submit issues/PRs in our repository.

More from Moshe: Our modern development environment at Mass.gov

Interested in a career in civic tech? Find job openings at Digital Services.
Follow us on Twitter | Collaborate with us on GitHub | Visit our site

Introducing Drupal Test Traits was originally published in MA Digital Services on Medium, where people are continuing the conversation by highlighting and responding to this story.

Páginas