Using Composer for Drupal Modules and Private Bitbucket Repos

The next installment in my ongoing set of posts to create a public record for things I couldn’t learn in one Google search is a process for using composer to track a Drupal 8 module in a private repository.

It’s pretty common for Drupal agencies to have a small collection of modules they have built in-house and use on nearly all client sites, or to build a module for one client that has many sites. We are all becoming adept at managing our projects with Composer, but the vast majority of resources are focused on managing publicly available code via packagist. There are times these kinds of internally shared modules cannot be made fully public (for example they may contain IP belonging to the client). We have one such client that needs a module deployed to dozens of sites, and so I sat down a few weeks ago to figure out a solution.

We use Bitbucket for our private repositories, I am sure there is a similar solution using GitHub, but I haven’t worked out its details.

  1. Create private repo for module on Bitbucket.
  2. Clone that repo locally, and structure it to match Drupal.org’s conventions (this probably isn’t required, but should allow your module to blend into the rest of the project more smoothly).
  3. Create Oauth token for your account in Bitbucket. Make sure to include a dumy callback URL; you can literally use http://www.example.com. If you see references to auth.json, don’t worry about that part yet.
  4. Add a composer.json file to the module’s repo (it only requires module name, type, and the branch alias, but it’s good to include the rest):
    {
      "name": "client/client_private_module",
      "type": "drupal-module",
      "description": "A very important module to our very important client.",
      "keywords": ["Drupal"],
      "homepage": "https://www.bitbucket.org/great_agency/client_private_module",
      "license": "proprietary",
      "minimum-stability": "dev",
      "extra": {
        "branch-alias": {
          "8.x-1.x": "1.x-dev"
        }
      },
      "require": {
        "drupal/diff": "~1.0",
      }
    },
    
  5. Add reference to project composer.json repositories section:
    {
      "type": "package",
      "package": {
        "name": "client/client_private_module",
        "version": "dev",
        "type": "drupal-module",
        "dist": {
          "url": "https://www.bitbucket.org/great_agency/client_private_module/get/8.x-1.x.zip",
          "type": "zip"
        }
      }
    }
  6. Now just run composer require client/client_private_module, and provide the oauth creds from step 3 (note: the first time you do this composer will create the needed ~/.composer/auth.json)

A Process to create a Drupal 8 module’s Config

One of the best practices for Drupal 8 that is still emerging is how to create modules with complex deployable configuration. In the past we often abused the features module to do this, and while that continues to be an option, with Drupal 8’s vastly improved configuration management options and the ability to install configuration easily I have been looking for something better. I particularly want to build modules that don’t have unnecessary dependencies but I can still reliably include all the needed configuration in my project. And after a few tries I think I’ve struck on an effective process.

Let’s start with a quick refresher on installing configuration for a Drupal 8 module. During module installation Drupal will load any yaml files that match configuration patterns it already knows about that are included in your module’s config/install directory. In theory this is great but if you want to include configuration that comes with other modules you have to figure out what files are needed; if you want to include configuration from core modules you probably will need to find a fairly large collection files to get all the required elements. Finding all those files, and copying them quickly and easily is the challenge I set out to solve.

My process starts with a local development sandbox site that is just there to support this development work, and I create a local git repository for the site’s configuration (I don’t need to connect it to a remote, like Bitbucket or GitHub, or handle all of the site’s code since it’s just to support finding changes to config files). Once installation and any base configuration is complete I export the site’s config to the directory covered by the repo (here I used d8_builder/config/sync, the site itself was at d8_builder/pub), and make sure all changes in the repository are committed:

Now I create my module and a second repository just for it. The module’s repository is linked to a remote since this is the actual product I’m creating.

With that plumbing in place I can to make whatever configuration change I need included in the module. Lately I’ve been creating a custom moderation workflow with several user roles and edge cases that will need to be deployed on a dozen or so sites, so you’ll see that reflected below, but this process should work for just about any project with lots of interrelated configuration.

Once I have completed a set of changes, I export the site’s configuration again making sure to avoid uuids and hashes that will cause trouble on import:  drupal config:export --remove-uuid --remove-config-hash

Now git can easily show which configuration files were changed, added, or removed:

Next I use git, xargs, and cp to copy those files into your module (hat tip on this detail to Andy Gregorowicz):
git ls-files -om --exclude-standard --exclude=core.extensions.yml |  xargs -I{} cp "{}" pub/modules/custom/fancy_workflow/config/install/

Notice that I skip the core.extensions.yml file. If your module had dependencies you’ll still need to update your module’s info.yml file to list them.

Now a quick commit and push of the changes to the module’s repo, and I’m ready to pull the module into other projects. I also commit the builder repo to ensure it’s easy to track any future changes.

This isn’t a replacement for tools like Configuration Installer, which are designed to handle an entire site, this is intended just for module development.

If you think you have a better solution, or that I’m missing something important please let me know.

Drupal 8: Remote Database Services

I recently completed a Drupal 8 project that required pulling data from a remote database.  The actual data is not terribly complicated, so Drupal’s role in this case is mostly to provide an abstraction layer that converts the database into a (cacheable) JSON response. Pulling all the pieces together took a little more research and guessing than I expected so I figured I might save a few people time by writing it up. This is more of an intermediate than a beginner project and so I’m going to skip over lots of detail that important to making it all really work. To really understand what’s happening here you’ll want a basic understanding of Drupal 8’s controllers and database services.

What we’re doing here is creating a database service and a controller to provide a JSON endpoint. We’ll define the database connection, the Drupal service, and then the controller.

Drupal allows us to define database connections in the main settings file. This allows easy access to Drupal’s database services and query classes.

Connection Definition

The first step is to define the database connection in settings.php:

<?php
$databases['remote']['default'] = [
'database' => 'extra_data',
'username' => 'accessingUser',
'password' => 'UseGoodPasswordsIn2017&Beyond',
'prefix' => '',
'host' => '10.10.1.1',
'port' => '',
'namespace' => 'Drupal\\Core\\Database\\Driver\\mysql',
'driver' => 'mysql',
// I'll explain this detail later.
'pdo' => [
\PDO::ATTR_TIMEOUT => 5,
],
];

Services

Next we need to create a new database service using the new connection information. Drupal core’s database service can be leveraged to connect to additional databases by just defining your new connection as a service in your module’s services.yml file.

services:
  mydataservice.database:
    class: Drupal\Core\Database\Connection
    factory: 'Drupal\Core\Database\Database::getConnection'
    arguments: ['default', 'remote']

Notice the arguments from the database service are the array keys (in reverse order) from the settings.php definition of the connection.

That’s all it takes to create a new service that wraps around your database.

The Controller

That service is all well and good as far as it goes, but if we want to actually to send the data to the browser we need a controller to leverage our new service and send the response.

Using dependency injection I attached the service to the controller and it looked like it worked great.

<?php
/**
* Constructs a new DataSearchController object.
*/
public function __construct(ConfigFactory $config_factory, Connection $dataservice) {
$this->config = $config_factory->get('mydataservice.datasettings');
$this->database = $dataservice;
}

/**
* {@inheritdoc}
*/
public static function create(ContainerInterface $container) {
return new static(
$container->get('mydataservice.database')
);
}

public function getData(Request $request) {
$query = $this->database->select('mytable', 'mt');
// From here you gather your data and send your response...
}

In fact it did work flawlessly all the way through initial testing. Using the database service to get the data and the cacheable JSON response technique I’d worked out previously everything came together quickly. You could make a request of the controller with your search terms and the browser gets back a list of objects for display.

Then just before launch the client physically relocated the database server and didn’t tell us it would be offline. Turns out we hadn’t tested what happens to an injected database service when there is no response from the remote database server. The request would wait for the database connection to time out and then throw an exception that didn’t get handled in my code at all. There was no place to add a nice error message and it was incredibly slow since the timeout was 30 seconds.

So at the last minute I had two more problems to solve: trap the error and shorten the timeout on PDO connections.

Drupal 8 database services attempt their connection when the service itself created. It you use dependency injection that means exceptions need to be caught in create(). But create() cannot send a response to the browser, that has to happen later when the function that corresponds to the active route is called by the kernel.

My solution was to make the database service an optional parameter on the controller, and adjust the static returned by create based on the exception thrown:

<?php
/**
* Constructs a new DataSearchController object.
*/
public function __construct(ConfigFactory $config_factory, Connection $dataservice = null) {
$this->config = $config_factory->get('mydataservice.datasettings');
$this->database = $dataservice;
}

/**
* {@inheritdoc}
*/
public static function create(ContainerInterface $container) {
try {
return new static(
$container->get('config.factory'),
$container->get('mydataservice.database')
);
}
catch (\Exception $e) {
return new static(
$container->get('config.factory')
);
}

public function getData(Request $request) {

if (!$this->database) {
throw new HttpException(404, $this->config->get('database_offline_message'));
}
$query = $this->database->select('mytable', 'mt');
// From here you gather your data and send your response...
}
}

The other way to handle the exception would be to load the connection from Drupal’s service container when you need it in place of using dependency injection for that service.

Also, notice we’re throwing a 404 error. Ideally it would return a 5xx type error, but those trigger other behaviors that prevented me from providing nice errors for the JavaScript application to process easily. Our controller also had a page display (to send the JavaScript libraries and base markup for the actual interface on application startup), which meant that we needed to create a reasonably well themed response in that function as well:

<?php
// If the database is offline, then send error message.
if (!$this->database) {
\Drupal::service('page_cache_kill_switch')->trigger();
$message = $this->config->get('database_offline_message');

$error = [
'#theme' => 'dataservice_error_page',
'#attributes' => [
'class' => ['dataservice', 'database-offline'],
'id' => 'dataservice-error',
],
'#message' => [
'#type' => 'processed_text',
'#text' => $message['value'],
'#format' => $message['format'],
'#filter_types_to_skip' => [],
],
'#title' => $this->t('Database Offline'),
];

return $error;
}

Settings revisited

So that fixed the errors, but still meant we had a really long wait for the database connection to time out before the connection error is even thrown in the first place. And now we come back to that PDO section of the database connection definition.

<?php
$databases['remote']['default'] = [
'database' => 'extra_data',
'username' => 'accessingUser',
'password' => 'UseGoodPasswordsIn2017&Beyond',
'prefix' => '',
'host' => '255.255.255.255',
'port' => '',
'namespace' => 'Drupal\\Core\\Database\\Driver\\mysql',
'driver' => 'mysql',
// Now is when I explain this
'pdo' => [
\PDO::ATTR_TIMEOUT => 5,
],
];

PDO expects you to override settings at connection time not in a settings file. So I couldn’t just update my PHP.ini and call it a day but I also couldn’t find any documentation on how to change any PDO settings. Leveraging the power of open source I started to follow the code paths, actually reading through how Drupal establishes MySQL connections and, found I it.

If you add a subarray in your connection definition keyed to ‘pdo’ Drupal will load and apply those settings instead of the defaults. There are a couple settings that Drupal insists on being right about for performance, stability, and security reasons, but timeout and many others are fair game.

Controlling Block Visibility with a Custom Field in Drupal 8 (updated for 9)

Awhile back I wrote up a pattern for creating static blocks on Drupal 8 sites. This week I was working on a site where one of those blocks needs to be enabled or disabled on specific nodes at the discretion of the content author. To make this happen, I’m adding a new feature to my pattern.

In older versions of Drupal there were a number of ways to go, like the PHP Filter, or custom handling in the block’s view hook, but I figured there were probably more appropriate tools for this in Drupal 8.  And I found what I needed in the Condition Plugin (more evidence that plugins are addictive). According to the change record they were designed to centralize a lot of the common logic used for controlling blocks, and I found it works quite nicely in this case as well (although a more generalized version might be useful).

I have the complete condition plugin at the end so you don’t have to get all the details exactly right as we go.

I started by adding a boolean field to the content type named field_enable_sidebar. Then I using drupal console generated the stub condition plugin:drupal generate:plugin:condition. In doing this the first time I also looked at the one defined in the core Node module to handle block visibility by content type.

The console will ask you a couple questions, obviously you can attach it to any module you’d like and call it whatever you’d like. For this example I have it in a fake module called my_blocks and the condition is named SidebarCondition. But the next couple questions are less obvious and more important.

Context Type should be entity

The context type should be set to entity since we are looking to work based on the node being displayed.

Context Entity Type should be Content

Next it’s trying to filter between entity types, and since we’re doing this based on the node content entity type, select “Content” to get a list of content entities on your site.

Context Definition ID should be Content

Finally select “Content” again since that’s the label for node entities in Drupal 8. If you have your field on another content entity type (like a taxonomy term, or a file), pick that entity here instead and rest of this should still work with minor editing.

Once you run through the wizard you’ll have a new file in your module: my_blocks/src/Plugin/Condition/sidebarContition.php

The condition plugin contains two main elements: a form that’s attached to all block settings forms, and an evaluate function that is called by blocks to determine if this condition applies in their current context.

buildConfigurationForm() defines the form array elements you need. In this case that means a simple checkbox to indicate that this condition applies to this block. We also need to define submitConfiguration() to save the values on block save.

public function buildConfigurationForm(array $form, FormStateInterface $form_state) {
$form['sidebarActive'] = [
'#type'          => 'checkbox',
'#title'         => $this->t('When Sidebar Field Active'),
'#default_value' => $this->configuration['sidebarActive'],
'#description'   => $this->t('Enable this block when the sidebar field on the node is active.'),
];
return parent::buildConfigurationForm($form, $form_state);
}

public function submitConfigurationForm(array &$form, FormStateInterface $form_state) {
$this->configuration['sidebarActive'] = $form_state->getValue('sidebarActive');
parent::submitConfigurationForm($form, $form_state);
}

In the complete example you’ll see there is summary() which provides the human friendly description of the values that have been set for this condition.

Now let’s jump back to the top of the plugin and review the annotation. Conditions are annotated plugins and those questions I guided you through above were used to generate the annotation at the start of the file:

/**
* Provides the 'Sidebar condition' condition.
*
* @Condition(
*   id = "sidebar_condition",
*   label = @Translation("Sidebar block condition"),
*   context_definitions = {
*     "node" = @ContextDefinition(
*        "entity:node",
*        required = TRUE ,
*        label = @Translation("node")
*     )
*   }
* )
*/

This is defining the context you’ll want passed to the condition for evaluation. In this case we are requiring that a node entity labeled “node” is provided when we need it.

The real work of the plugin is handled by evaluate():

/**
* Evaluates the condition and returns TRUE or FALSE accordingly.
*
* @return bool
*   TRUE if the condition has been met, FALSE otherwise.
*/
public function evaluate() {
if (empty($this->configuration['sidebarActive']) && !$this->isNegated()) {
return TRUE;
}
$node = $this->getContextValue('node');
if ($node->hasField('field_enable_sidebar')&& $node->field_enable_sidebar->value) {
return TRUE;
}
return FALSE;
}

The first conditional ensures that this plugin doesn’t disable all blocks that aren’t using it. Next we ask for the context node value we defined in the annotation, which provides us the current node able to be displayed. Since not all node types are guaranteed to have our sidebar field we first check that it exists and then check its value and return the correct status for block display.

Now every time a user checks a box on the node, any blocks with this condition enabled will be displayed along with the node. And the best part is that the user doesn’t need to even have block display permissions, we’ve allowed them to bypass that part of the system entirely.

T-Shirts Revisited

A few weeks ago I wrote about not taking free t-shirts from vendors at DrupalCon (or other tech conferences). Well DrupalCon North America 2017 has come and gone so I thought I’d report back on this year’s t-shirts.

My free shirts from DrupalCon 2017

I ended up with seven new free shirts all from places that also offered them in women’s sizes. In addition to the official conference t-shirt I picked up shirts from Lingotek, Linode, Pantheon, Optasy, Chef.io, and Kanopi Studios. There were a couple companies that appeared to be giving out shirts I didn’t talk to so there may be a couple worthy of complementing that I missed.  The big prize for the year goes to Linode whose job ad was a postcard in the shape of a (men’s) t-shirt and read:

Want a career in the cloud hosting industry? We’ve got a shirt in YOUR size.

Pantheon, as usual, had the most popular t-shirts with their custom printing shirts in a wide variety of sizes. That continues to be an amazing way to get people to watch their demo and collect contact information (although to be fair the demo itself is pretty amazing).

There were also a few companies that tried to convince me that bringing “unisex” shirts was the same as if they had brought women’s sizes.

Unisex shirts are, of course, just men’s shirts with a different label. And there are, of course, women who prefer the fit of that cut to shirts sold as women’s. But suggesting they work for everyone is just finding a new language for cheaping out. Arguably its worse than just forgetting that people come in different shapes since it shows someone thought about women looking for shirts but couldn’t be bothered to realize that most people want clothes that take into account their body shape. Ever met a guy who says he’s prefers how a woman’s cut shirt fits his body?

At least I didn’t hear anyone recommending using them as pajamas.

Cached JSON responses in Drupal 8

This week I was working on a Drupal 8 project that includes a page that uses Drupal as a simple proxy to convert a weak XML API into a simple JSON response. To ensure good performance I wanted to ensure I had cached JSON responses.

When I first built the site I hadn’t yet gotten my head around Drupal 8 caching, and so the JSON responses weren’t cached and therefore the page was slow. After some issues with the site caused me to have to look at this part of the project again I decided it was time to try to do something about this.

Drupal 8’s use of Symfony means we have great tools for simple tasks like providing JSON responses. Originally in the controller all I had to do was provide the JsonResponse class an array of data and it would handle the rest:

That’s all well and good if you don’t want your response to ever be cached, but Drupal provides a CacheableJsonResponse class that links up to the rest of the Drupal 8 caching engine to provide much better performance than a stand Symfony JsonResponse. But it turns out the docs kinda suck for explaining how to use it. After a great deal of digging I found this Question on StackExchange which gave me what I needed.

Edit: You must enable the Internal Dynamic Page Cache module to get CacheableJsonResponse to work correctly. Thank you for the correction.

Early Thoughts on Drupal Governance Change

One of the things that the Drupal community has learned in the last few weeks is that our current governance structures aren’t working in several ways. Having spent a lot of time at DrupalCon talking about these issues I figured I share a few initial thoughts for those working on our new processes.

This isn’t the first time I’ve been part of a community that was changing how it organizes itself. In my religious life I am a Quaker, and for a long time I was a member of Philadelphia Yearly Meeting which is the regional organizing body for Quakers in the greater Philadelphia area. And I served for a time on several of their leadership committees. I’ve seen that 300+ year old group pass through at least three different governance structures, and while many of the fundamentals are the same, the details that matter to people also change a lot.

My great aunt put it into perspective during one of the long discussions about change. When my wife asked her for her opinion about a then pending proposal she responded that it didn’t matter much to her as long as it worked for those willing to take leadership roles at the moment.

So as the Drupal community grows through a process to change our leadership structure here are the things I think it is important for all of us to remember.

  1. It will not be perfect.  We’re human, we will make mistakes, that’s okay.
  2. It will change again. I don’t know when or why, but whatever we do will serve us for a time, and then we’ll replace it again.
  3. Most of the community won’t care most of the time. Most of the time, most of us don’t notice what Dries, the Drupal Association, Community Working Group, and all the other groups that provide vision and leadership are doing.

I think we can all agree my first point is a given. I mention it mostly because some of us will find fault in anything done going forward. We should remember the people doing this work are doing the best they can and give them support to do it well.

On the plus side, whatever mistakes we make will be temporary because Drupal and its community will outlive whatever we create this time. We’ll outgrow it, get annoyed with the flaws, or just plain decide to change it again. Whatever we build needs to be designed to be changed, improved, and replaced in the future.  Think about it like the clauses in the U.S. constitution designed to allow amends to the constitution itself.

Finally, we should remember that community and project governance is insider baseball. Understanding how and why we have the leadership we do is like watching a pitching duel on a rainy day, most baseball fans don’t enjoy those kinds of games.  Most of our community wants to use Drupal and they don’t want to have to think about how DrupalCon, Drupal.org, and other other spaces and events are managed. That will not prevent them from complaining next time there are problems, but it is a fact of life those who do care should acknowledge.

Our community is stronger than we have been giving it credit for in the last few weeks. We need to be patient and kind with each other, and we’ll get through this and the divisions that will come in the future.

DrupalCon Baltimore Notes

I’m using this post as a place to store and share “notes” from DrupalCon Baltimore.  The part of conference notes I tend to find most useful are links and stray ideas I get talking with people.  I don’t tend to take detailed notes anymore since I rarely if ever go back to those, although on rare occasions I do re-watch a session if I found it particularly useful.

Basically this is a dump of links, pictures from various things, and a few stray thoughts. I’ll edit as the week progresses and probably add more thoughts and ideas.

Prenote: https://events.drupal.org/baltimore2017/balti-more-prenote-balti-most-fun-drupalcon (unmitigated silliness)

DriesNote: https://events.drupal.org/baltimore2017/driesnote

Import things to call out:

  • Claims we’re launching 15,000 D8 sites per month!
  • Field Layout module (experimental in Drupal 8.3 core) appears to be DS in core: https://www.drupal.org/node/2795833
  • BigPipe is ready for production.
  • Quick edit can do drag and drop image upload.

Greg Anderson’s PHP OSS Workflow tools: https://events.drupal.org/baltimore2017/sessions/development-workflow-tools-open-source-php-libraries

  • Some time is more important than others, like outages.
  • To find Drupal plugins on Packages: https://packagist.org/search/?type=drupal-drush
  • poser.pugx.org provides badges on packagist pages.
  • https://scrutinizer-ci.com/ “Sometimes any static analysis tools will give you answers you don’t like so just ignore it.”
  • https://www.versioneye.com/ uses the licenses from composer.json to check for both out of date and for compatible licenses. Not terribly useful internally, but user for open projects since you can link a badge to the result. Alternative `composer licenses` which is actually smarter.
  • Sami: A symfony component that does a JavaDoc like project for API documents. https://github.com/FriendsOfPHP/Sami

Baby Steps, Lessons Learned & Big plans for Drupal Diversity and Inclusion: https://events.drupal.org/baltimore2017/sessions/year-diversity-initiatives (Core conversations style 30 minute talk with 30 minutes questions).


Launching Online Stores with Commerce 2.x on Drupal 8 https://events.drupal.org/baltimore2017/sessions/launching-online-stores-commerce-2x-drupal-8

Drupal is Changing, Quickly: How and Why https://events.drupal.org/baltimore2017/sessions/drupal-changing-quickly-how-and-why

Composer Resources:


Project Estimates:

My free shirts from DrupalCon.

Technology and its workforce at ethics crossroad

https://events.drupal.org/baltimore2017/keynote-technology-and-its-workforce-ethics-crossroad

  • Humans don’t panic properly! We panic too late instead of when we can do something about it.
  • Programmers shouldn’t trust themselves since they don’t know what will happen with their work later. @zeynep keynote #drupalcon
  • Everything is multi-causal.
  • Toolmakers’ ideals don’t rule their tools.

  • Surveillance is Baked into Everything.
  • Dismantling structures of accountability
  • Labor Realities of New Economy

Raising the bar with guardr

https://events.drupal.org/baltimore2017/sessions/raising-security-bar-guardr

Watch later:

Additional Resources to checkout:

A Pattern for Drupal 8 Blocks

For Drupal 7, I have a pattern to help simplify creating and managing custom blocks for a site. Since the standard hook_block_info() and really hook_block_view() implementations tend to get messy and junked up with markup in the code.  My solution was the use the block delta key from the block info array as a helper function name called from the hook_block_view(), and a theme function in hook_theme() to make sure I could easily create a template for each block. I’m not going into detail about it since was hardly original and I’ve seen several variations from other developers.

When I started to work on Drupal 8 I wanted to develop a similar pattern to help create simple conventions for integration with front end work on a project. While the new plugin system helps avoid the ugliness of old hook_block_view() implementations, how to create a twig file for your custom block isn’t obvious and doesn’t impose a naming convention. Worse, I’ve seen lots of example code with blocks that return render arrays of type markup meaning they have HTML in strings in PHP.

Most projects involve a collection of nearly static blocks that provide basic information like the copyright information, a disclaimer text, a link to the firm that built the site, decorative flourishes, and other similar elements that don’t benefit from being managed as content. After a couple experiments I’ve come up with a solution that works pretty well:

A visual of module files also described in main text.
Outline of the custom blocks pattern.

  1. Create a module for simple blocks (although I use the rest of the pattern anytime I’m creating a custom block).
  2. Create the block class for each block within the module at src/Plugin/Block/CopyrightBlock.php.
  3. Create a theme function for each block in the main .module file.
  4. Create a twig file to implement the theme function within the module at templates/custom-blocks-copyright.html.twig.

The for a copyright block the class itself is simple.  We implement the variables we want rendered by the block, and set the cache to expire at midnight (one of the places a timed cache makes more sense that cache tags):

With the block created, we need to define the custom_blocks_copyright theme function we included in the block using hook_theme() from within the dot module file.  Remember any custom variables you used in the block class need to also be defined here (in this case attributes and year):

Finally, create the twig file to provide the actual markup.

Drupal 8 Custom Plugins are addictive

In December I gave a talk at the SCDUG meeting on Drupal 8 plugins.  This is based on that talk with a few improvements and additions from things I’ve learned since. 

Drupal 8 provides many great improvements, but one I’ve been finding to be the most exciting is the easy ability to create my own plugins. They are actually a bit addictive, and it’s easy to start seeing projects as an excuse to create a new plugin manager. At first it can take a little bit of effort to get your head around the process, but I got great details from Unraveling the Drupal 8 Plugin System from Drupalize.me. But the problem with those, and all the examples I could find, was they were either too deep (the drupalize.me article should be considered required reading, but the example is too abstract for most people) or too shallow (the drupal.org Plugin API examples currently lack context about why I would want to do this in practice).

So my goal is to straddle that gap.

What’s a plugin?

Plugins are a design pattern that let you extend a module or service. Drupal 8 has at least four plugin discovery systems:

  • Annotated ← I’m mostly talking about this one, its used for blocks, views, rules, Queue API, and more.
  • YAML ← These are used for menus, routes, and services.
  • Hook ← D7 carry over hook system, still used in many places.
  • Discovery Decorators ← Wrap around other plugin systems to replace/improve on the classic info alter hooks from previous Drupal versions.
  • Static ← These are used for test code, and I’m ignoring here.

Basic Commonly Used Plugins

As a Drupal 8 developer you probably use several plugins when building a site, particularly if there is any custom functionality required. When creating controllers you’ll need a route, and likely a menu item, which means you’ll use YAML files to define those aspects.  For blocks, the queue api, or Rules you’ll create an annotated plugin.  Theme functions are still commonly defined with hook_theme().

Why do I create new plugin?

Between core, and major contrib modules (like Rules and Search API), plugins options abound, but it’s not always obvious why you might want to create one for a more focused project. For large projects that may have a large number of related functions, poorly defined use cases, or use cases that are expected to expand over time Plugins provide three major advantages over other approaches:

  • Plugins make it easy to easily keep classes small and focused.
  • Plugins make it easy to extend a service.
  • Plugins allow one module to easily extend the function of another.

Actual Examples

Cyberwoven has a client who needed an internal training portal. They want to provide some basic gamification, particularly badges, and they wanted a more detailed tracking of user behavior than the core statistics module provides. The badges present a function that has many related, but different, use cases and is likely to grow over time. So I created a pluggable service that allows us to easily define a set of conditions for creating a new badge. Each plugin’s annotation lists the events it cares about, and when one of those events is fired the service runs the plugin’s test function. If the badge has been earned the plugin returns the information needed to generate that entity. All the event listening, entity generation, and error checking happens in the service, so each plugin is only focused on the details of the one badge. The event tracker is similar, with each plugin defining a filter for the major system events to determining what details should be recorded, and the service handling the logging of those details and reporting. As the client needs have evolved we’ve been able to add and modify the plugins to meet their needs.

The second place I’ve created a custom plugin system for was an internal project at Cyberwoven (it was actually the project I used to learn how to create custom plugins). Our testing server has always had a service to perform some basic operations on the server, like pulling repos via webhooks and similar tools to help developers. Setting up our development environments for D8 meant it was time for a new test server, and therefore new tools to manage it. From the start I wanted to create a tool that helped both developers and account managers. So I created a D8 site to manage the sites that provides various tools to perform task operations we all need (not only code handling, but also checking if security releases apply to any sites). All the various operations we want exposed through that site are created as plugins to our custom manager. Each task plugin is focused on just that one task. All the other various needed to track the sites, route requests to run a task, and render their responses are all handled in other places.  It’s also now making it easy to add features that wrap around the plugins since the plugins all have a consistent interface (like adding new displays and running some tasks through cron).

Introducing Site Manager

List of example sites.
The listing page for the site manager tool. This runs on our server and can be run in developer’s sandboxes as well.

The main module defines an entity for each site, a couple displays (like the listing page displayed above), and the annotated plugin base so we can add new features easily. Plugins can be enabled and disabled on a site-by-site basis, and can be run from the plugins dropbutton.

Once we had the basics in place (like getting the git and drush status’s via simple plugins) we started to add more powerful features to turn the tool into a dashboard.  We added the module search block you see on the left as a separate module that searches all sites for a specific Drupal module (nice when security updates are released). The search module also provides a plugin for each site that lists the modules for just that site.

The last site in the screen shot has a reference to scanning the HTTPS certificate for a specific site.  This is another feature we added recently as this tool increasingly starts to serve as a dashboard. The HTTPS Scanner module again provides both a batch job to scan all sites, and a task plugin that scans just one.

As of this writing, all told there are nine working task plugins across four modules (there are a couple more underway as well). Tasks can also have their results displayed in a block when you look at the details for a site.

Building your own manager

Hopefully by now your convinced that it’s worth having your own plugin manager at least sometimes. So the next question is how to build them.

Elements of a custom plugin

Plugins have several parts, most of which are small and simple:

  • Annotation Plugin Definition
  • Plugin Manager Service
  • Plugin Interface
  • Plugin Base
  • Example Implementation
  • Controller (optional)

Annotation Plugin

The first step is to define the annotation for your plugin. This is a simple class that extends Drupal\Component\Annotation\Plugin to define the variables you want listed in the annotation comments of the plugin. This file goes in custom_module/src/Annotation, and as always the file name and class name match:

Plugin Manager Service

The plugin manager itself is a service, but the vast majority of its function comes from the parent classes.  The convention is to place this file in custom_module/src/Plugin (as will the rest of the base definitions we’re about to provide).  In this case we call it: TaskPluginManager.php.

The full version of this provides a few more improvements to this class (I overrode the getDefinitions() method to provide some filtering options to support disabling plugins), but those aren’t actually required for it to work well.

 

In short, this class just provides some general definitions to separate it from all the other annotated plugins.

And since the plugin manager is a service, we add it to our module’s service.yml file:

services:
   plugin.manager.task_plugin.processor:
     class: Drupal\sitemanager\Plugin\TaskPluginManager
     parent: default_plugin_manager

Plugin Interface

Next we define the plugin’s interface.

 

For this plugin I defined 4 critical public functions, three just provide basic information about a plugin, but fourth (run) will be the interesting one in a minute.

Plugin Base

Finally we’re getting to something interesting (or at least something that requires code).  The plugin base is an abstract class which all the actual plugins will extend. This goes in custom_module/src/Plugin/TaskPluginBase.php.

The things worth taking notice of here are the fact that I inject several services into the base class, so I have them for each plugin: translation, main sitemanager module configuration, and a wrapper on Symfony’s process service to make running drush and git commands easier.  The three informational functions are fully defined here, but the run function stays abstract since it’s the function that actually justifies creating a plugin at all.

In theory I could stop right there. I’ve created a plugin manager and defined all the things someone else needs to be able to use it. But there are two technically optional parts that are useful if you like to work and play well with others:  an example implementation and a controller for use or testing purposes.

Example Implementation

We went through all the trouble to create the plugin manager, before you stop you should create at least one plugin to prove all the work we just did actually works.

This is a simple task to use drush to run cron for a site.  The site entities know the location on the file system for Drupal root, and the simple task service handles the extra settings to control where the commands are run (and time outs and error trapping), so we can safely feed that to command and the location to get the response.

The method returns a render array, that can be used as part of a response.

Controller

The controller is truly 100% optional.  There are two reasons you might want to create one: to actually run the plugin if that makes sense (it does for the site manager), and to provide for easy testing of a plugin.

For site manager I created the controller because I actually needed it to run the tasks from the dropbutton (its the link used by all the links on those buttons).  But when I created the others for clients, I realized they can be very hard to test. They are buried inside several layers of complexity, making a test suite is a challenge to create correctly. And since are of the reason to use a plugin for a project with growing definitions, the test suite is often of limited use for rush additions. But by having a carefully built (and secured) controller and route, you can create an endpoint to use to test the plugin.

Because it’s optional, and well covered elsewhere (or heck just use Drupal console to generate it), I’m going to skip over most of the details of actually building the route, and linking that to the controller, and focus on the single function of running the task itself. To keep this short I’ve removed lots of extraneous details like security and error trapping – creating a tool like this isn’t for beginners so I’m assuming you know how to do those things.

The plugin manager service was injected into the controller (code not shown) so I just have it create an instance of the task requested. The controller calls the run function.  If the task is successful the output (a render array) is used for the response, otherwise a simple message is send. This sends an AjaxResponse to work with Drupal’s standard JS handlers, but it could generate a full page just as easily.

Final thoughts

Being able to create your own plugins easily is really nice.  In Drupal 7 custom plugins where generally considered an advanced developer task. The improvements in the abstraction (and the fact that true plugin support is now in core not pulled from ctools or other critical modules), means that while its still not a beginner task it is something all Drupal developers should be learning.

But remember you don’t always need or want the complexity of your own plugins. I’ve found the idea a bit addictive, and I get unreasonably excited when I find a place plugins are used or that it makes sense for me to create a new plugin type. For custom work you can almost always do the same thing using extra functions on a service, controller, or event listener, so this is a judgement call in the end.