Popular WordPress Plugins Get Low OpenSSF Scorecard Security Scores, But Does it Matter?
We recently introduced a Plugin Security Scorecard tool to promote better handling of security by the developers of WordPress plugins. A direct inspiration for this is the Open Source Security Foundation’s (OpenSSF) Scorecard. That is marketed as allowing you to “quickly assess open source projects for risky practices” and is supposed to have information on “over 1 million of the most used OSS projects.” While the marketing makes their solution sound impressive, there is a decided lack of evidence put forward that it provides useful results. That seems important, considering that the broad scope of the project raises questions about how reliable the results can be across such divergent software. As we look to improve our own tool, we wanted to better understand what that delivers for WordPress plugins. The results were not great.
Limited Breadth of WordPress Plugins Covered
While it is claimed that over 1 million of the most used OSS projects are covered, the website doesn’t provide further details on what is covered. So there is no breakdown of how many WordPress plugins are covered and what those are. As best we can tell, the project checks software hosted on GitHub and GitLab, so WordPress plugins hosted on the WordPress Plugin Directory could only be checked if they are also hosted on one of those.
We looked at all the plugins in the WordPress Plugin Directory that had 2+ million or more active installs to see if they were on GitHub. We found 22 that did. Of those, only 6 had received OpenSSF Scorecard assessments. So about a third of those plugins. We didn’t see any clear pattern in which ones had received them. For example, all four of the 10+ million install plugins have GitHub projects, but only two of them had assessments.
Even if the results of the assessments were useful, all that makes that of limited usefulness for those managing WordPress websites.
Scoring a Project not the Plugin
One issue we noticed while looking into the assessments was that a project and not the plugin might receive a score. While there is a score for WooCommerce, it actually is a score for the project the plugin is included in, not the plugin. So even if there is a score, it might not be relevant to the plugin.
Questionable Metric and Measurement
In looking over the components of the score and the results for the 2+ million install WordPress plugins, we easily found some questionable elements. One of the elements is if the software has “defined a license.” The explanation for that raises further questions:
This check tries to determine if the project has published a license. It works by using either hosting APIs or by checking standard locations for a file named according to common conventions for licenses.
A license can give users information about how the source code may or may not be used. The lack of a license will impede any kind of security review or audit and creates a legal risk for potential users.
An actual lack of a license could create legal risk, but we don’t know hot it would impede a security review or audit. Could a license restrict doing a security review? Possibly, but that wouldn’t stop hackers, since they are already doing something illegal.
With WordPress plugins, it appears they are checking for a separate license file and not looking at information in the readme.txt file or the plugin’s main file. So the All in One SEO plugin gets the worst score, as it doesn’t have a separate file:
While Elementor gets the best score:
In between is Yoast SEO, where the assessment states the “project license file does not contain an FSF or OSI license:”
It, like Elementor, uses the GNU GENERAL PUBLIC LICENSE Version 3. So the claim that one is an FSF or OSI license and the other isn’t doesn’t seem to make sense.
Another Questionable Metric
Another metric that is problematic, especially with the breadth of what the system is meant to assess, is if the software has had activity in the last 90 days. The documentation for the scoring system acknowledges this. Stating:
Some software, especially smaller utility functions, does not normally need to be maintained. For example, a library that determines if an integer is even would not normally need maintenance unless an underlying implementation language definition changed.
And:
There is no remediation work needed from projects with a low score; this check simply provides insight into the project activity and maintenance commitment. External users should determine whether the software is the type that would not normally need active maintenance.
Despite that, the metric is “high” risk, as opposed to the license metric, which is “low” risk.
Devil is in The Details
Another metric is if a plugin has a published security policy. That seems like a very good metric to check, but with an automated system, you only understand so much. The Elementor plugin gets a 10 out of 10:
Elementor’s security policy that is scored is based on, looks impressive, but reality is far different. Elementor is redirecting vulnerability reports in their WordPress plugin to a known unreliable security provider that sells information on vulnerabilities. It is a situation that hasn’t produced good results with Elementor.
With our own scoring system, we now provide a warning for plugins that redirect vulnerability reports away from the developer, because it is such a bad idea.
The Scores
With the OpenSSF Scorecard, the higher the score the better, and the highest score is 10. Here are the scores for the six plugins we looked at that, which had scores:
- Advanced Custom Fields (ACF) 3.6
- All in One SEO 2.7
- Elementor 4.1
- Redirection 2.9
- Site Kit by Google 6.1
- Yoast SEO 4.7
Those are not great, with the average only being 4.0. At a quick glance, those scores don’t seem like a great measure of how secure those plugins relative to the others mentioned.
The highest score their stands out, as it is for a plugin from Google, which has had significant involvement with the OpenSSF Scorecard project. But even with their involvement, the score isn’t great, which makes us wonder how useful they believe the scores are.
Incorporating OpenSSF Scorecard Into Plugin Security Scorecard
With the various limitations with both the scoring and getting access to the scores for WordPress plugins, we are not incorporating these scores into our own grades, but when we find that there is a score available we are including a link to that in our listings for plugins. You can also see what those plugins are and compare them to our grades. So far, two of those plugins that have been checked:
- Elementor Website Builder F
- Yoast SEO A
The results are not in line with OpenSSF Scorecard results at all, considering they gave both plugins scores in the 4 range.
Your Thoughts?
We would love to hear other people’s perspectives on the OpenSSF Scorecard results for WordPress plugins.