Enter your email address to follow this blog and receive notifications of new posts by email.
jasonrbriggs, mikeschinkel, Jake Goldman, and 13 others are discussing. Toggle Comments
Thanks for posting the video, wish I could have made it. Loved the discussion and the way it developed. I think Jake nailed it toward the end, proposing the addition of a social aspect to .org, enabling .org users to endorse other .org users. I think there’s a tremendous amount of value in being able to search/browse all of the favorited plugins of developers who you’ve endorsed. Also, if .org profiles would show who a user has endorsed, it would be a nice way of discovering other great developers who you may not be familiar with yet. Also, I agree with Shane that there’s a lot more value in showing the # of active installs of a plugin over downloads and would love to see this implemented in .org.
+100. I might be endorsing the endorsement of my own idea here, but I really think this is one of the most immediately actionable, useful ideas that we came up with.
And, as a side note, this might also discourage dishonest or lazy voting.
Also, we may want to intentionally not display the # of endorsements a user has to avoid FB & Twitter-like popularity contests.
A lot was said on this topic. Here are some of the ideas I’ve picked up from rewatching the video:
1) reviews for ratings (perhaps forced reviews with poor ratings)
2) connect ratings / reviews to actual users (perhaps favorites integrations)
3) relate ratings on a time scale in order to prevent older-is-better idea
4) have ability to change review / unfavorite
5) figure out way to make reviews not be worthless (rate reviews themselves, amazon style?)
What have I missed, and which of these are “sure thing” good ideas?
Regarding reviews: buddypress.org has supported plugin reviews for BuddyPress plugins for >1 year. Take a look at http://buddypress.org/community/groups/achievements/reviews/ for an example of an approach of this.
Is it possible to have a full discussion on these reviews though? I personally think anything more than a one-line response should be split out into a forum topic (linked from the review).
@Ryan Agreed. Rotten Tomatoes is probably the most useful review system out there and their reviews are just a sentence or two. I think the other component to that review system that makes it useful is that the reviews are by film critics, not just anyone. In this case, instead of critics, the reviews written by .org users who I’ve endorsed could be separated from the reviews written by users I haven’t endorsed. Similar to how Flixster has reviews by Flixster users separated from the reviews by Rotten Tomatoes critics.
@otto – is there a pattern right now for .org to have reviews or should I wireframe something up? We have great ideas floating around this forum. I’m staring at zappos right now and this is a really solid experience > http://cl.ly/Inf4 and could easily see it applied to plugins.
You can if you like, but it’s a bit early in the discussion to be making wireframes I think. It still has to be decided what exactly is going to be done. I expect such a discussion to take the better part of a month or two.
Some initial thoughts discuss on the call.
Metrics: # of downloads is the deceiving.
It hints at popularity but is deeply biased by age and users just trying things out with no commitments. Working with george at presstrends, I was able to get a side by side comparison of downloads vs active installs (across about 80k sites – so its not perfect) for event calendar plugins in the repo. http://cl.ly/ImfT. The left column is the # of downloads as of a few months ago and the right column is the plugins rank among all active plugins tracked by presstrends. You’ll notice that “Events Calendar” has more than 2-3x downloads of the top plugins and yet is the least used of the bunch. The statistic is deceiving.
1) show # active installs – that is a real number people can count on
2) if we keep showing downloads – maybe limit within X time period (6 months?) along with total
Metrics: Star has no transparency.
Admit it, you’ve given yourself 5 stars. And maybe even, in a fit of frustration given a plugin 1 star the might or might not have deserved. The challenge with stars are well know: people value a star differently; people judge based upon different criteria; people’s ratings are not worth the same… so here is the question. What are things we can do to bring better meaning?
Second, you can’t rate from the dashboard. 100% of the time I abandon a plugin, I personally do not go hunt it down on .org to give it a low rating. Only a few plugins get a high rating from me and that is after a lot of repeated use. Why? Simply because it is inconvenient.
Stars can be gamed. Not that many people are doing it but Otto mention it is a legit issue.
1) Ditch the stars. Replace it with favorite count, which will integrate with profiles and core.
2) Integrate stars vote option upon update or uninstall (increases problem with gaming the system)
3) Do the research to see what the big bad world outside of WP is doing to address this challenge (volunteer please)
Promotion: Screenshots, FAQ & Good Content
When I go to the apple app store, the first thing I look at is screenshots. Same thing with the .org repo. Difference is on our repo I an continually disappointed. The fact that I have to download a plugin and figure out how to make it work, just to see what it does far to often. Strong screenshots, FAQ and description play a huge role in building trust and expressing value. While we can’t require people to create great marketing content, it will be the difference between initial impression and mistrust. Look at the awesome impact something as simple as the top banner had.
1) I’m not sure we can require screenshots, but I wouldn’t be opposed to it.
2) Find a way to increase the prominence of plugins which have taken the time to create a strong presence. Reward the effort.
One of the primary reasons I shop at amazons is to read reviews. It puts ratings into perspective. It gives a story. They rating are measured by “Was this review helpful: y/n” and then best and worst rated reviews with high helpfulness are displayed first. The only thing I wished the did was find a way to display the credibility of the reviewer. The cool think is that we have the ability to do that with author profiles (# of reviews, # of plugins, core contributions and other cool badges).
1) get reviews going and tie them to rating & user
An extension of activation in the downloads section – not all installs are created equal. I have no idea how to do this, but if there was a way to show that this site was used by sites / brands people respect? If CNN is using a plugin on their site, that is a lot more meaningful than if joe’s bar and pub is. Just curious if anyone has any ideas aside from self reporting in a review.
ok, I have more but need to crash…
I completely agree that active installs would be hugely beneficial and a concrete indicator. +1 for that.
I do think downloads over a period of time can be useful. Maybe it could be integrated into the current stats page. But so much can go into downloads, like counting it as a download every time the plugin updates, which could be a lot.
I believe starts can add value. Here’s how the Mac app store displays them: http://cl.ly/image/2U0j053X2c2G
Just showing the # of each rating is helpful. The app store shows stars for the current version. I think this can create enormous issues. A dev could update to eliminate poor ratings, or may not update in order to preserve good ratings. I think total stars in a span of time is the way to go here, perhaps with the ability to view star history too (dreamy wish).
I view favorites differently than stars. To me, favorites are like my own little vetted list. They’re my “go-to” place. I think favorites could even be a big part of the socialization of the repository, and be one of the metrics used to measure when a plugin is ready to be vetted.
Screenshots and plugin documentation
I completely agree screenshots should be mandatory. I’d like to see nearly every admin page a plugin has (if any) in the screenshots section.
For documentation, I found that to be a difficult part of submitting a plugin. I’d like to see documentation pages for every plugin auto-generated that are basically the same as the codex. This way, even if a plugin author does a poor job, users can document the plugin as they see fit. Plus, a lot of devs do a poor job of understanding what a user needs to know about a plugin anyway.
The app store shows stars for the current version. I think this can create enormous issues. A dev could update to eliminate poor ratings, or may not update in order to preserve good ratings.
It’s not just that this can happen; it actively does happen on the App Store. I think a better way would be to have the ratings not count after x months after an update. So, a new update could be brought out to try and game this, but the previous ratings would still last for a while before disappearing.
Another possibility could be that the ratings decay over time. So, a 3 month old rating would be worth less than a rating made 1 second ago. I know Hacker News (http://news.ycombinator.com/) uses a rating system for comments based on this: http://amix.dk/blog/post/19574
A bit late to the party, but +1 on including some sort of history type system for ratings. Maybe a two color chart line would help. Show ratings based on the last 1 (or 2) years in the primary color, and then the total stars in secondary color.
Specifically regarding “# of downloads is the deceiving”:
Our stats tracking for active installs isn’t quite up to par yet to get those numbers. And that’s a hard problem, but one that we might be able to solve in the long run.
But, real *numbers* isn’t something we’d be willing to share, because raw numbers are usually crap without solid interpretation. Percentages, sure.
Here’s the thing: if I said 1000 sites ran your plugin, then all you know is that I said 1000 sites ran your plugin. You have no idea that that is a correct number. You don’t know my measurement methods. You don’t know my crazy assumptions. You have no way to gauge the reliability of my data. But, if I say that your plugin runs on 2.5% of measured sites, then that’s at least reliable, when compared against other plugins. It’s a useful metric.
Think about this problem: how many sites run WordPress.org software? Well, how do you define “run”? How do you define “site”, even?
If a site is setup on a server and nobody ever visits it, is it really running? At what point does a site disappear from our “count”? Not an easy question to answer, is it? This sort of thing is why raw numbers are misleading.
What about something along the lines of “report anonymous usage data to WordPress.org” which could be a setting on the general settings page even. This simply pings back to .org every so often with what WP version, plugins/theme+version.
Hits the following
Sites with no visitors never report because the process does not run
Greater reliability of tracking, maybe when users opt-out a ping back to .org with that. This way tracking opt-outs and knowing about what % of the user base you are getting usage info from
regarding @patrickgarman’s comment, getting that kind of data would be a huge boon to plugin authors as well as users. Many of us are operating blind and good data would really have an impact. For plugins that affect the front end, knowing the top XX themes used in conjunction with the plugin would help smooth layout conflicts. Same with knowing the top XX plugins to test for conflicts.
@otto, I totally hear you that the #s get convoluted.
% could be a reasonable approach although I wonder how much even the largest niche plugins have. Seeing 4% could imply low adoption when it is in fact the largest in the niche by 50%.
Maybe something like Plugin install rank could be another. Especially if it could be cut my some type of category or tag. Ie, like Brian was asking: Show me all SEO plugins sorted by active install base (rank). That would dramatically improve search. Only challenge is that it will naturally bury anything new and noteworthy.
there is the interesting challenge of mu installs, does a 10k instance mu build count a 1 or 10k?
Also how long does a site stay a credible statistic. If you plugin was installed 3 years ago and hasn’t been updated, nor has the site been updated and still runs WP 2.8, is this a useful & legit count? We know that very few WP sites ever upgrade.
can you weigh one site to be “worth”more than another? personally knowing that someone like TechCrunch runs your plugin means more to me than joe’s diner. perhaps that is more about exposing top users or something. not sure how to do this but it would be a big deal in terms of plugin credibility.
I think that low %’s are just something that each niche will need to weigh that against. In a specific niche the active developers will probably know that 2% is actually really high compared to all the others getting 0.2%
I think that MU sites should be treated as if each site was it’s own though as far as stats go. Each site would at least have a unique home_url.
Even old sites that have not been updated are still technically credible which is why version numbers are so important. We don’t like seeing sites running WP 2.X but it happens, and if the site is active it should still be counted but counted appropriately. Just like the plugins currently do with active versions, although I’m not sure where this metric comes from currently? – http://cl.ly/image/353w0r2b3o09
As far as weighing some sites more might be asking for too much information from a site though. Part of reporting anonymous data is that it’s being reported anonymously. It’d be awesome to know that kind of information but I’m not sure how well it would play out.
Usage rank it’s invaluable for plugin / themes developers. Bonus: show us the trend plotted over time. But not sure it’s a good measure of quality either. Some of the most used plugins are well know for its lack of quality. Also, plugins that do too much will tend to be better ranked that plugins that do just one thing but very very well. Citing Jake: “Britney Spears is really popular”.
Agreed that usage != quality, I think this is going to be more along the lines of tracking popularity and stats for developers more so than a usable stat for finding a good plugin.
Maybe once a vetting/review process is in place a plugin that gets released or has substantial growth over X period of time automatically gets flagged to check out.
A couple thoughts on this… Some already covered by others here.
I don’t see the importance of number of downloads as a measure of quality. It could be a measure of age or the reach of the plugin developer more than anything, neither a good way to tel
If something is coded well. Another reason against “vetted” authors. Even if every other plugin was great, their latest one might be lacking.
Star ratings have never had much meaning to me unless there is a corresponding g review explaining why. A yelp-like review and comment system directly attached to the plugin would be amazing. When we find a plugin that breaks something we could leave a comment saying “This broke my site because of x,y, z.
However, all of these things may matter to different people. So in stead of trying tonight’s out the one thing to sort plugins by or buil an algorithm that takes it all into account, wouldn’t it be better to have sortable listings. You could sort by star ratings, author rating, # of downloads, etc…
Oh.. And screenshots… Some plugins simply add functionality like a simple filter and have no ui or anything that COULD be a screenshot. Encouraging them is great. Requiring them does not make sense.
I think one of the major problems with ratings is, as discussed, their one-dimensionality. Ratings give you an overall picture of the plugin, but a one-star could mean “makes your server explode” or simply “I didn’t like this plugin but I can’t be bothered thinking of a proper rating”.
Someone mentioned in the Hangout about adding subratings to give more granularity to this. While this could work, I think it would add unneeded complexity to the system. I think a better way would be to introduce reviews and link them to a rating, with the review part optional.
How do you handle reviews though? Personally, I think reviews would serve a dual-purpose: to inform others of the plugin’s qualities, and also to start a conversation with the plugin author. A “helpfulness” metric would help here, much like Amazon’s: http://www.amazon.com/Kindle-Wi-Fi-Ink-Display-international/product-reviews/B0051QVF7A/ e.g. – Reviews that are rated “helpful” the most would bubble to the top, and those could help users make an informed decision on the plugin.
Trying to tie that into a social aspect, a flair system similar to Amazon’s system could be incorporated. Amazon shows “top 100 reviewer” for example; WP.org could show “core contributor”, “core committer”, “plugin author (42 plugins, average rating 4.9)” for example. This would introduce a trustworthiness metric into the reviews.
(Tying this into the badges idea: when a plugin is submitted for review, the reviewer could post their findings back as a review, with their flair including “official security review team”, e.g.)
I like the idea of showing “core contributor”, etc on reviews.
If you extend that to show it against the plugin authors on the plugin page itself, that would a) give users another useful signal about the quality of the plugin and b) give plugin authors additional motivation to contribute to core. Everyone wins.
All of these above suggestions are great (Shane – awesome list). That said, I can’t underscore enough the popularity is NOT a measure of quality. I don’t even care who’s Calendar plug-in is the most *used* (vs. downloaded). I care which one is best.
I really don’t check CNET for reviews much but could do something along what they have. There are user reviews and then separate core contributors or however you want to separate those users out.
Although I do agree with Jake there’s a couple things going on in the entire conversation here. Enhancing user reviews and that functionality is great, and I think a lot is already planned for this. I still would check (albeit not as deep) with plugins that have good “core contributor” reviews. This combined with a “vetted” status would be good though. Of the 50 plugins in a niche, 6 are vetted, and 2 have core contributor reviews. This leaves you with a variety of information to go off
BestBuy has some cool UX for its users reviews. Easy to filter for the things you’re interested about: http://screenshots.mzaweb.com/fqPi
@Ryan: +1 for “Helpfulness”
@Daniel: +1 for Best Buy’s description of relevant attributes.
@Patrick: IMO a “vetted” status is too binary and will favor the common use-cases which still won’t be relevant enough when the use-case requires more detailed evaluation.
@Jake: “Best” is too subjective. I’d suggest review system that let the motivated 1% give detailed reviews that 5-star rank relevant attributes and use those reviews/ratings to drive reviewer’s reputations which drives how much their ratings count in the overall score presented to the end user. This will allow a single simple number for end-users but allow advance people to drill down more. Here’s a mockup (sorry for pasting this URL a third time: http://screenshots.newclarity.net/skitched-20120817-224118.png)
Of the three topics, I feel that this one is the most important, and also the most realistic, since it doesn’t rely on a team of reviewers.
While reviews can be just as unhelpful as star ratings, they also have the potential to hugely improve things. There have been a huge number of times that I, as a user and developer, have wished I could leave a review on a quality plugin, in part to help it stand out from the crowd and in part just because I thought the developer deserved it.
By tying the reviews into .org user accounts and having their stats (# of plugins, core contributions, etc, etc) displayed (as badges or some other form), we can dramatically improve the worth of reviews. These stats won’t necessarily mean a lot to the typical WP user that doesn’t even know what the concept of Core is, but it will be a huge step in the right direction.
With reviews, there is always the possibility that users will leave abusive reviews, even when the plugin doesn’t deserve it, so I’d suggest a system for flagging reviews as poor also be implemented. Once a review has three negative flags, it gets hidden.
“Favorites” is not really a good seal of approval. You may add a plugin to favorites to remember reviewing it later. If the label were “Recommended” it’d imply much more for me. If I see that, for instance, Otto has a plugin in his “Recommended” list, for me is a no brainer decision to use it. Then, we can have a curated list of the “Recommended” plugins by the best devs (best as in # of core contributions, popularity of his plugins, etc). That seems fairly easy to implement, and seems to be a good solution for what @jakemgold proposed in the first place.
In short, if 5 core devs and 3 big names in the community “Recommended” a plugin, just go for it.
This will also not discriminate new devs. Because even though I can trust that “Recommended” list, it’s not an official dotorg label. Also eliminates the sense of favouritism factor altogether.
I think a “Recommend this Plugin” is an excellent idea.
+1 on recommend as well. This is a wonderful idea. Really it is just a favorite but with a different label.
+1 I think if each .org user was able to maintain a list of “Recommended” plugins and a list of other .org users they’ve “Endorsed”, this would make searching and browsing the .org repo a pleasure.
Agree that Favorites is not good, but “Recommend” is too simple. People recommend plugins for criteria that differs from the criteria I might care about. They may love it’s it’s UI whereas I might care it is uses Custom Post Types and provides hooks I can integrate with. Here’s a mockup to consider a potential rating module that I propose anyone should be able to use like anyone can review a product on Amazon: http://screenshots.newclarity.net/skitched-20120817-224118.png
The trick here is knowing who recommended a plug-in.
Knowing who recommended a plugin is useful, but knowing *why* them recommended it is even more useful. I’d be happy to recommend a plugin to one client and recommend against the same plugin to another, depending on the situation. Also, knowing *who* isn’t useful unless we have some validated why for people to establish their reputation as being a good reviewer. Unless we just want to create a cult-of-personality, which I sincerely hope is not the plan.
Mike – I think that’s where user profiles or “badges” come into play. Reviews from core contributors, for instance, might carry more weight, and it’s an earned credential.
I don’t really follow the “cult of personality” point. People will respect certain reviewers based on earned “cred”, not, say, personal likability (I think).
I think our views are coalescing. By “cult-of-personality” I was referring to what would happen if selected people were “blessed” to be the reviewers, not because of their cumulative reputation score created by their ongoing efforts but because they were hand picked.
Being a core developer doesn’t mean you will be a great plugin reviewer. I believe if we have something similar to StackExchange’s reputation model but for reviews then awesome reviewers will emerge, people who will take a lot of their own time to write great reviews. The core developers will never have time for that, and shouldn’t. Yes they can have an opinion and people may care about it, but we shouldn’t build a review system around their opinions.
The idea of badges is perfect which assumes people will be able to drill down to see all the different reviews and who made them but also have an aggregate rollup of reviews. So if Mark Jaquith’s opinion is important to someone they should be able to see that, but my point which I think we agree on is we should not limit the reviewers to a hand selected group; we should empower everyone in the community to review if they are motivated to.
In topic #3 I suggested a reputation system based in part on plugin reviews that were voted to be helpful. Here’s a mockup of what such a plugin review might look like:
Jack people said the rating system needs to be simple, but I think we won’t get valid until the ratings capture more valuable information. End users need a 9-of-10 stars but people like me need to know if it use custom post types when it should, does it have hooks I can use, can you override styling with your own CSS, how many HTTP requests does it make, does it 10 query_posts() on every page load, etc. By having ratins based on criteria (any one of which a rater could ingnore) would give WordPress.org the ability to see what other developers have said and thus make it easier for us to evaluate and to rate the same plugins ourselves.
We could create a handful of “vetting” criteria and then let reviewers add any criteria they want, and over time the core community could vet more criteria based on what people are using a lot in their plugin ratings.
This will be better than the 5 star ratings because only motivated people will rate, and ones who aren’t motivated will get poor reputations for their poor ratings and thus either won’t rate or their ratings won’t affect the single score displayed for the end-user.
Capturing this kind of criteria-based ratings would allow WordPress.org present weighted rankings of plugins for different personas; a non-technical end-user blogger will care about “Easy to Install”, “Visual Appealing UI” and “Unlikely to Break Site” whereas an agency building a site for a Fortune 100 will probably care a lot more about “Highly Secure”, “Able to Scale” and “Unlikely to Break Site.” Criteria-weighted rating algorithms could be pre-developed for common personas, and if a query UI is created and it would let the motivated among us to do custom queries based on our own weighted based on what we care about for any given site we might be working on.
Oops, P2 ate my <img> tag. Here’s the URL to the screenshot: http://screenshots.newclarity.net/skitched-20120817-224118.png
That looks perfect to me.
Yet another idea that would push this all to a more social atmosphere…I’d love to see a “Request a review” option where A plugin author could request a review from others. I imagined this on a small scale where I request it from people I know… However I could see it getting ugly for the more well known developers who could get inundated with requests. But maybe a Dev could opt in to be a reviewer and set a number of requests? I could see this getting messy but Ithe idea of requesting a review is interesting.
I like the sentiment but agree if could get ugly, fast. Everyone would ask @nacin and @markjaquith to review their plugin and they would get overwhelmed and very grumpy very quickly (and rightly so!)
Instead of requests for reviews why not provide incentives for people who write great reviews by adding a reputation system and the best reviewer and their reviews could bubble up to the top?
Agreed – lots of people will want reviews from credible devs, for the exposure if nothing else.
Yep. But I also think great reviewers could emerge like great question answerers did on WordPress Answers if a reputation system incents them properly. Those emergent high rated reviewers would probably have more clout to +90% of the user base who don’t pay enough attention to who the core developers even are, especially since more end users struggle with things that are easy for the core developers.
How about a “support available” flag. Let users know that if they download this plugin, their on their own. On the contrary, the developer can indicate that they will support / maintain their creation.
You must be logged in to post a comment.
← Topic 1: Submission Process
Topic 3: Advanced Vetting Process →
Blog at WordPress.com. Theme: P2 by Automattic.
Get every new post delivered to your Inbox.
Join 39 other followers