Reviewing the Review Process

This is my first post, so you might not know who I am. I’m Justin Scott (fligtar) and I’m an AMO developer, currently working on the developer/admin pages of the Remora project.

Recently, there’s been a lot of feedback through various channels on the AMO reskinning, “spyware” toolbars, and other issues, and I get the feeling that users think their comments are falling upon deaf ears, or not even falling in audible range. This is not the case. AMO 1.0/2.0 is frozen so that our development efforts can be focused on Remora, which you may confuse with us not caring about fixing problems with it. Issues brought up in bug reports, IRC, and other channels are definitely being considered and implemented in Remora.

Last week at the Firefox 2 Summit, we came up with a solution to many of these problems that will raise the quality bar and also increase the transparency of the add-on review process. This new system will divide into two sections: a public area (similar to the current AMO) and a sandbox area. The graphic below is a basic summary of how it will work.

We hope that this process will:

  • Eliminate approval delays for updates to trusted extensions
  • Keep add-ons that are not ready for prime-time out of the general public area
  • Automatically run security checks on all submitted add-on code and flag for security review if triggered
  • Provide a home for Mozilla Labs creations not ready for mass use

While in the sandbox, regular users will be able to install and test the add-ons and then write reviews as to whether the add-on works and should be pushed to the public site. After several of these reviews, an AMO editor will review the add-on and the public reviews and make a decision to send it to the public area or keep it in the sandbox. We expect that some add-ons will stay in the sandbox forever, such as those automatically generated by certain websites.

Please see our meeting notes in the Meeting Notepad or append to the Idea Dump for any ideas you may have regarding the development of Remora.

  • thericochet

    Thank you for acknowledging the complaints and shedding light on the process.

  • Ian

    Having that diagram somewhere on the site, when done, might be useful too.

  • zmanzero

    thank you for taking a look and trying to fix what has become a horrible mess.

  • Wow. I might consider using AMO now instead of hosting my extension myself. I’m going to keep watching this progress!

  • Nice to see you blogging! Thanks for showing us this scheme. With the editor review in place it should be stable enough to actually work. I have only one concern, I think it has been raised in morgamic’s blog before – what about all the less popular extensions? There is for example my Xpoint Sidebar extension (which should be removed from AMO but that isn’t the point). It has been written for one specific site, is promoted on this site and is used by probably a few hundred users. Getting together enough reviews is difficult, not even by begging people to write one – most users aren’t comfortable enough with English. Are these specialized extensions not supposed to be published on AMO? And before you ask, the reason why it was submitted to AMO in the first place: this was supposed to strengthen the then-new local XUL development group.

  • Hey trev 🙂

    Extensions don’t start out popular – they start with 0 downloads and have varying degrees of success or failure, as I’m sure you know. When users or editors review extensions in the sandbox, it shouldn’t matter whether the extension previously had 0 downloads, 500 downloads, or 500,000 downloads. If it’s an extension that does what it says and enriches the web, and if the author put a legitimate effort into its creation (didn’t click a “Develop” button), it should be on the site, in my opinion.

    This new process is still in its infancy and we may very well end up changing certain aspects as it is implemented in the code over the coming weeks, but the above criteria alone raises the quality bar from AMO’s current selection. If users want that bar even higher, it will be evident in their reviews, but that is where we’re starting.

  • trev, looking at what the sandbox provides, it would still enable the site to point to that extension and have users install it. Which may be a bug, could we check for the referrer in the actual download?

    I wonder if we should install extensions without displaying a “sandbox disclaimer”, it’d be too easy to create malware extensions and hook them up at a spoof site or something.

  • Hi Axel. We planned on having users login in order to see the sandbox (or possibly only to download/install from the sandbox). They would have previously agreed that sandbox add-ons are not tested by a person and may do harmful things. Obviously if we find out that an add-on is malicious, we can immediately remove it and add it to the Firefox block list if necessary (which will cause it to be disabled and un-installable in all Firefox installations).

  • flanagan

    What concerns me most is the current plethora of spyware toolbars. They make the current AMO completely unusable. In the new process, the “Editor Review” process must have a means of identifying extensions that do so-called “usage-tracking”. IMO such toolbars should be rejected immediately and thrown out. They should not appear in either Sandbox or Public. At the very least, such spyware needs to be relegated to its own classification so that it is completely out of sight of the normal user.

  • DonGato

    Well, at least you’re working but you should speed up as the mess is getting bigger. 😉

  • That looks great, but I have one concern.

    The proposed systems provides a means for add ons to be added to a popular, trusted site with no review process (for trusted add ons). That is fine, unless someone malicious gets access to the account associated with a trusted add on, at which point they can upload their own, spyware-infected version.

    You could say the same would be true if they got the admin username/password to the server, but that is known by far fewer people who have a closer relationship with I’d suggest at the very least having some form of email verification scheme (on upload, sends an email to the registered address with a link to activate the upload), or a better verification scheme if anyone thinks of one.

  • name

    Sounds Great!
    I love the “Trusted Addon” system, It’s going to make updating extensions so much faster 🙂

  • Amanda Russet

    Sounds like a good plan for going forward. I only have one main concern — relying on user reviews to be a trigger for deciding whether an add-on should be moved to the public area may be intrinsically flawed:

    Negativity Battles…
    From what I’ve seen, happy users usually won’t take the time to post their ‘happy’ comments, whereas angry users _always_ do. So the balance is perpetually skewed (toward the negative), which all too easily could harm the otherwise ‘good reputation’ of a great add-on. Also, I have seen a lot of ‘turf wars’, where competing extensions beat each other up unfairly in the comments and ratings department. I would not expect any of that to magically disappear with the arrival of AMO3.

    Unreliable and Incomplete Reviews…
    Nearly all user “reviews” are unreliable, are completely subjective, and are 99% incomplete because they only address one or two issues. That is, many users will stick a truly remarkable add-on with a rating of 1 or 0 just because they had a minor issue while trying to use one of its features and didn’t bother to seek help on a user forum – which happens a lot. Just imagine if Car&Driver magazine completely trashed the latest BMW with a 0 rating because the reviewer couldn’t figure out how to turn on the windshield wipers and didn’t bother to read the instructions. At other times, they will assign disproportionately high ratings to a tiny little add-on that does little more than paint a new button to invoke an existing FF option. Its pure silliness.

    Thus, I believe regular user ratings & comments are an unreliable source of information and don’t provide a truly unbiased and comprehensive assessment of an add-on. Trying to create an approval process that is loosely based on them is asking for trouble.

    Perhaps there should be a ratings-system for the reviewers themselves? Good reviewers could earn a reputation for being accurate, complete, and fair in their assessments, and poor reviewers would get vetted out quickly (similar to Mozilla drivers). In other words, more weight would be given to the opinions of good reviewers (during the add-on review process), and the general user population could vote on whether they agree/disagree with a reviewers comments, boosting or dropping his/her rating.

  • WRT Amanda’s comments above, which are valid,
    I have three metamoderation tricks to suggest:

    1) Take the “Was this comment helpful” link, make it bigger, and stick it at the top. 🙂

    2) Create a list of the “most helpful” reviews in the last 3 months, and link it from the menu.

    3) Also Create a list of “best reviewers”, and give the reviewers some form of ranking.

    Or, in other words, think like amazon 🙂

  • I would suggest that in the sandbox area, that you don’t publicly show the number of downloads per add-on. This way, every plugin will get it’s fair share of usage among users.

  • There are so many extensions in the sandbox area, people will have to go to a certain amount of effort to find the ones that might be worth trying out. It might in fact be necessary to allow the number off downloads to be used as a way to let the good stuff rise to the top and get recognition. I guess you could also use the average rating, or a weighted combination of both.

    It looks to me like the way the sandbox has been set up (hard to find!) and the fact that it’s so very full, will mean that most of the marginal XPI’s will now be hosted off people’s own sites until they’re ready for primetime. Which seems ok really. Setting the quality bar higher to get into the official addons list will inspire people to put a bit more effort into their extensions. (You should see ours! We’ve really done the hard yards, and are looking forward to seeing what people think of it)

    I think an improvement to the system as above might be that developers are only allowed to submit/nominate their plugins for review a limited number of times per month, or whatever, to stop the review queue from getting crowded.

  • Thanks for the post! Martin

  • max1million

    I understand trying to improve the listings. But from what I’ve been reading, now instead of having perfectly good or new extensions at the end of the most popular list cause of fewer downloads and fewer updates to cause automatic downloads, they are hidden away in an if you find this section don’t bother cause they haven’t been tested or just aren’t any good section. And developer should wait for “several” users (how many) to find, test, and rate it (if they are ambitious enough) before it goes to a normal reviewer for approval to be added to the regular site. It seems to me that your shoving the oldest most popular even farther ahead of the rest, instead of trying to even out the playing field, which I thought was one of the goals.

    I think actually having them reviewed and possibly adding Reviewer(‘s) comments to the addon’s page, above “User Comments” (ratings), would be better. My recent addition (new) at just over 3000 downloads this week and over 6000 total on the old site, has but one User Comment (just added). Since it is Sandboxed (along with the older but current rest of mine) on new site, common user probaly won’t even find it (them) after switch over. Of the 6 that I’ve uploaded my current total average rating is 4.14 with the worst being 2.75(fair or not). Should I even bother to update if they just get thrown in the junk bin anyway?

  • Hi I develop a Toolbar(spyware free 😉 ) for an small audience. The extension is not “great” but I’ts far better then the first version I got accepted att addons.moz.(what were you thinking, it was 99% based on thinkgeek toolbar and had like 3 google buttons and a search field… *heh*).

    Well, I realy dont need to get my Extension _listed_ at AMO, I have a small Swedish-speaking audience, BUT du to all restriktions(installing from other sites then addons.moz. is a no-no, gives a warning and I like the update function/comment system). Wouldnät it be possible to to just host and not list smaller extension and “auto generated toolbars”? If a user at a site would like to install the spyware-free extension, why not host it? Even if its not interesting for some users, others may like those extension. I have no problem with autogenerated-extension etc. No matcher if its I would love to have my small, crappy extensions hosted for at AMO for _MY_ users, not to get downloads(or maby a little). How ‘baut it?

    And what extension is trusted?

  • Ohh, I know, my English is sooo good. Well the point is, host those autogenerated toolbars etc but let the site link to ’em and dont list them.

    I would like to think that mozilla is a place for all kinds of addons…

  • Hey all, great informations on this site. Jürgen from Germany 🙂

  • Daniel

    I’ve gotten into the sandbox, but I can’t install anything. The install button is red instead of green, and addons don’t install.

  • Thanks for the post, guys! perfect informations. John