A typical performance review is a dull, predictable and dreaded ritual; this can be fixed by including realtime, ongoing crowdsourced input and data, according to Eric Mosley in his book The Crowdsourced Performance Review: How to Use the Power of Social Recognition to Transform Employee Performance. The performance review can then be informed not just by the opinions of one manager but by a year-long narrative of the employee’s accomplishments, skills and behaviour.
So, how do you set up a system to collect the crowdsourced information? Most of the book is about the benefits of such a system, rather than how to actually do it, and it is not until the appendix that the author discusses implementing a social recognition system. The attributes of such a system seem to include:
The author suggests that such a system needs to be accepted by at least 80 percent of the organisation’s workforce, with 5 percent receiving awards each week. A simple calculation reveals that on these figures the average employee would receive 2.5 awards per year. If the organisation has five core values, then 2.5 data points spread between five values hardly gives statistically valid crowdsourced data for an annual performance review. It seems to me that, in order to provide enough feedback for the performance review, each employee would be needing to give an award every week so that the average number of awards received is 50, or 10 for each company value. If the average value of an award is $100, then the cost to the company is $5,000 per employee per year.
I am not sure how to prevent “gaming” of a social recognition system (such as by people conspiring to award each other), but in my view the author has made a convincing case that crowdsourced data improves the quality of a performance review and that social recognition significantly increases positivity in the workplace.