The Boston Globe Ideas section on Sunday highlighted research by Ethan Bernstein at Harvard Business School entitled, The Transparency Paradox: A Role for Privacy in Organizational Learning and Operational Control (abstract) from Administrative Science Quarterly, June 2012.
Abstract: Using data from embedded participant-observers and a field experiment at the second largest mobile phone factory in the world, located in China, I theorize and test the implications of transparent organizational design on workers' productivity and organizational performance. Drawing from theory and research on learning and control, I introduce the notion of a transparency paradox, whereby maintaining observability of workers may counterintuitively reduce their performance by inducing those being observed to conceal their activities through codes and other costly means; conversely, creating zones of privacy may, under certain conditions, increase performance. Empirical evidence from the field shows that even a modest increase in group-level privacy sustainably and significantly improves line performance, while qualitative evidence suggests that privacy is important in supporting productive deviance, localized experimentation, distraction avoidance, and continuous improvement. I discuss implications of these results for theory on learning and control and suggest directions for future research.
The Globe highlighted the paradox of the research: in some situations, workers hid their improvements from management - implying that those improvements couldn't get spread outside of a very small group. The circumstances seem to be that of heavy management control of work procedures: if people are found to go outside of the procedure, they are punished (or they believe they will be punished), so when management are around they follow procedures to the letter. But when management aren't around, people use workarounds that improve efficiency.
While this paradox is curious, I think it highlights something even more important: you get what you measure. If you hold people accountable for following the script, that is what you get - even when that script can be improved.
One of the principles of continuous improvement relies upon is that the people doing the work often know best how to improve the work. (Lean calls this element, "Respect for people.") But if you don't give them credit for knowing this and don't encourage them to show the way, you will not get the improvements that are there for the taking. You may even get dis-improvements, as people shift things to their liking.
In defense of the research, there is also the aspect of experimentation that people naturally want to conduct alone or with trusted colleagues. It's only after the experiments bear fruit that we then want to share the ideas with a wider audience. So, from that perspective, being open to oversight all the time could be detrimental. I may need to get the full article to understand the details behind the context and experiments described in the research.