During its investigation of Enron – one of the biggest corporate collapses in history – the Federal Energy Regulatory Authority gathered around 500,000 emails generated by employees, and has since made this dataset publicly available. The signs had not been picked up at the time, but data analysis of the content of these emails years after the company filed for bankruptcy in 2001, tells a salutary tale.
When data analysis company KeenCorp tested its own software using the email dataset, which showed communications between the company’s top 150 executives, its algorithm assigned index scores to various points in time. The lowest occurred when the company filed for bankruptcy, which made sense. But there was also a high level of tension in 1999, 30 months before the bankruptcy filing. After further investigation, these were found to be linked to the company setting up partnership companies that would mask its losses – a transaction that eventually sparked an investigation into accounting fraud and triggered Enron’s demise. What executives were prepared to say in emails clearly diverged from what they would say on the record. The former chief financial officer Andy Fastow has since said that “with the benefit of the KeenCorp Index, Enron’s board of directors would have been alerted to reconsider its decision and prevent a cultural and financial meltdown.”
This may be an extreme example, but this type of sentiment analysis can be a way of understanding employees’ moods and perceptions and to pre-empt reputational crises, gauge if an initiative is not landing well or even detect health and safety risks. The practice has been around for some time in consumer circles, combing through unstructured customer feedback and determining if it is positive or negative based on a score. Using text from multiple data sources such as emails, social media or review sites, the tools use natural language processing to break down what individuals are saying, placing the words into context and detecting whether they reflect positively or negatively on a brand. So a “sick burn” might be concerning if you sell medical supplies, but a vote of confidence in the gaming community.
In the workplace, sentiment analysis could be used to identify several issues, such as:
- Clusters of disenchantment in teams, when linked to performance figures
- Reputational risk, such as rogue employees spreading rumours or sharing confidential information
- Pre-empting health and safety breaches, for example if incidences of people stating they do not feel safe increase
- Perceptions of leaders, perhaps after a change in senior management
- Flight risks, as in determining if high value employees may be about to leave
- Diversity and inclusion, for example whether certain groups dominate particular departments and the impact this has on others
- Detecting fraud and misconduct, or compliance breaches
Data could be drawn from multiple sources, including:
- Internal social media or networking platforms such as Slack, Workplace by Facebook and Microsoft Teams
- Annual employee surveys and other questionnaires where there is a free-text option
- Chats on WhatsApp or other employee communication channels such as Skype and Zoom
- Exit interviews or feedback from candidates on the recruitment process
- External social media such as Twitter or websites such as Glassdoor
Akey benefit of using sentiment analysis with employees is that – unlike with annual or pulse surveys – HR teams and managers can make changes based on immediate feedback and address concerns that might not have been aired explicitly. This could be in response to a new policy, a change in employee benefits or to give a general feel for where the workplace culture is heading. “All the indicators of behaviour can be found in language, helping you to detect an employee problem before it gets worse, or the opportunity to do something about it,” explains Ed Juline, head of business development at KeenCorp. “We tend to measure employee engagement in the rear-view mirror rather in real time, and predictive analytics means you can see if the signs are there.” KeenCorp’s technology is based on psycho-linguistic research and can analyse organisations’ data sources against business priorities. All the data is analysed at aggregate level so no-one is personally identifiable, and machine learning ensures interpretations become more accurate over time.
One of the issues with traditional engagement surveys, even if carried out on a regular basis, is that employees often say what they think the management wants to hear, adds Juline. Sentiment analysis circumvents this by working in the background, mining multiple data streams for comments and trends, rather than asking employees outright. This was particularly important during the pandemic, according to John Sumser, principal analyst at HRExaminer1. ”More so than ever, employees feel pressure to tell employers what they want to hear,” he says. “To really understand what’s going on in the workforce, you need multiple data streams. Asking employees is a good start. But we need different questions, to be willing to listen to what employees have to say and to be committed to doing something to address the issues we find. Interpreting and validating survey responses requires a level of depth that is best executed with intelligent tools.”
The Covid-crisis has also underscored the importance of listening tools such as this, according to Patrick Couroyner, chief evangelist at employee engagement company Peakon. An analysis across its customer base showed a significant uplift in employee comments around wellbeing as countries were forced into lockdown in March 2020. “Organisations using the platform were able to see this and move fast to address the problem. Our artificial intelligence functionality meant that vast numbers of comments could be quickly and anonymously analysed to reveal the real problems. Leaders could then implement initiatives to better support their employees,” he explains.
Algorithms can also be deployed to predict certain behaviours – for example if an employee is showing signs that they could leave. Peakon’s platform has an ‘attrition prediction’ algorithm that can analyse employee communications for keywords commonly shared before exit interviews, meaning HR can predict someone’s flight risk up to nine months before they hand in their notice. Couroyner adds: “This provides HR leads with an accurate forecast of turnover risk among different employee groups – such as departments, teams and office locations. They can then adapt their retention and recruitment strategies before attrition becomes a bigger problem.” In compliance-focused sectors such as financial services and healthcare, sentiment analysis can provide an early warning sign of risk, raising flags if certain keywords are triggered that could point to fraud or harassment, for example.
It is possible to perform some level of sentiment analysis without artificial intelligence – keyword searches can be used to sort conversations into themes for review – but automation and machine learning make the process much quicker and more reliable. Even with a level of automation, organisations need to feed data into the sentiment analysis algorithm so it can ‘learn’ how certain words and statements are affected by context, explains David Godden, vice president of sales and marketing at Thymometrics. “It can be a challenge as companies don’t always have the time for this, and reactions to things are happening in real time,” he says. Thymometrics system, which allows employees to share unstructured feedback about their workplace at any time, asks users to categorise their own comments to enhance the algorithm’s understanding. This also reduces the risk of someone receiving an automated message because their comment was misinterpreted. “This brings background to the feedback rather than relying on the technology to do it. We’re dealing with human beings and that personal touch is so important. People could be spilling their heart out and that warrants more than an automated message,” he adds.
The benefits of being able to mine employee communications for clues about their levels of engagement are clear, but what about employees’ right to privacy and potential data protection issues? A recent report by the Trades Union Congress found that just 31% of staff had been consulted by their employer around the introduction of monitoring technology, and 56% felt this was damaging the trust between workers and their employers2. One in seven said monitoring at work had increased during the pandemic, leading to calls from shadow digital minister Chi Onuwah to update official guidance for employers on electronic surveillance.
On the whole, sentiment analysis tools tend to aggregate and anonymise data so individuals cannot be identified, or will hide insights from smaller teams where a sharp change in behaviour could be linked to certain people. Your electronic communication policies or data protection policies may already include information on how the organisation will collect and use employee data, but even so, a general rule is that more transparency and consultation is better. For example, Microsoft launched a tool in 2020 called Productivity Score, which boasted that it could deliver insights on how employees use Teams, such as who they mention in chat functions. However, it removed the ability to see usernames after accusations these insights would border on employee surveillance. That said, acceptance that organisations may use the content of workplace communications tools to look at engagement is growing: a survey by analyst company Gartner in 2018 found that 30% of employees were comfortable with their employer monitoring their email, compared to only 10% of employees in 2015. When an employer explained the reasons for the monitoring, more than 50% of workers were comfortable with it3.
Similarly, employees’ response to sentiment analysis will depend on what the data is to be used for. Research into employee monitoring by researchers Lynn Bartels and Cynthia Nordstrom4 found that employees tend to assess the consequences of being monitored and “perform accordingly”. They found that performance did not improve when employees were monitored with no explanation, when surveillance was not tied to performance measures or rewards, or when surveillance was used for vague purposes. There are also limitations to how much automated tools for sentiment analysis will understand. As with employee surveys, there may also be an element of ‘saying what the manager or company wants to hear’ if employees know their communications are being monitored. And what about sarcasm? If the tool is powered by machine learning, the algorithm will memorise certain datasets and contexts in which certain words appear. A study of social media sentiment analysis by academics at a computational linguistics summit found that these tools could memorise that (for example) a tweet was sarcastic because an operator told them so, but might not recognise sarcasm again as the context would be different5. This also raises questions about how inclusive sentiment analysis is – if the team creating the ‘rules’ behind the analysis is not from a diverse range of backgrounds, the algorithm may be fixed on looking for aspects of language common to that group, but not sensitive to the nuances in other cultures, generations or those who could be considered neurodiverse.
Simon Rubin, co-founder of health insurance and benefits platform YuLife, argues that communicating to employees that the data insights are for their benefit can increase buy-in. “Generally people feel that if they are getting rewarded, they are happy to share their data,” he says. YuLife customers receive anonymised reports of how employees are using services so they can identify trends, but these reports would never show up personal mental health issues, for example. “These are a trigger for a conversation rather than a definitive answer. People downloading mindfulness apps may mean people are talking more about mental health or that work is too stressful. It just means you can investigate more,” Rubin adds.
As with any engagement tool, sentiment analysis works best in conjunction with data from other sources such as pulse surveys, focus groups or one-to-one performance conversations. As the Enron story shows, however, its key selling point can be in spotting shifts in feeling that employees might never reveal explicitly – providing a depth of analysis that other methods might not. With organisations moving to longer-term remote and distributed working, they can identify issues and respond quickly, holding on to talented employees and minimising risks.
1 It’s time for new approaches to engagement, John Sumser, Human Resources Executive, May 2020
2 Technology Managing People:The Worker Experience Report 2020, TUC,November 2020 https://www.tuc.org.uk/sites/default/files/2020-11/Technology_Managing_People_Report_2020_AW_Optimised.pdf
4 Examining Big Brother’s Purpose for Using Electronic Performance Monitoring, Lynn K Bartels and Cynthia Nordstrom, Wiley, 2012. https://onlinelibrary.wiley.com/doi/epdf/10.1002/piq.20140
5 Identifying sarcasm in Twitter: a closer look, The 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies, Proceedings of the Conference, 2011 https://www.researchgate.net/publication/220874376_Identifying_Sarcasm_in_Twitter_A_Closer_Look