Posts in User Stories
Listening to Employees Where it Really Matters

World-Renowned Hospitals Use OdinText to Listen to Valuable Employee Feedback and Prioritize Resources

Human-Resources.png

This year again, OdinText was honored to be voted among the Most Innovative Marketing Research Suppliers in the World (2018 Greenbook Research Industry Trends GRIT Report).

As is custom, each of the Top Marketing Research Suppliers are invited to submit a case study for inclusion in the annual e-book showcasing the best of the best in consumer insights.

Last year our case study highlighted how OdinText can use customer comment data to better understand drivers of customer satisfaction and NPS (Net Promoter Score), as well as  predict  return behavior and revenue! This year we chose a case which highlights how OdinText can be used for Workforce Analytics to leverage employee feedback for continuous improvement, increased employee engagement and satisfaction.

As the Director of Human Resources, Greene Memorial Hospital and Soin Medical Center comments on the experience with OdinText in the Health Care & Human Resources:

Odin-Health-Worjforce-Text-Analytics-300x194.png

“The magnitude and detail of OdinText is amazing! [OdinText] pinpoints exactly areas that we can really work on. Other vendors just give us material and we have to hunt and peck. For not knowing anything about our industry, this software is amazing! You know atmosphere, what’s changing and what’s not… This blows me away.”

You can view an abbreviated case study in the e-book or on the Greenbook site tomorrow. However we are also happy to share a slightly more detailed case with you. To find out how world class hospitals improve through stakeholder feedback follow the link below:

alt text

alt text

Turning staff feedback into action: OdinText artificial intelligence platformreveals what drives employeesatisfaction at Kettering HealthNetwork’s Greene Memorial Hospitaland Soin Medical Center!

Again Thank You to Greenbook and everyone who selected us as most innovative, and congratulations to all the other great companies this year!

If you are curious how YOUR Data +OdinText will provide more powerful Insights Guaranteed you can request more information here.

Tom H. C. Anderson Chief Research Officer @OdinText

Predicting Your KPI’s Has Never Been Easier & 29 Other Case Studies

As you may recall this year OdinText was honored to be voted Most Innovative in North America, and Third Most Innovative Globally in the Greenbook Research Industry Trends (GRIT) Report. This year Greenbook offered something special, each of the Top-50 Marketing Research Firms were invited to submit a case study for inclusion in an e-book showcasing the best of the best in consumer insights.

The e-book was released yesterday with submissions from 29 of the top 50 market research firms contributing a brief case study. We’d be happy to share the entire e-book with you, you can request it here.

The case we submitted is one of my favorites where Shell Oil was able to use OdinText to predict three business performance indicators;  Stated Satisfaction/WOM (i.e. NPS),  Actual Return Behavior and Sales. Something they had never been able to do before.

While that case was run using very big data, several of our users have conducted the same powerful analysis with much smaller survey data.

Again, we’d be happy to send you the entire booklet with all the companies listed, and if you’re interested in something similar we’d be happy to schedule a brief informational call to discuss, and/or a demo with your own data.

Again Thank You to Greenbook and everyone who selected us for the honor, and congratulations to all the other great companies listed in the report!

Tim Lynch VP OdinText

Brandtrust Uses OdinText to Quantify Qual at Scale and Unearth Dormant Brand Equities

Editor’s note: Today’s post was contributed by Brandtrust, an OdinText client, as part of a new ongoing series from our users. We felt Brandtrust was an outstanding candidate for a use case because they already have a set of sophisticated, proprietary methodologies, which were made even more valuable by easily incorporating the OdinText platform. Case Study: Realizing the Untapped Potential of Stories

A long-time client asked us to help determine the equity of an overlooked legacy sub-brand, as they were interested in how that sub-brand relates to the parent brand, both from the perspective of legacy sub-brand consumers and younger prospective consumers. They wondered if there was untapped potential in this neglected property.

Utilizing our Narrative Inquiry approach — text analytics with a unique take on unearthing human truth — Brandtrust asked legacy and prospective consumers to share their memories and experiences with the sub-brand and parent brand via an open-ended survey tool. We exposed prospective customers, who by definition lack experience with the sub-brand, to representative sub-brand stimulus, and then had them reflect on their exposure experience.

By tapping into stories around actual experiences, our team was able to elicit language around the relationships consumers had or have with the brand and sub-brand. Utilizing OdinText to analyze the unstructured data (a.k.a. stories) we received, we looked for narrative and emotional patterns across and between the legacy and prospective consumers.

Dormant Brand Equities at the Intersection

Our client’s operating hypothesis was that the perceptions and emotions of the two targets would vary dramatically.  As it turned out, legacy consumers did express more nostalgia with the sub-brand and recalled their past experiences with it fondly, associating the sub-brand with family connection and memorable special events. Prospective consumers, not surprisingly, expressed a greater sense of trepidation related to the unfamiliar-but-established sub-brand.

Interestingly, and most useful to our client, however, there were important areas of intersection between the two consumer groups, both in perception and experience of the sub-brand and the parent brand.

The sub-brand and parent brand elicited joyful emotions and communicated the concept of care, a key tenant of the parent brand. Additionally, the sub-brand reflections of both consumer groups contained elements of enjoyable education (think “learning is fun!”) and heartwarming interactions — equities that were well aligned with recent parent brand initiatives.

All in all, the client was pleased with the outcome and benefited greatly from the knowledge obtained: their quest to determine next steps in this endeavor were finally realized.

Methodological Review

The development and execution of this branch of methods at Brandtrust could have been daunting, but with OdinText at our fingertips, it was far less manual and labor-intensive than it would have been in the days of building code frames, buckets, and nets.  And yet, there is still a great deal of merit in the means by which text analysis was initially derived.

At this point in technological advancement machines cannot, and likely never fully will, replace humans; which is why Brandtrust employs a distinctive approach called Lateral Pattern Analysis — with parallel machine and human analysis, and a combined synthesis between the two, to determine the final outcome of our Narrative Inquiry studies.

Narrative Inquiry questionnaires are built on Brandtrust’s key research pillars, including grounded theory, phased dialogue, behavioral framing, narrative pattern identification, and priming reduction. Reliance on these key elements ensures that our Narrative Inquiry respondents — through a process of recall and reflection — can share with us the rational and emotional makeup of their perceptions and behaviors.

OdinText’s built-in emotional framework assists our team with the “machine” side of processing a vital but squishy element of human understanding through story: Emotion. Brandtrust draws from years of experience in processing story and emotion qualitatively, and OdinText’s features have helped us extend the reach and statistical certainty of that expertise.

 

Why Your HR Survey is a Lie and How to Get the Truth

OdinText Discovers Job Satisfaction Drivers in Anonymous Employee Data

Employee satisfaction surveys go by different names – “stakeholder satisfaction,” “360-degree surveys,” “employee engagement.” I blog a lot about the shortcomings of survey data and problems like respondent and data quality due to bad sample, long surveys, and poorly structured questions (which assumes the researcher already knows the best questions to ask), but I haven’t talked much about human resources/employee surveys.

HR surveys have a different and bigger problem than consumer surveys. It’s not the sample; we know exactly who our employees are. We know their email, phone numbers and where they sit. Heck, we can even mandate 100% survey participation (and some companies do). In fact, I’ve spoken to Human Resources directors who actually survey their employees once per week. The reasoning goes something like, “our millennial employees are in high demand and we want to keep them happy.” But that’s not a problem, per se; in fact, I’m a believer in frequent data collection.

The Problem with Employee Satisfaction Surveys

Hr employee satisfaction survey research

NO ONE taking an employee survey trusts that their data will be kept anonymous and confidential. This is the case even when third parties promising to keep all data at an aggregate level are hired to conduct the fieldwork.

It really doesn’t matter that this mistrust may be unfounded or invalid, only that it exists. And as it happens, it isn’t entirely unfounded. In fact, I know a former senior HR manager at a Fortune 500 who admitted to successfully pressuring vendors into providing de-anonymized, individual-level data.

Even if you as an employee believed your data would remain anonymous, you might nonetheless be wary of being completely forthcoming. For instance, if you were one of three or four direct reports asked to rate your satisfaction with either your company or your manager on a five-point Likert scale, it might feel foolhardy to answer with anything less than a top-3-box score. There would be a lot of interest in who the ‘disgruntled’ employee was, after all.

This is a data problem, and there are two solutions:

  1. Text Analysis of Employee Satisfaction Comment Data

Unstructured, free-form comment data can be a window into the soul! I might never risk giving my company or supervisor anything below a top-2-box satisfaction rating on a Likert scale, but there are other ways to unearth the truth. For example, consider these open-ended questions:

Q: “What do you think about the prospects for your company’s success in the next two years?”

Or maybe a specific question about a boss I didn’t like? Such as:

Q: “Tell us about your relationship with your boss. Does he/she provide you with adequate X?”

While the respondent would obviously still be very careful about how he/she answers – probably even more so – it would be nearly impossible not to divulge important clues about how he/she really feels.

Why? Because we really aren’t very good at lying. We can’t help leaving emotional clues in spoken or written text that reveal our hidden emotions based on word choice.

Even in the absence of any negative terms or emotions, just the appearance of significantly lower levels of basic positive sentiment within a department or within a specific manager’s group might signal a serious problem.

  1. Anonymizing Employee Satisfaction Data

The other solution is to collect data that truly is more anonymous. This is a second unmet opportunity to improve employee satisfaction and engagement research. The trick is not just providing an option for anonymous feedback such as a suggestion box, but making it top-of-mind and encouraging more frequent anonymous feedback.

Obviously, many companies know their customers are discussing them and giving them ratings both anonymously and with their real profiles on various sites—Amazon.com, BazaarVoice.com, Airbnb, TripAdvisor and Yelp, to name just a few.

But what about employee data? Back during the dotcom boom, working for Snowball.com/Ign.com I recall everyone at the company and other dotcom’s regularly visiting F*ckedCompany.com (the asterisk was added by me) where anonymous feedback about management, impending layoffs, etc., would be shared. This isn’t necessarily a good thing for anyone except investors wanting to short a certain company.

Today there are sites like GlassDoor.com where employees rate larger companies on both work satisfaction, in general, and even the interview process. The problem with this site is that it tends to be focused more on larger companies (think Home Depot and Lowes), though there are many ratings for middle-market and smaller companies, too.

I think in the future there will be even more opportunities to access public reviews of satisfaction at companies, but also hopefully more private ways to collect unbiased data on your company’s employee satisfaction.

What to Expect from Text Analysis of Good/Anonymous Employee Data?

While I’ll be writing more on this topic in the future, what prompted the idea for this blog post was one of our most recent clients, TruckersReport.com. As the name suggests, TruckersReport.com is a trucking industry professional community that includes an anonymous discussion board.

Recently, OdinText deployed to analyzed anonymous, unstructured comments as well as structured ratings data from the TruckersReport.com’s website. Some rather unexpected findings were covered in a joint press release. For example, OdinText discovered 11 features driving job satisfaction, and salary didn’t make the top three! You may request a more detailed report here.

I look forward to your thoughts and questions related to either the above study or human resources analytics in general.

 

@TomHCAnderson

Of Tears and Text Analytics

An OdinText User Story - Text Analytics Tips Guest Post (AI Meets VOC)

Today on the blog we have another first in a soon to be ongoing series. We’re inviting OdinText users to participate more on the Text Analytics Tips blog. Today we have Kelsy Saulsbury guest blogging. Kelsy is a relatively new user of OdinText though she’s jumped right in and is doing some very interesting work.

In her post she ponders the apropos topic, whether automation via artificial intelligence may make some tasks too easy, and what if anything might be lost by not having to read every customer comment verbatim.

 

Of Tears and Text Analytics By Kelsy Saulsbury Manager, Consumer Insights & Analytics

“Are you ok?” the woman sitting next to me on the plane asked.  “Yes, I’m fine,” I answered while wiping the tears from my eyes with my fingers.  “I’m just working,” I said.  She looked at me quizzically and went back to reading her book.

I had just spent the past eight hours in two airports and on two long flights, which might make anyone cry.  Yet the real reason for my tears was that I had been reading hundreds of open-end comments about why customers had decided to buy less from us or stop buying from us altogether.  Granted eight hours hand-coding open ends wasn’t the most accurate way to quantify the comments, but it did allow me to feel our customers’ pain from the death of a spouse to financial hardship with a lost job.  Other reasons for buying less food weren’t quite as sad — children off to college or eating out more after retirement and a lifetime of cooking.

I could also hear the frustration in their voices on the occasions when we let them down.  We failed to deliver when we said we would, leaving the dessert missing from a party.  They took off work to meet us, and we never showed.  Anger at time wasted.

Reading their stories allowed me to feel their pain and better share it with our marketing and operations teams.  However, I couldn’t accurately quantify the issues or easily tie them to other questions in the attrition study.  So this year when our attrition study came around, I utilized a text analytics tool (OdinText) for the text analysis of our open ends around why customers were buying less.

It took 1/10th of the time to see more accurately how many people talked about each issue.  It allowed me to better see how the issues clustered together and how they differed based on levels of overall satisfaction.  It was fast, relatively easy to do, and directly tied to other questions in our study.

I’ve seen the benefits of automation, yet I’m left wondering how we best take advantage of text analytics tools without losing the power of the emotion in the words behind the data.  I missed hearing and internalizing the pain in their voices.  I missed the tears and the urgency they created to improve our customers’ experience.

 

Kelsy Saulsbury Manager, Consumer Insights & Analytics Schwan's Company

 

A big thanks to Kelsy for sharing her thoughts on OdinText's Text Analytics Tips blog. We welcome your thoughts and questions in comment section below.

If you’re an OdinText user and have a story to share please reach out. In the near future we’ll be sharing more user blog posts and case studies.

@OdinText