Usage Page Metrics
Workers Retained 30 Days After Reveal
This measures the number of drivers who have remained with the company at least 30 days after choosing to reveal their identity in response to a reachout request. We use the updated driver lists provided by the company to inform this metric. This is one reason why we must encourage clients to reliably provide updated lists. Otherwise, we can not accurately measure the success of WorkHound as a retention tool.
Reachouts Per Starred Comment
As comments are marked as “starred” by a CSM during coding, this metric shows what percentage of those comments have been reached out to by the client.
Average Worker List Upload Frequency
This metric is an average over time, reached by dividing the total number of days the client has been on the platform by the total number of driver list uploads. Only one upload per day is counted towards the frequency, so uploading hires and terms separately will not skew this metric as long as it is done on the same day.
Worker Reveal Percentage
This shows what percentage of workers have revealed compared to the number of reachouts initiated by the client in real time. Unlike the 30 day post reveal metric, these reveals are not counted as a retention opportunity yet.
Metrics Page
Satisfaction Score
A client’s satisfaction score is calculated as a rolling 30-day average of the self-reported 1-10 score gathered when collecting feedback. So, a company’s satisfaction score on any given day is an average of score values received from the previous 30 days. Any feedback that is collected outside of the prompt link form is not counted towards this, as workers do not provide a 1-10 satisfaction score in these instances.
Total Workers
This is the total number of unique individuals who have left feedback throughout the program or selected time segment. This is not to be confused with the total number of workers receiving WorkHound communications.
Total Comments
This is the total number of comments received through feedback prompts, pushed to feedback from conversations or orphaned comments.
Weekly Summary Email
The metrics in this weekly summary email sent to clients are representative of the entire company, not segmented by tag. This is important to note for any client stakeholders receiving this email who may only have dashboard access to certain tags.
Also provided in this email are all pieces of feedback that have been “starred” by the CSM during coding.
General FAQ
How frequently are these metrics updated?
Metrics are calculated the first time any user accesses the dashboard for a given time frame (30 days/90 days/All Time) each day, and cached for all users the remainder of the day.
How does caching work?
On the Metrics and Usage pages of our dashboards, we do backend caching of the data that is displayed. As datasets grow, the calculations needed to populate these pages take longer and longer. A calculation might take several seconds on a customer with a good amount of data.
Since we don't want our customers to have to wait that long for the screen to update, we only do the calculations on the first time these pages are visited in a given day. The result of the calculation then lives in our cache for the rest of the day so the pages can load quickly.
The default timeframe for the Metrics and Usage pages is "All Time". So the data for the "All Time" timeframe is what is cached on initial page load. When "30 Day" or "90 Day" filters are selected, a new calculation is run for those timeframes and that result is placed in our cache.
Since the "30 Day" and "90 Day" filters are not selected as frequently as the default "All Time" timeframe, it's possible that values are cached early in the day for the default time range and then later in the day for the other time ranges. This is rarely ever noticeable except in the case of a new customer, particularly on the day of the initial prompt. If a lot of comments come in after the initial 'All Time' caching of the day, the other time ranges can look pretty different, but new values will be calculated on first visit the next day.
How were the goal lines determined?
We used a combination of analysis and our insight and judgement on what would create the best program results to set them.
Comments
0 comments
Article is closed for comments.