18 results found
-
Ability to edit Participation list for Calibration
Participation lists for Calibrations have been implemented (yay!) but they cannot be edited or deleted after creation. We need this to be added ASAP
2 votes -
Calibration Export Option
Allow option to export completed calibration score. This should include both score of all participants and coaching commentaries.
1 vote -
Calibration Automation
The ability to automate calibrations, similar to workloads. Currently, calibrations can be found in Interactions, but this is a manual process that is time consuming. It would be great if there was an option to 'build' a calibration (what filter it pulls from, date range, who's included, etc) and have the system pull the evaluation for you.
1 vote -
Restrict the result view for participants of a calibration to their own result only
Currently calibrations results form part of our auditors KPI's. Being able to see how everyone else is performing in calibrations is causing problems and could mean we are unable to continue to move from our calibrations from our current platform, and into Playvox.
Please can there be the ability within roles management to set what any role can view eg: own, team, all. Thank you.5 votes -
The ability to edit a participant's calibration answers
Sometimes participants submit their answers for a calibration and realize they have made a mistake or wish to change or add something thought of afterwards. If the participant reaches out to the expert before the calibration call happens, it would be nice to be able to edit said question or to add something that could be useful. This could also affect the overall results in the end and prevent misalignment between the participant's answers and the expert's answers. Perhaps this could be added under the calibration options.
2 votes -
Calibration score based on complete evaluations in a calibration session
Currently, Playvox gives a 0% score for an evaluation that was not completed due to missing the deadline. The 0% is counting towards the calibration final score, even though there are many other completed ones.
We would like an enhancement to just take completed evaluations in a calibration session into account to generate the score.6 votes -
Option of replacing completed evaluations with Expert feedback/score after calibrations
Currently, when reviewing the outcomes of calibrations, they may result in score adjustments to the original evaluation and it is a manual action taken right now. If we could have a button as an option to replace the original evaluation score with the expert's updated feedback/scoring and have a banner that it was updated due to calibration #, that could potentially be more efficient. Ideally keeping the original evaluation timestamp/date/analyst associated with it.
4 votes -
Calibration Due Time
Currently the calibration feature allows you to only set a due date (end of day). We would like to be able to set up a datetime calibration results are due (eg. 9/29/2022 @ 4pm) This would allow us a bit more flexibility in how long we can allow users to provide their inputs.
1 vote -
evaluate the analyst workload reassignment
Include the reassignment feature thats available for the "evaluate the agent" workloads and extend to "evaluate the analyst" so I'm able to reassign a new analyst to help with calibrations.
2 votes -
Calibration comparison granular data
Exporting data from calibrations: we need the granular data from our calibrations, like a comparison between the analyst evaluation and expert evaluation; and separated analyst answers and expert answers;
4 votes -
expert in a calibration session
The client would want for the expert in a calibration session, to see the answers of all the participants in the session before he sends his own evaluation.
2 votes -
Edit permission on Calibrations
It would be great to allow calibration participants an option to edit up until due date. I understand not wanting them to be able to keep changing to try and get the right score.
But maybe allow one edit, I've had situations as the expert where users second guess themselves or go back and listen to a call, etc... then they want to edit but cannot.
Maybe it could be a option/permission for certain role, even?2 votes -
Calibration Results
Add a "close" button on calibrations and remove any incomplete evals to avoid including them in reporting details.
2 votes -
Workloads for Calibrations
Create Workloads for Calibration (multiple interactions) and assign to multiple QA’s (similar to current QA workloads). Select team, randomize “x” number of cases within criteria (criteria to define), assigned quality analyst, scorecard, end date, Calibration name.
4 votes -
calibrations to represent which feedback item was selected/entered
In calibration system represents the answer and the comment, but there is no visibilty of the selected feedback, thus the calibrations do not represent the full picture of the evaluation.
It would be useful to have an option to add feedback as part of the calibration, in order to cover all of the areas
1 vote -
2 votes
-
Scorecard Question results for Calibrations
It would be very helpful to see the average scorecard question results based on Calibrations.
Example:
Feature the Analyst's average scorecard question results for a specific period of time
VS.
The Expert's avg. scorecard question results for the same period of time.
Similarly, it would be very helpful to feature said data in the API for easier exporting.
1 vote -
3 votes
- Don't see your idea?