23 results found
-
Peer calibration feature for analyst evaluations
We are looking to implement a peer calibration process where analysts evaluate each other's completed evaluations/reviews to assess scoring consistency and alignment with quality standards.
The process should involve:
Selection: Choose a completed evaluation for each evaluator.
Assignment: Assign other team members to evaluate the same case.
Comparison: Compare the results to identify scoring misalignments across quality standards.Currently, this process is manual and requires significant effort to manage and analyze. We request a dedicated feature in Playvox that allows us to:
Select and assign evaluations for peer review.
Facilitate the comparison of results among evaluators.
Identify and address scoring…2 votes -
Customize notifications for calibrations
It would be great to be able to customise the notifications related to calibration sessions. At the moment this setting is not customisable and every time an interaction is added to calibration session, the participants receive an email.
1 vote -
Add a confirmation button
It would be great to add a second confirmation button that must be acknowledged/clicked before any calibration is officially submitted, rather than just clicking "Send Evaluation" and it is sent right away.
1 vote -
Custom Fields on Calibration
We would like to have the option to include custom fields when creating a Calibration (For example, including a textbox where we can include the Week number to easily filter those calibration sessions), and then they can be displayed on the Calibration details (under summary).
1 vote -
Ability to archive calibration categories
Right now I have calibration categories I would like to delete as they are no longer relevant. I'm unable to because they are attached to calibrations. I am unable to delete calibrations unless I do it one by one. Being able to either archive or delete the category would be helpful for new calibrations.
1 vote -
Save changes made to filters and list of calibration page
I'm loving the new layout and ability to edit the list of the main Calibration page but it would be great if the edits made to the column size were saved so if we leave the page and come back the changes remain.
Additionally, saving specific filters on any page (reports, workloads, calibrations ... etc) would be helpful.
Can we also bring back the option to 'open in a new tab' when right clicking the view button on calibrations.
1 vote -
Adding Agent in Columns under Calibrations
Hi PlayVox Team,
We want to request to have an additional column for the agent name under Calibrations. The addition of the filter will make it easier for team leaders and quality assurance personnel to identify which agents have already undergone calibration quickly.
Hoping to add this feature on PlayVox
1 vote -
Increase Number of Participants
The ability to increase the number of users in calibrations, up to 500, through an upload feature. Currently, users have to be manually and individually selected.
2 votes -
Ability to edit Participation list for Calibration
Participation lists for Calibrations have been implemented (yay!) but they cannot be edited or deleted after creation. We need this to be added ASAP
5 votes -
Calibration Export Option
Allow option to export completed calibration score. This should include both score of all participants and coaching commentaries.
3 votes -
The ability to edit a participant's calibration answers
Sometimes participants submit their answers for a calibration and realize they have made a mistake or wish to change or add something thought of afterwards. If the participant reaches out to the expert before the calibration call happens, it would be nice to be able to edit said question or to add something that could be useful. This could also affect the overall results in the end and prevent misalignment between the participant's answers and the expert's answers. Perhaps this could be added under the calibration options.
5 votes -
Restrict the result view for participants of a calibration to their own result only
Currently calibrations results form part of our auditors KPI's. Being able to see how everyone else is performing in calibrations is causing problems and could mean we are unable to continue to move from our calibrations from our current platform, and into Playvox.
Please can there be the ability within roles management to set what any role can view eg: own, team, all. Thank you.7 votes -
Calibration score based on complete evaluations in a calibration session
Currently, Playvox gives a 0% score for an evaluation that was not completed due to missing the deadline. The 0% is counting towards the calibration final score, even though there are many other completed ones.
We would like an enhancement to just take completed evaluations in a calibration session into account to generate the score.10 votes -
Calibration Due Time
Currently the calibration feature allows you to only set a due date (end of day). We would like to be able to set up a datetime calibration results are due (eg. 9/29/2022 @ 4pm) This would allow us a bit more flexibility in how long we can allow users to provide their inputs.
3 votes -
evaluate the analyst workload reassignment
Include the reassignment feature thats available for the "evaluate the agent" workloads and extend to "evaluate the analyst" so I'm able to reassign a new analyst to help with calibrations.
5 votes -
Option of replacing completed evaluations with Expert feedback/score after calibrations
Currently, when reviewing the outcomes of calibrations, they may result in score adjustments to the original evaluation and it is a manual action taken right now. If we could have a button as an option to replace the original evaluation score with the expert's updated feedback/scoring and have a banner that it was updated due to calibration #, that could potentially be more efficient. Ideally keeping the original evaluation timestamp/date/analyst associated with it.
4 votes -
Calibration comparison granular data
Exporting data from calibrations: we need the granular data from our calibrations, like a comparison between the analyst evaluation and expert evaluation; and separated analyst answers and expert answers;
5 votes -
expert in a calibration session
The client would want for the expert in a calibration session, to see the answers of all the participants in the session before he sends his own evaluation.
2 votes -
Edit permission on Calibrations
It would be great to allow calibration participants an option to edit up until due date. I understand not wanting them to be able to keep changing to try and get the right score.
But maybe allow one edit, I've had situations as the expert where users second guess themselves or go back and listen to a call, etc... then they want to edit but cannot.
Maybe it could be a option/permission for certain role, even?2 votes -
Scorecard Question results for Calibrations
It would be very helpful to see the average scorecard question results based on Calibrations.
Example:
Feature the Analyst's average scorecard question results for a specific period of time
VS.
The Expert's avg. scorecard question results for the same period of time.
Similarly, it would be very helpful to feature said data in the API for easier exporting.
2 votes
- Don't see your idea?