Skip to content

Quality

Categories

JUMP TO ANOTHER FORUM

  • Hot ideas
  • Top ideas
  • New ideas
  • My feedback

23 results found

  1. We are looking to implement a peer calibration process where analysts evaluate each other's completed evaluations/reviews to assess scoring consistency and alignment with quality standards.

    The process should involve:

    Selection: Choose a completed evaluation for each evaluator.
    Assignment: Assign other team members to evaluate the same case.
    Comparison: Compare the results to identify scoring misalignments across quality standards.

    Currently, this process is manual and requires significant effort to manage and analyze. We request a dedicated feature in Playvox that allows us to:

    Select and assign evaluations for peer review.
    Facilitate the comparison of results among evaluators.
    Identify and address scoring…

    2 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  2. It would be great to be able to customise the notifications related to calibration sessions. At the moment this setting is not customisable and every time an interaction is added to calibration session, the participants receive an email.

    1 vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Calibrations  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  3. It would be great to add a second confirmation button that must be acknowledged/clicked before any calibration is officially submitted, rather than just clicking "Send Evaluation" and it is sent right away.

    1 vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    Acknowleged  ·  1 comment  ·  Calibrations  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  4. We would like to have the option to include custom fields when creating a Calibration (For example, including a textbox where we can include the Week number to easily filter those calibration sessions), and then they can be displayed on the Calibration details (under summary).

    1 vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    Acknowleged  ·  1 comment  ·  Calibrations  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  5. Right now I have calibration categories I would like to delete as they are no longer relevant. I'm unable to because they are attached to calibrations. I am unable to delete calibrations unless I do it one by one. Being able to either archive or delete the category would be helpful for new calibrations.

    1 vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    Acknowleged  ·  0 comments  ·  Calibrations  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  6. I'm loving the new layout and ability to edit the list of the main Calibration page but it would be great if the edits made to the column size were saved so if we leave the page and come back the changes remain.

    Additionally, saving specific filters on any page (reports, workloads, calibrations ... etc) would be helpful.

    Can we also bring back the option to 'open in a new tab' when right clicking the view button on calibrations.

    1 vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    Acknowleged  ·  0 comments  ·  Calibrations  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  7. Hi PlayVox Team,

    We want to request to have an additional column for the agent name under Calibrations. The addition of the filter will make it easier for team leaders and quality assurance personnel to identify which agents have already undergone calibration quickly.

    Hoping to add this feature on PlayVox

    1 vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    Acknowleged  ·  0 comments  ·  Calibrations  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  8. The ability to increase the number of users in calibrations, up to 500, through an upload feature. Currently, users have to be manually and individually selected.

    2 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    Acknowleged  ·  0 comments  ·  Calibrations  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  9. Participation lists for Calibrations have been implemented (yay!) but they cannot be edited or deleted after creation. We need this to be added ASAP

    5 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    Acknowleged  ·  3 comments  ·  Calibrations  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  10. Allow option to export completed calibration score. This should include both score of all participants and coaching commentaries.

    3 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    Acknowleged  ·  0 comments  ·  Calibrations  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  11. Sometimes participants submit their answers for a calibration and realize they have made a mistake or wish to change or add something thought of afterwards. If the participant reaches out to the expert before the calibration call happens, it would be nice to be able to edit said question or to add something that could be useful. This could also affect the overall results in the end and prevent misalignment between the participant's answers and the expert's answers. Perhaps this could be added under the calibration options.

    5 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    Acknowleged  ·  0 comments  ·  Calibrations  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  12. Currently calibrations results form part of our auditors KPI's. Being able to see how everyone else is performing in calibrations is causing problems and could mean we are unable to continue to move from our calibrations from our current platform, and into Playvox.
    Please can there be the ability within roles management to set what any role can view eg: own, team, all. Thank you.

    7 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    Acknowleged  ·  1 comment  ·  Calibrations  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  13. Currently, Playvox gives a 0% score for an evaluation that was not completed due to missing the deadline. The 0% is counting towards the calibration final score, even though there are many other completed ones.
    We would like an enhancement to just take completed evaluations in a calibration session into account to generate the score.

    10 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  14. Currently the calibration feature allows you to only set a due date (end of day). We would like to be able to set up a datetime calibration results are due (eg. 9/29/2022 @ 4pm) This would allow us a bit more flexibility in how long we can allow users to provide their inputs.

    3 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    Acknowleged  ·  0 comments  ·  Calibrations  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  15. Include the reassignment feature thats available for the "evaluate the agent" workloads and extend to "evaluate the analyst" so I'm able to reassign a new analyst to help with calibrations.

    5 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    Acknowleged  ·  2 comments  ·  Calibrations  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  16. Currently, when reviewing the outcomes of calibrations, they may result in score adjustments to the original evaluation and it is a manual action taken right now. If we could have a button as an option to replace the original evaluation score with the expert's updated feedback/scoring and have a banner that it was updated due to calibration #, that could potentially be more efficient. Ideally keeping the original evaluation timestamp/date/analyst associated with it.

    4 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    Acknowleged  ·  0 comments  ·  Calibrations  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  17. Exporting data from calibrations: we need the granular data from our calibrations, like a comparison between the analyst evaluation and expert evaluation; and separated analyst answers and expert answers;

    5 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    Acknowleged  ·  0 comments  ·  Calibrations  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  18. The client would want for the expert in a calibration session, to see the answers of all the participants in the session before he sends his own evaluation.

    2 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    Acknowleged  ·  0 comments  ·  Calibrations  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  19. It would be great to allow calibration participants an option to edit up until due date. I understand not wanting them to be able to keep changing to try and get the right score.

    But maybe allow one edit, I've had situations as the expert where users second guess themselves or go back and listen to a call, etc... then they want to edit but cannot.
    Maybe it could be a option/permission for certain role, even?

    2 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    Acknowleged  ·  0 comments  ·  Calibrations  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  20. It would be very helpful to see the average scorecard question results based on Calibrations.

    Example:

    Feature the Analyst's average scorecard question results for a specific period of time

    VS.

    The Expert's avg. scorecard question results for the same period of time.

    Similarly, it would be very helpful to feature said data in the API for easier exporting.

    2 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    Acknowleged  ·  0 comments  ·  Calibrations  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
← Previous 1
  • Don't see your idea?