Skip to content

Quality

Categories

JUMP TO ANOTHER FORUM

  • Hot ideas
  • Top ideas
  • New ideas
  • My feedback

304 results found

  1. There should be the ability for our evaluators to challenge evaluation reviews, similar to how agents can dispute evaluations.

    11 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    2 comments  ·  Evaluations  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    Hi there!

    Thank you for sharing your ideas with us! 

    Your idea will be carefully evaluated. If it gains traction, it could significantly influence our development roadmap.

    Thanks for being a valuable part of our community. 

    Karina from Product

  2. We can currently dispute scores, which is great, but we rely heavily on the Feedback Options for our reporting to understand what went wrong.

    We also report on this level of error per advisor in many markets, and this means we get disputes that don't necessarily impact the score, but impact the feedback options selected.

    Currently we can't accept a dispute to change only the feedback options, we have to change the score.

    So this means we currently 'reject' the dispute - which is misleading, and then manually edit the original audit so the reporting data is correct.

    Ideally we…

    9 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    Acknowleged  ·  0 comments  ·  Disputes  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  3. We think the new insights are helpful and love the new graphics, tables, and figures we get out of this new set up on the Reporting tab in Playvox.

    We have noticed that it is missing some of the basic insights like:
    • Seeing the individual evaluations rather than just the amount of evaluations and being able to click on them to open up and review them.
    • Being able to select more than one Agent, Team, and Scorecards

    We also noticed that our Team Lead role doesn't have access to the Scorecard report option. I couldn't see anything under…

    7 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    Acknowleged  ·  0 comments  ·  Reports  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  4. Reviews functions under Quality (Review the reviewer/analyst):
    2. When composing reviews, there is no option to edit font, highlight, add links, ... even it doesn't have any paragraph break, the review looks messy/untidy. It would be better to have similar format as Evaluations for better organizing and following up.

    1 vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    Acknowleged  ·  2 comments  ·  Evaluations  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  5. As the QA manager, I want to share the outcome of accepted disputes with the analyst whose evaluation was disputed so that the analyst can receive information that will enable him/her to improve his/her personal evaluation process.

    The method by which the results are shared could be:
    - An email
    - A coaching session that is triggered by an automatically accepted dispute
    - A coaching session associated with the dispute that is created manually

    If you see this idea and feel an affinity with it, please leave in the comments how you would like to share the dispute results with…

    5 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    Acknowleged  ·  0 comments  ·  Disputes  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  6. We need to be able to evaluate agents, and hide only some evaluations from theor view (to be able to select which evaluations to make visible for agents and which not). Therefore, if they check their profile, they wont be able to see the evaluations that we don´t want to. For example: we want to do audits with investigation purpose, but we don´t want to give visibility to the agents so these audits wont have any impact on them.

    I know we have the option when doing an evaluation to mark if we want to send the notification to agent…

    6 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    Acknowleged  ·  1 comment  ·  Evaluations  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  7. Hi Team,

    This request is related to the scoring mechanism. In our practice, we use a corresponding rating for a range of scores in our current scorecard, and we would like to have Playvox QM detect the rating according to the total points a particular evaluation gets.

    Here we have attached a screenshot showing how the scoring works for us.

    Thanks, please let us know if it makes sense and if you need more clarification.

    2 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    Acknowleged  ·  0 comments  ·  Scorecards  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  8. I'd like to select multiple agents in the new agent report and multiple teams in the team report.
    I'd like to compare the selected items whenever I need to.

    3 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    Acknowleged  ·  0 comments  ·  Reports  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  9. There needs to be an option to hide/omit the evaluation results colors on questions and sections, hide/omit the evaluation section percentage score, hide/omit the question answer number score. See attached screenshot for the items referenced.

    The colors, section percentage score, question number score are distracting and confusing to the end users.

    The staff person receiving the results fails to look at the actual rating focusing instead on the colors, percentage or the numbers which can be misleading. If you aren't using a binary answer option then you have the high possibility of running into this situation.

    Our business use case…

    3 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    Acknowleged  ·  2 comments  ·  Evaluations  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  10. Clone filters from 1 Playvox instance to multiple instances.

    It will be beneficial to clone filters from 1 Playvox instance to multiple instances.
    Rather than creating filters every time for each instance, it will be helpful if we could clone filters.
    This will save us time and work.

    1 vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    Acknowleged  ·  0 comments  ·  Filters  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  11. In the evaluation tab, it would be incredibly useful to have a feature that identifies whether a completed evaluation originated from an assigned workload or from the interaction tab. Currently, there is no clear way to determine where the evaluation was initially triggered, which can lead to confusion and difficulty in tracking the source of evaluations.

    Implementing a feature that tags or highlights the origin of the evaluation—whether it came from a workload or directly from the interaction tab—would enhance clarity and improve the management of evaluations. This would provide evaluators and managers with better insights and allow for more…

    2 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Workloads  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    Hi there!

    Thank you for sharing your idea with us! 

    We will thoroughly evaluate it, and if it gains traction, it could greatly impact our roadmap.

    We appreciate your contribution to our community.

    Karina from Product

  12. We don't want to show the questions marked with N/A to agents on the Evaluations results page. So there's less noise on the page.

    2 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    Acknowleged  ·  0 comments  ·  Evaluations  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  13. It would be great to add a second confirmation button that must be acknowledged/clicked before any calibration is officially submitted, rather than just clicking "Send Evaluation" and it is sent right away.

    1 vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    Acknowleged  ·  1 comment  ·  Calibrations  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  14. In the overview report, I'd like to see all agents across all teams and their main KPIs.
    Also, I'd like to export this information, not just from the agents displayed in the graph but the entire list.

    2 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    Acknowleged  ·  0 comments  ·  Reports  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  15. We would like to have the option to include custom fields when creating a Calibration (For example, including a textbox where we can include the Week number to easily filter those calibration sessions), and then they can be displayed on the Calibration details (under summary).

    1 vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    Acknowleged  ·  1 comment  ·  Calibrations  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  16. Have a N/A option in the dispute validity in addition to accept and reject. This option should apply when the dispute had to be nullified since it was a gray area and the dispute wasn't valid for either of the stakeholders involved.

    2 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    Acknowleged  ·  0 comments  ·  Disputes  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  17. Hi Team,

    In the current evaluations tab, the headers are not sufficient for our users to identify the characteristics of those evaluations. We need more information on this tab. So far, there are filters for the custom header fields we created for the scorecard, but it is not convenient for users as every single time, they need extra steps to find a particular interaction.

    For example, can we display the interaction date, interaction name, and some other variables there according to the metadata we want to see by default?

    Another option is for Playvox to capture the metadate of the…

    1 vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Evaluations  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    Hi!

    Thank you for sharing your ideas with us! 

    Your idea will be carefully evaluated. If it gains traction, it could significantly influence our development roadmap.

    Thanks for being a valuable part of our community. 

    Karina from Product

  18. Allow to filter via satisfaction data from Zendesk. We need this as we report based on satisfaction date for cSAT performance. Currently, we're only able to filter using created date, solved at or updated at date.

    2 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    Acknowleged  ·  0 comments  ·  Filters  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  19. It would be highly beneficial to introduce a feature that allows the identification, reporting, and exporting of skipped evaluations. Currently, if an evaluation is skipped, there is no straightforward way to track or understand the reasons behind it.

    By implementing a system that records when an evaluation is skipped and allows managers and analysts to generate reports or export this data, they can gain valuable insights into patterns or issues that may be causing evaluations to be skipped. This feature would enhance the overall evaluation process by providing transparency and enabling better decision-making.

    1 vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    Acknowleged  ·  0 comments  ·  Workloads  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  20. Hi Team, we noticed that if there are multiple pages in the interaction page, each page contains 50 interactions. However it is inconvenient to count if the page is not full, or if there's only one page of interactions when some filters applied.
    Could you add the numbers there at the bottom of the interactions page?

    1 vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    Acknowleged  ·  0 comments  ·  Interactions  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
← Previous 1 3 4 5 15 16
  • Don't see your idea?