Sample Reports

There are more than two dozen reports that can be run within ComQuest. When you consider that most reports allow you to sort or filter by a variety of parameters including date range, demo, gender, ethnicity, partisan station, etc. the possibilities for creating semi-custom reports "on-the-fly" are virtually limitless.

Here are some examples of several of the reports in ComQuest:

Song By Song Report

This report details four weeks of trending for Familiarity, Positive Acceptance, Dislike and Burn. These attributes are also detailed for the five demo groups, as well as for P1 listening of the first 4 stations you're tracking, and the first three ethnic groups. Color bar graphs for Familiarity, Positive Acceptance and Burn are featured, as well as a color pie chart for overall attributes.

Song By Song Report

 

Familiarity Report

The Familiarity Report displays four weeks of trend data, based on the end-date you have specified for the report. Specific demos/genders can be filtered. Results are sorted in descending order, based on This Week's test scores.

Familiarity Report

 

Positive Acceptance

The Positive Acceptance report displays four weeks of trend data, based on the end-date you have specified for the report. Specific demos/genders can be filtered. Results are sorted in descending order, based on This Week's test scores.

Positive Acceptance Report

 

Burn Factor

The Burn Factor Report displays four weeks of trend data, based on the end-date you have specified for the report. Specific demos/genders can be filtered. Results are sorted in descending order, based on This Week's test scores.

Burn Factor Report

 

Unfamiliarity Report

The Unfamiliarity Report displays unfamiliarity for all songs, sorted in descending order. Positive Acceptance, Burn and Dislike scores are also displayed on this report. Specific demos/genders can be filtered.

Unfamiliarity Report

 

Weekly Trends

The Weekly Trends Report displays the five attributes for each song... for the last four weeks. (Based on the end-date you specify when the report is run). Specific demos/genders can be specified as well.

Weekly Trends Report

 

Mean Score Report

The Mean Score Report displays the mean score for each song tested (based on a scale of 1-5). This report can be run in a 4-week trend format, as shown below, or in a Rolling Averages format (scroll down), allowing you to define the start/end date parameters.

Mean Trend Report

Mean Score Report

 

Compatibility Report

The Compatibility Report allows you to select a target song; perhaps a song that is testing quite well for your station. Then, all other songs tested during that same week are ranked in descending order, based on how compatible they are with the target song. Essentially, respondents that gave the target song a positive score are isolated. Then, the scores this sub-group of respondents gave to all other songs is compiled, and the results are ranked in descending order.

Compatibility Report

 

Potential Acceptance Report

The Potential Acceptance Report "levels the playing field" for all songs, based on Familiarity. The Positive Acceptance of each song is "weighted" up, as though each song were 100% Familiar. This way, songs that might be eliciting a favorable score, but are not yet totally Familiar, can be easily compared with those songs that are more familiar.

Potential Acceptance Report

 

Crosstab Reports

Any question that is asked of respondents once they are transferred to the Fileserver to take the music test can later be cross-tabbed with the song results. In this sample report, we're cross-tabbing the results of the question "How often do you listen to this (CHR) montage", to the Positive Acceptance scores of the songs.Crosstab reports can also be run against the Familiarity and Burn Factor scores.

Crosstab Report

 

Demo Break Reports

There are four Demo Break Reports (Familiarity, Positive Acceptance, Burn and Favorite). Each report breaks down the scores for the desired attribute by the five demo groups you're using, as well as by P1 (partisan) listening patterns for the top 6 stations you're testing. The reports can be sorted to display results in descending order for any of the demos or stations indicated.

Demo Break Report

 

Ethnicity Reports

Similar to demo-break reports, the results for the four leading attributes (Familiarity, Positive Acceptance, Burn and Favorite), can be displayed for each of the six ethnic groups. A total score of all respondents as well a gender breakdown are also displayed. The results can be filtered by date range, and demographics and gender, and can be sorted in descending order by any of the the ethnic groups. (In this sample report, the station is using "ethnicity" to track whether respondents listen to the radio more than an hour a day or less than an hour a day).

Ethnicity Report

 

Partisans Report

The Partisans Report details the Positive Acceptance scores each of the Top 6 stations' P1 listeners, and the top 3 stations' P2 listeners. This report can be filtered for date range, demographics and gender, and can be sorted by any of the stations.

Partisans Report

 

Raw Counts Report

The Raw Counts Report is helpful if you want to see exactly how many people gave each score for each song. In this sample report, for example, you can see how many respondents gave each song tested a "1", a "2", "3", etc.

Raw Counts Report

 

Rolling Averages Reports

While many reports in ComQuest automatically sort your test results into discrete weeks (This Week, Last Week, Two Weeks Ago, etc.), the Rolling Averages Reports are used if you want to see how songs have tested over a specified period. For example, you might want to run a report, such as the one shown below, for two-weeks. This gives you a larger sample (minimizing Margin Of Error), which eliminates any weekly wobble in the results. (Some stations even run this report for the entire year, at the end of the year, to get their "year-end" countdown order!) The Rolling Averages Reports can be run sorted by Title, Familiarity, Positive Acceptance, Burn Factor, Dislike or Favorite attributes.

Rolling Averages Report

 

Song History Report

Week-to-week results for a specific song can be reviewed by running the Song History Report. Based on the date range, demographic and gender parameters you've specified, you can see the week to week results of any song you've been testing on a regular basis.

Song History Report

 

Question Detail Report

The results of any close-ended question asked of respondents once they're transferred to the Fileserver can be reviewed by printing the Question Detail Report. This report details results by demo and gender, ethnicity and cume and preference listening. Specific demos/genders can be used to filter results.

Question Detail Report

 

System Reports

Aside from the various reports that display song and question score results, there are many reports in ComQuest used to monitor efficiency and productivity. This an example of the Downtime Report which displays interviewer lulls between calls for the test week underway.

Downtime Report

 

Cume & Preference Reports

One of the nice "by-products" of doing weekly call-out music research is that extensive listening information is also collected. The Cume and Preference Reports display the results of the screening information (stations cumed/preferred, gender, demo), for any date range you specify. You can monitor the results daily, weekly, monthly, quarterly, etc. (Up to 45 stations can be tracked in all Cume/Preference reports.) There are 7 different cume and preference reports in ComQuest. Here is one; a sample Cume Report, with demo breakdowns.

Cume Report

 

6-Week Rolling Cume/Preference

You can also print/view Cume and Preference over a 6-week period. In addition, you can define how many weeks to roll together for each of the 6-week cycles. In this example, we have rolled together 4-weeks of 18-34 Female sample into each of the weekly cycles. This report helps you spot changes in listening patterns and other trends developing in your market before they're detected by Arbitron.

6-Week Cume Report

 

Incidence Report

Overall system efficiency can be monitored through the Incidence Report. Many stations print this report each day, or at least weekly, to keep tabs on the demo/gender/ethnic quotas, and how close they are to being met.

Incidence Report

 

Interviewer Incidence

In addition to overall Incidence, efficiency for each Interviewer can also be monitored. ComQuest keeps track of the calling time logged by each Interviewer, and details total numbers of calls attempted, as well as the results of those calls (Not Qualified, Completes, Terminates, Call Backs, etc.). The results are also displayed on a per-hour basis, as well as a percentage of all calls made.

Interviewer Incidence

 

Perceptual Results

Perceptual Surveys are a powerful bonus to the ComQuest system. There are eight various Perceptual Reports, allowing you to change crosstab and sort order specs for four different types of Perceptual questions; Pick List, Yes/No, Numeric and Verbatim. Here is an example of the print-out of a basic Pick List question. The results are ranked by responses, and an overall percentage is also displayed.

Verbatim Responses

 

There are many other reports available in ComQuest, not displayed on this page, including: Positive (No Unfamiliar), High Negatives, Gender Variance, Zip Code Reports, Question History and Interviewer Time Sheets.

All ComQuest reports can be run individually, or in batch mode The batch feature allows you to define the parameters for an endless number of reports, and print them all consecutively, at the click of a button!

That concludes our little tour of ComQuest reports. You can receive a complete set of actual ComQuest reports, simply by clicking on the Send Me More Info! link here, or below.

Send Me More Info! | Frequently Asked Questions | Introduction & Overview

ComQuest, LLC.