The intricate process of conducting a large-scale survey, particularly one involving minors and their parents, requires meticulous planning, rigorous execution, and transparent reporting of methodologies. This report delves into the detailed approach employed for a significant study conducted from September 25 to October 9, 2025, which gathered insights from 1,458 dyads, each consisting of a U.S. teen (ages 13-17) and their parent. Understanding the survey’s architecture – from participant recruitment and data collection to ethical oversight and statistical weighting – is crucial for interpreting its findings and appreciating its reliability.
Rigorous Ethical Oversight and Participant Protection
Central to the survey’s design was an unwavering commitment to the ethical treatment of all participants, especially minors. The research plan underwent thorough scrutiny by Advarra, an external institutional review board (IRB). Advarra, a specialized committee of experts, is dedicated to safeguarding the rights and welfare of research subjects. This particular study received a full board review due to the inherent sensitivities and potential risks associated with surveying minors. The approval, bearing the ID Pro00089395, signifies that the IRB meticulously vetted the research protocols before any data collection commenced, ensuring that participant protection was paramount. This robust ethical framework provides a foundational layer of trust and credibility for the study’s outcomes.

The Foundation: Ipsos’ KnowledgePanel and Recruitment Strategy
The survey leveraged Ipsos Public Affairs’ KnowledgePanel, a distinguished, nationally representative online research panel. KnowledgePanel is designed to overcome the limitations of traditional online surveys by ensuring inclusivity across different demographics and internet access levels. Panel members are recruited through sophisticated probability sampling methods. Critically, KnowledgePanel proactively addresses the digital divide by providing internet access and, if necessary, devices to individuals who lack them at the time of recruitment. This commitment to comprehensive coverage is vital for achieving a sample that accurately reflects the broader U.S. population.
The recruitment methodology of KnowledgePanel has evolved over time to enhance its representativeness. Initially, it relied exclusively on a national random-digit-dialing (RDD) sampling methodology. However, in 2009, Ipsos transitioned to an address-based sampling (ABS) recruitment methodology. This approach utilizes the U.S. Postal Service’s Computerized Delivery Sequence File, which is estimated to cover approximately 98% of the U.S. population, though some studies suggest this coverage might be closer to the low 90% range. This shift to ABS further strengthens the panel’s ability to capture a diverse and geographically dispersed sample.
Targeted Invitation and Dyadic Completion
The selection process for this specific survey was highly targeted. Panelists were invited to participate if their profiles indicated they were parents of teens aged 13 to 17. A random sample of 3,516 panel members received invitations. To ensure eligibility, these individuals were rescreened. Those who reconfirmed being the parent of at least one child within the specified age range were deemed eligible.

The survey design ingeniously captured data from both parent and teen within a dyad. Eligible parents were first asked a series of questions pertaining to their teenage child. Crucially, they were then asked for permission to contact their teen to complete a separate questionnaire. In households with multiple eligible teens, parents were instructed to consider one randomly selected teen for the survey. This systematic approach ensured that the teen questionnaire was completed by the teen themselves, under the guidance of their parent’s consent and initial screening. A survey was officially considered complete only when both the parent and the designated teen successfully finished their respective portions of the questionnaire. This dyadic completion is a cornerstone of the study’s ability to explore the perspectives of both generations on shared topics.
Response Rates and Data Integrity
The survey experienced a multi-stage process with distinct response rates. Out of the 3,516 invited panelists, 2,331 responded to the invitation, and 2,067 were identified as eligible. From these eligible households, 1,458 parent-teen dyads successfully completed both parts of the survey. This resulted in an eligibility rate of 89% and a final stage completion rate of 71%.
The overall study-level response rate, calculated using the American Association for Public Opinion Research (AAPOR) RR1 standard, stood at 45%. This figure represents the proportion of completed interviews out of all eligible units surveyed. Furthermore, accounting for nonresponse during the initial recruitment phases and any attrition from the panel over time, the cumulative response rate was 1.2%. While a 45% response rate for a complex study involving two distinct participants per unit is respectable, the lower cumulative rate underscores the challenges inherent in maintaining participation across a broad panel over extended periods. Understanding these rates is vital for assessing the potential for nonresponse bias.

Incentives for Participation
To encourage participation and acknowledge the time and effort involved, qualified respondents received incentives. Parents and teens who successfully completed their respective survey portions were awarded a cash-equivalent incentive valued at $10. Recognizing the importance of diverse representation, a higher incentive of $40 was offered to non-Hispanic Black panelists. This strategy aims to boost response rates within specific demographic groups that might be historically underrepresented in survey research, thereby enhancing the study’s equity and generalizability.
Communication and Field Period
The survey was administered in both English and Spanish to accommodate a wider range of participants. Communication with potential respondents was primarily through email invitations, with nonresponders receiving timely reminders. The entire data collection period, referred to as the "field period," spanned from September 25 to October 9, 2025, ensuring a defined timeframe for data gathering.
Advanced Weighting for Representative Analysis
A critical component of ensuring the findings accurately represent the U.S. teen and parent population is the sophisticated weighting methodology employed. Separate weights were constructed for parents and teens to reflect their distinct sampling probabilities and to correct for potential nonresponse biases.

Parent Weighting: The process began with a base design weight for each parent, calculated to represent their initial probability of being selected for the KnowledgePanel. These weights were then adjusted to account for the probability of selection for this specific survey, which included oversampling of Black and Hispanic parents to ensure adequate representation from these groups. Following this, an iterative raking technique was applied. This method systematically adjusted the parent design weights to align with established population benchmarks for parents of teens aged 13-17. The raking process targeted key demographic dimensions (as detailed in accompanying tables), thereby compensating for any differential nonresponse that might have occurred across various subgroups.
Teen Weighting: To create the teen weight, an adjustment factor was applied to the final parent weight. This factor was designed to reflect the selection of a single teen within each household. Subsequently, the teen weights were further refined through raking to match the demographic distribution of U.S. teens aged 13-17 who live with parents. The teen weighting dimensions mirrored those used for parents, with the exception of "teen education," which was not utilized in the teen weighting process.
Understanding Sampling Error and Limitations
The analysis of survey data is always subject to sampling error, which is the variation that arises from using a sample rather than the entire population. For this study, the margin of sampling error for the full sample of 1,458 teens was plus or minus 3.3 percentage points. Similarly, the margin of sampling error for the full sample of 1,458 parents was also plus or minus 3.3 percentage points. These margins indicate the range within which the true population value is likely to lie, with a 95% level of confidence.

It is imperative to acknowledge that sampling error is not the sole source of potential error in survey research. Practical difficulties in conducting surveys, such as question wording, respondent interpretation, and the inherent complexities of data collection, can introduce additional error or bias into the findings. The report explicitly notes that sampling errors and tests of statistical significance take into account the effect of weighting. Detailed tables outlining unweighted sample sizes and the expected margins of error for various subgroups at the 95% confidence level are provided to offer further transparency into the data’s precision. Subgroup-specific sample sizes and sampling errors are available upon request, allowing for more granular analysis where needed.
Broader Implications and Future Research
The meticulous methodology employed in this 2025 teen and parent survey underscores a commitment to generating reliable and representative data. By prioritizing ethical oversight, utilizing a robust national panel, employing a targeted recruitment strategy, and implementing advanced weighting techniques, the study aims to provide valuable insights into the perspectives of American teenagers and their parents. The detailed reporting of sampling error and potential limitations allows for a critical assessment of the findings.
The success of such research hinges not only on the data collected but also on the clarity with which its methodological underpinnings are communicated. This comprehensive overview serves as a testament to the rigor involved in contemporary survey research and provides a solid foundation for the subsequent analysis and interpretation of the study’s substantive findings, whatever they may be. The insights gleaned from such well-executed studies are invaluable for policymakers, educators, parents, and anyone seeking to understand the evolving landscape of adolescent life and family dynamics in the United States. The availability of detailed disposition and response rate tables further solidifies the transparency of the research process, allowing for a deeper understanding of the study’s reach and representativeness.
