The methodology underpinning the latest findings from Pew Research Center’s American Trends Panel (ATP) reveals a robust and meticulously designed survey process. Wave 189 of the ATP, conducted from March 16 to March 22, 2026, engaged 3,524 U.S. adults out of a sampled 4,053, achieving a strong survey-level response rate of 87%. This comprehensive approach ensures a high degree of reliability and accuracy in the data collected, providing a solid foundation for understanding public opinion and experiences.
Overview of the Survey and its Rigor
The American Trends Panel (ATP) represents Pew Research Center’s flagship survey instrument, renowned for its commitment to national representativeness and methodological rigor. Wave 189, the most recent iteration, underscores this dedication. The survey period, spanning a week in mid-March 2026, was strategically chosen to capture a snapshot of public sentiment during a period of anticipated national discourse. The achieved response rate of 87% is a significant indicator of the panel’s engagement and the effectiveness of Pew Research Center’s data collection strategies.
Beyond the immediate survey response, the ATP maintains a keen focus on cumulative response rates, accounting for initial recruitment challenges and ongoing panel attrition. The cumulative response rate for Wave 189 stands at 3%, a figure that reflects the demanding nature of maintaining a consistently representative panel over time. This low cumulative rate, while seemingly counterintuitive, is a testament to the stringent criteria for panel inclusion and the long-term commitment required from participants, ultimately contributing to the panel’s stability and depth. The break-off rate, measuring the percentage of panelists who begin the survey but do not complete it, was exceptionally low at 1%, indicating that the survey instrument was well-received and easily navigable by participants.

Crucially, the margin of sampling error for the full sample of 3,524 respondents is calculated at plus or minus 1.8 percentage points. This relatively narrow margin of error signifies a high level of confidence in the findings, meaning that any reported percentages are likely to be within this small range of the true population value. This precision is particularly important for making nuanced observations about public opinion.
To ensure the accurate representation of smaller demographic groups, the survey employed an oversample of non-Hispanic Asian adults. This intentional strategy allows for more precise estimations of the opinions and experiences of this segment of the population, which might otherwise be underrepresented in a standard national sample. It is important to note that these oversampled groups are subsequently weighted to reflect their correct proportions within the broader U.S. adult population, thereby maintaining the overall representativeness of the data.
The data collection itself was a mixed-mode operation, conducted by SSRS on behalf of Pew Research Center. The majority of interviews, 3,383, were administered online, leveraging the convenience and accessibility of digital platforms. A smaller but significant portion, 141 interviews, were conducted via live telephone interviewing. This dual approach ensures that individuals who may not have consistent internet access or prefer traditional communication methods are still included, broadening the survey’s reach. Interviews were conducted in both English and Spanish, further enhancing the inclusivity of the research. For those seeking a deeper understanding of the ATP’s operational framework, Pew Research Center provides extensive documentation on its website, under the section "About the American Trends Panel."
Panel Recruitment: A Foundation of Trust and Reach
The integrity of any survey hinges on its recruitment methodology. Since 2018, the ATP has relied on address-based sampling (ABS) as its primary recruitment strategy. This method involves mailing study cover letters and pre-incentives to a stratified, random selection of households across the United States. The sampled households are drawn from the U.S. Postal Service’s Computerized Delivery Sequence File, a comprehensive database estimated to cover 90% to 98% of the nation’s residential population. Within each selected household, the adult with the next birthday is identified as the potential participant. While the core ABS protocol has remained consistent, specific details of the recruitment process have evolved over time and are available upon request, underscoring a commitment to transparency and methodological evolution. Prior to 2018, the ATP utilized a combination of landline and cellphone random-digit-dial surveys, administered in both English and Spanish, reflecting an earlier phase of panel development.

The ATP has consistently sought to expand its national sample of U.S. adults annually since 2014. In certain years, the recruitment process has incorporated targeted "oversampling" efforts to bolster the accuracy of data for specific underrepresented demographic groups. For instance, Hispanic adults were oversampled in 2019, Black adults in 2022, and Asian adults in 2023. These strategic additions reflect a proactive approach to ensuring that the panel accurately mirrors the diverse tapestry of American society.
Sample Design: Ensuring Representativeness
The overarching target population for Wave 189 of the ATP was defined as noninstitutionalized persons aged 18 and older residing in the United States. The sample design for this wave utilized a stratified random sampling approach from the existing ATP. A key feature of this design was the selection of non-Hispanic Asian adults with certainty, ensuring their adequate representation. For all other panelists, sampling rates were meticulously calibrated to maintain proportionality with their respective shares of the U.S. adult population, to the greatest extent feasible. Subsequent adjustments to respondent weights are implemented to account for any differential probabilities of selection that may arise during this process, as detailed in the weighting section. This layered approach to sample design is fundamental to achieving the panel’s goal of being a reliable barometer of national public opinion.
Questionnaire Development and Testing: A Commitment to Clarity
The intellectual architecture of the survey, the questionnaire, was a collaborative effort between Pew Research Center and SSRS. This partnership ensures that the questions are not only relevant to the research objectives but also clear, unbiased, and easily understood by a diverse range of respondents. Before its deployment, the online survey instrument underwent rigorous testing. The SSRS project team and Pew Research Center researchers meticulously evaluated the web program on both personal computer and mobile device platforms. This thorough pre-launch testing included populating test data and analyzing it in SPSS to confirm that all logical flows and randomizations functioned precisely as intended, minimizing the risk of technical glitches or unintended survey behaviors.

Incentives: Encouraging Participation
Recognizing the value of respondents’ time and effort, all participants in Wave 189 were offered a post-paid incentive for their participation. This incentive could be received either as a check or as a digital gift code redeemable at major online retailers such as Amazon.com, Target.com, or Walmart.com. The incentive amounts varied, ranging from $5 to $15, with the differential amounts strategically allocated based on how easily or difficult a particular demographic group is to reach. This tiered incentive structure is a deliberate tactic designed to enhance panel survey participation among groups that have historically demonstrated lower response propensities, thereby further strengthening the representativeness of the collected data.
Data Collection Protocol: A Multi-Channel Approach
The data collection for Wave 189 unfolded between March 16 and March 22, 2026, employing a dual-mode strategy encompassing both self-administered web surveys and live telephone interviewing.
For panelists who opted for the online survey experience, a multi-stage notification process was initiated. Postcard notifications were dispatched to a subset of participants on March 16. The survey invitations were then rolled out in two distinct phases: a soft launch and a full launch. The soft launch, commencing on March 16, included sixty panelists, allowing for an initial test of the invitation system. Following this, the full launch, which began on March 17, encompassed all remaining English- and Spanish-speaking sampled online panelists.

Panelists participating online received an email invitation to take the survey. For those who did not immediately respond, up to two subsequent email reminders were sent. Furthermore, for panelists who had previously consented to receive SMS messages, an SMS invitation containing a direct link to the survey was also dispatched, accompanied by up to two SMS reminders for those who remained non-responsive. This layered communication strategy aims to maximize the chances of participation without being overly intrusive.
Panelists opting for telephone interviews were also subject to a structured protocol. Prenotification postcards were mailed on March 13, providing advance notice of the upcoming survey. A soft launch for telephone interviews commenced on March 16, with interviewers dialing until a total of nine completed interviews were secured. Subsequently, throughout the remainder of the field period, numbers for all remaining English- and Spanish-speaking sampled phone panelists were systematically dialed. To ensure comprehensive coverage and maximize response, panelists participating via phone could receive up to six calls from trained SSRS interviewers.
Data Quality Checks: Upholding Integrity
Maintaining the highest standards of data quality is paramount for Pew Research Center. To this end, researchers conducted rigorous data quality checks to identify any respondents exhibiting patterns of "satisficing" – behaviors that suggest a respondent is not fully engaging with the survey questions. This included scrutinizing response patterns for excessively high rates of unanswered questions or a consistent tendency to select the first or last option presented in a series. As a direct outcome of these diligent checks, eight ATP respondents were excluded from the survey dataset prior to the weighting and analysis phases, ensuring that the final results are based on the most engaged and thoughtful responses.
Weighting: Aligning Data with Reality

The weighting process for the ATP data is a sophisticated multi-stage procedure designed to accurately reflect the U.S. adult population. Each panelist begins with a base weight that denotes their probability of being recruited into the panel. These initial weights are then calibrated to align with established population benchmarks, as presented in accompanying tables. This calibration step is crucial for correcting potential nonresponse bias that may occur during recruitment surveys and to account for panel attrition over time. If a wave of the survey specifically invites only a subsample of panelists, the base weight is further adjusted to reflect any differential probabilities of selection within that subsample.
Following the completion of the survey by respondents, the weights are again calibrated to align with population benchmarks. This secondary calibration ensures that the final analyzed sample accurately mirrors the demographic composition of the U.S. adult population on key characteristics. To mitigate the potential loss of precision that can arise from extreme weight values, weights are trimmed at the 1st and 99th percentiles. Crucially, all calculations of sampling errors and tests of statistical significance are conducted with full consideration of the impact of this weighting process, ensuring that the reported margins of error are accurate and reflect the complex statistical adjustments made.
The accompanying tables provide a detailed breakdown of the unweighted sample sizes and the expected sampling error at a 95% confidence level for various demographic groups included in the survey. These tables are essential for researchers and the public alike to understand the precision of the data for different segments of the population. Beyond sampling error, it is important to acknowledge that other factors, such as the specific wording of survey questions and practical challenges inherent in survey administration, can also introduce error or bias into the findings of any opinion poll.
Dispositions and Response Rates: A Comprehensive View
The final disposition of respondents and the detailed response rates for Wave 189 of the American Trends Panel are presented in comprehensive tables. These tables offer a granular view of the survey process, detailing the number of individuals who were contacted, participated, or were deemed ineligible. This transparency is a hallmark of Pew Research Center’s methodological reporting, allowing for a thorough evaluation of the survey’s reach and completion. The breakdown includes the initial number of sampled individuals, those who completed the survey, those who refused, and those who were unreachable, among other categories. The cumulative response rate, a critical indicator of overall panel health and data collection success, is also meticulously detailed, providing context for the robustness of the findings. This level of detail ensures that researchers and interested parties can fully assess the survey’s methodological underpinnings and the confidence that can be placed in its results.
