ASCB logo LSE Logo

The Project Ownership Survey: Measuring Differences in Scientific Inquiry Experiences

    Published Online:https://doi.org/10.1187/cbe.13-06-0123

    Abstract

    A growing body of research documents the positive outcomes of research experiences for undergraduates, including increased persistence in science. Study of undergraduate lab learning experiences has demonstrated that the design of the experience influences the extent to which students report ownership of the project and that project ownership is one of the psychosocial factors involved in student retention in the sciences. To date, methods for measuring project ownership have not been suitable for the collection of larger data sets. The current study aims to rectify this by developing, presenting, and evaluating a new instrument for measuring project ownership. Eighteen scaled items were generated based on prior research and theory related to project ownership and combined with 30 items shown to measure respondents’ emotions about an experience, resulting in the Project Ownership survey (POS). The POS was analyzed to determine its dimensionality, reliability, and validity. The POS had a coefficient alpha of 0.92 and thus has high internal consistency. Known-groups validity was analyzed through the ability of the instrument to differentiate between students who studied in traditional versus research-based laboratory courses. The POS scales as differentiated between the groups and findings paralleled previous results in relation to the characteristics of project ownership.

    INTRODUCTION

    Numerous calls for reform in undergraduate biology education emphasize the value of undergraduate research (e.g., American Association for the Advancement of Science, 2011). These calls are based on a growing body of research documenting how students benefit from research experiences (Kremer and Bringle, 1990; Kardash, 2000; Rauckhorst et al., 2001; Hathaway et al., 2002; Bauer and Bennett, 2003; Lopatto, 2004, 2007, 2010; Seymour et al., 2004; Hunter et al., 2007; Russell et al., 2007; Laursen et al., 2010; Thiry and Laursen, 2011). Undergraduates report cognitive gains, such as learning to think and work like a scientist; affective gains, such as finding research enjoyable and exciting; and behavioral outcomes, such as intentions to pursue further education or careers in science. Studies of undergraduate research experiences have focused primarily on internship-style research, in which individual undergraduates participate in research as apprentices to graduate, postdoctoral, or faculty mentors.

    Many colleges and universities lack the research infrastructure to involve undergraduates in internship-style research experiences or simply cannot accommodate large undergraduate populations in internships (Wood, 2003; Desai et al., 2008). As a result, an increasing number of faculty members are employing “alternatives to the apprenticeship model” (Wei and Woodin, 2011)—scalable ways of involving students in research. Course-based undergraduate research experiences, or CUREs, involve whole classes of students in research projects that build on current science knowledge and involve students in the range of scientific practices, from asking questions to collecting, analyzing, and interpreting data, to building models and communicating their findings. In many of these projects, such as the Genomics Education Partnership or the Partnership for Research and Education in Plants (Dolan et al., 2008; Shaffer et al., 2010), students’ findings have been published or have the potential to be published. Such courses can also engage introductory students or others who have chosen not to pursue internship-style research. Thus, CUREs may influence students’ academic and career paths more than summer research experiences, which typically serve to confirm students’ prior academic or career choices (Lopatto, 2004, 2007; Seymour et al., 2004; Hunter et al., 2007). Students who participate in CUREs report many of the same gains as students who participate in more intensive but less scalable lab- or field-based internships (Goodner et al., 2003; Hatfull et al., 2006; Drew and Triplett, 2008; Lopatto et al., 2008; Caruso et al., 2009; Shaffer et al., 2010).

    To date, most assessment of CUREs has made use of the Classroom Undergraduate Research Experiences (CURE) survey (Lopatto, 2010). The CURE survey comprises three elements: 1) an instructor report of the extent to which the learning experience resembles the practice of science research (e.g., outcomes of the research are unknown, students have some input into the focus or design of the research), 2) a student report of learning gains, and 3) a student report of attitudes toward science. The student portions of the CURE survey include series of Likert-type items about students’ attitudes toward science and their educational and career interests, as well as students’ perceptions of the learning experience, the nature of science, their own learning styles, and the science-related skills they developed as a result of participating in a CURE. Use of the CURE survey is an important first step in understanding the impacts of these educational experiences, but information concerning this instrument's dimensionality, validity, or reliability is not readily available. Another instrument used to measure the outcomes of research experiences is the Undergraduate Research Student Self-Assessment (URSSA; Hunter et al., 2009; Laursen et al., 2010). A large qualitative study provided the empirical evidence for URSSA development, serving as the basis for its construct validity. The URSSA and the CURE survey, as well as the CURE survey's sister instrument, the SURE (Survey of Undergraduate Research Experiences; Lopatto, 2004, 2007), aim to document outcomes of research experiences, rather than measure or discriminate between elements of CUREs that lead to particular student outcomes. In fact, for both CUREs and undergraduate research experiences in general, the design elements that lead to such desired outcomes as persistence in the sciences have yet to be elucidated (Sadler et al., 2010; Adedokun et al., 2013).

    Project ownership has been proposed as one of these elements (Lopatto, 2003). Previous research on project ownership (Kennedy, 1994; Chung et al., 1998; Downie and Moore, 1998; Mason et al., 2004; Nail, 2007; Wiley, 2009; Hanauer et al., 2012) has explored the concept as part of educators’ increasing interest in understanding and measuring how social interactions influence students’ psychological development. The most developed of these studies, from a measurement perspective, was Hanauer et al. (2012), which utilized a computational linguistic and content analysis approach to measure project ownership, differentiate among undergraduate research experiences, and show connections to student retention. The aim of the present study is to evaluate the reliability and validity of a new instrument for assessing project ownership in undergraduate research experiences. The instrument is called the Project Ownership survey (POS), and it was developed from the linguistic and content categories presented in Hanauer et al. (2012).

    Ownership in education has been related to student choices, engagement, emotional involvement, and personal connectivity (Kennedy, 1994; Chung et al., 1998; Downie and Moore, 1998; Mason et al., 2004; Nail, 2007; Wiley, 2009). Ownership as a concept integrates personal responsibility with commitment to and identification with the work conducted in the educational setting (Wiley, 2009). As such, measuring project ownership offers the potential to capture a particular orientation toward work conducted within the sciences. Hatfull (2010), in a discussion of his CURE dedicated to bacteriophage isolation and genomic description, makes the connection explicit by relating scientific discovery, positive emotion, a sense of accomplishment, motivation, and ownership. This approach emphasizes the relationship between the design components of the experience (such as facilitating the option of discovering a novel organism) and student emotive responses. Milner-Bolotin (2001) has also specified that the development of ownership is sensitive to the specific components manifested in an education program.

    Hanauer and colleagues (2012) specifically defined the construct of project ownership and its relationship to educational experiences. This study involved interviewing students who had participated in different undergraduate research experiences and then carefully coding their expressed experiences for indicators or counterindicators of ownership. Three undergraduate research experiences were explored: a research-based field and laboratory course (Scott Strobel's Rainforest Expedition and Laboratory [REAL] program), internship-style independent research experiences, and more traditional laboratory courses in biochemistry and chemistry. The result of this analysis was a set of content statements that characterize these different experiences in terms of aspects of project ownership. The following five categories of project ownership statement were found to differentiate between the three educational experiences:

    1. Constructing connections between personal history and scientific inquiry: This category includes both statements and narratives that describe significant moments in a student's life that have shaped, influenced, and created the scientific work done in laboratory and fieldwork. These statements and narratives involve a student bringing past experience in the form of personal stories and past educational experience into current and future research.

    2. Agency combined with mentorship: This category identifies moments when the student actively sought advice, assistance, or direction from professors, teachers, and other students in order to overcome an issue or fulfill an aim in the student's research project. These moments represent cocontributions and knowledge building between students and with educators.

    3. Expressions of excitement toward scientific inquiry: This category finds moments of genuine excitement for the process of scientific inquiry. This code reveals when students show real emotional connections to the work they are performing. These statements express positive emotional interaction relating to involvement in science.

    4. Overcoming challenging moments in science: This category identifies statements that address strategies for overcoming frustrating moments or problems encountered in research. The students discussed how they approached problems by adjusting their work or predicted how they could develop an entirely different approach to work around the problem.

    5. Expressions of a sense of personal scientific achievement: This category describes a positive emotional expression upon achieving a specific goal. This category captures a specific moment in student work. Students reference a specific finding or discovery, and how the finding resulted in their pride, happiness, or satisfaction.

    For each of these categories, there were increased frequencies of usage for students who partook in the research-based course rather than in the independent study or the traditional laboratory. Furthermore, students from the independent group had higher frequencies of usage for these categories than the traditional laboratory group but lower frequencies than the research-based course group. In this sense, these categories operationalized the concept of project ownership in a way that allowed differences in expressed educational experience to emerge.

    An additional finding of the Hanauer et al. (2012) study was the presence of increased levels of emotive language for students who studied in the research-based laboratory course. This suggested that project ownership included not only personal connectivity, agency, problem solving, social interaction, and a sense of personal achievement, but also increased emotional valence for the educational experience. The study also presented data on the long-term outcomes of these educational experiences and suggested that increased project ownership resulted in long-term persistence in the sciences.

    A limitation of the work conducted so far on project ownership is that, methodologically, it has been inappropriate for larger-scale research. Although theoretical description, qualitative observation, interview, and content and linguistic analysis have all been important for developing an understanding of the concept of project ownership, they are limited by the time-consuming and personnel-intensive nature of data collection and analysis. However, the findings from this work are sufficient to formulate a hypothesis that enhanced project ownership is positively related to longer-term retention in science careers and to offer an operational definition of project ownership. The task now is to develop an instrument based on existing research that is suitable for larger-scale implementation. Broadly, the aim of this paper is to present and evaluate a survey-based instrument that measures project ownership, with the hope that this tool will facilitate larger-scale studies of project ownership. To facilitate this process, the current study addressed the following research questions:

    1. What components of project ownership can be measured in a valid and reliable way?

    2. Are there differences in degrees of project ownership for students who studied in traditional and research-based educational laboratory experiences?

    METHODS

    Participants

    The participants in this study were 114 undergraduate students enrolled in a 24 different laboratory courses at 21 different institutions of higher education across the United States. The average number of participating students at each institution was 2.29 (range of 1–8 students). Student participation in the survey was requested through course instructors who were members of the Course-based Undergraduate Research Experiences Network (CUREnet; www.curenet.franklin.uga.edu). Students were not offered any incentive to participate in the survey. Of the 114 participants, only 68 completed the full survey. Analysis was only conducted on full surveys. Demographic information for the students is provided in Table 1. The request to participate in the survey and the Web-based informed consent process were conducted in accordance with Indiana University of Pennsylvania's IRB approval (log no. 13–185). The request to complete the survey was sent in the last 2 wk of classes, and all responses were collected by the end of that semester.

    Table 1. Demographic characteristics of participants (n = 68)a

    Characteristicn%
    Gender
     Female3450
     Male3450
    Class
     First year710
     Sophomore2029
     Junior1522
     Senior2435
     Prefer not to respond23
    Race/ethnic identification
     Asian57
     African American1116
     Hispanic or Latino1116
     Native Hawaiian or Other Pacific Islander11
     White3957
     Other34
     Prefer not to respond46
    Institution type
     Research university2232
     Master’s-granting institution11
     Four-year school4363
     Community college23

    aTotals may be greater than 68 due to respondents identifying more than one race of ethnicity.

    Development of the Survey Instrument

    The POS was designed to have three main components: specification of the undergraduate research experience, assessment of degrees of project ownership, and emotive scales. The specification of the educational experience included several scales dealing with degrees of autonomy and a written description of the specific course. The project ownership component was developed as an extension of the qualitative and quantitative findings of the Hanauer et al. (2012) study presented above. In the development of the survey, the content categories of project ownership were rewritten as statement prompts in a five-point Likert scale (strongly agree–strongly disagree) format. For the emotional scales, a standardized set of self-reported, discrete emotion scales was used (Izard, 1993). The complete survey was piloted with a small group of 10 undergraduate students to evaluate that all questions and scales were comprehensible and elicited data relevant to the intent of the survey. Revisions were made in wording and organization. Finally, the survey was migrated to a Web-based interface for ease of data collection.

    Data Analysis for Survey Instrument Validation

    Data analysis was conducted in accordance with procedures utilized when validating a new survey instrument. The process of scale development has been described as involving four stages of development: 1) construct definition through theoretical and literature review; 2) generating measurement items; 3) refining scales through field-testing; and 4) finalizing the scale (Netemeyer et al., 2003). Both stages three and four involve field testing and, in a sense, include a repetition of the same set of procedures designed to explore the dimensionality, reliability, and validity of the new scale. Dimensionality in relation to survey instruments refers to the homogeneity of the items on the scale and evaluates the presence of underpinning constructs (Netemeyer et al., 2003). The core assumption is that specific items that vary in a systematic correlational manner underpin and together offer an operational definition of the construct being measured. An evaluation of the dimensionality of scale is important, in that it facilitates an understanding of the ways in which the different items on the scale relate and of the number of underpinning factors present within the scale. Dimensionality is usually analyzed through the statistical procedure of factor analysis (Rietveld and Van Hout, 1993). Both stages three and four of scale development involve factor analysis; however, in stage three, an exploratory factor analysis is conducted to refine the scale and gain understanding of the relationship between the items in relation to underpinning factors. In stage four, a confirmatory factor analysis is conducted to confirm the structure of the scale.

    Both reliability and validity are well-established concepts for any usage of a research tool. In relation to scale development, reliability is established through the evaluation of the interrelatedness of the items on the scale and the stability of scores across administrations (Bachman, 2004). Cronbach's alpha is the most widely used reliability coefficient for establishing levels of acceptable internal consistency (Netemeyer et al., 2003). For validity, both theoretical and empirical measures need to be taken to ensure the construct validity of the scale. These can include close evaluation of the sources of the scale, its relationship to existing literature, qualitative analyses of participant responses, and empirical comparisons of the new scale with existing measures. The aim, as with any evaluation of validity, is to make sure that the new scale measures what it is intended to measure.

    For assessing the dimensionality, reliability, and validity of the POS, the following analytical procedures were performed and are reported in the next section. An exploratory factor analysis was conducted to assess dimensionality and establish relationships between items. Cronbach's alpha was calculated to establish internal consistency of the scales. A known-groups comparative study was conducted to assess the validity of the revised instrument.

    RESULTS

    The Psychometric Properties of the POS

    Statistical analyses were conducted to investigate the psychometric properties of the POS. The internal consistency of the whole instrument was evaluated using Cronbach's alpha with the result α = 0.91, which indicates high levels of consistency for the tool. However, it is important to note that 30 of the 48 items on the survey came from a standardized and established tool for evaluating emotional profiles (Izard, 1993), and were therefore already highly consistent. Accordingly, it was decided to evaluate the internal consistency of the 18 new items without the emotional scales. Cronbach's alpha for the new items was α = 0.74, which is an acceptable level of consistency. To further understand the internal consistency of these 18 scales, we computed item-total correlations by correlating each item with the sum of the items (total score) to identify particular items that might be reducing reliability (Guilford, 1953). In this analysis, four items were found not to contribute to the instrument's reliability and were thus identified for potential removal from the scale, depending on the outcomes of the factor analysis. The four items were: “I designed my own research project,” “I found my research experience to be frustrating,” “My research experience was boring,” and “I did not care about the findings of my research project.” Table 2 presents the means and SDs for all scales and item-total correlations for the 18 new scales.

    Table 2. Item descriptions, means, and SDs for all scales and item-total correlations for 18 POS scales

    MeanSDItem-total correlation
    1. I designed my own research project.3.481.380.08
    2. I was responsible for the outcomes of my research.2.021.110.45
    3. I was in control of my research project.2.481.370.32
    4. The research question I worked on was important to me.2.571.050.60
    5. I had a personal reason for choosing the research project I worked on.3.231.170.46
    6. In conducting my research project, I actively sought advice and assistance.1.950.910.66
    7. My research project was exciting.2.280.990.57
    8. I faced challenges that I managed to overcome in completing my research project.2.170.910.59
    9. The findings of my research project gave me a sense of personal achievement.2.080.930.67
    10. My findings were important to the scientific community.2.570.980.56
    11. My research will help to solve a problem in the world.2.771.050.55
    12. I found my research experience to be frustrating.3.151.070.30
    13. In conducting my research I faced unexpected difficulties.2.350.940.27
    14. In conducting my research I was told what results to expect.3.321.16−0.55
    15. In conducting my research I was told what procedures to follow.2.351.160.24

    Because the POS is a new instrument, an exploratory factor analysis was conducted to establish the internal structure of the tool and the dimensions of the project ownership construct (Thompson, 2004). As reported above, only 68 participants completed the full survey. Because this is usually considered too small a sample for a factor analysis, a Kaiser–Meyer–Olkin measure of sampling adequacy was calculated with the result of 0.653. This result, which is above the 0.5 benchmark, indicates an adequate sample, and a full exploratory factor analysis was therefore conducted. Descriptive statistics for each of the measures to be used in the factor analysis were calculated to make sure that the assumption of multivariate normality was not violated. A maximum likelihood factor analysis with oblimin with Kaiser normalization rotation was conducted to determine the internal structure of the survey and the dimensions of project ownership. For determining the number of factors to enter into the analysis, a scree plot of eigenvalues was graphed (see Figure 1). Based on this plot, a three-factor solution was specified.

    Figure 1.

    Figure 1. Scree plot of eigenvalues.

    Three components were extracted, accounting for 51.56% of the total variance of the observed variables. Table 3 presents the factor pattern matrix and the regression coefficients of each variable on each of the factors. The first factor, which accounted for 28% of the total variance, was constructed of emotional items with a negative orientation. Accordingly this factor was labeled “negative emotion.” The second factor, which accounted for 18.2% of the total variance, was constructed from the items dealing with aspects of project ownership and positive emotion categories. Accordingly, this factor was labeled “project ownership.” Three items in this factor had low pattern loadings, suggesting they should be considered for removal from the instrument: “To what extent does the word alert describe your experience of the laboratory course?” “To what extent does the word concentrating describe your experience of the laboratory course?” and “In conducting my research I faced unexpected difficulties.” The third factor, which accounted for 5.3% of the total variance, was constructed from items that dealt with student control over the research and its outcomes. Accordingly, this factor was labeled “degrees of agency.” Interestingly, the emotional category item of being worried factored with this group of items. It should be noted, though, that the pattern loadings for this factor were low. Broadly, the factor analysis parallels the original design of the POS and differentiates between the sections that deal directly with project ownership, emotion, and course design.

    Table 3. Pattern matrix and regression coefficients for the POS

    Factora
    1 (emotion)2 (project ownership)3 (course type)
    To what extent does the word sad describe your experience of the laboratory course?0.90
    To what extent does the phrase feeling of distaste describe your experience of the laboratory course?0.89
    To what extent does the word disgust describe your experience of the laboratory course?0.89
    To what extent does the phrase feeling of revulsion describe your experience of the laboratory course?0.87
    To what extent does the word disdainful describe your experience of the laboratory course?0.86
    To what extent does the word mad describe your experience of the laboratory course?0.85
    To what extent does the word angry describe your experience of the laboratory course?0.84
    To what extent does the word downhearted describe your experience of the laboratory course?0.81
    To what extent does the word enraged describe your experience of the laboratory course?0.80
    To what extent does the word blameworthy describe your experience of the laboratory course?0.78
    To what extent does the word guilty describe your experience of the laboratory course?0.76
    To what extent does the word bashful describe your experience of the laboratory course?0.75
    To what extent does the word contemptuous describe your experience of the laboratory course?0.72
    To what extent does the word scared describe your experience of the laboratory course?0.71
    To what extent does the word scornful describe your experience of the laboratory course?0.71
    To what extent does the word discouraged describe your experience of the laboratory course?0.68
    To what extent does the word fearful describe your experience of the laboratory course?0.64
    To what extent does the word sheepish describe your experience of the laboratory course?0.61
    To what extent does the word shy describe your experience of the laboratory course?0.59
    To what extent does the word repentant describe your experience of the laboratory course?0.58
    I found my research experience to be frustrating.0.53
    To what extent does the word tentative describe your experience of the laboratory course?0.41
    My research project was interesting.0.88
    My research project was exciting.0.83
    The findings of my research project gave me a sense of personal achievement.0.78
    To what extent does the word delighted describe your experience of the laboratory course?0.76
    The research question I worked on was important to me.0.75
    To what extent does the word happy describe your experience of the laboratory course?0.74
    To what extent does the word joyful describe your experience of the laboratory course?0.72
    In conducting my research project, I actively sought advice and assistance.0.69
    To what extent does the word astonished describe your experience of the laboratory course?0.68
    To what extent does the word surprised describe your experience of the laboratory course?0.65
    To what extent does the word amazed describe your experience of the laboratory course?0.65
    My research experience was boring.−0.60
    I faced challenges that I managed to overcome in completing my research project.0.57
    My findings were important to the scientific community.0.55
    My research will help to solve a problem in the world.0.52
    I did not care about the findings of my research project.−0.52
    I was responsible for the outcomes of my research.0.49
    I had a personal reason for choosing the research project I worked on.0.41
    To what extent does the word alert describe your experience of the laboratory course?0.39
    To what extent does the word concentrating describe your experience of the laboratory course?0.30
    In conducting my research I faced unexpected difficulties.0.18
    To what extent does the word worried describe your experience of the laboratory course?0.57
    I was in control of my research project.−0.46
    I designed my own research project.−0.37
    In conducting my research I was told what procedures to follow.0.36
    In conducting my research I was told what results to expect.0.30

    aExploratory factor analysis with maximum likelihood extraction and oblimin with Kaiser normalization rotation.

    Based on the pattern loadings, item-total correlations, and reliability analysis, a revised version of the POS could be proposed. For the scales of project ownership, factor 2 was modified by removing items with low reliability (“My research experience was boring” and “I did not care about the findings of my research project”). Cronbach's alpha was calculated for the remaining scales on this factor with the result of α = 0.92, which indicates a very high level of reliability. Cronbach's alpha was also calculated for the variables of factor 3 dealing with degrees of ownership. The resultant alpha value (α = 0.27) was very low, and given this, these scales were deemed unreliable and removed from the survey. Table 4 presents the final version of the survey, which consists of 16 items dealing with project ownership and positive emotion.

    Table 4. Final version of the POS

    Strongly agreeAgreeNeither agree nor disagreeDisagreeStrongly disagree
    My research will help to solve a problem in the world.
    My findings were important to the scientific community.
    I faced challenges that I managed to overcome in completing my research project.
    I was responsible for the outcomes of my research.
    The findings of my research project gave me a sense of personal achievement.
    I had a personal reason for choosing the research project I worked on.
    The research question I worked on was important to me.
    In conducting my research project, I actively sought advice and assistance.
    My research project was interesting.
    My research project was exciting.
    Very stronglyConsiderablyModerateSlightlyVery slightly
    To what extent does the word delighted describe your experience of the laboratory course?
    To what extent does the word happy describe your experience of the laboratory course?
    To what extent does the word joyful describe your experience of the laboratory course?
    To what extent does the word astonished describe your experience of the laboratory course?
    To what extent does the word surprised describe your experience of the laboratory course?
    To what extent does the word amazed describe your experience of the laboratory course?

    Differences between Educational Experiences on the POS

    Previous research on the concept of project ownership had demonstrated differences in the expression of project ownership between types of scientific inquiry educational experience (Hanauer et al., 2012). As reported above, the scales of the current survey were developed on the basis of this research, and as such, the survey has some degree of construct validity. However, to further assess the validity of the tool, a known-groups validity approach was taken. If the original study utilized project ownership categories to differentiate between educational experiences, then a basic measure of the validity of the POS is to assess whether the new tool also systematically differentiates between educational experiences. In completing the survey, participants were asked to specify whether the laboratory course they were responding about was a traditional laboratory course or a research-based laboratory experience, and they were also required to verbally describe the course. Students’ answers specifying course type were compared with their verbal descriptions of the course they had taken in order to establish the presence of clear groupings. This analysis established two types of educational research experience that could be compared: traditional laboratory courses and research-based laboratory courses. The traditional laboratory course was characteristically described by students as being a “required introductory course” and involving “various small experiments; not real research.” The research-based course was characteristically described by students in terms of the scientific question they explored, such as “Our current work focuses on the molecular basis of the unique colonizing ability of the D-genotype strains. Genomic sequence of L51-96 will reveal unique features of the special relationship between these strains and wheat.”

    Thirty-two students specified they took a traditional laboratory course (12 different specified courses), and 33 specified they took a research-based laboratory course (12 different specified courses). These specifications were corroborated through a content analysis of their verbal descriptions. Based on previous research (Hanauer et al., 2012), a comparison between these two groups on the POS scales should provide some insight into the ability of this tool to differentiate between types of educational experience. The basic hypothesis was that undergraduate research-based courses would elicit systematically more positive ratings on the individual project ownership rating scales than the traditional undergraduate laboratory course.

    Table 5 presents the means and SDs for each of the items in the revised POS. As can be seen from the descriptive statistics, for every item, the research-based group has more positive ratings (indicated by lower means on five-point Likert scales, with 1 = strongly agree and 5 = strongly disagree) than the traditional laboratory group. To further evaluate the hypothesis of difference between educational experience on the POS, we calculated one-way analysis of variance (ANOVA) with each of the items as dependent variables and groups (required; research-based) as the independent variable. As seen in Table 5, nine of the 16 items indicated significant differences between the traditional and research-based lab course at the 0.05 or 0.01 levels. Cohen's d as a measure of effect size for scales with significant differences ranged from 0.51 to 0.9, specifying medium-to-large effect sizes (Cohen, 1992). In addition, two additional items showed a trend of difference, with p levels at 0.10 and Cohen's d in the medium range (0.45–0.55). The remaining five items were not found to be significantly different.

    Table 5. Means, SDs, and one-way ANOVA comparison for traditional laboratory and research laboratory groups

    POS scaleTraditional labResearch labF(1,63)d
    My research project was interesting.2.41 (0.87)1.94 (0.83)4.89**0.55
    My research project was exciting.2.47 (0.84)2.06 (1.05)2.97*0.43
    The findings of my research project gave me a sense of personal achievement.2.47 (1.02)1.70 (0.68)12.98***0.90
    To what extent does the word delighted describe your experience of the laboratory course?2.94 (1.10)2.27 (1.09)5.92***0.61
    The research question I worked on was important to me.2.72 (1.02)2.33 (1.10)2.120.36
    To what extent does the word happy describe your experience of the laboratory course?2.75 (1.14)2.03 (0.85)8.41***0.72
    To what extent does the word joyful describe your experience of the laboratory course?3.06 (1.16)2.33 (1.11)6.70***0.64
    In conducting my research project, I actively sought advice and assistance.2.28 (1.02)1.61 (0.70)9.65**0.77
    To what extent does the word astonished describe your experience of the laboratory course?3.63 (1.26)3.58 (1.09)0.280.04
    To what extent does the word surprised describe your experience of the laboratory course?3.28 (1.22)3.15 (1.00)0.220.11
    To what extent does the word amazed describe your experience of the laboratory course?2.94 (1.24)2.64 (1.02)1.140.26
    I faced challenges that I managed to overcome in completing my research project.2.41 (0.95)1.94 (0.86)4.32**0.51
    My findings were important to the scientific community.2.88 (1.07)2.21 (0.74)8.48***0.74
    My research will help to solve a problem in the world.3.03 (1.1)2.42 (0.91)5.98***0.60
    I was responsible for the outcomes of my research.2.25 (1.27)1.76 (0.87)3.35*0.45
    I had a personal reason for choosing the research project I worked on.3.34 (1.18)3.18 (1.23)0.290.13

    Five-point Likert scale: 1 = strongly agree to 5 = strongly disagree.

    *p < 0.10.

    **p < 0.05.

    ***p < 0.01.

    Five project ownership items (“The research project gave me a sense of personal achievement,” “In conducting my research project I actively sought advice and assistance,” “I faced challenges that I managed to overcome,” “My findings were important to the scientific community,” and “My research will help to solve a problem in the world”) and the positive emotional scales of delight, happiness, and joy systematically differentiated between the groups, with the research-based laboratory groups having more positive ratings on each of these scales. Interestingly, four of the five nonsignificant items came from the same emotional domain (astonishment, surprise, and amazement), suggesting that, while this might measure some aspects of project ownership, it does not differentiate between the groups and might in a future iteration of the POS be considered for removal. Overall, based on the finding of the tests of difference, the POS would seem to operate in a manner that is expected from a tool of this type.

    CONCLUSIONS

    The aim of this study was to present and evaluate a new instrument for measuring project ownership that is appropriate for the collection of larger-scale data. The revised 16-item POS was found to be highly reliable (α = 0.92). Construct validity was addressed by developing the items based on a 2-yr study that defined and operationalized project ownership in the context of undergraduate laboratory learning experiences. To bolster the evaluation of validity, we utilized a known-groups validity approach based on the prediction of group differences between types of undergraduate research experience. The findings revealed that the tool significantly differentiated between the groups on a majority of the items of the POS. Accordingly, it seems that the psychometric properties of the instrument allow it to be used for larger-scale data collection on project ownership.

    Previous research had specified five categories of student statement that signified the presence of project ownership (Hanauer et al., 2012). These consisted of connections between personal history and scientific inquiry, agency and mentorship, expressions of excitement, overcoming challenging moments, and expressions of personal scientific achievement. The findings presented here on the POS items that significantly differentiate between traditional and research-based laboratory courses replicate some but not all of the original findings. Interestingly, the items dealing with the category of personal connections to scientific inquiry (“I had a personal reason for choosing the research project I worked on” and “The research question I worked on was important to me”) were not significantly different between the groups. It is possible that this reflects the realities of most laboratory courses and that, even in research-based courses, research questions and projects more often are assigned than student defined. In the previous study, the special nature of the Strobel REAL project may indeed be very different from the research-based courses in the current study, and this may account for this difference. All the other categories from the original study were supported by significant differences in the items of the POS. The usage of discrete emotional scales did provide clear evidence of the emotional valence of different undergraduate research experiences. The items dealing with interest, excitement, delight, happiness, and joy were all significantly more positive for the research-based course.

    In the process of tool validation, several items of the original survey were removed. Importantly, items under the heading of degree of agency, involving ratings of the degree of control and design of the research project, were not found to have psychometric properties that would allow them to be used in the current version of the POS. However, from a theoretical perspective, these types of items would seem to have some significance, and as such, it is possible that new versions of these items need to be formulated and psychometrically validated in a new version of the POS.

    As with any instrument development process, the current study has some limitations. The main limitation concerns the size and type of sample used. The sample was small and not random. Students came from 21 different institutions across the United States, but independently decided whether to complete the survey itself. In accordance with IRB criteria, participation was voluntary, and this may have constructed a self-biasing group of participants (e.g., those who have an interest in ownership). However, the adequacy of the sample size was statistically evaluated and found to be sufficient to conduct a factor analysis. As with the development of any new instrument, we suggest that future usage of the revised POS involve an additional iteration of the analysis of the tool's psychometric properties.

    Underpinning the research on project ownership is the idea that it is important to understand, facilitate, and measure both the psychosocial and cognitive aspects of undergraduate research experiences. Project ownership is one of several potential measures that could be used to further explore the elements of undergraduate research experiences that influence a student's decision to stay in the sciences and become an active researcher (Adedokun et al., 2013; Eagan et al., 2013). Project ownership may be particularly important, in that it is tied directly to the research project and educational experience of the student. The evidence presented here demonstrates that the POS is a useful tool for measuring project ownership. At this stage, it is important for the instrument to be used by a broad group of researchers and evaluators of undergraduate research and laboratory experiences to yield further insights into its validity and reliability as a measure of project ownership. The POS should also be used to characterize a broad range of research experiences in order to elucidate the relationship between students’ sense of ownership and the gains they make from participating in research. Results of this work will be useful for identifying design features of undergraduate research experiences that enhance project ownership. Ultimately, the aim is to understand those factors that enrich student research experiences and facilitate a process of increased numbers of students staying in the sciences for the very best reason—the joy of being a researcher.

    ACKNOWLEDGMENTS

    D.I.H. was funded through a subaward (IUP RI log no. 0910-028) from a Howard Hughes Medical Institute professorship award to Graham Hatfull. CUREnet is supported by the National Science Foundation under grant no. 1554681.

    REFERENCES

  • Adedokun OA, Bessenbacher AB, Parker LC, Kirkham LL, Burgess WD (2013). Research skills and STEM undergraduate research students’ aspirations for research careers: mediating effects of research self-efficacy. J Res Sci Teach 50, 940-951. Google Scholar
  • American Association for the Advancement of Science (2011). Vision and Change in Undergraduate Biology Education: A Call to Action, Washington, DC. Google Scholar
  • Bachman L (2004). Statistical Analyses for Language Assessment, Cambridge, UK: Cambridge University Press. Google Scholar
  • Bauer KW, Bennett JS (2003). Alumni perceptions used to assess undergraduate research experience. J High Educ 74, 210-230. Google Scholar
  • Caruso SM, Sandoz J, Kelsey J (2009). Non-STEM undergraduates become enthusiastic phage-hunters. CBE Life Sci Educ 8, 278-282. LinkGoogle Scholar
  • Chung H, Rodes P, Knapczyk D. (1998). Using web conferencing to promote ownership in distance education coursework In: Paper presented at the Third Annual World Conference of the WWW, Internet, and Intranet, held 7–12 November 1998, in Orlando, FL. Google Scholar
  • Cohen J (1992). A power primer. Psychol Bull 112, 155-159. MedlineGoogle Scholar
  • Desai KV, Gatson SN, Stiles TW, Stewart RH, Laine GA, Quick CM (2008). Integrating research and education at research-extensive universities with research-intensive communities. Adv Physiol Educ 32, 136-141. MedlineGoogle Scholar
  • Dolan EL, Lally DJ, Brooks E, Tax FE. (2008). PREPping students for authentic science. The Science Teacher 75, 38-43. MedlineGoogle Scholar
  • Downie M, Moore P (1998). Closing the gap: schools forge a bridge to community—in California: comprehensive services and realistic planning. Perspect Educ Deafness 16, 15-27. Google Scholar
  • Drew JC, Triplett EW (2008). Whole genome sequencing in the undergraduate classroom: outcomes and lessons from a pilot course. J Microbiol Biol Educ 9, 3-11. MedlineGoogle Scholar
  • Eagan MK, Hurtado S, Chang MJ, Garcia GA (2013). Making a difference in science education: the impact of undergraduate research programs. Am Educ Res J 50, 463-713. Google Scholar
  • Goodner BW, Wheeler CA, Hall PJ, Slater SC (2003). Massively parallel undergraduates for bacterial genome finishing. Am Soc Microbiol News 69, 584-585. Google Scholar
  • Guilford JP (1953). The correlation of an item with a composite of the remaining items in a test. Educ Psychol Measurement 13, 87-93. Google Scholar
  • Hanauer DI, Frederick J, Fotinakes B, Strobel SA (2012). Linguistic analysis of project ownership for undergraduate research experiences. CBE Life Sci Educ 11, 378-385. LinkGoogle Scholar
  • Hatfull GF (2010). Bacteriophage research: gateway to learning science. Microbe 4, 243-250. Google Scholar
  • Hatfull GF, et al. (2006). Exploring the mycobacteriophage metaproteome: phage genomics as an educational platform. PLoS Genet 2, e92. MedlineGoogle Scholar
  • Hathaway RS, Nagda BA, Gregerman SR (2002). The relationship of undergraduate research participation to graduate and professional education pursuit: an empirical study. J Coll Stud Dev 43, 614-631. Google Scholar
  • Hunter A-B, Laursen SL, Seymour E (2007). Becoming a scientist: the role of undergraduate research in students’ cognitive, personal, and professional development. Sci Educ 91, 36-74. Google Scholar
  • Hunter A-B, Weston TJ, Laursen SL, Thiry H (2009). URSSA: evaluating student gains from undergraduate research in the sciences. Counc Undergrad Res Q 29, 15-19. Google Scholar
  • Izard CE (1993). The Differential Emotions Scale: DES IV: A Method of Measuring the Meaning of Subjective Experience of Discrete Emotions, Newark: University of Delaware Press. Google Scholar
  • Kardash CM (2000). Evaluation of an undergraduate research experience: perceptions of undergraduate interns and their faculty mentors. J Educ Psychol 92, 191-201. Google Scholar
  • Kennedy M (1994). The Ownership Project: an experiment in student equity. Soc Stud Rev 33, 24-31. Google Scholar
  • Kremer JF, Bringle RG (1990). The effects of an intensive research experience on the careers of talented undergraduates. J Res Dev Educ 24, 1-5. Google Scholar
  • Laursen S, Hunter A-B, Seymour E, Thiry H, Melton G (2010). Undergraduate Research in the Sciences: Engaging Students in Real Science, San Francisco, CA: Jossey-Bass. Google Scholar
  • Lopatto D (2003). The essential features of undergraduate research. Counc Undergrad Res Q 24, 139-142. Google Scholar
  • Lopatto D (2004). Survey of Undergraduate Research Experiences (SURE): first findings. Cell Biol Educ 3, 270-277. LinkGoogle Scholar
  • Lopatto D (2007). Undergraduate research experiences support science career decisions and active learning. CBE Life Sci Educ 6, 297-306. LinkGoogle Scholar
  • Lopatto D (2010). Science in solution: the impact of undergraduate research on student learning. Tucson, AZ: Research Corporation for Science Advancement In: www.rescorp.org/gdresources/downloads/Science_in_Solution_Lopatto.pdf. Google Scholar
  • Lopatto D, et al. (2008). Undergraduate research: Genomics Education Partnership. Science 322, 684-685. MedlineGoogle Scholar
  • Mason CY, McGahee-Kovac M, Johnson L (2004). How to help students lead their IEP meetings. Teach Except Child 36, 18-25. Google Scholar
  • Milner-Bolotin M (2001). The effects of topic choice in project based instruction on undergraduate physical science students’ interest, ownership, and motivation In: Unpublished Doctoral Dissertation, Austin: University of Texas at Austin. Google Scholar
  • Nail MH (2007). Reaching out to families with student-created newsletters. Kappa Delta Pi Record 44, 39. Google Scholar
  • Netemeyer RG, Bearden WO, Sharma S (2003). Scaling Procedures: Issues and Applications In: Thousand Oaks, CA: Sage Publications. Google Scholar
  • Rauckhorst WH, Czaja JA, Baxter Magolda M (2001). Measuring the impact of the undergraduate research experience on student intellectual development In: Paper presented at Project Kaleidoscope Summer Institute, held 18–21 July, in Snowbird, UT. Google Scholar
  • Rietveld T, Van Hout R (1993). Statistical Techniques for the Study of Language and Language Behaviour, Berlin: Mouton de Gruyter. Google Scholar
  • Russell SH, Hancock MP, McCullough J (2007). The pipeline: benefits of undergraduate research experiences. Science 316, 548-549. MedlineGoogle Scholar
  • Sadler TD, Burgin S, McKinney L, Ponjuan L (2010). Learning science through research apprenticeships: a critical review of the literature. J Res Sci Teach 47, 235-256. Google Scholar
  • Seymour EL, Hunter AB, Laursen S, De Antoni T (2004). Establishing the benefits of research experiences for undergraduates: first findings from a three-year study. Sci Educ 88, 493-594. Google Scholar
  • Shaffer CD, et al. (2010). The Genomics Education Partnership: successful integration of research into laboratory classes at a diverse group of undergraduate institutions. CBE Life Sci Educ 9, 55-69. LinkGoogle Scholar
  • Thiry H, Laursen SL (2011). The role of student-advisor interactions in apprenticing undergraduate researchers into a scientific community of practice. J Sci Educ Technol 20, 771-784. Google Scholar
  • Thompson B (2004). Exploratory and Confirmatory Factor Analysis: Understanding Concepts and Applications, Washington, DC: American Psychological Association. Google Scholar
  • Wei A, Woodin T (2011). Undergraduate research experiences in biology: alternatives to the apprenticeship model. CBE Life Sci Educ 10, 123-130. LinkGoogle Scholar
  • Wiley J (2009). Student ownership of learning: an analysis In: Unpublished Master's Thesis, Manoa: University of Hawaii. Google Scholar
  • Wood WB (2003). Inquiry-based undergraduate teaching in the life sciences at large research universities: a perspective on the Boyer Commission report. Cell Biol Educ 2, 112-116. LinkGoogle Scholar