This a piece of analytical research extending a descriptive approach to examine the influences students have when deciding to study Science subjects at post-16. The main data collection used quantitative methods utilising an online descriptive survey to gather the data. A qualitative approach of using a small focus group was used to generate the questions for the survey and a paper based version was trialled with a small group of students to check for and issues with the questions concerning readability and reliability. This mixed method pragmatic approach is a much better way to gain a fuller understanding of the research questions and allows the best research methods to be employed for the type of questions I am trying to answer, rather than just picking one approach polarizing the two paradigms (Cohen et al).
I used a positivistic approach as I wanted to identify and measure what the major influencing factors are for student subject choice (University of Bradford, Introduction to Research and Research Methods http://www.brad.ac.uk/acad/management/external/els/pdf/introductiontoresearch.pdf) and rank there relative importance.
A focus group was used to generate possible survey questions utilising the background research as a starting point for identifying the key influences in student choice. Focus groups allow for a greater range of discussion to take place as interactions between the participants can take place, all views can be listened to and a consensus can be established long with the potential to triangulate ideas (Bell, 2007). They can also generate a large amount of data in a relatively short period of time which is a major constraint of this research, however would not be as in-depth as if one to one interviews were used (Cohen et al). As this research is looking at what student influences are rather than the reasons for them then this limitation of focus groups is not a major concern.
A survey was used for the main data collection, a mixture of closed questions, likert scales and open questions were employed to best answer the research questions. Questionnaires are widely used and allow for quantitative data to be collected without the presence of the researcher and can be analysed easily (Wilson and McLean, 1994). As my research questions focuses on identifying what students say their influences are rather than an the explanation as to why they feel like that then a survey should employed as a reflection of the positivistic nature of the research questions (University of Bradford). Two of the pieces of backgrounds used surveys for their data collection (Harvard, 1996 & James, 2007) as well as many of the research papers used in the Welcome Trust’s systematic review this adds more justification for my choice using a survey. The operationalization of the questions is very important especially as the survey was to be completed online and so there could be not help from myself for the participants when answering the questions. It was essential that the survey was trialled on paper first to check for errors, readability and answerability of the questions before publishing online.
The closed questions at the beginning of the survey were designed to be non-threatening and gain basic information about the participants e.g. sex, type of institution and subjects studying, this allowed for some of the subsidiary research questions to be answered as well as place the participant in a good mind set and building confidence with questions they can easily answer before moving on to the more personalized questions for the main research question (Cohen et al).
The main questions examining what student influences were used a 7 point likert scale to help reduce the number of extremist answers by providing a range of possible responses and a middle value to allow the option of sitting on the fence. The 7 point scale also ensures reliability by ensuring the respondents can best discriminate between the values of the scales (Schwatz et al., 1991). As there may have been a named influence that student has no idea about before taking the survey then the 8th option of ‘Don’t Know’ was added so students didn’t feel forced to answer for something they honestly hadn’t thought about, this option was introduced as a result of feedback from the paper pilot survey.
The 15 possible influences on student choice were identified through the background research and focus group results. In order to reduce bias, via the primacy effect, by placing the expected main influencing factors at the beginning of the list (interest, university course requirement and career prospects) then each answer was assigned a random number using the spread sheet package excel and then ranked in ascending numerical order to determine the order in which they appeared in the survey. This order was repeated for all three questions as not to confuse students who answered all three questions. The final two open questions were used to allow participants to elaborate further on their answers if they wished. Is it possible that a student may have had an influence that was no in the list of 15 and so an opportunity to add it to the research was important. A participant may have also have chosen ‘very important’ on the likert scale for more than one influence either because they had multiple influences they considered very important or they are an extremist in answering survey questions so the ability to answer a question on what the major influence was would require the student to be forced into deciding what that was. The outline of the survey was deigned to follow the recommendations of Cohen et al Chapter 20.
The survey was administered online this was to enable ease of data collection and analysis. As the participants are 16 – 19 students then they would be familiar with an online survey and be web-savvy and so the problem of participants being inexperienced with the internet will be minimal. An online survey allows for the survey to be completed in the participants own time this allows for greater authenticity as they are volunteering to take the survey, in addition lesson time does not have to be taken up to carry out the research which could make the participants resent taking the survey if they felt that the survey was taking time out of their education. Participants may not be present in class when a paper survey is administered so an online survey can allow for a greater number of students to be reached. Even though the response rate of online surveys is not as high as paper based surveys administered in person, this survey was relatively short so hopefully reducing the dropout rate and allowed for the survey to potentially reach a greater number of participants.
The survey was administered through the website Bristol Online Surveys (BOS) (
) this site was chosen as even though there are easier ways to generate an online survey BOS allows for a greater range of question types to be asked as well as not having any limitations on the number of questions that can be asked, a limitation of the free survey sites such as surveymonkey.com. As BOS is administered by the University of Bristol there were no charges involved in the construction and administration of the survey and by having the University logo associated with the survey added to its legitimacy, improving participant response rate as well as honesty of answers and so increase the reliability of the data collected. BOS also had a simple straightforward look when completing the survey so reducing some of the technical and software problems associated with online surveys (Cohen et al).
The survey was emailed to all the students I teach but was also emailed to all the other science students in my workplace allowing for students who do not study A Level Biology to be included in the sample. The survey could also then be easily passed on to other teachers in the UK to increase the number of respondents. The use of the social network website twitter (www.twitter.com) has increased the potential for teachers to talk to one another, discuss and share lesson ideas and take part in Continual Professional Development (Shaha 2012). In the field of science education two UK hashtags (using the symbol # to identify your tweet as part of a particular discussion) take place regularly; #ASEChat (
) which takes place every Monday evening and is run by the Association for Science Education and #SciTeachJC (
) which takes place on alternate Tuesdays and is run by teachers interested in discussing Science Education Research and was started as a response to #twitjc a journal club for medics. I decided that it would be interesting to make other teachers aware that I was carrying out this survey by using the hashtags and send the link to the survey. By opening up the survey to a wide and diverse range of participants provides a rich data set for possible responses so improving the generalizability of the results. Though it is a self-selecting sample the teachers who expressed an interest in helping could have been from any type of institution from any area of the UK allowing for a wide range of potential students. The time frame for the survey meant that a random sample of UK school was not possible but this snowball type approach allows for a quick and effective of collecting valid data in order to determine a general trend.
The focus group consisted 5 AS students (3 female, 2 male) from a Biology class this mix was to ensure that was enough participants to create a meaningful discussion but ensure that no one participant took over the discussion. The participants were told about the nature of the research and given the opportunity to withdraw. Simple open questions were used to illicit the greatest level of response from the participants; I wanted to hear what they thought rather than bias them with what I had learnt from the background research. The session was short (5mins) due to the time constraints of taking place during a break time in the college timetable. The participants were asked to be open and honest and that their responses wouldn’t be attributed to them by name.
When asked why they wanted to study a science subject at A Level the initial response was either they were interested in the subject or they needed it for a university course or career. The course or careers were all medically based, but this was probably due to all the students studying biology. Their interest in science was hard for some students to pinpoint “it’s just something I’ve always been interested in.” when asked to elaborate their experiences at school were mentioned by one student how one of their “…teachers enthusiasm, rubbed off on me.”
A desire to understand how the world works was also agreed upon by all the students when one of them mentioned it. One student explained how they didn’t think about studying science until after the had their GCSE results, this was a new concept that I hadn’t read about in the previous research, maybe students decide what A Levels they want to do after their GCSE results, rather than always wanting to study the subject. I asked the students when they decided they wanted to study science at A Level. All of them had decided to study science at or after GCSE, when asked about KS3 Science their opinion of it was mixed but for most it wasn’t a hugely positive experience of science, it wasn’t until they were studying it at GCSE did they feel like they were “studying science properly.”
When asked about the influence of family and friends the students didn’t feel it was a major one except for one student whose parents wanted them to work in a particular field and so study a particular university course. Finally I asked about the influence of the media citing Professor Brian Cox and The Big Bang Theory. This was not something that the students had thought of and though they didn’t think of it being an influence, however the students did watch both the tv shows mentioned and said that they found them interesting and felt that they gave a positive view of science in the media. Another student also said that maybe the shows hadn’t been around long enough to influence current A Level students yet but maybe in a few more years it might be more of an influence on student choice.
To ensure that no one voice took over the focus group I ensured that all students had an opportunity to speak by using leading questions to bring the other students into the discussion e.g. “So what do you think about that…”
The online survey was written using BOS ensuring that it wasn’t too long and the questions were worded unambiguously. As a response to the focus group a question asking students at what point they decided to study A Levels was added and the influence ‘The Media’ was given some extra explanation to ensure that students considered all forms of media when answering the question. The pilot survey did not raise any issues with the questions except for the addition of ‘Don’t Know’ on questions 7,8 & 9 as previously discussed.
The link to the survey was emailed to 32 students that I teach and an additional 8 who I do not teach at the College where I teach. Nine teachers and a lecture in Initial teacher Training responded to my request for help on twitter as well as a colleague studying the same MSc course as myself, four of the respondents told me there were not able to send the link to their students due to time constraints or simply forgetting, four of them who did send the survey to their students gave me the number of students they sent it to, this increased the potential sample size to 143. In order to ensure that the survey wasn’t placed on the internet for anyone to click on and so impair the results the link was placed onto a password protected section of my blog (
) and the password was communicated by the Direct Message facility of Twitter or email. The survey was available for 2 weeks and I expected to gain 50 respondents, in total 63 students completed the questionnaire in addition and extra 12 students started the survey but did not complete, this is a response rate of 52% (44% completed) .
To ensure that all students carried out the survey with informed consent the first page of the survey gave a brief explanation of the purpose of the study as well as identifying that I am a practising teacher studying at the University of Bristol this way the research was identified as a bona fide and not a marketing ploy. I also told my students personally that I would be sending them a link to the survey explaining why I was doing it. I cleared the use of the students as participants with my line manger to check if there were any conflicts of interest with the College’s policies. Participants were told that their responses would be anonymous and confidential and the right to withdraw was explained. A data protection statement was also written to reassure participants about the security of the personal data as well as web based security issues such as cookies. Any questions that required the student to reveal more personal information were optional. The anonymity also meant that if any answers to the open questions revealed anything controversial they could not be attributed to a particular student. All the teachers who sent the link to the survey to their students’ responded to a request for help; I did not ask any individual personally ensuring that no one was coerced into taking part.