Questioning

The Questioning Learning Hub page and resources are managed by Hub Leader, Andi Clarke

Hub members:

  • ACL (Hub Leader, Maths)
  • REM (Maths)
  • SCR (English)
  • LRI (English)
  • PGO (Science)
  • JBO (Humanities)
  • SBR (Humanities)
  • JCO (Performing Arts)
  • MBL (Humanities)

Objectives

  • To develop deep and probing questioning for teaching/memory that elicits students to think hard, supporting a culture of ‘growth mindset’
  • To develop questioning for assessment that informs teaching, e.g. hinge questions etc.

Initial research

I decided initially to focus on some of the common issues that people face with questioning. I identified some of these to be:

  • Calling on high achievers disproportionately
  • The same students always volunteering answers (hands up)
  • Depth of questioning (too simple / complex?)
  • Not giving enough wait time
  • Responding to a student answer and just moving on (IRE)
  • Dealing with the response “I don’t know”
  • Dealing with wrong answers

Calling on high achievers / the same students always volunteering answers

Dylan Wiliam is an advocate of using ‘no hands up’ techniques in the classroom. He believes that using a hands up technique (where students volunteer answers) is one of the most damaging practices that currently happens in our classrooms, with only a quarter of students volunteering answers consistently. Research by Mercer et. al (2004) and Mercer (2011) has shown that engagement in high-quality classroom dialogue through questioning can improve IQ (using Ravens test as a measure). Wiliam therefore argues that by allowing students to volunteer whether or not to answer, we are in fact making the achievement gap bigger.

Techniques that Wiliam suggests to use are a no hands up rule and name randomisers in order to avoid always calling upon the same students in a class for responses.

Doug Lemov also champions a no hands up approach, basing ‘Cold Call’ – one of the 49 techniques from his Teach Like a Champion book on this. Lemov believes that asking students who know an answer to raise their hands, before calling on one of them to answer, teaches students that they will never have to participate if they don’t raise their hands.

Instead, Cold Call involves asking a question, giving wait time and then randomly calling on any student. The expectation is that all students should be trying to answer the question in their head and be ready with an answer when you ‘Cold Call’.

This technique is being covered in our TLAC sessions this half term so I will be trialling this technique.

Depth of questioning

William Wilen’s work (1991) highlights the issue that despite teachers being aware of the importance of questioning, the vast majority of questions asked by teachers are low-level cognitive questions that require students to focus on the memorisation and recall of factual information (rather than questions which foster deeper student understanding).

Cognitive scientist Daniel Willingham states that your memory is a product of what you think about and not what you want to remember – in other words, if your students aren’t actually thinking and making meaning then the content won’t be learnt.

Memories are created by the release of chemicals.  If we pitch the challenge just right, we create an emotional response that releases dopamine.  Too little challenge offers too little reward and students won’t engage emotionally.

With this in mind it is apparent that questioning needs to be deep, meaningful and really make students think.

Tom Sherrington refers in his blog, to the need for probing and deep questioning to be practised and become habit – an engrained part of our classroom routine (not something that is an added extra or after thought). He proposes the following ways that good questioning promotes learning:

  • Good questions stimulate thinking, and often generate more questions to clarify understanding.
  • Good questions generate informative responses often revealing not only misconceptions and misunderstanding, but understanding and experience beyond that expected.
  • Good questions encourage learners to make links.
  • Good questions push learners to the limit of their understanding.
  • Good questions from pupils push teachers to the limits of their understanding too, and challenge them to find better ways of explaining.
  • Good questions offer opportunities for learners to hear others’ answers to questions, it helps them to reflect on their own understanding.

In his blog, Alex Quigley discusses how despite being naturally inclined to ask questions, students ask relatively few questions in the classroom setting. It takes six to seven hours for a typical student to ask a single question in class (Graesser and Person). This makes it even more essential to ensure that we make sure that students ask the right questions. Most questions in the classroom are closed questions that don’t elicit the deeper comprehension provoked by open questions such as “why…“, “how…” and “what if….”

Quigley suggests that we monitor our questions to ensure we are asking many more open questions that generate deeper thinking. Possible ways he suggests to do this are:

  • Use students as ‘question monitors‘ to note and evaluate such questions.
  • Use video technology, like IRIS, and tally your question types to reflect on your own questioning.

In his blog, Andy Tharby discusses the importance of responsive and probing questioning arguing that questioning should be about helping students to formulate new perceptions and about challenging lazy preconceptions. He believes that, when successful, questioning will lead to discussion, and it is in these episodes that we build our relationships with our classes and cement the ethos of our classroom. Perhaps the crucial point is that questioning is about initiating and sustaining a high level of academic rigour; the more we probe, the more we push the discussion forward, the less we leave unchallenged, the better our students learn.

Alex Quigley also suggests using a ‘Question ladder’. The question ladder is a strategy for planning lessons, devising a sequence of questions that form a ladder, most often with an escalating degree of challenge and complexity. By thoughtfully constructing such questions, beginning with more closed, recall style questions, before utilising more open, conceptually challenging questions, you can lead students through any given topic. The questions form a ladder as each question builds upon their specific knowledge of the text or topic at hand.

Wait Time

An issue that I found mentioned in a variety of sources was practising “wait time.” Research shows that on average teachers wait between 0.7 seconds and 1.4 seconds for pupils to respond to questions. Furthermore, teachers wait less than 0.7 seconds if they believe that their students might not know the answer to the question posed. “Wait time” has been shown to have a huge impact on the classroom dynamic. Mary Bud Rowe first described the positive outcomes associated with “wait time” in 1972.

Rowe’s research indicated that when teacher directed questions were followed by at least three seconds of undisturbed silent time for students to formulate responses, the students answered the question more successfully.

A synthesis of studies of Wait Time by Tobin and Capie (1980) confirms the following benefits of Wait Time when used by teachers:

  • The length of student responses increased.
  • More frequent, unsolicited contributions (relevant to the discussion) were made.
  • An increase in the logical consistency of students’ explanations occurred.
  • Students voluntarily increased the use of evidence to support inferences.
  • The incidence of speculative response increased.
  • The number of questions asked by students increased.
  • Greater participation by all learners occurred.

‘Wait Time’ is another of the techniques covered by Doug Lemov in Teach Like A Champion.

Responding to answers

Dylan Wiliam advocates the need to get away from the IRE system (Initiation, Response, Evaluation), and to think more carefully about the way in which we ask questions and respond to pupil’s answers.

Teacher: “How many sides does a hexagon have?” (Initiate)

Pupil: “6?” (Response)

Teacher: “Well done.” (Evaluate)

Instead, in the following example, the teacher poses a question, pauses to allow pupils time to think, pounces on any pupil (keeps them on their toes) and then bounces the pupil’s response onto another pupil.

Teacher: “How might you describe a hexagon?”

Pupil: “It’s a shape with 6 sides”

Teacher (to second pupil): How far do you agree with that answer?”

(Example from www.fromgoodtooutstanding.com/2012/05/ofsted-2012-questioning-to-promote-learning)

Bouncing the answer of a student to another student (or students) helps to ensure that more students are engaged with the question.

ABC Feedback is a similar technique where after a question has been answered students are required to Agree with, Build Upon or Challenge another student’s response.

By selecting the right students based on an escalating degree of challenge, we can give students options and can bounce these questions around the room, exemplifying differentiated progress.

Dealing with wrong answers or “I Don’t Know”

Doug Lemov proposes the strategy ‘No Opt Out’ when questioning students. This is a sequence that begins with a student unable (or unwilling) to answer a question and should end with the student answering that question as often as possible.

Format 1: The teacher provides the answer, the student then repeats the answer.

Format 2: Another student provides the answer; the initial student repeats the answer.

Format 3: The teacher provides a clue; the student uses the clue to work out the answer.

Format 4: Another student provides a clue; the initial student uses it to work out the answer.

This is another technique that is being covered in the TLAC group in school so I will be practising this throughout the year.

Memory

After looking at some of the issues experienced with questioning, I have started to research ways to improve memory and recall of students. One of the key ideas is that regular testing (where students are required to retrieve information) helps commit the content to their long-term memory.

One suggested method of regular testing is using multiple-choice questions and there is much discussion surrounding this at the moment.

Research by Little, Bjork and Bjork tested whether multiple-choice tests could trigger productive retrieval processes-provided the alternatives were made plausible enough to enable test takers to retrieve both why the correct alternatives were correct and why the incorrect alternatives were incorrect. In two experiments, they found that properly constructed multiple-choice tests could indeed trigger productive retrieval processes, but also that they had one potentially important advantage over cued-recall tests.

Both testing formats fostered retention of previously tested information, but multiple-choice tests also facilitated recall of information pertaining to incorrect alternatives, whereas cued-recall tests did not. Thus, multiple-choice tests can be constructed so that they exercise the very retrieval processes they have been accused of bypassing.

The key insight is that these alternatives must be plausible enough to enable pupils to retrieve why correct alternatives are correct and incorrect options are incorrect.

In his blog, Joe Kirby argues that using MCQs make assessment more reliable, marking less labour-intensive, pupil understanding and misconceptions more visible, and allow a wider breadth of knowledge to be assessed across a unit than just using holistic end-of-unit assessments. He believes that as long as they are constructed properly that they get pupils thinking deeply about subject content.

The 7 principles that Kirby suggests to consider when designing MCQs are as follows:

  1. The proximity of options increases the rigour of the question
  1. The number of incorrect options increases rigour

Three options gives pupils a 33% chance of guessing the correct answer; five options reduces the chances of guessing to 20%; always create five rather than three or four options for multiple choice questions. A ‘don’t know’ option prevents pupils from blindly guessing, allowing them to flag up questions they’re unsure about rather than getting lucky with a correct guess.

  1. Incorrect options should be plausible but unambiguously wrong

If options are too implausible, this reduces rigour as pupils can too quickly dismiss them.

  1. Incorrect options should be frequent misconceptions where possible
  1. Multiple correct options make a question more rigorous.

Not stating how many correct options there are makes pupils think harder. For example:

  1. The occasional negative question encourages students to read the questions more carefully.

Once they get a question like “Which of these is NOT a cause of World War 1?” wrong, and realise why, they’ll work out they need to read questions again to double-check on exactly what it is they’re being asked.

  1. Stretch questions can be created with comparisons or connections between topics.

Learning Hub 1 Autumn Term

Hub 1 presentation is available to download below:

Questioning Hub 1 – 11th September 2014

What next?

  • Hub members to trial 1 or 2 of these questioning techniques (no hands up / PPPB / ABC feedback)
  • Use some pre planned and differentiated questions as part of the trial
  • Make some reflection notes on the effectiveness of the technique, including any pros / cons / possible improvements.
  • Make comparisons with lessons where the techniques have not been used, considering student engagement and quality of responses.

Future Questioning Hub meetings

  • Challenging ‘I don’t know’…. A growth mindset
  • Multiple Choice Questions
  • Hinge Questions
  • Collaborative / cooperation strategies

 Learning Hub 2 Autumn Term

Since our first Learning Hub, members trialled different ‘no hands up’ techniques for questioning. The feedback has been overwhelming positive regarding the impact that teachers noticed on their classes.

Common positive comments were that ‘no hands up’:

  • enables us to obtain responses from quieter students who would not normally volunteer
  • stops the same students answering all the time
  • can increase the pace of a lesson, avoiding lengthy periods waiting for volunteers
  • is a good assessment tool for checking on learning and progress of all students
  • makes questioning accessible for all, as questions can be differentiated to suit the student being asked
  • Using ABC feedback or PPPB allows other students to add to, and develop prior responses.

Negatives were mainly centred around students getting used to the new format of questioning and not calling out or putting their hands up. We would expect these instances would decrease as students become familiar with these techniques around school.

After my initial Hub Leaders meeting it became apparent that our focus needed to be more centred around helping students to develop a Growth Mindset in relation to questioning and enjoy being challenged with questions.

Hub 2 focused around two areas:

  1. Helping students develop a Growth Mindset in relation to questioning.
  2. Multiple Choice Questions

A Growth Mindset – the power of belief

Slide06

Genius?

Research studying geniuses and/or great creative contributions is yielding findings to suggest that talent alone cannot explain these phenomena.

Instead the one thing that appears to set those who become geniuses or who make great creative contributions apart from their other talented peers is the deliberate practice they devote to their field (Ericsson, Charness, Feltovich, & Hoffman, 2006).

In other words, genius often appears to be developed over time through focused, extended effort. This is precisely the kind of effort fostered by a Growth Mindset.

The research

The research demonstrates that changing students’ mindsets can have a substantial impact on their achievement, with the impact of Growth Mindset workshops enduring long enough to boost end-of-term measures of achievement also.

Interventions that change mindsets Blackwell et al 2007

Interventions that change mindsets (Blackwell, et al., 2007)

In addition, teachers, blind to whether students were in the control group or the Growth Mindset (experimental) group, singled out three times as many students in the experimental group as showing notable changes in their motivation (27% in the experimental group vs. 9% in the control group).

control Vs growth

It will be important to follow students over longer periods of time to see whether the gains last, but it is likely that environmental support is necessary for them to do so (e.g. it will be important to have teachers who subscribe to a Growth Mindset).

Developing a Growth Mindset to support challenging questioning

Slide13

How do we get students to be willing to accept being challenged and not fear making mistakes or being wrong? Helen Hindle comments that we need to model the Growth Mindset for students.

Slide14

Using language to model a Growth Mindset

  • Show students how to recognise Fixed Mindset thoughts, how to stop them, and how to replace them with Growth Mindset thoughts.
  • Make the rule that Fixed Mindset thoughts spoken aloud in your class will be stopped, and the student will need to rephrase the idea as a Growth Mindset thought, by doing so you will help students recognise Fixed Mindset thoughts.
  • You will also help students monitor each other and shift their thoughts toward growth.

Using language to model the growth mindset

Modelling through language

Modelling through language

We decided to begin designing an aid / display to be used in the classroom to help students change the language they use and hopefully their thought process when they are ‘stuck’. For example other sentences to use instead of ‘I don’t get it’. We discussed different examples from other schools and decided on elements that we did or did not want to include. An initial version is currently being designed and this will be critiqued in Hub 3 – before being produced to trial in Hub member classrooms.

Multiple Choice Questions

There has been a wealth of research into Multiple Choice Questions (MCQs) and their impact upon memory and retention, however it is also well documented that effective MCQs are tricky to create.

Some people consider multiple choice exams easier than essay or open ended exams because:

  • The correct answer is guaranteed to be among the possible responses. A student can score points with a lucky guess.
  • Many multiple choice exams tend to emphasise basic definitions or simple comparisons, rather than asking students to analyse new information or apply theories to new situations.
  • Multiple choice exams usually contain many more questions than essay exams, each question has a lower point value and thus offers less risk.
  • Students can see the correct answer, it does not help with retrieval and memory.

The second part of Hub 2 was used to introduce hub members to the blogs of Joe Kirby cited in my initial research above. Hub members then spent time using these principles to create some MCQ to assess a unit that they would be teaching soon. The outcome of this will be shared by members in the next Hub.

Measuring impact

Initial ideas were:

  • Peer observations
  • IRIS self observations/evaluations
  • Student Voice
  • Data comparison with other classes

After attending #TMBelmont I was introduced to the work of Sarah Gott. She presented her ‘Basic Impact Spreadsheet’ as a way to measure the impact of a new teaching strategy. I will share this with hub members in Hub 3 as I think it will be extremely useful as a way to measure the impact of MCQ on retention. This could either be done with parallel groups or by splitting a group in half.

Hub 2 presentation is available to download below:

Questioning Hub 2 – 6th November 2014


Learning Hub 3 Spring Term

Since our second learning hub, members have continued to trial ‘no hands up’ techniques and also the use of Multiple Choice Questions.

Review of MCQ

The main feedback so far from other hub members around MCQ is the amount of time that designing MCQs takes (something highlighted in a lot of the research written around MCQ).  For this reason most hub members don’t feel that they’ve used MCQ assessments regularly enough to be able to assess any sort of impact on student retention (myself included).  We are therefore going to continue with MCQs and will look to assess their impact later in the year.

Hub 3

Hub 3 had a real focus on getting to grips with designing good quality MCQs and how we are going to assess whether our Questioning hub has had any impact by the end of the year.  As previously mentioned I was introduced to the work of Sarah Gott last year and her ‘Basic Impact Calculator’.

BIC

I shared this with the hub members as a way to measure the impact of MCQs.  Four hub members (including myself) have appropriate parallel classes where we can use one class as a control group and use the Basic Impact calculator.  We will complete assessment between now and Hub 4 and keep the raw scores for each group for analysis.

A student voice survey was also designed in Hub 3, to be used to assess student perception of questioning in their classes.  The survey will be a base measure so will be used with classes where the teacher has not trialled the techniques yet and the survey will be completed again later in the year for comparison.  The survey focuses on the types of questions students are asked , how their teacher accepts answers (e.g. hands up /no hands up) and how their teacher responds to student answers (e.g. when someone’s answer could be improved or when someone doesn’t know an answer).

As well as this, the learning aid posters were finalised in Hub 3 and they are currently being printed for hub members to have and refer to in their classrooms.

GMS Questioning 2GMS Questioning

The aim of the posters is to help develop resilience in students when they are stuck or ‘don’t know’ what to do.  Peer observations and the student voice survey will be used to assess whether these are being used and having any impact in changing student mindset.

Since Hub 3, I have started to read Mr Thomas’ Blog.  Mr Thomas is a maths teacher and he is a huge advocate of the Quick Key mobile phone app as an in-class AfL tool.

QuickKey2

He cites Quick Key as one of his top 3 techniques from 2014 and I have just downloaded the app to try. The app enables users to scan student responses to MCQs (which takes around 1-2mins) and get analysis of the responses immediately.  I will trial this before Hub 4 and share my findings with Hub members then.

Overall, I am happy with the progress that we are making.  I think we are now clear about where we are going and most importantly how we are going to assess if there has been any change to questioning in our classrooms.

Hub 3 presentation is available to download below:

Questioning Hub 3 – 15th January 2015


 Learning Hub 4 Spring Term

Since Hub 3, members have continued to trial the questioning techniques from Hubs 1 and 2 and we’ve also started to use the Growth Mindset questioning learning aids with our classes.  As well as this a student voice survey has been completed to assess where questioning is at currently.

The main feedback from the learning aids is that although they are useful they’re a bit too small so I am in the process of having larger versions created.  As well as this I am having small bookmarks made for students to have and refer to during their lessons.  We realise that in order for these aids to be effective they need to be used regularly and not just a nice poster for the wall.

Student voice results show a lot of positive outcomes, which is good however with hindsight we should have completed this survey at the very beginning of the year to get an accurate baseline measure of questioning.  Individual teachers did choose classes to complete the survey where they had not consciously used the particular techniques we have trialled, however  the fact that we have focused solely on questioning for over a term will undoubtedly had some impact on all of our classes (not just those we’ve consciously trialled strategies with).  As a result I don’t feel that this is a wholly accurate baseline measure and it means we will not as easily be able to assess the impact of our hub and the strategies we have each been trying.

Three Hub members are continuing to trial the use of MCQ assessments/quizzes with an experimental group to compare with a control group later in the year.  This data will be collected at the end of the year ready for analysis in our final Hub.

Hub 4

Hub 4 had two main focus areas:

  • Analyse our student voice results
  • Understand how to design and use good quality hinge questions

Analysis of student voice results

Student Voice Responses_Page_1 Student Voice Responses_Page_2 Student Voice Responses_Page_3 Student Voice Responses_Page_4

Positives:

  • Teachers are using ‘no hands up’ in lessons (84% of students agreed)
  • Teachers are asking students questions which challenge them to think hard (77% of students agreed)
  • Everyone gets the opportunity to answer questions (82% of students agreed)
  • Teachers prompt students to improve others’ answers (74% of students agreed)

Focus areas:

  • Giving more thinking time (only 14% of students stated that this happened when someone didn’t know the answer to a question)
  • Encouraging students to refer to learning aids to help them when they ‘don’t know’ (only 4% of students said this happened in lessons)
  • Despite a high percentage of students stating that everyone did get the chance to answer questions, 38% of students said they felt it was often the same people who answered questions.

What are they really thinking? Hinge Questions

A hinge question is based on the important concept in a lesson that is critical for students to understand before you move on in the lesson.

  • The question should fall roughly midway during the lesson.
  • Every student must respond to the question within two minutes.
  • You must be able to collect and interpret the responses from all students in 30 seconds.

Dylan Wiliam Hinge Questions

Designing / Using Hinge Questions

The characteristics of good hinge questions are identified by Dylan Wiliam are:

  • A multiple choice question.
  • An immediate check of every student’s understanding.
  • Something which tells us:
    • Have they understood?
    • If not, what has been misunderstood (and needs re-teaching)?
  • Relate to important learning outcomes necessary for progression in learning
  • Can be used at any point in a learning sequence
    • Beginning (range-finding)
    • Middle (mid-course correction)
    • End (e.g., “exit pass”)
  • When used in “real-time” teacher must be able to collect and interpret the response of all students in 30 seconds
  • Teacher can then respond accordingly

It is the final bullet point which I believe is most important.  A lot of people (myself included) have used tools such as whiteboards for whole class questioning and so when first introduced to the idea of hinge questions may instinctively feel that they already use them. However, the key difference with a well designed hinge question is the teacher reaction to the responses they receive. Once the responses are gathered from students (via whiteboards/ finger voting etc.) the essential part of a hinge question is that the direction the lesson will take is completely dependent upon the student answers.  Jason Buell points out that it is important (before asking the question) that you decide on what percentage of the class needs to get the question right in order to decide the class have understood enough to move on and ‘lock yourself in’.

Examples of Hinge Questions

Hinge Q eg 1 Hinge Q eg 2 Hinge Q eg 3 Hinge Q eg 4Hinge Q eg 5Hinge Q eg 6Hinge Q eg 7Hinge Q eg 8Hinge Q eg 9

Important points:

  • Student misconceptions live, grow & prosper
    • Student learning requires a redrawing of mental maps
  • Hinge questions are hard work
    • But they offer a powerful way to bring misconceptions out
  • It’s not what you’ve got, it’s what you do with it
    • The best hinge question is useless without action

Dylan Wiliam

Our Hub members are now designing some hinge questions that can be attached to their Schemes of Learning and shared with their departments during out next departmental meetings.

Recommended further reading on hinge questions:

Do they understand this well enough to move on? Introducing hinge questions and Hinge questions hub by Harry Fletcher-Wood

Does everyone get it now? by Jason Buell

Useful source of Hinge Questions:

Diagnostic Questions

Hub 4 presentation is available to download below:

Questioning Hub 4 – 26th February 2015


Learning Hub 5 Summer Term

Since Hub 4, members have really focused on designing and using hinge questions in lessons to help inform their teaching.   As well using hinge questions themselves we have each shared the information with our departments during department time.

Hub 5

Hub 5 had the following focus areas:

  • Review the use of hinge questions
  • Questioning to aid revision (including question level analysis of tests)
  • Pre-testing
  • Start to consider what evidence/examples we have to show what impact the Hub has had on our teaching practice.

Throughout our discussion around using hinge questions it was agreed that the use of hinge questions was very informative, sometimes with surprising results.  Most Hub members are currently using them more towards the end of a lesson, similar to an exit ticket type of question and then using the information to inform the direction of the next lesson.  Now that I am feeling more confident with them I am going to try using these more mid- lesson.  I have been using the Quick Key app with my year 8 class to analyse the feedback from a hinge question, but I’ve not yet been able to scan responses quick enough for this to be done mid-lesson, therefore I will use mini-whiteboards where the hinge question falls within the lesson.

Since it is exam season it seemed odd not to consider questioning in relation to revision, so I shared Shaun Allison’s blog  ‘Supporting Learning Through Effective Revision Techniques’. The blog is a summary of Dunlosky et al’s research which highlights effective and ineffective revision techniques.
Dunlosky ineffective Dunlosky ineffective 2

Practice testing and elaborative interrogation are the two strategies which link most directly with questioning and also support our previous research. Elaborative questioning requires students to think very deeply about a topic and understand ‘why’ something is true or ‘why’ a method works, as opposed to just learning facts/methods by heart.

Dunlosky effective Practice testing Elaborative interrogation

As well as this I shared with the Hub members the question level analysis our Maths department carry out on mock GCSE exams to inform planning.

Maths QLA example

This means that each KS4 class has a bespoke SoW which regularly revisits the focus topics for that group.

Individual class TT

Finally we started to move towards how we can evaluate the impact of our Hub and spent time thinking about different evidence/examples we can collate to exemplify our Hub’s aims:

  • Questioning for assessment that informs teaching.
  • Deep and probing questioning for teaching/memory that elicits students to think hard, supporting a culture of ‘growth mindset’
  • Embedding a culture of ‘growth mindset’ across our learning community in order to raise aspirations and expectations of what students can achieve.

By 21st May each Hub member will provide examples of how the Hub has impacted their teaching practice and complete the  follow up student voice surveys, ready for our final evaluation in Hub 6.

Hub 5 presentation is available to download below:

Questioning Hub 5 – 23rd April 2015


Learning Hub 6 Summer Term

Initially the aim for hub 6 had been solely to collate and review examples of our impact and to discuss possible next steps for the hub going forward into next year.  However, after reading Richard Donnelly’s  recent blog ‘Assessing knowledge and understanding using google forms’ I felt that this was too useful not to share with everyone – so Hub 6 did include one final new strategy/tool.

Hub 6

Hub 6 had the following focus areas:

  • Using Google Docs and Flubaroo App to create self marking MCQ quizzes
  • Review of final student voice results
  • Organising evidence of the our Hub’s impact
  • Possible next steps, where next?
  • Complete Hub evaluation survey

In Hub 5 I shared with members how useful Question Level Analysis can be when designing SoWs.  One of the concerns (perhaps justifiably) had been the amount of time that this can take to complete manually.  So, when I watched the video shared on Richard’s blog, demonstrating  how to setup self marking/self analysing MCQ quizzes I thought I needed to share this with everyone.  Although the quizzes will take some time to set up at the front end, as the video demonstrates this is a relatively quick process.  If you haven’t already seen this then I would definitely recommend you set aside 20minutes and give this a watch:

Review of follow up student Voice:

Our follow up student voice results show improvements in each area that we assessed.  I am sure that these improvements would’ve been more significant had the first student voice been carried out at the very start of the year.  Since the initial survey was completed after our second Hub I don’t think we really got a true measure of our starting point.

 

Questioning Hub 6Questioning Hub 6 (1) Questioning Hub 6 (2) Questioning Hub 6 (3) Questioning Hub 6 (4) Questioning Hub 6 (5) Questioning Hub 6 (6)

Teachers are using ‘no hands up’ in lessons

  • 97% said yes (84% initial survey)

Teachers are asking students questions which challenge them to think hard

  • 84% of students agreed / strongly agreed (77% initial survey)

Everyone gets the opportunity to answer questions

  • 86% of students agreed/strongly agreed (82% initial survey)

Teachers prompt students to improve others’ answers

  • 85% of students agreed / strongly agreed (74% initial survey)

 Focus areas from first survey analysis:

Giving more thinking time

  • Only 14% of students stated that this happened when someone didn’t know the answer to a question. (Review = 22%)

Encouraging students to refer to learning aids to help them when they ‘don’t know’

  • Only 4% of students said this happened in lessons. (Review = 10%)

Reviewing the impact our hub has had

Members have provided evidence of where/how our Questioning Hub has had on their individual teaching practice.  Examples include:

  • Hinge questions & reflections on their use
  • Multiple Choice Assessments
  • MCQ unit pre-tests
  • Sarah Gott’s Basic Impact Calculator
  • Deep questions within SoW and shared with departments to increase challenge

Next Steps

The National Teacher Enquiry Network state that powerful CPD is:

    • Challenging
    • Evaluated and monitored
    • Draws on external expertise
    • Theoretical and practical
    • Sustained
    • Collaborative
    • Focused on valued outcomes for students

We considered each of the 7 features in Hub 6 and it is the latter point which we feel we have the least amount of evidence.   It is apparent that we are much more easily able to assess/provide evidence of the outcomes for our own practice, as opposed to outcomes for students. This is not to say that we have no evidence, however.  We have measured outcomes for students using 2 measures: Student Voice surveys and Sarah Gott from CEM’s ‘Basic Impact Calculator’ in relation to using MCQ.  Both measures do show a positive impact – with The Basic Impact Calculator showing a class impact of 0.7 (8 months progress) for those students who had repeatedly completed MCQ quizzes.  I aim to repeat this study next year myself to see if I obtain similar results.

Going forward next year assessing the impact on students should be our main focus.  When we discussed this it was suggested that a good way to do this would be to work collaboratively & conduct Lessons Studies.  This would allow hub members to assess the impact of different techniques on small key groups of students.  This would also increase the collaborative aspect of the Learning Hub as well.

Questioning Hub 6 (7)

Hub 6 presentation is available to download below:

Questioning Hub 6 – 4th June 2015

One thought on “Questioning

  1. Pingback: Questioning | Kesgrave High School

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s