Reflections and “Results” on an In-Class Survey

I decided to administer my first full-scale survey at the end to this semester.  I had previously asked students random questions on handouts I had administered, while I thumbed through student responses in the past, I never really looked at the data in (somewhat) more systematic fashion.


I gave 3 of my (so-called) high level classes (Grade 1/2/3–one survey for each grade level) a 12-question survey using likert-like items.  The survey was divided into two parts: the first surveyed student attitudes in general, in addition to student perceptions of their in-class experience with the NET this semester.  The second part asked students to rate the effectiveness of the main activities used during my generally set-format lesson plans.


Here are the items:



I feel I learned something in this class.

I wish I could’ve learned more during this class.

I need English to succeed.

I like group work.

I like the teacher to teach in front of the class.

I like working alone.


B (list of routine classroom activities)


Visual aids

Sentence stems/structured speaking opportunities

Worked examples


Textbook activities


I also provided students a space to “write any additional comments.”


Students provided 3 comments: which I transcribe: “Watch more TV,” “I want to other [warm-up activities] (ex: puzzle),” and the illuminating comment: “Boston.”


Anyhow, I looked through one class somewhat systematically–grade 2 (a cursory glance for both grades 1 and 3 appeared similar).


Here are the findings:


Part A


Item Type # Number Answering 3 or Below Number Answering 4 or Above
1 3 9
2 6 6
3 4 8
4 4 7
5 5 7
6 7 5


Part B


Item Type Number Answering 3 or Below Number Answering 4 or Above
Warm-up 2 7
Visual Aid 1 8
Sentence Stems/Speaking 1 8
Worked Examples 5 4
Dialogues 1 9
Textbook 3 6


26 out of 32 students completed the survey in some fashion.  I ignored 14 of surveys because the student circled only one number (i.e. 1 or 5 for every item).  This level of apathy is consistent with student speaking test performance in my classes.  There were some positives.  The majority of respondents felt the learned something (8/11 responding in the 4-5 range),  and thought English was necessary (8/12).  Curiously enough a sizable amount responded in an ambivalent fashion about wishing to have learned more (5/12).  Direct teaching appears to be the most favorable method, though not by a large margin.




The second part of the survey was somewhat gratifying.  Students indicated favorable attitudes concerning the effectiveness to all of the main activities in my class, with the exception of worked examples.  I suppose putting in those 3 hours for one ridiculously detailed powerpoint-based lesson was worth it!  Perhaps not surprisingly, dialogues, as indicated by this survey, were the most popular activity.


Of course, such a survey has a million limits which should be apparent, small sample, questionable questions, the closed nature of the questions, et al.  To which I must respond: it was a first effort.  This survey reminded me an old sports magazine which published the results of fan telephone surveys occasionally: “sampling error: +/- 100%.”


I suppose the big lesson is how apathetic the majority of my students are, approximately 60% couldn’t bother to take the effort to reflect on a question and circle a number.  However, the flip side is that 40% did so, thus there may exist some willingness to engage in learning tasks among a sizable minority of my students.


I hope to continue administering surveys, hopefully twice a semester.  I presume my approach will get more refined as experience accumulates.


Thus, I leave with a few questions: Any suggestions to tweak my questions to get more valuable data?  How can I promote greater completion rates in the future (bribery?–students didn’t put their names on this particular survey)?  Any personal experience administering surveys in the past?–opinions?  What value did they yield?




p.s. Here is a reference which inspired me to finally implement this into my practice:


Davies, Alun (2006).  What Do Learners Really Want from Their EFL Course?  ELT Journal 60 (1), pgs. 3-12.



About cmiller112

Teacher, Father, Jogger, Sleeper, Husband, (add extra label here)
This entry was posted in Uncategorized. Bookmark the permalink.

One Response to Reflections and “Results” on an In-Class Survey

  1. Hi Chris,
    I applaud you for your willingness to ask the questions of your students that might help you improve your practice. I am also impressed by your courage to admit that this might not have worked. I hope you won’t stop trying to get your students’ feedback.

    I also collect student feedback. I have done this in a variety of ways: asking them to write three things they like and three things they hated, giving them partial sentences to complete (I want to do more _____. My least favorite activity was ______. ________ was very useful to me. Etc.). Sometimes I have asked them to do this in groups and sometimes individually. I never ask for their names on the papers and I try to make it clear that my reason for collecting feedback is to inform my own teaching and improve the course for future participants.

    Some other teachers who have written about student feedback and might provide more information and answers to your questions are:
    Rose Bard
    Alex Walsh
    Josette LeBlanc
    Laura Phelps


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s