I decided to administer my first full-scale survey at the end to this semester. I had previously asked students random questions on handouts I had administered, while I thumbed through student responses in the past, I never really looked at the data in (somewhat) more systematic fashion.
I gave 3 of my (so-called) high level classes (Grade 1/2/3–one survey for each grade level) a 12-question survey using likert-like items. The survey was divided into two parts: the first surveyed student attitudes in general, in addition to student perceptions of their in-class experience with the NET this semester. The second part asked students to rate the effectiveness of the main activities used during my generally set-format lesson plans.
Here are the items:
I feel I learned something in this class.
I wish I could’ve learned more during this class.
I need English to succeed.
I like group work.
I like the teacher to teach in front of the class.
I like working alone.
B (list of routine classroom activities)
Sentence stems/structured speaking opportunities
I also provided students a space to “write any additional comments.”
Students provided 3 comments: which I transcribe: “Watch more TV,” “I want to other [warm-up activities] (ex: puzzle),” and the illuminating comment: “Boston.”
Anyhow, I looked through one class somewhat systematically–grade 2 (a cursory glance for both grades 1 and 3 appeared similar).
Here are the findings:
|Item Type #||Number Answering 3 or Below||Number Answering 4 or Above|
|Item Type||Number Answering 3 or Below||Number Answering 4 or Above|
26 out of 32 students completed the survey in some fashion. I ignored 14 of surveys because the student circled only one number (i.e. 1 or 5 for every item). This level of apathy is consistent with student speaking test performance in my classes. There were some positives. The majority of respondents felt the learned something (8/11 responding in the 4-5 range), and thought English was necessary (8/12). Curiously enough a sizable amount responded in an ambivalent fashion about wishing to have learned more (5/12). Direct teaching appears to be the most favorable method, though not by a large margin.
The second part of the survey was somewhat gratifying. Students indicated favorable attitudes concerning the effectiveness to all of the main activities in my class, with the exception of worked examples. I suppose putting in those 3 hours for one ridiculously detailed powerpoint-based lesson was worth it! Perhaps not surprisingly, dialogues, as indicated by this survey, were the most popular activity.
Of course, such a survey has a million limits which should be apparent, small sample, questionable questions, the closed nature of the questions, et al. To which I must respond: it was a first effort. This survey reminded me an old sports magazine which published the results of fan telephone surveys occasionally: “sampling error: +/- 100%.”
I suppose the big lesson is how apathetic the majority of my students are, approximately 60% couldn’t bother to take the effort to reflect on a question and circle a number. However, the flip side is that 40% did so, thus there may exist some willingness to engage in learning tasks among a sizable minority of my students.
I hope to continue administering surveys, hopefully twice a semester. I presume my approach will get more refined as experience accumulates.
Thus, I leave with a few questions: Any suggestions to tweak my questions to get more valuable data? How can I promote greater completion rates in the future (bribery?–students didn’t put their names on this particular survey)? Any personal experience administering surveys in the past?–opinions? What value did they yield?
p.s. Here is a reference which inspired me to finally implement this into my practice:
Davies, Alun (2006). What Do Learners Really Want from Their EFL Course? ELT Journal 60 (1), pgs. 3-12.