The Babbel Blog

language learning in the digital age

How Babbel Built an Online English Test

Posted on December 21, 2017 by

Babbel’s partnership with Cambridge English brings language assessment into the digital age

 

Ben, originally from the UK, is project manager for English in Babbel’s Didactics team, the language experts who create and optimise our courses. In the past, he’s trained and worked as an English teacher and assessor in both Germany and Spain, and he delights in learning more unusual languages as far afield from English as possible, including Swahili and Tongan. Here, he writes about how Babbel and Cambridge English, experts in language assessment, partnered to release the Babbel English Test…

 

 

 

Babbel already provides a platform for learning languages online and gaining confidence quickly for actual conversation, and for a while, we’ve wanted to give our learners the means to prove their language skills. So, we recently partnered with Cambridge English, with whom 5 million people each year take a language exam, to build the Babbel English Test.

In the field of English learning and teaching, Cambridge English is renowned worldwide for providing quality assessments, while Babbel enables you to learn how to speak a new language anywhere, anytime. As Babbel’s co-founder Thomas Holl explains: “Our aims as companies are complementary — to provide a quality experience for language learners. So for us, it seemed perfect to combine Cambridge’s unrivaled knowledge of assessment with Babbel’s expertise in digital language learning!”

The result? The Babbel English Test powered by Cambridge English is now live! It’s possible to take the entire test online, and it offers our learners the chance to accurately assess their reading and listening skills in English and certify levels from A1 to B1 and above. The result is displayed in a personalised certificate with your photo, which you can add to your CV or share on social and professional networks.

Over the past decade, online language education and assessment have grown in tandem. It’s now possible to sit many of the best known English exams on computers – albeit only in specially designated test centres, yet traditional paper exams have fallen increasingly by the wayside. The enormous potential of an affordable, high-quality and widely accessible digital assessment was clear, but Babbel also recognized we needed a partner with the right expertise to help us ensure this test met our extremely high standards. Enter: Cambridge English.

Prototyping the First Version

At the beginning of the process, we decided to create a short and simple A1 (Beginner) prototype test to get an idea of how both Babbel and Cambridge would work together, and what each could contribute. More importantly, we wanted to know what Babbel learners thought about the overall experience. Together with the representatives for Listening and Reading from Cambridge, we created a 37 question A1 test to assess listening and reading skills.

For those of us working in Babbel’s Didactics team, customer feedback is already essential to constantly improve our content, so at the end of the A1 prototype, we asked the test-takers to take a short survey. Some feedback was expected, for example, the large majority found the test far too easy. This is fairly standard at A1 level. However, we also found that a large portion of learners didn’t understand what A1 meant. This is something we see with our courses that are based on levels of the CEFR, and to remedy this, we explain what the level means within each course description.

Babbel English Test Certificate

For a test, however, especially one assessing whether or not a person has attained a specific level of English, it might be hard for a person to even decide whether or not to attempt such a test if they are unsure what level they’re aiming for. So, we used this feedback to design a model that would cover a range of levels rather than offering one exam per level. This means that our test-takers don’t need to worry about pre-assessing their level, as would be the case with individual level exams.

Another important theme in the feedback was – rather surprisingly – that people enjoyed taking the exam. I obviously can’t speak for everyone, but when I’ve done traditional language exams in the past, I’ve always found them rather stressful, leaning more towards the definitely-not-fun end of the scale. What we discovered, however, is that being able to do this test online in the comfort of one’s own home and at their own pace, people actually enjoyed the experience. In fact, several people commented that they would even appreciate a longer test.

Designing the Test

Now we were ready to build a more comprehensive test, covering levels A1 to B1 and above, following a new test model that we designed in collaboration with Nir, Babbel’s Senior Product Owner for New Business Initiatives. He also helped evaluate how much effort the various models we’d designed would require Babbel’s engineers to actually build. After all, when a test is online, the content creation team is just one of a number of teams involved.

For this iteration, the teams for Listening and Reading from Cambridge collected content for the various levels. To determine which topics should be tested, I collected content from hundreds of progressive Babbel lessons for English so that our partners at Cambridge could see exactly what vocabulary and grammar our English learners would be expected to know at the various CEFR levels, according to the Babbel syllabus. Following this, the team at Cambridge commissioned specialist test content writers to write questions in keeping with Babbel’s content.

Other content was used though, namely, existing content that has already been used in other Cambridge assessments and has been properly calibrated. Calibration means that responses to these questions are monitored to ensure that they really are accurately assessing the level that they were designed to, and where necessary, adjusted. By including these so-called anchor items, we are able to compare responses to these with responses to the new questions, to definitively ensure that our test accurately assesses the test-taker

Throughout the whole process, I’ve learned an enormous amount about language assessment and the amount of thought and work that goes into designing a fair and accurate exam. In fact, it gives me a whole new appreciation of all those delightful German exams I sat in the past that I didn’t fully appreciate…

Since the release of the test, we’ve been working on optimisation. For security and validity, we’ve implemented an identity-verification system to ensure a fairer testing process, and we monitor feedback we receive from test takers with an eye to how they feel about the results and the variety of topics assessed. We hope to expand the test over the coming months to include higher levels, up to level B2 (Upper-Intermediate). We’ll also start examining different modes of assessment, for example, assessing free text input, i.e. expanding the test to look at the test-takers’ writing ability in English. So watch this space!

In the future, it is entirely conceivable that all traditional language exams will ditch their paper formats and embrace the digital future. Even speaking assessment will take place via webcam. It’s also likely that learners will have a much wider range of assessment options, with newer, non-traditional companies building their own assessments. However, with this shift, it’ll be more important than ever to ensure that online test are accurate, valid and just as rigorous as their in-person counterparts. Combining thorough knowledge of assessment with experience in online language instruction, as Babbel and Cambridge English have done, is absolutely essential.  It’ll also be key to continue shaping online assessment to suit the needs of the language learner – which for a digital test, may well look altogether distinct from those of a traditional assessment.

Leave a Reply