
Measuring EdTech Impact with MEII
When you work in EdTech long enough, you realize everyone talks about impact — but few actually measure it. We celebrate engagement rates, dashboards, and user growth. But what happens after students log off? Did they learn something that lasts? Did our tools actually make a difference — or just keep them entertained?
At Sirius Game, this isn’t a philosophical concern — it’s our foundation. Our pedagogical DNA was built on evidence, first through university research supervision, and later through classroom testing with real teachers and students. Every product we design begins with a research question, not a feature list: What learning outcomes do we want to observe? How do we measure them? What does the literature tell us about motivation, play, and knowledge retention?
Since our early prototypes, we’ve been fortunate to collaborate with Harvard, NYU, the Free University of Bolzano, Riga Technical University, and Göttingen University — building bridges between theory and classroom practice. This combination of academic mentorship and field validation shaped our unique approach: Playful Learning with measurable impact.
That’s why the paper “Evaluating Educational Technology: Consolidating Across Multiple Impact Indicators and Rating Systems”, introducing the Multiple EdTech Impact Index (MEII), immediately caught my attention. The MEII is a framework developed by the academic advisory board of EduEvidence – The International Certification of Evidence of Impact in Education. It consolidates more than twenty existing global evaluation rubrics — from the UK’s EdTech Evidence Group to UNESCO’s digital learning frameworks — into a single multi-dimensional model.
The MEII evaluates EdTech solutions across five complementary dimensions:
- Efficacy — measurable learning outcomes and knowledge retention
- Effectiveness — real-world adoption, teacher satisfaction, sustained engagement
- Ethics — privacy, bias detection, transparency
- Equity — accessibility, inclusion, representation
- Environment — sustainability and digital well-being
Together, these lenses create a 360° view of what “impact” should mean in the digital education era. Instead of a single score, the MEII uses a weighted index combining quantitative data with expert peer review. It helps startups, researchers, and institutions position tools transparently along a continuum — from early prototypes to evidence-backed interventions.
At Sirius Game, we’ve begun integrating this mindset directly into our development cycle. Every release includes an impact validation stage, testing both usability and learning outcomes in partner schools. The result is a model we call Pedagogical Supervision by Design: continuous observation, evidence collection, and iteration.
One sentence from the MEII paper stayed with me:
“Impact without evidence is just a good story.”
In EdTech there’s no shortage of stories — but real transformation requires data, transparency, and responsibility.

That’s why at Sirius Game, impact research isn’t a task — it’s part of our creative identity. Every mission, narrative, and classroom activity is designed to be joyful, meaningful, and measurable.
The MEII pushes the entire EdTech ecosystem toward a more equitable, sustainable, and accountable future. Innovation without evidence is noise — and education deserves more than noise.

