Funding
Sparkli Raises $5M Pre-Seed to Build an AI-Native Learning Engine for Children

Sparkli has raised a $5 million pre-seed round to develop a new kind of learning platform designed specifically for children aged five to twelve. Founded by former Google Area 120, YouTube, and Search engineers, with Area 120 being Google’s internal incubator where employees build and test experimental startup-style products, the startup is emerging from stealth with an ambition that goes beyond digitizing textbooks or automating worksheets. Sparkli is positioning itself around a bigger question: how artificial intelligence might help children learn by doing, not just consuming.
The funding will be used to scale Sparkli’s multimodal learning engine and prepare for a private beta planned for early 2026. The company is already piloting its platform with a large private school group, giving it a real-world environment to test how AI-driven learning behaves in classrooms rather than just demos.
From passive screen time to active exploration
Much of today’s educational screen time is either passive—videos, games, or short-form content—or rigid, with predefined lessons that leave little room for curiosity. Sparkli is attempting to sit in a different space. Instead of asking children to work through linear material, the platform lets them start with a question and then builds an interactive “learning expedition” around it.
If a child wants to design a city on Mars, for example, Sparkli doesn’t respond with paragraphs of text. It generates a multi-step experience that blends visuals, voice, simulations, and decision-making. Children experiment with ideas, test constraints, debate trade-offs, and reflect on outcomes. The aim is to turn curiosity into structured exploration rather than flattening it into answers.
This approach reflects a broader shift happening across education technology, where AI is increasingly being used to adapt learning to the learner, rather than forcing learners to adapt to fixed content.
What research suggests about AI and learning
Over the past few years, research into AI in education has pointed to several consistent benefits when systems are used thoughtfully. Personalized learning is one of the most cited. AI systems can adjust difficulty, pacing, and presentation based on how a learner responds, helping maintain engagement and reduce frustration. This is especially relevant for children, whose developmental stages and interests vary widely even within the same age group.
There is also evidence that interactive and exploratory learning—particularly when it involves simulation and problem-solving—can lead to stronger conceptual understanding than memorization-based approaches. When learners are asked to make decisions, explain reasoning, or defend outcomes, they tend to retain knowledge longer and develop transferable skills.
At the same time, educators and researchers emphasize that AI works best as an augmentation tool. The most successful implementations support teachers, parents, and curricula rather than replacing them. Platforms that treat AI as a creative collaborator, rather than an answer machine, tend to align more closely with these findings.
Addressing the risks of AI for children
The use of AI with younger users comes with real concerns. Open-ended AI systems can overwhelm children, surface inappropriate content, or encourage over-reliance on automated answers. Privacy, data usage, and emotional attachment are also active topics of debate in child-focused technology.
Sparkli’s design appears to be shaped by these risks. Rather than exposing children to a general-purpose chatbot, the platform constrains interactions into guided, age-appropriate environments. Learning experiences are structured, goals are explicit, and progression is designed to encourage reflection and agency rather than instant gratification.
This guarded approach mirrors a growing consensus in education: the question is not whether AI belongs in learning, but how narrowly and responsibly it should be applied—especially during formative years.
Early signals from classroom pilots
In early pilots, Sparkli has been tested in both structured classroom settings and more open-ended sessions. Teachers have observed students engaging in debates about budgeting, sustainability, and design choices while running simulations such as small businesses or infrastructure projects. In less structured “free exploration” periods, children initiated their own learning paths, moving between topics like game design, cosmology, and environmental planning.
Parents involved in early testing have noted a shift in how children talk about what they learned, often returning from sessions eager to explain ideas or propose solutions rather than simply describing what they watched.
While anecdotal, these signals align with what education research suggests about active learning: when children feel ownership over the process, motivation tends to increase.
A longer-term vision for AI in childhood learning
Sparkli’s longer-term goal is to evolve beyond exploration into creation, giving children tools to prototype ideas directly within the platform. Over time, the system builds an interest and knowledge graph for each child, allowing learning experiences to adapt as interests mature.
The broader implication is a move toward AI systems that grow alongside learners—remembering what captured their attention years earlier and helping them develop those interests into skills. If successful, this model could influence how educational platforms think about continuity, personalization, and the role of AI as a long-term learning companion.
The $5 million pre-seed round gives Sparkli the runway to test whether this vision can work at scale. As AI becomes more embedded in education, experiments like this will help define whether the technology deepens curiosity—or merely digitizes old habits in new ways.












