
A Neuroscience-Based Assistive Technology for Neurodivergent Learners
This project proposes a neuroscience-driven approach: a tool that transforms text into structured cognitive maps—visual representations of ideas that match how many autistic learners think.
Re-imagining Learning Through Cognitive Maps
Neurodivergent children, especially those with autism, process information differently from traditional learners. Standard classroom instruction often relies on linear verbal teaching, which may not align with the spatial and pattern-based reasoning strengths of autistic students.
This project proposes a neuroscience-driven approach: a tool that transforms text into structured cognitive maps—visual representations of ideas that match how many autistic learners think. By merging computational neuroscience, NLP, and machine learning, it turns static mind maps into dynamic, adaptive learning supports.
The Framework
Data Collection & Preprocessing
A dataset of 35 000 image-text pairs was selected to balance efficiency and depth.
Training and Testing
Data was split 90 % for training and 10 % for testing.
Embedding Generation
Using OpenAI embeddings, each text-image pair was converted into a high-dimensional vector representation.
Vector Database Integration
Embeddings were stored and indexed in Pinecone, enabling quick retrieval of semantically similar concepts.
Prediction and Evaluation
Outputs were validated through cosine-similarity scores between input queries and generated visuals.
Web Application Deployment
A lightweight interface was tested in Codespaces, allowing users to input text and receive real-time visualizations.
Results
The prototype achieved strong correlation between queries and generated visuals:
Testing cosine-similarity: 0.98
Real-time cosine-similarity: 0.86
These results show the system can reliably create images that closely match the intended meaning of textual input, giving educators a tangible way to support comprehension in real time.
Why It Matters
Visual and spatial reasoning are powerful learning channels for many autistic students. By linking neuroscience insights with AI, this platform:
Enhances comprehension through visual representation of abstract text.
Reduces cognitive overload by translating language into pattern-based visuals.
Supports personalized education, adapting to each learner’s cognitive strengths.
Future iterations could integrate audio, video, and multimodal feedback, making classrooms more inclusive for both neurodivergent and neurotypical learners.
Looking Forward
This work demonstrates how LLMs and vector databases can bring neuroscience into education. The long-term goal is to develop a real-time assistive platform that educators can use to personalize instruction—bridging the gap between human empathy and technological intelligence.
At NeuroKind, we believe that when learning tools adapt to every kind of mind, education becomes not only inclusive, but truly transformative.
References
(Adapted from source
Sanskriti_sf_25
)
Chung S. & Son J.-W., Visual Perception in Autism Spectrum Disorder: A Review of Neuroimaging Studies, 2020.
Klorfeld-Auslender S. et al., A Distinct Route for Efficient Learning and Generalization in Autism, Current Biology, 2022.
Minshew N.J. & Williams D.L., The New Neurobiology of Autism, Arch Neurol., 2007.
Cockerham D. & Malaia E., Neuroscience-Based Approaches to Teaching Students on the Autism Spectrum, Zeitschrift für Psychologie, 2016.


