A citizen science collaboration between students at the University’s National Software Academy (NSA) and academics from the Cardiff University Brain Research Imaging Centre (CUBRIC), has developed a new app called NeuroSwipe. This is an app which allows scientists to sort through thousands of brain scans using a swiping movement inspired by dating apps.
The app trains non-scientists to become experts in recognizing poor-quality brain scans that would not be important for inclusion in a scientific study, which saves the researchers valuable time.
The app includes many scans that are taken from individuals involved in a study, which typically takes a lot of time to look through. Scans vary in quality and accuracy making some not fit to include in an analysis. Individuals using the app can distinguish the differences and swipe accordingly.
With large-scale studies, experts have to go through hundreds of thousands of people and the images need to be checked by a trained expert to make sure consistent data quality is found before a scientific analysis can begin.
“The image filtering process has been largely automated in recent years but training an artificial-intelligence program to detect poor quality scans is challenging,” said Dr. Judith Harrison from CUBRIC, who is leading the project. “The human eye is exquisitely sensitive to subtle differences in size, shape, colour and appearance, so that’s why we wanted to get the public involved.”
Dr. Harrison contacted students from the NSA and in 10 weeks, they were able to fully understand how brain scanning worked, the requirements needed for such an app and build the app.
“It was hugely beneficial; by working on the real project it provided the opportunities to encounter problems that I wouldn’t normally have met in a typical learning environment. Not only this, but it meant I had the opportunity to work on a project from its design to its integration and see it be used and deployed in a real-world environment,” said Jack Light, a student from the NSA who worked on the project.
People can use a mobile or web-based app where they are trained on how to differentiate between brain scans and then are asked to classify if images received are a “good image” by swiping right or a “bad image” by swiping left.
“The images come from a diffusion MRI scan, which creates pictures of the brain by detecting the motion of water molecules travelling through axons – important nerve fibres which are essentially the wiring of the brain that carry information about our environment, our vital organs and even our memories,” said Harrison.
The images currently on the app are for a study that focuses on genetic risks of Alzheimer’s disease. Users are asked to look out for the correct shape of the fornix – a bundle of nerve fibres that are vital for storing new memories and are affected in early Alzheimer’s disease.
“NeuroSwipe is currently in the early stages of development and is purely proof-of-concept at the moment, but we would love the public to get involved and give us feedback so that the process can be fine-tuned and the app can ultimately be scaled up for use in large-scale studies involving thousands of patients,” said Harrison.