From intelligent data acquisition via smart data-management to confident predictions
Images contain very rich information, and digital cameras combined with image processing and analysis can detect and quantify a range of patterns and processes. The valuable information is however often sparse, and the ever increasing speed at which data is collected results in data-volumes that exceed the computational resources available.
The HASTE project takes a hierarchical approach to acquisition, analysis, and interpretation of image data. We develop computationally efficient measurements for data description, confidence-driven machine learning for determination of interestingness, and a theory and framework to apply intelligent spatial and temporal information hierarchies, distributing data to computational resources and storage options based on low-level image features.
Following the win at the Adipocyte Imaging Challenge organized by AstraZeneca, two PhD students from the team, Ankit Gupta, and Håkan Wieslander were asked to comment in a technical report in Nature on the topic of virtual staining. The technology feature can be found here: https://www.nature.com/articles/d41586-021-00812-7
Fluorescence imaging is a valuable tool for biological analysis but is time-consuming and toxic to the cells. Using deep learning to virtually stain bright-field images is an active field of research that can alleviate these problems. Phil Harrison, a PhD student in the HASTE group, presented a poster at the Swedish Symposium on Deep Learning …
We are happy to present our newest member of HASTE: Dan Rosén! Dan is joining the group of Ola Spjuth to work as a Data Engineer. In his projects he will work with data pipelines and interact closely with microscopes to help reaching the goals of HASTE to act on collected image streams and make …
The HASTE project takes a holistic approach to new, intelligent ways of processing and managing very large amounts of microscopy images to leverage the imminent explosion of image data from modern experimental setups in the biosciences. One central idea is to represent datasets as intelligently formed and maintained information hierarchies, and to prioritize data acquisition and analysis to certain regions/sections of data based on automatically obtained metrics for usefulness and interestingness. To arrive at such smart systems for scientific discovery in image data, we will pursue a range of topics such as efficient data mining in image data, machine learning models with quantifiable confidence that learn an object’s interestingness, and development of intelligent and efficient cloud systems capable of mapping data and compute to a variety of cloud computing and data storage e-infrastructure based on the quality and interestingness of the data.
We will focus our efforts on microscopy data, and work in three specific areas where image collection results in data volumes difficult to handle with today’s computational resources, namely:
Large-scale time-lapse experiments exploring the dynamics of cells and drug. delivery particles in collaboration with Astra Zeneca.
Nanometer-resolution transmission electron microscopy data of in collaboration with Vironova AB.
Multi-modal digital pathology data from SciLifeLab Sweden.
We expect the resulting methodologies and frameworks to be highly relevant also for other scientific and industrial applications, including surveillance, predictive maintenance and quality control.
The project is a collaboration between the Wählby lab (PI), Hellander lab (co-PI), both at the Department of Information Technology, Uppsala University, the Spjuth lab (co-PI) at the Department of Pharmaceutical Biosciences, Uppsala University, the Nilsson lab at the Department of Biochemistry and Biophysics at Stockholm University and SciLifeLab, Vironova AB and AstraZeneca AB.
The HASTE project is funded by the Swedish Foundation for Strategic Research (SSF), under the call “Big Data and Computational Science”. See the press release here. The publications arising from the project are solely the responsibility of the authors and does not necessarily reflect the views of this agency.