We are happy to present the joining of Xiaobo Zhao as our newest member of HASTE. Xiaobo Zhao is joining the group of Andreas Hellander to work as a PostDoctoral Researcher. Xiaobo will be working on research and development of intelligent stream data processing pipelines, and the development of intelligent and efficient cloud systems capable of mapping data and compute to a variety of cloud computing and data storage e-infrastructure based on the quality and interestingness of the data.
Xiaobo Zhao received the M.S. degree in Communications and Information System from Northwestern Polytechnical University, Xi’an, China in 2015. He later received a Ph.D. degree in Electrical and Computer Engineering from Aarhus University, Aarhus, Denmark in 2020. Before joining the Hellander lab, he was a Research Assistant at Aarhus University and focused on efficient ML/DL service offloading to Edge/Cloud servers.
We are happy to announce that our paper “Deep learning models for lipid-nanoparticle-based drug delivery” is now available ahead of print and open access in the journal Nanomedicine.
Authors: Harrison PJ, Wieslander H, Sabirsh A, Karlsson J, Malmsjö V, Hellander A, Wählby C & Spjuth O.
Abstract: Background: Early prediction of time-lapse microscopy experiments enables intelligent data management and decision-making. Aim: Using time-lapse data of HepG2 cells exposed to lipid nanoparticles loaded with mRNA for expression of GFP, the authors hypothesized that it is possible to predict in advance whether a cell will express GFP. Methods: The first modeling approach used a convolutional neural network extracting per-cell features at early time points. These features were then combined and explored using either a long short-term memory network (approach 2) or time series feature extraction and gradient boosting machines (approach 3). Results: Accounting for the temporal dynamics significantly improved performance. Conclusion: The results highlight the benefit of accounting for temporal dynamics when studying drug delivery using high-content imaging.
In the figure below we show a schematic for the modelling approach used in the paper that combined convolutional and recurrent neural networks (long short-term memory, LSTM). This model is used for predicting information only present in the GFP channel at the end of the experiment from other imaging channels captured during the early time points of the experiment, prior to any GFP expression.
Following the win at the Adipocyte Imaging Challenge organized by AstraZeneca, two PhD students from the team, Ankit Gupta, and Håkan Wieslander were asked to comment in a technical report in Nature on the topic of virtual staining.
Fluorescence imaging is a valuable tool for biological analysis but is time-consuming and toxic to the cells. Using deep learning to virtually stain bright-field images is an active field of research that can alleviate these problems. Phil Harrison, a PhD student in the HASTE group, presented a poster at the Swedish Symposium on Deep Learning (SSDL) 2021 based on the HASTE team’s winning solution for the Adipocyte Cell Imaging Challenge. The poster presented our approach and results.
We are happy to present our newest member of HASTE: Dan Rosén! Dan is joining the group of Ola Spjuth to work as a Data Engineer. In his projects he will work with data pipelines and interact closely with microscopes to help reaching the goals of HASTE to act on collected image streams and make intelligent decisions and control microscopes to prioritize collecting the most interesting data.
AI Sweden and AstraZeneca organised the Adipocyte Cell Imaging Challenge, a two-week-long hackathon to help AstraZeneca accelerate the drug development process. The task was to use machine learning in solving the problem of labelling cell images without requiring toxic preprocessing of cell cultures by predicting the content of the fluorescence images from the corresponding bright-field images.
The solution used the Learning Under Privileged Information (LUPI) paradigm to solve the problem. LUPI enables the inclusion of additional (privileged) information when training machine learning models, data that is not available when making predictions. In this case, the segmentation masks of the nuclei were used as the privileged information during the training of machine learning models. Our solution will help AstraZeneca to speed up the drug discovery process and bring drugs to market quicker.
We are looking for a skilled Data Engineer to join the HASTE team!
In collaboration with other researchers, develop, implement and test systems for AI-controlled automated microscopes. The task includes interacting directly with the microscope and establishing pipelines where models trained on previously taken images decide and control where the microscope should take images in the next step to reach a specific goal. We are looking for a candidate with a genuine interest in technology and automation, and who enjoys solving problems including both practical interaction with hardware (robots, microscopes) and different types of software. Since our microscopes generate large amounts of images, the position will also include large-scale data management and -analysis. You will work with researchers in AI modeling and biological laboratory sciences, and contribute to implementing methods and evaluating them for different types of biological problems.
This is a 2-year position that is part of the HASTE project, funded by the Swedish Foundation for Strategic Research (SSF) aiming at developing new, intelligent ways of processing and managing very large amounts of microscopy images in order to be able to leverage the imminent explosion of image data from modern experimental setups in the biosciences. Industry collaborators are Vironova AB and AstraZeneca AB.
A master’s degree in engineering or a university degree in a relevant field is a requirement. Good programming skills in Python and preferably more programming languages is a requirement. Experience in AI modeling, Linux systems as well as developing REST services and APIs is a requirement. Experience of AI modeling on image data, practical handling of automated microscopes and working with software containers (e.g. Docker/Singularity) is meriting.
We are currently looking for an ambitious, highly motivated Postdoc with a good background in AI and imaging to join the HASTE project.
This is a 2-year postdoc position. Assignments include development and application of methods for large-scale analysis of microscopy images using AI / Machine Learning within the framework of the HASTE project. The project focuses on AI / machine learning with quantifiable confidence or probability, based on methods such as Active Learning, Conformal Prediction, Probabilistic (Venn) Prediction, and Deep Learning. Applicants are expected to collaborate with other project members and participate in regular research visits with industry partners AstraZeneca and Vironova.
PhD degree or a foreign degree equivalent to a PhD degree in a relevant field. The PhD degree must have been obtained no more than three years prior to the application deadline. The three year period can be extended due to circumstances such as sick leave, parental leave, duties in labour unions, etc. Documented experience with AI / ML methods and / or computerized image analysis. Experience in programming in eg Python is a requirement. Applicants should have excellent communication skills and be keen to actively interact with other team members including biologists, systems developers and researchers in AI / ML. Furthermore, applicants should be curious and creative, take initiatives and build relationships. Applicants should have good organizational ability, be able to structure work with multiple projects and solve anticipated and unexpected problems. The applicant must be able to express themselves very well in written and oral English
Unfortunately, we didn’t get a chance to enjoy the good food and beautiful environment of the Noor Castle, but we had a productive project meeting anyway. The meeting started with report reading and Salman Toor presenting his docent lecture “Distributed Computing e-Infrastructures (DCI): Challenges and Opportunities for Application Experts and Service Providers”. This was followed by intense brainstorming on the continued research in the project, and a discussion on the almost complete half-time report, and planning for the coming Tuesday seminars, which have become an important part of the project communication now that most of us work from home.