Sharing large data streams for Optical Coherence Tomography (OCT) are crucial to create a better understanding of pathological processes and to develop and improve tools for diagnostics and treatment. Due to the large volumes of data associated with the high resolution of OCT, the analysis requires high-performance computing solutions.
Similar to other imaging modalities, OCT generates a large amount of data (around 8 GB per volume scan). Several scans (and often dozens) are required for every patient. This means that data can quickly take up hundreds of gigabytes per patient. If the resolution is increased, this could amount to more than 1 terabyte per patient. These specific data need to be processed, stored, visualised and analysed in an efficient and fast way.
Read about how SURF and Netherlands eScience Center offered a combined platform to tackle this big data problem on the In The Field Blog.