eprintid: 29234 rev_number: 17 eprint_status: archive userid: 5607 dir: disk0/00/02/92/34 datestamp: 2020-12-18 07:33:16 lastmod: 2021-01-19 10:24:06 status_changed: 2020-12-18 07:33:16 type: doctoralThesis metadata_visibility: show creators_name: Petersen, Jens title: Learning Distributions of Functions on a Continuous Time Domain subjects: ddc-004 subjects: ddc-500 subjects: ddc-530 subjects: ddc-570 subjects: ddc-610 divisions: i-130001 adv_faculty: af-13 cterms_swd: Maschinelles Lernen cterms_swd: Maschinelles Sehen cterms_swd: Deep learning abstract: This work presents several contributions on the topic of learning representations of function spaces, as well as on learning the dynamics of glioma growth as a particular instance thereof. We begin with two preparatory efforts, showing how expert knowledge can be leveraged efficiently in an interactive segmentation context, and presenting a proof of concept for inferring non-deterministic glioma growth patterns purely from data. The remainder of our work builds upon the framework of Neural Processes. We show how these models represent function spaces and discover that they can implicitly decompose the space into different frequency components, not unlike a Fourier transform. In this context we derive an upper bound on the maximum signal frequency Neural Processes can represent and show how to control the learned representations to only contain certain frequencies. We continue with an improvement of a more recent addition to the Neural Process family called ConvCNP, which we combine with a Gaussian Process to make it non-deterministic and to improve generalization. Finally, we show how to perform segmentation in the Neural Process framework by extending a typical segmentation architecture with spatio-temporal attention. The resulting model can interpolate complex spatial variations of segmentations over time and, applied to glioma growth, it is able to represent multiple temporally consistent growth trajectories, exhibiting realistic and diverse spatial growth patterns. date: 2020 id_scheme: DOI id_number: 10.11588/heidok.00029234 ppn_swb: 1744104247 own_urn: urn:nbn:de:bsz:16-heidok-292346 date_accepted: 2020-12-09 advisor: HASH(0x561a6294c528) language: eng bibsort: PETERSENJELEARNINGDI2020 full_text_status: public place_of_pub: Heidelberg citation: Petersen, Jens (2020) Learning Distributions of Functions on a Continuous Time Domain. [Dissertation] document_url: https://archiv.ub.uni-heidelberg.de/volltextserver/29234/1/Jens_Petersen-Learning_Distributions_of_Functions_on_a_Continuous_Time_Domain.pdf