Invertible Neural Network utilizing Glow

Keywords

No Thumbnail Available

Issue Date

2020-07-01

Language

en

Document type

Journal Title

Journal ISSN

Volume Title

Publisher

Title

ISSN

Volume

Issue

Startpage

Endpage

DOI

Abstract

Normalizing flows are rising in popularity, but a disadvantage is that information is often lost during the transformation between distributions. This problem is tackled by introducing the concept of marginalizing flows, which are added on top of a normalizing flow. They add a variable epsilon to the data, which contains thrown-away data, before being passed through the normalizing flow. Marginalizing over epsilon then should contain more information than only using a normalizing flow, which is what is tested here. The specific normalizing flow used is Glow, with a marginalizing flow that applies a padding of 4 to the right and bottom borders. Results indicate that the improvement in performance that marginalizing flows offer is insignificant.

Description

Citation

Faculty

Faculteit der Sociale Wetenschappen