Skip to content

Q: Incorrect neural network dimensionality? #1

@giladturok

Description

@giladturok

I think the input dimension of the scale and translate neural networks $s$ and $t$ may be incorrect.

The original RealNVP paper states that the dimensionality of networks $s$ and $t$ are $R^d \rightarrow R^D-d$ for some $d < D$. In this code, the data (moons and normal) is $D=2$. However, the code defines the input layers of networks $s$ and $t$ as $d=2$ instead of $d=1 < D$. I believe this is a mistake that has not been detected despite the popularity of this repo.

Screenshot from this code example:

Screenshot 2024-09-26 at 8 23 17 PM

Screenshot from the paper:

Screenshot 2024-09-26 at 8 03 25 PM

This mistake is easy to make because a later equation in the paper suggests that the input dimension of networks $s$ and $t$ should be $D$ instead of some $d < D$. In the screenshot below, the terms $s(b \cdot x)$ and $t(b \cdot x)$ are misleading because although $b \cdot x \in R^D$, we actually want to pass in only the non-masked elements of $x$ (which is in $R^d$).

Screenshot 2024-09-26 at 8 05 45 PM

If I'm making a mistake, please let me know!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions