Skip to content

Question about using PReLU activation after convolutional layer #19

@mhy9989

Description

@mhy9989

Your work is outstanding! But I still have some questions. In the kan_conv code of this repository, I found that an extra PReLU layer is added after the convolution and normalization layers for activation. For the original kan layer, there is usually no need to add an extra activation layer. So for kan_conv, is there any basis for adding the PReLU layer or is there any improvement on kan_conv?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions