Skip to content

Confusing logic for reset_parameters() in Embedding and Linear layer #194

@aeryskyB

Description

@aeryskyB

Problem 1

Image
Source: LoRA/loralib/layers.py
Issue: lora_B (instead of lora_A) is sampled from normal distribution.

Problem 2

Image
Source: LoRA/loralib/layers.py
Issue: The comment is misguiding because lora_A is supposed to get Kaiming initialization, whereas lora_B is supposed to be initialized as zero.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions