Skip to content

canceling while hotswap loading occurs can leave model in invalid state #43

@anotherjesse

Description

@anotherjesse

Instance can think it has loaded a fine-tune even though it hasn't

  • assume you have 2 fine-tunes
  • they are being run by sending in a url with the fine-tuned lora weights replicate_weights

Because load_lora_weights sets self.tuned_weights to the passed in URL before actually loading the weights, it means if a prediction is canceled while the download/... is happening - (Eg before finishing) - you can end up in an invalid state

    def load_trained_weights(self, weights, pipe):
        from no_init import no_init_or_tensor

        # weights can be a URLPath, which behaves in unexpected ways
        weights = str(weights)
        if self.tuned_weights == weights:
            print("skipping loading .. weights already loaded")
            return

        self.tuned_weights = weights
        
        ### SNIP - now we actually load the weights ###

We need to ensure that load_lora_weights leaves the model in a recoverable/correct state even if canceled during a prediction

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions