使用Peft库微调基座模型(比如LLaMA-7B)后会得到Lora参数模块,将基座模型与Lora参数合并后才能得到完整的微调后的大模型

# Copyright 2023 Rohan Taori, Ishaan Gulrajani, Tianyi Zhang, Yann Dubois, Xuechen Li

#

# Licensed under the Apache License, Version 2.0 (the "License");

# you may not use this file except in compliance with the License.

# You may obtain a copy of the License at

#

# http://www.apache.org/licenses/LICENSE-2.0

#

# Unless required by applicable law or agreed to in writing, software

# distributed under the License is distributed on an "AS IS" BASIS,

# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.

# See the License for the specific language governing permissions and

# limitations under the License.

from typing import Optional

from peft import PeftModel

import fire

import torch

import tqdm

import transformers

@torch.inference_mode()

def merge(

path_zhixi,

path_lora,

p

文章来源

评论可见,请评论后查看内容,谢谢!!!
 您阅读本篇文章共花了: