Skip to content

Conversation

jm12138
Copy link
Contributor

@jm12138 jm12138 commented May 14, 2023

PR types

Bug fixes

PR changes

Fix max_position_embeddings support in LLaMA model.

Description

Fix max_position_embeddings support in LLaMA model.

@paddle-bot
Copy link

paddle-bot bot commented May 14, 2023

Thanks for your contribution!

@codecov
Copy link

codecov bot commented May 14, 2023

Codecov Report

Merging #5914 (6e51284) into develop (9ed2d66) will increase coverage by 0.43%.
The diff coverage is 95.59%.

@@             Coverage Diff             @@
##           develop    #5914      +/-   ##
===========================================
+ Coverage    61.93%   62.37%   +0.43%     
===========================================
  Files          491      491              
  Lines        69136    69245     +109     
===========================================
+ Hits         42822    43190     +368     
+ Misses       26314    26055     -259     
Impacted Files Coverage Δ
paddlenlp/transformers/convbert/tokenizer.py 100.00% <ø> (ø)
paddlenlp/transformers/convbert/modeling.py 85.62% <94.87%> (+64.85%) ⬆️
paddlenlp/transformers/__init__.py 100.00% <100.00%> (ø)
paddlenlp/transformers/convbert/configuration.py 100.00% <100.00%> (ø)
paddlenlp/transformers/llama/modeling.py 72.13% <100.00%> (ø)

... and 1 file with indirect coverage changes

Copy link
Contributor

@ZHUI ZHUI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@ZHUI ZHUI merged commit 28208c7 into PaddlePaddle:develop May 15, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants