-
Notifications
You must be signed in to change notification settings - Fork 3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Inference] Make inference_model on by default. #9879
base: develop
Are you sure you want to change the base?
Conversation
Thanks for your contribution! |
|
@@ -1361,6 +1361,13 @@ def create_predictor( | |||
paddle.set_device(predictor_args.device) | |||
paddle.set_default_dtype(predictor_args.dtype) | |||
|
|||
if not is_paddlenlp_ops_available(): | |||
if predictor_args.inference_model: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这里是不是也应该判断一下model在白名单列表里?例如Llama/Qwen/Deepseek等等
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
已更新 8fcf282
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## develop #9879 +/- ##
========================================
Coverage 51.66% 51.66%
========================================
Files 739 739
Lines 117426 117426
========================================
+ Hits 60668 60670 +2
+ Misses 56758 56756 -2 ☔ View full report in Codecov by Sentry. |
Before submitting
tests
folder. If there are codecov issues, please add tests cases first.PR types
Others
PR changes
Others
Description
Make inference_model on by default.