You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the problem the feature is intended to solve
Recently I found that building tensorflow serving with tcmalloc and set soft limit can mitigating these kind of memory issue: #2142 , #1664. Also, I could get slightly better performances.
Here's what I did.
build tensorflow_model_server cc binary with tcmalloc using malloc argument.
launch background thread before server starts using tcmalloc::MallocExtension::ProcessBackgroundActions (reference)
Set soft limit, and add tcmalloc_soft_limit argument for it
Describe the solution
How about providing tensorflow serving compiled with tcmalloc? I know I can use jemalloc instead, but jemalloc has so much configurations, while tcmalloc doesn't. It is easy to use and has great performance.
The text was updated successfully, but these errors were encountered:
Feature Request
Describe the problem the feature is intended to solve
Recently I found that building tensorflow serving with tcmalloc and set soft limit can mitigating these kind of memory issue: #2142 , #1664. Also, I could get slightly better performances.
Here's what I did.
tensorflow_model_server
cc binary with tcmalloc using malloc argument.tcmalloc::MallocExtension::ProcessBackgroundActions
(reference)Describe the solution
How about providing tensorflow serving compiled with tcmalloc? I know I can use jemalloc instead, but jemalloc has so much configurations, while tcmalloc doesn't. It is easy to use and has great performance.
The text was updated successfully, but these errors were encountered: