You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@@ -34,6 +31,12 @@ Note that the benchmarks run on an 8xA100-80GB, power limited to 330W with a hyb
34
31
35
32
For more details about Mixtral 8x7B, please check [this page](./mixtral-moe) or this [note](https://thonking.substack.com/p/short-supporting-mixtral-in-gpt-fast).
36
33
34
+
## Examples
35
+
In the spirit of keeping the repo minimal, here are various examples of extensions you can make to gpt-fast as PRs.
0 commit comments