-
Notifications
You must be signed in to change notification settings - Fork 18
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Questions on sub-ePDG depth and normalization #9
Comments
|
In my opinion, the question for question 1 was insufficiently answered. Is the reason you didn't go deep into the problem that the sampling step may contain a malicious sub-ePDG, because you trust the |
The question is not really "is this graph buggy"; the question is "is there a bug at this manifestation point?" so I don't think it matters if the bug is only a subgraph of the graph in consideration: it's buggy either way. |
That's right. Anyway the s2v captures the difference between I think that is awesome point of ML. |
Hi, thank you for the excellent paper and open sourcing of the project. I am working on a new project that modifies and uses your code, and here are two questions I have.
depth_limit
option for controlling the depth of sub-ePDGs at runtime, but this option is unused in thehector train
command line, which means that the sub-ePDG of a manifestation point always includes all its predecessor nodes. There might be situations where a malicious sub-ePDG is a subgraph of another sub-ePDG labelled benign. Are these situations expected, or we should avoid them early in the data preprocessing step?model.py
has an orphanBatchNorm
module left unused. Was there any reason behind that change? Or has it been replaced because the normalization performed byhector feature_stats
achieves a similar goal?The text was updated successfully, but these errors were encountered: