Weight Pruning on FPGA #1294
              
  
  Closed
              
          
                  
                    
                      Smitashree-code
                    
                  
                
                  started this conversation in
                General
              
            Replies: 0 comments
  
    Sign up for free
    to join this conversation on GitHub.
    Already have an account?
    Sign in to comment
  
        
    
Uh oh!
There was an error while loading. Please reload this page.
-
I have applied sparsity API weight pruning and synthesized both pruned and unpruned models using hls4ml targeting a Xilinx FPGA. The pruned model achieved an 11% reduction in DSP48 usage, indicating reduced multiply-accumulate (MAC) operations.
However, the overall reduction in flip-flops (FFs) and look-up tables (LUTs) was minimal (<2%), and BRAM usage slightly increased.
Beta Was this translation helpful? Give feedback.
All reactions