Become a leader in the IoT community!
Join our community of embedded and IoT practitioners to contribute experience, learn new skills and collaborate with other developers with complementary skillsets.
Join our community of embedded and IoT practitioners to contribute experience, learn new skills and collaborate with other developers with complementary skillsets.
Hey everyone,
Thanks for the previous suggestions on tackling the inference timeout issue in my vibration anomaly detection project. I implemented quantization to optimize the model, but now I’m encountering a new error:
**Error Message:**
Quantization Error: Unsupported Layer Type in INT8 Conversion - Layer 5 (Depthwise Conv2D)
It seems like the quantization process is failing specifically at Layer 5, which uses a Depthwise Conv2D operation.
Whatβs the best approach to handle layers that arenβt compatible with INT8 quantization? Should I consider retraining with a different architecture, or is there a workaround to manually adjust these layers?
Thanks in advance for your help!
CONTRIBUTE TO THIS THREAD