700% higher concurrency 50% memory savings Startup is 10 times faster. Packing 90% smaller; It also supports java8 ~ java25, native runtime.
Abstract: Artificial neural networks have led to a higher computational burden, complicating inference tasks on low-power edge devices. Spiking neural network (SNN), which leverages sparse spikes for ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results