论文标题

小吃:在微控制器上进行饮食检测

Tiny Eats: Eating Detection on a Microcontroller

论文作者

Nyamukuru, Maria T., Odame, Kofi M.

论文摘要

对低功率高效可穿戴设备的兴趣日益增加,用于自动饮食监测(ADM)[1]。深度神经网络在音频事件分类问题中的成功使它们成为此任务的理想选择。但是,深度神经网络不仅在计算密集型且能量效率低下,而且需要大量的记忆。为了应对这些挑战,我们提出了一个适合资源受限应用程序的浅门复发单元(GRU)架构。本文描述了在低功率微控制器ARM Cortex M0+上的小型GRU GRU的实施,以对饮食发作进行分类。 Tiny Eats Gru是传统的Gru [2]和Egru [3]的混合体,使其足够快且足够快,以适合与传统GRU相当的精度放在手臂皮层M0+上。 Tiny Eats Gru仅利用了4%的ARM Cortex M0+记忆力,并鉴定出6毫秒延迟和准确性为95.15%的进食或非进食发作。

There is a growing interest in low power highly efficient wearable devices for automatic dietary monitoring (ADM) [1]. The success of deep neural networks in audio event classification problems makes them ideal for this task. Deep neural networks are, however, not only computationally intensive and energy inefficient but also require a large amount of memory. To address these challenges, we propose a shallow gated recurrent unit (GRU) architecture suitable for resource-constrained applications. This paper describes the implementation of the Tiny Eats GRU, a shallow GRU neural network, on a low power micro-controller, Arm Cortex M0+, to classify eating episodes. Tiny Eats GRU is a hybrid of the traditional GRU [2] and eGRU [3] to make it small and fast enough to fit on the Arm Cortex M0+ with comparable accuracy to the traditional GRU. The Tiny Eats GRU utilizes only 4% of the Arm Cortex M0+ memory and identifies eating or non-eating episodes with 6 ms latency and accuracy of 95.15%.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源