[Arm IoT Total Solutions](https://www.arm.com/solutions/iot/total-solutions-iot) provides a complete solution designed for specific use-cases, leaving developers to focus on what really matters – innovation and differentiation across diverse and varied use cases. It has everything needed to simplify the design process and streamline product development, including hardware IP, software, real-time OS support, machine learning (ML) models, advanced tools such as the new Arm Virtual Hardware, application specific reference code and support from the world’s largest IoT ecosystem.
[Arm IoT Total Solutions](https://www.arm.com/solutions/iot/total-solutions-iot) provides a complete solution designed for specific use-cases, leaving developers to focus on what really matters – innovation and differentiation across diverse and varied use cases. It has everything needed to simplify the design process and streamline product development, including hardware IP, software, real-time OS support, machine learning (ML) models, advanced tools such as the new Arm Virtual Hardware, application specific reference code and support from the world’s largest IoT ecosystem.
# Overview
This repo contains Arm's first [IoT Total Solution](https://www.arm.com/solutions/iot/total-solutions-iot), "Keyword Detection". It provides general-purpose compute and ML workload use-cases, including an ML-based keyword recognition example that leverages the DS-CNN model from the [Arm Model Zoo](https://github.com/ARM-software/ML-zoo).
This repo contains Arm's first [IoT Total Solution](https://www.arm.com/solutions/iot/total-solutions-iot), "Keyword Detection". It provides general-purpose compute and ML workload use-cases, including an ML-based keyword recognition example that leverages the DS-CNN model from the [Arm Model Zoo](https://github.com/ARM-software/ML-zoo).
The software supports multiple configurations of the Arm Corstone™-300 subsystem, incorporating the Cortex-M55 processor and Arm Ethos™-U55 microNPU. This total solution provides the complex, non differentiated secure platform software on behalf of the ecosystem, thus enabling you to focus on your next killer app.
The software supports multiple configurations of the Arm Corstone™-300 subsystem, incorporating the Cortex-M55 processor and Arm Ethos™-U55 microNPU. This total solution provides the complex, non differentiated secure platform software on behalf of the ecosystem, thus enabling you to focus on your next killer app.
## Keyword detection application
## Keyword detection application
The keyword detection application runs the DS-CNN model on top of [AWS FreeRTOS](https://docs.aws.amazon.com/freertos/). It detects keywords and inform the user of which keyword has been spotted. The audio data to process are injected at run time using the [Arm Virtual Hardware](https://www.arm.com/virtual-hardware) audio driver.
The keyword detection application runs the DS-CNN model on top of [AWS FreeRTOS](https://docs.aws.amazon.com/freertos/). It detects keywords and inform the user of which keyword has been spotted. The audio data to process are injected at run time using the [Arm Virtual Hardware](https://www.arm.com/virtual-hardware) audio driver.
The Keyword application connects to [AWS IoT](https://docs.aws.amazon.com/iot/latest/developerguide/what-is-aws-iot.html) cloud to publish recognised keywords. AWS IoT cloud is also used for OTA firmware updates. These firmware updates are securely applied using [Trusted Firmware-M](https://tf-m-user-guide.trustedfirmware.org/). For more information, refer to the keyword detection [Readme](./kws/README.md).

## Blinky application
## Blinky application
The blinky application demonstrate blinking LEDs using Arm Virtual Hardware. AWS FreeRTOS and FreeRTOS are already included in the application to kickstart new developments.
The blinky application demonstrate blinking LEDs using Arm Virtual Hardware. AWS FreeRTOS and FreeRTOS are already included in the application to kickstart new developments.
# Quick Start
...
...
@@ -32,14 +32,14 @@ To utilize the Arm Virtual Hardware, you will need to create an [AWS Account](ht
#### Launching the instance in EC2 [(AWS on getting started)](https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/EC2_GetStarted.html)
1. Go to [EC2](https://console.aws.amazon.com/ec2/v2/) in the AWS Web Console.
1. Select **Launch Instances** which will take you to a wizard for launching the instance.
1. **Step 1: Choose an [Amazon Machine Image (AMI)](https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/AMIs.html)** - In the Search box, type `Arm Virtual Hardware` then find the item called "Arm Virtual Hardware" that is by Arm, and press Select for that item. This image contains all the software necessary to build and run the Arm IoT Total Solutions.
This will raise a subscription page/pop-up titled, **Arm Virtual Hardware**. You will note that the subscription is free from Arm, but AWS does charge for the costs of the instances themselves according to the pricing chart provided.
This will raise a subscription page/pop-up titled, **Arm Virtual Hardware**. You will note that the subscription is free from Arm, but AWS does charge for the costs of the instances themselves according to the pricing chart provided.
> Arm Virtual Hardware for Corstone-300 is available as a public beta on AWS Marketplace. To help you get started, AWS are offering more than 100 hours of free AWS EC2 CPU credits for the first 1,000 qualified users. Click here to find out more: https://www.arm.com/company/contact-us/virtual-hardware.
You must select Continue if you want to move forward.
1. **Step 2: Choose an Instance Type** - Select one of the instance types from the list. We recommend the **c5.large**. Keep in mind that there are charges that accrue while the instance is running.
From here you may select **Review and Launch** to move directly to the launch page or select **Next: Configure Instance Details** if you need to set any custom settings for this instance.
...
...
@@ -48,22 +48,22 @@ Once you complete the wizard by initiating the instance launch you will see a pa
Whichever way you choose find your new instance and select its instance ID to open the page to manage the instance.
#### Connecting to the instance:
#### Connecting to the instance:
1. Select **Connect** to open an SSH terminal session to the instance in your browser.
1. Ensure the User name field is set to `ubuntu`.
1. Select the **Connect** button to open the session. This will put you in a browser window where you will have an SSH terminal window ready for your input.
### Launch Using a local terminal
1. Install [AWS CLI 2](https://docs.aws.amazon.com/cli/latest/userguide/install-cliv2.html) on your machine.
1. Install [AWS CLI 2](https://docs.aws.amazon.com/cli/latest/userguide/install-cliv2.html) on your machine.
2.[Configure](https://docs.aws.amazon.com/cli/latest/userguide/cli-configure-quickstart.html) the access key, secret key and region that AWS CLI will use. If your organization uses AWS Single Sign-On, the [configuration process](https://docs.aws.amazon.com/cli/latest/userguide/cli-configure-sso.html) is slightly different. Make sure the region selected matches the region of the SSO service.
3. Create a new key pair.
3. Create a new key pair.
```sh
aws ec2 create-key-pair --key-name MyKeyPair
```
4. When AWS CLI display the new key pair. Save the key material in a `.pem` file. The file permission must be set to `400`.
4. When AWS CLI display the new key pair. Save the key material in a `.pem` file. The file permission must be set to `400`.
```sh
chmod 400 MyKeyPair.pem
...
...
@@ -83,7 +83,7 @@ chmod 400 MyKeyPair.pem
./scripts/vht_cli.py -k MyKeyPair status
```
2. Connect to the instance using SSH and the private key saved localy.
2. Connect to the instance using SSH and the private key saved localy.
```sh
ssh -i"MyKeyPair.pem" ubuntu@<instance ip address>
...
...
@@ -91,8 +91,8 @@ ssh -i "MyKeyPair.pem" ubuntu@<instance ip address>
## Build and execute the application
To update the application, a set of scripts is included to setup the environment,
build applications, run them and test them. These scripts must be executed in the AVH AMI.
To update the application, a set of scripts is included to setup the environment,
build applications, run them and test them. These scripts must be executed in the AVH AMI.
Open your favorite terminal program or linux shell application and connect to the AVH AMI instance:
...
...
@@ -100,7 +100,7 @@ Open your favorite terminal program or linux shell application and connect to th
@@ -145,13 +145,13 @@ To build, run and launch a test of the blinky application, replace `kws` by `bli
## Updating audio data
The audio data streamed into the Arm Virtual Hardware is read from the file `test.wav` located at the root of the repository. It can be replaced with another audio file with the following configuration:
- Format: PCM
The audio data streamed into the Arm Virtual Hardware is read from the file `test.wav` located at the root of the repository. It can be replaced with another audio file with the following configuration:
- Format: PCM
- Audio channels: Mono
- Sample rate: 16 bits
- Sample rate: 16kHz
- Sample rate: 16kHz
# Continuous integration setup
# Continuous integration setup
Step by step instructions on how to setup a continuous integration pipeline in GitHub are provided in a mirror of this repository located at https://github.com/ARM-software/ATS-Keyword.
...
...
@@ -257,7 +257,7 @@ Upon completion of the build and sign process the signature string will be echoe
1. Go to [EC2](https://console.aws.amazon.com/ec2/v2/) in the AWS Web Console.
2. Select the instance to stop.
3. Click on `Instance state` and select `Stop Instance` in the drop down menu.
## Stopping the instance using a local terminal
```sh
...
...
@@ -266,24 +266,24 @@ Upon completion of the build and sign process the signature string will be echoe
# Source code overview
-`bsp`: Arm Corstone™-300 subsystem platform code and AWS configurations.
-`bsp`: Arm Corstone™-300 subsystem platform code and AWS configurations.
-`lib`: Middleware used by IoT Total Solution.
-`lib/mcuboot`: MCUboot bootloader.
-`lib/tf-m`: Trusted Firmware M: [Platform Security Architecture](PSA) for Armv8-M.
-`lib/mbedcrypto`: Mbed TLS and PSA cryptographic APIs.
-`lib/ml-embedded-evaluation-kit`: Arm® ML embedded evaluation kitr Ethos NPU. It includes [TensorFlow](https://www.tensorflow.org/)
-`lib/ml-embedded-evaluation-kit`: Arm® ML embedded evaluation kitr Ethos NPU. It includes [TensorFlow](https://www.tensorflow.org/)
-`libVHT`: Virtual streaming solution for Arm Virtual Hardware.
-`libVHT`: Virtual streaming solution for Arm Virtual Hardware.
-`blinky`: Blinky application.
-`blinky/main_ns.c`: Entry point of the blinky application
-`kws`: Keyword detection application.
-`kws/source/main_ns.c`: Entry point of the kws application.
-`kws/source/main_ns.c`: Entry point of the kws application.
-`kws/source/blinky_task.c`: Blinky/UX thread of the application.
-`kws/source/ml_interface.c`: Interface between the virtual streaming solution and tensor flow.
-`kws/source/ml_interface.c`: Interface between the virtual streaming solution and tensor flow.
# ML Model Replacement
All the ML models supported by the [ML Embedded Eval Kit](All the models supported ) are available to applications. The first step to use another module is to generate sources files from its labels and `.tflite` model.
All the ML models supported by the [ML Embedded Eval Kit](All the models supported ) are available to applications. The first step to use another module is to generate sources files from its labels and `.tflite` model.
Models available are present in `./lib/ml-embedded-evaluation-kit/resources_downloaded`.
Pre-integrated source code is available from the `ML Embedded Eval Kit` and can be browsed from `./lib/ml-embedded-evaluation-kit/source/use_case`.
Pre-integrated source code is available from the `ML Embedded Eval Kit` and can be browsed from `./lib/ml-embedded-evaluation-kit/source/use_case`.
Integrating a new model means integrating its source code and requires update of the build files.
# ML Embedded Eval Kit
The Arm ML Embedded Evaluation Kit , is an open-source repository enabling users to quickly build and deploy embedded machine learning applications for Arm Cortex-M55 CPU and Arm Ethos-U55 NPU.
The Arm ML Embedded Evaluation Kit , is an open-source repository enabling users to quickly build and deploy embedded machine learning applications for Arm Cortex-M55 CPU and Arm Ethos-U55 NPU.
With ML Eval Kit you can run inferences on either a custom neural network on Ethos-U microNPU or using availble ML applications such as [Image classification](https://review.mlplatform.org/plugins/gitiles/ml/ethos-u/ml-embedded-evaluation-kit/+/c930ad9dc189d831ac77f716df288f70178d4c10/docs/use_cases/img_class.md), [Keyword spotting (KWS)](https://review.mlplatform.org/plugins/gitiles/ml/ethos-u/ml-embedded-evaluation-kit/+/c930ad9dc189d831ac77f716df288f70178d4c10/docs/use_cases/kws.md), [Automated Speech Recognition (ASR)](https://review.mlplatform.org/plugins/gitiles/ml/ethos-u/ml-embedded-evaluation-kit/+/c930ad9dc189d831ac77f716df288f70178d4c10/docs/use_cases/asr.md), [Anomaly Detection](https://review.mlplatform.org/plugins/gitiles/ml/ethos-u/ml-embedded-evaluation-kit/+/c930ad9dc189d831ac77f716df288f70178d4c10/docs/use_cases/ad.md), and [Person Detection](https://review.mlplatform.org/plugins/gitiles/ml/ethos-u/ml-embedded-evaluation-kit/+/HEAD/docs/use_cases/visual_wake_word.md) all using Arm Fixed Virtual Platform (FVP) available in Arm Virtual Hardware.
With ML Eval Kit you can run inferences on either a custom neural network on Ethos-U microNPU or using availble ML applications such as [Image classification](https://review.mlplatform.org/plugins/gitiles/ml/ethos-u/ml-embedded-evaluation-kit/+/c930ad9dc189d831ac77f716df288f70178d4c10/docs/use_cases/img_class.md), [Keyword spotting (KWS)](https://review.mlplatform.org/plugins/gitiles/ml/ethos-u/ml-embedded-evaluation-kit/+/c930ad9dc189d831ac77f716df288f70178d4c10/docs/use_cases/kws.md), [Automated Speech Recognition (ASR)](https://review.mlplatform.org/plugins/gitiles/ml/ethos-u/ml-embedded-evaluation-kit/+/c930ad9dc189d831ac77f716df288f70178d4c10/docs/use_cases/asr.md), [Anomaly Detection](https://review.mlplatform.org/plugins/gitiles/ml/ethos-u/ml-embedded-evaluation-kit/+/c930ad9dc189d831ac77f716df288f70178d4c10/docs/use_cases/ad.md), and [Person Detection](https://review.mlplatform.org/plugins/gitiles/ml/ethos-u/ml-embedded-evaluation-kit/+/HEAD/docs/use_cases/visual_wake_word.md) all using Arm Fixed Virtual Platform (FVP) available in Arm Virtual Hardware.
# Known limitations
- Arm compiler 6 is the only compiler supported.
- Accuracy of the ML detection running alongside cloud connectivity depends on the performance of the EC2 instance used. We recommend to use at least a t3.medium instance.
- Arm Corstone™-300 subsystem simulation is not time accurate. Performances will differ depending on the performances of the host machine.
- Arm compiler 6 is the only compiler supported.
- Accuracy of the ML detection running alongside cloud connectivity depends on the performance of the EC2 instance used. We recommend to use at least a t3.medium instance.
- Arm Corstone™-300 subsystem simulation is not time accurate. Performances will differ depending on the performances of the host machine.