The No Code TinyML Book

At the moment, machine learning models permeate every aspect of our existence. More often than you may be aware, you use these models during the day.

Machine learning models are used in daily tasks, including surfing social media, snapping pictures, and checking the weather.

We know how computationally expensive it is to train these models. However, using inference on these models is frequently computationally costly.

We require computer platforms that can handle machine learning at the rate at which it is being used. Therefore, most of these models are run on massive data centers with groups of CPUs and GPUs (even TPUs in some cases).

Bigger is not necessarily better.

Machine learning should start instantaneously when you capture an image. You don’t want to hold off until the image is transmitted to a data center, processed, and then returned.

Your machine learning model should be able to operate locally.

Your gadgets should reply immediately to your requests for “Alexa” or “Ok, Google” when you utter them — awaiting the moment the gadget sends your voice to the computers for processing and information retrieval.

The user experience suffers, and it takes time. Once more, in this scenario, running the machine learning model locally is preferred.

What Is TinyML?

TinyML is a branch of machine learning and embedded systems research that looks at models that may be used on compact, low-power gadgets like microcontrollers. It allows edge devices to do low-latency, low-power, and low-bandwidth model inference.

A typical consumer microcontroller uses the power on the order of milliwatts or microwatts, in contrast to a conventional consumer CPU’s consumption of 65 to 85 watts and a regular consumer GPU’s use of 200 to 500 watts. That uses around a thousand times less energy.

TinyML devices can execute ML applications on edge while operating unplugged for weeks, months, and in some circumstances, even years, thanks to their low power consumption.

Future of Machine Learning is Tiny and Bright!

There aren’t many learning resources currently available because TinyML is a new area. However, there are a few excellent resources, like Rohit Sharma’s “No Code TinyML Book.” The book delves further into technology than is often used while keeping it approachable for people with different backgrounds, including students, enthusiasts, managers, market researchers, and developers.

It provides the no-code and low-code tinyML platform to create solutions suitable for production, such as predictive maintenance, American sign language, audio wake words, and visual wake words.

Conclusion

Microcontrollers are ubiquitous and generate a tonne of data. We may use this data to improve our products by using TinyML.

More than 250 billion microcontrollers are in use, and this figure will continue to grow. Price reduction will result from this (duh, economics!).

Microcontrollers that support machine learning will have more potential.

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store