Top 24 Applications of AI: Transforming Industries Today
Taking AI To The Edge For Smaller, Smarter, And More Secure Applications
AI continues to spark debate and demonstrate remarkable value for businesses and consumers. As with many emerging technologies, the spotlight often falls on large-scale, infrastructure-heavy, and power-hungry applications. However, as the use of AI grows, there is a mounting pressure on the grid from large data centers, with intensive applications becoming much less sustainable and affordable.
As a result, there is a soaring demand for nimbler, product-centric AI solutions. Edge AI is leading this new trend, by bringing data processing closer to (or embedded within) devices, on the tiny edge, meaning that basic inference tasks can be performed locally. By not sending raw data off to the cloud via data centers, we are seeing significant security improvements in industrial and consumer applications of AI, which also enhances the performance and efficiency of devices at a fraction of the cost compared to cloud.
But, as with any new opportunity, there are fresh challenges. Product developers must now consider how to build the right infrastructure and the required expertise to capitalize on the potential of edge.The importance of local inferenceTaking a step back, we can see that AI largely encompasses two fields: machine learning, where systems learn from data, and neural network computation, a specific model designed to think like a human brain. These are supplementary ways to program machines, training them to do a task by feeding it with relevant data to ensure outputs are accurate and reliable. These workloads are typically carried out at a huge scale, with comprehensive data center installations to make them function.
For smaller industrial use-cases and consumer industrial applications – whether this is a smart toaster in your kitchen or an autonomous robot on a factory floor – it is not economically (or environmentally) feasible to push the required data and analysis for AI inference to the cloud.Instead, with edge AI presenting the opportunity of local inference, ultra-low latency, and smaller transmission loads, we can realize massive improvements to cost and power efficiency, while building new AI applications. We are already seeing edge AI contribute towards significant productivity improvements for smart buildings, asset tracking, and industrial applications. For example, industrial sensors can be accelerated with edge AI hardware for quicker fault detection, as well as predictive maintenance capabilities, to know when a device's condition will change before a fault occurs.
Taking this further, the next generation of hardware products designed for edge AI will introduce specific adaptations for AI sub-systems to be part of the security architecture from the start. This is one area in which embedding the edge AI capability within systems comes to the fore.Embedding intelligence into the productThe next stage in the evolution of embedded systems is introducing edge AI into the device architecture, and from there its "tiny edge". This refers to tiny, resource-constrained devices that process AI and ML models directly on the edge, including microcontrollers, low-power processors and embedded sensors, enabling real-time data processing with minimal power consumption and low latency.
A new class of software and hardware is now emerging on the tiny edge, giving the possibility to execute AI operations in the device. By embedding this capability within the architecture from the start, we are making the 'signal' itself become the data', rather than wasting resources transforming it. For example, tiny edge sensors can gather data from the environment that a device is in, leveraging an in-chip engine to produce a result. In the case of solar farms, sensors within a solar panel can specifically detect nearby arc faults across power management systems. When extreme voltages occur, it can automatically trigger a shutdown failsafe and avoid an electrical fire. With applications like arc fault detection as well as battery management or on-device face or object recognition driving growth in this space, we will see the market for microcontrollers capable of supporting AI on the tiny edge grow at a CAGR of over 100% (according to ABI Research). To realize this potential, more work is needed to bridge the gap between the processing capabilities of cloud-based AI and targeted applications from devices that are capable of working on, or being, the edge.However, like with any new technology: where there is a demand, there is a way.
We are already seeing meaningful R&D results focused on this challenge, and tiny AI is starting to become embedded in all types of different systems – in some cases, consumers are already taking this technology for granted, literally talking to devices without thinking 'this is AI'.
Building edge AI infrastructureTo capitalize on this emerging opportunity, product developers must first consider the quality and type of data that goes into edge devices, as this determines the level of processing, and the software and hardware required to deal with the workload. This is the key difference between typical edge AI, operating on more powerful hardware, capable of handling complex algorithms and datasets, and tiny AI, which focuses on running lightweight models that can perform basic inference tasks.
For example, audio and visual information - especially visual - are extremely complex and need a deep neural architecture to analyze the data. On the other hand, it is less demanding to process data from vibrations or electric current measurements recorded over time, so developers can utilize tiny AI algorithms to do this within a resource-constrained or ultra-low power, low latency device.
It is important to consider the class of device and microcontroller unit needed in the development stage, based on the specific computational power requirements. In many cases, less is more, and running a lighter, tiny AI model improves the power efficiency and battery life of a device. With that said, whether dealing with text or audio-visual information, developers must still undertake pre-processing, feeding large quantities of sample data into learning algorithms to train the AI using them.
What's on the horizon?The development of devices that embed AI into the tiny edge is still in its infancy, meaning there's scope for businesses to experiment, be creative, and figure out exactly what their success factors are. We are at the beginning of a massive wave, which will accelerate digitalization in every aspect of our life.
The use-cases are vast, from intelligent public infrastructure, such as the sensors required for smart, connected cities, to remote patient monitoring through non-invasive wearables in healthcare. Users are able to improve their lives, and ease daily tasks, without even realizing that AI is the key factor.
The demand is there, with edge AI and tiny AI already transforming product development, redefining what's classified as a great piece of technology, enabling more personalized predictive features, security, and contextual awareness. In just a few years, this type of AI is going to become vital to the everyday utility of most technologies – without it, developers will quickly see their innovations become obsolete.
This is an important step forward, but it doesn't come without challenges. Overcoming these challenges can only happen through a broader ecosystem of development tools, and software resources. It's just a matter of time. The tiny edge is the lynchpin through which society will unlock far greater control and usefulness of its data and environment, leading to a smarter AI-driven future.
The author is Marc Dupaquier, Managing Director Artificial Intelligence Solutions, STMicroelectronics.
Disclaimer: The views expressed are solely of the author and ETCIO does not necessarily subscribe to it. ETCIO shall not be responsible for any damage caused to any person/organization directly or indirectly.
Subscribe to our newsletter to get latest insights & analysis.
IMF: AI's Economic Benefits Will Likely Outweigh Its Emissions Costs
The International Monetary Fund (IMF) said that the benefits of artificial intelligence would boost global production by 0.5% per year between 2025-2030, and outweigh the rising costs associated with the carbon emissions from the data centres required to run AI models.
The IMF released a report at its annual spring meetings in Washington that noted the fact that these output gains were not shared equally around the globe. It called on governments and businesses to minimize costs to society.
The report stated that despite challenges associated with higher electricity prices and greenhouse gases emissions, AI is likely to be able to offset the costs of additional emissions.
The report "Power Hungry - How AI will Drive Energy Demand" stated that "the social cost of the extra emissions are minor in comparison to the economic gains expected from AI. However, it still contributes to the alarming build-up of emissions."
The adoption of AI will drive a surge in the demand for data processing power that is energy intensive in the coming years. This is happening even though there are still many countries struggling to meet their carbon emission reduction promises.
IMF's report stated that the area dedicated to warehouses filled with servers in Northern Virginia, the place where the largest concentration of data centers is located, was equivalent to eight Empire State Buildings.
The estimated AI-driven global energy needs could triple to 1,500 terawatt hours (TWh) in 2030, which is about the same amount as India's present electricity consumption. It would also be 1.5 times more than the expected demand for electric vehicles during the same time period.
This rise in carbon emissions will be partly determined by whether or not tech companies can deliver on their promises to reduce emissions from data centers through increased use of renewable energy and other methods.
COULD AI lead to energy efficiency gains?
According to the IMF, a strong uptake of AI under current energy policy would result in a global increase of greenhouse gases of 1.2% between 2025 and 3025. It estimated that greener energy policies could limit this increase to 1.3 Gt.
The extra cost was calculated at between $50.7 and $66.3 billion, which is less than the income gain associated with the 0.5% annual increase in global GDP that AI would bring.
Analysts say that the impact of AI on the economy and environment will be heavily dependent on its use. This includes whether AI can result in more efficient energy consumption or sustainable consumption patterns.
Grantham Research Institute on Climate Change and the Environment stated that it could lead to a reduction in carbon dioxide emissions overall if it accelerates advances in low-carbon technology in the power, transportation and food sectors.
Roberta Pierfederici, Grantham Policy Fellow, said: "Market forces alone will not be able to drive AI's application towards climate action."
She said that governments, tech companies, and energy companies all need to play a role in ensuring AI technology is used in an intentional, equitable, and sustainable way. She also cited the importance of R&D funding, and policies, in order to combat inequalities caused by AI advancements. Mark John, London (Writing & reporting) Edited by Ros Russell
(source: Reuters)
DIEZ Reaffirms Impact Of Its AI Strategy With Over 700 AI Specialized Companies Operating Across Its Economic Zones
ERROR: The request could not be satisfiedRequest blocked. We can't connect to the server for this app or website at this time. There might be too much traffic or a configuration error. Try again later, or contact the app or website owner. If you provide content to customers through CloudFront, you can find steps to troubleshoot and help prevent this error by reviewing the CloudFront documentation.
Generated by cloudfront (CloudFront) Request ID: UXbFhkpDL5q8ovsF2OGpnGR9drgDIjolrm7_RQUDvQAPuev4JIjp4g==
Comments
Post a Comment