Artificial intelligence (AI) is traditionally used in the cloud because AI algorithms consume large amounts of data and consume huge amounts of computing resources, but AI is not the only one living in the cloud. In many cases, AI-based data shortages and decisions should be made locally on devices near the edge of the network.
Edge AI enables you to make business-critical and time-critical decisions faster, more reliably, and more securely. The onslaught on AI to the brink is supported by the rapid growth of smart devices (sensors on smartphones, smartwatches, machines, and infrastructure) on the edge of the network. Earlier this month, Apple spent a total of 200 million dollars on the acquisition of Xnor.ai, a Seattle-based AI startup focused on low-power machine learning software and hardware. Microsoft offers a comprehensive toolkit called Azure IoT Edge that allows you to move AI workloads to the edge of your network.
Will AI continue to move to the brink? It is useful to look back at how the pendulum has been softened from centralized intelligence to distributed intelligence over the four paradigms of arithmetic.
Central management and decentralisation
Since the beginning of the computer is one of the design challenges always where intelligence should live in networks. As I mentioned in the 2001 Harvard Business Review article, the “intelligence transition” from centralized intelligence to distributed intelligence has been repeated.
The first era of computing was mainframes, and intelligence focused on large central computers with all computing power. At the other end of the network was a terminal consisting of a keyboard with essentially a green screen and little intelligence, which was therefore called a “stupid terminal”.
The second era of computing has overturned the mainframe paradigm with desktops or PCs.com.’ Your PC contains all local storage and computer intelligence, and you don’t even need to connect to the network. This decentralized intelligence led to the rise of Microsoft and Intel with the vision of democratizing computing and putting PCs on every home and desk.
The third era of computing, called client-server computing, was a compromise between two extreme intelligences. Large servers performed heavy lifting on the back end, collecting “front-end intelligence” and storing it in network client hardware and software.
The fourth era of computing is a cloud computing paradigm that companies like Amazon use with Amazon Web Services Salesforce.com Microsoft with Azure cloud platform. The cloud provides tremendous computing power and very cost-effective storage and storage. It makes sense to place AI applications in the cloud, as the 300,000-fold capability in AI algorithms increased by 300,000 between 2012 and 2019 and doubled every three-and-a-half months.
However, cloud-based AI has its problems. First, the cloud-based AI is delayed, the data is moved to the cloud for processing, and the results are sent over the network to the local device. In many cases, latency can have a significant impact. For example, if a sensor in a chemical plant predicts an explosion, the plant must be stopped immediately. Surveillance cameras at airports and factories should detect intruders and respond immediately. Autonomous vehicles When the AI algorithm predicts an impending collision, it waits a tenth of a second for the emergency brake to work. In this situation, AI must be on edge so that you can make quick decisions without relying on network connectivity or exchanging large amounts of data over the network.
Just as we moved from mainframe computing to desktop computing 40 years ago, the pendulum is swinging again from centralization to intelligence decentralization.
But as you can see on your PC, life isn’t easy in the end. The computing power that can be introduced into cameras, sensors and smartphones is limited. In addition, many devices at the edge of the network are not connected to the power supply and cause problems with battery life and heat dissipation. Companies such as Tesla, ARM, and Intel are addressing these challenges by developing more efficient processors and lean algorithms that don’t consume as much power.
But there are still times when AI is better in the cloud. AI should stay in the cloud when decision-making requires tremendous computing power and no real-time power is required. For example, if you want to use AI to interpret MRI scans or analyze geospatial data collected by drones on a farm, you can take advantage of the full power of the cloud without having to wait a few minutes or hours.
Training and reasoning
One way to determine where AI should live is to understand the difference between training and reasoning in AI algorithms. When creating and training AI algorithms, this process requires a large amount of data and computing power. In order to teach autonomous vehicles pedestrian detection and traffic lights, it is necessary to deliver millions of images to the algorithm. However, when the algorithm is trained, you can “inference” locally, look at a single object, and determine if it is a pedestrian. In inference mode, the algorithm uses training to make computationally-intensive decisions at the edge of the network.
Cloud AI works synergistically with AI on the edge. Consider an AI-powered car like Tesla. On the edge, AI makes countless real-time decisions, including braking, steering, and lane changes. At night, when a car is parked and connected to a Wi-Fi network, the data is uploaded to the cloud to further train the algorithm. Then you can download smarter algorithms to your vehicle from the cloud, and Tesla is a positive cycle that has been repeated hundreds of times through cloud-based software updates.
Accept the wisdom of “to”
Just as there are more reasons to put AI on the edge, the cloud needs AI. It is either/or the answer, not “and”. The way you live where you need intelligence, AI is where you need it. AI is evolving into decentralized, ubiquitous and networked environmental intelligence. In this vision of the future, intelligence on the edge complements intelligence in the cloud to better align centralized computing needs with localized decisions.
Privacy & Cookies Policy
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.