Artificial Intelligence (AI) Solutions, particularly those based on Deep Learning in the areas of Computer Vision, are done in a cloud-based environment requiring heavy computing capacity.
Inference is a relatively lower compute-intensive task than training, where latency is of greater importance for providing real-time results on a model. Most inference is still performed in the cloud or on a server, but as the diversity of AI applications grows, the centralized training and inference paradigm is coming into question.
It is possible, and becoming easier, to run AI and Machine Learning with analytics at the Edge today, depending on the size and scale of the Edge site and the particular system being used. While Edge site computing systems are much smaller than those found in central data centers, they have matured, and now successfully run many workloads due to an immense growth in the processing power of today’s x86 commodity servers. It’s quite amazing how many workloads can now run successfully at the Edge. Read More
Tag Archives: Nvidia
NVIDIA, Amazon Web Services (AWS) Partner on AI, IoT
SAN JOSE, Calif., March 18, 2019 (GLOBE NEWSWIRE) — GPU Technology Conference—NVIDIA today announced a collaboration with Amazon Web Services (AWS) IoT on NVIDIA® Jetson™ to enable customers to deploy AI and deep learning to millions of connected devices.
This joint solution enables models to be easily created, trained and optimized on AWS, then deployed to Jetson-powered edge devices using AWS IoT Greengrass.
The NVIDIA Jetson platform offers AI at the edge with high-performance and power-efficient computing. Applications include autonomous machines and smart cameras for industries such as retail, manufacturing, agriculture and more. Read More
GPUs usher in a new era in government analytics
Decades ago, graphics processing units were used mostly for rendering ninja fighters and Formula One racecars. Since their days in game systems, however, GPUs have experienced an amazing evolution in processing power. Today they sit at the very center of enterprise computing — and in doing so, are ushering in a new era of capability and insight for government agencies.
The unique ability of these more advanced GPUs to handle artificial intelligence, machine learning and other high-performance tasks have made them a vital component of digital transformation. Many agency IT departments are discovering the power of GPUs to make better use of massive amounts of data. With the volume of data amassing at ever-increasing rates — 40% annually by many accounts — processing power is becoming essential to better public service and improved national security.
Currently, CPU processing power and performance is growing by 10-20% per year — not nearly fast enough to keep pace with data generated from billions of active and passive data sources. GPU performance, on the other hand, is growing by an impressive 50% per year. In the latest TOP500 supercomputer rankings, more performance gains were added by GPUs than CPUs for the first time ever. Read More