Ever since radar proved a game-changer for air defence in World War II, increasingly advanced systems have enabled the accurate detection of enemy aircraft, ships, land systems and missiles and more. But adversaries have equivalent systems that pose a risk to military operations, and not knowing where they are could prove disastrous.
“So we created some business rules and put those manual verification and validation processes into software rules into our system which we’ve called Moonlight. Read More
Monthly Archives: April 2019
AI/ML Lessons for Creating a Platform Strategy – Part 2
In Part 1 we described these lessons:
- Information centric businesses are obvious targets. (E.g. insurance, mortgage lending, media, telecom, real estate brokerage).
- Since network size is the measure of success, it is more likely these will be B2C.
- Fragmented industries are good targets.
- Best of all find fragmented markets that are under served.
- Yes there is competition among emerging platforms so first movers who execute well are favored.
- Don’t wait to get started or you could end a commodity in someone else’s network platform.
Now let’s continue:
- In addition to fragmented markets, look for markets with a large imbalance of knowledge between buyer and seller.
What happens when AI falls into the wrong hands
There are three types of attacks in which an attacker can use AI:
1. AI-boosted/based cyber-attacks – In this case, the malware operates AI algorithms as an integral part of its business logic. For example, using AI-based anomaly detection algorithms to indicate irregular user and system activity patterns.
2. AI-based attack infrastructure and frameworks – in this case, the malicious code and malware running on the victim’s machine do not include AI algorithms, however, AI is used elsewhere in the attacker’s environment and infrastructure – on the server side, in the malware creation process etc.
3. Adversarial attacks – In this case, we use “malicious” AI algorithms to subvert the functionality of “benign” AI algorithms. This is done by using the same algorithms and techniques used in traditional machine learning, but this time it’s used to “break” or “reverse-engineer” the algorithm(s) of security products. Read More
The next wave of Artificial Intelligence: on-device AI
Artificial Intelligence is no more a science fiction word today. It has turned into a basic piece of our reality as cell phones, smartwatches, tablets, to give some examples. Our lives today revolve around these gadgets to an incomprehensible degree. Utilization of virtual Personal Assistants like Siri and Cortana is on the ascent. We would be powerless if Google maps weren’t there to guide us. To put it plainly, AI is progressing quickly and is changing the manner in which we live our lives. The smart devices today are way ‘more intelligent’ than their ancestors. The fast upgrades in the hardware and software spaces have set off an era where Intelligence is moving from the cloud onto the device and altering our lives.
With new abilities for existing solutions, on-gadget AI makes them all the more brilliant and faster. That could mean more intelligent assistants, more secure vehicles, enhanced security, leaps in robotics, a development in healthcare solutions, and much more. ML and data processing in the cloud won’t leave, yet on-device AI conveys customized experiences with some gigantic advantages, including colossally improved performance, particularly for those AI use cases that can’t manage the cost of even a microsecond slack: like auto security. On-device AI supports security and privacy, ensuring sensitive information like voice ID and face checks that could be undermined in the cloud. Also, when your AI power is in your grasp or readily available, reliability is never again an issue of network accessibility or bandwidth.
It is a typical conviction that AI is about Big data and cloud. Unexpectedly, AI can likewise be localized, directly in the palm of our hands in our cell phones. There has been a consistent development of AI towards the edge devices. This has been possible because of an expansion in computing power combined with upgrades in AI algorithms and the creation of strong software and hardware. These progressions have made it conceivable to run Machine Learning solutions on cell phones and cars instead of in the cloud, and this pattern is on the ascent. Read More
One Month, 500,000 Face Scans: How China Is Using A.I. to Profile a Minority
The Chinese government has drawn wide international condemnationfor its harsh crackdown on ethnic Muslims in its western region, including holding as many as a million of them in detention camps.
Now, documents and interviews show that the authorities are also using a vast, secret system of advanced facial recognition technology to track and control the Uighurs, a largely Muslim minority. It is the first known example of a government intentionally using artificial intelligence for racial profiling, experts said.
The facial recognition technology, which is integrated into China’s rapidly expanding networks of surveillance cameras, looks exclusively for Uighurs based on their appearance and keeps records of their comings and goings for search and review. The practice makes China a pioneer in applying next-generation technology to watch its people, potentially ushering in a new era of automated racism. Read More
Is Artificial Intelligence the New Productivity Paradox?
In the 1970s and 80s, investments in computer technology were increasing by more than 20% per year. Strangely though, productivity growth had decreased during the same period. Economists found this turn of events so strange that they called it the productivity paradox to underline their confusion.
Productivity growth would take off in the late 1990s, but then mysteriously drop again during the mid-aughts. At each juncture, experts would debate whether digital technology produced real value or if it was all merely a mirage and that debate continued even as industry after industry was disrupted.
Today, that debate is over, but a new one is likely to begin over artificial intelligence. Much like in the early 1970s, we have increasing investment in a new technology, diminished productivity growth and “experts” predicting massive worker displacement . What’s different is that now we have history and experience to guide us and can avoid making the same mistakes. Read More
DataOps is NOT Just DevOps for Data
One common misconception about DataOps is that it is just DevOpsapplied to data analytics. While a little semantically misleading, the name “DataOps” has one positive attribute. It communicates that data analytics can achieve what software development attained with DevOps. That is to say, DataOps can yield an order of magnitude improvement in quality and cycle time when data teams utilize new tools and methodologies. The specific ways that DataOps achieves these gains reflect the unique people, processes and tools characteristic of data teams (versus software development teams using DevOps). Here’s our in-depth take on both the pronounced and subtle differences between DataOps and DevOps. Read More
7 Steps to Go From Data Science to Data Ops
The DataOps Manifesto
Whether referred to as data science, data engineering, data management, big data, business intelligence, or the like, through our work we have come to value in analytics:
Individuals and interactions over processes and tools
Working analytics over comprehensive documentation
Customer collaboration over contract negotiation
Experimentation, iteration, and feedback over extensive upfront design
Cross-functional ownership of operations over siloed responsibilities
Read More
Emerging Data Center Trends: From DevOps To DataOps
If asked to list the top trends that are shaping the enterprise data center today, most technologists and tech investors would likely agree on a core set. The list would include technologies like such as cloud computing, containers and virtualization, microservices, machine learning and data science, flash memory, edge computing, NVMe and GPUs. These technologies are all important for organizations pushing digital transformation.
The harder question: What’s coming next? Which emerging technologies or paradigm shifts are poised to be the next big thing? And what effects will they have on the hardware and software markets? Read More
