Democratizing Machine Learning Algorithms for Integrated Data-Sharing
On Artificial Intelligence and Ethics
Transforming the Art Museum in the 21st Century.
Machine Learning: Enabling New Capabilities in Health and Beyond
Revolutionizing Disease Predictions Using Machine Learning
Ylan Kazi, Vice President, Data Science + Machine Learning, UnitedHealthcare
Embedding AI in everything. Why we can't do without it now
Vishwa Kolla, AVP, Head of Advanced Analytics, John Hancock Insurance
Thank you for Subscribing to CIO Applications Weekly Brief
The Tango of AI and Big Data
By Monica Khurana, CTO/CIO in Financial Services, Guardian Life Insurance and Mouli Nagarajan, MNM Partners - AI Consulting Services
We are all witnesses to the staggering explosion in digitally stored data. As per a 2014 IDC Universe Digital Study, there were only around 100 exabytes worth of data on the internet in the year 2006. Today, that number is about 10,000 exabytes. We are now counting data size in zettabytes, which equals one thousand exabytes. Just to show scale, one zettabyte is more than four million times the size of the entire U.S. Library of Congress.
More than 80 percent of this data is raw and unstructured. Most enterprises are looking to store these large data sets and derive useful information quickly from these large datasets. With the evolution of storage, distributive computing and cognitive computing solutions, it is so much easier to store all the data generated, conduct statistical analysis on it to predict the future, get estimates, understand client needs, understand the market sentiments, get a handle on pulse of the business, etc.
How did AI / ML get out of slumber?
The AI / ML concepts date back to the 1950s and many of the key algorithmic breakthroughs happened in the 80s and 90s. This begs the question, why has the “tango” of AI/ ML and data only now gained so much momentum. What has changed today is the availability of so much digital data (audio, video, images, text) and the vast computational power available to make neural networks work so efficiently to pull useful information from these large unstructured datasets.
ML has always been a subset of AI and focussed on training machines to learn about their environments and to use the information to gain knowledge. However, with the massive computing power available along with the availability of large datasets to train on, the value proposition has changed.
with the massive computing power available along with the availability of large datasets to train on, the value proposition has changed
It takes two to Tango!
The benefits of AI/ ML are numerous across different industries. Machine learning is able to benchmark normal behavior from the large troves of training data and can identify anomalous behaviors that could indicate cyber security breaches, money laundering, understand customer behavior, help with regulatory and compliance issues, etc. In the legacy systems, as soon as the criminals changed their behavior and made a change in the way they moved money, the legacy systems would fall apart and would need to be updated to reflect this changed behavior. However, with machine learning, the systems are constantly learning from the digital footprint of data, which means that they can identify shifts in patterns and adapt rapidly to these shifts.
The financial industry is built on data. Stock markets generate a lot of data from channels ranging from exchanges, company’s financial reports, SEC filings, economic reports, analyst predictions, etc. AI systems access this data and identify trading correlations, analyze market trends and make trading decisions.
Artificial intelligence and machine learning are paving the way to transform these piles of data and creating a competitive advantage that any data-driven business can tap into to run their company more efficiently, make smarter decisions and boost profits. AI is already driving consumer tools from Amazon Echo to Google Translate putting AI/Big Data within everybody’s reach.
The growth in AI has fueled fears of machines replacing humans and debate on the relevance of humans in an AI world. Alan Turing, the father of modern computing had devised a test for AI - if a computer could convince a panel of human judges that they were talking to a human and hold a convincing conversation, then it would indicate that AI had advanced to the point where it was indistinguishable from human intelligence. So far, no machine has been able to solve the problem of context retention - understanding what has been said before, referring back to and crafting responses based on the point the conversation has reached. We have start-ups like Semantic Machines that are hoping their AI assistant will be able to interact with you just like a secretary, but with an unparalleled ability to retrieve information from the internet.
Will computers ever be able to think and respond like a human brain? We are at a point where the machine’s ability to understand, assess and interact with the world is growing at a rapid pace, and this pace is only increasing with the volume of data that can help learn, understand and make decisions faster. Big data is the fuel that powers AI which in turn helps machines approach human levels of intelligence.