Skip to main content

Artificial Intelligence vs. Machine Learning vs. Deep Learning

The world as we know it is moving towards machines — big time. But we cannot fully utilize the working of any machine without a lot of human interaction. In order to do that, we need some kind of intelligence for the machines. This is where artificial intelligence comes in. It is the concept of machines being smart enough to carry out numerous tasks without any human intervention.
The terms artificial intelligence and machine learning often lead to confusion and many of us don't exactly know the difference between them. Hence, we end up using these terms interchangeably. Machine learning is basically the learning concepts of machines through which we can achieve artificial intelligence. Deep learning is the latest thing in the artificial intelligence field. It is one of the ways to implement machine learning to achieve AI.
Most of us have seen AI-based movies with machines having their own intelligence like the Terminator series or I, Robot. But in real life, the concept of AI isn't optimized enough to handle these real-life situations and act accordingly. Mostly, AI implementations are only situation-based codings. Machine learning was introduced to handle large amounts data and to make machines learn using inputs/examples to process further problems.

Artificial Intelligence

Artificial intelligence, as the name suggests, involves creating intelligence artificially. It's the word that we've been repeating for more than half a century. It was introduced in the 60s and caught everyone's eye very quickly. The goal of AI is to reduce human interaction for a machine to do its work properly.
AI has been implemented in several ways. It doesn't always have to be smart implementation. Many implementations are just hardcoded functionalities used to run according to a given choice or situation. But in real-time scenarios, we have a lot of variables, and according to them, some action must be chosen to be executed. In those scenarios, hardcoding does not give us good results. This is when machine learning comes into the picture.

Machine Learning

Machine learning is an approach to implementing artificial intelligence. It is basically the study of the algorithms that make use of vast datasets to be parsed and ingested as examples, and based on these examples, further problems are solved.
Machine learning means making the machine learn to solve problems by providing enough examples/inputs, just like a human learns something, with examples on how to use this intelligence to solve further problems.
There are several algorithms that are used for machine learning. For example:
  • Find-S
  • Decision trees
  • Random forests
  • Artificial neural networks

Deep Learning

Deep learning is the newest term in the era of machine learning. Deep learning is a way to implement machine learning. It basically uses the artificial neural networks algorithm. Neural networks are inspired by our understanding of the biology of our brains and all the interconnections between neurons. But unlike a biological brain where any neuron can connect to any other neuron within a certain physical distance, these artificial neural networks have discrete layers, connections, and directions of data propagation. We'll learn more about deep learning in further blogs.

Conclusion

Artificial intelligence is a broad concept that is implemented through machine learning (which involves many efficient algorithms for real data). Deep learning involves neural network-based algorithms based on machine learning.
Deep learning has given a new level of possibilities to the AI world. Currently, deep learning is being used in the research community and in the industry to help solve many big data problems such as computer vision, speech recognition, and natural language processing.
Still, what we have today is the concept of narrow AI. Narrow AI (or weak AI) symbolizes that the AI we are working on is related to specific tasks. For example, vehicle automation (such as the Google self-driving car) or image classification/face recognition (such as Facebook's deep learning algorithms) are some specific tasks that have been made possible with deep learning.
The aim of AI from the start was to create a general AI (strong AI) to achieve the functionality of human brain that was not related to a specific task but to perform all general tasks and to respond to situations well, i.e. mimicking a human brain processing. So, we still have a long way to go.

Comments

Popular posts from this blog

EVENT DRIVEN MICROSERVICES

EVENT BASED MICROSERVICES - Event Sourcing In a Microservice Architecture, especially with Database per Microservice, the Microservices need to exchange data. For resilient, highly scalable, and fault-tolerant systems, they should communicate asynchronously by exchanging Events. In such a case, you may want to have Atomic operations, e.g., update the Database and send the message. If you have SQL databases and want to have distributed transactions for a high volume of data, you cannot use the two-phase locking (2PL) as it does not scale. If you use NoSQL Databases and want to have a distributed transaction, you cannot use 2PL as many NoSQL databases do not support two-phase locking. In such scenarios, use Event based Architecture with Event Sourcing. In traditional databases, the Business Entity with the current “state” is directly stored. In Event Sourcing, any state-changing event or other significant events are stored instead of the entities. It means the modifications of a Busines...

Recommendation Engines - Know How

Recommendation Engines perform a variety of tasks - but the most important one is to find products that are most relevant to the user. Content based filtering, collaborative filtering and Association rules are common approaches to do so. So let's first  Understand basics of Recommendation Engines and then we'll later on Build Our Own Recommendation Engine !!! HIGH QUALITY, PERSONALIZED  ARE THE HOLY GRAIL FOR EVERY ONLINE STORE. UNLIKE OFFLINE STORES,  ONLINE STORES HAVE NO SALES PEOPLE. USERS ON THE OTHER HAND  HAVE LIMITED TIME AND PATIENCE,  ARE NOT SURE WHAT THEY ARE LOOKING FOR  ONLINE STORES HAVE A HUGE NUMBER OF  PRODUCTS. RECOMMENDATIONS HELP USERS  NAVIGATE THE MAZE OF ONLINE STORES  FIND WHAT THEY ARE LOOKING FOR  FIND THINGS THEY MIGHT LIKE, BUT DIDN’T KNOW OF. RECOMMENDATIONS HELP ONLINE STORES  SOLVE THE PROBLEM OF DISCOVERY. BUT HOW? Lets Explain this. ONLINE STORES HAVE DATA 1) WHAT USERS  BOUGHT 2)...

KAFKA - Architecture

Kafka - Architecture What is Kafka? Kafka is an event-streaming platform that is designed to process high volumes of data in real-time. Developed by LinkedIn in 2011, it has quickly become the infrastructural backbone of companies like Netflix, Twitter, and Spotify. Why do we need Kafka? In today’s data-driven world, tracking information like user clicks, recommendations, and shopping carts can be invaluable for a company’s growth. With these analytics, companies can make the product improvements needed to boost user engagement and conversion rates. However, on sites with millions of daily users, collecting and analyzing this data is nontrivial. Kafka was des i gned to streamline this operation, acting as a robust tool that maintains efficient, real-time processing capabilities with incredible quantities of data. For instance, as of late 2019, LinkedIn’s Kafka deployments were managing more than 7 trillion messages per day. How does Kafka work? Kafka provides a structured architecture ...