AI isn't novel... and it isn't a passing trend. It has been simmering at the forefront of technological advancement for years. However, for it to indeed become mainstream, it had to become something tangible that the general public could wield and experience. And I am sitting in wide-eyed bemusement of the possibilities it offers to enhance productivity, efficiency, consistency, and quality across every part of my client's business. So... what happened to make this giant inflection point we are living through?
Let's start our exploration of artificial intelligence with the progression of three vital components:
Data, datatatatatata......Big, bigger, biggerer data. Sufficient data volume to render anomalies statistically insignificant across entire industries. High-quality, diverse data is as valuable as iron, coal, or oil in their respective technological eras. Wars will be fought over access to data because it is the fuel that makes artificial intelligence possible.
MIT collaborated with Mass General to scrutinize numerous breast exams, aiming to detect patterns within pixelated images that indicate aggressive forms of cancer. Mass General possessed both the images and the outcomes. MIT built the model to extract the results. The model commenced its learning process gobbling up the data and unveiled associations that were previously beyond the scope of human researchers' imagination. Now the model can quickly assess certain types of breast cancer risk eliminating unnecessary surgeries while improving mortality rates.
What constitutes a significant volume of data? IBM estimates that approximately 2.5 quintillion bytes of data are generated daily. This amount is equivalent to 2.5 exabytes or 2,500,000 terabytes of data. To provide some perspective, it could be likened to roughly 625 billion DVDs or approximately 312,500 years of HD video streaming.
Algorithms and models. Algorithms and models are designed to analyze and derive valuable information from data, and the latest breakthrough is GPT-3 (Generative Pre-trained Transformer 3) developed by OpenAI. This language model has a profound effect as it enhances people's communication abilities. It has the capacity to generate coherent and contextually appropriate responses to specific "prompts." While it requires skill in formulating questions and verifying the accuracy of its results, it provides simultaneous access to a vast wealth of knowledge.
Primary AI algorithms include Supervised Learning, Unsupervised Learning, Reinforcement Learning, Deep Learning, Natural Language Processing (NLP), Computer Vision, Genetic Algorithms, Decision Trees, Clustering Algorithms, and Bayesian Networks. These algorithms enable tasks like prediction, pattern recognition, language analysis, image processing, and optimization. They form the basis of diverse AI applications and constantly evolve with advancements.
Processing Power. Processing data using those models and algorithms demands substantial computing power, and the advancement of cloud computing and distributed computing has emerged as the pivotal third element in unlocking the potential of artificial intelligence.
Cloud Computing and Distributed Computing: Cloud computing platforms offer accessible and robust computing resources, democratizing access to high-performance computing. This empowers researchers and developers to leverage scalable resources on demand, eliminating the need for expensive local infrastructure. Additionally, distributed computing frameworks facilitate the parallelization of AI tasks across multiple machines, reducing training time and enabling the processing of extensive datasets.
Quantum Computing: Despite being in its early stages, the exploration of quantum computing's potential to revolutionize AI is underway. Quantum computers have the capability to execute specific AI calculations exponentially faster than classical computers, enabling rapid optimization of complex algorithms and solving currently intractable computational problems.
Comments