AI Today Podcast: AI Glossary Series – Backpropagation, Learning Rate, and Optimizer

AI Today Podcast: Artificial Intelligence Insights, Experts, and Opinion - Un podcast de AI & Data Today - Les mercredis

Catégories:

Backpropagation was one of the innovations by Geoff Hinton that made deep learning networks a practical reality. But have you ever heard of that term before and know what it is at a high level? In this episode of the AI Today podcast hosts Kathleen Walch and Ron Schmelzer define the terms Backpropagation, Learning Rate, and Optimizer, explain how these terms relates to AI and why it's important to know about them. Show Notes: FREE Intro to CPMAI mini course CPMAI Training and Certification AI Glossary Glossary Series: Artificial Intelligence Glossary Series: Artificial General Intelligence (AGI), Strong AI, Weak AI, Narrow AI Glossary Series: Heuristic & Brute-force Search AI Glossary Series - Machine Learning, Algorithm, Model Glossary Series: (Artificial) Neural Networks, Node (Neuron), Layer Glossary Series: Bias, Weight, Activation Function, Convergence, ReLU Glossary Series: Perceptron Glossary Series: Hidden Layer, Deep Learning

Visit the podcast's native language site