Menu

Blog

Archive for the ‘mathematics’ category: Page 55

Mar 13, 2023

Building a neural network FROM SCRATCH (no Tensorflow/Pytorch, just numpy & math)

Posted by in categories: mathematics, robotics/AI

Kaggle notebook with all the code: https://www.kaggle.com/wwsalmon/simple-mnist-nn-from-scratch-numpy-no-tf-keras.

Blog article with more/clearer math explanation: https://www.samsonzhang.com/2020/11/24/understanding-the-mat…numpy.html

Mar 13, 2023

But what is a neural network? | Chapter 1, Deep learning

Posted by in categories: mathematics, robotics/AI

What are the neurons, why are there layers, and what is the math underlying it?
Help fund future projects: https://www.patreon.com/3blue1brown.
Written/interactive form of this series: https://www.3blue1brown.com/topics/neural-networks.

Additional funding for this project provided by Amplify Partners.

Continue reading “But what is a neural network? | Chapter 1, Deep learning” »

Mar 13, 2023

Microsoft Proposes MathPrompter: A Technique that Improves Large Language Models (LLMs) Performance on Mathematical Reasoning Problems

Posted by in categories: information science, mathematics, robotics/AI

LLMs stands for Large Language Models. These are advanced machine learning models that are trained to comprehend massive volumes of text data and generate natural language. Examples of LLMs include GPT-3 (Generative Pre-trained Transformer 3) and BERT (Bidirectional Encoder Representations from Transformers). LLMs are trained on massive amounts of data, often billions of words, to develop a broad understanding of language. They can then be fine-tuned on tasks such as text classification, machine translation, or question-answering, making them highly adaptable to various language-based applications.

LLMs struggle with arithmetic reasoning tasks and frequently produce incorrect responses. Unlike natural language understanding, math problems usually have only one correct answer, making it difficult for LLMs to generate precise solutions. As far as it is known, no LLMs currently indicate their confidence level in their responses, resulting in a lack of trust in these models and limiting their acceptance.

To address this issue, scientists proposed ‘MathPrompter,’ which enhances LLM performance on mathematical problems and increases reliance on forecasts. MathPrompter is an AI-powered tool that helps users solve math problems by generating step-by-step solutions. It uses deep learning algorithms and natural language processing techniques to understand and interpret math problems, then generates a solution explaining each process step.

Mar 12, 2023

How Einstein tried to model the shape of the Universe

Posted by in categories: cosmology, information science, mathematics, quantum physics

To keep his Universe static, Einstein added a term into the equations of general relativity, one he initially dubbed a negative pressure. It soon became known as the cosmological constant. Mathematics allowed the concept, but it had absolutely no justification from physics, no matter how hard Einstein and others tried to find one. The cosmological constant clearly detracted from the formal beauty and simplicity of Einstein’s original equations of 1915, which achieved so much without any need for arbitrary constants or additional assumptions. It amounted to a cosmic repulsion chosen to precisely balance the tendency of matter to collapse on itself. In modern parlance we call this fine tuning, and in physics it is usually frowned upon.

Einstein knew that the only reason for his cosmological constant to exist was to secure a static and stable finite Universe. He wanted this kind of Universe, and he did not want to look much further. Quietly hiding in his equations, though, was another model for the Universe, one with an expanding geometry. In 1922, the Russian physicist Alexander Friedmann would find this solution. As for Einstein, it was only in 1931, after visiting Hubble in California, that he accepted cosmic expansion and discarded at long last his vision of a static Cosmos.

Einstein’s equations provided a much richer Universe than the one Einstein himself had originally imagined. But like the mythic phoenix, the cosmological constant refuses to go away. Nowadays it is back in full force, as we will see in a future article.

Mar 11, 2023

I believe chatbots understand part of what they say. Let me explain

Posted by in categories: mathematics, quantum physics, robotics/AI

Finally, a rational exploration of what ChatGPT actually knows and what that means.


Try out my quantum mechanics course (and many others on math and science) on Brilliant using the link https://brilliant.org/sabine. You can get started for free, and the first 200 will get 20% off the annual premium subscription.

Continue reading “I believe chatbots understand part of what they say. Let me explain” »

Mar 10, 2023

Long-Sought Math Proof Unlocks More Mysterious ‘Modular Forms’

Posted by in categories: mathematics, physics

Using “refreshingly old” tools, mathematicians resolved a 50-year-old conjecture about how to categorize important functions called modular forms, with consequences for number theory and theoretical physics.

Mar 9, 2023

Perovskite nanocrystal computer components inspired by brain cells

Posted by in categories: computing, mathematics, neuroscience

Researchers at Empa, ETH Zurich and the Politecnico di Milano are developing a new type of computer component that is more powerful and easier to manufacture than its predecessors. Inspired by the human brain, it is designed to process large amounts of data fast and in an energy-efficient way.

In many respects, the is still superior to modern computers. Although most people can’t do math as fast as a , we can effortlessly process complex sensory information and learn from experiences, while a computer cannot—at least not yet. And, the brain does all this by consuming less than half as much energy as a laptop.

One of the reasons for the brain’s energy efficiency is its structure. The individual brain cells—the neurons and their connections, the synapses—can both store and process information. In computers, however, the memory is separate from the processor, and data must be transported back and forth between these two components. The speed of this transfer is limited, which can slow down the whole computer when working with large amounts of data.

Mar 9, 2023

A Brief History of Dirac Delta Function

Posted by in categories: engineering, mathematics, quantum physics

From Cauchy to Dirac — A Century-Long Journey. “A Brief History of Dirac Delta Function” is published by Areeba Merriam in Cantor’s Paradise.

Mar 6, 2023

An Applied Mathematician Strengthens AI With Pure Math

Posted by in categories: mathematics, robotics/AI

Lek-Heng Lim uses tools from algebra, geometry and topology to answer questions in machine learning.

Mar 5, 2023

Mathematicians Discovered a New, Much Faster Way to Multiply Large Numbers

Posted by in categories: information science, mathematics

Two mathematicians from Australia and France have come up with a new, faster way to multiply extremely long numbers together.

In doing so, they have cracked an algorithmic puzzle that remained unsolved by some of the world’s best-known math minds, for almost fifty years.

Page 55 of 152First5253545556575859Last