Menu

Blog

Archive for the ‘information science’ category: Page 158

Nov 21, 2021

How we could Time Travel through a (special) black hole — Back to the PAST!

Posted by in categories: cosmology, information science, singularity, space travel, time travel

Get your SPECIAL OFFER for MagellanTV here: https://try.magellantv.com/arvinash — It’s an exclusive offer for our viewers! Start your free trial today. MagellanTV is a new kind of streaming service run by filmmakers with 3,000+ documentaries! Check out our personal recommendation and MagellanTV’s exclusive playlists: https://www.magellantv.com/genres/science-and-tech.

Chapters.
0:00 — You are a time traveler.
2:32 — Spacetime & light cone review.
6:15 — Flat Spacetime equations.
7:03 — Schwarzschild radius, metric.
8:42 — Light cone near a black hole.
10:15 — How to escape black hole.
10:39 — Kerr-Newman metric.
11:34 — How to remove the event horizon.
11:50 — What is a naked singularity.
12:20 — How to travel back in time.
13:26 — Problems.

Continue reading “How we could Time Travel through a (special) black hole — Back to the PAST!” »

Nov 21, 2021

Is God in Physics? Fine Tuning Scrutinized

Posted by in categories: alien life, information science, mathematics, particle physics

Signup for your FREE TRIAL to The GREAT COURSES PLUS here: http://ow.ly/5KMw30qK17T. Until 350 years ago, there was a distinction between what people saw on earth and what they saw in the sky. There did not seem to be any connection.

Then Isaac Newton in 1,687 showed that planets move due to the same forces we experience here on earth. If things could be explained with mathematics, to many people this called into question the need for a God.

Continue reading “Is God in Physics? Fine Tuning Scrutinized” »

Nov 21, 2021

China unveils detailed goals for 5G-aided Industrial Internet of Things development

Posted by in categories: chemistry, information science, internet, robotics/AI

China’s Ministry of Industry and Information Technology (MIIT) on Saturday released its second batch of extended goals for promoting the usage of China’s 5G network and the Industrial Internet of Things (IIoT).

IIoT refers to the interconnection between sensors, instruments and other devices to enhance manufacturing efficiency and industrial processes. With a strong focus on machine-to-machine communication, big data and machine learning, the IIoT has been applied across many industrial sectors and applications.

The MIIT announced that the 5G IIoT will be applied in the petrochemical industry, building materials, ports, textiles and home appliances as the 2021 China 5G + Industrial Internet Conference kicked off Saturday in Wuhan, central China’s Hubei Province.

Nov 19, 2021

‘Deepfaking the mind’ could improve brain-computer interfaces for people with disabilities

Posted by in categories: information science, robotics/AI

Researchers at the USC Viterbi School of Engineering are using generative adversarial networks (GANs)—technology best known for creating deepfake videos and photorealistic human faces—to improve brain-computer interfaces for people with disabilities.

In a paper published in Nature Biomedical Engineering, the team successfully taught an AI to generate synthetic brain activity data. The data, specifically called spike trains, can be fed into to improve the usability of (BCI).

BCI systems work by analyzing a person’s brain signals and translating that into commands, allowing the user to control like computer cursors using only their thoughts. These devices can improve quality of life for people with motor dysfunction or paralysis, even those struggling with locked-in syndrome—when a person is fully conscious but unable to move or communicate.

Nov 19, 2021

Why This Lab Is Slicing Human Brains Into Little Pieces

Posted by in categories: information science, robotics/AI

There’s a multibillion-dollar race going on to build the first complete map of the brain, something scientists are calling the “connectome.” It involves slicing the brain into thousands of pieces, and then digitally stitching them back together using a powerful AI algorithm.

Presented by Polestar.

Continue reading “Why This Lab Is Slicing Human Brains Into Little Pieces” »

Nov 19, 2021

Researchers Find Human Learning Can be Duplicated in Synthetic Matter

Posted by in categories: information science, robotics/AI

Rutgers researchers and their collaborators have found that learning — a universal feature of intelligence in living beings — can be mimicked in synthetic matter, a discovery that in turn could inspire new algorithms for artificial intelligence (AI).

The study appears in the journal PNAS.

One of the fundamental characteristics of humans is the ability to continuously learn from and adapt to changing environments. But until recently, AI has been narrowly focused on emulating human logic. Now, researchers are looking to mimic human cognition in devices that can learn, remember and make decisions the way a human brain does.

Nov 18, 2021

Understanding Bias in AI: What Is Your Role, and Should You Care?

Posted by in categories: information science, robotics/AI

There are billions of people around the world whose online experience has been shaped by algorithms that utilize artificial intelligence (AI) and machine learning (ML). Some form of AI and ML is employed almost every time people go online, whether they are searching for content, watching a video, or shopping for a product. Not only do these technologies increase the efficiency and accuracy of consumption but, in the online ecosystem, service providers innovate upon and monetize behavioral data that is captured either directly from a user’s device, a website visit or by third parties.

Advertisers are increasingly dependent on this data and the algorithms that adtech and martech employ to understand where their ads should be placed, which ads consumers are likely to engage with, which audiences are most likely to convert, and which publisher should get credit for conversions.

Additionally, the collection and better utilization of data helps publishers generate revenue, minimize data risks and costs, and provide relevant consumer-preference-based audiences for brands.

Nov 17, 2021

A computer algorithm that speeds up experiments on plasma

Posted by in categories: biotech/medical, computing, information science, nuclear energy

A team of researchers from Tri Alpha Energy Inc. and Google has developed an algorithm that can be used to speed up experiments conducted with plasma. In their paper published in the journal Scientific Reports, the group describes how they plan to use the algorithm in nuclear fusion research.

As research into harnessing has progressed, scientists have found that some of its characteristics are too complex to be solved in a reasonable amount of time using current technology. So they have increasingly turned to computers to help. More specifically, they want to adjust certain parameters in a device created to achieve fusion in a reasonable way. Such a device, most in the field agree, must involve the creation of a certain type of that is not too hot or too cold, is stable, and has a certain desired density.

Finding the right parameters that meet these conditions has involved an incredible amount of trial and error. In this new effort, the researchers sought to reduce the workload by using a to reduce some of the needed trials. To that end, they have created what they call the “optometrist’s .” In its most basic sense, it works like an optometrist attempting to measure the visual ability of a patient by showing them images and asking if they are better or worse than other images. The idea is to use the crunching power of a computer with the intelligence of a human being—the computer generates the options and the human tells it whether a given option is better or worse.

Nov 17, 2021

Do Androids Dream of Electric Sheep? Dr. Ben Goertzel with Philip K. Dick at the Web Summit 2019

Posted by in categories: bitcoin, information science, internet, robotics/AI, singularity

Dr. Ben Goertzel with Philip K. Dick at the Web Summit in Lisbon 2019.

Ben showcases the use of OpenCog within the SingularityNET enviroment which is powering the AI of the Philip K. Dick Robot.

Continue reading “Do Androids Dream of Electric Sheep? Dr. Ben Goertzel with Philip K. Dick at the Web Summit 2019” »

Nov 17, 2021

Mathematicians derive the formulas for boundary layer turbulence 100 years after the phenomenon was first formulated

Posted by in categories: information science, mathematics

Turbulence makes many people uneasy or downright queasy. And it’s given researchers a headache, too. Mathematicians have been trying for a century or more to understand the turbulence that arises when a flow interacts with a boundary, but a formulation has proven elusive.

Now an international team of mathematicians, led by UC Santa Barbara professor Björn Birnir and the University of Oslo professor Luiza Angheluta, has published a complete description of boundary turbulence. The paper appears in Physical Review Research, and synthesizes decades of work on the topic. The theory unites empirical observations with the Navier-Stokes equation—the mathematical foundation of dynamics—into a .

This phenomenon was first described around 1920 by Hungarian physicist Theodore von Kármán and German physicist Ludwig Prandtl, two luminaries in fluid dynamics. “They were honing in on what’s called boundary layer turbulence,” said Birnir, director of the Center for Complex and Nonlinear Science. This is turbulence caused when a flow interacts with a boundary, such as the fluid’s surface, a pipe wall, the surface of the Earth and so forth.