Menu

Blog

Archive for the ‘information science’ category: Page 226

Mar 4, 2020

Invisible Headlights

Posted by in categories: information science, robotics/AI, transportation

Autonomous and semi-autonomous systems need active illumination to navigate at night or underground. Switching on visible headlights or some other emitting system like lidar, however, has a significant drawback: It allows adversaries to detect a vehicle’s presence, in some cases from long distances away.

To eliminate this vulnerability, DARPA announced the Invisible Headlights program. The fundamental research effort seeks to discover and quantify information contained in ambient thermal emissions in a wide variety of environments and to create new passive 3D sensors and algorithms to exploit that information.

“We’re aiming to make completely passive navigation in pitch dark conditions possible,” said Joe Altepeter, program manager in DARPA’s Defense Sciences Office. “In the depths of a cave or in the dark of a moonless, starless night with dense fog, current autonomous systems can’t make sense of the environment without radiating some signal—whether it’s a laser pulse, radar or visible light beam—all of which we want to avoid. If it involves emitting a signal, it’s not invisible for the sake of this program.”

Mar 4, 2020

Musician uses algorithm to generate every possible melody to prevent copyright lawsuits

Posted by in category: information science

Catalogue of 68 billion tunes contains ‘every melody that’s ever existed and ever can exist’

Mar 4, 2020

Unveiling Biology with Deep Microscopy

Posted by in categories: biotech/medical, finance, information science, military, robotics/AI, space

The scientific revolution was ushered in at the beginning of the 17th century with the development of two of the most important inventions in history — the telescope and the microscope. With the telescope, Galileo turned his attention skyward, and advances in optics led Robert Hooke and Antonie van Leeuwenhoek toward the first use of the compound microscope as a scientific instrument, circa 1665. Today, we are witnessing an information technology-era revolution in microscopy, supercharged by deep learning algorithms that have propelled artificial intelligence to transform industry after industry.

One of the major breakthroughs in deep learning came in 2012, when the performance superiority of a deep convolutional neural network combined with GPUs for image classification was revealed by Hinton and colleagues [1] for the ImageNet Large Scale Visual Recognition Challenge (ILSVRC). In AI’s current innovation and implementation phase, deep learning algorithms are propelling nearly all computer vision-intensive applications, including autonomous vehicles (transportation, military), facial recognition (retail, IT, communications, finance), biomedical imaging (healthcare), autonomous weapons and targeting systems (military), and automation and robotics (military, manufacturing, heavy industry, retail).

It should come as no surprise that the field of microscopy would ripe for transformation by artificial intelligence-aided image processing, analysis and interpretation. In biological research, microscopy generates prodigious amounts of image data; a single experiment with a transmission electron microscope can generate a data set containing over 100 terabytes worth of images [2]. The myriad of instruments and image processing techniques available today can resolve structures ranging in size across nearly 10 orders of magnitude, from single molecules to entire organisms, and capture spatial (3D) as well as temporal (4D) dynamics on time scales of femtoseconds to seconds.

Mar 3, 2020

Google algorithm teaches robot how to walk in mere hours

Posted by in categories: information science, robotics/AI

A new robot has overcome a fundamental challenge of locomotion by teaching itself how to walk.

Researchers from Google developed algorithms that helped the four-legged bot to learn how to walk across a range of surfaces within just hours of practice, annihilating the record times set by its human overlords.

Continue reading “Google algorithm teaches robot how to walk in mere hours” »

Mar 3, 2020

Honeywell says it will soon launch the world’s most powerful quantum computer

Posted by in categories: computing, information science, quantum physics

“The best-kept secret in quantum computing.” That’s what Cambridge Quantum Computing (CQC) CEO Ilyas Khan called Honeywell’s efforts in building the world’s most powerful quantum computer. In a race where most of the major players are vying for attention, Honeywell has quietly worked on its efforts for the last few years (and under strict NDA’s, it seems). But today, the company announced a major breakthrough that it claims will allow it to launch the world’s most powerful quantum computer within the next three months.

In addition, Honeywell also today announced that it has made strategic investments in CQC and Zapata Computing, both of which focus on the software side of quantum computing. The company has also partnered with JPMorgan Chase to develop quantum algorithms using Honeywell’s quantum computer. The company also recently announced a partnership with Microsoft.

Mar 3, 2020

SLIDE algorithm for training deep neural nets faster on CPUs than GPUs

Posted by in categories: information science, robotics/AI

Computer scientists from Rice, supported by collaborators from Intel, will present their results today at the Austin Convention Center as a part of the machine learning systems conference MLSys.

Many companies are investing heavily in GPUs and other specialized hardware to implement deep learning, a powerful form of artificial intelligence that’s behind digital assistants like Alexa and Siri, facial recognition, product recommendation systems and other technologies. For example, Nvidia, the maker of the industry’s gold-standard Tesla V100 Tensor Core GPUs, recently reported a 41% increase in its fourth quarter revenues compared with the previous year.

Rice researchers created a cost-saving alternative to GPU, an algorithm called “sub-linear deep learning engine” (SLIDE) that uses general purpose central processing units (CPUs) without specialized acceleration hardware.

Mar 2, 2020

Novel camera calibration algorithm aims at making autonomous vehicles safer

Posted by in categories: information science, robotics/AI, transportation

Some forms of autonomous vehicle watch the road ahead using built-in cameras. Ensuring that accurate camera orientation is maintained during driving is, therefore, in some systems key to letting these vehicles out on roads. Now, scientists from Korea have developed what they say is an accurate and efficient camera-orientation estimation method to enable such vehicles to navigate safely across distances.


A fast camera-orientation estimation algorithm that pinpoints vanishing points could make self-driving cars safer.

John Wallace

Continue reading “Novel camera calibration algorithm aims at making autonomous vehicles safer” »

Mar 1, 2020

How China is using AI and big data to combat coronavirus outbreak

Posted by in categories: biotech/medical, information science, robotics/AI, surveillance

Authorities in China step up surveillance and roll out new artificial intelligence tools to fight deadly epidemic.

Mar 1, 2020

Meet Xenobot, an Eerie New Kind of Programmable Organism

Posted by in categories: bioengineering, information science

Under the watchful eye of a microscope, busy little blobs scoot around in a field of liquid—moving forward, turning around, sometimes spinning in circles. Drop cellular debris onto the plain and the blobs will herd them into piles. Flick any blob onto its back and it’ll lie there like a flipped-over turtle.

Their behavior is reminiscent of a microscopic flatworm in pursuit of its prey, or even a tiny animal called a water bear—a creature complex enough in its bodily makeup to manage sophisticated behaviors. The resemblance is an illusion: These blobs consist of only two things, skin cells and heart cells from frogs.

Writing today in the Proceedings of the National Academy of Sciences, researchers describe how they’ve engineered so-calleds (from the species of frog, Xenopus laevis, whence their cells came) with the help of evolutionary algorithms. They hope that this new kind of organism—contracting cells and passive cells stuck together—and its eerily advanced behavior can help scientists unlock the mysteries of cellular communication.

Feb 28, 2020

AI Is an Energy-Guzzler. We Need to Re-Think Its Design, and Soon

Posted by in categories: information science, robotics/AI

Of course, the computers and data centers that support AI’s complex algorithms are very much dependent on electricity. While that may seem pretty obvious, it may be surprising to learn that AI can be extremely power-hungry, especially when it comes to training the models that enable machines to recognize your face in a photo or for Alexa to understand a voice command.

The scale of the problem is difficult to measure, but there have been some attempts to put hard numbers on the environmental cost.

For instance, one paper published on the open-access repository arXiv claimed that the carbon emissions for training a basic natural language processing (NLP) model—algorithms that process and understand language-based data—are equal to the CO2 produced by the average American lifestyle over two years. A more robust model required the equivalent of about 17 years’ worth of emissions.