Menu

Blog

Archive for the ‘supercomputing’ category: Page 62

May 17, 2020

“Hacking Bitcoin” Called Super Computers to Crypto Mining

Posted by in categories: bitcoin, cybercrime/malcode, supercomputing

Several supercomputers in Europe have been hacked in the past few days. Attackers are thought to use these supercomputers for mining Monero (XMR).

A massive attack was carried out on some supercomputers based in Germany, the UK and Switzerland. These events first surfaced with the announcement of the University of Edinburgh on Monday. University of Edinburgh; He explained that the supercomputer known as ARCHER has detected a “vulnerability in the input nodes” and the system has been disabled. Authorities had to reset their SSH password to prevent the attack.

The attacks were not limited to this. An organization called bwHPC in Germany also made a statement on Monday, and five different supercomputers in Germany; It announced that it was closed due to “vulnerabilities” similar to those in the UK.

May 15, 2020

Meet the Intern Using Quantum Computing to Study the Early Universe

Posted by in categories: cosmology, education, nanotechnology, quantum physics, supercomputing

With the help of the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory, Juliette Stecenko is exploring cosmology—a branch of astronomy that investigates the origin and evolution of the universe, from the Big Bang to today and into the future. As an intern through DOE’s Science Undergraduate Laboratory Internships (SULI) program, administered at Brookhaven by the Office of Educational Programs (OEP), Stecenko is using modern supercomputers and quantum computing platforms to perform astronomy simulations that may help us better understand where we came from.

Stecenko works under the guidance of Michael McGuigan, a computational scientist in the quantum computing group at Brookhaven’s Computational Science Initiative. The two have been collaborating on simulating Casimir energy—a small force that two electrically neutral surfaces held a tiny distance apart will experience from quantum, atomic, or subatomic fluctuations in the vacuum of space. The vacuum energy of the universe and the Casimir pressure of this energy could be a possible explanation of the origin and evolution of the universe, as well a possible cause of its accelerated expansion.

“Casimir energy is something scientists can measure in the laboratory and is especially important for nanoscience, or in cosmology, in the very early universe when the universe was very small,” McGuigan said.

May 11, 2020

Supercomputer Simulations Identify Several Drugs as Potential Candidates Against COVID-19

Posted by in categories: biotech/medical, supercomputing

Drugs used for curing hepatitis C might also help against Covid-19 / World Health Organization publishes paper presented by researchers from Mainz University.

Several drugs approved for treating hepatitis C viral infection were identified as potential candidates against COVID-19, the disease caused by the SARS-CoV-2 coronavirus. This is the result of research based on extensive calculations using the MOGON II supercomputer at Johannes Gutenberg University Mainz (JGU). One of the most powerful computers in the world, MOGON II is operated by JGU and the Helmholtz Institute Mainz.

As the JGU researchers explained in their paper recently published at the World Health Organization (WHO) website, they had simulated the way that about 42,000 different substances listed in open databases bind to certain proteins of SARS-CoV-2 and thereby inhibit the penetration of the virus into the human body or its multiplication.

May 5, 2020

Four years of calculations lead to new insights into muon anomaly

Posted by in categories: particle physics, supercomputing

Two decades ago, an experiment at the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory pinpointed a mysterious mismatch between established particle physics theory and actual lab measurements. When researchers gauged the behavior of a subatomic particle called the muon, the results did not agree with theoretical calculations, posing a potential challenge to the Standard Model—our current understanding of how the universe works.

Ever since then, scientists around the world have been trying to verify this discrepancy and determine its significance. The answer could either uphold the Standard Model, which defines all of the known subatomic particles and how they interact, or introduce the possibility of an entirely undiscovered physics. A multi-institutional research team (including Brookhaven, Columbia University, and the universities of Connecticut, Nagoya and Regensburg, RIKEN) have used Argonne National Laboratory’s Mira supercomputer to help narrow down the possible explanations for the discrepancy, delivering a newly precise theoretical calculation that refines one piece of this very complex puzzle. The work, funded in part by the DOE’s Office of Science through its Office of High Energy Physics and Advanced Scientific Computing Research programs, has been published in the journal Physical Review Letters.

A muon is a heavier version of the electron and has the same electric charge. The measurement in question is of the muon’s magnetic moment, which defines how the particle wobbles when it interacts with an external magnetic field. The earlier Brookhaven experiment, known as Muon g-2, examined muons as they interacted with an electromagnet storage ring 50 feet in diameter. The experimental results diverged from the value predicted by theory by an extremely small amount measured in parts per million, but in the realm of the Standard Model, such a difference is big enough to be notable.

May 2, 2020

Could Photonic Chips Outpace the Fastest Supercomputers?

Posted by in categories: encryption, quantum physics, robotics/AI, supercomputing

There’s been a lot of talk about quantum computers being able to solve far more complex problems than conventional supercomputers. The authors of a new paper say they’re on the pat h to showing an optical computer c an do so, too.

The idea of using light to carry out computing has a long pedigree, and it has gained traction in recent years with the advent of silicon photonics, which makes it possible to build optical circuits using the same underlying technology used for electronics. The technology s hows particular promise for accelerating deep learning, and is being actively pursued by Intel and a number of startups.

Now Chinese researchers have put a photonic chip t o work tackling a fiendishly complex computer science challenge called the s ubset sum problem in a paper in Science Advances. It ha s some potential applications in cryptography and resource allocation, but primarily it’s used as a benchmark to test the limits of computing.

May 2, 2020

Transhumanism 2.0 (Full Documentary)

Posted by in categories: cybercrime/malcode, cyborgs, education, Elon Musk, genetics, neuroscience, quantum physics, robotics/AI, supercomputing, transhumanism

TABLE OF CONTENTS —————
:00–15:11 : Introduction
:11–36:12 CHAPTER 1: POSTHUMANISM
a. Neurotechnology b. Neurophilosophy c. Teilhard de Chardin and the Noosphere.

—————————————————————————————–
POSTHUMAN TECHNOLOGY
—————————————————————————————–

Continue reading “Transhumanism 2.0 (Full Documentary)” »

Apr 28, 2020

World’s first 3D simulations of superluminous supernovae

Posted by in categories: cosmology, physics, supercomputing

For most of the 20th century, astronomers have scoured the skies for supernovae—the explosive deaths of massive stars—and their remnants in search of clues about the progenitor, the mechanisms that caused it to explode, and the heavy elements created in the process. In fact, these events create most of the cosmic elements that go on to form new stars, galaxies, and life.

Because no one can actually see a supernova up close, researchers rely on to give them insights into the physics that ignites and drives the event. Now for the first time ever, an international team of astrophysicists simulated the three-dimensional (3D) physics of superluminous supernovae—which are about a hundred times more luminous than typical supernovae. They achieved this milestone using Lawrence Berkeley National Laboratory’s (Berkeley Lab’s) CASTRO code and supercomputers at the National Energy Research Scientific Computing Center (NERSC). A paper describing their work was published in Astrophysical Journal.

Astronomers have found that these superluminous events occur when a magnetar—the rapidly spinning corpse of a massive star whose magnetic field is trillions of times stronger than Earth’s—is in the center of a young supernova. Radiation released by the magnetar is what amplifies the supernova’s luminosity. But to understand how this happens, researchers need multidimensional simulations.

Apr 17, 2020

Artificial Intelligence as a Godlike Tool for Experimentation

Posted by in categories: habitats, robotics/AI, supercomputing

When we think of the interaction between mankind and any type of artificial intelligence in mythology, literature, and pop culture, the outcomes are always negative for humanity, if not apocalyptic. In Greek mythology, the blacksmith god Hephaestus created automatons who served as his attendants, and one of them, Pandora, unleashed all the evils into the world. Mary Shelley wrote the character named the Monster in her 1818 novel Frankenstein, as the product of the delusions of grandeur of a scientist named Victor Frankenstein. In pop culture, the most notable cases of a once-benign piece of technology running amok is the supercomputer Hal in 2001 Space Odyssey and intelligent machines overthrowing mankind in The Matrix. Traditionally, our stories regarding the god-like creative impulse of man bring about something that will overthrow the creators themselves.

The artificial intelligence-powered art exhibition Forging the Gods, curated by Julia Kaganskiy currently on view at Transfer Gallery attempts to portray the interaction between humans and machines in a more nuanced manner, showcasing how this relationship already permeates our everyday lives. The exhibition also shows how this relation is, indeed, fully reflective of the human experience — meaning that machines are no more or less evil than we actually are.

Lauren McCarthy, with her works “LAUREN” (2017) and its follow-up “SOMEONE” (2019) riffs on the trends of smart homes: in the former, she installs and controls remote-controlled networked devices in the homes of some volunteers and plays a human version of Alexa, reasoning that she will be better than Amazon’s virtual assistant because, being a human, she can anticipate people’s needs. The follow-up SOMEONE was originally a live media performance consisting of a four-channel video installation (made to look like a booth one can find at The Wing) where gallery-goers would play human versions of Alexa themselves in the homes of some volunteers, who would have to call for “SOMEONE” in case they needed something from their smart-controlled devices. Unfortunately, what we see at Forging The Gods is the recorded footage of the original run of the performance, so we have to forgo playing God by, say, making someone’s lighting system annoyingly flicker on and off.

Apr 15, 2020

Folding@home is now 15 times faster than any current supercomputer

Posted by in categories: biotech/medical, supercomputing

By itself, your PC is not anywhere near as powerful as a supercomputer. Don’t worry, neither is mine, or anyone else’s I know. But while none of use have the computing resources to single-handedly unlock the secrets of a virus, there is strength in numbers. As such, the collective efforts of PC users far and wide have propelled the Folding@home project to crunch data at a pace that is 15 times faster than IBM’s Summit, the top supercomputer in the world.

The developers of Folding@home have been posting periodic updates on Twitter, and according to the latest one, the distributed computing project is currently cranking out around 2.4 exaFLOPs of computational power.

With our collective power, we are now at ~2.4 exaFLOPS (faster than the top 500 supercomputers combined)! We complement supercomputers like IBM Summit, which runs short calculations using 1000s of GPUs at once, by spreading longer calculations around the world in smaller chunks! pic.twitter.com/fdUaXOcdFJ April 13, 2020

Apr 14, 2020

Supercomputing future wind power rise

Posted by in categories: energy, supercomputing, sustainability

Wind power surged worldwide in 2019, but will it sustain? More than 340,000 wind turbines generated over 591 gigawatts globally. In the U.S., wind powered the equivalent of 32 million homes and sustained 500 U.S. factories. What’s more, in 2019 wind power grew by 19 percent, thanks to both booming offshore and onshore projects in the U.S. and China.

A study by Cornell University researchers used supercomputers to look into the future of how to make an even bigger jump in in the U.S.

“This research is the first detailed study designed to develop scenarios for how wind energy can expand from the current levels of seven percent of U.S. electricity supply to achieve the 20 percent by 2030 goal outlined by the U.S. Department of Energy National Renewable Energy Laboratory (NREL) in 2014,” said study co-author Sara C. Pryor, a professor in the Department of Earth and Atmospheric Studies, Cornell University. Pryor and co-authors published the study in Nature Scientific Reports, February 2020.

Page 62 of 96First5960616263646566Last