The Qualcomm Snapdragon 855 is a high-end chip that powers phones such as the Galaxy S10 and OnePlus 7 Pro. Qualcomm is not ready to move on to the design of the next-generation chip, but is currently turbocharging the 855 in the next phones. The aptly named 855+ is a Snapdragon 855 with just a little more power.
The 855 included some notable changes from Qualcomm's previous SoC, and the 855+ builds on these changes. The 855 was an octa-core chip, as was the 855+, with the traditional division into four high-power cores and four low-power cores. One of those faster processor cores was the Prime kernel with a higher clock speed than the others to improve single-thread operation. The Snapdragon 855+ still uses Kryo 485 cores derived from ARM Cortex A76, but the Prime Core has been increased to 2.96 GHz, compared to 2.84 GHz in the 855. The clock speed of other high-power cores remains unchanged at 2.42 GHz. , and the slower processors based on A55 are still at 1.8 GHz.
Qualcomm's Adreno GPUs are a bit like a black box, but the 855+ Adreno 640 is also apparently faster. All Qualcomm will say that it is 15% faster than the Snapdragon 855's 640. Presumably, it has increased the clock speed of 585 MHz to achieve something in the top 600. The chip also has the same X24 LTE modem as the old 855.
The 855+ will be the company's favorite 5G chip in the future, but that's only because of the fact that it's a little faster than the old 855 The configuration of the 5G is no different – OEMs still have to opt for the separate X50 5G modem, which connects to the 855+. The SoC only has built-in LTE features, but Qualcomm's next-generation true mobile processor will almost certainly integrate the X55 modem for LTE and 5G on a single chip. This will save energy and reduce the cost of 5G phones.
Qualcomm says we will start seeing the phones with the Snapdragon 855+ in the second half of 2019. The first would apparently be the game centered Asus ROG Phone 2, which has not been fully disclosed. This phone will be launched on July 23rd. Waiting, The Samsung Galaxy Note 10 will be unveiled August 7and it is likely that the phone will also sport the 855+. You can expect more "gaming" phones launched in 2019 to make a big difference to the overclocked 855+, but Qualcomm's 2020 chip is the one to watch.
Humans tend to think that they pretty much control the functioning of the physical world, but things become unspeakably small on a small scale. Particles are not always particles, and sometimes these particles (or waves) behave in a bizarre and counter-intuitive way. Quantum entanglement is one of the strangest aspects of physics, and scientists at the University of Glasgow are coming captured the first picture demonstrating the effect.
When two particles or molecules entangle at a quantum level, they share one or more properties such as spin, polarization or momentum. This effect persists even if you move one of the entangled objects away from the other. Einstein in turn called the entanglement "spooky action at a distance". Einstein felt that the existence of entanglement meant that there were gaping holes in the theory of quantum mechanics.
Scientists have successfully demonstrated quantum entanglement with photos, electrons, molecules of different sizes and even very small diamonds. The Glasgow University study is the first to capture visual evidence of entanglement. The experiment used photons in entangled pairs and measured the phase of the particles – this is what is called a Bell entanglement.
The team produced photons with an ultraviolet laser, crossing a crystal that entangled some of the photos. A beam splitter turned the beam into two equal "arms," with some of the entangled photos taking different paths. Since they were entangled, they continued to share the same phase even after being separated.
One of these photons crosses a liquid crystal material that makes it pass through four phase transitions (0, 45, 90 and 135 degrees). The team used a very sensitive camera to capture images of the entangled photon that had not passed through the filter. However, he showed the same phase transitions as his partner. The image above shows the entangled pair at a 45 degree phase.
Scientists believe that quantum entanglement could have applications in quantum computing, data transmission and even teleportation. For this to work, we need to study in more detail the entanglement. The University of Glasgow experience could pave the way for new types of imagery that help us cope with this spooky action from a distance.
The conventional wisdom is that most major galaxies harbor supermassive black holes to their centers. Scientists also believe that only so-called "active" galaxies should have a visible accumulation of matter, but that The Hubble Space Telescope has found a around a black hole with an unusually low brightness. This galaxy may be a bit of a departure from the rules, but it offers an opportunity to study how the theory of relativity applies in the real world.
NGC 3147 is a large spiral galaxy a little smaller than our own Milky Way. It's about 120 million light-years away – you've probably seen some pictures because it's pretty amazing. Enabling galaxies like quasars are easy to spot. The material that falls there produces emissions over the entire electromagnetic spectrum, and the accretion disks are well visible. Everyone thought that NGC 3147 was way too dark to have its own record, but a new analysis from an international team suggests the opposite.
Hubble has collected data on the central black hole of NGC 3147, which is about 250 million times larger than our sun. The object turns out to have a thin disc of material similar to the one you would find around an active galactic core. Hubble's observations show that the disk rotates at about 10% of the speed of light. Researcher Stefano Bianchi of the Università degli Studi in Rome Tre in Italy explains that this discovery indicates that current models of low-light galaxies have "clearly failed".
This discovery could have significant effects on how we model galaxies and black holes, but it could also provide a better understanding of the physics underlying supermassive objects. The fast disk associated with the apparent low luminosity of the NGC 3147 could test both general relativity and special relativity.
General relativity deals with the mechanisms of gravity in the universe and the special relativity describes the relationship between space and time. Since NGC 3147 is weaker than a typical active galaxy, the accretion disk that surrounds it can be made much better. The disc sits inside the powerful gravity field of the black hole where scientists can study the impact of light on it.
According to the findings of NGC 3147, astronomers could now go in search of other weakly active galaxies. Some may even be closer to Earth where we can make even more accurate measurements to test Einstein's theories.
Google's DeepMind division has made considerable efforts to apply artificial intelligence to issues such as computer vision and climate change, but there is still room for gaming. DeepMind first dominated the game of Go, then switched to StarCraft II, beat some of the best players in the world early this year. Now, you could have a chance of play against IA AlphaStarbut you will probably be destroyed.
Before challenging the professional players, Deepmind Simulated more than 200 years of StarCraft II gameplay to form the bot. It's a convolutive neural network that began by absorbing reruns of StarCraft II pro matches. By using competing models, DeepMind has trained several "agents" capable of building and fighting, as well as human players – better, in fact. AlphaStar has won 10 of 11 matches against professional players.
Previous matches were an impressive demonstration of AI prowess. AlphaStar had a better understanding of resource allocation, coverage and micromanagement of units than most human actors. Naturally, his ability to transparently control multiple units has also helped. Although, he did not take as much action as human players to win.
The experiment will run on Blizzard's European servers, where a small number of humans will be paired with AlphaGo in part 1v1. They will not know it, but players have to sign up for a chance to compete with the AI. Unfortunately, there is no way to make sure you can play against AlphaStar.
Blizzard will let DeepMind manage several different agents on Battle.net, and their operation will be different from that of the last demo. The new AlphaStar will be able to play against or one of the three StarCraft II races (it was only before Protoss). It also relies on a normal view of the game by the camera, while the former used a bird's eye view of the entire map. DeepMind has also limited alphaStar shares per minute (APM).
DeepMind is primarily interested in testing AlphaStar in matches where players change their strategies, and keeping the secrecy of the match ensures a controlled test. After being presumably murdered by artificial intelligence, players will see their rankings affected as if they had played a human opponent. DeepMind will use the results of this test to inform future research on AI, and the results of the matches will be included in a future scientific article.
The Japanese probe Hayabusa2 has just entered the story – again. The probe collected samples from the surface of the asteroid Ryugu earlier this year and bombed the asteroid a few months later. Now, the probe made another trip to the surface to pick up virgin materials that were buried under the surface.
The Japan Aerospace Exploration Agency (JAXA) launched Hayabusa2 in 2014, taking four years to meet before Ryugu was in orbit some 300 million kilometers from Earth. He began by depositing barrel-shaped robots to obtain the configuration of the earth, and then descended to collect his first sample in February 2019.
Hayabusa2 carried several tantalum slugs, which were drawn to the surface after each landing. The impact of particles up and in the collection compartment of the probe – that first landing material collected from the top layer of the asteroid. The most recent collection is crucial because it contains material from deeper Ryugu.
In April, Hayabusa2 launched the 2.5-pound (2.5 kilogram) software. Small cabin impactor (SCI). The SCI used an HMX-shaped explosive charge to launch the impactor at Ryugu at more than 2 km per second. The result was a small artificial crater. Regoliths on the surface of Ryugu have been bombarded with solar radiation for an eternity, but just below the surface is a pristine material that has hardly changed since the birth of the solar system. That's what Hayabusa2 was after this time.
(PPTD) These images were taken before and after touch by the small camera monitor (CAM-H). The first is 4 seconds before the touch, the second is the touch and the third 4 seconds later. In the third picture, you can see the amount of rocks rising. pic.twitter.com/ssZU5TV3x9
– HAYABUSA2 @ JAXA (@ haya2e_jaxa) July 11, 2019
This landing required careful preparation – any problem could result in the loss of the first sample still in Hayabusa2. The probe landed about 20 meters from the center of the crater, using its second tantalum slug to collect the ejecta. Visual examination of the area showed that the color varied from other parts of the asteroid. The team is therefore convinced that the material contained in Ryugu was present.
Hayabusa2 will remain in Ryugu orbit for several more months and leave in November or December. The spacecraft will return to Earth by the end of 2020, after which scientists will be able to closely examine Ryugu's samples. The JAXA does not know exactly how much Ryugu is coming back to Earth, but it needs to recover up to 100mg of asteroid material. This does not seem much, but it will be the only sample of the solar system perfectly preserved from an object farther than the moon.
Cars are starting to behave well, but what about planes? The autopilot has been around for years, but a team of researchers at the Technical University of Munich (TUM) has developed the most autonomous aircraft to date. The automated landing system called C2Land uses computer vision to identify the runway and guide the plane for a safe landing.
Commercial plans currently have a system called Instrument Landing System (ILS), which can be useful when you can not make visual contact with a runway. It relies on ground radio signals and onboard receivers to determine the position of the aircraft, but C2Land does not need this infrastructure. Costly equipment will not be installed on all runways and airports, but C2Land does not need anything on the ground, except for a runway. This could make practical use in smaller and more remote airports, where expensive ILS equipment is not practical.
C2Land uses GPS for flight control as well as a computer vision system. The cameras operate in the visible and infrared ranges, allowing the system to operate even in low visibility conditions. The computer finds the outline of the track and determines where the plane should land. Combined with the current speed and altitude, the system can calculate a virtual descent path for a perfect landing.
The team explained the system in a series of three articles, but nothing really indicates a test like in the real world. They equipped a Diamond DA42 propeller plane with the C2Land system and got stuck to test the pilot under the controls. The pilot Thomas Wimmer took the plane and indicated the proximity of the runway, but from there everything depended on the computer.
You can see all the landing, as well as the image recognition of the computer, in the video above. Wimmer does not at all need to take control of the plane during the landing. It's not technically the first autonomous landing ever made – Boeing realized this with a suburban airplane prototype. However, it was a VTOL vehicle (take-off and vertical landing). The TUM system works with the aircraft in service today.
Vision-assisted navigation systems such as C2Land may become standard in the future, but this is still an experiment. Nevertheless, researchers need to be confident that they can take a human into the aircraft and hand over commands to an algorithm.