Categories
Technology

China claims it constructed a quantum laptop that’s 10 billion occasions sooner than Google’s

State researchers in China recently developed a quantum computing system that is said to be 10 billion times faster than Google’s “Sycamore” machine. If so, it would be a significant milestone for the field.

Front: As far as we can tell, it’s true. After a quick look at the research paper and a look at the peer response, it is evident that the Chinese team managed to do something extraordinary here.

Here is the jargon from the paper itself:

Quantum computers promise to perform certain tasks that are believed to be impossible for classical computers to solve. Boson sampling is one such task and is considered a strong candidate to demonstrate the benefit of quantum computation. We perform a Gaussian boson scan by sending 50 indistinguishable single-mode squeezes into a 100-mode, low-loss interferometer with full connectivity and random matrix – all of the optical setup is phase-locked – and the output with 100 highly efficient singles -Sample mode samples. Photon detectors.

Basically, the researchers say they built a quantum computer machine that uses light to perform a very specific task (boson scanning) to demonstrate and measure its effectiveness.

Background: The reason this is important is because, in theory, quantum computers can solve really difficult problems. We’re talking about the kind of problems that physicists and computer scientists value that a classic machine would take thousands of years to solve.

In 2018, Google claimed it had developed the first machine that could demonstrate a “quantum advantage.” This just means that supposedly a quantum computer was able to do something that a classical computer either couldn’t do or couldn’t do in a reasonable time.

Google claimed that its system, a 53-qubit machine that shakes a quantum chip called “Sycamore”, could solve a specific problem that a supercomputer couldn’t solve. Unfortunately for Google, IBM was quick to deny this claim. According to Big Blue, it can solve the same problem on one of its classic supercomputers in just a few days – using algorithms that already existed at the time of Google’s announcement.

Take quickly: What China has done is completely different from what Google has done. In essence, China has built a machine that can only perform the experiment that demonstrates quantum superiority. In other words, it doesn’t actually solve any problems, which makes being a computer a bit voluntary.

The Google machine, on the other hand, is “programmable”. This means that in theory it could be tweaked to solve one or more problems.

That doesn’t mean what China did isn’t a breakthrough. Pushing the boundaries of what quantum science can achieve is the goal of everyone working in this field. The various laboratories around the world working on building quantum computing machines use different approaches because while the future holds promise for the field, we are still taking the first theoretical steps towards useful quantum computers.

China’s methods may have made the most recent breakthrough, but as Lu Chaoyang, the professor who led the experiment, told the Financial Times:

Building a quantum computer is a race between humans and nature, not between countries.

You can find more information on quantum computing in our primer here.

Published on December 4, 2020 – 18:47 UTC

Categories
Technology

Who’s accountable if a self-driving automotive has an accident?

As self-driving cars gain momentum in today’s automotive landscape, the issue of legal liability in the event of an accident has become more relevant.

Studies of the interaction between humans and vehicles have shown time and again that even systems for automating driving – such as adaptive cruise control, which keeps the vehicle at a certain speed and distance from the car in front – are anything but error-free.

Recent evidence suggests that drivers have limited understanding of what these systems can and cannot do (also known as mental models), which contributes to system abuse.

In the video above, a webinar on the dangers of advanced driver-assisted systems.

There are many issues worrying the self-driving car world, including imperfect technology and the lukewarm public acceptance of autonomous systems. There is also the question of legal liability. In particular, what are the legal responsibilities of the human driver and the automaker who built the self-driving car?

Trust and Accountability

In a study recently published in Humanities and Social Science Communications, the authors deal with the problem of excessive driver trust and the resulting system abuse from a legal perspective. They are examining what self-driving car manufacturers should legally be doing to ensure drivers understand how to use the vehicles appropriately.

One solution suggested in the study is to require buyers to sign end-user license agreements (EULAs), similar to the terms that an agreement would require when using new computer or software products. To get consent, manufacturers may use the ubiquitous touchscreen installed on most new vehicles.

The problem is that this is far from ideal or even safe. And the interface may not provide enough information to the driver, creating confusion about the nature of the consent forms and their implications.

The problem is, most end users don’t read EULAs: A 2017 Deloitte study shows that 91% of people agree to them without reading. In young people, the percentage is even higher. 97% agree without checking the terms.

In contrast to using a smartphone app, operating a car involves considerable safety risks, regardless of whether the driver is human or software. Human drivers must agree to take responsibility for the software and hardware results.

“Warning drowsiness” and distracted driving are also a cause for concern. For example, a driver who is upset after receiving continuous warnings might choose to just ignore the message. If the message is displayed while driving, it can be a distraction.

Given these limitations and concerns, the manner in which this consent is to be obtained is unlikely to fully protect automakers from their legal liability in the event the system fails or an accident occurs.

Driver training for self-driving vehicles can help drivers fully understand system capabilities and limitations. This needs to go beyond just buying the vehicle – recent evidence shows that even relying on the information provided by the dealer won’t answer questions.

Against this background, the way forward for self-driving cars will not go smoothly.

This article by Francesco Biondi, Assistant Professor of Human Kinetics at the University of Windsor, has been republished by The Conversation under Creative Commons license. Read the original article.


SHIFT is brought to you by Polestar. It’s time to accelerate the transition to sustainable mobility. That’s why Polestar combines electric driving with state-of-the-art design and exciting performance. Find out how.

Published on December 5, 2020 – 16:00 UTC

Categories
Technology

The very best low cost drone offers for December 2020: DJI and Parrot

Have you always wanted to fly drones but have held back because of the costs? Drone prices keep falling as more drones fly. Affordable drones are now within the reach of most people and are no longer just a toy for people with high expendable incomes. These days, you can find good but cheap drones for less than $ 500 (or even less) if you grab one while it’s on sale. To make this easier, we’ve rounded up the best cheap deals on drones right now.

Today’s best drone deals

How to choose a drone

The right drone for you depends on what you want to do. A cheap drone – especially those that are under $ 250 – has the fewest features and may not have some features that may not make it the best solution for you.

In general (and this is not a hard and fast rule) a cheap drone will usually provide 720p video and a relatively low frame rate, typically 30 frames per second. While this will be fine for most of us, the video will lack that cinematic smoothness. Stepping up to a mid-range drone (usually in the $ 250 to $ 750 range) gives you 1080p video and often 60 fps, resulting in higher quality videos.

Most modern high-end drones offer 4K video, but you will likely have to spend more than $ 1,000 to get 4K video at 60 frames per second. If video quality is a primary concern, expect higher costs.

A cheap drone also lacks other useful features, including tracking, video and drone stabilization. In ideal flight conditions this is not a problem. However, you should take advantage of these features when you want to fly in different conditions or work without much user input.

We also found that many cheaper drones either don’t avoid obstacles or are not as good as more expensive drones. If you’re flying in wide open spaces, you don’t have a big problem here. However, if you plan to fly in areas with obstacles nearby, make sure that the drone you choose has sufficient obstacle avoidance capabilities.

Do drones make noises?

All drones make noise. Most describe it as a humming sound, similar to a bee. The propellers are spinning at extremely high speeds, and that is what makes this noise. It is most noticeable when you are closest to the drone, but you will hardly hear it if it is ever higher in the air.

Can you fly a drone at night

Most drones can be flown at night, although we would not recommend doing so until you have a lot of experience. We recommend keeping your drone in view when flying at night. Drones that can be deployed at night have lights so that they can be seen during night flight.

Remember that most drones don’t have night vision capabilities. Therefore, the video you record during the night flight will only be lit by available ambient lights like moonlight, street lights, etc.

Can you fly a drone in the rain

Most drone manufacturers advise against flying your drone in the rain. Most cheap drones are neither waterproof nor waterproof, so even small amounts of water can damage your drone, especially the motor and battery. The moisture creates a short circuit that causes your drone to stop working with little or no warning. As a rule, therefore, do not fly your drone in rain, fog or excessively humid conditions.

If you are trapped in these conditions, land as soon as possible and move your drone to a dry place, disconnect the battery and let it dry. You may also want to gently shake the drone itself to get water out of the inner case and let it dry as well. Use techniques similar to drying out a wet phone to dry out your drone. If this is your first time flying it, take some time to test the drone at low altitude before returning to normal flying.

Do drones have to be registered?

The Federal Aviation Administration requires that you register your drone based on your scheduled flight and tag your drone with your provided registration number. Any drone weighing more than 250 grams and less than 55 pounds must be registered, which covers nearly every drone currently sold in the market. This registration must be renewed every three years.

You must also have proof of registration with you at all times during the flight and you are not allowed to fly for commercial purposes according to the license terms. Drone flights are only permitted below 400 feet and class “G” or uncontrolled airspace. It is your responsibility to follow these rules.

In some states, drone pilots must comply with additional regulations. So be sure to check the laws of the state in which you are flying before you take off.

Do you need a license to fly a drone?

No license is currently required for recreational or hobby drone pilots. However, additional certifications may be required to fly for commercial purposes.

We strive to help our readers find the best deals on quality products and services, and we carefully and independently select what we cover. The prices, details and availability of the products and offers in this post are subject to change at any time. Make sure they are still valid before making a purchase.

Digital Trends can earn commissions on products purchased through our links, which supports the work we do for our readers.

Editor’s recommendations



Categories
Technology

Mapping the whole ocean ground – from the sky?

Stanford University

A friend of mine who works in game design recently showed me a 3D model of the earth that has been rendered in great detail using topographically accurate satellite data so that we can fly at high speed through canyons and our respective neighborhoods like a pair of super people. “Let’s see if we can go underwater,” he said enthusiastically as we flew over the Pacific.

We could not. The amazingly accurate model on land apparently had no data with which to model the underwater environment. There was an unrendered void beneath the glassy surface of the water, like this was a sub-aquatic version of the Truman Show, and we had reached the end of the world.

Neither of us was particularly surprised. The shock would have been if the oceans had been rendered. Where would this information have come from? And how exactly would it have been? It would have meant that the makers of the model knew something that even the world’s leading oceanographers don’t.

Despite all the justified excitement surrounding space exploration in the 2020s (Elon Musk is “very confident” that humans will be racing towards Mars by 2026), our planet’s oceans remain a largely unknown and unknown domain that is much closer to their home lies. Water covers roughly 71 percent of the earth’s surface, with the freshwater material we drink making up a tiny 3 percent, little more than a rounding error. But the vast majority of Earth’s oceans – up to 95 percent – is an unexplored mystery.

While we’re still far from a Google Street View equivalent for the underwater world, a new project carried out by researchers at Stanford University could pave the way for something like this in the future – and much more. Imagine if you could fly an airplane over a piece of water and see with absolute clarity what is hiding under the waves.

It sounds impossible. As it turns out, it’s just very, very difficult.

The problem with lidar, the problem with sonar

“Mapping underwater environments from an airborne system is a challenging task, but it has many potential applications,” Aidan James Fitzpatrick, a PhD student in the electrical engineering department at Stanford University, told Digital Trends.

The obvious candidate for this imaging job is lidar. Lidar is the bounce laser technology best known for helping autonomous vehicles (without Tesla) perceive the world around them. It sends out pulsed waves of light and then measures how long it takes for them to bounce off objects and return to the sensor. In this way, the sensor can calculate how far the light pulse has moved and thus create an image of the world around it. Self-driving cars are still the most popular use of lidar, but they can also be used as a powerful mapping tool in other contexts. For example, researchers used it in 2016 to uncover a long-lost city hidden under a dense canopy in the Cambodian jungle.

Lidar is not suitable for this type of assignment. Although advanced, high-performance lidar systems perform well in extremely clear waters, much of the ocean – especially coastal waters – is cloudy and opaque. According to Fitzpatrick, much of the underwater recordings made to date have relied on in-water sonar systems that use sound waves that can easily travel through murky waters.

Unfortunately there is a catch here too. In-water sonar systems are mounted on or pulled by a slow moving boat. Aerial imaging with a flying aircraft would be more effective as it could cover a much larger area in less time. However, this is impossible because sound waves cannot get from the air into the water and then back again without losing 99.9999 percent of their energy.

What comes to PASS

While lidar and radar systems have mapped the entire landscape of the earth (focus on the “land”), consequently only about 5 percent of global waters have been subject to similar imaging and mapping. This corresponds to a world map that only shows Australia and leaves the rest dark like an unexplored Age of Empires map.

“Our goal is to propose a technology that can be mounted on a flying vehicle to provide extensive coverage while using an imaging technique that is robust in murky water,” said Fitzpatrick. “For this purpose, we are developing a photoacoustic air sonar system that we have designed. PASS uses the advantages of the propagation of light in the air and the propagation of sound in the water to map underwater environments from an airborne system. “

Stanford University

PASS works as follows: First, a special customer-specific laser system fires a burst of infrared light that is absorbed by the first centimeter of water. Once laser absorption has occurred, the water thermally expands and creates sound waves that can move into the water.

“These sound waves now act as a sonar signal in the water that was generated remotely by the laser,” continued Fitzpatrick. “The sound waves are reflected by underwater objects and travel back to the surface of the water. Some of this noise – only about 0.06 percent – passes the air-water interface and travels towards the air system. Highly sensitive sound receivers or transducers record these sound waves. The converters [then] convert the sound energy into electrical signals that can be passed through image reconstruction algorithms to create a perceptible image. “

The things that are below

PASS is currently in progress. The team demonstrated high-resolution three-dimensional imaging in a controlled laboratory environment. Fitzpatrick acknowledged that this is in a “container the size of a large aquarium,” although the technology is now “close to the stage” where it could be deployed over a large swimming pool.

Stanford University

There is, of course, a slight difference between a large swimming pool and all of the world’s oceans, and it requires a lot more work. Specifically, a major challenge that needs to be resolved before testing in larger, more uncontrolled environments is how to approach imaging through water with turbulent surface waves. Fitzpatrick said this is a head scratcher, but it’s one that “certainly has workable solutions” that some are already working on.

“PASS could be used to map the depths of unknown waters, study biological environments, search for lost wrecks, and possibly much more,” he said. “Isn’t it strange,” he added, “that we don’t have to explore all of the earth we live on yet? Maybe PASS can change that. “

The combination of light and sound to solve the air-water interface would be a game changer. And then? Bring the mapping drone army to finally show us what’s under the ocean’s surface.

An article describing the PASS project was recently published in IEEE Access magazine.

Editor’s recommendations