Categories
Technology

This AI movie is a glimpse right into a way forward for text-to-movie mills

If you’re impressed by the recent spate of text to image generators, get ready for the next step in AI artistry: text-to-video.

While the huge compute costs and scarcity of text-to-video datasets have stunted the technique’s growth, recent research has brought the promise closer to reality.

A computer artist called Glenn Marshall has given a glimpse at the potential.

Greetings, humanoids

Subscribe to our newsletter now for a weekly recap of our favorite AI stories in your inbox.

The Belfast-based composer recently won the Jury Award at the Cannes Short Film Festival for his AI film The Crow.

Marshall had previously earned plaudits for an AI-generated Daft Punk video, but he applied a different approach to The Crow.

While his earlier technique turned text into random visual mutationsThe Crow uses an underlying film as an image reference.

“I had been heavily getting into the idea of ​​AI style transfer using video footage as a source,” Marshall told TNW.

“So every day I would be looking for something on YouTube or stock video sites, and trying to make an interesting video by abstracting it or transforming it into something different using my techniques.

“It was during this time I discovered Painted on YouTube — a short live action dance film — which would become the basis of The Crow.”

Marshall fed the video frames of Painted to CLIP, a neural network created by OpenAI.

He then prompted the system to generate a video of “a painting of a crow in a desolate landscape.”

My AI film ‘The Crow’ wins Jury Award at Cannes!https://t.co/WHDsI7UzJM pic.twitter.com/Ww1DGyBbxw

— Glenn Marshall (@GlennIsZen) August 24, 2022

Marshall says the outputs required little cherry-picking. He attributes this to the similarity between the prompt and underlying video, which depicts a dancer in a black shawl mimicking the movements of a crow.

“It’s this that makes the film work so well, as the AI ​​is trying to make every live action frame look like a painting with a crow in it, so I’m meeting it half way, and the film becomes kind of a battle between the human and the AI ​​— with all the suggestive symbolism.”

In the future, Marshall wants to add 3D animation to his AI creations. He’s also exploring CLIP-guided video generation, which can add detailed text-based directions, such as specific camera movements.

That could lead to entire feature films produced by text-to-video systems. Yet Marshall believes even his current techniques could attract mainstream recognition.

He says The Crow is now eligible for submission to the prestigious BAFTA Awards.

“I haven’t got a speech prepared, but I fantasize about collecting an award, in the role of a herald of AI, and proclaiming to the star-studded audience that [for] each and every one of you, actor, director, set designer, costume designer, artist, composer… AI is coming, and you’ll find yourself in a very different job soon — or out of a job all together.”

Categories
Technology

Nvidia RTX 4090: information, specs, worth, launch date, and extra

If you’ve been following the GPU market these past couple of years, it’s been a wild ride. But now that we’re at the tail end of a massive spike in GPU prices, Nvidia is set to release its next big flagship GPU, the RTX 4090.

Details and leaks about the upcoming 40-series have been all of the place, with some extreme rumors about power consumption and price. Either way, it’s likely to take a spot among the most powerful GPUs you can buy. We don’t know what all will end up being true, but here’s everything we’ve heard so far.

Kiklas/Shutterstock

release date

We won’t know any of the final details or pricing until Nvidia officially launches the RTX 4090. But we do know that Nvidia traditionally drops new graphics cards every two years in the fall, which makes sometime between mid-September and the end of October the most likely scenario.

The last time around was at the height of a global health and supply crisis. While things are definitely chaotic this year, chip supply chains, at least, are in much better shape than they were in 2020. This means the RTX 4090 could get into consumers’ hands much faster than the RTX 3090.

To add more support to an upcoming fall release, Nvidia leaked a testing schedule back in late May and included August and September for mass production of the unit.

Nvidia's development schedule.Igor’s Lab

Of course, this could still be off. After all, a rumor back in May pegged the RTX 4090 release date for mid-July. Obviously, this didn’t happen, and it doesn’t fit with Nvidia’s customary fall release.

On the other hand, it is possible we could see Nvidia delay the event into early 2023. The reason is simple: They have too many RTX 3000-series chips on hand. Logistics bottlenecks during the pandemic meant chips piled up in warehouses, and Nvidia may hope to offload some more of these 2020 GPUs before dumping a new version onto the market.

Right now, rumors are suggesting that Nvidia will release only the RTX 4090 and hold off on the mid-range 4000-series cards while they offload surplus stock. This would help bring down pricing on older, but still capable. chips, and put them more in line with AMD chips.

Price

The current RTX 3090, arguably the king of graphics cards right now, dropped from a high of $1,999 and has settled around $1,499. The RTX 4090 will probably end up north of that number, although we can’t see it reaching $2,000 territory.

Pricing is important right now, especially with AMD snipping at Nvidia’s heels. The AMD Radeon RX 7000 series of GPUs promise to be beasts as well, but at a lower price. The current-gen AMD Radeon RX 6900 XT retails for $900, or $500 less than the current Nvidia RTX 3090 Ti. Nvidia’s recent financial woes in the gaming sphere could even prompt Nvidia to get more aggressive with pricing, but we’ll have to wait and see.

Architecture rumors

According to the rumors, the Nvidia RTX 4090 will most likely include 16,384 CUDA cores, 144 ray tracing cores, and 576 tensor cores. Potentially, this could give it a sizzling 40% performance increase over the RTX 3090.

It will also include 24GB of VRAM, the same as the 3090. However, Twitter leaker Kopite7kimi says the RTX 4090 will have a base clock speed of 2235 MHz and can boost up to 2520 MHz, more than double that of the RTX 3090.

RTX 4090 RTX 3090
CUDA cores 16,384 10,496
Ray tracing cores 144 82
tensor cores 576 328
VRAM GDDR6X 24GB 24GB
Base clock speed 2235MHz 1394MHz
Boost clock speed 2520MHz 1695MHz
Bus width 384 widths 384 widths

Power draw

If rumors are to be believed, the RTX 4090 will be a power-chewing monster with an unquenchable appetite for electricity. Put more succinctly, it will require up to 600 watts of power when pushed to its maximum. With the way some are already downclocking their GPUs to keep heat and noise down, this amount of power could be a problem.

Although it will come with a standard 12-pin PCIe Gen 5 power connector, chances are there will need to be some kind of adapter in order to draw that kind of power. It could be built in or may need to be purchased separately. Nvidia usually doesn’t nickel-and-dime consumers, so we don’t see the latter being likely.

Still, 600 watts is a lot of power. The previous-gen RTX 3090 gulps up to 385 watts, which is a lot. The RTX 4090 will move enough energy to heat your home, meaning it’ll be beyond time to upgrade your power supply.

Gigabyte Aorus P1200W power supply.Jacob Roach / Digital Trends

According to tech watchdog Igor’s Lab, the RTX 4090 will need 450 watts just to power the GPU. The VRAM, fans, inductor, caps, and MOSFET will need the rest.

Because of this massive need for power, the RTX 4090 will get hot. We’re talking “surface of the sun” levels of heat — enough to melt the unit under sustained load. It would require a massive cooling unit to keep temperatures under control.

We reported on just such a thing back in June. The RTX 4090 has a three-slot BFGPU cooler much like the RTX 3090 Ti, but with more fins and more heat sink surface. It will likely keep the same dual flow-through as the RTX 3090, and the logo will light up LED lights.

performance

Power doesn’t necessarily translate into performance, but the RTX 4090 needs to find a way to convert that energy into pure graphical fulfillment. There’s no denying that high-end Nvidia GPUs are among the best in the world, even if they’re overpriced and hard to get. But the RTX 4090 could end up being on a different level altogether.

For starters, the card will supposedly be able to achieve a score of 19,000 on a Time Spy Extreme benchmark test, according to kopite7kimi. This would make it the fastest GPU ever built and nearly double the RTX 3090’s score of 10,000. Other rumors suggest the RTX 4090 can hit 4K 160 fps with both ray tracing and DLSS enabled. Gamers are going to love this card, if these rumors prove correct.

But more shocking is the raw performance. The RTX 4090 is set to become the first GPU to break the 100 teraflops bar. These are truly phenomenal numbers, but let’s be real: This is an enthusiast card only. Performance such as this sits on the edges of consumer needs.

Gamers will be able to play Cyberpunk 2077 or Elden Ring at full settings. The only problem could be what your electricity bill will look like later.

Editors’ Recommendations



Categories
Sport

Baltimore Ravens mascot, Poe, carted off subject after being injured throughout halftime youth soccer recreation

BALTIMORE — The Baltimore Ravens’ bad run of luck on injuries apparently did not end last season.

Poe, the Ravens’ mascot, was carted off the field during a mascot vs. youth football game at halftime of Baltimore’s preseason finale against the Washington Commanders on Saturday night. Poe had to be lifted into the cart by three people and had to have his left leg stabilized.

The injury occurred when Poe was caught from behind and got pushed to the ground by a teenage football player.

Ravens coach John Harbaugh walked onto the field and checked on Poe, who laid on the field for nearly five minutes because many were unsure whether he was really injured. Poe’s injury nearly delayed the start of the second half. He was taken off the field with about a minute left before the start of the third quarter.

Asked for an injury update on Poe, Ravens coach John Harbaugh said, “I knew you guys were going to ask me that.”

Harbaugh then added in jest, “No updates on that. There will be an MRI tomorrow, for sure.”

It was a surprisingly rough game between the mascots and youth football players. One of the players got shoved off his feet by a bear mascot as he tried to get into the end zone.

A replacement Poe emerged midway through the third quarter, which drew cheers from fans at M&T Bank Stadium. He gave high fives to fans.

Despite the loss of Poe, the Ravens were able to hold off Washington to win their 23rd straight preseason game. Injuries derailed Baltimore’s season last season, when 25 players were placed on injured reserve at some point. But this marked the first injured mascot for the Ravens.

With a smile, a Ravens spokesman said after the game that there will no injury updates until Week 1 of the regular season.

Categories
Entertainment

Oscar Mayer Introduces Scorching Canine Popsicles

If you enjoy a hot dog, you can have it as a popsicle, thanks to Oscar Mayer, CNN reports.

The company announced it will introduce the “Cold Dog,” a frozen popsicle that tastes like its hot dog with “both refreshing and smokey, umami notes” and a mustard swirl.

The popsicle comes after lovers of the food brand asked for a poll, and followers of the Instagram account stated this was a “genius” idea.

After the overwhelming fan excitement for our beloved Cold Dog, it was a no-brainer to make this hot dog-inspired frozen pop a reality,” said Anne Field, an Oscar Mayer spokesperson, said in a press release.

The frozen weiner isn’t available nationwide just yet. Right now, you can find it in Long Beach, New York City, New Orleans, and Alpharetta, GA at Popbar locations for just $2.

Oscar Mayer is known for making shocking food items. A bologna face mask was also introduced by the company owned by Kraft earlier this year that sold out on Amazon.

Roomies, what do you think of this?

Categories
Health

CDC cautiously optimistic outbreak could be slowing

The Centers for Disease Control and Prevention is cautiously optimistic that the US is slowing the spread of monkeypox as new cases fall in several major cities.

“We’re watching this with cautious optimism, and really hopeful that many of our harm-reduction messages and our vaccines are getting out there and working,” CDC Director Dr. Rochelle Walensky told reporters Friday during an update on the monkeypox outbreak.

Although monkeypox cases are still increasing nationally, the speed of the outbreak appears to be slowing, Walensky said. The US has reported nearly 17,000 monkeypox cases since May, more than any other country in the world, according to CDC data.

In New York City, which has reported more infections than any other jurisdiction, new monkeypox cases have dropped from more than 70 per day on average to nine as of Thursday, according to data from the city health department.

dr Aswhin Vasan, the city health commissioner, said earlier this week the outbreak has slowed due to increased vaccination and community outreach efforts. New York City has reported a total of 2,888 monkeypox cases.

In Chicago, another major epicenter of the outbreak, new cases have dropped from 141 during the week ended July 30 to 74 for the week ended Aug. 20, according to that city’s health department. Chicago has reported a total of 807 cases.

“We’re not seeing the potentially exponential growth that we were seeing early on so that is reassuring,” said Dr. Allison Arwady, Chicago’s public health commissioner, during a Facebook live event earlier this week. “Too early to say things look really good, but definitely some signs of slowing of cases.”

The US is nearing the point where the entire community of gay and bisexual men who currently face the greatest health risk from monkeypox rwill have access to two doses of the monkeypox vaccine, according to Dawn O’Connell, head of the office responsible for the national stockpile at the Health and Human Services Department.

The CDC previously estimated that up to 1.7 million gay and bisexual men who are HIV-positive or are eligible for medicine to reduce their chance of contracting HIV face the greatest health risk from monkeypox.

The US has distributed 1.5 million doses of the monkeypox vaccine so far and more than 3 million doses should be available by when the latest distribution round is complete, according to O’Connell.

To date, the outbreak is disproportionately affecting Black and Hispanic men. About 30% of monkeypox patients are white, 32% are Hispanic and 23% are Black, according to CDC data. Whites make up about 59% of the US population while Hispanics and Blacks account for 19% and 13%, respectively.

The monkeypox vaccine, called Jynneos in the US, is administered in two doses 28 days apart. The patients will not have full protection from the vaccine until two weeks after the second dose is administered, according to the CDC. Data from 19 jurisdictions show that nearly 97% of the shots administered so far were first doses, according to Walensky.

About 94% of monkeypox cases are associated with sexual contact and nearly all of the people who have contracted the virus are men who have sex with men, according to Demetre Daskalakis, the deputy head of the White House monkeypox response team.

A CDC survey of 824 gay and bisexual men found that 48% respondents have reduced their number of sexual partners and 50% have reduced one-time sexual encounters during the current outbreak. A separate CDC study found that a 40% decrease in one-time sexual encounters would reduce the final percentage of gay and bisexual men infected with monkeypox by up to 31%.

“We’re actually seeing vaccine get out, behaviors change, harm reduction messages being heard and implemented,” Walensky said. “And all of that working together to bend the curve.”

Categories
Technology

Meta needs to supercharge Wikipedia with an AI improve

Wikipedia has a problem. And Meta, the not-too-long-ago rebranded Facebook, may just have the answer.

Let’s back up. Wikipedia is one of the largest-scale collaborative projects in human history, with more than 100,000 volunteer human editors contributing to the construction and maintenance of a mind-bogglingly large, multi-language encyclopedia consisting of millions of articles. Upward of 17,000 new articles are added to Wikipedia each month, while tweaks and modifications are continuously made to its existing corpus of articles. The most popular Wiki articles have been edited thousands of times, reflecting the very latest research, insights, and up-to-the-minute information.

The challenge, of course, is accuracy. The very existence of Wikipedia is proof positive that large numbers of humans can come together to create something positive. But in order to be genuinely useful and not a sprawling graffiti wall of unsubstantiated claims, Wikipedia articles must be backed up by facts. This is where citations come in. The idea – and for the most part of this works very well – is that Wikipedia users and editors alike can confirm facts by adding or clicking hyperlinks that track statements back to their source.

Citation needed

Say, for example, I want to confirm the entry on President Barack Obama’s Wikipedia article stating that Obama traveled to Europe and then Kenya in 1988, where he met many of his paternal relatives for the first time. All I have to do is to look at the citations for the sentence and, sure enough, there are three separate book references that seem to confirm that the fact checks out.

By contrast, the phrase “citation needed” is probably the two most damning in all of Wikipedia, precisely because they suggest that there’s no evidence that the author didn’t conjure the words out of the digital ether. The words “citation needed” affixed to a Wikipedia claim is the equivalent of telling someone a fact while making finger quotes in the air.

Citations don’t tell us everything, though. If I were to tell you that, last year, I was the 23rd highest-earning tech journalist in the world and that I once gave up a lucrative modeling career to write articles for Digital Trends, it appears superficially plausible because there are hyperlinks to support my delusions.

The fact that the hyperlinks don’t support my alternative facts at all, but rather lead to unrelated pages on Digital Trends is only revealed when you click them. For the 99.9 percent of readers who have never met me, they might leave this article with a slew of false impressions, not the least of which is the surprisingly low barrier to entry into the world of modeling. In a hyperlinked world of information overload, in which we increasingly splash around in what Nicholas Carr refers to as “The Shallows,” the existence of citations themselves appear to be factual endorsements.

Meta wades in

But what if citations are added by Wikipedia editors, even if they don’t link to pages that actually support the claims? As an illustration, a recent Wikipedia article on Blackfeet Tribe member Joe Hipp described how Hipp was the first Native American boxer to challenge for the WBA World Heavyweight title and linked to what seemed to be an appropriate webpage. However, the webpage in question mentioned neither boxing nor Joe Hipp.

In the case of the Joe Hipp claim, the Wikipedia factoid was accurate, even if the citation was inappropriate. Nonetheless, it’s easy to see how this could be used, either deliberately or otherwise, to spread misinformation.

Mark Zuckurburg introduces Facebook's new name, Meta.

It’s here that Meta thinks that it’s come up with a way to help. Meta AI (that’s the AI ​​research and development research lab for the social media giant) has developed what it claims is the first machine learning model able to automatically scan hundreds of thousands of citations at once to check if they support the corresponding claims. While this would be far from the first bot Wikipedia uses, it could be among the most impressive — although it’s still currently in the research phase, and not in use on actual Wikipedia.

“I think we were driven by curiosity at the end of the day,” Fabio Petroni, research tech lead manager for the FAIR (Fundamental AI Research) team of Meta AI, told Digital Trends. “We wanted to see what was the limit of this technology. We were absolutely not sure if [this AI] could do anything meaningful in this context. No one had ever tried to do something similar [before].”

Understanding meaning

Trained using a dataset consisting of 4 million Wikipedia citations, Meta’s new tool is able to effectively analyze the information linked to a citation and then cross-reference it with the supporting evidence. And this isn’t just a straightforward text string comparison, either.

“There is a component like that, [looking at] the lexical similarity between the claim and the source, but that’s the easy case,” Petroni said. “With these models, what we have done is to build an index of all these webpages by chunking them into passages and providing an accurate representation for each passage … That is not representing word-by-word the passage, but the meaning of the passage . That means that two chunks of text with similar meanings will be represented in a very close position in the resulting n-dimensional space where all these passages are stored.”

a single-pane comic from xkcd about Wikipedia citationsxkcd

Just as impressive as the ability to spot fraudulent citations, however, is the tool’s potential for suggesting better references. Deployed as a production model, this tool could helpfully suggest references that would best illustrate a certain point. While Petroni balks at it being likened to a factual spellcheck, flagging errors and suggesting improvements, that’s an easy way to think about what it might do.

But as Petroni explains, there is still much more work to be done before it reaches this point. “What we have built is a proof of concept,” he said. “It’s not really usable at the moment. In order for this to be usable, you need to have a fresh index that indexes much more data than what we currently have. It needs to be constantly updated, with new information coming every day.”

This could, at least in theory, include not just text, but multimedia as well. Perhaps there’s a great authoritative documentary that’s available on YouTube the system could direct users toward. Maybe the answer to a particular claim is hidden in an image somewhere online.

A question of quality

There are other challenges, too. Notable in its absence, at least at present, is any attempt to independently grade the quality of sources cited. This is a thorny area in itself. As a simple illustration, would a brief, throwaway reference to a subject in, say, the New York Times prove a more suitable, high-quality citation than a more comprehensive, but less-renowned source? Should a mainstream publication rank more highly than a non-mainstream one?

Google’s trillion-dollar PageRank algorithm – certainly the most famous algorithm ever built around citations – had this built into its model by, in essence, equating a high-quality source with one that had a high number of incoming links. At present, Meta’s AI has nothing like this.

If this AI was to work as an effective tool, it would need to have something like that. As a very obvious example of why, imagine that one was to set out to “prove” the most egregious, reprehensible opinion for inclusion on a Wikipedia page. If the only evidence needed to confirm that something is true is whether similar sentiments could be found published elsewhere online, then virtually any claim could prove technically correct — no matter how wrong it might be.

“[One area we are interested in] is trying to model explicitly the trustworthiness of a source, the trustworthiness of a domain,” Petroni said. “I think Wikipedia already has a list of domains that are considered trustworthy, and domains that are considered not. But instead of having a fixed list, it would be nice if we can find a way to promote these algorithmically.”

Editors’ Recommendations



Categories
Technology

Scientist says grasping physicists have overhyped quantum tech

Nikita Gourianov, a physicist at Oxford university, yesterday published a scathing article full of wild, damning claims about the field of quantum computing and the scientists who work in it.

According to Gourianov, the quantum computing industry has been led astray by greedy physicists who’ve hyped up the tech’s possibilities in order to rip off VCs and get paid private-sector salaries for doing academic research.

Double, double

Per Gourianov’s article, the real problems started in the 2010s after investors started taking notice of the hype surrounding quantum physics:

As more money flowed in, the field grew, and it became progressively more tempting for scientists to oversell their results. With time, salesman-type figures, typically without any understanding of quantum physics, entered the field, taking senior positions in companies and focusing solely on generating fanfare. After a few years of this, a highly exaggerated perspective on the promise of quantum computing reached the mainstream, leading to a greed and misunderstanding taking hold and the formation of a classical bubble.

Greetings, humanoids

Subscribe to our newsletter now for a weekly recap of our favorite AI stories in your inbox.

Gourianov’s entire premise seems to hinge on their assertion that “despite years of effort nobody has yet come close to building a quantum machine that is actually capable of solving practical problems.”

They illustrate their argument by pointing out that Rigetti, IonQ, and D-Wave (three popular quantum computing companies) combined have failed to turn a sufficient profit.

According to Gourianov:

The reality is that none of these companies — or any other quantum computing firm, for that matter — are actually earning any real money. The little revenue they generate mostly comes from consulting missions aimed at teaching other companies about “how quantum computers will help their business”, as opposed to genuinely harnessing any advantages that quantum computers have over classical computers.

Finally, Gourianov’s conclusion leaves no doubt as to their feelings on the subject:

Well, when exactly the bubble will pop is difficult to say, but at some point the claims will be found out and the funding will dry up. I just hope that when the music stops and the bubble pops, the public will still listen to us physicists.

Toil and trouble

In the words of the great Jules Winnfield, Samuel Jackson’s character from the classic film Pulp Fiction, “Well, allow me to retort.”

I have but five words I’d like to say to Gourianov, and they are: IBM, Google, Amazon, Microsoft, and Intel.

I don’t think we need to do a deep dive into big tech’s balance sheets to explain that none of those companies are in any financial danger. Yet, each of them is developing quantum computers.

It’s unclear why Dr. Gourianov would leave big tech out of the argument entirely. There are dozens upon dozens of papers from Google and IBM alone demonstrating breakthrough after breakthrough in the field.

Gourianov’s primary argument against quantum computing appears, inexplicably, to be that they won’t be very useful for cracking quantum-resistant encryption. With respect, that’s like saying we shouldn’t develop surgical scalpels because they’re practically useless against chain mail armor.

Per Gourianov’s article:

Shor’s algorithm has been a godsend to the quantum industry, leading to untold amounts of funding from government security agencies all over the world. However, the commonly forgotten caveat here is that there are many alternative cryptographic schemes that are not vulnerable to quantum computers. It would be far from impossible to simply replace these vulnerable schemes with so-called “quantum-secure” ones.

This appears to suggest that Gourianov believes at least some physicists have pulled a bait-and-switch on governments and investors by convincing everyone that we need quantum computers for security.

This argument feels a bit juvenile and like a borderline conspiracy theory. Governments around the world have been working in tandem with experts from companies such as Google spinout SandboxAQ and IBM for several years to address the encryption issue.

No serious person involved in the decision-making is going to be confused about how math works because of crappy marketing hype or a misleading headline.

Gourianov’s rhetoric reaches a peak as they appear to accuse physicists of manipulating the hype around quantum computing out of sheer greed:

Some physicists believe, in private, that there is no problem here: why not take advantage of the situation while it lasts, and take the easy money from the not-too-sophisticated investors? Earning a private-sector level salary whilst doing essentially academic research is a pretty good deal, after all.

That’s quite the accusation.

Quantum computing bubble?

On the whole, however, it feels like Gourianov’s chief complaint isn’t that quantum computers don’t work, it’s that they aren’t very useful. dr Gorianov isn’t wrong. The technology is far from mature.

But make no mistake, today’s quantum computing systems do work. They just don’t work well enough to replace classical computers for many useful functions — yet.

IBM's quantum roadmap, an infographicImages: IBM

Looking at the above roadmap, keep in mind that IBM was founded in 1911. It didn’t build the IBM 5150, the company’s first PC, until 1981.

Along the way, a lot of reputable scientists said the PC market was a bubble. The naysayers claimed it was not only pointless for consumers, but that there were just too many problems to overcome in order to make computers affordable and useful for personal use.

We all know how that worked out for IBM. Do we need to even get into what Intel, Microsoft, Amazon, and Google have accomplished over the course of their ventures? They each have their own roadmaps concerning how they’re approaching the STEM challenges involved in quantum computing. So does MIT, Harvard, Oxford and myriad other universities.

I’m not a big tech shill by any means, but there’s a lot to be said about a fistful of companies worth somewhere around the one trillion dollars mark each deciding that a future-facing technology vertical is worth wagering their bankbooks on.

It’s beyond the scope of this article to address every non-trillion dollar company in the quantum computing field. But, having spoken to dozens of people working at various quantum computing companies, including the ones Gourianov mentioned, it’s clear to me that nobody building quantum computers has any misconceptions about their capabilities — not even the C-suite executives.

If VCs are confused and media hype is distorting the tech’s possibilities, I’d call that par for the course.

I can’t think of a single modern technology that mainstream journalism gets right all the time. And a significant portion of wealthy VCs are going to be both eager and ignorant about any given tech — shall we discuss AI or Web3 investments too?

The future is now

In my opinion, it would take a scientific shocker on par with discovering a viable antithesis for Newton’s laws for the bottom to fall out of the quantum computing industry. We’re not talking about a theoretical technology, we’re talking about a nascent one.

Quantum computers are here now. But like the IBM 5150 in 1981, they don’t really do anything that regular computers of their day can’t already do.

Still, I’d be interested in hearing what anyone who said the PC market was a bubble back in 1981 has to say about it now.

I imagine we’ll all see quantum computers differently in 40 years.

Perhaps quantum computing is a bubble market for VCs looking for short-term ROI projects, but the technology isn’t going anywhere.

There’s overwhelming evidence that today’s quantum computing technology is rapidly advancing to the point where it can help us solve problems that are infeasible for classical computation.

Maybe there are a bunch of greedy scientists out there peddling unwarranted optimism to VCs and entrepreneurs. But I’d wager that the curious scientists and engineers who chose this field because they actually want to build quantum computers outnumber them.

Categories
Science

The Science of Snowfall and Local weather Snowjobs – Watts Up With That?

Jim Steele

“As scientists who study what controls snowfall admit, “There are “no easy answers” ​​to the question of climate change and snow” Nonetheless click bait media doesn’t hesitate to fear monger that we are on the verge of the “end of snow”

However the science of snowfall reveals how natural weather oscillations affect the transport of moisture which determines changes in regional snowpack, and each region experience unique conditions.

Jim Steele is Director emeritus of San Francisco State University’s Sierra Nevada Field Campus, authored Landscapes and Cycles: An Environmentalist’s Journey to Climate Skepticism, and proud member of the CO2 Coalition.

Transcript below.

Welcome back everyone.

This video addresses another atrocious media claim, this time by Bloomberg Green, that very soon there will be an end to snow. This fear mongering has been pushed for over a decade now.

In 2000, Dr. David Viner, the senior research scientist at the Climatic Research Unit of the University of East Anglia, predicted within a few years winter snowfall will become “a very rare and exciting event”. “Children just aren’t going to know what snow is”.

But a quick review should show why such fear mongering is not supported by the science of snowfall.

Bloomberg’s journalists clearly do not understand the difference between natural weather oscillations and climate change. They compared the Sierra Nevada’s heavy snowfall in 2019 with low snowfall in 2022 as evidence of a declining trend in snowfall.

But they NEVER addressed the well-known effects of El Nino and the Pacific Decadal Oscillation that cause such variations.

Studies detailing 3-fold changes in Sierra Nevada snowfall over the years have been published, such as Christy’s 2010 research paper.

By ignoring a wealth of snowfall science, Bloomberg’s so-called journalists will be better known for their ridiculous end of snow predictions.

Both the media and alarmist scientists are guilty of cherry-picking just the decline in springtime snow extent to push their end of snow fears. But during the winter, snowfall has increased And autumn snowfall has also increased.

Such contrasting trends again suggest that snow extent is not being controlled by global warming

In an interview, Dr David Robinson, New Jersey’s state climatologist and head of the Rutgers University Global Snow Lab stated,

“There are “no easy answers” ​​to the question of climate change and snow”

Regions that cover less than 6% of northern hemisphere explain 62%–92% of the interannual variance across the continents.

Snow will change in most places as the climate continues to warm, but exactly how and why, may be among the most challenging questions about weather and climate change

So, ignore the doomsayers! Let’s quickly examine why there are indeed “no easy answers”

As temperatures fall, significant snowfall happens each winter in the northern two thirds of the United States. The locations illustrated here in red experience the heaviest snowfall and are governed by very different moisture transport dynamics, prohibiting any one size fits all analysis of changes in global snowfall.

First remember what every elementary school child is told; no two snowflakes are the same. That may not be entirely true, but it speaks to the varying conditions of temperature and moisture that control snowflake formation, creating a huge spectrum of differing snowflake crystals that produce different snowpacks, from heavy wet snow to dry powdery snow.

So, it could also be argued no two snowpacks are exactly alike. The density and thus water content of snowpacks may vary 3-fold. Depending on when the snow falls the snowpack can become denser over time.so simply measuring the extent of snow cover from satellites, fails to determine how much water fell as snow So, scientists use snow water equivalent measurements, but those measurements require time consuming efforts and thus provide a very limited sample size of snow conditions.

Still, we do know that El Nino cycles and the Pacific Decadal Oscillation have very significant impacts on the snow water equivalent. Thus, the effects of natural oscillations must be considered for any analysis of snowfall trends.

Temperature and moisture have opposing effects on snowfall. As temperatures decrease to the point where snowflakes can form and reach the ground, the amount of moisture in the air decreases reducing the amount of snow

The northern hemisphere’s latitudes with greatest snowfall are also regions with the lowest winter atmospheric moisture.

Predictions that global warming will melt more snow, suggest the biggest declines in snow extent will happen at the relatively warmer southerly edge of the northern hemisphere’s snow extent.

But as surveyed by Kunkle 2016, the pattern of decreasing and increasing snowfall does not fit global warming expectations, again suggesting that varied dynamics of moisture transport are the key to understanding snowfall variations.

Furthermore, many studies unscientifically simply assume a global average temperature affects all regions equally. But as illustrated by Cohen 2014, much of the mid latitudes have experienced winter cooling for the past 2 decades.

And in contrast to global warming hypotheses, despite cooling over most of Eurasia, that region has experienced less snowfall.

Warmer air holds more moisture. And it is the transport of that moisture to cooler regions that provides enough water vapor for significant snowfall.

If warm air at 20 degrees Celsius is cooled to the freezing point, it will precipitate over 60% of its moisture. Typically, atmospheric rivers bringing moisture from the warm tropics will dump the most snow when making landfall further northward.

In contrast, because cold arctic air masses averaging minus 10 to 30 degrees Celsius, that cold air hold insignificant amounts of moisture, and cannot bring significant snowfall directly to the regions it passes over.

Nonetheless that cold air can cause warm air water vapor to precipitate as snow.

Typically, all moisture at higher altitudes forms snow, but if it descends through a warmer air mass, it turns to rain.

If that rain then falls through a colder air mass nearer the ground it forms sleet or freezing surface rain.

By preventing melt, snow accumulation only happens where the air is cold enough all the way to the surface.

At weather fronts, cold air will force warm moist air to rise to altitudes where temperatures are cold enough to initiate snowflake formations

As storms move across the land, the counterclockwise motion of the winds pulls cold air down from the north to interact with warm moist air being drawn northward.

Thus, the more northerly latitude of winter storm tracks will more likely produce the cold air required for snow accumulation. However, as storm tracks move northwards, snow fall could be reduced further south.

However, studies find that although storm track latitudes have varied over the past 300 years there is no apparent trend as expected from global warming theories.

Mountains have a tremendous effect on snowfall. Moist air forced upslope to cooler altitudes is the reason the greatest snowfall in the United States is found in mountainous regions.

Although snow rarely falls over the west coast flatlands, just a hundred miles further east, heavy snow falls in the Sierra Nevada and cascade mountain ranges.

The amount of snow is governed by El Nino cycles. El Nino brings warm moist air to the southern USA. Accordingly, studies such as Lute 2014 have detailed how El Nino years bring high snowfalls to the Sierra Nevada, but reduced snowfall to the Pacific northwest.

A swing to La Nina-like conditions brings dryness to California and the southern United States. The reduced Sierra Nevada snowfall of 2022, and fear mongered by Bloomberg’s “End of Snow” click bait atrocity, was the result of reduced moisture transport associated with current La Nina conditions and a negative Pacific Decadal Oscillation.

La Nina, however, directs more moisture northward causing more snow in the pacific northwest and the northern United States

Other studies have shown oscillating years of much more vs much less snowfall in the Sierra Nevada for the last seven decades.

In the northern Sierra Nevada, there was an insignificant decreasing trend.

But an insignificant increasing trend in the southern Sierra Nevada

And at low elevations in the southern Sierra Nevada where global warming hypotheses expect the greatest loss of snow, there has been an insignificant increase in snowfall

The heavy snowfall in the Rocky Mountains is also partially determined by El Nino cycles. However, the moisture carried from the pacific by westerly winds during the winter loses much of that moisture before reaching Colorado, resulting in the dry powdery snow that is so favored by skiers.

But that changes in the spring!

A low-pressure system settles in westward during the spring causing easterly winds to carry moisture from the Gulf of Mexico. These dynamics deliver weather snow and as much if not more snow than falls during the winter.

Despite the lack of mountains, the midwestern USA experiences heavy snowfall from lake-effect snow. Cold dry Arctic air will absorb copious moisture as it passes over the relatively warmer great lakes and then dumps it inland.

Studies associated with NOAA have mapped out the contributions from lake-effect snow. They reported that while non-lake-effect snowfall has decreased in Illinois, Indiana and Ohio, lake-effect snow has increased.

They attributed the increase in lake effect snow to declines in ice cover caused by global warming. But that is inaccurate and not very truthful for a government sponsored scientific study.

It has been well established that lake effect snow is governed by many variables: besides ice cover, wind speed and wind direction have major impacts. When the winds blow along the long axis of a lake more moisture is absorbed, and greater snowfall occurs.

Indeed, ice cover does have a major effect, however only lake eerie ever completely freezes each year, while most of the deeper lakes maintain large areas of open water illustrated here by dark purple colors.

Although ice cover declined as NOAA noted from 1975 to 2000. Ice cover then increased from 2000 onward, contrary to global warming predictions of declining ice cover.

Cold dry winds blowing from Siberia absorb moisture as they cross the Sea of ​​Japan. Upslope snowfall then deposits great amounts of snow on the mountain tops leaving very little moisture to reach Japan’s east coast

The strength and direction of those winds changes as the high-pressure system over Siberia varies.

The strength and location of the Aleutian low pressure system, which is altered by El Nino cycles and the Pacific Decadal oscillation, also alters the pressure gradient which controls the strength and direction of the winds and thus the amount of sea-effect snow accumulation.

Due to such variability, Japan’s local snow accumulation has exhibited no trend in one location,

increasing snowfall trends in others,

and decreasing trends in still others.

The last region in the United States of high snowfall is in the northeast. Moisture from the Atlantic is delivered via winter storms known as “nor-easters” and dumped in the higher elevations of the Green or White Mountains further inland.

Snowfall here is largely governed by moisture transport that varies with the Atlantic Multidecadal Oscillation and the North Atlantic Oscillation.

The related Arctic Oscillation determines where and how much cold Arctic air moves southward to interact with relatively warm moist air flowing from the Atlantic.

The many possible interactions amongst the natural oscillations, described in this video, have huge effects on moisture transport and thus snow accumulation.

So snowpacks will naturally ebb and flow accordingly.

Thus, the great complexities governing snowfall across the northern hemisphere indeed provide no easy answers regarding the effect of climate change.

So don’t believe the doomsayers. The science has yet to support their fear of mongering.

And don’t hesitate to buy your children’s winter sports equipment. There will be plenty of snow most years for them to enjoy.

And I am so confident of the science of snowfall, that I promise to reimburse everyone’s winter sports expenditures, if “the end of snow”, ever really happens in our lifetimes!

This video will be added to our Videos page.

Like this:

Like Loading…