Categories
Science

A Novel Perspective on the Greenhouse Impact – Watts Up With That?

Guest essay by: Thomas E. Shula
Rancho Mirage, CA

The figure below, published by NASA, is one example of many that attempt to visualize the various factors in the “Energy Budget” of the Earth.  The yellow arrows on the left depict the incoming solar radiation. It is in part absorbed by the atmosphere, partially reflected into space by clouds and the atmosphere, partially reflected by the surface of the Earth, and a bit less than 50% is absorbed by the surface of the Earth and converted into heat. On the right, the red arrows depict the paths that transport the energy from the Earth’s surface to space, as postulated by the greenhouse effect.  This model of the “Energy Budget” is the basis of climate models attempting to predict the effects of hypothesized Anthropogenic Global Warming (AGW) from greenhouse gases. 

Diagram Courtesy of NASA

As the inset paragraph in the NASA diagram states, “On average, and over the long term, there is a balance at the top of the atmosphere.”

The values associated with each of the arrows in the diagram are the corresponding energy fluxes in Watts/m2.  These values are derived in different ways, some of which are relevant to this exposition and will be described below.  These values are used in climate models and may vary as the models evolve, though typically not by much.  Some typical values from a NASA document can be found on page 16 HERE.[1]  Certain assumptions led to the development of the Greenhouse Gas Theory.  One of the conclusions explained at Earth Temperature without GHGs – Energy Education[2] is that without greenhouse gases, the earth would be approximately 33 C cooler, essentially an average temperature near freezing.  This is the result of treating both the Earth and its atmosphere as blackbodies following the Stefan-Boltzmann Law, as is discussed in this VIDEO[3]  from an online course about climate modeling. 

From the energy budget diagram, there are four red arrows corresponding to (average) longwave (Infrared) radiation flux.  They are as follows:

  • 398.2 Watts/m2 longwave radiation upwelling from the surface
  • 18.4 Watts/m2 upward from conduction/convection
  • 86.4 Watts/m2 upward from evapotranspiration
  • 340.3 Watts/m2 longwave radiation downwelling from the atmosphere as Back Radiation

According to the greenhouse effect, it is the downwelling Back Radiation that “traps” the heat in the atmosphere to keep the Earth warm.

For purposes of this exposition, we will consider only first two components above, as we will be investigating the relationship between upwelling longwave radiation and conduction/convection at the Earth’s surface.  According to the model explained above, 398.2 W/m2 represents  approximately 95.5% of shared heat transport and conduction/convection approximately 4.5% of shared heat transport.

How might we measure this?  We know that there are three mechanisms for transport of heat energy: conduction, convection, and radiation.  One needs to design an experiment that can discern the proportion of heat loss due to radiation versus the heat loss due to conduction and convection.  It so happens that there is a common instrument that has been in use for over 100 years that does precisely this.

The Pirani Gauge

This image was provided with permission by MKS Instruments, Inc. (Andover, MA)

The modern Pirani Gauge is used to measure vacuum in the range from 760 Torr to 10-4 Torr, though some are designed to measure higher pressures up to 1000 Torr.  It was invented in 1906 by Marcello Pirani, a German physicist working for Siemens & Halske, and has been used in a myriad of applications for over 100 years.  The operating principle of the gauge is simple.  Inside the gauge body there is a filament that is heated and maintained at a constant temperature.  The energy going into the filament is controlled via the current flowing through it.  Energy can be dissipated from the filament in four ways:

  • Gas Conduction
  • Gas Convection
  • Radiation
  • End Losses (i.e., conduction of heat from the filament to its support structure.)

The Radiation and End Losses are constant and can be measured by creating an adequate vacuum inside the gauge so that losses from conduction and convection are negligible.  When gas is introduced to the enclosure, heat is removed from the filament via conduction and convection.  The input power required to maintain the temperature of the filament will depend on how much energy is being removed via conduction and convection by the gas.  In summary, the Pirani gauge tells us the relative contributions to heat transport by radiation versus conduction/convection as a function of gas pressure for an object (the filament in this case) held at a constant temperature.  Referring to the paragraph preceding the above image, this is exactly the measurement we are looking for.

The response curve for a typical gauge is shown in next illustration.  Both illustrations can be found in the Technical Note “Introduction to Vacuum Pressure Measurement” published by MKS Instruments, and the specific gauge illustrated in the figure is an MKS Instruments convection enhanced Pirani gauge.

This image was provided with permission by MKS Instruments, Inc. (Andover, MA)

The red line in the chart represents the (constant) total radiative and end losses of approximately 0.4 mW.  The blue line represents the power loss due to gas only, and the green curve that flattens out on the two ends represents the total loss, i.e., the total energy input required to maintain the temperature of the filament as a function of pressure.  At atmospheric pressure, 760 Torr, the power required to maintain the temperature of the filament is 100 mW.  Since the radiative and end losses are 0.4 mW, this means that the heat transport by gas is 99.6%, with only 0.4% due to radiative and end losses.  This should not be surprising, because all gas molecules can transport heat via conduction and convection, not just the tiny fraction that constitute the so-called “greenhouse gases.”

We can also consider the case of a vacuum pressure of 10 Torr, the equivalent of about 110,000 feet above sea level.  In this case, about 60 mW of power is required to maintain the filament temperature, so the gas is still accounting for about 99.3% of heat transport with radiative and end losses only 0.7%.  As one goes higher in altitude a larger proportion of the heat transport is attributable to radiation, and that is how all the heat eventually returns to space in the extreme upper atmosphere.  The crossover point, where gas losses are equal to radiative and end losses, is at about 200 milliTorr (.02 Torr), equivalent to an altitude beyond 250,000 feet.  The response of the Pirani gauge is independent of the enclosure it is in or the lack thereof.  If we took a “naked” Pirani gauge to an altitude where the atmospheric pressure is 10 Torr, the response would be the same as if it was attached to a vacuum system at a pressure of 10 Torr.  There have been Pirani gauges made in many different sizes and configurations, some with radiative losses on the order of 0.1% at standard atmospheric pressure.[4]

The filament in the Pirani gauge is analogous to the surface of the Earth.  The gas molecules collide with the surface and absorb energy raising their effective temperature (conduction).  A “bubble” of this warmer gas then rises relative to the cooler gas around it as the cooler gas drops to the surface and repeats the cycle continuously (convection).  This cools the surface and is perfectly illustrated by the response of the Pirani gauge. This is well understood by those who have worked with high temperature processes in vacuum systems, and no doubt by many others.   The author can only speculate regarding why this has not been given consideration earlier.

Conclusions

The Pirani gauge provides a method to measure the relative contributions of radiation vs. conduction/convection to heat transport in a gaseous environment as a function of pressure.  At pressures relevant to the lower atmosphere (troposphere + stratosphere) radiation accounts for less than 1% of the upward heat transport.  This does not refute the existence of said radiation in the lower atmosphere, it only demonstrates experimentally that its role in upward heat transport is insignificant.

It has been demonstrated via the Pirani gauge operating principle that upward heat transport via radiation plays an insignificant role in the transport of heat at atmospheric pressures from the surface to the upper stratosphere.  The greenhouse effect, if it exists, is based on upward heat transport via radiation in the lower atmosphere.  Therefore the greenhouse effect, if it exists, plays an insignificant role in heat transfer and, by extension, the energy balance of the atmosphere.

Contemporary climate models are based on energy balance models of the type depicted in the NASA diagram at the beginning of this paper.  It is clear from the NASA diagram as well as similar diagrams from other sources that the fundamental assumption of these models is that radiation is the primary driver of upward heat transport in the lower atmosphere.  Because radiation is an insignificant driver of upward heat transport in the lower atmosphere, these models are based on a false assumption and are therefore invalid.  Finally, because the models are generally intended to support the theory of Anthropogenic Global Warming because of the greenhouse effect, there is no scientific evidence for the greenhouse effect or Anthropogenic Global Warming.

The radiation energy that the Earth absorbs from the Sun arrives at the speed of light.  The Earth loses heat at a speed driven by convection in a process we call “weather.”  Weather is the chaotic process by which the Earth’s atmosphere continuously tries to reach thermal equilibrium but never succeeds.  The convection takes place continuously, but the speed at which heat is transported by convection is MUCH slower than the speed of light.  This means that heat energy leaves the Earth more slowly than it arrives, and that is why the Earth is warmer than predicted by the Stefan-Boltzmann Law. 

Appendix

(April 11, 2023)

How did “Climate Science” get this so wrong?

The two fundamental assumptions leading to the Greenhouse Effect are that 1) The primary mechanism by which the surface of the Earth loses heat is radiation and, 2) Based on the Stefan-Boltzmann Law the temperature of the Earth’s surface should be 33K cooler than what we observe.

The Stefan-Boltzmann Law (SBL) defines a blackbody (which is an idealized object that does not exist in nature) as having the following characteristics:

  1. It exists in an environment at 0K, i.e., a perfect vacuum.
  2. It is in equilibrium with its environment.
  3. It is a perfect absorber of radiation.

With certain adjustments such as emissivity, the SBL provides a convenient means of measuring the temperature of an object based on its emitted radiation even in non-ideal environments.  The estimation of the temperature of stars for example, and the use of infrared cameras to detect “hot spots.”  One must be careful, however, to keep in mind that it is only the “idealized” black body that behaves strictly according to the SBL.

The Earth and its atmosphere do not satisfy any of the conditions of SBL.   Additionally, it has become common to ignore condition number 1 above.  If one looks up the definition of a blackbody, a reference to the 0K (perfect vacuum) condition is often not mentioned.  This typically has little effect when it comes to temperature measurements using optical techniques, but it is extremely important in understanding the dynamics of heat transfer in, for example, terrestrial conditions.

This is neglected in climate models.  It assumes that with the surface temperature of 288K the power radiated upward from the surface is 398 Watts/m2 and that it is all longwave IR radiation.  It then becomes necessary to “balance” that upwelling radiation with “back radiation” to obtain “radiative balance” in the atmosphere.

What is happening is quite different.  At 288K temperature the photon flux (generously assuming it is all at 15 micron wavelength to maximize the number of IR active photons) is approximately 3X1022 photons/sec-m2.   That is a lot of photons, and if the surface was in a perfect vacuum that radiative flux would be the only way for the surface to release energy.

But we have an atmosphere.  At standard temperature and pressure, air has some very interesting properties.   It is much denser than we typically imagine.

Average molecular velocity approximately 470 m/sec (1050 mph, supersonic at the macro level)

Molecular collision frequency (each with another) approximately 7,000,000,000 collisions/sec (7 GHz)

Mean free path approximately 70 nm (about 1/10 wavelength of visible light)

Frequency of collisions with an ideal planar surface approximately 3X1027 collisions/sec-m2

To put this in perspective, the last number is quite useful.   The average surface area of an adult human is around a square meter.  That means that each second about 100 lbs. of air molecules collide with each of us with an average speed of about 1050 mph.  More importantly, given the photon flux at 288K this means that approximately 100,000 air molecules collide with the surface for each potential infrared photon emitted.  Because the energy transfer from collisions will change the equilibrium at the surface by removing energy through conduction, it is likely that the actual emitted photon flux will be even less.  To believe that radiative transfer is the primary mechanism for upward heat transfer at the Earth’s surface would mean that one IR photon would transfer more energy than 100,000 molecular collisions.  These numbers are for a perfectly smooth planar surface.  The actual surface area at an atomic level can be much greater. 

Clearly, the interface between the surface of the Earth and the atmosphere is an extremely chaotic place at the atomic level.  This gives perspective to explain what we see in the operation of the Pirani gauge as explained in the body of this paper.

References:

[1] Kelly, Schmidt, et al, GISS-E2.1: Configurations and Climatology

[2] Earth Temperature without GHGs – Energy Education

[3] (224) Climate Dynamics Lecture 02 Energy and the Earth System – YouTube

[4] Fabrication of thermal‐based vacuum gauge – Jung – 2014 – Micro & Nano Letters – Wiley Online Library

Like this:

Like Loading…

Categories
Sport

Jalen Hurts underwent surgical procedure to take away “{hardware}” from his ankle, sources say

PHILADELPHIA — Philadelphia Eagles quarterback Jalen Hurts, who recently became the highest-paid player in NFL history, underwent right ankle surgery earlier this offseason, league sources told ESPN.

The procedure consisted of removing “hardware” that had been inserted into his ankle after Hurts suffered a severe ankle sprain while playing for the University of Alabama in an October 2018 game against Tennessee, sources said. Hurts was sidelined for almost a month before playing again in November of that season.

The surgery to remove the hardware took place in February and was considered minor, according to sources, who added that Hurts returned to his off-season training routine not long after the procedure.

Editors Favorites

2 relatives

Sources say Hurts will be a full participant in the Eagles’ offseason training program, which begins next week.

Hurts, 24, also underwent surgery on his left ankle in February 2022. In November 2021 against the New York Giants, he suffered a serious sprained ankle, missed a game and returned after the team’s departure to lead Philadelphia to a playoff spot in his first season as a full-time starter.

Hurts compiled an MVP-level campaign last season, averaging a 14-1 record as a starter while throwing for 22 touchdowns and rushing for 13 more. He saved one of his best performances for the biggest stage, piling up 374 yards with four touchdowns and a costly turnover in a 38-35 Super Bowl loss to the Kansas City Chiefs.

According to ESPN’s Adam Schefter, Hurts and Philadelphia on Monday agreed to a five-year, $255 million extension that includes over $179 million in guarantees. It is the largest contract by average annual value in NFL history at $51 million per year and includes a no-trade clause — the first in Eagles history.

Hurts has missed a total of three games as a professional. He sat out two games late in the 2022 season with a right shoulder sprain, but returned for the regular-season finals against the Giants and played with discomfort in the postseason, leading the Eagles to the Super Bowl.

Categories
Health

Covid vaccine gross sales have fueled the beat however no gross sales forward

Janssen Johnson & Johnson’s COVID-19 vaccine.

Allen J. Cockroaches | Los Angeles Times | Getty Images

International sales of Covid vaccines added to the spark Johnson&JohnsonRevenue and profits were outperformed on Tuesday, but the company said it doesn’t expect sales from the shot going forward.

“Regarding our Covid-19 vaccine, we do not expect any significant sales beyond those recorded in the first quarter as our contractual obligations are fulfilled,” Chief Financial Officer Joseph Wolk said during a conference call on Tuesday.

Those obligations include external production exit costs and clinical trial expenses, the consumer goods giant said in its first-quarter earnings release.

This marks the end of three difficult years for J&J’s Covid vaccine, despite being one of the first vaccines to hit the US market during the pandemic. The vaccine, originally billed as a single-dose therapy, has long been overshadowed by the slightly more potent shots of Pfizer And Modern due to a rare but serious risk of a blood clotting disorder.

J&J’s unpopular shot seemed to bear its final fruit Tuesday, contributing to $747 million in sales for the three months ended March 23. This led to strong growth in the company’s pharmaceuticals business, which reported a revenue increase of more than 4% compared to the same period last year.

Notably, all of its Covid vaccine revenue for the quarter came from outside the United States. It is unclear which countries contributed to the sales.

That number beat Wall Street analysts’ estimates.

Bank of America analyst Geoff Meacham had expected the shot to generate $150 million in revenue during the quarter. A forecast by Wells Fargo analysts “expected no Covid sales in the first quarter” but noted some contract promise revenue could “come through”.

After J&J reported the earnings, SVB Securities analyst David Risinger also noted that sales of the vaccine beat a consensus estimate by more than $500 million. JP Morgan analyst Chris Schott added that J&J’s first-quarter hit was partly “bumped up by the Covid vaccine.”

First-quarter sales of the Covid vaccine are also up from the $544 million it raked in in J&J’s most recent quarter and the $457 million the company reported a year ago.

The last time J&J reported U.S. sales of the vaccine was in the second quarter of 2022, which ended weeks after a decision by the Food and Drug Administration that severely restricted who can get the vaccine. The agency said the vaccine could only be given to adults who specifically request it or who cannot get another vaccination, indicating the risk of blood clots.

Earlier this year, the drugmaker also announced that it had scaled back production of the shot amid slumping demand.

While J&J’s vaccine fell out of favor in the US and other wealthy countries, developing countries continue to rely on it. As a one-time injection, the vaccine is less expensive and easier to distribute to hard-to-reach populations.

Categories
Entertainment

Shay Mitchell reacts to the affiliation of her model BÉIS with Scandoval

No lie here: Shay Mitchell ♥ loves a marketing moment.

BÉIS, Pretty Little Liars’ luggage company, had a blast on social media recently when Vanderpump Rules was the star Rachel Leviss was photographed with one of the brand’s weekender bags as she exited Castmate Tom Sandovalhome amid news of their affair. On March 30, the brand shared an unsponsored image of Raquel loading the large tote bag into the trunk of her car, alongside the caption, “We provide the bag, not the luggage.”

Now, in an exclusive interview with E! News’ The Rundown, Shay weighed in on the lively post.

“I have a great team at BÉIS,” she told the host Erin Lim Rhodes While celebrating Coachella at the Revolve Festival, he jokingly remarked that people “bring the bags and we’ll provide the bags.”

The actress added that BÉIS has a lot of VPR fans on the team who “tell me everything,” although she hasn’t personally kept up with Scandoval.

Categories
Technology

Deal drops the value of this 3D printer from $500 to $200

If you’re interested in giving 3D printing a try, here’s an offer to help jumpstart your new endeavor – the Monoprice Maker Ultimate 2 for a very affordable $200. Monoprice has dropped its original $500 price tag by $300, which is a discount you’ll rarely see on a 3D printer. You’ll need to act fast if you want to get the machine at this great price though, as we’re pretty sure it’s drawing a lot of attention, so supplies could run out at any moment.

Why you should buy the Monoprice Maker Ultimate 2 3D printer

Monoprice, a popular 3D printer brand that has launched well-reviewed products like the Monoprice Maker Select Mini and Monoprice Maker Select Plus, will not disappoint 3D printing beginners and veterans alike with the Monoprice Maker Ultimate 2. It’s a fully enclosed 3D printer, which helps maintain internal temperatures and ensures environmental conditions don’t affect your project, and internal lighting makes it easy to monitor progress without the need for an external light source.

The Monoprice Maker Ultimate 2 features a removable glass build plate that offers the flattest possible surface, and an underlying aluminum plate that uses their built-in inductive sensor that automatically levels the print bed. The glass build plate can be heated to 100 degrees Celsius, preventing the first few layers from cooling and warping during printing, and a filament detector automatically pauses the project when the filament runs out so you can load more. Its maximum build volume of 200 x 150 x 150 millimeters, larger than some of the best 3D printers, opens up many projects for you – you are only limited by your own creativity.

For those looking to buy a 3D printer on the cheap, it will be difficult to find a better deal than Monoprice’s $300 rebate on the Monoprice Maker Ultimate 2. The machine that originally cost $500 will be yours for only $200. This deal won’t last long – in fact, it could be gone as soon as tomorrow – so if you don’t want to miss out on getting the Monoprice Maker Ultimate 2 3D printer for less than half its original price, I highly recommend that you do it buy now

Editor’s Recommendations



Categories
Technology

The EU helps over 100 deep tech startups based by girls

The European Commission has announced the results of the second round Women TechEUa program designed to help deep tech startups founded by women scale.

The round that has a budget of €10 million, saw applications from 467 deep tech startups founded by women from across Europe, of which 134 were selected to participate. It builds on a successful one pilot in 2021 with 50 startups.

The startups selected for the second round will each receive individual funding from 75,000. TThe founders are also offered mentoring and coaching within the framework of the European Innovation Council (EIC). Program for women leadersand get access to networking opportunities in the EU.

The startups are active in 16 different deep tech sectors and have developed Solutions ranging from new medicines and carbon capture technologies to digital learning and autonomous robotics.

Sweden is resident among the participants norbitthat uses insects to recycle plastic waste, based in the Netherlands Agurotechthat digitizes agriculture using AI and is based in Lithuania Inobiostarwhich has developed a waste paper-based material for removing oil spills.

The <3 of the EU tech

The latest rumors from the EU tech scene, a story of our wise founder Boris and some questionable AI artworks. It’s free in your inbox every week. Join Now!

“By combining innovative ideas, female entrepreneurship and excellent research and development, this year’s companies selected for WomenTechEU will contribute to improving the quality of life of citizens in the EU and beyond”, called the European Innovation Council.

Deep tech accounts for over a quarter of the European startup ecosystem with European deep tech companies estimated to a total of 700 billion euros in 2021.

Yet women remain chronic underrepresented: last year Only 3% of VC funding in the European deep tech space went to startups founded by women.

These injustices are pervasive across the industry, but are amplified in the deep tech sector. Deep tech startups tend to have longer R&D cycles and require higher capital expenditures than traditional startups, making them more balanced heavier for women-led and women-founded teams.

Not only is this bad practice, it’s also bad for business. According to consulting firm McKinseyThe European tech ecosystem will only remain competitive if it manages to attract and retain more female talent.

And investors seem to agree: “Diversity of thought, opinion and creativity is essential for our deep tech ecosystem to thrive,” said Christina Franzeskides, a deep tech investor at Lakestar, in 2023 European Deep Tech Report.

“To that end, we must strive for inclusion across all backgrounds and genders, so that the space can reach its full potential,” she added.

The European Commission believes that the right early support and investment of female-founded startups can help bridge the deep tech gender gap and strengthen the ecosystem as a whole.

Categories
Science

Undeterred and Unmoved by Failed Concepts – Watts Up With That?

From Climate Etc.

by Planning Engineer  (Russ Schussler)

“Green” ideas and their proponents can create problems.  Like the antagonist in Terminator 2, green arguments and proponents don’t go down easily.  With serious challenges, they retreat, hibernate sometimes, morph, transform and come back.  It’s hard to argue with many “green” energy ideas.  They are often huge in scope but severely limited in details.  Focusing on a couple key factors and ignoring  or leaving so much to be worked out later.  Painfully naïve or unaware of so many factors associated with the provision of energy, feedback and often even human behavior.   They see the flaws in current efforts, but are blind to the drawbacks which will necessarily emerge from their proposals.  The offer conjectures with a lot of dots to still be connected. They speak of things that may be possible, without any handle on the probabilities.

Usually, “green” ideas are packaged with threats of doom, promises of superior technology or both.   The media are drawn to both those themes and many policy makers are attracted as well.   Attention is a great thing for new ideas.  The themes of urgency and the scope of change,  gives these ideas more weight and seeming gravitas.  Unfortunately, the needed incentives to dig down and look critically as these ideas are generally lacking. Woefully, those promulgating “green” ideas don’t have much incentive for engaging with their critics or broadening their understandings. They generate the feeling that we need to move forward with the big, new important thing – no time for distractions.

Death of the Grid

Consider the following example.  Predictions for the death of the grid have held some prominence during the last decade.   It started around 2012 with forecasts of ‘death spirals” for utilities.  The theory was that as customers found self-generation options preferable, more and more would leave the grid, thus raising costs for those who remained. This grid defection or load defection would lead to rising costs which would lead to further load/grid defection.  Searching “grid defection” and/or “load defection “brings up a host of warnings proclaiming a coming green energy transition which would be accompanied by the demise of the grid.

Financial analysts joined in and issued warnings as well:

  • Morgan Stanley, Clean Tech, Utilities & Autos [March 2014] “Our analysis suggests utility customers may be positioned to eliminate their use of the power grid.”
  • Barclays, Utilities Credit Strategy Analyst Report [May 2014] “We see near-term risks to credit from regulators and utilities falling behind the solar + storage adoption curve and long-term risks from a comprehensive re-imagining of the role utilities play in providing electric power.”
  • Goldman Sachs, Analyst note on Tesla stock [March 2014] “…decreased reliability from an aging distribution infrastructure, a broadening desire to reduce the carbon footprint, and perhaps most importantly, the reduction of solar panel and battery costs could also work together to make grid independence a reality for many customers one day”

Creating Challenges for Transmission Project Approval

This “idea” or “forecast” of potential grid obsolescence caused challenges in the real world of electric utilities planning.  At the time, I was seeking the approval of annual grid construction budgets running into the hundreds of million dollars per year. My Board asked: why are we putting so much into a grid that Morgan Stanley and others say might go away?   I shared my perspectives with the board, arguing the need for continued grid expansion. Some of those perspectives can be read in these two articles I co-authored some years later, titled Reports of the Electric Grid’s Death Have Been Greatly Exaggerated  and The Grid End Game.

At the time, our Board (and many others) were in a tough position.  Who are you going to believe?  Academics, government experts, renewable specialists and recognized financial experts, or your own local guy?  From my perspective, I had a strong understanding of electric supply, consumer needs, issues around availability and deliverability, and I worked hard to understand what the arguments of the other “experts”. The renewables people seemed to have so much faith in themselves that they didn’t need to be bothered by the details of providing electrical service or understanding why their predictions might be wrong.  Financial experts were relying on renewable experts without paying attention to many of the broader issues involved in power delivery. While to me it seems clear, that considerable respect should go to those in the field versus the potential disruptors, that has been a hard argument to make historically.  Despite their poor record of forecasting in the past, those who’ve made bad predictions continue to gain considerable attention and respect.

What did we do to help our board?  At strategic planning meetings we took the other side.  We assumed the need for the grid would wither away.  We looked at what might happen to our billions in investment.  We argued that our resources would still have value. For example, some of our transmission ties would be valuable for energy exchanges between distributed networks.  Many of our transmission substations could house batteries and serve to support smaller networks.  Other right of ways we owned might have value for communication pathways, pipelines, roadways or the like.  That provided enough comfort for going forward with continued transmission investment in the interim.

Overwhelmingly it’s a good thing that many entities continued to build transmission, despite the dire warnings of grid obsolescence. Less optimal results likely ensued when project support was stymied by the cautions of “experts”.  The “green” consensus now seems to be that  enhanced robust grids are essential to increasing the penetration of renewables. The existing grid elements , including projects completed back then despite the warnings,  are foundational to any serious efforts at expanding renewable resources.

Experts at Conferences

Back then, there were various conferences, symposiums and working groups centered around the demise of the grid. I went to several to make sure I was aware of their best arguments and well informed on recent and potential developments. At one sponsored by the Department of Energy,  Ernest Moniz in 2013, the US Secretary of the Department of Energy welcomed us.  Unfortunately, such gatherings usually failed to provide significant platforms for dissenting views and were a little heavy handed in touting grid fears. My experience with one large “working group” illustrates generally how these meeting would go.  Here to the best of my memory is what happened at a working group held at Duke University, which had around 100 participants, government sponsorship and was run by high priced consultants. I asked questions suggesting the grid had a lot of value and that distributed “green” resources would struggle mightily in its absence.  Those on the agenda were super confident, they had it all figured out.  Those questioning the “wisdom” were seen as oddballs, but some people would come up and whisper to me during breaks that they were wondering the same things.

One task introduced for the large working group in attendance was figuring out what we might do to make the grid more relevant as demand for the grid decreased. I sensed a disconnect, if the group felt the grid did not have value, why work to preserve it? I passionately explained,  “I work for a transmission only entity.  I believe the grid had great value and will continue to provide great value. But if you are right, perhaps the grid should be allowed to fade away.”  I explained that, “my goal is to meet the needs of our distribution customers and end-use consumers.  If they have better options than retaining the grid – I would encourage them to use those options. ” I asked then, “Why if you think the grid is not needed, do you care about its continuance?  What’s the purpose of this working group? Why isn’t our goal to help the transition?” The  room got silent and eventually the facilitator noted that was an interesting perspective worthy of consideration.

What the group decided to do (likely pre-ordained by the facilitators) was model a bunch of different future generation scenarios showing where new generation would come from to see what they showed about timing and the need for the grid.  There were a number of different scenarios proposed, some dominated by large distant wind, other more supported by dispersed solar and so on.  All potential scenarios were heavily or exclusively renewables based.   I asked shouldn’t we have one scenario where new natural gas plants played some role.  (Much like what has actually played out in the last decade.)   The leaders quickly came back and said, “NO, fracking might  be banned! So, gas scenarios may be worthless.”   I replied that I certainly understood that as a possibility, but that every other scenario suggested faced similar challenges and roadblocks. Wouldn’t a scenario showing some addition of natural gas plants be worthwhile for comparison purposes?  When we broke into smaller working groups with differing tasks,  I wasn’t assigned to the one refining and selecting the the scenarios. Not surprisingly additional natural gas resources  were not included in in any of the scenarios.  ( I suppose I don’t need to tell the readers that any additional nuclear wasn’t represented as a possibility in any of the scenarios either.)

Real work responsibilities prevented me from attending the follow up sessions. While I looked forward to reading the reports that came out of the group, no reports or formal outputs ever materialized. By the time they were finishing up, I suspect the handwriting was on the wall and it had becoming clear enough that the findings they originally anticipated would not be defensible.  Unfortunately, it’s often the case that when these type groups don’t find the results they want, they don’t admit mistakes or publish a lesson learned from their endeavors. They just move on to something else.

Deja Vu: The Ideas Changed but the Same Experts Remained 

I recognized many of the individuals and groups who were pushing the end of the grid, from various conferences, symposiums and working groups I had attended years earlier  on the topic of  Integrated Resource Planning (IRP).  It was like seeing the same actors in a slightly different play. Reading new scripts but still ushering in “green” change and creating problems for those actually trying to support the grid.

One of the entities involved in both was the Rocky Mountain Institute.   They, like many of the other “experts” pushing the demise of the grid, earlier were busy pushing Integrated Resource Planning. The Rocky Mountain Institute touted the great value of negawatts (a unit of electricity save by conservation).  They characterized the traditional utility approach to planning as blindly looking at load growth and building resources as needed.  They proposed that considerable benefits would accrue from treating load, generation, efficiency and distributed resources on equal footing in all stages of planning.  The argued that utilities could see significant savings by paying customers to improve efficiency and thus lowering their need for costly infrastructure improvements. They thought negawatts should be a prime option for addressing system needs and avoiding infrastructure. Buying negawatts could save on infrastructure.

They encouraged the expectation that forecasts of expensive upgrades for transmission lines should preferably be addressed by targeted localized efficiency programs. It’s hard to estimate potential efficiency gains on a system wide basis, let alone in targeted load areas.  Deploying programs with such precision is  huge problem because of  all the uncertainty in load growth, efficiency program impacts and other interrelated factors.  Due to the complexity and unknowns, it was likely impossible for any utility to do defer individual projects by using the recommended IRP approaches.

Back in the mid-90s, regulators would ask if you looked at delaying a transmission uprate by implementing a program to incentivize replacements of older refrigerators with more efficient ones.  They were not impressed when you told them, this did not seem like a workable solution.  All these experts were telling everyone utilities should do this, but looking across the nation (and globally) no one had achieved any kind of success suggesting this was remotely possible.  I was very pleased when I heard the Electric Power Research Institute (EPRI) was undertaking a huge program to demonstrate the state of the art as to how such things could be done.

EPRI selected a community in Oregon and they were going to follow the best advice of “experts” to demonstrate the proposed concepts. I naively  felt that either they would give us guidance as to how this might realistically be accomplished, or more likely force them to publicize the limitations of such approaches.  I expected they would encounter numerous  unwieldly real-world challenges.  The program was launched with a big fanfare at with a considerably large budget.  I followed the early efforts as the program implementation began. The early documentation was frequent and very impressive, explaining the great things being undertaken.   As results should have been emerging, suddenly there was silence. I heard the program was having some trouble, but nothing was being published.  I searched and searched over time.  Finally, years later, I found a comprehensive listing of cancelled EPRI projects.  For the targeted efficiency program there were only about  two lines in that listing. It stated the project name and  said only that the project was cancelled because the target city had become the wind surfing capital of the east coast and the resultant load growth in the area had made the project infeasible.

That’s the way the world works most of the time.  Something big comes along that you didn’t anticipate, or many small things, or a combination of factors.  Having overly complicated plans dependent on getting multiple variables right, is not a good recipe for success. I wish EPRI had provided some follow up.  With all the investment and efforts put into place, before they realized their hoped-for plans were dashed, they could have provided some documentation of the challenges and successes (if any) they encountered before the project “blew up”.  But unfortunately, it is not common for  to write of the demise of their cherished ideas. The promoters  just withdraw and let their dreams hibernate to maybe come back another day. The obvious lessons aren’t learned. The experts that pushed for these ideas found a new wagon to hitch to their horses, and for many of the IRP/negawatt experts it was the idea of grid defection.

It’s a Game

What was gained by forecasting the death of the grid?  What was gained by making utilities prioritize using negawatts?   Claiming disaster or a new superior approach grabs attention. Extreme criticisms of existing approaches can get attention as well.   This attention can help entities promote other related objectives. Predicting the end of the grid is pretty bold and it attracted a lot of press.  It helped focus attention on “green” projects and industries and no doubt helped their funding.  If the claims are bold and the consequences large, it seems that the strength of supporting evidence is irrelevant.

Historically we’ve had an excellent power system, but there will always be emerging needs and challenges.  Arguing for continued incremental improvements makes sense.  Saying the grid is worthwhile and will be needed for a long while, though  is not as exciting as forecasting the grids end.  Looking at the world more realistically is suitable for boring articles in the trade publications. Talk of enhancements to existing technology while carefully nurturing new technology is not near as exciting as most “green” proposals.  It perhaps should not be surprising that such plans do not garner as much attention or support.  But that is unfortunate, because projects conceived with such understandings have proven, and will likely continue to be proven,  to be the best options in the future.

When green ideas seem credible to unquestioning minds, they have shown that they can attract crowds, attention and money.  With political support their proponents can avoid engagement with critics.  When the real world intrudes and some ideas seem less credible, the appropriate lessons aren’t learned; rather the same flawed ideas merely hibernate.  Those pushing the discarded ideas then find new ideas to push. Sometimes “green” advocates switch gears to advocate renewable energy ideas that are directly contradictory to what they were advancing before. That type thing goes on untouched without observation or notice.

Where are We Now?

Most “green” entities now see the grid as central to achieving CO2 goals.  The Rocky Mountain Institute is currently much less bullish on grid defection then they were before and they now observe that, “the grid has been growing in importance for decades as a driver of economic growth, and recently as a key enabler for meeting economy-wide decarbonization targets through electrification with renewable energy.”  However,  they note that “historical approaches to ensuring grid security in the United States are proving to be poorly suited to the emerging, catastrophic threats facing the grid.” Now they warn that, “A grid outage can mean not being able to access critical health services, water supply, communications, and more, negatively affecting people’s well-being and our country’s economic growth.”

By now almost  all “green” advocates have figured out that the grid is central to allowing the increased penetration of renewable resources. Rather than proclaiming the death of the grid, they see the grid now as needing their help. They don’t praise the grid for what it has done, but rather are critical of the supposed shortcomings of the grid. They speak of modern grids as being “third world grids”. They insist that new ideas are needed  and they encourage the expansion of the grid with the development of enhanced capabilities.  Suddenly they are the defenders of the grid and the experts who know what must be done with the grid to protect us from the looming crises.

The truth is that integrating increasing amounts of solar and wind is complicated, expensive and poses reliability risks.  Renewable advocates want to blame the grid for the problems inherent in asynchronous intermittent wind and solar generation.  Their ideas for the future grid are more about transferring and hiding costs rather than about providing technical solutions to the problems posed by integrating wind and solar.

The grid has seen substantial changes over the years.  It has become stronger, more robust and continues to use new technology to enhance its functioning.  The grid is “smart” now, it was “smart” in the past and it will continue to be “smart” in the future.    Nevertheless, integrating large amounts of wind and solar will create significant problems for the power system.  Changes to the grid can help integrate more wind and solar, but only with  increasingly greater costs and increasing reliability concerns. It’s not an exciting message, but it’s one that should be heard. We shouldn’t let talk of emerging technological breakthroughs or apocalyptic threats distract us from serious considerations.  The grid should grow and evolve as it always has by balancing economics, reliability and public responsibility.   That will likely happen slowly and  bit by bit, not by a top-down politically mandated grand redesign.

Postscript: Just after completing this posting  it was reported that California is considering moving to fixed rate billing (based on income) which would completely uncouple electric consumers from usage concerns.   I remember that once upon a time smart meters giving real time data to customers paired with real time pricing was the key for efficiency and better use of  of distributed resources.  In fact, RMI wrote in 2015 that:

“The grid of the future will be centered on the customer, enabling customers to understand and manage their energy use more efficiently. Personalized, transparent, and actionable data availability to customers and to the marketplace is a key factor enabling that transition… (P)ersonalized feedback has been described as the “holy grail” of energy efficiency, and yields the greatest percentage of customer responses and energy savings.”

I’m afraid that emerging problems triggered by  California’s “green” efforts are behind this terribly ill-conceived proposal.  I wait to see how RMI and other “green” advocates will react to California’s fixed cost proposals.  My guess is that they may like fixed costs billing because consumers can be completely separated from the consequences of their personal energy use allowing “green” energy initiatives to be pursued with less transparency and interference.

4.8
13
votes

Article Rating

Like this:

Like Loading…

Categories
Health

Moderna shares fall regardless of promising most cancers vaccine knowledge

Sopa Pictures | Light Rocket | Getty Images

shares of Modern fell on Monday as Wall Street munched over new trial results on the personalized cancer vaccine it is developing note.

Merck shares were essentially flat.

The experimental mRNA vaccine, when combined with Merck’s blockbuster drug Keytruda, reduces the risk of skin cancer-melanoma recurrence by 44% compared to Keytruda alone, the companies said on Sunday in their first detailed presentation of results from a key Phase 2 Study.

Almost 80% of participants who received both the vaccine and Keytruda remained cancer-free at 18 months, compared with 62% of participants who received only Keytruda, the companies said. They added that the vaccine’s side effects were generally mild, with fatigue being the most common.

These results, presented at a meeting of the American Association for Cancer Research in Florida, add to initial results on the treatment combination published in December.

The results suggest that the vaccine, when combined with Keytruda, “could be a novel means of potentially prolonging the lives of patients with high-risk melanoma,” said Dr. Kyle Holen, director of development, therapeutics and oncology at Moderna, in a press release. Moderna and Merck said they will initiate a Phase 3 trial in 2023 and will “rapidly expand” their research to look at the treatment’s effect on additional tumor types, including an important type of lung cancer.

Wall Street greeted the news with a mixture of cautious optimism and doubt.

Analysts from SVB Securities said the results suggest the personalized cancer vaccine shows promise. But they also wrote in a Sunday note that the treatment’s path to approval is new and untested, adding that the company doesn’t see accelerated approval as an option.

The Food and Drug Administration’s Accelerated Approval designation is intended to allow faster approval of medicines for serious conditions that address an unmet medical need.

A Monday note from Wolfe Research analyst Tim Anderson said many Moderna and Merck stakeholders remain “cautiously optimistic at best” about the possibilities of the cancer vaccine-Keytruda combination.

He said expectations for the treatment combination were quite high at the start of the weekend, but noted there are still many skeptics about cancer vaccines due to a “long history of failures in this area”.

Wells Fargo analyst Mohit Bansal also said he expressed “cautious optimism” about the treatment combination. In a Sunday note, Bansal pointed to “trial imbalances” that may have led to more favorable outcomes for the personalized cancer vaccine.

He said these imbalances warrant waiting for more data on the treatment.

Categories
Sport

Washington Commanders sale – What are high priorities for brand new homeowners?

Shortly after buying the NHL’s New Jersey Devils, Josh Harris explained why.

“What we’ve done is buy good companies,” he told NJ.com in 2014, “good franchises that have a reason to exist but for whatever reason may have some financial difficulties or need some new leadership.”

Harris is doing the same thing by reaching an agreement to purchase the Washington Commanders, once a marquee NFL franchise but one that fell on hard times during Dan Snyder’s ownership.

Editor’s Picks

The deal, with the sale price of $6.05 billion, must be finalized — another bidder, possibly Canadian billionaire Steve Apostolopoulos, could conceivably make another offer — then approved by the NFL. The earliest Harris could be voted in would be at the league meetings May 22-24 in Minneapolis.

If Harris indeed wins the bidding, he would then inherit a franchise that went 164-220-2 in Snyder’s 24 seasons; only five teams had worse records during that period.

But Harris will need to revive the organization in more ways than winning. One team employee said workers have been beaten down because of the heavy dose of negative attention over the past several years, in particular, stemming first from an NFL investigation into Washington’s workplace culture followed by a congressional investigation. Multiple state attorneys general are looking into allegations of financial improprieties.

The new ownership group will have a number of priorities, from needing to find a site for a new stadium to winning back fans to eventually deciding on an organizational power structure: Will the Commanders move forward with team president Jason Wright, general manager Martin Mayhew and coach Ron Rivera? If handled right, Washington’s franchise could be reinvigorated.

Harris also owns the NBA’s Philadelphia 76ers, who, after six consecutive losing campaigns, have been one of the top teams in the Eastern Conference over the past six seasons. The Devils, after lean years, had the second-best regular-season record in the Eastern Conference this season.

At the NFL annual meeting in Phoenix last month, Wright said, “There’s nothing but upside on the other side of this, and that’s important.”

Commanders coach Ron Rivera is in the fourth year of a five-year contract with Washington. Patrick Semansky

Decide on the coach and the general manager

It’s too late for the Commanders to change the power structure on the football side for the 2023 season. Rivera is the coach, Mayhew is the general manager, and it’s not expected that will change, team and league sources said.

However, Rivera initially signed a five-year contract, and he is entering Year 4. It is a pivotal year regardless of the ownership change.

Considering Rivera is the main decision-maker on the football side — he hired Mayhew and assistant general manager Marty Hurney in 2021 — the new owners will decide whether they want to keep Rivera beyond 2023. It stands to reason any determination on Rivera impacts his GM’s future, as well.

What to know for the 2023 NFL draft

Ranks: Kiper | McShay | Positional
Mocks: Kiper | McShay | Miller | Reid
• Biggest needs » | Riddick’s favorites »
• QB Hot Board » | First-round grades »
• Scouting reports » | Order » | More »

The Denver Broncos represent an example of an ownership group that made a change at coach one season after buying the team and assessing the situation. The Broncos fired coach Nathaniel Hackett after a 5-12 season in 2022 but retained general manager George Paton.

Washington’s new owners need to be judicious in making sure the organization is set up properly for the future. It is possible they want a fresh start, and Rivera couldn’t do much to change such a desire. In three seasons under Rivera, Washington has a 22-27-1 record with one division title. The Commanders were 8-8-1 last season and finished last in a loaded NFC East.

Keep in mind: None of the seven coaches hired by Dan Snyder finished with a winning record. Of that group, Rivera has had the most power, being allowed to select his own front office.

Only one coach, Jay Gruden, lasted more than four seasons under Snyder. And Gruden was the lone coach to receive an extension. Gruden, who had a 35-49-1 record in five-plus seasons with the team, was fired after an 0-5 start to the 2019 season. He is the only coach under Snyder who had consecutive winning campaigns.

Rivera has been through this, having experienced an ownership change when he was the coach of the Carolina Panthers. Rivera had been more successful with the Panthers than he has been with the Commanders, guiding Carolina to a 76-63-1 regular-season record, a Super Bowl appearance during the 2015 campaign and an 11-5 record two years later. David Tepper bought the team in 2019, and following a 5-7 start, he fired Rivera during his ninth season with Carolina.

Rivera learned from the experience, saying he wishes he had explained the rationale for roster decisions better and treated his sessions with Tepper like an extended job interview.

“It’s all about presentation and what you’re doing going forward and understanding the reasons why you did things,” Rivera said last month in Phoenix. “I don’t want anyone to think I’m doing anything because I’m desperate; I’m doing things because this is the right way. If they’re not happy and want to let me go? Great. But … I’m not worried about what the decision is after a year. What I’m worried about is making sure this roster is being built, that it’s in place, and if I leave, I’ll leave it in a good position.”

Television ratings and game-day attendance have declined in Washington over the past decade. AP Photo/Patrick Semansky

Win back the fans

Washington has played in eight playoff games since the 1993 season, last winning one in 2005. The once-proud franchise won three Super Bowls in a decade, reached five from the 1972 season to the 1991 campaign and went 18-10 in the postseason during that stretch.

The fan base has seen more name changes in the past five years (two) than playoff games (one) — not to mention the numerous investigations and off-field issues.

“The first thing is winning,” Wright said. “They can be back again; these fans want us to do well.”

Winning will help, but it won’t cure all. A top goal needs to be regaining the fans’ trust.

“Trust is the perfect word,” said Grant Paulsen, a Washington, D.C., sports radio host. “There’s not a lot of trust in the football operation and not trust in the business operation, and some of it is not fair. … That’s a reality they deal with, and it won’t change the day Snyder is out. But they will get more benefit of the doubt. It will take time. They have to prove the witch is dead.”

The team’s local TV ratings steadily declined over the past decade. According to the Sports Business Journal, Washington ranked 24th in the league in 2021 with an average local rating of 16.64. In 2012, Robert Griffin III’s rookie year, the franchise ranked 16th at 27.5.

Team attendance has continued to slide, as well. In 2019, Washington ranked 30th in percentage of seats sold and 20th in average attendance per game. Last season, the Commanders ranked last in both. The last time Washington finished higher than 20th in percentage of seats sold was 2007 (second), though with a seating capacity that once topped 90,000, it ranked among the top five in attendance from 2006 to 2014.

Longtime D.C. sports talk show host Kevin Sheehan said it’ll be hard to know how many fans will return with new ownership.

“Five years ago, if you said Dan was leaving, that was the only thing that mattered and everyone would have been back,” Sheehan said. “But the last five years, this acceleration of shenanigans and reporting and investigations and [congressional] oversight committees and lawsuits, there has been a further wearing down and erosion of people ready to jump back on board.”

Paulsen said the Commanders can’t take the fan base — or the business it creates — for granted. The new ownership group must be proactive in winning fans back.

“They have lived off their name for so long that they stopped growing their brand or feeling like [they] had to satisfy the base,” Paulsen said. “They have to view themselves as needing the business.”

Washington has been looking for a new stadium site for several years. For new owners, a stadium would help boost the franchise value, giving them a return on their investment. AP Photo/Susan Walsh, File

Build a stadium

The Washington franchise has been looking for a new stadium site for several years in Maryland, Virginia and the District of Columbia, increasing the intensity of the pursuit over the past two years, in particular. For new owners, a stadium would help boost the franchise value and, therefore, give them a return on their investment.

“It would increase it dramatically,” said Marc Ganis, a sports marketing expert and president of SportsCorp Ltd., a sports financing consulting firm. “It would also be a new start for the team, which they really do need.”

A stadium would have been a boon for Snyder too.

Among the places considered: next door to the Commanders’ current site; 15 miles away in Oxon Hill, Maryland, on federal land and across from the MGM Casino and National Harbor area; their old home where RFK Stadium still needs to be torn down; and multiple sites in Virginia, including one approximately 10 miles from their current practice facility.

Best of NFL Nation

• Top fixes for Commanders’ new owner
• How Tyreek trade still affects Chiefs
• How Titans made Simmons a $94M man
• How comp picks fit into 49ers’ formula
• Saquan sends Giants a message

Multiple sources said Snyder was a hindrance in these talks, that numerous politicians did not want to do business with him or the team until the investigations ended.

“The owner definitely made it a nonstarter,” said one person with knowledge of the discussions in Maryland.

The same source said the hard part for a while was not knowing who to deal with from the franchise; until perhaps within the last year, it could be one of three different groups within the organization, including Snyder. Eventually, it was Wright and vice president of public affairs Joe Maloney. Nonetheless, the source contrasted that with the Baltimore Ravens, who used the same person when talking about stadium issues or needs.

Under Snyder, Washington wanted to move into a new stadium by the 2028 season. The current contract at FedEx Field ends early in the 2027 season, though it can be renewed to extend the stay.

Former owner Jack Kent Cooke paid for FedEx Field in Landover, Maryland, with his own money, but it was a poor sequel to RFK Stadium, the team’s home until 1997. RFK Stadium, which was in bad shape when the team left, was considered more intimate, with a seating capacity of 56,692 as compared to FedEx Field’s 78,270, which eventually grew to 90,000. RFK Stadium was accessible by the Washington Metro system; the closest such stop to FedEx Field is approximately one mile away. And because the main way to access FedEx Field is off Interstate 495, traffic congestion was an early issue for fans.

Current executives say the organization did not invest enough in maintaining FedEx Field, leading to failing pipes — that sometimes doused fans with water — and other problems. Players have complained there is no place to meet with their families after games nor is there any day care for their kids — as other franchises offer on game day.

In August, USA Today ranked FedEx Field as the NFL’s worst stadium.

“A new venue changes the fortune of a franchise,” Wright said, “in the revenue growth, but also in the fan experience. We definitely have to have the vision of a new ownership team. However, all the research and work done to date is additive; it’s not wasted. It’s all beneficial and will allow us to move quickly.”

The new owners must determine where they want to build: D.C., Maryland or Virginia. The RFK Stadium site remains a fan favorite. Three Super Bowl winners played at RFK Stadium, enabling fans to maintain a connection to the past. A source familiar with the stadium talks said people who represent Harris’ group have reached out to local politicians to assess the situation.

D.C. Mayor Muriel Bowser has said multiple times over the past year — and beyond — she wants the team back in the city. In the fall, she told reporters that ownership, meaning Snyder, was an obstacle.

However, because that land is owned by the federal government, it would first have to either sell or lease the land to the District of Columbia. Then the D.C. Council would have to approve any plans. Another fear by some team officials is that some D.C. residents would oppose a stadium because they don’t want it publicly financed, have a desire for different projects and worry about increased traffic.

With Snyder, those involved in the talks say, the team had no chance to build at the RFK Stadium site. But even without him, they say it will be difficult.

“You have to let [the fans] know you tried everything in the District,” one person involved in the situation said. “They still have to do the work; they owe it to the fans.”

It’s possible, one person involved in the stadium talks said, that the new venue could be located in one state — or district — with the practice facility remaining in Virginia, whether near the current one in Loudoun County or elsewhere.

The Commanders’ practice facility, built in 1992, is considered outdated, in part because it is small and would be expensive to expand.

In May 2022, the organization acquired the rights to purchase 200 acres of property in Virginia’s Prince William County; but multiple sources say if the team builds a stadium in Virginia, it would be in Loudoun, a growing county that has the nation’s highest per-household income.

Maryland also has options, including the Commanders’ current stadium site in Landover. The state has committed $400 million to redevelop the area around FedEx Field. Another site that has been mentioned sits on federal government land.

Matt Rogers, chief of staff for Loudoun County Supervisor Phyllis Randall, said “there’s no question” having Snyder out of the picture will help the process.

“There is a spring of opportunity,” he said. “We needed to have a clean slate here. We would have been further along had this not been the case for some time.”

According to sources, Dan and Tanya Snyder have reached an agreement to sell the Commanders for $6.05 billion. Geoff Burke-USA TODAY Sports

Establish a philosophy

When Snyder took over in 1999, he established an immediate tone to his ownership style by firing dozens of employees within a couple of months and, one former member of the organization said, imploring the front office to chase big-name players, whether it was realistic or not. As one former employee said early in Snyder’s tenure, his group operated under a “fire, aim, ready” philosophy.

“He was a fan,” the former team employee said. “I would think a guy [Harris] who owns the Sixers and Devils would be smarter than that. Dan had a million people in his ear, and he didn’t know who to believe. A guy that’s been an owner? That won’t be a problem.”

Welcome to the NFL offseason

• Early Power Rankings for 2023 season » Rebuild index: Tiering each team » Ranking top 50 free agents » Offseason guide for all 32 teams »
More free agency » | More draft »

Others who know Snyder well say he also hired, and trusted, the wrong people. Both things set the tone for his tenure as owner.

That’s why the first priority for Harris, if the deal indeed goes through, would be to establish a tone to set the organization up for success. One former member of the front office said the Commanders need to have a clear philosophy when it comes to the organizational structure, something the source said it lacked under Snyder.

Harris said in a 2015 article in Bethesda Magazine, “Your most important hire as an owner is your general manager and then your coach.” In a later interview with NJ.com after buying the Devils, Harris said it was important to “provide the resources and hold them accountable.”

Harris doesn’t have the reputation of a big meddler, another label Snyder wore throughout his tenure. Also, Snyder has gone a long time without taking questions at a news conference; he answered some questions at a 2014 ceremony. Harris, meanwhile, has typically addressed the media a couple of times a year, with other periodic interviews during a season.

“It’s picking the right people and developing a team,” former Washington coach Joe Gibbs once told ESPN. “That doesn’t change.”

It’s what helped Washington be successful under late owner Cooke, who oversaw three Super Bowl champions. Many employees were entrenched in the organization for decades.

One former Washington assistant coach said when his staff started, the members were struck by the unhappiness of many in the building. Scouts valued by the franchise left because of the low pay. Multiple team sources over the past several years said internal surveys conducted by teams showed Washington’s support staff workers ranked at or near the bottom of the pay scale as compared to other teams, despite being in an expensive market. In January, Consumer Affairs ranked Washington, D.C, as the seventh-most expensive place to live in the United States, based on housing costs. Multiple D.C. suburbs in both Virginia and Maryland have been on previous lists.

The malaise wasn’t just about Snyder. Bruce Allen, the team president from 2010 to 2019, was described as “stingy” by a former front-office employee.

That has improved over the past three years, current employees say, but more work remains.

“The first thing is understanding what we need to do to fuel a championship,” Wright said. “It has to be about winning. The best thing for us to do is listen to ownership, understand why they bought the team, what is their aspiration for this, so we can align the way we work.”

The Commanders like what little they’ve seen from Sam Howell but haven’t ruled out drafting a QB later this month. AP Photo/Alex Brandon

Find a quarterback

Washington’s staff has expressed excitement about quarterback Sam Howell starting in 2023, but the Commanders haven’t shut the door on drafting a signal-caller. Tennessee quarterback Hendon Hooker is scheduled for a visit.

Maybe Howell will be the guy. The Commanders would celebrate if that happened. But he is a fifth-round pick who has attempted 19 career NFL passes. There’s a lot to learn. This season will dictate whether he can be the solution. If not, they have to keep trying until they get it right.

A big part of Washington’s problems over the past 25 years, predating Snyder, stem from the quarterback. The franchise has started 34 quarterbacks since it last won the Super Bowl at the conclusion of its 1991 campaign. In Rivera’s first three seasons, eight quarterbacks started at least one game. A different quarterback has started each of the past five season openers. This year will make it six.

The team has drafted four quarterbacks in the first round since 2002: Patrick Ramsey (2002), Jason Campbell (2005), Griffin (2012) and the late Dwayne Haskins (2019). None became the long-term starter. It has traded for veterans such as Mark Brunell, Donovan McNabb, Alex Smith and Carson Wentz. None was the answer.

Washington ranks 24th in the NFL in total QBR since 2000, 29th over the past 10 years and 31st over the past five seasons.

Too often, according to multiple former and current team sources, Snyder inserted himself into quarterback decisions. Whether pushing for a trade, free agent signing or draft pick, his fingerprints were often on the moves — sometimes going against the recommendations of football decision-makers.

“The owner needs to immediately sort of communicate, ‘I’m not the football person here. I will hire the best possible people and give them total autonomy to build a winner for all of you and me,'” Sheehan said.

In part because that didn’t happen in the past, Washington’s quest for a quarterback hasn’t ended. During the fall, Rivera caused a minor firestorm when asked the difference between the Commanders and other teams in the division. He answered with one word: quarterback. He meant that the other teams had more stable quarterback situations.

If the Commanders ever find that, it would help any new owner.

“The truth is, this is a quarterback-driven league,” Rivera said in October. “The teams that have been able to sustain success, they’ve been able to build it around a specific quarterback.”

Categories
Entertainment

Lil Durk Donates $350,000 in Scholarships to Howard College

Lil Durk has announced the development of the Durk Banks Endowment Fund in partnership with Amazon Music. In addition, the rapper is providing two Chicago students with a $50,000 scholarship to attend Howard University.

Photo Credit: Courtesy of the Durk Banks Endowment Fund Photo Credit: Courtesy of the Durk Banks Endowment Fund Photo Credit: Courtesy of the Durk Banks Endowment Fund

The first two recipients of this grant will be announced today during Lil Durk’s performance at Howard University’s Springfest 2023.

The two students were selected from 20 participants in his Neighborhood Heroes HBCU College & Career Readiness Cohort Program.

Lil Durk donates $250,000 to Howard’s GRACE Grant

In addition to the two individual $50,000 grants, Lil Durk will also donate $250,000 to Howard’s GRACE Grant, a program created to help students who need tuition assistance, according to the university.

The scholarship was established by President Frederick in 2014. In addition, the funding was intended to target students with the greatest financial hardship who wanted to stay in school and graduate with their class.

Since the inception of the GRACE grant, recipients have experienced a 15 percent increase in employee retention and a 78 percent four-year graduation rate — a 32 percent increase compared to those who did not receive funding from the program.

I have received the GRACE grant in my last 2 years with Howard and it has been a BLESSING! https://t.co/9P0XSU03Sw

– Key. (@keptlikeki) October 13, 2021

Lil Durk continues to give back through his Neighborhood Heroes Foundation

Lil Durk is no stranger to giving back and holding it down for his community! In 2022, he announced the launch of his Career Readiness program through his Neighborhood Heroes Foundation. The aim was to change the lives of 20 students.

Through this program, students were given the opportunity to take an HBCU college tour. They were also able to visit various institutions in Atlanta and Alabama. In addition, some of them were invited to New York City to meet executives from Sony, Alamo Records and the New York Knicks, where they could accompany veteran professionals.

One of the students, Semaje Vaughn, gave Durk his flowers. Vaughn said he had no hope for his future until this program.

“I didn’t believe in anything, but having opportunities like this is a huge boost from my previous position.”

Lil Durk and the neighborhood heroes are giving away nearly 30,000 bottles of hand sanitizer to Illinois inmates and staff

If that’s not enough, Lil Durk even had his founding partner in Chicago Votes. The two companies distributed 29,000 bottles of hand sanitizer to the Illinois Department of Corrections (IDOC), as previously reported by The Shade Room. The distribution helped inmates and facility staff keep their hands clean as COVID-19 rates rose again.

In addition, it was revealed that the problem had been going on for months after a report was released of an acute water crisis at one of Illinois’ largest prisons.

If you know Durk, you know that giving back is in his blood, from supplying Chicago’s essential frontline workers during COVID to providing school supplies to elementary school kids. He’s definitely a valued voice of this generation.