eleitl

joined 4 years ago
MODERATOR OF
[–] eleitl@lemmy.ml 8 points 4 months ago

No, that captures just the neuroanatomy. Not the properties like density of ion channels, type, value of the synapse and all the things we don't know yet.

[–] eleitl@lemmy.ml 22 points 4 months ago (2 children)

You seem to trust Nvidia. I don't.

[–] eleitl@lemmy.ml 20 points 4 months ago

They write

"Of course, AMD is trying to get into the the AI training and inferencing game itself with the Instinct MI300 chip. And that, perhaps, is the main if modest cause for hope. If AMD can gain some traction in that huge market, it will not only be making lots of money, it will be in a position to do a similar thing to Nvidia and push some of that technology across into its gaming GPUs."

which strikes me as incorrect. AMD MI is pretty widespread in HPC. With margins lower in the consumer market it makes sense to focus on HPC.

[–] eleitl@lemmy.ml 9 points 4 months ago
[–] eleitl@lemmy.ml 3 points 4 months ago

I would look into thin clients and Lenovo etc. tiny PC for office on eBay. I run old low power low noise rackmount Supermicros which are nice but hard to find at low prices.

[–] eleitl@lemmy.ml 2 points 4 months ago

Nicely put. I fully agree with this description.

[–] eleitl@lemmy.ml 1 points 4 months ago (3 children)

Yes orders of magnitude, but not too many of them. The real estate of a 300 mm wafer is limited, the structure shrink is saturating and you can't get too many layers. You still need a packet switched network on the wafer even if the rest is mostly analog. Perhaps spintronics can limit the power requirements too.

[–] eleitl@lemmy.ml 1 points 4 months ago (2 children)

I think we are already at the catabolic collapse stage where the elites are cannibalizing the periphery before the heartland. On the other hand the US has been also being stripped bare in patches for a while. So it is a simultaneous progressive process.

[–] eleitl@lemmy.ml 2 points 4 months ago (1 children)

And these power lines into the EU are very easy to disrupt and disrupt again after repairs.

[–] eleitl@lemmy.ml 5 points 4 months ago (2 children)

Factor in power bills and heat and noise into your calculations.

[–] eleitl@lemmy.ml 8 points 4 months ago

Secure has no meaning.

16
submitted 4 months ago* (last edited 4 months ago) by eleitl@lemmy.ml to c/collapse@lemmy.ml
 

Abstract

Tropical Cyclones (TCs) inflict substantial coastal damages, making it pertinent to understand changing storm characteristics in the important nearshore region. Past work examined several aspects of TCs relevant for impacts in coastal regions. However, few studies explored nearshore storm intensification and its response to climate change at the global scale. Here, we address this using a suite of observations and numerical model simulations. Over the historical period 1979–2020, observations reveal a global mean TC intensification rate increase of about 3 kt per 24-hr in regions close to the coast. Analysis of the observed large-scale environment shows that stronger decreases in vertical wind shear and larger increases in relative humidity relative to the open oceans are responsible. Further, high-resolution climate model simulations suggest that nearshore TC intensification will continue to rise under global warming. Idealized numerical experiments with an intermediate complexity model reveal that decreasing shear near coastlines, driven by amplified warming in the upper troposphere and changes in heating patterns, is the major pathway for these projected increases in nearshore TC intensification.

 

Abstract

Large pulses of tree mortality have ushered in a major reorganization of Europe’s forest ecosystems. To initiate a robust next generation of trees, the species that are planted today need to be climatically suitable throughout the entire twenty-first century. Here we developed species distribution models for 69 European tree species based on occurrence data from 238,080 plot locations to investigate the option space for current forest management in Europe. We show that the average pool of tree species continuously suitable throughout the century is smaller than that under current and end-of-century climate conditions, creating a tree species bottleneck for current management. If the need for continuous climate suitability throughout the lifespan of a tree planted today is considered, climate change shrinks the tree species pool available to management by between 33% and 49% of its current values (40% and 54% of potential end-of-century values), under moderate (Representative Concentration Pathway 2.6) and severe (Representative Concentration Pathway 8.5) climate change, respectively. This bottleneck could have strong negative impacts on timber production, carbon storage and biodiversity conservation, as only 3.18, 3.53 and 2.56 species of high potential for providing these functions remain suitable throughout the century on average per square kilometre in Europe. Our results indicate that the option space for silviculture is narrowing substantially because of climate change and that an important adaptation strategy in forestry—creating mixed forests—might be curtailed by widespread losses of climatically suitable tree species.

 

Abstract

Mammals have dominated Earth for approximately 55 Myr thanks to their adaptations and resilience to warming and cooling during the Cenozoic. All life will eventually perish in a runaway greenhouse once absorbed solar radiation exceeds the emission of thermal radiation in several billions of years. However, conditions rendering the Earth naturally inhospitable to mammals may develop sooner because of long-term processes linked to plate tectonics (short-term perturbations are not considered here). In ~250 Myr, all continents will converge to form Earth’s next supercontinent, Pangea Ultima. A natural consequence of the creation and decay of Pangea Ultima will be extremes in due to changes in volcanic rifting and outgassing. Here we show that increased , solar energy (F⨀; approximately +2.5% W m−2 greater than today) and continentality (larger range in temperatures away from the ocean) lead to increasing warming hostile to mammalian life. We assess their impact on mammalian physiological limits (dry bulb, wet bulb and Humidex heat stress indicators) as well as a planetary habitability index. Given mammals’ continued survival, predicted background levels of 410–816 ppm combined with increased F⨀ will probably lead to a climate tipping point and their mass extinction. The results also highlight how global landmass configuration, and F⨀ play a critical role in planetary habitability.

 

Abstract

Global aviation emissions have been growing despite international efforts to limit climate change. Quantifying the status quo of domestic and international aviation emissions is necessary for establishing an understanding of current emissions and their mitigation. Yet, a majority of the United Nations framework convention on climate change (UNFCCC)-ratifying parties have infrequently disclosed aviation emissions within the international framework, if at all. Here, we present a set of national aviation emission and fuel burn inventories for these 197 individual parties, as calculated by the high-resolution aviation transport emissions assessment model (AviTeam) model. In addition to CO2 emissions, the AviTeam model calculates pollutant emissions, including NOx, SOx, unburnt hydrocarbons, black carbon, and organic carbon. Emission inventories are created in aggregated and gridded format and rely on Automatic Dependent Surveillance–Broadcast combined with schedule data. The cumulative global fuel burn is estimated at 291 Tg for the year 2019. This corresponds to CO2 emissions of 920 Tg, with 306 Tg originating from domestic aviation. We present emissions from 151 countries that have yet to report their emissions for 2019, which sum to 417 TgCO2. The improved availability of national emissions data facilitated by this inventory could support mitigation efforts in developed and developing countries and shows that such tools could bolster sector reporting to the UNFCCC.

11
submitted 4 months ago* (last edited 4 months ago) by eleitl@lemmy.ml to c/collapse@lemmy.ml
 

A power substation near a data center in Ashburn, Virginia. Photographer: Nathan Howard/Bloomberg By Josh Saul

May 2, 2024 at 5:10 PM UTC

Data center developers in Northern Virginia are asking utility Dominion Energy Inc. for as much power as several nuclear reactors can generate, in the latest sign of how artificial intelligence is helping drive up electricity demand. Dominion regularly fields requests from developers whose planned data center campuses need as much as “several gigawatts” of electricity, Chief Executive Officer Bob Blue said Thursday during the company’s first-quarter earnings call. A gigawatt is roughly the output of a nuclear reactor and can power 750,000 homes. Electric utilities are facing the biggest demand jump in a generation. Along with data centers to run AI computing, America’s grid is being strained by new factories and the electrification of everything from cars to home heating. The surge in demand is complicating utility efforts to turn off carbon-emitting power plants and meet their climate goals.

Over the past five years, Dominion has connected 94 data centers that, together, consume about four gigawatts of electricity, Blue said. That means that just two or three of the data center campuses now being planned could require as much electricity as all the centers Dominion hooked up since about 2019.

8
A world without growth (consciousnessofsheep.co.uk)
4
Playing seesaw (consciousnessofsheep.co.uk)
view more: ‹ prev next ›